So the more blunt-spoken listeners and watchers told me, when I raised the TV sound volume.
I pretended it wasn’t true, because I could communicate one-on-one from close at hand; but in meetings I only heard people saying “Rhubarb, rhubarb.”
Sue, of Westshore Hearing Solutions, put me in a glass-fronted test cubicle. Between the beeps of varied pitch and increasing loudness that she sent through earphones (“press the button when you hear it”) I wondered whether my ears could be tuned up as good as new. This seemed unlikely. Was I really hearing the faintest sounds, or just imagining them?
Sue told me her gadgets were calibrated to mark the difference, but I still felt edgy. I was neurotic about my ears. Me versus auditory science – that seemed to be the conflict. It’s lucky for me I can engage in it. For some people, the only hopes are surgery or sign-language.
Sue rolled out words and diagrams to explain the mechanics of hearing. She checked ear canals for wax, showed how my ears were failing to pick up the full sound of the higher-notes in women’s voices (the left ear worse than the right).
She then calculated what amplifying volume she would ask the technicians to build into the ear-aids to make the eardrums vibrate more precisely.
At home I rummaged through the Internet in search of information about ear-aids of the future. I skipped the 2012 stuff. Sue and colleagues were in charge of that. Could humans be transformed into cyborgs, part human, part machine, with distant hearing and X-ray vision? I am already a mild cyborg because of the artificial lenses that the eye-surgeon grafted in to fix the cataracts. On the Internet I found a picture of a further-advanced cyborg: Michael Chorost, born deaf but endowed with hearing by a brain implant.
Pop-science commentator Michael Anissimov of the Lifeboat Foundation Scientific Advisory Board predicts in his essay: “Top Ten Transhumanist Technologies,” that upgrades of the 2020s and ’30s will include not only hearing implants, but also artificial bones, muscles, and organs, and brain-computer interfaces.
Back in the 2012 world, my ear-aids were ready. Instructor Sue pulled open each sound-catching device and showed me how to insert the tiny round batteries, plug in the earpieces and slip the receivers behind my ears.
The ear-aids did a great job — I think. They brought back the long-forgotten buzzing, shrilling and rustling of cars and trees. Sometimes I still asked people to repeat what they had said. I wasn’t sure I was hearing it accurately, or making up meanings.
Putting on the ear-aids had seemed easy when Sue showed me. At home it became complicated. Distinguishing the back of the ear from the front was tricky.
“So I’m spatially challenged, I have to live with it,” I told myself, and finally twisted the gadgets into place. Next day I left them in their box.
Why do I still leave them in the box 15 days out of 30, efficient though they are, after three return visits and adjustments at the hearing centre?
I suspect it’s because of the fiddly concentration that is needed to manoeuvre these small objects, which are as valuable as jewels.
It’s also because of indolent tech-avoidance.
Unlike the process of putting on glasses, I doubt that ear-improving behaviour will ever come naturally to me.
• G.E Mortimore is a longtime columnist with the Goldstream Gazette.