By: Quinn Keast
Can you hear me? No, but I can read you. A hearing-impaired product designer has his team experience his world and says we need to make work more inclusive for people living with disabilities.
At my company, we’re all-remote. That means we write a lot—in emails, Google Docs, Github, and Slack. We work hard to cultivate a written-first culture.
And that works beautifully for me. Because, you see, I have a hearing problem. Or as I like to say, I’m deaf as a post.
While I can hear sound, I can’t turn that sound into words, and rely on lip reading for day-to-day conversations.
For me, a product designer, this approach to work is amazing. Long stretches of focus time and being able to decide when and how to best do my work. And for someone with hearing loss, this almost inherently creates a more inclusive environment.
Still, from time to time, we rely on synchronous methods of communication: team syncs to catch up on a more personal level and bring teams together, moderated user research, and facilitated collaborative activities and workshops. And more often than not, synchronous means video call.
Video calls have always been a challenge for me. Lip-reading doesn’t work well over video, because lip-reading relies on a whole lot more visual information than just the lips, and video calls don’t carry the full visual and emotional bandwidth needed to read lips easily. So instead, I use a series of hacks or built-in tools to help me out by providing real-time speech-to-text.
These hacks and tools work well, but sometimes they’ll cut out or they’ll lag and fall behind. And then simple things like someone asking for my thoughts on something can create awkward moments of silence before I realize that someone said my name, because I was relying on a transcript that was five seconds behind.