More Culture:

February 17, 2016

Infrequently Asked Questions: How do blind people use smartphones?

Philadelphia-based accessibility consultant Austin Seraphin explains screen readers

The world is full of questions we all want answers to but are either too embarrassed, time-crunched or intimidated to actually ask. In the spirit of that shared experience, we've embarked on a journey to answer all of the questions that burn in the minds of Philadelphians -- everything from universal curiosities (Why do disposable coffee cups still leak?) to Philly-specific musings (How does one clean the Liberty Bell?). 

For the bulk of the 64 percent of Americans who own a smartphone, using one is a cinch -- a reflex, almost. But how does someone with vision loss make the most of what's debatably today's most valuable tech tool?

We reached out to accessibility consultant and Philly Touch Tours co-founder Austin Seraphin, who has been blind since birth, for an answer.

How does a person who is completely blind go about using a smartphone?

A blind person uses a smartphone or computer by using a piece of software called a screen reader. A screen reader, as it implies, reads the screen. It usually does this through synthesized speech. Some people use Braille displays as well. And basically, it talks to us.

So, on a desktop computer, in the old days with text interfaces, it was pretty straightforward. You can imagine navigating by lines and chapters and words and things like that, and then once graphical interfaces came along, it got more complicated, because now we’re dealing with presenting a visual screen auditorily. Although, it’s interesting because with something like an iPhone, you’d think a blind person can’t use a touch screen – it’s almost counterintuitive that you can use it through a piece of glass. That challenge is turning a two-dimensional screen into a one-dimensional speech output. 

In the case of the iPhone, VoiceOver is the screen reader installed on Apple’s devices  iPhone, Apple Watch, Mac, they all have their own versions of VoiceOver — but it’s the same basic idea. So if I turn on my phone, I just have my launch screen, and if I move my finger, it will see what’s under my finger. [He moves his finger to the "Unlock Button," and a voice reads it out loud.]

... It’s really cool. Android does something similar and so does Microsoft. So they’re all getting in, but Apple is leading the way as far as mobile and tablets go.

Does that limit the apps you can use?

That gets into one of the things I do, which is accessibility. And the job of an app is to communicate the proper information to the screen reader. The cool thing Apple has done -- and there are equivalents on other platforms as well -- is if you’re using standard controls they give you, then accessibility features are built in. So if you’re using standard labeling of buttons and labeling everything properly, it will take care of all that for you and you don’t have to do anything extra. Which is why I really encourage developers to make apps accessible, because, in a lot of cases, it’s so easy to do -- comparatively.

Ten or 20 years ago, a company would say, ‘Oh, we can’t make this accessible; it costs too much money,’ and it was regrettably true. Now, that’s no longer the case. But they’re still kind of acting like it is. So it’s actually really easy in a lot of cases to make an app accessible, and the same is true for the Web in a lot of cases -- the limitation is if the screen reader can make sense of it.

If you’re sending a text message, are you using voice-to-text?

That’s going the other way -- that’s dictation. That's speech-to-text, as opposed to text-to-speech. And oftentimes I will use dictation, as will a lot of people. Apple is big on that. I use dictation a lot on my [Apple Watch]; I find it works well. And the new iOS software also has a Braille input mode, which is cool -- especially if you have iPhone 6S Plus, the bigger phone, or an iPad. You can enter Braille on the keyboard.

How does that work?

You put your fingers on the keyboard and orient your iPhone in landscape mode. The Braille keyboard has six dots – 1, 2, 3 on the left hand; 4, 5, 6 on the right hand. If you picture two columns with three dots each, it’s what a Braille scale looks like. And a different combination of those dots makes the letters and contractions, so then you calibrate the keyboard to your hand position and you can enter the combinations on your virtual keyboard on your iPhone screen. [The Braille combinations translate into letters.] It’s neat. It takes getting used to and isn’t perfect, but it’s pretty cool. Especially on one of the bigger phones.

Dictation is great. Typing on a standard keyboard can take a while.

How are you able to tell where the dots are?

If you go to Explore Mode it will tell you ‘Dot 1, Dot 2, Dot 3’ when you drag your finger around, and … it will calibrate the keyboard to where your fingers are at that moment.

How about Web browsing?

The Web is kind of wild. It varies. Some websites are pretty good, some are pretty horrible. There are a lot of problem areas easily fixed, images with "alt tags" that give alternative text to the image to tell a screen reader how to read. For instance, I noticed a lot of social networking icons on websites, instead of labels "Facebook, Twitter, etc.," oftentimes they won’t have a label, so it will just read, "Link Link Link," and you have no idea what they do.

Mobile dialog [boxes] can present a problem. If you hit a link and it pops up a dialog, it won’t give you any alert that it happens … That can be challenging. So, there are challenges but solutions to overcome those challenges. Things like ARIA (Accessible Rich Internet Applications). It comes down to standards in coding.

How would you say you use your smartphone differently from someone with sight?

I have a whole bunch of cool apps that help. For instance, the first app I ever got for my iPhone, back in 2010 when they came out with VoiceOver -- I didn’t know what apps I could get. And I was thinking about all these different devices for the blind. Technology for the blind has a small market, so it costs a lot of money -- supply and demand. A device to identify colors costs $200, to point at an object and identify its color. So I found an app that identified colors for $2. And I downloaded and ran the app and was so excited and ran it, and it kept saying, "Black, black, black," and I thought, "This sucks. This isn’t the next big thing!" Then I realized it was 2 a.m. and I had all my lights off and it was pitch black. I turned on the lights and it was quite a magical moment.

So, color ID. TapTapSee I like a lot, where you take a picture of a can or box and it will give you a description of that item, if you have a can or something or you’re cooking. Now there are a few apps that can actually read print text -- one that costs $100 and one that’s even free. I’m so glad a high school student made a free one. It’s OCR (Optical Character Recognition).

And then just having the Internet at your fingertips. Getting a menu at a restaurant. For the blind, that’s doubly meaningful, all that access.

Have a question you're dying to have answered? Send an email to, and we'll find an expert who can give you the answer you're craving.