How iOS 15 Will Help Those Without Sight Interact With (And, ‘See’ More Details In) Photos

FEATURE: 06.14.21 – Seeing the writing on the wall is never a good thing… unless you’re a visually impaired individual looking at a photograph of just that.

iOS — Apple’s mobile operating system — will be getting a number of new features which will use, as the company described it, “on-device intelligence” (a.k.a., the Apple Neural Engine) to enhance the user experience while on an Apple smartphone. Announced and previewed by Apple senior Vice President of software engineering, Craig Federighi, at the Cupertino, California-based company’s annual worldwide developers conference, WWDC 2021, the software updates are just one of many upgrades that users of the iPhone can expect to see when the next version, iOS 15, is released later this year.

Live Text, which uses “on-device intelligence” to identify text contained within images, is a new feature coming later this year to iOS 15 designed to enhance the user experience while viewing photographs in the Photos app on an iPhone. (Photo: Apple, Inc.)

While the new features coming later this year to iOS 15 were not classified by Federighi as being accessibility-oriented in nature, these software updates will equally benefit not just sighted users of the iPhone but those without sight as well who use an Apple smartphone to assist them in their daily lives.

**AD: For the lowest prices, best deals, and latest discounts on all Apple products and accessories from Apple and Apple Authorized Resellers? Check out our exclusive and award winning price trackers at MacPrices.net . Pricing updated daily, seven days a week!

Seeing What’s In Your Photos

These are the new features in iOS 15 that will allow users of the iPhone to interact with the information contained within the images in their camera roll while viewing photographs in the Photos app on their Apple smartphone:

  • Live Text- users will be able to recognize words in photographs (e.g., capturing the phone number from a photograph of a business’s storefront with the option to place a call to that very business right from the Photos app)
  • Spotlight- while not a new feature by itself, Apple’s built-in system search engine software will now be able to search for photographs by location or specifically for people, scenes, and objects; using Live Text, it also can find specific text contained within images including handwriting (e.g., finding a photograph of a handwritten family recipe)
  • Visual Look Up- users will be able to learn more about specific items in their photographs (e.g., popular landmarks around the world, pieces of art, plants found in nature, and breeds of pets)

And, that’s not all. The on-device intelligence built in to an Apple smartphone will enhance the user experience while users of the iPhone are in the process of actively taking a photograph too. For example, with Live Text enabled, the Camera app will now be able to quickly detect text on the fly, or, as the company described it, “in the moment” )e.g., copying a Wi-Fi password on display on the wall of a coffee shop without first taking the picture and saving it to the camera roll).

“For many customers, iPhone has become indispensable, and this year we’ve created even more ways it can enhance our daily lives,” said Federighi (in a press release with regard to the new features coming later this year to iOS 15).

Recognizing What’s In Your Photos

Providing some context into the technology behind this on-device intelligence and how it recognizes the content in photographs is a Wired magazine article published back in 2017 on a new innovation contained deep inside the circuitry in the processor of an iPhone: the Apple Neural Engine.

According to Wired, the Apple Neural Engine has circuits that are tuned to accelerate certain kinds of artificial intelligence (AI) called “artificial neural networks” which are good at processing speech and images. In the past few years, applications such as Siri, Apple’s AI virtual assistant, have gotten much better at recognizing speech as Apple has rebuilt its speech recognition software around these artificial neural networks. The magazine reported that this same technology also powers the feature that allows users of the iPhone to search their images in the Photos app using specific terms (e.g., keywords like “dog”).

In an interview with Wired, Eugenio Culurciello, a professor at Purdue who works on chips for machine learning (ML),, said, at the time, that because more powerful algorithms can be deployed right on the iPhone itself, Apple’s neural engine could open new uses for ML such as image recognition. The magazine reported that the circuits found in the Apple Neural Engine allow these machine-learning algorithms to more quickly analyze data (e.g., recognizing images of a user’s pet in the Photos app).

“Apple’s new silicon could improve the iPhone’s ability to understand your voice and the world around you,” said Culurciello (speaking to Wired).

Exploring What’s In Your Photos

In May, Apple announced and previewed via a press release a number of what it described as, “next generation technologies” for disabled people with mobility and cognitive issues and for persons who are hearing and visually impaired.

The announcement indicated that these new features, which are designed to assist persons with disabilities, were coming later this year with software updates across all of Apple’s operating systems. Interestingly enough, however, Apple did not specify the operating systems that would be getting these upgrades (e.g., iOS or MacOS).

Specifically for those who are blind or have low vision, VoiceOver, Apple’s screen reader software for the visually impaired, will get even smarter using on-device intelligence. Users will now be able to explore even more details about the text, people, and objects contained within images (e.g., a description of a photograph that says, “slight right profile of a person’s face with curly brown hair smiling”). One will also be able to relive memories in detail with a new feature called Markup, which allows users to add their own image descriptions to photographs.

According to Apple, these new features advance its long history of delivering industry-leading technologies that make Apple products and accessories accessible for all users, supporting the company’s belief that accessibility is a human right.

“At Apple, we’ve long felt that the world’s best technology should respond to everyone’s needs, and our teams work relentlessly to build accessibility into everything we make. With these new features, we’re pushing the boundaries of innovation with next-generation technologies that bring the fun and function of Apple technology to even more people,” said Apple senior director of Global Accessibility Policy and Initiatives, Sarah Herrlinger (in the press release).


A Note from the Author: this article is one of a number of stories with Accessibility or disability related topics as its subject matter which this writer features periodically in this column due to his own disabilities (being visually impaired and partially hearing impaired) and, whenever the opportunity arises to share such stories, it is published here.

Some of the links above are affiliate links to the retailer's site. That means we may earn a small commission from any sales (Thank you!).


Boost Infinite
Apple Store