This story is part of , our full coverage of the latest news from Apple headquarters.
Apple is pulling more information out of photos: its new Live Text feature coming to iOS 15, announced at itsdeveloper conference, looks like a variation on Google’s clever computer vision smarts baked into Lens.
The feature will recognize text in photos or through Apple’s Camera app, and will recognize seven languages to start. The computer vision-based tool will be able to search for text found in photos, much likealready does.
While Live Text isn’t quite augmented reality as defined by technologies like, Live Text could very well be a key tool for upcoming AR software (or ). The ability to recognize and pull information on the fly looks like a pretty necessary trick for glasses, although how it plays out in iOS 15 remains to be seen.