Apple Live Text takes on Google Lens, can read your photos – CNET

live-photos-ios-15-wwdc
Apple

This story is part of Apple Event, our full coverage of the latest news from Apple headquarters.

Apple is pulling more information out of photos: its new Live Text feature coming to iOS 15, announced at its virtual WWDC developer conference, looks like a variation on Google’s clever computer vision smarts baked into Lens.

The feature will recognize text in photos or through Apple’s Camera app, and will recognize seven languages to start. The computer vision-based tool will be able to search for text found in photos, much like Google Lens already does.

While Live Text isn’t quite augmented reality as defined by technologies like Apple’s ARKit, Live Text could very well be a key tool for upcoming AR software (or hardware). The ability to recognize and pull information on the fly looks like a pretty necessary trick for glasses, although how it plays out in iOS 15 remains to be seen.

Source

Previous post Apple overhauls notifications with summaries and focus tools – CNET
Next post Best fitness trackers for 2021 – CNET