Apple’s version of Google Lens comes in the form of two features: Live Text and Visual Look Up. These features will be available on the iPhone’s camera and Photos app with the release of iOS 15. Just like Google Lens, they aim to provide users with a powerful tool for searching and interacting with the real world through their smartphone camera.
Live Text is a feature that allows users to extract text from images in real-time. This means that you can simply point your iPhone camera at a sign, a document, or any other text-based object, and the text will be recognized and made selectable. You can then copy and paste the text, look up definitions, translate it, or even call phone numbers directly from the captured text. This feature has the potential to be incredibly useful in various scenarios, such as capturing information from a whiteboard during a meeting or quickly saving contact details from a business card.
Visual Look Up, on the other hand, is a feature that enables users to learn more about objects and scenes captured by their iPhone camera. By simply tapping on an object in a photo, you can access detailed information about it, including its name, description, and even related articles. This feature utilizes machine learning and computer vision to recognize objects and provide relevant information. It can be particularly handy when you come across something unfamiliar and want to quickly gather more knowledge about it.
Both Live Text and Visual Look Up are direct rivals to Google Lens, as they offer similar functionalities and capabilities. Google Lens has been a popular feature on Android devices for quite some time, allowing users to search for information, identify objects, and translate text using their smartphone camera. With Apple’s introduction of these features, iPhone users will now have native tools at their disposal to perform similar tasks without having to rely on third-party apps.
I personally find the addition of Live Text and Visual Look Up to be exciting and practical. As someone who frequently uses their iPhone camera to capture various types of information, such as notes, recipes, or important documents, the ability to easily extract text from images will undoubtedly save me time and effort. Additionally, the Visual Look Up feature seems like a valuable tool for expanding my knowledge and understanding of the world around me, especially when encountering unfamiliar objects or places.
Apple’s version of Google Lens, consisting of Live Text and Visual Look Up, brings powerful visual search capabilities to the iPhone’s camera and Photos app. These features aim to enhance the user experience by allowing text extraction and providing detailed information about objects and scenes captured through the smartphone camera. With their introduction in iOS 15, iPhone users will have native tools to search the real world using their device, providing a direct competition to Google Lens.