Just point your iPhone at anything and let it speak what it sees

Have you ever wished you could see things nearby better than your eyes? Maybe you want to read a small print on a label, a menu, or a book. Or maybe you just want to have some fun with your iPhone and discover new details in your surroundings.

Whatever the reason, there is a cool feature on your iPhone that can help you see things easier and even tell you what it’s looking at. It’s called the ‘Point and Speak’ feature, and it’s exactly that: point your iPhone at something to read, and it will speak it to you.

Android users follow these tips.

 

What is ‘Point and Speak’?

The ‘Point and Speak’ feature is part of the Magnifier app, which is a built-in accessibility tool for people with visual impairments. But anyone can use it, whether you have low vision or not.

The Magnifier app turns your iPhone into a digital magnifying glass that can zoom in and out, adjust brightness and contrast, and apply filters to enhance the image. The ‘Point and Speak’ feature adds another layer of functionality: it can recognize text in the image and read it aloud using Siri’s voice.

 

Requirements to use the ‘Point and Speak’ feature

To use the ‘Point and Speak’ feature, you need to have iOS 17 or later installed on your iPhone, and your iPhone model must have a LiDAR sensor. The LiDAR sensor is a special camera that can detect depth and distance. You can check if your iPhone has a LiDAR sensor by looking at the back of your iPhone. If you see a small black circle next to the main camera, that means your iPhone has a LiDAR sensor. The iPhone models that have a LiDAR sensor are:

  • iPhone 12 Pro and Pro Max
  • iPhone 13 Pro and Pro Max
  • iPhone 14 Pro and Pro Max
  • iPhone 15 Pro and Pro Max

MORE: IPHONE 15 PRO’S BEST NEW SECRET WEAPON. HOW TO USE THE ACTION BUTTON  

 

If you don’t already have the magnifier app on your iPhone

  • Download the Magnifier app from the Apple App Store
  • To access the magnifier app quickly, you can add it to your Control Center by going to Settings
  • Then tap Control Center
  • Next, scroll down and tap the green plus icon next to Magnifier
  • Now you can use the magnifier app anytime you need it. Just swipe down from the top right corner of your screen to open the Control Center, and tap the magnifier icon. It looks like a magnifying glass with a plus sign in the middle.

 

How to use the ‘Point and Speak’ feature

If your iPhone has a LiDAR sensor and iOS 17 or later, you can use the Point and Speak feature by following these steps:

  • Open the Magnifier app on your iPhone. You can find it in the Utilities folder, by searching for it in Spotlight or swiping down from the top right of your screen
  • Tap the Detect mode icon on the bottom right corner. It looks like a square with a circle inside it
  • Tap the Point and Speak icon on the bottom left corner. It looks like a hand pointing to three lines
  • Hold your iPhone about 12 inches away from the text you want to read and point the camera at it
  • Use your other hand to point at the text you want your iPhone to speak. Your iPhone will highlight the text in yellow and read it aloud
  • If you don’t hear the speech feedback, click the circular gear icon on the upper left of the screen. Then click Point and Speak and make sure Speech is toggled on.  Then click the left arrow Back buttons in the upper left. Then tap Done. 

  • When you’re finished with the Point and Speak feature, tap Done in the upper right of the screen to return to the Magnifier screen.

MORE: HOW TO FIND ANY RECIPE WITH JUST A PHOTO ON IPHONE  

 

How to receive live image descriptions

There is also a feature that can help you know what your iPhone is seeing. It’s called live image descriptions, and it can tell you what objects, people, and text are in your camera view. Live image descriptions are also part of the Magnifier app.  Here’s how to turn on live image descriptions and use them:

  • Open the Magnifier app on your iPhone. You can find it in the Utilities folder, by searching for it in Spotlight or swiping down from the top right of your screen
  • Tap the Detect mode icon on the bottom right corner. It looks like a square with a circle inside it
  • Then tap the Live Image Descriptions icon. The Live Image Descriptions icon looks like a chat bubble:  . It will turn yellow when you tap it.
  • Position your iPhone so the rear camera can get image descriptions of the world around you. For example, I pointed the camera at a telescope, and Siri said, “A telescope on a tripod on a wooden surface.”
  • If you don’t hear the speech feedback, Click the circular icon on the upper left of the screen. Then click Image Descriptions and make sure Speech is toggled on.  Then click the Back left arrow buttons in the upper left. Then tap Done. 
  • When you’re finished with Live Image Descriptions, tap Done to return to the Magnifier screen.

Live image descriptions are a useful and fun feature that can help you explore and learn more about your environment. Try them out and see what your iPhone can see.

MORE: UNFORGETTABLE TRICKS TO CONTROL YOUR IPHONE WITH VOICE COMMANDS AND TOUCH 

 

Kurt’s key takeaways

The ‘Point and Speak’ and the Live Image Descriptions features are amazing tools that can help you see and hear things better using your iPhone. They are not only useful for people with visual impairments but also for anyone who wants to have some fun and discover new things in their surroundings.

Whether you want to read a small print, a sign, or a book, or you want to know what objects, people, and text are in your camera view, you can use these features to make your iPhone more than just a phone. Try them out and see what your iPhone can do for you.

What are some situations where you would use the Point and Speak or the Live Image Descriptions feature? Let us know in the comments below.

FOR MORE OF MY TECH TIPS & SECURITY ALERTS, SUBSCRIBE TO MY FREE CYBERGUY REPORT NEWSLETTER HERE

 

Answers to the most asked CyberGuy questions:

Related posts

From TikTok to trouble: How your online data can be weaponized against you

Food tracking just got lazy (In the best way possible) with this wearable

Massive data breach at federal credit union exposes 240,000 members