Skip to main content

Google Gemini eases web surfing for users with vision and hearing issues

Accessibility improvements to Android.
Google

Android devices have offered a built-in screen reader feature called TalkBack for years. It helps people with vision problems to make sense of what appears on their phone’s screen and lets them control it with their voice. In 2024, Google added its Gemini AI into the mix to give users a more detailed description of images. 

Google is now bolstering it with a whole new layer of interactive convenience for users. So far, Gemini has only described images. Now, when users are looking at images, they can even ask follow-up questions about them and have a more detailed conversation.

Recommended Videos

How does it help users with vision difficulties?

“The next time a friend texts you a photo of their new guitar, you can get a description and ask follow-up questions about the make and color, or even what else is in the image,” says Google. This builds on the accessibility upgrade that integrated Gemini within the Talkback system late last year.

The Talkback menu on Android now shows a dedicated Describe Screen feature that puts Gemini in the driving seat. So, for example, if users are browsing a garment catalogue, Gemini will not only describe what appears on the screen, but will also answer relevant questions. 

For example, users can ask questions such as “Which dress would be the best for a cold winter night outing?” or “What sauce would go best with a sandwich?” Gemini will also be able to analyse the entire screen and inform users about granular product details, or if there are any discounts available. 

Making captions expressive and improving text zoom

In the Chrome browser, Google is giving a small lift to the auto-generated captions for videos. Let’s say you are watching a football match. The captions will no longer just follow the commentator’s words, but will also match their emotions and expressions.

For example, instead of “goal,” users with hearing issues will see a resounding “goooaaal” for an added dash of emotional emphasis. Google is calling them Expressive Captions. 

In addition to human speech, they will now also cover important sounds such as whistles, cheering, or even the speaker just clearing their throat. Expressive captions will be available on all devices running Android 15 or a later version, in the US, UK, Canada, and Australia. 

Another meaningful change coming to the Chrome browser is adaptive text zoom, which is essentially an update on the Page Zoom system available on Android phones. Now, when users increase the size of text, it will not affect the layout of the rest of the web page. 

“You can customize how much you want to zoom in and easily apply the preference to all the pages you visit or just specific ones,” says Google. Users will be able to make zoom range adjustments using a slider at the bottom of the page.

Nadeem Sarwar
Nadeem is a tech and science journalist who started reading about cool smartphone tech out of curiosity and soon started…
Thanks to Gemini, you can now talk with Google Maps
Gemini’s Ask about place chip in Google Maps.

Google is steadily rolling out contextual improvements to Gemini that make it easier for users to derive AI’s benefits across its core products. For example, opening a PDF in the Files app automatically shows a Gemini chip to analyze it. Likewise, summoning it while using an app triggers an “ask about screen” option, with live video access, too.
A similar treatment is now being extended to the Google Maps experience. When you open a place card in Maps and bring up Gemini, it now shows an “ask about place” chip right about the chat box. Gemini has been able to access Google Maps data for a while now using the system of “apps” (formerly extensions), but it is now proactively appearing inside the Maps application.

The name is pretty self-explanatory. When you tap on the “ask about place” button, the selected location is loaded as a live card in the chat window to offer contextual answers. 

Read more
Gemini app finally gets the world-understanding Project Astra update
Gemini Live App on the Galaxy S25 Ultra broadcast to a TV showing the Gemini app with the camera feature open

At MWC 2025, Google confirmed that its experimental Project Astra assistant will roll out widely in March. It seems the feature has started reaching out to users, albeit in a phased manner, beginning with Android smartphones.
On Reddit, one user shared a demo video that shows a new “Share Screen With Live” option when the Gemini Assistant is summoned. Moreover, the Gemini Live interface also received two new options for live video and screen sharing.
Google has also confirmed to The Verge that the aforementioned features are now on the rollout trajectory. So far, Gemini has only been capable of contextual on-screen awareness courtesy of the “Ask about screen” feature.

Project Astra is the future of Gemini AI

Read more
Your Google Assistant just lost a bunch of features ahead of the move to Gemini
Google Assistant messaging shortcut

We've only just learned that Google Assistant is being replaced with Gemini, and now, it turns out that some features are being quietly retired as a result. Some will be available as part of Gemini, but devices that don't yet have access to Google's latest AI companion may not have an immediate replacement. Here's what's going away.

As spotted by 9to5Google, Google Assistant will lose a total of seven features, and this will affect Android, Nest Hub, and Nest speakers users.

Read more