Google’s “Voice Access” is decent for controlling the device through verbal commands, but you have to be looking at the screen to get results - it won’t read anything back to you.
Google’s “TalkBack” will read things on screen to you, but you have to interact with the screen physically (never mind the significant change in how interactions work - which I understand the need for - but it’s still a serious mental PITA to switch between the two interaction methodologies frequently).
Is there no way to just interact with it entirely verbally? A (very) simple example of what I’m looking for:
- "What are the current Google News headlines?"
- Starts reading each one aloud, along with the names of the sources.
- "Read the article about Trump caught making out with Elon from AP News."
- Proceeds to load the article & read it aloud.
(Yeah, I know there are podcasts for this - it’s meant to illustrate the basic idea of completely verbal interaction with the device, not be an actual problem I’m looking for someone to provide a solution to.)
It just seems to me that we should be able to do this by now - especially with all the AI blow-up over the past couple of years. Can anybody point me to a usable solution to accomplish this?
TIA.
Catoblepas@lemmy.blahaj.zone 1 day ago
Sorry if I’m overlooking something obvious here, but you’re basically asking about accessibility features, right? Whatever settings and features blind users use should let you navigate without looking at it.
Although I will say you probably would need to get used to listening to text at a very high speed to use it at the speed you read.
SanctimoniousApe@lemmings.world 1 day ago
Imagine someone blind who also has Parkinson’s - they can’t see to use Voice Access, and they can’t control their hands well enough to interact with the screen physically in a reliable manner. You can’t actually use those two accessibility features together - they are mutually exclusive in that they require you either be able to see the screen, OR you must be able to interact with it physically as it reads out what you’re touching. Why is there no way to interact entirely verbally?
Catoblepas@lemmy.blahaj.zone 1 day ago
Someone who knows more about how to/if you can do this on Android will have to answer the specifics for that; I know on iOS you can use custom voice control actions (along with default voice control phone navigation mode) to do more or less what you’re describing. I’m surprised if Android has no accessibility features that work similarly.