Something I've thought about today:
Android is kind of less "blind friendly". I use that to mean how well the OS, accessibility frameworks, and screen reader are working together to give an experience that doesn't assume a visual user. A really good showcase for this is scrolling. On iOS, if you swipe, you barely notice that the screen scrolls when you get to the bottom of it. On Android though, you can hear the half second or so it takes to scroll. Also there technically are no screen reader commands to scroll up, down, left, or right. There's just "scroll forward" and "scroll backwards," which means that if you scroll forward in an app with tabs, you might find yourself on the next tab rather than the next list of items.
Now, for those who only use speech, this is usable. But a lot of blind Android users who just explore by touch don't seem to get that "swiping" is all a Braille user can do. Like, the system should not care which way one navigates. And even though on a touch screen, you can scroll in any direction using two fingers, this isn't screen reader specific, so a Braille user cann't do that. But who cares about Braille, it's dead don'cha know? /s
Another thing that really gets on my nerves sometimes is putting in my PIN. I really need to try a password and see if that works better, but the PIN entry field isn't an actual keyboard, it's just an interface that looks like one. So, using a Braille display, I have to navigate one number at a time, and enter them by pressing Enter on the one I want. Sometimes I can press Space with dot 4 to go down a line of numbers, but sometimes that puts me on the bottom row instead of the next row. Of course, on iOS, I can type my passcode as expected.
It's also kind of baffling to me that Gemini on Android doesn't automatically speak or Braille responses whenever I type to it. It could easily send those responses to TalkBack. But, as usual, the hearing, speaking blind are the testers Google has, so of course the feedback is that it works, it's fine, and if there are any descenting voices, they're either drown out or unheard. And this is AI, the current money-maker and time-waster for all these companies. And yet, even in that, they still can't get accessibility right. Just look at it on the web. The thing says Gemini replied, except it hasn't even finished generating the response yet. Imagine if VoiceOver did that in iMessage and the person had just started typing, and VO didn't even say when they actually sent the message? The NFB would have all their resolutions on just that one topic.
Don't get me wrong, a lot of things in Android work well. But there are just these things that remind me that there really needs to be a big shift in Google regarding accessibility, and not just a surface-level cleaning, for Android to really lose that speech-only attitude of workarounds. Also I'm not saying iOS is anywhere near perfect, even for Braille. But when I do use Braille on iOS, I feel a lot closer to a second-class citizen than a third or fourth like on Android.