“All I want to do is search the internet and watch my films”
–AbilityNet Client
Sounds straightforward? But in practise when you are:-
- totally blind
- no hand/finger function
It is incredibly difficult to do!
I have been working with this client for over 18 months now. We have looked at a variety of solutions. Echo Dot has been successful in giving the client a leisure activity playing music and accessing books and games using her voice to control . The Amazon Fire Stick is the next project to integrate with the Echo Dot to extend the range of programs with audio description available. It does have a menu that is spoken but requires the remote to scan the menu- not possible with no hand/finger function.
So it was a joy when using Dragon Dictate 13 and Sero (samobile.net) and found that simple commands such as “SEARCH” “PRESS ENTER” and “TAB” could navigate and select items on Sero that the client used to be able to do.
I now have the task of trying to get the laptop to start up and switch down. Once the on button is pressed the client can control all the computer’s functions at startup. One solution is to never the shut the laptop down. Have it go into hibernation and then wake it up with a command. This is my next goal along with making the Amazon Fire stick do more than “stop” and “play” a channel using the Echo Dot as input.
Simple questions like the one this post started with often have quite complex answers which frustrates the client/user. In an ideal world we would just love the tech to just work and integrate seemlessly with each other. Ah well…. role on Artifical Intelligence which does offer that possibility!