Do you remember the scene in Star Trek the Voyage Home when Scotty talks to a 1980s PC and asks it to do something… only to be told to use the Mouse.. “How Quaint” is I believe his comment.
I remember my first car with speech recognition – it was rubbish and pretty much every car since has been rubbish as well. However my iPhone isnt. (im sure android are as good) Ive started giving it commands – “set an alarm for 645am tomorrow” “remind me to pay the X bill on the 24th august” “Send a text to Rebecca – ill pick you up in 10 mins” all seem to work. In none of these cases do I actually need to know how to do any of those things.
Now we have had speech dictation for some time – aimed at putting text into our notes – but why dont we have similar for our EPRs? “send a sick note for 1 month to Mr John Smiths email” “Issue a course of amoxycillin suspension for 5 days and send it as a one of to Chemist X” “patient needs bloods for X Y and Z and an ECG and then a follow up appointment a week later with me.” How about the task system – “yes let mr X know he can take his tablets in the morning and close this task” – which would like a tasking system and a messaging system
I understand Automation companies are looking at workflows; are they creating a voice receptive front end? Do we want one?