In the near future, you may simply go to your wall screen and talk to robodoc. You will be able to change the face, and even the personality, of the robodoc that you see with the push of a button. The friendly face you see in your wall screen will ask a simple set of questions: How do you feel? Where does it hurt? When did the pain start? How often does it hurt?
Each time, you will respond by choosing from a simple set of answers. You will answer not by typing on a keyboard, but by speaking.
Each of your answers, in turn, will prompt the next set of questions. After a series of such questions, the robo-doc will be able to give you a diagnosis based on the best experience of the world’s doctors. Robodoc will also analyze the data from your bathroom, your clothes, and furniture, which have been continually monitoring your health via DNA chips. And it might ask you to analyze your body with a portable MRI scanner, which is then analyzed by supercomputers. (Some primitive versions of these heuristic programs already exist, such as WebMD, but they lack the nuances and full power of heuristics.)
The majority of visits to the doctor’s office can be eliminated in this way, greatly relieving the stress on our health care system.