Machines may say: move over doctor

March 04, 2012 01:29 am | Updated July 26, 2016 12:28 am IST

A man wears a brain-machine interface to measure slight electrical current and blood flow change in the brain, at the Honda’s headquarters in Tokyo. File Photo

A man wears a brain-machine interface to measure slight electrical current and blood flow change in the brain, at the Honda’s headquarters in Tokyo. File Photo

The diversified and myriad responses to my articles >“Do doctors listen to patients or computers?” (Open page, January 15) and >“Do patients listen to doctors?” (January 29) have provoked this response. Have we indeed come full circle? Will the H2H (human to human) communication of yesteryear (the then basis of healthcare), now slowly becoming H2M (human to machine), eventually become M2M (machine to machine)? Internet of Things will make today's human internet look like the Jurassic Park. The present network of interconnected computers will become a network of interconnected “things,” each having its unique IP address.

If an intelligent refrigerator can directly contact eBay, why not an embedded-sensor send my cardiac rhythms to a specialised cardiac server without my intervention? Will machines eventually communicate among themselves to provide “better” healthcare to humans with no or minimal human intervention? Will computers listen at all/to whom/ to a patient/ to a doctor/or to one of their own kin? The only thing that is constant in the universe is change and what a 360 change this will be. To quote Mark Twain, “The future ain't what it used to be.”

M2M uses a device (a surface or implanted sensor) to capture an event (e.g., BP, blood sugar, ECG) which is relayed to an application (software programme) that translates the captured event into meaningful information. This information is then analysed by an Artificial Intelligent System (AIS) reviewed by a physician and instructions for corrective measures (e.g., release of appropriate insulin from implanted pump) are sent.

The H2M interaction is possible today without a mouse, keyboard or touch screen. Gesture-based computing (MicrosoftKinect) is already being used by physically challenged patients. Unique virtual reality activities and simulated tasks using gestures are more exciting than traditional physiotherapy regimes. At present, an automatic blood analyser gives a printout of various tests on scores of patients. It is possible to instruct the analyser to directly send abnormal results through SMS to the primary consultant, an e.g. of M2H (machine to human) interaction. In “the Brave New World” these results can be sent automatically along with clinical data to an AIS which, in turn, would recommend appropriate action with a cc to a human.

Surface or embedded devices can send details of calorie consumption or sleep patterns to help consumers tailor their habits. A medical alert pendant can, in the case of patient incapacity due to a fall, pacemaker failure, etc, automatically inform a response centre from where a PC with AI (artificial intelligence) will contact the nearest ambulance. GlowCaps is a commercial product which uses light and sound to signal when it is time to take a pill. When the bottle is opened, a wireless signal is sent to a secure network. If the bottle is not opened two hours after a scheduled dose, the user will be reminded with a call. Each week a compliance report is automatically sent by the container to the caregiver. The container directly contacts the pharmacy for a refill.

The Gates Foundation is funding a project where one can cough into your mobile. An Acoustic Vocalisation Analysis Software may suggest the next step. LAN, WAN, PAN (Personal Area Network) will be supplemented with BAN (Body Area Network). In the event of a fall, help will reach the elderly living alone, within minutes, thanks to video cameras and pressure sensors in beds. The contact lens of the future will be connected to the internet for reading emails, a M2M activity.

Today, a “connected” moving vehicle with sophisticated detectors can call 1066, seconds before an anticipated collision. Inbuilt sensors which detect unacceptable movements of the driver (e.g. alcohol content in the CO breathed out) can directly send a message to a server, which will automatically inform the police patrol.

Instrumented intelligent clothes will provide remote monitoring of vital signs, sweat and even infection. Through the mobile phone, real-time information will be uploaded to a central server which, if necessary, will inform a human (your doctor), who will contact you. The doctor will be told in a humanoid synthesised voice “The patient will now see you.” What a fall my countrymen for keeping our patients waiting all these decades! The tables will truly turn!

Prologue: Year 2032. My 4-year-old great grandson says, “Big grandpa, did you really record the blood pressure and talk to your patient? How funny, Enthiran informed the pharmacy server to change the medicines that he will be giving to big grandmother.”

(The author is a Chennai-based neurosurgeon and telemedicine specialist. His email ID is drganapathy@ apollohospitals.com)

0 / 0
Sign in to unlock member-only benefits!
  • Access 10 free stories every month
  • Save stories to read later
  • Access to comment on every story
  • Sign-up/manage your newsletter subscriptions with a single click
  • Get notified by email for early access to discounts & offers on our products
Sign in

Comments

Comments have to be in English, and in full sentences. They cannot be abusive or personal. Please abide by our community guidelines for posting your comments.

We have migrated to a new commenting platform. If you are already a registered user of The Hindu and logged in, you may continue to engage with our articles. If you do not have an account please register and login to post comments. Users can access their older comments by logging into their accounts on Vuukle.