Viewpoint: Chatbots creating a culture of health
Excellent healthcare generally goes hand in hand with a superior standard of living. However, with the rising cost of medical care and within a region like SA, where a large section of the population lives in rural areas, access to superior healthcare and excellent medical knowledge is not a reality. It's on its way though, thanks to virtual reality!
That is, a bot can assist where people fail.
Chatbots in medicine are becoming widely adopted. According to Chatbots Magazine, "the first 'operative' bot in the healthcare sphere dates back to 50 years ago. ELIZA was created to mimic a Rogerian psychologist that is a therapist who asks questions to the patient simply by rearranging what the patients have said themselves.
Artificial intelligence (AI) and its applications, including machine learning and deep learning, has advanced medical chatbots greatly. "Some chatbots are compact medical reference books, which are useful for patients and for those who want to learn more about health. Bots can remind us to take our pills," continues Chatbots Magazine.
Although most literature available states that chatbots can never replace real doctors, Wired magazine published a piece on "Woebot", created by a team of Stanford psychologists and AI experts. Woebot uses brief daily chat conversations, mood tracking, curated videos, and word games to help people manage mental health," writes Megan Molteni in Wired. "After spending the last year building a beta and collecting clinical data, Woebot Labs just launched the full commercial product - a cheeky, personalised chatbot that checks on you once a day for the price of $39 a month."
This amount is relatively cheap in the US, and the low cost plus the bots availability 24 hours a day every day of the week, makes therapy accessible to more people than before. It has, however, also opened up the debate of ethics and legal issues. "Being the only therapy chatbot with peer-reviewed clinical data to back it up separates Woebot from the pack. But using those results to claim it can significantly reduce depression may expose Woebot to legal liabilities that bots in supporting roles have managed to avoid. Without moral agency, autonomous code can't be found guilty of any criminal acts. But if it causes harm, it could be subject to civil laws governing product liability. Most manufacturers deal with those risks by putting labels on their products warning of possible hazards; Woebot has a somewhat synonymous disclaimer that states people shouldn't use it as a replacement for getting help," continues Molteni.
In general, however, chatbots in medicine may just be what the doctor ordered. A recent Juniper Research study has revealed that the annual cost savings derived from the adoption of chatbots in healthcare will climb 320% year over year, finally reaching $3.6 billion globally by 2022.
Early and correct diagnosis, understanding symptoms, a personal nurse to remind you to take your pills, universal and superior healthcare - the hope for an excellent quality of life for all people, everywhere, is definitely in our reach thanks to the advancements in medical bots.