Facebook searched Videos and contributions with artificial intelligenceto find out if one of the users is attempting suicide.
Discontinued model human therapist?
If someone plans a suicide and Facebook notices this, the family or a psychological service is informed. Soon even the assigned therapist could be an artificial intelligence.
Artificial intelligence and human emotions
The market for psychotherapeutic chatbots is growing. Pharmaceutical companies and scientists are using digital technology to advance the automation of psychotherapy. For example, the chatbot called “Woebot” on Facebook is intended to help users overcome depressive thoughts. Also on Facebook, teenagers of the “Dr. Sommer Chatbot” on Facebook answer questions about sexuality – formerly the column of the youth magazine “Bravo”.
The development of these worry robots is still in its infancy. And there are many reasons to push them forward.
Chatbots are undemanding and modest
I call a chatbot, robot or program that deals with any human concern a “Robopeut”.
He has time on his hands day and night. Whenever problems are big and help is sorely needed, he is there. A Robopeut has no problems of his own, which could become annoying in a session. He has no private subjects and needs that would disturb the treatment. He is never distracted by his own life. In contrast to his human counterpart, he has no private life or moods that affect his concentration.
The Robopeut understands better with each use what his patient needs
The memory of a robotic hunter is excellent! He is able to remember every little thing and thus with each session he further consolidates his understanding of the patient’s concerns. Everything said can be called up at any time and can be related to each other in time.
Furthermore, he is programmed to understand his patient. If a doctor makes mistakes, this will be noticed in the best case. That’s rare with therapists. If a treatment does not work, it is more likely that the patient’s curability is questioned than the therapist’s professional competence. This was impressively demonstrated in the 1960s with studies of the labelling approach.
I maintain that the number of unreported cases in which therapists have done harm is high. Spectacular attention is only given to the misdiagnoses of psychologists and psychiatrists, on the basis of which criminals are wrongly dismissed as having been rehabilitated. Back at liberty, many a delinquent has once again committed a crime and the positive prognosis of his therapist has been proven wrong.
Rational Robopeut
A Robopeut has no emotions or problems which he carries into the treatment. He is unprejudiced – precisely because he has no feelings.
That’s why I see a great value in roboputen, especially in the social prognoses for offenders. I expect that every offender will have to document his or her state of mind with a roboputen app every day in order to improve the social prognoses of psychologists and psychiatrists. And the latter – I hope – judges completely unprejudiced and purely on the basis of the facts.
In the field of forensic psychiatry, I see great potential for artificial intelligence.
Ubiquitous Robotpeut
The human therapist is expensive! A Robopeut, on the other hand, fits in your pocket – as a mobile phone app. This makes it not only the most cost-effective and quickly available alternative to conventional therapy stations.
Meetings are so cheap that everyone can afford them
The app can be easily downloaded – without having to inform the health insurance company and apply for a therapy place – along with waiting time and a corresponding note in the health insurance company’s file.
Almost the Robopeut is comparable to a fitness app, which is a cheap alternative to the fitness center. Except this is about mental fitness.
Does the therapy of the robot peutist help?
Alison Darcy a clinical psychologist from Stanford has developed the “Woebot” and tested it on depressed people. The results of the study are surprisingly positive.
Indifferent brain
The human brain does not care whether its therapist has human or artificial intelligence. Stanford professor Jeff Hancock, who is conducting research on this topic, assumes that the use of robotic birds will lead to comparable therapeutic success.
Although the human brain has evolved over millions of years and chatbots are a new invention, studies have shown that a patient’s brain changes in exactly the same way whether they are talking to a human or a robot. Jeff Hancock
What can the human therapist score with?
There are many arguments to be treated by a robo-people in the future. But science and industry agree on one thing:
Robotics should not cause people to become alienated from each other. The empathy of a roboaut can never be real – it is always a programmed trick.
This means that real human qualities such as forgiveness, understanding, empathy or even compassion are lost. A criminal would be “judged” in a completely rational way – without regard to the concrete person. In forensic psychiatry, cases are conceivable in which a person would judge “graciously”, but not a Robopeut.
The compassion of a human therapist is usually sincere and truly emotional. For some patients, the therapist’s compassion is healing, which can even become a problem for the therapist.
As paradoxical as it sounds – it can be the human closeness of the therapist that heals the patient’s soul
Exactly this human fallibility, as it was touchingly beautiful described in the novel “the red couch”.
That’s why I think that a therapist mix of human and Robopeut would be useful: human compassion and the rational intelligence and memory of the Robopeut.
One thing will certainly not change: There is no life without worries – not even with a Robopeut!
Who I am? And why I do this? You want to write a master thesis in sociology or psychology and are interested in the project? Come in!