“Siri, I’m Feeling Depressed”

Can artificial intelligence replace mental health care professionals?

By: Alyssa Parking, Contributor

PC Sam Mathers

Apple recently announced they are working to make Siri better at responding to consumer’s mental health issues. People have been increasingly turning to Siri for mental health advice as artificial intelligence evolves. Dianna Inkepen, a professor at the University of Ottawa told the CBC that this is a natural progression in user behaviour.

When asked: “Siri, I am depressed, what should I do?” Siri currently responds: “I’m very sorry. Maybe it would help to talk to someone about it.” It is not until directly asked where to find help that Siri begins to find resources, and provides ones that are relatively out of date.

Creating an app or piece of artificial intelligence that can respond to those suffering from mental health crises is an extremely complex process for many reasons. One of such issues, is language. There are many different combinations of words that people can use to indicate distress and a comprehensive system must be able to identify an infinite number of distress calls in many different languages.

Apple is not the only company beginning to take a closer look at artificial intelligence and mental health. Other companies such as Google, as well as specific app companies have joined the competition as well. The U.S. Department of Veterans Affairs is one of many specific groups that have begun developing digital support for veterans experiencing PTSD. There is currently little publicly disclosed information due to the competitive nature of the field – artificial intelligence in mental health has evolved into a virtual arms race.

To work on this project, Apple is looking to fill the position of Siri Software Engineer, Health and Wellness. The company is looking to employ individuals with a background in both computer engineering and psychology. The key qualifications of the job, as listed on the company’s website, include:

“Strong Computer Science fundamentals including familiarity with basic algorithms”

“In-depth development experience with server-side Java and web services”

“Strong object-oriented programming and design skills”

“Experience writing multi-threaded code”

“Experience integrating data and services from multiple providers”

“Experience in developing highly scalable systems”

Any mention of specific experience working or studying the mental health field is included at the end of the job listing, under “additional qualifications,” which asks for “peer counseling or psychology background, with excellent problem solving, critical thinking, and communication skills,” and “ideally [having] knowledge of one or more foreign languages.” Educational requirements do not include any mention of study relating to mental health, psychology, or peer counseling, but rather a BS/MS in Computer Science or equivalent.

While attempts to equip Siri with this new ability shows an increased awareness of the importance of mental health, the ability to access support when otherwise unable to reach out to others, and increasing awareness regarding rights and availability of treatment, there have been several concerns raised surrounding artificial intelligence and mental health.

One of such concerns is the lack of privacy. Mental-health services provided by artificial intelligence are not protected under any federal privacy laws, as they are not registered doctorial services, and therefore users are not afforded the same privacy protections. All information and data collected will be open for Apple’s use.

Furthermore, it is stressed that Siri and other artificial intelligence venues are not currently able to replace mental health professionals, but rather only serve to help individuals access the support they require. It is recommended that people use apps and tools that involve actual clinicians or at least publish patient statistics when looking for support.

If you or someone you know needs support regarding mental health, there are services in place on the Lakehead Campus, such as the Student Health and Counselling Centre. For immediate support in Ontario please call:

Mental Health Help Line: 1-866-531-2600

Ontario College and University Students: 1-866-925-5454

KidsHelpPhone Ages 20 and Under in Canada: 1-800-668-6868

First Nations and Inuit Hope for Wellness 24/8 Help Line: 1-855-242-3310

Canadian Indian Residential Schools Crisis Line: 1-866-925-4419

Trans LifeLine – All Ages: 1-877-330-6366