A young person holding a cell phone.

Artificial intelligence could help crisis center workers to better intervene with people having suicidal thoughts.

Photo: Shutterstock / Motortion Films

Speech analysis using an artificial intelligence tool will make it possible to assess suicide risks, says data science engineer Alaa Nfissi, who is currently studying for a doctorate at Concordia University in Montreal.

His doctoral project focuses on the development of artificial intelligence techniques allowing the automatic detection of emotions from telephone conversations.

The idea is to detect emotions from the words of people who regularly call support call centers because of suicidal thoughts.explains Mr. Nfissi, who is also a member of the Center for Research and Intervention on Suicide, Ethical Issues and End-of-Life Practices (CRISE).

To achieve this, he created a deep learning model. Ultimately, the researcher hopes that his model will lead to the creation of a real-time dashboard including Telephone advisors will be able to use this in their interventions with callers struggling with emotion and which will help them choose the right strategy.

Two telephone operators work face to face.

Researcher Alaa Nfissi’s model could help telephone advisors choose the right intervention strategy.

Photo: Radio-United States / ALEXANDRE LAMIC

Word, mirror of the soul

Speaking is essential to understanding the emotional state of a person in crisis. For this reason, it plays a major role in detecting suicidal thoughts.

The people who answer the phone on help lines are trained to understand the emotions of the callers and know very well how to detect them.

Another Breaking News:  What strengths and weaknesses do the signs have according to their element?

But no system is perfect, the interpretation of the words of a suicidal person can sometimes be erroneous.

The reality is that an intervention of this type requires a lot of expertise. What we want to do is standardize part of the emotion detection process to help all agents, including those who are less experienced, make the best decisions to help callers.

A quote from Alaa Nfissi, doctoral student at Concordia University


  • Approximately 4,500 people commit suicide each year in Canada, which equates to 12 people per day.
  • In Quebec, 1,030 people take their own lives annually, which is equivalent to 3 people per day.
  • More than 200 people attempt suicide every day in the country.
  • There are around thirty suicide prevention centers in Quebec.
A telephone operator wears a call headset and looks at a computer screen.

Speech plays a major role in detecting suicidal thoughts.

Photo: Radio-United States / ALEXANDRE LAMIC

The model used in the study

To create the model, Alaa Nfissi used a database made up of real calls recorded by organizations dedicated to suicide prevention, and a compilation of recordings used in research made by actors expressing specific emotions.

As part of a protocol adapted to this type of task, the two sets were segmented then annotated by CRISE trained researchers or by the participants who carried out the recordings themselves.specifies Mr. Nfissi.

Each annotated segment thus reflects a particular state of mind such as anger, indifference, sadness, fear, worry or worry.

The data was then analyzed using Alaa Nfissi’s deep learning model.

We take a (vocal) segment as is and pass it through the model which takes care of extracting the characteristics of the voice which are necessary to detect the emotion.

A quote from Alaa Nfissi, doctoral student at Concordia University

According to the researcher, the results of his model are encouraging. The model correctly identified:

  • fear, worry or worry in 82% of cases;
  • indifference in 78% of cases;
  • sadness in 77% of cases;
  • anger in 72% of cases.
Another Breaking News:  Ottawa aims to buy back 140,000 firearms in the middle of an election year

The model was very effective in recognizing segments recorded by actors, with its accuracy rate ranging from 78% for sadness to 100% for anger.

Our model is more effective than other models that are developed by other researchers, and we are always working to improve it. This is ongoing work, since it is important to have more data to be more preciseinsists the researcher.

The model is still in the prototype stage and is not used in call centers. And to achieve this, financial investments will be necessary to refine the tool and create a reliable dashboard.

Researching and producing a call center ready tool are two different worlds. Implementing a model like ours requires a significant investment in artificial intelligence.

A quote from Alaa Nfissi, doctoral student at Concordia University

The researcher estimates that if adequate funding was secured soon, the model could reach call centers within a year or two.

Alaa Nfissi’s model was first presented in February 2024, during the 18th IEEE Congress (New window) (International Conference on Semantic Computing), which was held in Laguna Hills, California.

Photo of a cell phone where the number 988 is dialed.

The AI ​​model could reach call centers in a year or two.

Photo: Radio-United States / Kheira Morellon


Using artificial intelligence for vocal emotion recognition comes with challenges, such as privacy and risk of biasestimates the engineer.

Risk of bias in artificial intelligence refers to the tendency of deep learning systems to develop biases that may come from the data they are trained on.

If the data contains unequal or stereotypical representations of certain groups of people, theAI might be less accurate for these groups. For example, if a system is predominantly trained with voices from certain regions or ethnic groups, it might be less able to correctly recognize emotions expressed by people from other regions or origins. This can lead to unintentional discrimination and affect the fairness of the application of technology.

A quote from Alaa Nfissi, doctoral student at Concordia University

It is therefore essential to adopt data protection measures and carefully monitor bias to make the best use of these technologies while respecting ethical and legal standards.adds Alaa Nfissi, who insists on the fact thatAI also provides significant opportunities to improve communication and human interaction.

Other uses

Speech analysis using an artificial intelligence tool could also be useful in many areas, including telephone customer services or distance education.

Another Breaking News:  “A monster” Adriana Karembeu makes shattering confessions about her father

It really is a very broad area of ​​research that can be applied in several aspects of society.concludes Alaa Nfissi.

Do you need help for yourself or a loved one?

  • On the web: www.suicide.ca (New window)
  • By telephone: 1 866 CALL (277-3553)
  • By text: 535353

Similar Posts