The suicide rate in the United States has been climbing steadily over the last 20 years, with a 35% increase between 1999 and 2018. While data on the impact of the COVID-19 pandemic on suicidality remains limited to date, experts concur that the pandemic’s exacerbation of suicide risk factors, such as financial strain, social isolation, and stress, will result in further tragic increases in suicide deaths. There is no doubt, we are facing a suicide crisis. The good news is that suicide is preventable, but this requires everyone to do their part. 

In order to make progress, we all have a role to play in preventing suicides. However, like most complex societal issues there is not a simple solution. As a voluntary outpatient digital health provider specializing in substance use, Confidant is not a suicide crisis hotline and is not suitable to treat patients that are actively suicidal. Active suicidality is of the utmost seriousness and concern to all medical providers, and safety cannot be properly managed over a virtual platform, therefore anyone experiencing suicidal thoughts must always be referred to in-person care. Similar to many virtual care providers, we have an upfront screening process for suicidality to promptly connect individuals that are experiencing suicidal thoughts to resources that can meet their needs as soon as possible.  

“If you or your loved one are experiencing an emergency situation you should call 911 or the National Suicide Prevention Help Line at 1-800-273-8255 (en Español: 1-888-628-9454; deaf and hard of hearing: 1-800-799-4889) or the 24/7 Crisis Text Line by texting 741741.”

This message is also clearly posted throughout the app to ensure Confidant guests have easy access to the hotlines and are always aware that we are not an emergency service provider.  But the reality is that digital health providers have an opportunity to go beyond screening questions for new clients and posted messages, to play a more active role in suicide prevention. 

At Confidant, we embrace this idea by ensuring all of our team members, not just clinical staff, are gatekeepers to screen for and respond to suicidality with the appropriate referrals and protocol. We have also developed proactive crisis plans for those clients already under our care, who did not come into the app with suicidality as an issue. As we all life can change at any moment, and as healthcare providers we are mindful of that reality, and want to ensure our team as well as the Confidant community is best equipped to prevent suicide. 

Our strategies to do so are detailed below, we support the adoption of these strategies by other digital health providers:


Train every team member to ask about suicidal thoughts and what to do:

It’s critical to reverse the idea that only trained behavioral health professionals should ask about thoughts of suicide - this is a harmful myth. Waiting for someone experiencing suicidal thoughts to reach a trained behavioral health professional before directly asking about suicide can delay a potentially life-saving intervention. 

Individuals that are not trained as clinicians or therapists should be empowered to ask about and respond to suicide as part of standard protocol. This creates more nodes within your organization to detect and intervene.

The National Suicide Prevention Lifeline created the #Bethe1 campaign to spread the word about actions we can all take to prevent suicide. Their website details the 5 steps to take for interacting with someone who may be suicidal.  The first is to ask clearly and directly about suicide. They refute the misunderstanding that asking about suicde may “plant the seed” of suicidal thoughts: “Studies show that asking at-risk individuals if they are suicidal does not increase suicides or suicidal thoughts. In fact, studies suggest the opposite: findings suggest acknowledging and talking about suicide may in fact reduce rather than increase suicidal ideation.”

The five steps of #Bethe1 also aligns with the 3 step process to “Question, Persuade, Refer (QPR).” Training is available on these steps for both clinical and nonclinical professionals, also known as “gatekeepers.”

At Confidant, a prospective client that expresses suicidal thoughts to non-clinical staff, such as a Matchmaker or Coach, would not be appropriate for our services. Instead of onboarding the client, the staff’s role is to follow the QPR protocol. The staff listens without judgement and explains that there are resources better suited to help the individual and why, followed by referrals to appropriate care and reporting the incident to authorities. Our non-clinical staff immediately report any incident of this nature to clinicians for support in both handling the issue and debriefing.

At Confidant, every human interaction can be an opportunity to prevent suicide, this goes beyond the constant but passive automated message.


Train every team member on how to develop crisis plans:

In addition to training team members to ask about suicide and intervene if necessary, we train our team to work with clients to develop crisis plans. Having a crisis plan in place before an actual crisis is the only way said crisis plan is actually effective. This is always an optional exercise, and does not pertain directly to suicide, but rather whatever our clients determine may be a “crisis” situation for them. Given our specialization in substance use, this may be related to triggers or cravings.

Crisis plans, such as the Brown Stanley Safety Plan Template include warning signs, coping strategies or what we refer to as simply “activities” that can change the situation or person’s mindset, and identifying social and professional supports. This also creates the opportunity to connect a client to both clinical services and community resources. Digital health providers of other speciality areas, but especially those that treat conditions that commonly co-occur with suicidality, should consider working on crisis plans with their clients. Importantly, developing a crisis plan is never an option for a client that is actively suicidal who instead would require the QPR protocol and involvement of the appropriate authorities to ensure the individual’s safety.


Create a collaborative virtual environment:

As mentioned, Confidant’s team includes a mix of clinical and non-clinical staff. Creating an environment where there is collaboration across disciplines, specialties, and professional designations, is critical to best supporting our clients. This applies to both providing better clinical care for clients by being able to ask for input and support from teammates, and to managing our own team’s mental wellbeing. 

The last year and a half has been hard on everyone, and working in an environment where you may have never met your own colleagues in person (despite spending a lot of time in their house via Zoom), and where there are many additional stressors for every person is difficult. Additionally, the fields of mental health and substance use disorder treatment are notorious for burnout. Fostering a culture that enables team members to live true to their values, feel fulfilled and supported in their duties, and prioritizes wellbeing is a critical foundation for the helping professions. 


Include information about suicide - beyond just crisis support resources:

People may feel more comfortable disclosing suicidality to a chatbot or through virtual platforms than in-person. Virtual platforms allow people to gather information without fear of judgement or stigma. Recognizing this, virtual health platforms should include information about suicidal thoughts beyond simply referring to a suicide helpline. Direct referral to a lifeline without any additional information may feel like an overreaction, scary, or discouraging for those that are seeking support, particularly for passive suicidal ideations or for information on how to help a loved one.

At Confidant, we always start by reiterating that suicidal thoughts must always be taken seriously. We offer brief information to answer common questions: that suicidal thoughts are not normal but can be treated, that active suicidality must be treated in person rather than virtually, that suicidal thoughts may be associated with underlying mental health and substance use conditions or medications, the warning signs of suicide, how to support someone that is suicidal (the 5 action steps).

Digital health platforms, particularly those that intersect with mental and behavioral health, will inevitably have clients that are experiencing suicidal thoughts to some degree. Providing factual and trustworthy information in addition to where to go for help may be the gateway the client needs to seek support. 


Create off ramps for clinical interventions further upstream:

At Confidant, we specialize in substance use but our goal is to help people thrive. We treat mental health conditions and support people in improving their quality of life. As such, we gather information on many wellbeing indicators within our app. These indicators, such as sleep hygiene or anxiety, as well as metrics related to the frequency and intensity of substance use, can be used as “off ramps” to suggest individuals meet with a clinician. For example, Confidant guests may choose to use our app to learn strategies to reduce their drinking and adopt healthier habits, and in the process report that their sleep is worsening and their energy levels are low. A negative change in quality of life can prompt the opportunity to suggest meeting with a clinician. We include these prompts throughout a user's engagement with our app using motivational interviewing techniques shown to increase a willingness to seek support and move through the stages of change. This presents an opportunity to intervene earlier in the process of a mental downturn. For clients that are already meeting with Confidant clinicians, negative trends are used to inform measurement-based care and to tweak and optimize interventions to get back on track.


Moving forward

If digital heath providers implement the bare minimum, directing potentially suicial users to the National Suicide Prevention Lifeline and Crisis Text Line, as their “Suicide Prevention Strategy,” they are falling short. Suicide is in everyone’s lane, and we encourage all technology platforms and professionals operating in this space to adopt and share more proactive and creative strategies in addressing this crisis as each step in the care journey.