Page 1: What AI mental health chatbots can do
“LLM-based systems are being used as companions, confidants, and therapists.”
Stanford HAI, “Exploring the Dangers of AI in Mental Health Care”
Page 2: What AI mental health chatbots cannot do
“Findings suggest practitioners favor administrative uses of generative AI.”
Goldie et al., 2025, JMIR article summary
Page 3: Why this matters for college students
- Students may rely on AI instead of reaching out to a counselor or trusted adult.
- Students may accept inaccurate information because it is delivered confidently.
- Students may feel worse if the chatbot response is shallow during a serious moment.
- Students may feel isolated if AI becomes their only source of support.
Page 4: How to use AI safely and responsibly
- Mild stress and everyday anxiety
- Journaling prompts and reflection
- Grounding and breathing routines
- Planning simple self care habits
- Practicing reframing negative thoughts
- Thoughts of self harm or suicide
- Trauma disclosure and processing
- Diagnosing mental health conditions
- Severe depression or panic attacks
- Abuse situations or active danger
- Use AI as a supplement, not a replacement.
- Be skeptical of confident advice and double check important claims.
- Do not share sensitive personal information if you do not understand how the app stores data.
- If something feels serious, reach out to a real person and do not wait.
- Save your campus counseling number and crisis resources before you need them.