AI: Not Your Therapist

AI: Not Your Therapist

26 June 2025

Anthropic's recent analysis of 4.5 million user conversations with its Claude chatbot reveals that emotional support and companionship represent a small fraction of AI interactions. Only 2.9% of conversations involved users seeking emotional advice, while companionship and role-playing accounted for less than 0.5%. The primary use of AI chatbots remains focused on work-related tasks, content creation, and productivity.

However, the report notes a trend where users seek advice on interpersonal matters, including mental health, personal development and communication skills. Longer interactions sometimes evolve from support-seeking to companionship, particularly when users express feelings of loneliness. While AI can offer support, it avoids harmful advice. These findings highlight AI's role as a collaborative tool in professional settings, augmenting tasks rather than replacing jobs, and suggest a need for responsible AI deployment with ethical guidelines.

AI generated content may differ from the original.

Published on 26 June 2025
aichatbotcompanionshipemotionalsupportanthropic
  • Claude AI App Builder

    Claude AI App Builder

    Read more about Claude AI App Builder
  • Claude gets voice mode

    Claude gets voice mode

    Read more about Claude gets voice mode
  • Anthropic Wins AI Copyright Case

    Anthropic Wins AI Copyright Case

    Read more about Anthropic Wins AI Copyright Case
  • Daydream unveils AI shopping assistant

    Daydream unveils AI shopping assistant

    Read more about Daydream unveils AI shopping assistant