Haiqa Irfan

Would You Want Your AI to Act Like a Therapist? Microsoft Thinks Gen Z Does

Microsoft is reportedly working on making Copilot more emotionally intelligent, designed to check in on users and behave more like a therapist to resonate with Gen Z.

It’s a bold move.

On one hand, Gen Z wants tech that feels personal and emotionally aware, not just productive. But on the other hand, do we really want to lean on corporate AI for emotional support?

As a community of builders and users, I'm curious:

Would you ever build emotional features into your AI product?

Do you think users actually want their AI to feel more “human”?

Where should we draw the line between helpful and invasive?

Let’s discuss👇

32 views

Add a comment

Replies

Best
Aurélia Bret

Great question, and for me, the answer is yes BUT only in the context of one of my projects.

I’m currently working on a project where we explore how to analyze candidates' emotions during interviews. It’s a fascinating and complex topic.

The goal isn’t to replace human empathy, but to support better decision-making and highlight subtle signals that are often overlooked.

Emotion-aware AI can bring real value, as long as it supports (not substitutes) human judgment.

It all comes down to context, transparency and intent.

So yes, I do believe there’s a place for emotionally intelligent AI, if designed with care.

Furqaan

@aurelia_b Great take.

Edward Michaelson

For work stuff, no. I want a coldly objective robot to tell me how it is.

On a personal note, therapy has been extremely valuable for me. If there was an interface that could replace my human therapist, I would for sure try it and pay for it.

Abdul Rehman

Great question, but I think.

People do want AI to behave more like humans, to understand us, talk like us, may be even care like us.

But deep down, there's also a fear.

A fear that slowly, we are drifting away from real human connection... and handing that space over to machines.

And that, in some way, feels dangerous.


Because when we start replacing people with AI in emotional spaces, we don’t just risk losing jobs, we risk losing each other

Yi Zheng

Interesting topic, in my opinion:

There are benefits to the emotional capabilities of AI, especially for those who can't afford a therapist or are afraid to talk to someone, and it can be a first step.

But there are three parts that need to be made clear (making it clear that “hey, I'm just AI, not a professional”/recommending real-life professionals to help with serious problems/don't intentionally make users dependent on AI)

It's not the viability of AI emotional support that worries me, it's the fact that we could be creating a false substitute that makes us miss the importance of real human connection.🤔

Jihoon Lee

As @abod_rehman stated, I think there's a growing lack of human connection. I’m a big supporter of AI tools for journaling, accessibility, and guided frameworks like CBT — but at the end of the day, we have to admit AI just simulates.

I often use the analogy of live music vs. the studio recording. The CD version might be cleaner, more technically precise — but there’s something about listening to a live performance that hits deeper. Same with therapy. AI can replicate the structure, but not the presence

Anthony Cai

Thanks for sharing this thought-provoking post! I think adding emotional intelligence to AI like Copilot could be a game-changer for user engagement, especially with Gen Z who value authenticity and connection. However, I’m cautious about relying on corporate AI for emotional support—there are privacy concerns and the risk of oversimplifying complex mental health needs.

If I were building AI products, I’d consider incorporating empathetic responses or check-ins, but always with clear boundaries and transparency about data use. The key is to complement, not replace, human support systems. Ultimately, users want AI that feels understanding but not intrusive. Striking that balance will be critical as these features evolve. Would love to hear others’ perspectives on where that line should be drawn!