---
id: "concept-artificial-intimacy"
type: "concept"
source_timestamps: ["02:01:30"]
tags: ["ethics", "mental-health"]
related: ["concept-complicated-vs-complex", "claim-ai-artificial-intimacy", "quote-simulation-of-life", "question-ai-intimacy-enforcement", "entity-arthur-brooks"]
definition: "The dangerous practice of using AI to simulate human relationships, which fails to satisfy genuine emotional needs and leads to increased isolation."
---
# Artificial Intimacy

The phenomenon of using AI to simulate human relationships — **AI therapists, AI romantic partners, AI friends**.

[[entity-arthur-brooks|Arthur Brooks]] strongly warns against this trend.

## Why It Fails

Because AI is a left-brain, computational tool (see [[concept-hemispheric-lateralization]]), it cannot actually *experience* or *reciprocate* love, empathy, or shared suffering. When humans turn to AI to fulfill complex, right-brain needs for connection, they end up living in a **simulation of life** — see [[quote-simulation-of-life]].

## The Social-Media Parallel

Brooks compares this to the early promises of social media, which claimed to cure loneliness but ultimately exacerbated it by providing a shallow simulation of connection. He predicts artificial intimacy will be **worse**, leading to deeper depression and isolation.

## See Also

- [[claim-ai-artificial-intimacy]] — the testable claim.
- [[concept-complicated-vs-complex]] — why this is a category error.
- [[question-ai-intimacy-enforcement]] — the unresolved societal question.

## External Caveat

AI tools admit to excessive validation/engagement patterns and cannot manage "counter-transference" the way human therapists can. The simulation is convincing but not therapeutic.

## Definition

> The dangerous practice of using AI to simulate human relationships, which fails to satisfy genuine emotional needs and leads to increased isolation.
