---
id: "claim-ai-artificial-intimacy"
type: "claim"
source_timestamps: ["02:01:30"]
tags: ["mental-health", "societal-impact"]
related: ["concept-artificial-intimacy", "concept-complicated-vs-complex", "quote-simulation-of-life", "question-ai-intimacy-enforcement"]
confidence: "high"
testable: true
---
# Using AI for emotional connection will increase loneliness.

Drawing a parallel to the rise of social media, [[entity-arthur-brooks|Arthur Brooks]] claims that turning to AI for friendship, therapy, or romance — i.e., [[concept-artificial-intimacy|Artificial Intimacy]] — will **backfire**.

## The Mechanism

Because AI cannot genuinely experience love or suffering, it only provides a **simulation of connection**. Relying on this simulation to solve the [[concept-complicated-vs-complex|complex]] human problem of loneliness will ultimately leave individuals feeling more isolated and depressed, as it starves them of actual right-brain, human-to-human interaction.

## Captured Slogan

> [[quote-simulation-of-life|"If you look for love, happiness, and meaning in tech, you wind up living in a simulation of life."]]

## Confidence & Validation

- **Confidence:** high.
- **Testable:** yes.
- **External validation:** no direct longitudinal data yet. Adjacent evidence (AI's shallow validation patterns, the social-media analogy) is consistent with the claim. The **resolution path** is in [[question-ai-intimacy-enforcement]] — longitudinal studies on heavy AI-companion users versus controls.

## See Also

- [[concept-artificial-intimacy]]
- [[framework-right-brain-exercise]] — the prescribed alternative.
