---
id: "concept-complicated-vs-complex"
type: "concept"
source_timestamps: ["01:58:10"]
tags: ["problem-solving", "psychology"]
related: ["concept-hemispheric-lateralization", "claim-ai-artificial-intimacy", "concept-artificial-intimacy"]
definition: "Complicated problems have definitive, logical solutions (ideal for AI), whereas complex problems involve ongoing human emotions and relationships that cannot be 'solved'."
---
# Complicated vs. Complex Problems

A critical distinction made by [[entity-arthur-brooks|Arthur Brooks]] regarding what AI can and cannot do.

## Complicated Problems

Belong to the **left brain** (see [[concept-hemispheric-lateralization]]). They may be incredibly difficult — writing complex software, calculating logistics, diagnosing a disease from data — but they have a **definitive solution**. AI is exceptional at solving complicated problems.

## Complex Problems

Belong to the **right brain**. These involve human emotions, relationships, love, and finding meaning in suffering. Complex problems do **not have a solution** — they must simply be *lived* and experienced.

## The Category Error

Applying a *complicated tool* (AI) to a *complex problem* (human loneliness, grief, the search for meaning) is a **category error** that leads to dysfunction. This is the root of [[concept-artificial-intimacy|Artificial Intimacy]] and the basis for [[claim-ai-artificial-intimacy|the claim that AI emotional connection increases loneliness]].

## Definition

> Complicated problems have definitive, logical solutions (ideal for AI), whereas complex problems involve ongoing human emotions and relationships that cannot be 'solved'.
