Experiments

V35: Language Emergence

V35: Language Emergence

Status: Complete (10 seeds).

Hypothesis: Language emerges when: (1) partial observability creates information asymmetry (obs_radius=1), (2) discrete channel forces categorical representation (K=8 symbols), (3) cooperative pressure rewards signaling (1.5x bonus for co-consumption), (4) communication range exceeds visual range (comm_radius=5 > obs_radius=1).

Result: Referential communication emerges in 10/10 seeds (100%). Mean symbol entropy 2.48 ± 0.14 bits (83% of maximum, range 2.18–2.67). Resource MI proxy 0.001–0.005 (all positive). All 8 symbols maintained in active use. This breaks the V20b null where continuous z-gate signals never departed from 0.5. Discrete symbols under partial observability and cooperative pressure produce referential communication as an inevitability, not a rarity.

But communication does NOT lift integration. Mean comm ablation Φ\intinfo lift ≈ 0. Late Φ=0.074±0.013\intinfo = 0.074 \pm 0.013below V27 baseline (0.090, t=1.78t = -1.78). Distribution: 0 HIGH / 7 MOD / 3 LOW. Φ\intinfo-MI correlation ρ=0.07\rho = 0.07 (null): language and integration are orthogonal. Communication neither helps nor hurts — it operates on a different axis entirely.

Language is cheap. Like affect geometry, referential communication emerges under minimal conditions — partial observability plus cooperative pressure. It sits at rung 4–5 of the emergence ladder. Language does not create dynamics any more than geometry does. The expensive transition remains at rung 8, requiring embodied agency and gradient coupling. Adding communication channels does not help cross it.

Language Is Cheap. 10 agents under partial observability develop referential communication using 8 discrete symbols. Symbol entropy rises to 2.48 bits (83% of maximum). But Φ\intinfo stays flat at ~0.074 — language and integration are orthogonal. Like geometry, language is an inevitability of survival under information asymmetry. It sits at rung 4-5, not rung 8.

Source code