Back Channel.
About Get involved
AI Accountability Infrastructure

Keeping AI safe takes a coordinated movement.

Back Channel is how we build one.

The AI accountability ecosystem is full of brilliant organizations, committed funders, and workers inside AI companies who know what's broken. But many of us are working in silos — duplicating effort, diluting impact, and losing ground to an industry moving faster than any single org can track. Organizations that should be finding each other aren't. Projects that need each other don't connect. A researcher with a hypothesis has no way to find the insider who could test it.

Back Channel is the infrastructure that changes that.

Introducing The AI Accountability Codex

A living, queryable mandate for AI accountability — built collectively, updated quarterly, and owned by everyone who contributes to it. Turning distinct efforts into an exponentially more powerful playbook.

AI
AI safety + accountability orgs
Workers inside AI companies
Funders
01 — The Codex
A shared playbook

A living, AI-powered knowledge base built collectively and updated continuously. Tells organizations what has already been done so they stop duplicating it. Tells funders where the gaps are. Surfaces collaborative projects no single organization could identify alone.

02 — Back Channel
Where needs meet people

The right people find each other — privately, securely, and behind the scenes. Needs get met. Gaps get filled. Connections happen that couldn't have happened any other way.

03 — Change Channel
Where collaboration gets resourced

Funders identify collaborative multi-org projects and matchmake with grantees — incentivizing the ecosystem to pool resources rather than compete for the same small pot. Bigger bets. Better outcomes. More coordinated impact.

If you're working on AI accountability — or funding it — get in touch.

Early members are shaping the Codex from the ground up.