Governance Is the New Differentiator: AI Policy, Moderation, and the Truth Social Cautionary Tale
Part 3 of Exploring Mastodon - the final post in this mini-series. Part 1 maps the AI policy spectrum. Part 2 covers the communities that give the network its texture. This post is the governance layer underneath both.
I applied for an account on a PeerTube instance and noted in the registration form that my channel covers AI-assisted tooling and self-hosting. The moderator's reply was polite and direct: "we are highly opposed to AI, including education on how to use AI-assisted tooling and selfhosting." One more line: "As far as we are aware, we are one of the very few, if not the only, instance with such a strong take. So, you might want to try a different instance."
I'm now on spectra.video, where the same content posts without friction.

That email isn't a failure of the open web. It's evidence the open web is working. The instance knew what it was for, knew its position was unusual, and communicated both clearly, because "try a different instance" is a complete answer here in a way it simply can't be on a platform that is the only place.
The fediverse is structured to make that answer possible. Understanding why requires going one level below the communities themselves, into the governance mechanics that create and protect them.
Who a community is for
Part 1 of this series mapped the AI policy spectrum as a user-choice decision tree: which servers ban AI-generated content, which require disclosure, which restrict training on user data, which say nothing. That framing is correct. It's also incomplete.
These policies aren't rules a community enforces after the fact. They're descriptions of who the community is for, written in advance.
mastodon.art's ban on AI-generated content didn't create the community of working illustrators and photographers who live there. It reflected them. The artists who chose mastodon.art after the generative AI wave of 2022 and 2023 chose it because it protected hand-made work. The ban is an accurate description of who the community was already becoming: not a restriction imposed on new members, but a statement that said people like you are what this is for. Part 2 covered the texture of that community in detail; the version here is shorter. The ban is a self-portrait, not a rule.
The PeerTube rejection works the same way. That instance's opposition to AI education isn't a moderation quirk. It's a statement about what community they're building and which members fit inside it. They volunteered that their position is unusual. They offered a path forward. None of that was hostile. It was precise, which is its own form of respect.
This distinction matters because it changes what you're actually doing when you read a server's code of conduct. You're not checking a list of rules to avoid trouble. You're reading a self-portrait: what this community values, who it imagines its members are, what it's built to protect. Communities whose policies feel like a fit are ones where the ambient assumptions already match what you're trying to do. Communities whose policies feel like friction are pointing you toward a different room.
The governance mechanics that follow are how communities protect those self-portraits once they've written them down: defederation, coordinated block policies, and the full Truth Social story.
What defederation actually is
Every major Mastodon instance publishes its block list. Browse to an instance's /about page (sfba.social/about, for example) and there's a full list of moderated servers, with reasons where the admin chose to provide them. The list is public. What's less obvious is how it shapes what you don't see: when your instance blocks another, posts from that server stop appearing in searches, in hashtags, and in the federated timeline, with no indication anything is missing. The other server's users simply don't exist from where you're standing.
What admins see that users don't: incoming reports about behavior on other servers, patterns of harassment or spam originating from specific instances, requests from peer admins asking whether you're aware of what's coming from a particular server. Much of this is mundane: obvious spam farms, coordinated trolling, servers with no moderation at all. Some requires judgment: a mostly-functional community with occasional bad actors, a large instance whose moderation decisions you disagree with, a server whose policies are too different from yours to produce clean federation without regular friction.
The decision process varies widely. Some admins act unilaterally, updating their block list as patterns emerge. Others open the decision to their community, posting proposed defederations to a local channel and giving members a chance to weigh in before anything is finalized. Some instances publish their full block lists publicly; others treat defederation as internal administration. The transparency spectrum is wide, and it's not obvious which end is better. Publishing block lists lets users understand the shape of their network. It also tells a targeted server exactly where they've been cut off.
Most creators only discover the reach implications after the fact. If your home server is defederated by a significant cluster of other instances, posts you send there may never arrive at their destination: no bounce, no error, no indication that anything is wrong. A follow request that never got accepted might have nothing to do with the other person. It might be infrastructure.
The asymmetry matters: if you're on a small instance that defederates a large one, you've removed the large instance from your timeline. The large instance's users can still see your public posts if their admin hasn't blocked you back. Defederation is directional. The direction affects reach.
This is governance at the instance level: individual admins making calls that shape what their community sees. When those decisions get coordinated across hundreds of instances, it becomes something else.
Fedipact
In mid-2023, Meta announced that Threads, its Instagram-adjacent text platform, would implement ActivityPub (the open protocol that makes Mastodon part of the fediverse) and federate with the fediverse. The announcement split fediverse administrators almost immediately.

One camp saw it as validation: a platform backed by Meta's scale and resources joining the open protocol would expand the fediverse's reach, and individual admins could always defederate if Meta misbehaved. The other camp named the pattern they expected from Meta's history with open standards: embrace, extend, extinguish. The concern wasn't that Meta would federate badly in year one. It was that a platform with Meta's engineering resources and growth incentives could selectively implement ActivityPub, build dependency on features only available through Threads, and eventually make full fediverse participation impractical for anyone who wanted access to its user base.
The Anti-Meta Fedipact emerged from that second camp: a coordinated pledge by instance administrators to preemptively defederate from any Meta-owned Threads instance once it joined the fediverse. Over 700 instances signed. Hundreds more blocked Threads without formally signing the pact.
The Fedipact is defederation operating at network scale: not one admin making a judgment call, but a coordinated policy across a significant portion of fediverse communities, implemented simultaneously and publicly. An instance that signs the Fedipact is making a governance statement: we think this federation creates more risk for our community than it creates value, and we're committing to that position in advance. The fact that hundreds of different admins, running different kinds of instances for different communities, reached the same conclusion tells you something about how governance norms travel across the fediverse.
The controversy that followed was predictable and legitimate. Critics argued that Fedipact concentrated power in the hands of whoever maintains the list: a small group of signatories could effectively shape what a large portion of the fediverse federates with, without any democratic process inside those individual communities. Supporters argued that coordinated block policies are the only practical defense against a well-resourced platform at scale, and that an individual instance acting alone is simply outmatched. Both positions are defensible. The fediverse hasn't resolved this tension; it's living with it.
For creators choosing an instance, the Fedipact question is concrete: if you're on a signatory instance, you cannot interact with Threads accounts. Your instance's governance decision is also your network's boundary. That's not a bug. It's the system being legible about what it values. Understanding which coordinated policies your home instance has adopted is part of understanding what your network actually looks like.
Truth Social: what happens when you take the protocol out
In 2022, Trump Media & Technology Group launched Truth Social using Mastodon's open-source code as the backend. The frontend was Soapbox, another open-source AGPL (Affero General Public License)-licensed project, which Truth Social had adapted for the platform. From a technical standpoint, Truth Social's architecture was recognizably Mastodon-derived.

What made it categorically different from a Mastodon instance was a single decision: Truth Social removed ActivityPub federation entirely. No incoming connections from other fediverse servers. No outgoing federation. No shared social graph with any instance, anywhere. Users on mastodon.social, mastodon.art, infosec.exchange, or any other fediverse server cannot follow Truth Social accounts, and Truth Social accounts cannot follow them. The protocol that makes Mastodon part of a network was the first thing they stripped out.
The AGPL licensing situation compounded it. Mastodon's code carries the AGPL license, which requires anyone who deploys the software to publish their modifications. When Truth Social launched without releasing modified source code, the Software Freedom Conservancy flagged the violation in October 2021. Mastodon's lawyers sent a formal letter to Truth Social's legal team the same month: release the modified source within 30 days, or face permanent license revocation.
Truth Social's response was technically compliant and substantively evasive. On November 12, 2021, they published a ZIP file of unaltered, upstream Mastodon code. The AGPL requires you to publish your modifications, not the original you started from. Releasing the unchanged base doesn't satisfy the license; it demonstrates they understood the requirement well enough to attempt to satisfy it without actually doing so. Source code uploads continued sporadically until December 2022, when they stopped entirely. Eventually, after sustained community pressure, Truth Social published an automatically-updated source dump the open-source community confirmed as current.
Soapbox's founder, Alex Gleason, had joined Truth Social as head of engineering in January 2022, hired specifically to adapt his open-source frontend for the platform. Eighteen months later, he resigned. He left to bring Soapbox to Nostr, an open protocol backed by a Jack Dorsey-funded nonprofit, specifically to build the parts of decentralized social media that Truth Social had used his work to close off. The person most responsible for Truth Social's open-source frontend left to build the open version instead.

The Truth Social story is the mirror image of the PeerTube opening. The PeerTube instance said: "this isn't for you, go find your community." That's governance working: a community knowing what it is, communicating clearly, and trusting the network to provide alternatives. Truth Social said: "we'll take the technology, strip the protocol, and close the door." Same underlying codebase. Opposite outcome.
The cautionary tale for any creator who cares about social graph ownership is specific: a platform built on Mastodon's code produced a network with no portability, no federation, no ability for users to leave and take their connections with them. It's the maximally complete version of everything the first post in this series warned against: centralized control dressed in open-source infrastructure. They took the code that enables communities to own their networks and used it to build a walled garden instead.
What you're signing up for
On every major centralized platform, governance is set by the company. Rules change when leadership changes. Algorithms shift when quarterly targets shift. Users have no mechanism to make their preferences binding on the people running the platform. The fediverse inverts this: the governance is what you're choosing when you choose an instance.
The PeerTube rejection made that choice legible. An instance that knows its policy well enough to acknowledge it's unusual, communicates that clearly, and offers a constructive path forward is governance doing exactly what it's supposed to do. Not every fediverse community operates at that level of clarity. Many don't publish block lists. Many change policies without announcements. The range of admin quality is real and wide. But the structure makes this kind of governance possible in a way no centralized platform does, because the people making the decisions are accountable to the people funding the server, which is usually the same group of people posting on it.
The harder version of this argument: if you use AI tools and publish about them on the open web, you will encounter communities that don't want that work. The feature image at the top of this post — a person standing at the intersection of two networked worlds — was generated with Google Gemini. On mastodon.art, that image would violate the community policy. On another instance, it's unremarkable. Both responses are the governance working correctly. That's fine. The network is structured so that "try a different instance" is always a complete answer. You don't need to fight a community's culture or spend energy on friction with a room that organized itself around something else. You find the community whose ambient assumptions already match what you're building.
The fediverse is not frictionless, and governance is a significant part of the friction. Defederation shapes your reach without notifying you. Coordinated policies like Fedipact make governance decisions visible at network scale. The AGPL story shows what happens when someone tries to use the infrastructure while rejecting the ethic that produced it. None of that is a reason to stay on platforms that govern on behalf of advertisers. It's a reason to understand the governance of the community you're joining: read the code of conduct as a self-portrait, understand which coordinated policies your instance has signed, and pick the community whose values are actually yours.
Governance isn't a tax you pay to use the fediverse. It's the reason the fediverse is worth using.
The free Fediverse Quick-Start Checklist is a 12-step PDF that walks you through picking a server, setting up your profile, and finding your first follows.
What's next in this series
This post closes the Exploring Mastodon mini-series. The Exploring the Fediverse umbrella continues with a mini-series on what happens when ActivityPub meets the other open protocols — AT Protocol (Bluesky), Nostr, and Threads — and on the broader fediverse beyond Mastodon: Pixelfed, PeerTube, GoToSocial, and others. The governance questions from this post don't stay contained to Mastodon. They travel with the protocol.
Sources and further reading
AGPL and Truth Social
- Mastodon blog: Trump's new social media platform found using Mastodon code — the initial notice, October 2021
- Vice: Mastodon Lawyers Tell Trump Social Network to Make Source Code Public
- Newsweek: Trump's Truth Social Could Have Software License Revoked
- Heather Meeker (IP/OSS attorney): Trump's Truth Social Platform Accused of Violating AGPL
- boehs.org: You Have Power: Making Truth Social Comply with the AGPL — most thorough account of the full compliance arc
- PocketNow: Truth Social removes most freedom-friendly features of the Fediverse
- Wikipedia: Truth Social
Alex Gleason / Soapbox
- PR Newswire: Truth Social Head of Engineering Leaves for Nostr
- No BS Bitcoin: Truth Social Head of Engineering Resigns to Build Ditto
- The Hill: Top exec at Trump's Truth Social resigns
Fedipact
- Anti-Meta Fedipact
- Fedipact — instances blocking Threads
- El Platt: Threads, The Fediverse, and the #FediPact
Defederation
The Exploring the Fediverse umbrella series continues with a mini-series on what happens when ActivityPub meets the other open protocols — AT Protocol (Bluesky), Nostr, and Threads — and on the broader fediverse beyond Mastodon: Pixelfed, PeerTube, GoToSocial, and others.
