Interoperability or Isolation: The Standards Battle Shaping Social Media

Picture the digital landscape as a crowded marketplace where every stall speaks a different dialect. Your tweet exists in one linguistic universe, your Mastodon post in another, and your Bluesky thread in yet another still. They all express fundamentally similar ideas, yet they cannot understand one another. This is not merely an inconvenience; it represents one of the most significant technical and political challenges facing the contemporary internet.

The question of how platforms and API providers might converge on a minimal interoperable content schema seems almost deceptively simple. After all, content is content. A post is a post. A like is a like. Yet beneath this apparent simplicity lies a tangle of competing interests, technical philosophies, and governance models that have resisted resolution for nearly three decades.

The stakes have never been higher. In 2024, Meta's Threads began implementing federation through ActivityPub, making President Joe Biden the first United States President with a presence on the fediverse when his official Threads account enabled federation in April 2024. Bluesky opened its doors to the public in February 2024 and announced plans to submit the AT Protocol to the Internet Engineering Task Force for standardisation. The European Union's Digital Services Act now requires very large online platforms to submit daily reports on content moderation decisions to a transparency database that has accumulated over 735 billion content moderation decisions since September 2023.

Something is shifting. The walled gardens that defined the social web for the past two decades are developing cracks, and through those cracks, we can glimpse the possibility of genuine interoperability. But possibility and reality remain separated by formidable obstacles, not least the fundamental question of what such interoperability should actually look like.

The challenge extends beyond mere technical specification. Every schema reflects assumptions about what content is, who creates it, how it should be moderated, and what metadata deserves preservation. These are not neutral engineering decisions; they are deeply political choices that will shape communication patterns for generations. Getting the schema right matters immensely. Getting the governance right matters even more.

The promise of interoperability is not merely technical efficiency. It represents a fundamental shift in the balance of power between platforms and users. When content can flow freely between services, network effects cease to function as lock-in mechanisms. Users gain genuine choice. Competition flourishes on features rather than audience capture. The implications for market dynamics, user agency, and the future of digital communication are profound.

Learning from the Graveyard of Standards Past

Before plotting a course forward, it pays to examine the tombstones of previous attempts. The history of internet standards offers both inspiration and cautionary tales, often in equal measure.

The RSS and Atom Saga

Consider RSS and Atom, the feed standards that once promised to liberate content from platform silos. RSS emerged in 1997 at UserLand, evolved through Netscape in 1999, and fragmented into competing versions that confused developers and users alike. The format's roots trace back to 1995, when Ramanathan V. Guha developed the Meta Content Framework at Apple, drawing from knowledge representation systems including CycL, KRL, and KIF. By September 2002, Dave Winer released RSS 2.0, redubbing its initials “Really Simple Syndication,” but the damage from years of versioning confusion was already done.

Atom arose in 2003 specifically to address what its proponents viewed as RSS's limitations and ambiguities. Ben Trott and other advocates believed RSS suffered from flaws that could only be remedied through a fresh start rather than incremental improvement. The project initially lacked even a settled name, cycling through “Pie,” “Echo,” “Atom,” and “Whatever” before settling on Atom. The format gained traction quickly, with Atom 0.3 achieving widespread adoption in syndication tools and integration into Google services including Blogger, Google News, and Gmail.

Atom achieved technical superiority in many respects. It became an IETF proposed standard through RFC 4287 in December 2005, offering cleaner XML syntax, mandatory unique identifiers for entries, and proper language support through the xml:lang attribute. The Atom Publishing Protocol followed as RFC 5023 in October 2007. Unlike RSS, which lacked any date tag until version 2.0, Atom made temporal metadata mandatory from the outset. Where RSS's vocabulary could not be easily reused in other XML contexts, Atom's elements were specifically designed for reuse.

Yet the market never cleanly converged on either format. Both persist to this day, with most feed readers supporting both, essentially forcing the ecosystem to maintain dual compatibility indefinitely. The existence of multiple standards confused the market and may have contributed to the decline of feed usage overall in favour of social media platforms.

The lesson here cuts deep: technical excellence alone does not guarantee adoption, and competing standards can fragment an ecosystem even when both serve substantially similar purposes. As one developer noted, the RSS versus Atom debate was “at best irrelevant to most people and at worst a confusing market-damaging thing.”

The Dublin Core Success Story

Dublin Core offers a more optimistic precedent. When 52 invitees gathered at OCLC headquarters in Dublin, Ohio, in March 1995, they faced a web with approximately 500,000 addressable objects and no consistent way to categorise them. The gathering was co-hosted by the National Center for Supercomputing Applications and OCLC, bringing together experts who explored the usefulness of a core set of semantics for categorising the web.

The fifteen-element Dublin Core metadata set they developed became an IETF RFC in 1998, an American national standard (ANSI/NISO Z39.85) in 2001, and an ISO international standard (ISO 15836) in 2003. Today, Dublin Core underpins systems from the EPUB e-book format to the DSpace archival software. The Australian Government Locator Service metadata standard is an application profile of Dublin Core, as is PBCore. Zope CMF's Metadata products, used by Plone, ERP5, and Nuxeo CPS content management systems, implement Dublin Core, as does Fedora Commons.

What distinguished Dublin Core's success? Several factors emerged: the specification remained deliberately minimal, addressing a clearly defined problem; it achieved formal recognition through multiple standards bodies; and it resisted the temptation to expand beyond its core competence. As Bradley Allen observed at the 2016 Dublin Core conference, metadata standards have become “pervasive in the infrastructure of content curation and management, and underpin search infrastructure.” A single thread, Allen noted, runs from the establishment of Dublin Core through Open Linked Data to the emergence of Knowledge Graphs.

Since 2002, the Dublin Core Metadata Initiative has maintained its own documentation for DCMI Metadata Terms and emerged as the de facto agency to develop metadata standards for the web. As of December 2008, the Initiative operates as a fully independent, public not-for-profit company limited by guarantee in Singapore, an open organisation engaged in developing interoperable online metadata standards.

ActivityPub and AT Protocol

The present landscape features two primary contenders for decentralised social media interoperability, each embodying distinct technical philosophies and governance approaches.

The Rise of ActivityPub and the Fediverse

ActivityPub, which became a W3C recommended standard in January 2018, now defines the fediverse, a decentralised social network of independently managed instances running software such as Mastodon, Pixelfed, and PeerTube. The protocol provides both a client-to-server API for creating and modifying content and a federated server-to-server protocol for delivering notifications and content to other servers.

The protocol's foundation rests on Activity Streams 2.0, a JSON-based serialisation syntax that conforms to JSON-LD constraints whilst not requiring full JSON-LD processing. The standardisation of Activity Streams began with the independent Activity Streams Working Group publishing JSON Activity Streams 1.0 in May 2011. The W3C chartered its Social Web Working Group in July 2014, leading to iterative working drafts from 2014 to 2017.

Activity Streams 2.0 represents a carefully considered vocabulary. Its core structure includes an actor (the entity performing an action, such as a person or group), a type property denoting the action taken (Create, Like, Follow), an object representing the primary target of the action, and an optional target for secondary destinations. The format uses the media type application/activity+json and supports over 50 properties across its core and vocabulary definitions. Documents should include a @context referencing the Activity Streams namespace for enhanced interoperability with linked data.

The format's compatibility with JSON-LD enables semantic richness and flexibility, allowing implementations to extend or customise objects whilst maintaining interoperability. Implementations wishing to fully support extensions must support Compact URI expansion as defined by the JSON-LD specification. Extensions for custom properties are achieved through JSON-LD contexts with prefixed namespaces, preventing conflicts with the standard vocabulary and ensuring forward compatibility.

Fediverse Adoption and Platform Integration

The fediverse has achieved considerable scale. By late 2025, Mastodon alone reported over 1.75 million active users, with nearly 6,000 instances across the broader network. Following Elon Musk's acquisition of Twitter, Mastodon gained more than two million users within two months. Mastodon was registered in Germany as a nonprofit organisation between 2021 and 2024, with a US nonprofit established in April 2024.

Major platforms have announced or implemented ActivityPub support, including Tumblr, Flipboard, and Meta's Threads. In March 2024, Threads implemented a beta version of fediverse support, allowing Threads users to view the number of fediverse users that liked their posts and allowing fediverse users to view posts from Threads on their own instances. The ability to view replies from the fediverse within Threads was added in August 2024. Ghost, the blogging platform and content management system, announced in April 2024 that they would implement fediverse support via ActivityPub. In December 2023, Flipboard CEO Mike McCue stated the move was intended to break away from “walled garden” ecosystems.

AT Protocol and Bluesky's Alternative Vision

The AT Protocol, developed by Bluesky, takes a markedly different approach. Where ActivityPub grew from W3C working groups following traditional standards processes, AT Protocol emerged from a venture-backed company with explicit plans to eventually submit the work to a standards body. The protocol aims to address perceived issues with other decentralised protocols, including user experience, platform interoperability, discoverability, network scalability, and portability of user data and social graphs.

Bluesky opened to the public in February 2024, a year after its release as an invitation-required beta, and reached over 10 million registered users by October 2024. The company opened federation through the AT Protocol soon after public launch, allowing users to build apps within the protocol and provide their own storage for content sent to Bluesky Social. In August 2024, Bluesky introduced a set of “anti-toxicity features” including the ability to detach posts from quote posts and hide replies.

AT Protocol's architecture emphasises what its creators call “credible exit,” based on the principle that every part of the system can be run by multiple competing providers, with users able to switch providers with minimal friction. The protocol employs a modular microservice architecture rather than ActivityPub's typically monolithic server design. Users are identified by domain names that map to cryptographic URLs securing their accounts and data. The system utilises a dual identifier system: a mutable handle (domain name) and an immutable decentralised identifier (DID).

Clients and services interoperate through an HTTP API called XRPC that primarily uses JSON for data serialisation. All data that must be authenticated, referenced, or stored is encoded in CBOR. User data is exchanged in signed data repositories containing records including posts, comments, likes, follows, and media blobs.

As described in Bluesky's 2024 Protocol Roadmap, the company planned to submit AT Protocol to an existing standards body such as the IETF in summer 2024. However, after consulting with those experienced in standardisation processes, they decided to wait until more developers had explored the protocol's design. The goal, they stated, was to have multiple organisations with AT Protocol experience collaborate on the standards process together.

What Actually Matters Most

When constructing a minimal interoperable content schema, certain elements demand priority attention. The challenge lies not in cataloguing every conceivable property, but in identifying the irreducible core that enables meaningful interoperability whilst leaving room for extension.

Foundational Metadata Requirements

Metadata forms the foundation. At minimum, any content object requires a unique identifier, creation timestamp, and author attribution. The history of RSS, where the guid tag did not appear until version 2.0 and remained optional, demonstrates the chaos that ensues when basic identification remains undefined. Without a guid tag, RSS clients must reread the same feed items repeatedly, guessing what items have been seen before, with no guidance in the specification for doing so. Atom's requirement of mandatory id elements for entries reflected hard-won lessons about content deduplication and reference.

The Dublin Core elements provide a useful starting framework: title, creator, date, and identifier address the most fundamental questions about any piece of content. Activity Streams 2.0 builds on this with actor, type, object, and published properties that capture the essential “who did what to what and when” structure of social content. Any interoperable schema must treat these elements as non-optional, ensuring that even minimal implementations can participate meaningfully in the broader ecosystem.

Content Type Classification

Content type specification requires particular care. The IANA media type registry, which evolved from the original MIME specification in RFC 2045 in November 1996, demonstrates both the power and complexity of type systems. Media types were originally introduced for email messaging and were used as values for the Content-Type MIME header. The IANA and IETF now use the term “media type” and consider “MIME type” obsolete, since media types have become used in contexts unrelated to email, particularly HTTP.

The registry now encompasses structured suffix registrations defined since January 2001 for +xml in RFC 3023, and formally included in the Structured Syntax Suffix Registry alongside +json, +ber, +der, +fastinfoset, +wbxml, and +zip in January 2013 through RFC 6839. These suffixes enable parsers to understand content structure even for novel types. Any content schema should leverage this existing infrastructure rather than reinventing type identification.

The Moderation Metadata Challenge

Moderation flags present the thorniest challenge. The Digital Services Act transparency database reveals the scale of this problem: researchers analysed 1.58 billion moderation actions from major platforms to examine how social media services handled content moderation during the 2024 European Parliament elections. The database, which has been operating since September 2023, has revealed significant inconsistencies in how different services categorise and report their decisions.

The European Commission adopted an implementing regulation in November 2024 establishing uniform reporting templates, recognising that meaningful transparency requires standardised vocabulary. The regulation addresses previous inconsistencies by establishing uniform reporting periods. Providers must start collecting data according to the Implementing Regulation from 1 July 2025, with the first harmonised reports due in early 2026.

A minimal moderation schema might include: visibility status (public, restricted, removed), restriction reason category, restriction timestamp, and appeals status. INHOPE's Global Standard project aims to harmonise terminology for classifying illegal content, creating interoperable hash sets for identification. Such efforts demonstrate that even in sensitive domains, standardisation remains possible when sufficient motivation exists.

Extensibility and Schema Evolution

Extensibility mechanisms deserve equal attention. Activity Streams 2.0 handles extensions through JSON-LD contexts with prefixed namespaces, preventing conflicts with the standard vocabulary whilst ensuring forward compatibility. This approach allows platforms to add proprietary features without breaking interoperability for core content types.

The JSON Schema project has taken a similar approach to managing complexity. After 10 different releases over 15 years, the specification had become, by the project's own admission, “a very complex document too focused on tooling creators but difficult to understand for general JSON Schema users.” The project's evolution toward a JavaScript-style staged release process, where most features are declared stable whilst others undergo extended vetting, offers a model for managing schema evolution.

Who Decides and How

The governance question may ultimately prove more decisive than technical design. Three broad models have emerged for developing and maintaining technical standards, each with distinct advantages and limitations.

Open Standards Bodies

Open standards bodies such as the W3C and IETF have produced much of the infrastructure underlying the modern internet. In August 2012, five leading organisations, IEEE, Internet Architecture Board, IETF, Internet Society, and W3C, signed a statement affirming jointly developed OpenStand principles. These principles specify that standards should be developed through open, participatory processes, support interoperability, foster global competition, and be voluntarily adopted.

The W3C's governance has evolved considerably since its founding in 1994. Tim Berners-Lee, who founded the consortium at MIT, described its mission as overseeing web development whilst keeping the technology “free and nonproprietary.” The W3C ensures its specifications can be implemented on a royalty-free basis, requiring authors to transfer copyright to the consortium whilst making documentation freely available.

The IETF operates as a large open international community of network designers, operators, vendors, and researchers concerned with the evolution of the internet architecture and the smooth operation of the internet. Unlike more formal organisations, participation requires no membership fees; anyone can contribute through working groups and mailing lists. The IETF has produced standards including TCP/IP, DNS, and email protocols that form the internet's core infrastructure. As the Internet Society noted in its policy brief, “Policy makers and regulators should reference the use of open standards so that both governments and the broader economies can benefit from the services, products, and technologies built on such standards.”

The Activity Streams standardisation process illustrates this model's strengths and limitations. Work began with the independent Activity Streams Working Group publishing JSON Activity Streams 1.0 in May 2011. The W3C chartered its Social Web Working Group in July 2014, leading to iterative working drafts from 2014 to 2017 before Activity Streams 2.0 achieved recommendation status in January 2018. In December 2024, the group received a renewed charter to pursue backwards-compatible updates for improved clarity and potential new features.

This timeline spanning nearly a decade from initial publication to W3C recommendation reflects both the thoroughness and deliberate pace of open standards processes. For rapidly evolving domains, such timescales can seem glacial. Yet the model of voluntary standards not funded by government has been, as the Internet Society observed, “extremely successful.”

Consortium-Based Governance

Consortium-based governance offers a middle path. OASIS (Organization for the Advancement of Structured Information Standards) began in 1993 as SGML Open, a trade association of Standard Generalised Markup Language tool vendors cooperating to promote SGML adoption through educational activities. In 1998, with the industry's movement to XML, SGML Open changed its emphasis and name to OASIS Open, reflecting an expanded scope of technical work.

In July 2000, a new technical committee process was approved. At adoption, there were five technical committees; by 2004, there were nearly 70. OASIS is distinguished by its transparent governance and operating procedures. Members themselves set the technical agenda using a lightweight process designed to promote industry consensus and unite disparate efforts.

OASIS technical committees follow a structured approval pathway: proposal, committee formation, public review, consensus approval, and ongoing maintenance. The OASIS Intellectual Property Rights Policy requires Technical Committee participants to disclose any patent claims they might have and requires all contributors to make specific rights available to the public for implementing approved specifications.

The OpenID Foundation's governance of OpenID Connect demonstrates consortium effectiveness. Published in 2014, OpenID Connect learned lessons from earlier efforts including SAML and OpenID 1.0 and 2.0. Its success derived partly from building atop OAuth 2.0, which had already achieved tremendous adoption, and partly from standardising elements that OAuth left flexible. One of the most important changes is a standard set of scopes. In OAuth 2.0, scopes are whatever the provider wants them to be, making interoperability effectively impossible. OpenID Connect standardises these scopes to openid, profile, email, and address, enabling cross-implementation compatibility.

Vendor-Led Standardisation

Vendor-led standardisation presents the most contentious model. When a single company develops and initially controls a standard, questions of lock-in and capture inevitably arise. The Digital Standards Organization (DIGISTAN) states that “an open standard must be aimed at creating unrestricted competition between vendors and unrestricted choice for users.” Its brief definition: “a published specification that is immune to vendor capture at all stages in its life-cycle.”

Yet vendor-led efforts have produced genuinely open results. Google's development of Kubernetes proceeded in the open with community involvement, and the project is now available across all three major commercial clouds. Bluesky's approach with AT Protocol represents a hybrid model: a venture-backed company developing technology with explicit commitment to eventual standardisation.

The Art of Evolution Without Breakage

Any interoperable schema will require change over time. Features that seem essential today may prove inadequate tomorrow, whilst unanticipated use cases will demand new capabilities. Managing this evolution without fragmenting the ecosystem requires disciplined approaches to backward compatibility.

Learning from Schema Evolution

The JSON Schema project's recent evolution offers instructive lessons. The project chose to base their new process on the process used to evolve the JavaScript language. In the next release, most keywords and features will be declared stable and will never change in a backward incompatible way again. Features not yet comfortable being made stable will become part of a new staged release process that ensures sufficient implementation, testing, and real-world vetting.

API versioning strategies have converged on several best practices. URI path versioning, placing version numbers directly in URL paths, has been adopted by Facebook, Twitter, and Airbnb among others. This approach makes versioning explicit and allows clients to target specific versions deliberately. Testing and automation play crucial roles. Backward compatibility can be ensured by introducing unit tests that verify functionality remains across different versions of an API.

Stability Contracts and Deprecation

Crucially, backward compatibility requires understanding what must never change. Root URLs, existing query parameters, and element semantics all constitute stability contracts. HTTP response codes deserve particular attention: if an API returns 500 when failing to connect to a database, changing that to 200 breaks clients that depend on the original behaviour.

The principle of additive change provides a useful heuristic: add new fields or endpoints rather than altering existing ones. This ensures older clients continue functioning whilst newer clients access additional features. Feature flags enable gradual rollout, hiding new capabilities behind toggles until the ecosystem has adapted.

Deprecation requires equal care. Best practices include providing extensive notice before deprecating features, offering clear migration guides, implementing gradual deprecation with defined timelines, and maintaining documentation for all supported versions. Atlassian's REST API policy exemplifies mature deprecation practice, documenting expected compatibility guarantees and providing systematic approaches to version evolution.

Practical Steps Toward Convergence

Given the technical requirements and governance considerations, what concrete actions might platforms and API providers take to advance interoperability?

Establishing Core Vocabulary and Building on Existing Foundations

First, establish a minimal core vocabulary through multi-stakeholder collaboration. The Dublin Core model suggests focusing on the smallest possible set of elements that enable meaningful interoperability: unique identifier, creation timestamp, author attribution, content type, and content body. Everything else can be treated as optional extension.

Activity Streams 2.0 provides a strong foundation, having already achieved W3C recommendation status and proven adoption across the fediverse. Rather than designing from scratch, new efforts should build upon this existing work, extending rather than replacing it. The renewed W3C charter for backwards-compatible updates to Activity Streams 2.0 offers a natural venue for such coordination.

Second, prioritise moderation metadata standardisation. The EU's Digital Services Act has forced platforms to report moderation decisions using increasingly harmonised categories. This regulatory pressure, combined with the transparency database's accumulation of over 735 billion decisions, creates both data and incentive for developing common vocabularies.

A working group focused specifically on moderation schema could draw participants from platforms subject to DSA requirements, academic researchers analysing the transparency database, and civil society organisations concerned with content governance. INHOPE's work on harmonising terminology for illegal content provides a model for domain-specific standardisation within a broader framework.

Extension Mechanisms and Infrastructure Reuse

Third, adopt formal extension mechanisms from the outset. Activity Streams 2.0's use of JSON-LD contexts for extensions demonstrates how platforms can add proprietary features without breaking core interoperability. Any content schema should specify how extensions are namespaced, versioned, and discovered.

This approach acknowledges that platforms will always seek differentiation. Rather than fighting this tendency, good schema design channels it into forms that do not undermine the shared foundation. Platforms can compete on features whilst maintaining basic interoperability, much as email clients offer different experiences whilst speaking common SMTP and IMAP protocols.

Fourth, leverage existing infrastructure wherever possible. The IANA media type registry offers a mature, well-governed system for content type identification. Dublin Core provides established metadata semantics. JSON-LD enables semantic extension whilst remaining compatible with standard JSON parsing. Building on such foundations reduces the amount of novel work requiring consensus and grounds new standards in proven precedents.

Compatibility Commitments and Governance Structures

Fifth, commit to explicit backward compatibility guarantees. Every element of a shared schema should carry clear stability classifications: stable (will never change incompatibly), provisional (may change with notice), or experimental (may change without notice). The JSON Schema project's move toward this model reflects growing recognition that ecosystem confidence requires predictable evolution.

Sixth, establish governance that balances openness with efficiency. Pure open-standards processes can move too slowly for rapidly evolving domains. Pure vendor control raises capture concerns. A consortium model with clear membership pathways, defined decision procedures, and royalty-free intellectual property commitments offers a workable middle ground.

The OpenID Foundation's stewardship of OpenID Connect provides a template: standards developed collaboratively, certified implementations ensuring interoperability, and membership open to any interested organisation.

The Political Economy of Interoperability

Technical standards do not emerge in a vacuum. They reflect and reinforce power relationships among participants. The governance model chosen for content schema standardisation will shape which voices are heard and whose interests are served.

Platform Power and Regulatory Pressure

Large platforms possess obvious advantages: engineering resources, market leverage, and the ability to implement standards unilaterally. When Meta's Threads implements ActivityPub federation, however imperfectly, it matters far more for adoption than when a small Mastodon instance does the same thing. Yet this asymmetry creates risks of standards capture, where dominant players shape specifications to entrench their positions.

Regulatory pressure increasingly factors into this calculus. The EU's Digital Services Act, with its requirements for transparency and potential fines up to 6 percent of annual global revenue for non-compliance, creates powerful incentives for platforms to adopt standardised approaches. The Commission has opened formal proceedings against multiple platforms including TikTok and X, demonstrating willingness to enforce.

Globally, 71 regulations now explicitly require APIs for interoperability, data sharing, and composable services. This regulatory trend suggests that content schema standardisation may increasingly be driven not by voluntary industry coordination but by legal mandates. Standards developed proactively by the industry may offer more flexibility than those imposed through regulation.

Government Policy and Middleware Approaches

The UK Cabinet Office recommends that government departments specify requirements using open standards when undertaking procurement, explicitly to promote interoperability and avoid technological lock-in.

The “middleware” approach to content moderation, as explored by researchers at the Integrity Institute, would require basic standards for data portability and interoperability. This would affect the contractual relationship between dominant platforms and content moderation providers at the contractual layer, as well as requiring adequate interoperability between content moderation providers at the technical layer. A widespread implementation of middleware would fundamentally reshape how content flows across platforms.

The Stakes of Success and Failure

If platforms and API providers succeed in converging on a minimal interoperable content schema, the implications extend far beyond technical convenience. True interoperability would mean that users could choose platforms based on features and community rather than network effects. Content could flow across boundaries, reaching audiences regardless of which service they prefer. Moderation approaches could be compared meaningfully, with shared vocabularies enabling genuine transparency.

Failure, by contrast, would entrench the current fragmentation. Each platform would remain its own universe, with content trapped within walled gardens. Users would face impossible choices between communities that cannot communicate. The dream of a genuinely open social web, articulated since the web's earliest days, would recede further from realisation.

Three Decades of Web Standards

Tim Berners-Lee, in founding the W3C in 1994, sought to ensure the web remained “free and nonproprietary.” Three decades later, that vision faces its sternest test. The protocols underlying the web itself achieved remarkable standardisation. The applications built atop those protocols have not.

The fediverse, AT Protocol, and tentative moves toward federation by major platforms suggest the possibility of change. Activity Streams 2.0 provides a proven foundation. Regulatory pressure creates urgency. The technical challenges, whilst real, appear surmountable.

An Open Question

What remains uncertain is whether the various stakeholders, from venture-backed startups to trillion-dollar corporations to open-source communities to government regulators, can find sufficient common ground to make interoperability a reality rather than merely an aspiration.

The answer will shape the internet's next decade. The schema we choose, and the governance structures through which we choose it, will determine whether the social web becomes more open or more fragmented, more competitive or more captured, more user-empowering or more platform-serving.

That choice remains, for now, open.


References and Sources

  1. W3C. “Activity Streams 2.0.” W3C Recommendation.
  2. Wikipedia. “ActivityPub.”
  3. Wikipedia. “AT Protocol.”
  4. Bluesky Documentation. “2024 Protocol Roadmap.”
  5. European Commission. “Digital Services Act: Commission launches Transparency Database.”
  6. Wikipedia. “Dublin Core.”
  7. Wikipedia. “Atom (web standard).”
  8. Wikipedia. “RSS.”
  9. W3C. “Leading Global Standards Organizations Endorse 'OpenStand' Principles.”
  10. Wikipedia. “OASIS (organization).”
  11. Wikipedia. “Fediverse.”
  12. Wikipedia. “Mastodon (social network).”
  13. Wikipedia. “Bluesky.”
  14. FediDB. “Mastodon – Fediverse Network Statistics.”
  15. JSON Schema. “Towards a stable JSON Schema.”
  16. Wikipedia. “Media type.”
  17. IANA. “Media Types Registry.”
  18. W3C. “History.”
  19. Wikipedia. “Tim Berners-Lee.”
  20. Zuplo Learning Center. “API Backwards Compatibility Best Practices.”
  21. Okta. “What is OpenID Connect?”
  22. Wikipedia. “Vendor lock-in.”
  23. ARTICLE 19. “Why decentralisation of content moderation might be the best way to protect freedom of expression online.”
  24. ArXiv. “Bluesky and the AT Protocol: Usable Decentralized Social Media.”
  25. Internet Society. “Policy Brief: Open Internet Standards.”
  26. European Commission. “How the Digital Services Act enhances transparency online.”
  27. Centre for Emerging Technology and Security, Alan Turing Institute. “Privacy-preserving Moderation of Illegal Online Content.”
  28. Integrity Institute. “Middleware and the Customization of Content Moderation.”
  29. O'Reilly Media. “A Short History of RSS and Atom.”
  30. Connect2ID. “OpenID Connect explained.”

Tim Green

Tim Green UK-based Systems Theorist & Independent Technology Writer

Tim explores the intersections of artificial intelligence, decentralised cognition, and posthuman ethics. His work, published at smarterarticles.co.uk, challenges dominant narratives of technological progress while proposing interdisciplinary frameworks for collective intelligence and digital stewardship.

His writing has been featured on Ground News and shared by independent researchers across both academic and technological communities.

ORCID: 0009-0002-0156-9795 Email: tim@smarterarticles.co.uk

Discuss...