A recent report by the U.S. House Judiciary Committee alleges that the Global Alliance for Responsible Media (GARM) has been orchestrating efforts to demonetize content deemed unfavorable, affecting discourse on vaccines, politics, and more. This report highlights potentially significant implications for online speech and consumer choice in media.
Brands Colluding to Control Online Speech
The investigation by the House Judiciary Committee has unearthed accusations that some of the world’s largest brands and advertising agencies are working together to control online speech. The report claims that GARM, a World Federation of Advertisers (WFA) initiative, uses its market power to silence dissenting voices, potentially violating antitrust laws.
“Through GARM, large corporations, advertising agencies, and industry associations participated in boycotts and other coordinated actions to demonetize platforms, podcasts, news outlets, and other content deemed disfavored by GARM and its members,” the report states.
GARM: A Powerful Alliance
GARM, established in 2019 by the WFA, represents over 150 of the world’s largest brands and more than 60 national advertiser associations. The alliance includes major players in the advertising industry, wielding significant influence over online content.
“WFA members represent roughly 90% of global advertising spend, or almost one trillion dollars annually,” according to the report.
The committee’s investigation focused on GARM’s activities since its inception, examining its sway over major social media platforms, news outlets, and content creators.

Cartel-Like Behavior
Rep. Jim Jordan (R-Ohio) described GARM as a cartel during a congressional hearing, questioning the organization’s anticompetitive actions. The report outlines several instances where GARM allegedly coordinated actions against platforms and content creators:
- Twitter Boycott after Musk’s Acquisition: GARM is accused of organizing a boycott of Twitter following Elon Musk’s acquisition in October 2022. Internal documents suggest extensive discussions and debriefings around Musk’s takeover, with GARM allegedly recommending its members halt all paid advertisements on the platform.
- Pressure on Spotify over Joe Rogan’s Podcast: In early 2022, GARM allegedly pressured Spotify regarding content on Joe Rogan’s podcast, particularly his remarks on COVID-19 vaccines. Internal emails show GARM coordinating with member companies to formulate responses to Spotify.
- Efforts to Demonetize News Outlets: The report claims GARM discussed strategies to block certain news outlets like Fox News, The Daily Wire, and Breitbart News. Internal emails reveal GARM members closely monitoring these outlets, waiting for them to cross the line to justify blocking them.
Influence on Political Content and Elections
GARM’s influence extends to political discourse and election outcomes. The report highlights instances where GARM attempted to sway political content and election-related information:
- 2020 U.S. Presidential Election: In October 2020, GARM members suggested telling Facebook to apply its COVID-19 content moderation policies to election-related content. When Facebook refused to label a Trump campaign ad as misinformation, GARM’s initiative lead allegedly described the decision as “reprehensible.”
- Hunter Biden Laptop Story: GARM members reportedly expressed concerns about Musk’s handling of the Hunter Biden laptop story on Twitter, viewing Musk’s actions as overtly partisan.
Ad-Tech Partnerships and AI Integration
GARM’s reach extends into the realm of artificial intelligence and machine learning. The report details GARM’s partnerships with ad-tech companies, conditioning membership on these partners agreeing to align with GARM’s goals.
GARM’s plans include integrating its framework into AI solutions, potentially automating content moderation and demonetization based on GARM’s standards. This could result in biased decisions about which content receives advertising revenue, limiting consumer choice and diverse viewpoints online.
Government Connections and Censorship Efforts
The report also alleges connections between GARM’s partners and government agencies involved in content moderation. Channel Factory, a GARM ad-tech partner, collaborated with the Department of Homeland Security’s Cybersecurity and Infrastructure Security Agency (CISA) to develop a common lexicon for discussing misinformation.
This type of collaboration could lead to government influence over private-sector content moderation practices, raising concerns about the potential for censorship.
Dangerous, Anticompetitive Behavior
The House Judiciary Committee concluded that GARM’s actions may violate antitrust laws and threaten free speech and consumer choice online. The report emphasizes that antitrust laws apply regardless of GARM’s intentions, suggesting that legislative reforms may be necessary to address these concerns.
“If collusion among powerful corporations capable of collectively demonetizing, and in effect eliminating, certain views and voices is allowed to continue, the ability of countless American consumers to choose what to read and listen to, or even have their speech or writing reach other Americans, will be destroyed,” the report states.
The committee vowed to continue its oversight of GARM and evaluate the adequacy of existing antitrust laws to address what it describes as “dangerous, anticompetitive behavior.”
______________________________________________
Help Keep Independent Journalism Alive & Support a Senior
Even a small contribution to my GoFundMe helps me continue this work and get a used car to stay mobile.
“If collusion among powerful corporations capable of collectively demonetizing, and in effect eliminating, certain views and voices is allowed to continue, the ability of countless American consumers to choose what to read and listen to, or even have their speech or writing reach other Americans, will be destroyed,” the report states.