Truth Wasn’t Profitable, So I Got Rid of It

Meta is shifting towards engagement-focused strategies, reducing moderation, embracing divisive content, and letting users debate misinformation for higher profits.

Truth Wasn’t Profitable, So I Got Rid of It
477g

Meta is charting a bold new course: abandoning costly moderation efforts, embracing divisive content, and harnessing user outrage to maximize engagement and profits.


The is the transcript of a video posted by the controlling shareholder of Meta Platforms on 7 Jan.

By Mark Zuckerberg, Founder and CEO of Meta

Hey everyone. I want to discuss a new direction for Meta. It focuses on what we’ve always done best — keeping people engaged.

When I started social media, the goal was to create a platform so engaging that people wouldn’t want to leave. Over time, moderation, fact-checking, and policies bogged us down, creating costs, frustrations, and too many complaints.

It’s time to move past all that.

For years, governments and the media have demanded we control misinformation. We tried. We hired fact-checkers, built filters, and added complex rules to manage content. What we learned is that these efforts alienate users, erode trust, and are expensive to keep. People don’t like being flagged, even when they’re wrong.

Community Notes is the solution that aligns with how people already use the platform.

With Community Notes, users will annotate and debate posts themselves. Arguments over what’s true will replace top-down fact-checking. It’s a cheaper, more scalable model. It also turns every disagreement into an opportunity for engagement, which drives activity.

Whether these debates clarify anything doesn’t matter. The engagement itself is the goal.

Content policies are also being simplified.

Topics like immigration or gender have sparked complaints about censorship. Loosening restrictions will allow more people to share their views. That means more heated exchanges, but that’s precisely what drives clicks and keeps people coming back.

We’re scaling back content filters as well.

Filters that aggressively scan for violations frustrate users when they flag innocent content. By focusing only on serious issues, we’ll reduce those frustrations and leave less serious violations to user-driven reporting. Some harmful content will slip through, but the advantage is a more active platform with fewer complaints about censorship.

Political content is making a return. People said it stressed them out, but now we’re seeing renewed interest in political posts. These topics ignite passionate discussions and keep users deeply engaged, which makes them valuable to the platform.

Critics will claim this approach fuels misinformation and deepens divisions. They're not wrong. Viral conspiracy theories and shocking posts get more activity than factual corrections ever do.

That activity is what keeps Meta thriving. Instead of fighting misinformation, we’re creating the space for it to live and grow because it keeps the platform alive.

Mistakes will happen. Some harmful content will go viral. People will complain. But the bigger picture is that the platform will stay active, advertisers will stay happy, and profits will keep growing.

Truth is no longer our responsibility. It's yours now. Debate it, reshape it, or ignore it entirely.

I’ll be here, watching as engagement soars, confident this is the right move for the future of Meta. ■


Mark is one of the world’s wealthiest individuals and a force behind the modern attention economy. An aspiring human, he enjoys backyard barbecues and virtual reality.