In a blog post titled “More speech, fewer mistakes”, Meta’s new policy chief, Joel Kaplan, outlined some dramatic changes to its content moderation strategy:
No more third-party fact-checkers.
Instead, Meta is rolling out a Community Notes model, following the path of other platforms like X.
Loosening topic restrictions.
Only “illegal and high-severity” content will face the hammer, while mainstream (even polarizing) discussions get more breathing room.
Tailored political feeds.
You’ll see more of what aligns with your views – think a hyper-personalized echo chamber.
Why now?
Some see this as Meta aligning with changing political winds, especially as a new U.S. administration takes charge.
Others believe it’s a long-overdue correction of policies that often felt too heavy-handed or mistake-prone (Meta admits 1-2 out of every 10 censorship actions were wrong).
Platforms like Meta walk a tightrope between free expression and user safety. These changes could foster openness, but the risk of misinformation resurging is real.
The big question:
Will more “speech freedom” actually help create healthier online spaces, or are we about to witness the rise of even more echo chambers?
What’s your take? Comment below.