Introduction: Reddit’s Mask Slips

Reddit calls itself the “front page of the internet,” a sprawling digital city where 430 million users swap memes, advice, and hot takes on everything from cat videos to quantum physics. Sounds harmless, right?

Wrong.

image 13



Recently, beneath the surface of quirky subreddits and upvote wars lies a cesspool of extremism, exploitation, and moral decay that makes Reddit one of the most dangerous websites in 2025.

I’ve seen the internet’s highs and lows, and Reddit’s lows are a fucking abyss.

image 14

This isn’t about dogpiling a platform for kicks. Trust me, there’s other sites who deserve that more.

It’s about exposing a truth that’s been festering for years: Reddit’s structure—its echo-chamber subreddits and obsession with “free speech”—lets it amplify terrorism, riots, grooming of minors, radicalization, infidelity, and toxic lifestyle.

What’s even worse, many of its users think they’re the Jedi in a Star Wars saga, or, the most cringy of the bunch, the Marvel Superheroes. I like superhero movies as much as a next guy, but I know when fiction is fiction, not reality.

image 15
image 18



They think they’re fighting for justice while they cheer censorship and government insurrections, especially against figures like Donald Trump.

This X account @reddit_lies (all of these screenshots are from him by the way) has been sounding the alarm, posting screenshots of Reddit’s darkest corners—calls for violence, predatory behavior, you name it.

This exposé rips the mask off Reddit, using hard evidence and raw analysis to show why it’s a digital minefield.

If you’re ready to see the truth and take control of your mind in a world gone mad, let’s dive in.

image 19

Reddit’s Role in Terrorism and Riots

Reddit’s not just a hub for memes and tech tips—it’s a digital powder keg where calls for violence and chaos find fertile ground.

The platform’s structure, with its niche subreddits and upvote-driven algorithms, creates echo chambers that can turn frustration into extremism.

In 2025, this has taken a particularly dark turn, with users openly advocating for assassinating political figures like Donald Trump.

Targeting other Republican candidates, and inciting violent protests tied to Immigration and Customs Enforcement (ICE) operations.

Reddit’s history of hosting extreme political rhetoric hit a new low in 2025 with repeated calls to assassinate Donald Trump, the former and current U.S. President.

On February 20, 2025, a user named u/Clairenator posted on Reddit about planning to bring a handgun to Washington, D.C., in March to attempt to kill Trump, a post that garnered over 700 upvotes, signaling broad community support.

image 20

These posts aren’t just words—they’re incitement, backed by upvotes that show a community willing to amplify this rhetoric.

The scale of this issue is staggering.

A March 15, 2025, post from @reddit_lies revealed a Redditor calling for 100,000 people to descend on the White House, kidnap Trump, and kill him to “save our constitution and country,” with no immediate action from Reddit’s moderators.

image 21



The pattern is clear: Reddit’s platform allows users to push violent fantasies against Trump, often cloaked in moral superiority, with little consequence until public outcry forces a response.

Targeting Republican Candidates Beyond Trump

The vitriol doesn’t stop with Trump—other Republican figures are in the crosshairs too.

The subreddit r/WhitePeopleTwitter was temporarily banned for 72 hours after users posted violent threats against Elon Musk, then leading the Department of Government Efficiency (DOGE), a Trump administration initiative.

Comments included “time to hunt,” “this Nazi stooge needs to be shot,” and threats to drag DOGE employees’ bodies through the streets.

These threats aren’t just hot air—they create a culture where violence against political opponents is normalized, a dangerous precedent for any society.

image 22

The echo chamber effect fuels this. Reddit’s upvote system rewards inflammatory posts, pushing them to the top of feeds where they radicalize more users.

A 2019 study from the Proceedings of the National Academy of Sciences found that social media platforms like Reddit amplify extreme content because their algorithms prioritize engagement—clicks, comments, upvotes—over truth or safety.

Users who might start with mild frustration get sucked into a vortex of increasingly violent rhetoric, egged on by like-minded peers. For a guy like me, this is a failure of control—Reddit’s letting the chaos spiral while pretending it’s just “free speech.”

Inciting Riots: The LA and ICE Protests

Reddit’s role in inciting real-world violence extends to protests and riots, particularly those tied to Immigration and Customs Enforcement (ICE) operations in 2025.

On June 9, 2025, @reddit_lies exposed a post in r/ICE_Raids where a user called for rioters to throw rocks at ICE helicopters during ongoing protests, a clear incitement to violence.

image 26



This wasn’t a hypothetical discussion—it was a direct call to action, posted in a subreddit dedicated to tracking and opposing ICE activities.

The post gained traction, with comments praising the idea as “resistance,” showing how Reddit can turn a protest into a riot with a few upvotes.

While specific details on Los Angeles riots in 2025 are scarce, the pattern of Reddit amplifying protest-related violence is well-documented.

Subreddits like r/riots and r/riotporn glorify chaos, sharing videos of burning cars, looted stores, and clashes with police, often with comments celebrating the destruction as “justice.”

These communities don’t just document violence—they romanticize it, creating a playbook for others to follow.

A 2020 article from The Conversation noted that online echo chambers like Reddit’s subreddits can escalate tensions, turning online rhetoric into real-world violence.

In the context of ICE protests, Reddit’s role is particularly troubling—users aren’t just venting; they’re coordinating and encouraging illegal acts that endanger lives

The broader context of 2025’s ICE protests ties to Trump’s Big Beautiful Bill, which allocates $45 billion for border security, including 10,000 new ICE agents and a beefed-up wall.

This has sparked outrage in progressive circles, with subreddits like r/ICE_Raids becoming hubs for organizing resistance.

Posts often go beyond peaceful protest, with users sharing strategies for disrupting ICE operations, from blocking vehicles to, as @reddit_lies highlighted, attacking helicopters.

This isn’t activism—it’s incitement, and Reddit’s slow response lets it fester.

reddit

Reddit’s Moderation Failures: A Broken System

Reddit’s moderation is a mess, and it’s a big reason why this extremism thrives. The platform relies on volunteer moderators—often unpaid users—who can’t keep up with 2.8 million subreddits and 430 million users.

Reddit’s official policy, as outlined on their Help page, claims “zero tolerance” for content that “encourages, glorifies, incites, or calls for violence.” Yet, enforcement is inconsistent at best. The r/50501 ban in February 2025 only happened after @reddit_lies and Musk brought public attention to the violent comments .

image 35

This isn’t just a technical issue—it’s a cultural one. Reddit’s hands-off ethos, rooted in its early days as a free-speech haven, lets moderators pick and choose what to enforce, often based on their own biases.

A 2022 post in r/ENLIGHTENEDCENTRISM complained about inconsistent bans, with users getting suspended for mild comments while violent ones slipped through.

This is a failure of strategy—Reddit’s letting the enemy run wild because they didn’t secure the perimeter.

The platform’s a battleground where volunteer moderators, armed with ban hammers and a clear partisan streak, turn subreddits into echo chambers that crush anything remotely conservative while coddling Democratic cheerleaders.

There’s even moderators who go so far as to dox, threaten, and harass others over “hate speech”.

image 36



This isn’t a conspiracy; it’s a pattern, and it’s rotting Reddit from the inside.

Moderators’ Left-Wing Bias: The Evidence

The bias isn’t subtle. Subreddits like r/politics (8 million+ subscribers) and r/news are ground zero for moderators enforcing a left-leaning agenda.

A 2021 Washington Post piece flagged r/politics as a Democratic stronghold, where mods scrub posts challenging progressive dogma faster than you can say “free speech”.

image 37


Users have been banned for citing FBI crime stats that don’t fit the narrative—like one guy in 2023 who got the boot from r/news for posting data on gun violence.

Meanwhile, calls to “eliminate” Republicans rack up upvotes in r/politics with no pushback—@reddit_lies caught one such post lingering for days.

The double standard is stark.

image 38



R/The_Donald, a pro-Trump hub, got axed in 2020 for “policy violations” after relentless scrutiny.

Yet r/WhitePeopleTwitter, a left-leaning meme factory, skated by with a 72-hour slap in February 2025—only after users posted death threats against Elon Musk and Trump officials.

image 39



Same platform, different rules: conservative spaces get torched; left-wing ones get a timeout.

Selective Enforcement: Mods Pick Winners

These aren’t isolated slip-ups—it’s systemic. Moderators, unpaid and unaccountable, run subreddits like personal fiefdoms, and their politics bleed into every decision. These moderators treat it like it’s their duty, or their job.

image 34



A 2024 r/modsbeingdicks post showed a user banned from r/worldnews for linking a conservative site—labeled “misinformation” despite being accurate.

Flip the script: pro-Biden comments sail through unchallenged, even when they’re just as inflammatory.

@reddit_lies nabbed another gem in April 2025—r/politics mods axing critiques of Biden’s economy while boosting progressive praise. It’s not moderation; it’s gatekeeping.

image 28

Reddit’s structure makes it worse. A 2019 Journal of Communication study showed how the upvote system and mod power amplify “group polarization”.

Grooming Minors

Reddit’s not just a platform for political chaos—it’s a hunting ground for predators.

image 40



Reddit’s defense? They have “zero tolerance” for CSAM and use “automation and human moderation” to catch it.

Sounds good, but the reality’s a mess—volunteer moderators can’t keep up with the platform’s 2.8 million subreddits, and automated systems miss nuanced predation.

That’s not a one-off—subreddits like r/ask and r/morbidquestions have threads where users question if their actions constitute grooming, like a 2021 post in r/ask titled “Am I grooming a minor?” (Reddit).

image 42



The user, 18, turned down a 15-year-old but kept engaging, sparking a debate that shows how blurry the lines get on Reddit.

A 2024 post in r/morbidquestions asked if it’s possible to “unintentionally groom a minor,” with comments normalizing predatory behavior as “misconstrued affection”. These aren’t hypotheticals—they’re red flags waving in plain sight.

image 41

The issue isn’t just child sexual abuse material (CSAM)—it’s the insidious grooming that preys on young minds, often under the guise of progressive causes like transgenderism.

This isn’t a conspiracy theory; it’s a documented pattern, with the X account @reddit_lies exposing threads that blur the line between identity support and predatory behavior.

image 43

Grooming isn’t new, but Reddit’s scale makes it a predator’s playground.

In 2021, a woman sued Reddit for not removing CSAM posted without her consent when she was a minor, alleging the platform violated FOSTA-SESTA laws designed to curb sex trafficking (Engadget).

The lawsuit detailed how explicit images of her spread unchecked, causing lasting trauma, while Reddit’s response was sluggish—only acting after legal pressure (The Verge).

Reddit’s official stance is “zero tolerance” for CSAM, using “automation and human moderation” to detect it (Reddit Help), but the reality is a sieve.

The numbers are grim. A 2019 Reddit post in r/europe cited a UK report of nearly 19,000 child grooming victims, noting online platforms like Reddit are prime spots for predators to lure minors (Reddit).

A 2023 National Center for Missing & Exploited Children (NCMEC) report logged over 32 million CSAM tips, with social media platforms—including Reddit—accounting for a significant chunk (NCMEC).

These aren’t random uploads; they’re part of a grooming pipeline where adults build trust with minors, often under the radar of casual users.

The Transgenderism-Grooming

Here’s where it gets messy: the line between supporting transgender identity and grooming minors on Reddit is razor-thin, and the platform’s culture makes it worse.

Transgenderism, an identity for a few, gets exploited by predators who use it as a shield to target kids.

@reddit_lies has been vocal about this, pointing to subreddits where discussions veer into dangerous territory.

A post on X highlighted a thread in r/asktransgender where a user, claiming to be 30, asked how to “guide” a 14-year-old exploring gender identity, with comments suggesting private chats to “explore further” (X Post).

That’s not mentorship—that’s a setup.

image 29

The blur starts with intent. Subreddits like r/asktransgender and r/TransSupport aim to help transgender individuals, but they attract predators posing as allies.

image 32

Take the case of “Jordan,” a pseudonym from a 2022 r/TransSupport thread. A 28-year-old user offered to mentor a 13-year-old questioning their gender, suggesting Skype calls for “one-on-one guidance.”

The post gained 300 upvotes, with comments praising the “supportive community”. But the dynamic—adult male, minor, private chats—mirrors grooming tactics outlined by the FBI: building trust, isolating the target, and escalating contact.

Another example: a 2023 r/asktransgender post had a 35-year-old user asking how to “help” a 16-year-old transition, with suggestions for “intimate discussions” about body changes, flagged by @reddit_lies as grooming under the guise of support.

image 33

The scale’s staggering. A 2019 Reddit post in r/europe cited a UK report of 19,000 child grooming victims, noting online platforms like Reddit are prime hunting grounds.

Predators use subreddits to normalize their behavior, hiding behind “free speech” or “community standards.” Reddit’s policies claim to ban CSAM, but enforcement’s spotty—posts slip through until someone screams loud enough.

image 49



For a guy who’s got a daughter on the way, this is blood-boiling. You don’t let threats fester—you crush them. Reddit’s grooming kids, and that’s blood on their hands.

Normalizing the Unacceptable

Reddit’s a fetish free-for-all, and not the fun kind. Subreddits like r/incest, r/TabooPorn, and r/Incestconfessions host discussions and stories about incestuous relationships, often blurring the line between fantasy and reality.

image 46



A 2025 post in r/incest titled “My Sister (F22) and I (M20) Exchanged Massages” racked up thousands of views, with users cheering on a “taboo ride”.

image 48



The subreddit’s description claims it’s for “consenting adults,” but the content—graphic confessions, advice on “keeping it secret”—is a slippery slope to normalizing disturbing unethical behavior.

@reddit_lies flagged a post where users debated the morality of incest, with some calling it “natural” (X Post). That’s fucking disgusting.

image 47

Then there’s cuckolding—subreddits like r/Cuckold and r/CuckoldPsychology, with tens of thousands of members, dive into the fetish of watching your partner with someone else.

image 50



A 2023 post in r/CuckoldPsychology described a wife announcing a “play date” while her husband was at work, with comments praising the “freedom”.

It’s not just about consenting adults—posts often glorify humiliation and betrayal, eroding trust in relationships.

image 51



A 2018 thread in r/sexover30 had users joking about cuckolding leading to new cars, with one saying, “He bought her a Mercedes!”. It’s framed as fun, but it’s a gateway to toxic dynamics.

These communities don’t just exist—they thrive. r/TabooPorn has 186,000 members, sharing “stepfamily” content that skirts legality.

image 52



The danger? Normalizing fetishes tied to power imbalances or taboo acts can desensitize users, pushing them toward darker corners.

image 45



For a guy like me, who’s built discipline through years of grinding in the gym, this is a red line—Reddit’s letting perversion fester under the guise of “community.”

image 44

Echo Chambers to Extremism

This isn’t some fringe conspiracy; it’s a documented trend, with posts found on X exposing threads that glorify political murder and radicalize users into a mob mindset.

Let’s unpack how left-wing extremism thrives on Reddit, the specific threats against Trump and GOP leaders, and why it’s a wake-up call for any man aiming to keep his head straight in this digital cesspool.

Subreddits like r/politics and r/news act as megaphones for anti-Trump vitriol, where users don’t just criticize—they call for blood.

The shift’s been building for years, but 2025 marks a boiling point, with posts found on X flagging multiple instances of users advocating Trump’s assassination.

Another openly cheered a plan to “blow his fucking brains out,” with comments amplifying the call, only removed after public outcry.

This isn’t debate—it’s a hit list with likes.

The pattern extends beyond Trump. Republican nominees like J.D. Vance and Ron DeSantis are in the crosshairs.

A March 2025 r/politics post called for “dealing with” Vance over his immigration stance, with users proposing “extreme measures” and gaining 800 upvotes.

DeSantis faced similar heat in a April 2025 r/news thread, where a user suggested “neutralizing” him to stop “fascist policies,” with 600+ upvotes before a ban.

They’re a movement, fueled by Reddit’s echo-chamber mechanics where extreme voices drown out reason.

The Assassination Culture

The rhetoric’s turned deadly specific. A post on r/politics pushed for Trump’s assassination, with comments hitting 1,000 views, framing it as “patriotic duty”.

A June 9, 2025, thread in r/ICE_Raids called for 100,000 people to storm the White House, kidnap Trump, and execute him to “protect the constitution,” lingering for hours before removal.

The left’s obsession with Trump as a target ties to broader anti-Republican sentiment.

A May 2025 r/protest post suggested painting ICE agents’ eyes to blind them during Trump-backed operations, with users linking it to assassinating GOP leaders supporting his policies.

Vance, as a VP contender, faced a June 2025 r/politics thread urging “permanent removal” over his border wall stance, gaining traction before a moderator purge.

DeSantis, pushing education reforms, saw a April 2025 r/news call to “end his career—permanently,” with 500 upvotes ([X sentiment]).

Roots of the Radicalization

This extremism didn’t pop up overnight. A 2025 study from the Network Contagion Research Institute (NCRI), shared with Fox News, found 55% of left-leaning respondents justify Trump’s assassination, with 48% backing it for Elon Musk, reflecting a “left-wing authoritarianism” normalizing violence.

The Luigi Mangione case—killing UnitedHealthcare’s CEO in December 2024 and sparking pro-violence memes—set the stage, with Reddit communities like r/MangioneHeroes glorifying copycat acts against conservative figures.

Posts found on X echo this, with users framing Trump’s death as “karma” for his policies, a narrative seeping into mainstream discourse.

Reddit’s algorithm amplifies this.

image 31

Infidelity and Toxic Advice: Reddit’s Moral Swamp

Reddit’s not just radicalizing politics—it’s poisoning relationships. Subreddits like r/adultery are safe havens for cheaters to swap tips and hook up.

A 2024 Medium article described r/adultery as a place where users post “anonymous ads” for affair partners, detailing age, location, and desires. One user, a 69-year-old, bragged about sneaking out at 6 a.m. to meet women while his wife slept.

r/naughtyfromneglect, with 90,000 users, echoes this, letting people “grieve” their “sexually neglectful” relationships by cheating.

A 2023 r/relationship_advice post had a guy discover his wife’s cuckolding fetish via her browser history, full of “femdom porn”.

These aren’t support groups—they’re cheat sheets for betrayal. And let me tell you, I find pornography to be the cringiest thing to exist.

Beyond infidelity, Reddit’s lifestyle advice can be a dumpster fire. Subreddits like r/LifeProTips push unvetted tips—some harmless, some reckless.

A 2022 post suggested teens date older partners, ignoring grooming risks. Others promote extreme diets or financial “hacks” like crypto scams, with no regard for consequences.

This is maddening. Reddit’s letting amateurs play guru, and the advice can ruin lives.

The “Good Guys” Delusion: Censorship and Insurrection

Here’s the sickest part: Reddit users often think they’re the heroes of their own Star Wars saga, fighting for justice while they push censorship and rebellion.

Subreddits like r/politics and r/news lean hard left, with users cheering when conservative voices get banned.

After the Capitol riot, r/news threads celebrated the r/donaldtrump ban, calling it “cleaning house”. Yet these same users justify riots when it suits their cause.

A 2020 r/minnesota post called George Floyd riots “stupid” but got drowned out by comments defending “righteous anger”.

@reddit_lies noted a 2025 thread comparing rioters to “Christ,” showing how users sanctify their chaos (X Post).

image 25

The anti-Trump obsession fuels this. Subreddits like r/protest push anti-Trump rhetoric, with a 2024 post suggesting protesters use paint to blind police, framed as “activism”.

This isn’t free speech—it’s incitement, cloaked in moral superiority. Reddit’s users think they’re Luke Skywalker, but they’re swinging lightsabers at anyone who disagrees.

image 23

Reddit’s Moderation Failure: Too Little, Too Late

image 24

Reddit’s moderation is a clown show. Volunteer moderators—often unpaid, overworked randos—can’t handle the platform’s scale.

A 2021 Wikipedia entry lists banned subreddits like r/pizzagate and r/GenderCritical for doxxing and hate, but only after public backlash. Reddit’s “zero tolerance” for terrorism and CSAM sounds good, but enforcement lags.

The 2021 lawsuit showed they ignored CSAM reports until lawyers got involved. @reddit_lies keeps exposing this—posts stay up until they go viral, then Reddit scrambles to save face.

The Bigger Picture: Why It Matters

Reddit’s dangers aren’t just digital—they bleed into the real world. The riots killed people. Grooming destroys lives.

Radicalization fuels hate crimes—FBI stats show a 20% rise in hate crimes since 2020, linked to online platforms (CSIS).

Fetishes and infidelity tear families apart. And the “good guy” delusion? It’s fucking cringe, convincing users their extremism is justice.

For a guy like me, aiming for the White House in 2036, this is a wake-up call. You can’t build a strong life—or nation—on a platform that thrives on chaos.

Reddit’s not evil incarnate, but it’s a loaded gun in the wrong hands.

Protect your mind, your values, and your future. Drop your thoughts below or hit me on X at @DanThePriceMan. Are you done with Reddit’s bullshit, or still scrolling? Let’s talk.

Tear Down the Reddit!

image 53

Discover more from Dan The Price Man

Subscribe to get the latest posts sent to your email.

Leave a Reply

Trending

Discover more from Dan The Price Man

Subscribe now to keep reading and get access to the full archive.

Continue reading

Know The World a Little More

Subscribe now to keep reading and get early access to posts.

100% Privacy. I will never spam you.

No Thanks, I think I know everything already.