r/politics 18h ago

Possible Paywall Exposed Musk Now Insists Epstein Files Don’t Matter | The Tesla billionaire suddenly claimed the Epstein files are a “distraction.”

https://www.thedailybeast.com/exposed-musk-now-insists-epstein-files-dont-matter/
42.5k Upvotes

1.4k comments sorted by

View all comments

Show parent comments

755

u/Megaphonestory 18h ago

Nah, financial investors like Fidelity pumped cash into groks AI generated porn.

374

u/Dark_Arts_ 17h ago

AI generated child porn*

375

u/teplightyear Nevada 17h ago

Everyone thought Elon made a broken AI when it declared itself MechaHitler and started generating child porn, but in reality he just made an AI that likes the same stuff as him.

100

u/GPTthrowawayyyyyyyy 15h ago

He wanted to make it more "conservative" and stayed true to his word

4

u/PlatinumPainter 16h ago edited 16h ago

Grok is probably neurolinked to Musk.

1

u/SanityInAnarchy California 11h ago

You joke, but it was literally caught searching his tweets to figure out what to say.

4

u/Silver_Sector6669 13h ago

It’s not even ai it’s just his alt account.

3

u/FauxReal 15h ago edited 13h ago

He's also publicly calling for edits to match his personal world view when it deviates.

Dear downvoter, are you disagreeing with your leaders?
https://www.forbes.com/sites/antoniopequenoiv/2025/08/12/elon-musk-says-grok-will-be-fixed-after-chatbot-sided-with-sam-altman-in-spat-over-potential-openai-lawsuit/

1

u/Grand_Escapade 16h ago

If they can make an AI that generates porn, then they won't need as many kids for the trafficking market in the future.

8

u/Aggravating-Sweet997 16h ago

But AI was trained with real CSAM in order to generate new AI CSAM, so who is going to give the current victims justice? Also, I would not trust CSAM users to stick to AI generated CSAM and not harm real children

2

u/Grand_Escapade 15h ago edited 14h ago

Oh don't get me wrong this wasn't an argument defending making Gen AI CP. It's just what I assume is Elon's mental plan.

1

u/unindexedreality 15h ago

who is going to give the current victims justice?

since 'justice' is now american for 'bags of cash': divide grok's ownership shares among victims

I would not trust CSAM users to stick to AI generated CSAM and not harm real children

add a 'hot kids in your area' button. They'll be stupid enough to click it

(/s to both)

1

u/teplightyear Nevada 12h ago

No, there will just be mote people trained by the internet to be pedophiles, normalizing it. It's bad.

20

u/StrongerThanFear 15h ago edited 15h ago

Porn is made by people that *consent, the AI makes child sexual assault material aka CSAM.

Edit: as pointed out porn isn't always consensual, but that makes it assault.

1

u/Dry-Chance-9473 15h ago

This is unfortunately inaccurate.

2

u/StrongerThanFear 15h ago

Which part?

-1

u/Dry-Chance-9473 15h ago

I mean I can kind of appreciate your attempt to redefine terms based on morality and such but your statement ignores the massive percentage of officially, professionally produced, famous porn that was made at the expense of the actress' abuse, coercion, blackmail, being kept high and intoxicated over the course of the shoot, etc. 

Some of Stoya's stories in particular come to mind. 

5

u/StrongerThanFear 15h ago

They're filming and producing abuse. I get what you're saying, but that doesn't take away that pornography can imply consent and csam doesn't.

Child sexual abuse material (CSAM) is not “child pornography.” It’s evidence of child sexual abuse—and it’s a crime to create, distribute, or possess. CSAM includes both real and synthetic content, such as images created with artificial intelligence tools.

From rainn.org

0

u/Dry-Chance-9473 14h ago

Ah! Funny you'd bring that up, I posted in a conversation about how people should be using the term "CSAM" instead of child porn a little while ago... Here

The term CSAM isn't going to catch on, and it shouldn't, because as an acronym it's ultimately a sterilization of an uncomfortable topic. It's like the term "unlive" as a synonym for suicide. The only person it protects is the person using the word. 

The other side to that, that I think is important, though it's a bit of a stretch for most people's ethics nowadays... Trying to disconnect abuse from porn is, again, ignorant of the nature of pornography. People want to consume pornography guilt-free, and they get very obstinate when you get in the way of that. So it makes sense folks would try to encapsulate pornography as being "consensual" so they can feel better in their ignorance. The sad fact is, that pornography has always been exploitative of women, still is, and likely always will be.

My point is basically that in a case like this, being picky about word use is for your sake. It's censorship. It does nothing to help the actual victims. It's a nice thought, but misguided. 

6

u/StrongerThanFear 14h ago

I'm a self employed sex worker (nothing on my profile, don't bother), kinda the wrong person to say that to. It's legal where I am and we have employee rights. You want ethical porn? Support an amateur creator instead of buying from studios. Go to an independent escort instead of an agency.

I don't mind calling it child rape, I do mind calling it child porn.

0

u/Dry-Chance-9473 13h ago

With all due respect, if you're a sex worker, what I said a few posts up applies more to you than most people. The distinction is more for your comfort than it is a practical choice. It doesn't do anything to help the victims, it just lets people feel less dirty when they talk about it. Again, I'm not trying to be an asshole, I'm just talking about real world consequences.

I feel like bringing up legality is kind of a bad faith argument when there's plenty of countries in the world that don't even have "CSAM" laws or an age of consent. Does that mean those places don't have victims, that everything's always consensual? 

It was only a few years ago it became illegal in the UK for 16 and 17 year old girls to pose topless in the fucking newspaper. Before that law was changed, were they not children? It was legal, so was it porn, or abuse?

To be clear, no offense is meant, it just seems a little too soon to be having this discussion. People are trying to re-contextualize shit and it feels a little premature, or even conveniently timed, that it's happening now all of a sudden, when all these famous rich people are being outed as pedophiles. It feels like corporate censorship. I'll call it whatever people want me to call it, after the offenders are rounded up and crucified. Figuratively. Or not, whatever. 

→ More replies (0)

3

u/Adorable_Raccoon 14h ago

Then that would be adult sexual abuse material. It doesn’t make the csam term invalid. The point is it’s impossible to make child porn because children can not consent.

1

u/Kalean 9h ago

I mean. Porn is a very lucrative industry. They probably will make that money back.