it will loose its ability to differentiate between there and their and its and it’s.
loose
Irony?
must of made a mistake their
your so dumb lmao
thank you kind stranger
Should of proof red it
And my axe!
This guy fucks
I also choose this guy’s dead wife.
this fly fucks
deleted by creator
I need to of a word with you
Knead*
This one must be the worst. “Could care less” being a close second
Clothes second*
Duh
I mean for all intensive porpoises I could care less
Nah, there is some scents there. As in, I barely care, but I could care less.
OP hasn’t payed enough attention in English class.
Me fail English? That’s unpossible!
“must have” not “must of”
That bot gets downvoted so much by idiots. It’s not pedantry but fucking 1st grade English. Amazing how many people take offense instead of fixing it and moving on.
I prefer the must of white grapes for wine making than the must of red grapes.
Muphry’s Law at work
Woosh
Now when you submit text to chat GPT, it responds with “this.”
Unironically this
Criminaly underated post
As a language model, I laughed at this way harder than I should have
NTA, that was funny.
And it will get LOSE and LOOSE mixed up like you did
it will UNLEASH its ability to differentiate between there and their and its and it’s.
I’m waiting for it to start using units of banana for all quantities of things
ChatGPT trained used Reddit posts -> ChatGPT goes temporarily “insane”
Coincidence? I don’t think so.
This is exactly what I was thinking.
And maybe some more people did what i did. Not deleting my accounts but replacing all my posts with content created by a bullshit-generator. Made texts look normal, but everything was completely senseless.
Back in june-july, I used a screen tapping tool + boost to go through and change every comment i could edit with generic type fill, then waited something like 2 weeks in hopes that all of their servers would update to the new text, and then used the same app to delete each comment and post, and then the account itself. Its about all I could think to do.
I hope so. I generally liked that idea back then, but couldn’t do that to my historical collection of words. My words remain in the cloud, as they always will
They have always trained on reddit data, like, gpt2 was, i’m unsure about gpt1
I downloaded my content before changing the posts to nonsense.
ChatGPT 9 to be trained on R9K posts. Won’t be able to distinguish fake free text from real.
It also won’t be able to differentiate between a jackdaw and a crow.
Wild to think that was 7 years ago.
deleted by creator
They both look like ravens
ChatGPT also chooses that guy’s dead wife
The Narwhal Bacons at Midnight.
On the contrary, it’ll becomes excessively perfectionist about it. Can’t even say “could have” without someone coming in and saying “THANK YOU FOR NOT SAYING OF”
It already was, the only difference is that now reddit is getting paid for it.
Would have and would of
Would’ve
Wood’re
This one is the worst for me
Its going to be a poop knife wielding guy with 2 broken arms out to get those jackdaws.
From now on, when you say something like “I think I can give my hoodie to my girlfriend”, it will answer"and my axe""
GROND
“I also choose this guy’s dead wife”
Not always.
Sometimes it will say “and my bow”. :-P
And between were, we’re and where.
Insure and ensure.
ChatGPT was already trained on Reddit data. Check this video to see how one reddit username caused bugs on it: https://youtu.be/WO2X3oZEJOA?si=maWhUpJRf0ZSF_1T
I’m not gonna watch, but I assume little Bobby Tables strikes again.
It’s about the counting subreddit. It was used on the token generation database, but then removed on the training. This user posted so much on that subreddit that a token with its username was created, but then it had nothing associated with it in the training and the model dosen’t know how to act when the token is present.
It will also reply “Yes.” to questions “is it A or B?”.
Perfectly acceptable answer
Not of it’s neither A nor B ;)
Would you trust ChatGPT to know?