“The chatbot gave wildly different answers to the same math problem, with one version of ChatGPT even refusing to show how it came to its conclusion.”
It’s getting worse. And because it’s a black box model they don’t know why. The computer science professor here likens it to how human students make mistakes… but human students make mistakes because they don’t have perfect recall, mishear things being told to them, are tired and/or not paying attention… A bunch of reason that basically relate to having a human body that needs food, rest and water. A thing a computer does not have.
The only reason ChatGPT should be getting math wrong is that it’s getting inputs that are wrong, but without view into it they can’t figure out where it’s getting it wrong and who told it the wrong info.
You think that’s bad? My calculator can’t even finish a simple sentence.
Boobs is a sentence.
It’s not, but
Bob bobs
is.A single word can be a full sentence, unless answers to either/or questions are not sentences.
Or is this one of those logic things where a train is only a train when the railway engine is connected to something?
A sentence needs a subject and a verb, if I remember grade school. Fun fact: “I’m.” is a sentence. There can be an implied “You” in there. Like “[You] Stop!” or “[You] Go!”
The verb can be implied too. “Would you like mashed potatoes or fries?”
“[I would like] Fries.”
There’s also the joke sentence(?): “This sentence(,) no verb.”
I’m pretty far away from an expert, but I’m pretty sure your example isn’t a “real” sentence. It implies the subject and the verb. (I, like).