Nemeski@lemm.ee to Technology@lemmy.worldEnglish · 5 months agoTim Cook is “not 100 percent” sure Apple can stop AI hallucinationswww.theverge.comexternal-linkmessage-square145fedilinkarrow-up1413arrow-down118cross-posted to: [email protected]
arrow-up1395arrow-down1external-linkTim Cook is “not 100 percent” sure Apple can stop AI hallucinationswww.theverge.comNemeski@lemm.ee to Technology@lemmy.worldEnglish · 5 months agomessage-square145fedilinkcross-posted to: [email protected]
minus-squarekaffiene@lemmy.worldlinkfedilinkEnglisharrow-up42arrow-down1·5 months agoI’m 100% sure he can’t. Or at least, not from LLMs specifically. I’m not an expert so feel free to ignore my opinion but from what I’ve read, “hallucinations” are a feature of the way LLMs work.
minus-squarerottingleaf@lemmy.ziplinkfedilinkEnglisharrow-up10arrow-down1·5 months agoOne can have an expert system assisted by ML for classification. But that’s not an LLM.
I’m 100% sure he can’t. Or at least, not from LLMs specifically. I’m not an expert so feel free to ignore my opinion but from what I’ve read, “hallucinations” are a feature of the way LLMs work.
One can have an expert system assisted by ML for classification. But that’s not an LLM.