@[email protected] to Science [email protected]English • 3 months agoHuhsh.itjust.worksimagemessage-square48fedilinkarrow-up1560arrow-down111
arrow-up1549arrow-down1imageHuhsh.itjust.works@[email protected] to Science [email protected]English • 3 months agomessage-square48fedilink
minus-square@[email protected]linkfedilinkEnglish65•3 months agoI assumed they reduced capacity to save power due to the high demand
minus-square@[email protected]linkfedilinkEnglish48•3 months agoThis. They could obviously reset to original performance (what, they don’t have backups?), it’s just more cost-efficient to have crappier answers. Yay, turbo AI enshittification…
minus-square@[email protected]linkfedilinkEnglish40•3 months agoWell they probably did power down the performance a bit but censorship is known to nuke LLM’s performance as well
minus-square@[email protected]linkfedilinkEnglish11•3 months agoTrue, but it’s hard to separate, I guess.
I assumed they reduced capacity to save power due to the high demand
This. They could obviously reset to original performance (what, they don’t have backups?), it’s just more cost-efficient to have crappier answers. Yay, turbo AI enshittification…
Well they probably did power down the performance a bit but censorship is known to nuke LLM’s performance as well
True, but it’s hard to separate, I guess.