agreed. we’ve veered a bit too close to slashdot’s tone on this one.
with that said, I’m also acutely aware of the tactics that programming.dev reply guys use to generate these kinds of responses. to our guests: it’s best to take your questions about database best practices literally anywhere else but here.
with that said, I’m also acutely aware of the tactics that programming.dev reply guys
I wasn’t actually aware of this, and will be taking note of it in future. for my part I tried to make my reply “uhh go look at $x and learn” post without, y’know, overtly making things into a not-meant-for-here debate setup, but that didn’t seem to have worked out entirely well :)
Just to be clear, if a person is wrong about best practices then it’s not a big deal.
In context of spicy autocomplete as coding assistance, it better output immaculate, robust code every fucking time or we should be clowning on it with zero remorse.
Yeah, I’m all for dunking on promplets, but just being wrong about best practice isn’t a big deal. The reaction here is excessively harsh.
agreed. we’ve veered a bit too close to slashdot’s tone on this one.
with that said, I’m also acutely aware of the tactics that programming.dev reply guys use to generate these kinds of responses. to our guests: it’s best to take your questions about database best practices literally anywhere else but here.
I wasn’t actually aware of this, and will be taking note of it in future. for my part I tried to make my reply “uhh go look at $x and learn” post without, y’know, overtly making things into a not-meant-for-here debate setup, but that didn’t seem to have worked out entirely well :)
Just to be clear, if a person is wrong about best practices then it’s not a big deal.
In context of spicy autocomplete as coding assistance, it better output immaculate, robust code every fucking time or we should be clowning on it with zero remorse.
Wait a second…to err is to be human. Programmers err sometimes. ChatGPT shits itself all the time…😟. Yud et al. were right