

Even that is a bad analogy, it’s like commissioning a painter to paint something for you, and then claiming you know how to paint. You told an entity that knows how to do stuff what you wanted, and it gave it to you. Sure, you can ask for tweaks here and there, but in terms of artistic knowledge, you didn’t need any and didn’t provide any, and you didn’t really directly create anything. Taking a decent photo requires more knowledge than generating something on ChatGPT. Not to mention actually being in front of the thing you want a photo of.
That wasn’t my experience in school, but there’s a good chance you were just in an introductory class or similar. However, that doesn’t change anything about my argument. If you need the log of something, you knew that you needed to look up the log in a table to solve the problem. ChatGPT removes the need to even understand that you can use a log to solve a problem, and instead spits out an answer. Yes, people can use ChatGPT to accelerate learning, as one would a calculator, and in those instances I think it’s somewhat valuable if you completely ignore the fact that it will lie to your face and claim to be telling you the truth. However, anecdotally I know quite a few folks that are using it as a replacement for learning/thinking, which is the danger people are talking about.