In terms of hype it’s the crypto gold rush all over again, with all the same bullshit.
At least the tech is objectively useful this time around, whereas crypto adds nothing of value to the world. When the dust settles we will have spicier autocomplete, which is useful (and hundreds of useless chatbots in places they don’t belong…)
For something that is showing to be useful, there is no way it will simply fizzle out. The exact same thing was said for the whole internet, and look where we are now.
The difference between crypto and AI, is that as you said crypto didn’t show anything tangible to the average person. AI, instead, is spreading like wildfire in software and research and being used by people even without knowing worldwide.
I’ve seen my immediate friends use chatbots to help them from passing boring yearly trainings at work, make speeches for weddings, and make rough draft lesson plans
Why are we in the fallacy that we assume this tech is going to be stagnant? At the current moment it does very low tier coding but the idea we are even having a conversation about a computer even having the possibility of writing code for itself (not in a machine learning way at least) was mere science fiction just a year ago.
And even in its current state it is far more useful than just generating “hello world.” I’m a professional programmer and although my workplace is currently frantically forbidding ChatGPT usage until the lawyers figure out what this all means I’m finding it invaluable for whatever projects I’m doing at home.
Not because it’s a great programmer, but because it’ll quickly hammer out a script to do whatever menial task I happen to need done at any given moment. I could do that myself but I’d have to go look up new APIs, type it out, such a chore. Instead I just tell ChatGPT “please write me a python script to go through every .xml file in a directory tree and do <whatever>” and boom, there it is. It may have a bug or two but fixing those is way faster than writing it all myself.
I have the same job and my company opened the floodgates on AI recently. So far it’s been assistive tools, but I can see the writing on the wall. These tools will be able to do much more given enough context.
As a thought experiment, we might consider that any function that is too complicated to explain to ChatGPT and have it produce a working result might need to be refactored for complexity. Obviously not in every case, and our own ability to translate the requirements into a useful prompt must be considered, but I think it’s worth consideration.
I’ve gotten it to give boiler plate for converting one library to another for certain embedded protocols for different platforms. It creates entry level code, but nothing that’s too hard to clean up or to get the gist of how a library works.
Exactly my experience as well. Seeing CoPilot suggestions often feels like magic. Far from perfect, sure, but it’s essentially a very context “aware” snippet generator. It’s essentially code completion ++.
I have the feeling that people who laugh about this and downplay it either haven’t worked with it and/or are simply stubborn and don’t want to deal with new technology. Basically the same kind of people who, when IDEs with code completion came to be, laughed at it and proclaimed only vim and emacs users to be true programmers.
In terms of hype it’s the crypto gold rush all over again, with all the same bullshit.
At least the tech is objectively useful this time around, whereas crypto adds nothing of value to the world. When the dust settles we will have spicier autocomplete, which is useful (and hundreds of useless chatbots in places they don’t belong…)
For something that is showing to be useful, there is no way it will simply fizzle out. The exact same thing was said for the whole internet, and look where we are now.
The difference between crypto and AI, is that as you said crypto didn’t show anything tangible to the average person. AI, instead, is spreading like wildfire in software and research and being used by people even without knowing worldwide.
I’ve seen my immediate friends use chatbots to help them from passing boring yearly trainings at work, make speeches for weddings, and make rough draft lesson plans
deleted by creator
Why are we in the fallacy that we assume this tech is going to be stagnant? At the current moment it does very low tier coding but the idea we are even having a conversation about a computer even having the possibility of writing code for itself (not in a machine learning way at least) was mere science fiction just a year ago.
And even in its current state it is far more useful than just generating “hello world.” I’m a professional programmer and although my workplace is currently frantically forbidding ChatGPT usage until the lawyers figure out what this all means I’m finding it invaluable for whatever projects I’m doing at home.
Not because it’s a great programmer, but because it’ll quickly hammer out a script to do whatever menial task I happen to need done at any given moment. I could do that myself but I’d have to go look up new APIs, type it out, such a chore. Instead I just tell ChatGPT “please write me a python script to go through every .xml file in a directory tree and do <whatever>” and boom, there it is. It may have a bug or two but fixing those is way faster than writing it all myself.
I have the same job and my company opened the floodgates on AI recently. So far it’s been assistive tools, but I can see the writing on the wall. These tools will be able to do much more given enough context.
As a thought experiment, we might consider that any function that is too complicated to explain to ChatGPT and have it produce a working result might need to be refactored for complexity. Obviously not in every case, and our own ability to translate the requirements into a useful prompt must be considered, but I think it’s worth consideration.
deleted by creator
Genuine question: Based on what? GPT4 was a huge improvement on GPT3, and came out like three months ago.
I’ve gotten it to give boiler plate for converting one library to another for certain embedded protocols for different platforms. It creates entry level code, but nothing that’s too hard to clean up or to get the gist of how a library works.
Exactly my experience as well. Seeing CoPilot suggestions often feels like magic. Far from perfect, sure, but it’s essentially a very context “aware” snippet generator. It’s essentially code completion ++.
I have the feeling that people who laugh about this and downplay it either haven’t worked with it and/or are simply stubborn and don’t want to deal with new technology. Basically the same kind of people who, when IDEs with code completion came to be, laughed at it and proclaimed only vim and emacs users to be true programmers.
deleted by creator