

Similar thought… If it was so revolutionary and innovative, I wouldn’t have access to it. The AI companies would be keeping it to themselves. From a software perspective, they would be releasing their own operating systems and browsers and whatnot.


Similar thought… If it was so revolutionary and innovative, I wouldn’t have access to it. The AI companies would be keeping it to themselves. From a software perspective, they would be releasing their own operating systems and browsers and whatnot.
It’s a form of engagement hacking.
I think the order of Java and Python makes perfect sense. The OOP C++ -> Java pipeline was massive in the early 2000s when python wasn’t really on the radar. The world has been slowly moving away from that, and Python is one of the most popular languages right now.
Odin mentioned!
Maybe try convincing him in terms he would understand. If it was really that good, it wouldn’t be public. They’d just use it internally to replace every proprietary piece of software in existence. They’d be shitting out their own browser, office suite, CAD, OS, etc. Microsoft would be screwing themselves by making chatgpt public. Microsoft could replace all the Adobe products and drive them out of business tomorrow.
Edit: that was fast


Also depends how hard the AI runs them. A good chunk of the graphics cards that were used as miners came out on life support if not completely toasted. Games generally don’t run the piss out of them like that 24/7, and many games are still CPU bound.


I took his comment to mean recession from bubble popping.
I don’t think you would get much traction on C developers’ existing projects. C gives you the option to do everything your way. If the developer’s paradigm doesn’t agree with the borrow checker, it could become a rewrite anyway.
Most projects don’t use the newer c standards. The language just doesn’t change much, and C devs like that. This might get a better response from the modern C++ crowd, but then you are missing a large chunk of the world.
They are also dev friendly too,
Not saying you’re wrong because I don’t use it, but from the outside, they appear actively hostile toward developers.


We are eventually going to stop writing code and focus more on writing specifications.
I don’t think this will happen in my lifetime.
100%. In my opinion, the whole “build your program around your model of the world” mantra has caused more harm than good. Lots of “best practices” seem to be accepted without any quantitative measurement to prove it’s actually better. I want to think it’s just the growing pains of a young field.
You shouldn’t have any warnings. They can be totally benign, but when you get used to seeing warnings, you will not see the one that does matter.
And, you can have pointers to bits!
Algorithms + Data Structures = Programs
Niklaus Wirth


60k rows of anything will be pulled into the file cache and do very little work on the drive. Possibly none after the first read.
Meh. I had a bash job for 6 years. I couldn’t forget it if I wanted to. I imagine most people don’t use it enough for it to stick. You get good enough at it, and there’s no need to reach for python.


Heh, the red alert readme says it currently requires borland for the asm and watcom compiler for the c/c++.
I’m on your side dude. Comments rot. Some are useless. Don’t even get me started on doxygen comments.
AI hype in a nutshell