uh, well, im running like fifty things at once on all my devices, and except for the OS, all of them were coded with this design philosophy. I can definitely tell.
on a commercial device, with everything live-snitching on me to fifty different people at once, computing actually appears to slow down over time.
That’s not because of hand-written assembly vs compilers, that’s because everyone and their dog wants abstractions up the wazoo. You have frameworks on top of frameworks, and no compiler can efficiently sift through that nonsense.
I’d really like to see a shift back toward compiled languages like Rust to cut through the bloat.
oh, no, I don’t think it’s the compilers’ fault, I think it’s the design philosophy of ‘fuck it, computers get faster, be a messy bitch, finish code fast.’ that’s fucking us.
Yup. I feel that so much at my day job. We use Python on our BE, and we have so much waste on top of that
For example, we have some low level code for a simulation (not Python), and we ported it to Python, and we noticed the code spent a ton of its time doing bubble sort. So our Python implementation ended up being competitive by just making reasonable high level choices. We had a paginated sort + filter that loaded all possible records into RAM and did the logic in Python instead of SQL (fixing that dropped request time like 80% on larger queries).
We have so much more crap like that, it’s not funny. But I’m ticking them off one by one by inflating my estimates a little to allow for refactors.
Yup. And our processors are a lot more powerful, so the tricks you’d do in assembly to eek out performance just don’t matter anymore.
I know it’s a typo but “eek out performance” has made me picture someone programming a little ghost to spook the rest of the code into running faster
I think it was a subconscious letter swap. :) I’ll keep it because ghosts.
uh, well, im running like fifty things at once on all my devices, and except for the OS, all of them were coded with this design philosophy. I can definitely tell.
on a commercial device, with everything live-snitching on me to fifty different people at once, computing actually appears to slow down over time.
That’s not because of hand-written assembly vs compilers, that’s because everyone and their dog wants abstractions up the wazoo. You have frameworks on top of frameworks, and no compiler can efficiently sift through that nonsense.
I’d really like to see a shift back toward compiled languages like Rust to cut through the bloat.
oh, no, I don’t think it’s the compilers’ fault, I think it’s the design philosophy of ‘fuck it, computers get faster, be a messy bitch, finish code fast.’ that’s fucking us.
Yup. I feel that so much at my day job. We use Python on our BE, and we have so much waste on top of that
For example, we have some low level code for a simulation (not Python), and we ported it to Python, and we noticed the code spent a ton of its time doing bubble sort. So our Python implementation ended up being competitive by just making reasonable high level choices. We had a paginated sort + filter that loaded all possible records into RAM and did the logic in Python instead of SQL (fixing that dropped request time like 80% on larger queries).
We have so much more crap like that, it’s not funny. But I’m ticking them off one by one by inflating my estimates a little to allow for refactors.
I barely code and this hurt to read.
Yes that’s what I was referring to.
It’s some sort of out of order execution and branch prediction that does it. The thing you’re usually waiting on the most is IO.