- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
TL;DR: Intellisense works best if you write bottom-up (true) and it means you have to remember less stuff (also true), therefore it makes you write worse code (very doubtful).
So I don’t think IntelliSense is helping us become better programmers. The real objective is for us to become faster programmers, which also means that it’s cheapening our labor.
This doesn’t make any sense though.
-
People don’t have unlimited time. Writing high quality code takes time and wasting it remembering or typing stuff that Intellisense can take care of means I have less time for refactoring etc. Also one of the really useful things about Intellisense is that it enables better refactoring tools!
-
It doesn’t make you dumber to use tool assistance. It just means you get less practice in doing the thing that the tool helps you with. Does that matter? Of course not! Does it matter that I can’t remember how to do long division because I always use a calculator? Absolutely not. Similarly it doesn’t matter that I can’t remember off the top of my head which languages use
starts_with
,HasPrefix
,startswith
, etc. doesn’t matter at all if Intellisense can easily tell me. -
You don’t have to use the Intellisense suggestions. Just press escape. It’s very easy.
-
It’s very well known that making something easier to do increases demand for it.
-
that is an ungodly 40 minute read without much of a takeaway. Here’s the AI generated tl;dr
The author recounts a reluctant experience with public speaking, stemming from a past incident where he discussed topics beyond his audience’s expectations. Despite initially hesitating, he revives his experience to talk about “Computers in the Movies,” drawing parallels between the portrayal of computers in films and real technological advances. The author highlights the impact of movies like 2001: A Space Odyssey and WarGames in depicting computers as both helpers and threats, often reflecting societal fears about technology. He contrasts Hollywood’s dramatizations with today’s mundane reality of technology, pointing out how things like email and PowerPoint have subtly influenced our behaviors and thinking, suggesting that our relationship with technology is akin to addiction rather than dependency. Transitioning to modern development practices, the author critiques tools like Visual Studio and features like IntelliSense for shaping, perhaps simplifying, programming methods. Despite these tools’ capabilities, he fears they undermine coding skills by promoting faster but potentially less thoughtful programming. Finally, he reflects on the shift from traditional coding, highlighting the value of returning to basic coding tasks to rediscover the joy of pure algorithmic programming, away from the complexity of modern integrated development environments and pre-written frameworks.
Very long shitpost?
¿
I always thought writing and debugging code on VS was fine. Where I never liked Visual Studio was building especially cross-platform building or just using special tools etc. Fankly just give me an editor and Make, but make sucks on Windows due to slow process startup. But I do like the VS debugger.
No.
Although I think it’s a symptom of a larger problem. At the very least, consider Rider (or for non-C# code, VS Code/Codium/your terminal editor of choice).
At work, we have to use VS for C# development though, due to us having VS licenses and not Rider licenses. I guess we could use VS Code for C# dev, but I could also use Morse code to type, and neither of those sound like a good time when you take our work tooling into account.
There can’t be much brain there to rot if it’s still using Microsoft products.