I’ve seen a few articles saying that instead of hating AI, the real quiet programmers young and old are loving it and have a renewed sense of purpose coding with llm helpers (this article was also hating on ed zitiron, which makes sense why it would).
Is this total bullshit? I have to admit, even though it makes me ill, I’ve used llms a few times to help me learn simple code syntax quickly (im and absolute noob who’s wanted my whole life to learn code but cant grasp it very well). But yes, a lot of time its wrong.
Not a programmer, but I used it at my last job to get over humps where I was stuck on PowerShell scripts. AI can show you a path you didn’t know or hadn’t thought about. The developers seemed to be using it the same way. Great tool if you don’t completely lean on it and you know enough to judge the output.
Thats the key, use it to learn, not to do your thinking
I find it excels at one-off scripts. They are simple enough that every parameter and line of code fits in a small bit of memory. They are really bad at complex tasks, but they can help if you use it judiciously.
I used ChatGPAT to write some fairly straight forward bash scripts last week and it was mostly awful. I ended up massaging it enough to do what I needed, but I would have been better off just writing it myself and maybe asking it a couple syntax questions (although the regex I needed was one of 8 things it stumbled over)