The evolution of software development over the past decade has been very frustrating. Little of it seems to makes sense, even to those of us who are right in the middle of it. My theory is fairly straightforward:
The long-term popularity of any given tool for software development is proportional to how much labour arbitrage it enables.
The more effective it is at enabling labour arbitrage, the more funding and adoption it gets from management.
Right. Because if you quote $700,000 to do a job in C/C++, and someone else quotes $70,000 to do the same job in JavaScript… no prizes for correctly guessing who wins the contract.
But that’s not the whole story. Where C really falls to shit is if you compare it to giving the JavaScript project $500,000. At that point, it’s still far cheaper than C, but you can hire a 7x larger team. Hire twice as many coders and also give them a whole bunch of support staff (planning, quality assurance, user experience design, a healthy marketing budget…)
JavaScript is absolutely a worse language than C/C++. But if you compare Visual Studio to Visual Studio Code (with a bunch of good plugins)… then there’s no comparison VSCode is a far better IDE. And Visual Studio has been under active development since the mid 90’s. VSCode has existed less than half that long and it has already eclipsed it, despite being backed by the same company, and despite that company being pretty heavily incentivised to prioritised the older product (which they sell for a handsome profit margin, while the upstart is given away for free).
I learned C 23 years ago and learned JavaScript 18 years ago. In my entire life, I’ve written maybe 20,000 lines of C code where I was actually paid to write that code and I couldn’t possibly estimate the number of lines of JavaScript. It’d be millions.
I hate JavaScript. But it puts food on the table, so turn to it regularly.
Yeah you’re way off the mark. Earlier today I added this comment to my code:
// remove categories that have no sales
For context… above that comment was a fifty lines of relatively complex code to extract a month of sales data from several database tables, and summarise it down to a simple set of figures which can be used to generate a PDF report for archival/potential future auditing purposes. Boring business stuff that I’d rather not work on, but it has to be done.
The database has a bunch of categories which aren’t in use currently (e.g. seasonal products) and I’d been asked to remove them. I copy/pasted that comment from my issue tracker into the relevant function, hit enter, and got six lines of code. A simple map reduce function that I could’ve easily written in two minutes. The AI wrote it in a quarter of a second, and I spent one minute checking if it worked properly.
That’s not a “potential” productivity boost, it’s a big one. Does that make me worse at my job? No - the opposite. I’m able to focus all of my attention on the advanced features of my project that separate it from the competition, without getting distracted much by all the boring shit that also has to be done.
I’ve seen zero evidence of LLM authored code being destructive. Sure, it writes buggy code sometimes… but humans do that too. And anyone with experience in the industry knows it’s easier to test code you didn’t write… well guess what, these days I don’t write a lot of my code. So I’m better equiped to catch the bugs in it.
Why would you quote this and then immediately argue about productivity and progress?
You can’t win an argument by saying “you can argue about this enormous counterpoint all you want but I’m still right”.