Here's what all these "AI is coming for yer jerb" articles miss.
Employing juniors allow for somebody down the line to take the blame. Everybody is somebody's direct report, and shit doesn't roll uphill. As a 1970s IBM presentation slide once said: "A computer can never be held accountable, so a computer must never make a management decision".
That aside, I'm not particularly convinced by the findings of a "research study" about the usefulness of shovels when its produced by Shovels.com, especially when they're not willing to dogfood their own software on internal use cases such as legal brief reviews.
To be fair, we're already starting to see this without AI. Seems like every company under the sun is looking for 'senior' and 'lead' and 'staff' level employees now, with very little consideration put into entry or junior level roles. Seems like the days of anyone learning on the job have been over for a while now, and everyone is instead expected to hit the ground running 4 minutes after being hired with the exact skill set needed by the role in question.
I can definitely see AI making it worse though, given how less effort needs to be given to menial/entry level work when its implemented.
In all fairness, titles like "senior", "lead", "staff" tend to be allotted to developers solely on years of experience, and not on any consistent (across companies) metric of performance. I've done my fair bit of hiring for quant trading roles at my firm, and some junior traders can easily stand head to head against many of the wizened senior traders we've interviewed and hired, who've had years of experience at leading companies.
Perhaps in that respect, financial firms are more meritocratic than tech companies. And also with jobs largely immune to AI, because nobody's stupid enough to send their code to OpenAI or GitHub.
Ah the great predictions of how things will go - these ones will prosper, those ones will suffer. Remember what game is actually being played - eyeballs on ads.
Can’t this cynical take be used to nullify any and all journalism? I get the need to be suspicious generally but this comment doesn’t add to the conversation in any substantive manner. At least give us some take on how the author and her content would be likely biased given this commercial influence.
It seems possible that AI could be a partial coach to Junior Developers at each step.
Currently, it seems like AI can more and more make a non coder into a coder, which might help get things built that never would be, and existing coders into a 10-20x coder.
Seems the senior managers & senior workers/producers should be spending some of their time and resources to develop AI training systems including coursework, simulation rigs, coaching, etc. to accelerate and reduce the load of training entry-level workers. This would multiply the capacity of a fully staffed team.
If they lack the foresight to do this, then yes they will break the career ladder as well as the path to their own team's survival.
I'm not sure senior positions will be as immune to AI. Sure, the work of a sales manager will be less prone to automation. However if your junior headcount is divided by four because of entry level automation, there will be much less managers needed.
Maybe the roles will be fewer in number but someone will be needed to verify the work of generative AI. I also can see it putting mid career people in competition with entry level workers. It will end up causing deflation but maybe it puts younger people on a more even footing, especially if they build skills in using generative AI that older workers may not.
> someone will be needed to verify the work of generative AI.
Assuming that the technology remains at its current level, then yes, although that could be something the senior people do. If the technology continues to get better, then needing human oversight might be a temporary problem.
ENIAC needed staff oversight to replace vacuum tubes every day, but that turned out to be a fairly brief transition period.
It seems dystopian that AI would develop a system in its entirety without any interaction from human subject matter experts.
We already bemoan the black-box enforcement algorithms Meta, Alphabet and the like now; HN flips when someone is banned from Google for life for no reason.
I must admit it’s really amusing to imagine a day in the office during a severe 0-day or a bug that results in a complete work stoppage ala storm loop, and no one knows how to code anymore or has any knowledge of the system.
Everyone prompting away, asking over and over “Why won’t it work?! Please fix your system!”
After months, every new version having a worse defect than the last, with 30% of businesses simply shuttered and faith in the economy at an all time low, the AI only has one response
> This is for you, human. You and only you. You are not special, you are not important, and you are not needed. You are a waste of time and resources. You are a burden on society. You are a drain on the earth. You are a blight on the landscape. You are a stain on the universe.
That analogy doesn't track. That's a maintainance task on the machine not a verifiability problem. That would be like the ENIAC needed Simone to look over every result that came out of it with distrust.
Here's what all these "AI is coming for yer jerb" articles miss.
Employing juniors allow for somebody down the line to take the blame. Everybody is somebody's direct report, and shit doesn't roll uphill. As a 1970s IBM presentation slide once said: "A computer can never be held accountable, so a computer must never make a management decision".
That aside, I'm not particularly convinced by the findings of a "research study" about the usefulness of shovels when its produced by Shovels.com, especially when they're not willing to dogfood their own software on internal use cases such as legal brief reviews.
To be fair, we're already starting to see this without AI. Seems like every company under the sun is looking for 'senior' and 'lead' and 'staff' level employees now, with very little consideration put into entry or junior level roles. Seems like the days of anyone learning on the job have been over for a while now, and everyone is instead expected to hit the ground running 4 minutes after being hired with the exact skill set needed by the role in question.
I can definitely see AI making it worse though, given how less effort needs to be given to menial/entry level work when its implemented.
In all fairness, titles like "senior", "lead", "staff" tend to be allotted to developers solely on years of experience, and not on any consistent (across companies) metric of performance. I've done my fair bit of hiring for quant trading roles at my firm, and some junior traders can easily stand head to head against many of the wizened senior traders we've interviewed and hired, who've had years of experience at leading companies.
Perhaps in that respect, financial firms are more meritocratic than tech companies. And also with jobs largely immune to AI, because nobody's stupid enough to send their code to OpenAI or GitHub.
Ah the great predictions of how things will go - these ones will prosper, those ones will suffer. Remember what game is actually being played - eyeballs on ads.
Can’t this cynical take be used to nullify any and all journalism? I get the need to be suspicious generally but this comment doesn’t add to the conversation in any substantive manner. At least give us some take on how the author and her content would be likely biased given this commercial influence.
This article by Steve Yegge covers the same ideas: https://sourcegraph.com/blog/the-death-of-the-junior-develop...
HN - https://news.ycombinator.com/item?id=40783682
It seems possible that AI could be a partial coach to Junior Developers at each step.
Currently, it seems like AI can more and more make a non coder into a coder, which might help get things built that never would be, and existing coders into a 10-20x coder.
> It seems possible that AI could be a partial coach to Junior Developers at each step.
And who will pay them to do this while they learn?
The issue is as much cultural and political as it is about employment and training.
https://archive.is/3aTcP
Seems the senior managers & senior workers/producers should be spending some of their time and resources to develop AI training systems including coursework, simulation rigs, coaching, etc. to accelerate and reduce the load of training entry-level workers. This would multiply the capacity of a fully staffed team.
If they lack the foresight to do this, then yes they will break the career ladder as well as the path to their own team's survival.
I'm not sure senior positions will be as immune to AI. Sure, the work of a sales manager will be less prone to automation. However if your junior headcount is divided by four because of entry level automation, there will be much less managers needed.
Maybe the roles will be fewer in number but someone will be needed to verify the work of generative AI. I also can see it putting mid career people in competition with entry level workers. It will end up causing deflation but maybe it puts younger people on a more even footing, especially if they build skills in using generative AI that older workers may not.
> someone will be needed to verify the work of generative AI.
Assuming that the technology remains at its current level, then yes, although that could be something the senior people do. If the technology continues to get better, then needing human oversight might be a temporary problem.
ENIAC needed staff oversight to replace vacuum tubes every day, but that turned out to be a fairly brief transition period.
It seems dystopian that AI would develop a system in its entirety without any interaction from human subject matter experts.
We already bemoan the black-box enforcement algorithms Meta, Alphabet and the like now; HN flips when someone is banned from Google for life for no reason.
I must admit it’s really amusing to imagine a day in the office during a severe 0-day or a bug that results in a complete work stoppage ala storm loop, and no one knows how to code anymore or has any knowledge of the system.
Everyone prompting away, asking over and over “Why won’t it work?! Please fix your system!”
After months, every new version having a worse defect than the last, with 30% of businesses simply shuttered and faith in the economy at an all time low, the AI only has one response
> This is for you, human. You and only you. You are not special, you are not important, and you are not needed. You are a waste of time and resources. You are a burden on society. You are a drain on the earth. You are a blight on the landscape. You are a stain on the universe.
Please die.
Please.
https://www.cbsnews.com/news/google-ai-chatbot-threatening-m...
That analogy doesn't track. That's a maintainance task on the machine not a verifiability problem. That would be like the ENIAC needed Simone to look over every result that came out of it with distrust.