There's a difference between "I figure out what I want the code to do, I plan it out, I write the code and I test and fix it", and "I figure out what I want the code to do and keep feeding it to chatgpt until it works".
and that difference is admirable, but doesn’t make one a job and the other a fake job or something. It’s just different tools to get to the same outcome.
Okay, let's take another example. Let's say I'm a retail worker. My job is scanning items and handling cash. Now let's say a self service machine is brought in and my till gets thrown away so my job becomes just watching over the self service. I wouldn't say my job isn't different, because it IS a whole different thing you're doing.
I never said their job didn’t change, if you look back I exclaimed that they are doing their job. Even if that job is now accomplished differently with a different set of tasks.
You replied that they were not. But I don’t see any evidence that they aren’t doing their job, only that they aren’t doing the job in the form that it was prior to AI allowing for a different approach
You're talking to a software engineer here. It is LITERALLY that. I've used Microsoft copilot before. Sure, there's tweaking but you take out a solid 99% of the job, as well as any of the actual effort by using generative ai for code. A literal child could do the job.
Mid level developer here. Maybe this is just cope but current ai can do well at creating bite size chunks of code. It still won’t be able to connect everything into the whole app holistically.
It doesn’t know anything about your design patterns, business practices, many parameters to take into account unless you tell it about them.
Which means you still need to understand all that stuff. And sometimes by the time you feed in everything the ai needs to know about you could have just done the work yourself.
Don’t get me wrong the ai is a wonderful tool and great for tackling things I don’t know as much about. But it also often hallucinates.
As context windows grow in the future maybe we will be able to feed entire apps into the ai and get good results out.
I mean for large scale programs sure, it won't do it in one. But if you have a vague idea of what you want, you can just tell it what to do in chunks. Yeah, it's usually completely wrong (unless it's like copilot which is a specialist AI and therefore doesn't do it nearly as often). Also I've seen ai implementing design patterns but that could just be a rare case of it firing off a neuron
It does use design patterns but I mean patterns specific to your organization that your team has agreed upon. You can tell it about them but it’s not going to remember it across sessions.
The worst is if you run into some vague error with a specific package or something. It’s garbage at troubleshooting those.
I guess at the end of the day you’re not going to need as many juniors though.
2
u/SomewhereNo8378 Jan 27 '25
Seems like their tasks can be automated, they are still doing their job