GitHub (but really... Microsoft) recently released Copilot, a thing that looks cool in some demos but is actually utterly useless.
Which... to be fair, is 90% of anything that claims to be AI and/or ML based (and 99% of that 90% is dubious use of ML... if the thing even actually uses it).
Not to mention the slight issue that ML is basically black magic that no one really understands. It basically goes like this:
Alice: "Why does x =
0.015843, Bob?"
Bob: "shrug... that's what machine learning came up with. It seems to work though."
Anyway. The Copilot thing got a whole lot of buzz on the usual discussion sites and is built by some genius-level type people. Ah yes, this thing is going to code for you. Oh wait -- actually... hmmmm, hang on a second. If this thing can auto-complete what you actually want to code, then can't it just... replace you?
They don't even try to hide this aspect. It's literally the intention.
Ignoring the Exec-speak or whatever this is (actually looks like it was written by GPT-3), the intention is to basically de-specialise and commodify the programming profession to such a degree that programmers ("software engineers", lol) can either be replaced or paid less. Or have their jobs outsourced to someone who can write the bit that this thing translates into actual code.
Because if what you're doing right now is anticipatable (is this even a word? shrug) by this useless AI then your job is basically pointless. Pack it up -- Microsoft is going to replace you with a thing that doesn't know that you shouldn't use floats for currency types.
But... like... okay. Cool. Who does that leave? I mean, the number of people who actually have a clue about how to build AI's and ML-thingamabob's is pretty small. And most of them have PhD's or whatever from some very expensive universities.
So is the intention to concentrate all the power and supposed innovation to where all these people work? MSR? Google?
Who actually benefits from this? Who wants this?
Isn't this a indictment of the entire programming profession, though? Like... if your job is just copy/pasting from Stack Overflow or random source code you find on GitHub, are you producing anything novel?
I will say that this code writing AI looks to be about 5 years behind where self driving AI tech is and that's about 10 years from being usable (shhhh don't tell anyone), so... there's a ways to go before we're all out of jobs.
But it doesn't portend well for the future, that's for sure.
It really feels like knowing why computers do what they do doesn't matter anymore. Nobody even cares. Eric Sink
had a great writeup about this: Will deep understanding still be valuable?.
I have some bad news Eric. For the vast majority of programmers: no.
But that shouldn't stop anyone from wanting to learn why. Or how. Or what.
And those people are going to be the ones tasked with building these systems that put their fellow programmers out of jobs.
Hooray.
As an aside, a "cyber security" company recently IPO'ed with their... uh... AI/ML blah-de-blah detection software. I did a deep dive into it and it basically just looked like they had your standard AV/pattern matching/bayesian type system, and their system couldn't detect novel attacks and performed worse than whatever Microsoft sells. But it does say AI on it so... it has to be good.
So, all that taken into account, I think it's fair to say that we're living in a new era of computing bullshit.
It reminds me of when VC's and Silicon Valley tried to convince us chat bots were the next big thing? How's that going... hmmmm?
The real problem is that everyone's run out of ideas and this entire industry is boring. Like, colossally boring.
Sigh -- I think I'm getting old.