I don’t understand why there is such a drama around using AI while developing.
Everyone I know does that. Of course it doesn’t mean that I copy-paste the code from ChatGPT. But at the same time it is often useful and faster than Googling on Stack-Overflow…
Someone even said there that developers using AI will have legal problems in future… I really don’t think so, because it is the AI provider responsibility to train it on data provided freely for this purpose.
I’m against stealing data and brainslessly using AI, but at the same time the AI is here and you can’t ignore it.
The drama here isn’t solely over the use of AI. In fact, your last comment any brainlessly using AI is closer to the cause of this drama. The project lead is pushing untested code straight to main, and the fact it was AI generated was an addendum/insult to injury.
Other point the maintainers raised is the possibility that GenAI code violates the GPL license.
It’s a good concern to have, and one I feel people don’t talk about enough.
A little lesson about technical projects: You will quickly reach 95% completion and have something amazing to show off. Then, 95% of the work is completing that last 5% in order to make the prototype usable.
AI is good at making itself look ready. It is nowhere near ready.
I meant using AI for example for refactoring purposes or giving it some small tasks or questions. Not building whole projects with AI, because it is unmaintainable.
I would like to think some of it is class consciousness. “AI” inherently means fewer jobs. But considering the world we live in, I sincerely doubt the vast majority of people speaking out against it give a shit because “I am just demonstrating my value so the leopards won’t eat my face” and so forth.
Most of it is performative bullshit. Influencers who DID get fucked over (or decided this is why they didn’t get the job they wanted) have talked loud and proud about how evil it is and that is kind of where the conversation ends. “AI” is evil and nobody cares to think what “AI” actually means. I’ll never get tired of giving a depressed smirk when I see the same people who were championing the use of “magic” Adobe tools complaining about “AI” taking graphics design jobs and so forth. Same with the folk who have dozens of blog posts about using tools to generate their docstrings suddenly getting angry that “AI” does that too.
Personally? I very much DO care about the labor side. Not because I don’t think “AI” can do the job of an intern or even most early career staff (spend some time mentoring early career staff… I wish I just had to worry about six toes on a foot or code that would delete our prod tables if I don’t review it well enough). But because the only way for those dumbass kids to learn is by doing the tasks we would be getting rid of and that is already leading to a VERY rapid brain drain as it is increasingly hard to find staff who can actually do the job of a Senior role.
Which is the other issue… “AI” can do some stuff VERY well. Other stuff it is horrible at. And even more stuff that it “does well” is dictated by managers and “Prompt Engineers” and not actually the domain experts who can say “Yeah… this is good. THAT is complete dog shit”.
And then there is the IP theft part of things. People… are once again stupid and don’t realize that the folk posting answers on Stack Overflow were just as likely to have read that blog post where you talked about your cool algorithm. Or how much art is literally traced from others with no attribution and becomes part of major marketing campaigns. And while I think a MUCH bigger reckoning needs to happen regarding IP law and attribution… “AI” is just a symptom of the real problem.
A good rule of thumb when it comes to internet discourse that I sometimes remember to follow: Actually look at what the other person is writing and HOW they are writing it. There are very good odds you are talking to a literal child (or someone who never grew up). They are going to continue to insist that their life experiences (all five minutes of them) are the most important thing and you are never going to convince them otherwise. And… there isn’t a lot of point in trying in most of these discussions.
Because “These studies show that 99.9999999% of people don’t like sandpaper underpants so that is probably why Levi’s stopped selling it in 1972” versus “Well. I personally like sandpaper on my taint” invariably must become “Both sides have some good points and we need to agree to disagree” in the eyes of all the people who want to be enlightened centrists. The point is made and anyone who would have been convinced already is. Further discussion is just unsatisfying masturbation.
I don’t understand why there is such a drama around using AI while developing. Everyone I know does that. Of course it doesn’t mean that I copy-paste the code from ChatGPT. But at the same time it is often useful and faster than Googling on Stack-Overflow… Someone even said there that developers using AI will have legal problems in future… I really don’t think so, because it is the AI provider responsibility to train it on data provided freely for this purpose. I’m against stealing data and brainslessly using AI, but at the same time the AI is here and you can’t ignore it.
The drama here isn’t solely over the use of AI. In fact, your last comment any brainlessly using AI is closer to the cause of this drama. The project lead is pushing untested code straight to main, and the fact it was AI generated was an addendum/insult to injury.
Other point the maintainers raised is the possibility that GenAI code violates the GPL license.
It’s a good concern to have, and one I feel people don’t talk about enough.
When I read the comments, it seemed to me like they are blaming him for using AI in general. 🤷🏻♂️
A little lesson about technical projects: You will quickly reach 95% completion and have something amazing to show off. Then, 95% of the work is completing that last 5% in order to make the prototype usable.
AI is good at making itself look ready. It is nowhere near ready.
I meant using AI for example for refactoring purposes or giving it some small tasks or questions. Not building whole projects with AI, because it is unmaintainable.
I would like to think some of it is class consciousness. “AI” inherently means fewer jobs. But considering the world we live in, I sincerely doubt the vast majority of people speaking out against it give a shit because “I am just demonstrating my value so the leopards won’t eat my face” and so forth.
Most of it is performative bullshit. Influencers who DID get fucked over (or decided this is why they didn’t get the job they wanted) have talked loud and proud about how evil it is and that is kind of where the conversation ends. “AI” is evil and nobody cares to think what “AI” actually means. I’ll never get tired of giving a depressed smirk when I see the same people who were championing the use of “magic” Adobe tools complaining about “AI” taking graphics design jobs and so forth. Same with the folk who have dozens of blog posts about using tools to generate their docstrings suddenly getting angry that “AI” does that too.
Personally? I very much DO care about the labor side. Not because I don’t think “AI” can do the job of an intern or even most early career staff (spend some time mentoring early career staff… I wish I just had to worry about six toes on a foot or code that would delete our prod tables if I don’t review it well enough). But because the only way for those dumbass kids to learn is by doing the tasks we would be getting rid of and that is already leading to a VERY rapid brain drain as it is increasingly hard to find staff who can actually do the job of a Senior role.
Which is the other issue… “AI” can do some stuff VERY well. Other stuff it is horrible at. And even more stuff that it “does well” is dictated by managers and “Prompt Engineers” and not actually the domain experts who can say “Yeah… this is good. THAT is complete dog shit”.
And then there is the IP theft part of things. People… are once again stupid and don’t realize that the folk posting answers on Stack Overflow were just as likely to have read that blog post where you talked about your cool algorithm. Or how much art is literally traced from others with no attribution and becomes part of major marketing campaigns. And while I think a MUCH bigger reckoning needs to happen regarding IP law and attribution… “AI” is just a symptom of the real problem.
A good rule of thumb when it comes to internet discourse that I sometimes remember to follow: Actually look at what the other person is writing and HOW they are writing it. There are very good odds you are talking to a literal child (or someone who never grew up). They are going to continue to insist that their life experiences (all five minutes of them) are the most important thing and you are never going to convince them otherwise. And… there isn’t a lot of point in trying in most of these discussions.
Because “These studies show that 99.9999999% of people don’t like sandpaper underpants so that is probably why Levi’s stopped selling it in 1972” versus “Well. I personally like sandpaper on my taint” invariably must become “Both sides have some good points and we need to agree to disagree” in the eyes of all the people who want to be enlightened centrists. The point is made and anyone who would have been convinced already is. Further discussion is just unsatisfying masturbation.
It sure is hard to ignore. You’ve got that right