Page 1 of 1
Stack Overflow article.
Posted: 31 May 2025 15:44
by Squashman
Interesting article about Stack OverFlow. The author seems to think that AI is the nail in the coffin for the site. I personally don't think AI can code everything. Some of the batch file voodoo that a lot of you have contributed to this site I think could never be done by AI. I have seen people post AI generated code on Stack OverFlow because the person who asked the AI for it could not troubleshoot it. Let me know what your thoughts are.
https://www.infoworld.com/article/39934 ... rflow.html
Re: Stack Overflow article.
Posted: 31 May 2025 18:36
by thelowsunoverthemoon
I think there's a good deal of truth in it. Most questions on Stack Overflow are on well established technologies (Python, Java, etc.) which these models have trained on with years of Internet knowledge, and many of which are just syntax / didn't read documentation errors, questions that already have answers, or questions that can be pieced together from multiple answers. These types of questions can easily be answered by today's LLM models. Especially for a beginner, which is the type to ask the most questions, why would they write up an answer and wait, when they can get a decent answer from a model instantly?
Now, if some sort of new technology were to come by and sweep the world by storm, I think Stack Overflow would see a resurgence, until the models are trained again with new data on this new tech.
And speaking on the point on Batch, I think it's just a lack of demand. Given that these models are trained over the Internet, and there's so much bad Batch Script code out there, the Batch Script output by today's models is really far behind something like a Python question. However, I really think that, if someone had the time and expertise to carefully finetune one of these models using a selected Batch Script dataset from the experts on this site and certain documentation, that the Batch Script capabilities would be quite good as well.
Re: Stack Overflow article.
Posted: 01 Jun 2025 11:16
by aGerman
Forty years ago, my father used to say, "You young people are getting dumber and dumber. I know the multiplication tables by heart. You even enter 5x5 into your pocket calculator."
My father was totally right. Sometimes I can't recall parts of the multiplication tables. Why? Because I don't frequently need to.
My father was neither wrong nor right. I don't enter 5x5 into the calculator. Instead I have it in Excel formulas and program code.
My father was totally wrong. Apparently we are not as brainless as predicted. Our education branched out in directions other than memorizing stupid tables. We let machines remember them.
It wasn't possible for my father to foretell exactly where things would go. However, he could have seen that he couldn't banish the calculator from the world just because he didn't like how it changed our behaviors. I call this a typical generational problem ...
Now I'm old and of course biased, just like my father was decades ago. For me, AI still stands for "Absence of Intelligence" because it primarily repeats or reassembles existing knowledge in a not yet reliable way. However, I'm aware that it will continue to exist. And I'm sure it will continue to evolve. Who knows whether AI will be able to invent new things or conduct in-depth research in future. And when that time comes, I doubt it will still be necessary to have all program code in a human-readable form

At least we shouldn't need voodoo any longer.
Steffen
Re: Stack Overflow article.
Posted: 02 Jun 2025 02:15
by T3RRY
I don't think the idea of the article is mistaken. AI has become the goto for many people.
I don't myself think GPT is a reliable stand alone tool for beginners - it has a tendency to blend syntax from adjacent languages, which makes it necessary to have at least a foundational knowledge to proof read / error check its output.
Personally I think culture had a lot to do with people electing to use AI tools over stack overflow.
There are still communities that exist where languages and their limits are discussed, though often they are not as easily accessed as a forum such as Dostips.
The below opinions relate specificly to AI based on Large Language Models such as GPT, GROK etc.
I do feel Dependence on these types of AI will have a profound social cost, particularly in terms of reduced collaboration and the loss of innovation that results in.
Time will tell if that will be offset by the way it can be used to enable self driven learning.
The biggest problem with learning anything tends to be the fact you don't know what you don't know.
My opinion on such Ai tools is that their best use is for identifying learning goals. Outside of that, I do not trust that they have sufficiently specialized or coherent training in any given subject of expertise to be relied on too heavily.