Who is responsible for the software when AI takes over programming?

  • Thread starter Thread starter symbolipoint
  • Start date Start date
AI Thread Summary
The discussion centers on the accountability for software created by AI, questioning who is responsible for issues like bugs and malfunctions. It emphasizes that companies remain liable for products that incorporate AI, similar to traditional software development. Users do not need technical expertise to engage with consumer products, shifting the responsibility to the manufacturers. The conversation also touches on open-source software, where responsibility is decentralized, allowing users to choose versions and improvements. Ultimately, the accountability for AI-generated software still lies with the companies that produce and market it.
symbolipoint
Homework Helper
Education Advisor
Gold Member
Messages
7,561
Reaction score
2,004
TL;DR Summary
I found an online article telling that Artificial Intelligence will take-over the whole cycle of developing software programs. The idea is troubling and I wonder.
I tried a web search "the loss of programming ", and found an article saying that all aspects of writing, developing, and testing software programs will one day all be handled through artificial intelligence. One must wonder then, who is responsible. WHO is responsible for any problems, bugs, deficiencies, or whatever malfunctions which the programs make their users endure? Things may work wrong however the "wrong" happens. AI needs to fix the problems for the users. Any way to enforce corrections and compensation for the users? Some real person must be able to take the blame, but no actual person wrote the in-the-future-or-maybe-even-now software program.

https://www.linkedin.com/pulse/deat...-programming-may-soon-shankar-munuswamy-wsllc
Also here's another article, not yet read through yet but,
https://medium.com/@hatim.rih/the-d...n-layers-are-pushing-traditional-076356db0ed9

As looking through that second article, possibly I am misunderstanding some of what I 'think' about the loss of need to know how to use or handle the code of a programming language...
 
Technology news on Phys.org
The same people who were responsible before the AI "took over".
If someone does their home work with AI assistance, it is still their homework.
If a company sells a product that includes AI-written software, that company is still responsible for the marketability of their product.

When you buy a consumer product, you don't need to have an engineering understanding of how it works. For example, if Tesla wants to use AI in its "Autopilot" to recognized road obstructions, and it doesn't work, your beef is with Tesla, not the computer.
 
  • Like
Likes DaveE, DaveC426913, PeterDonis and 1 other person
.Scott said:
The same people who were responsible before the AI "took over".
If someone does their home work with AI assistance, it is still their homework.
The only exception I can think of is if AI tools are provided by a company that makes some guarantees of the results. Then they would take on some legal risk.
 
You folks need to see Mr Whipple’s Factory on the original Twilight Zone.

Machines replace the workers and Robbie the Robot replaces Mr Whipple.
 
  • Like
  • Informative
Likes DaveC426913 and symbolipoint
jedishrfu said:
You folks need to see Mr Whipple’s Factory on the original Twilight Zone.

Machines replace the workers and Robbie the Robot replaces Mr Whipple.
Watched that one recently. Binged the entire series in order.
 
symbolipoint said:
all aspects of writing, developing, and testing software programs will one day all be handled through artificial intelligence.
Yes but not all aspects of ownership, marketing, sales, distribution, revenue and legality.

Software is a product of a company; software doesn't just burst forth into the world like an infant born in an empty meadow.
 
  • Like
Likes FactChecker
Until it becomes open source. I know a few companies who spun off software to open source.

Sometimes, they become Apache projects s with a while community of programmers improving and enhancing it.

Small companies do it when they realize that an internal project wile useful but not sellable would benefit by making it open source.
 
  • Like
Likes DaveC426913
Right, but in the case of open source software, the question of who takes responsibility answers itself. Anyone is free to make a branch and make improvements. Reviewers can bless or reject the changes; users can refuse to use it, preferring an earlier iteration.
 
Last edited:
No need for AI: you only need to read the EULA to know that (for common cases) nobody, ever was actually responsible.
 
Last edited:
  • Love
  • Wow
  • Like
Likes symbolipoint, FactChecker and DaveC426913
  • #10
IMO, current AI systems are far from being able to take over the whole of software development. One might trust a chatbot to cough up specific working bits of code to plug in, or more likely, use as a template, thus making coding go faster. Code generators are useful for mundane formulaic situations only, and overall program design, integration and testing has always been a highly creative process that is woefully underestimated because it is unseen and ill understood even by the managers who hire people to do it. Of course, nothing will prevent some pointy-haired managers from trying to replace human coders altogether, and it will likely take some dramatic train wrecks to prove that not to be wise.

All the current chat bots I and friends have tried are both helpful and, often, glaringly and shamelessly incorrect about things. Hilarious recent example of AI "logic" at work:

Person: How fast do otters swim?
AI Chatbot: River otters 8 to 10 mph. Sea otters 5 mph
Person: Do river otters swim faster than sea otters?
AI Chatbot: I Don’t know.
 
  • Wow
  • Like
Likes symbolipoint and FactChecker
  • #11
As for "many types of AI", I believe it just about all involves various machine learning algorithms based on Bayes Theorem, is probability based, must be trained, and is subject to statistical type errors. If the code being written by an AI system can afford to have errors in it, fine. But if it's a surgical robot or a laser weapon or any system affecting human life and needing to be reliable, well, right MOST of the time is not going to cut it. And frankly, I think people would prefer all their software to be right more than most of the time.

At this point, both the US and Russia have admitted to having shot down a passenger airliner mistakenly. There was software involved in the attempt to recognize what they were shooting at. That is a classic situation that AI is now already being used for, in warfare.

Think of email spam filters. We can at least intervene and rescue a misclassified email. So that is a perfectly good case where machine learning is good to use. But in today's world, it is being unleashed unwisely in many areas.
 
Last edited:
  • Wow
Likes symbolipoint
  • #12
In response to my: "The same people who were responsible before the AI "took over".
If someone does their home work with AI assistance, it is still their homework.", @FactChecker replied:
FactChecker said:
The only exception I can think of is if AI tools are provided by a company that makes some guarantees of the results. Then they would take on some legal risk.

This brings up a related topic - how you choose who you do business with and how you do that business.
First, if it's a free service, their "guarantee" is no more than puffery.
Second, even if you are paying for the AI service, your course instructor may be more sympathetic to "the dog ate my homework" than to "my AI supplier guaranteed the answers". More generally, any time a supplier provides a "guarantee" that seems unrealistic, you should expect that what you are actually buying is, at best, the recourse provided in the guarantee - and not a functional product.
 
  • #13
harborsparrow said:
IMO, current AI systems are far from being able to take over the whole of software development. One might trust a chatbot to cough up specific working bits of code to plug in, or more likely, use as a template, thus making coding go faster.
The trouble is that, collectively, they can improve in sophistication very quickly. They don't keep secrets or hide their code or any other things that might impede cross-pollination. And they can take in vast troves of data rapidly.

So, while they may be only doing simple tasks for now, I suspect the complexity will improve at a (which one is it? geometric? exponential?) rate.


harborsparrow said:
Person: How fast do otters swim?
AI Chatbot: River otters 8 to 10 mph. Sea otters 5 mph
Person: Do river otters swim faster than sea otters?
AI Chatbot: I Don’t know.
Ah, but this is a poor example of software dev, isn't it?
 
  • #14
DaveC426913 said:
The trouble is that, collectively, they can improve in sophistication very quickly. They don't keep secrets or hide their code or any other things that might impede cross-pollination. And they can take in vast troves of data rapidly.

So, while they may be only doing simple tasks for now, I suspect the complexity will improve at a (which one is it? geometric? exponential?) rate.



Ah, but this is a poor example of software dev, isn't it?
"I suspect the complexity will improve"

That is the question the money is riding on, will it keep scaling or is this "as good as it gets"?
Is it already good enough to change how we design and build complex tasks if this is "as good as it gets"?
The truth is, nobody knows. A Jevons paradox future says we will just make more complex software and use just as many people (software developers of all types) to fill the, more software being required void these system will create.

Things like the fake animal escape videos and other brain-dead media creations are low-hanging fruit AI slop.
 
  • #15
DaveC426913 said:
The trouble is that, collectively, they can improve in sophistication very quickly. They don't keep secrets or hide their code or any other things that might impede cross-pollination. And they can take in vast troves of data rapidly.
There is money to be made. The best will hide their trade secrets.
Possibly just because they are the best, or they will be the best because they make a profit and can hire many of the best workers.
 
Back
Top