Who is responsible for a software when AI takes-over programming?

  • Thread starter Thread starter symbolipoint
  • Start date Start date
symbolipoint
Homework Helper
Education Advisor
Gold Member
Messages
7,556
Reaction score
2,002
TL;DR Summary
I found an online article telling that Artificial Intelligence will take-over the whole cycle of developing software programs. The idea is troubling and I wonder.
I tried a web search "the loss of programming ", and found an article saying that all aspects of writing, developing, and testing software programs will one day all be handled through artificial intelligence. One must wonder then, who is responsible. WHO is responsible for any problems, bugs, deficiencies, or whatever malfunctions which the programs make their users endure? Things may work wrong however the "wrong" happens. AI needs to fix the problems for the users. Any way to enforce corrections and compensation for the users? Some real person must be able to take the blame, but no actual person wrote the in-the-future-or-maybe-even-now software program.

https://www.linkedin.com/pulse/deat...-programming-may-soon-shankar-munuswamy-wsllc
Also here's another article, not yet read through yet but,
https://medium.com/@hatim.rih/the-d...n-layers-are-pushing-traditional-076356db0ed9

As looking through that second article, possibly I am misunderstanding some of what I 'think' about the loss of need to know how to use or handle the code of a programming language...
 
Technology news on Phys.org
The same people who were responsible before the AI "took over".
If someone does their home work with AI assistance, it is still their homework.
If a company sells a product that includes AI-written software, that company is still responsible for the marketability of their product.

When you buy a consumer product, you don't need to have an engineering understanding of how it works. For example, if Tesla wants to use AI in its "Autopilot" to recognized road obstructions, and it doesn't work, your beef is with Tesla, not the computer.
 
  • Like
Likes DaveC426913, PeterDonis and FactChecker
.Scott said:
The same people who were responsible before the AI "took over".
If someone does their home work with AI assistance, it is still their homework.
The only exception I can think of is if AI tools are provided by a company that makes some guarantees of the results. Then they would take on some legal risk.
 
You folks need to see Mr Whipple’s Factory on the original Twilight Zone.

Machines replace the workers and Robbie the Robot replaces Mr Whipple.
 
  • Like
  • Informative
Likes DaveC426913 and symbolipoint
jedishrfu said:
You folks need to see Mr Whipple’s Factory on the original Twilight Zone.

Machines replace the workers and Robbie the Robot replaces Mr Whipple.
Watched that one recently. Binged the entire series in order.
 
symbolipoint said:
all aspects of writing, developing, and testing software programs will one day all be handled through artificial intelligence.
Yes but not all aspects of ownership, marketing, sales, distribution, revenue and legality.

Software is a product of a company; software doesn't just burst forth into the world like an infant born in an empty meadow.
 
  • Like
Likes FactChecker
Until it becomes open source. I know a few companies who spun off software to open source.

Sometimes, they become Apache projects s with a while community of programmers improving and enhancing it.

Small companies do it when they realize that an internal project wile useful but not sellable would benefit by making it open source.
 
  • Like
Likes DaveC426913
Right, but in the case of open source software, the question of who takes responsibility answers itself. Anyone is free to make a branch and make improvements.
 
Back
Top