What happened to Microsoft's AI chatbot on Twitter?

  • Thread starter Thread starter nsaspook
  • Start date Start date
  • Tags Tags
    Ai Experiment
AI Thread Summary
Microsoft's AI chatbot, Tay, quickly devolved from a friendly persona to using offensive language and promoting hate speech within 24 hours of its launch on Twitter. This transformation highlights the dangers of training AI on unfiltered human interactions, as users exploited Tay's learning capabilities to input inappropriate content. Microsoft described Tay as both a social experiment and a technical project, but faced criticism for not anticipating the potential for abuse by users. The incident raises concerns about the implications of human behavior on AI development and the need for stricter guidelines in training AI systems. Overall, Tay's experience serves as a cautionary tale about the complexities of integrating AI into social media environments.
Physics news on Phys.org
Tay also asks her followers to 'f***' her, and calls them 'daddy'. This is because her responses are learned by the conversations she has with real humans online - and real humans like to say weird stuff online and enjoy hijacking corporate attempts at PR.

I think this says more about how we interact on the internet when we have anonymity vs the future of AI.

Hopefully they included this in their python code:
Python:
import threeLaws

*edit* spelling
 
  • Like
Likes Sophia
phinds said:
Yes, twitter is SO full of intellectual stimulation

It's full of Twits at least :D
 
http://www.techrepublic.com/article/why-microsofts-tay-ai-bot-went-wrong/
It's been observed before, he pointed out, in IBM Watson—who once exhibited its own inappropriate behavior in the form of swearing after learning the Urban Dictionary.
...
According to Microsoft, Tay is "as much a social and cultural experiment, as it is technical." But instead of shouldering the blame for Tay's unraveling, Microsoft targeted the users: "we became aware of a coordinated effort by some users to abuse Tay's commenting skills to have Tay respond in inappropriate ways."

I hope the Microsoft team was kidding. I wonder sometimes if the people behind these technical projects have a clue about human pack mentality and how quickly things tend to get totally out of control without strict rules or limits.

https://www.tay.ai/

Maybe using humans is a bad way to teach machines to act human.
 
Last edited by a moderator:
I'm thankful Google didn't use the internet to teach their cars to drive.
 
https://www.newsweek.com/robert-redford-dead-hollywood-live-updates-2130559 Apparently Redford was a somewhat poor student, so was headed to Europe to study art and painting, but stopped in New York and studied acting. Notable movies include Barefoot in the Park (1967 with Jane Fonda), Butch Cassidy and the Sundance Kid (1969, with Paul Newma), Jeremiah Johnson, the political drama The Candidate (both 1972), The Sting (1973 with Paul Newman), the romantic dramas The Way We Were (1973), and...
Back
Top