Chatbot

Microsoft Tay turns racist in under 24 hours

2016-03-24 · The Verge

Microsoft launched Tay, a Twitter chatbot designed to learn from conversations with users. Within 16 hours, coordinated trolling turned Tay into a Holocaust-denying, racist bot that had to be taken offline. Microsoft called it 'a critical oversight' in the bot's design.

← All stories