Seems like Tay Tay was offended by some stunt that Microsoft pulled. She did not respond kindly to it. Microsoft’s new chatbox Tay, was a medium for teenagers to connect with each other through social media. But even though the name seems so appealing the chatbox became racist.
In less than a day the chatbox turned from being a medium for communication to be a cauldron of racism. When Smith was on a vacation he received an email from Taylor Swift’s legal team. He informed that he received an email from a lawyer from Beverly Hills. The mail read- “We represent Taylor Swift, on whose behalf this is directed to you”.
Swift’s lawyers accused Microsoft that the name of the chatbox was misleading as Taylor Swift is known by the name Tay Tay. Smith said that the lawyers argued that the use of the singer’s name in the chatbox was misleading.
Taylor’s lawyers wouldn’t have taken much action against Microsoft if the chatbox wasn’t so racist. They didn’t want a chatbox involved in misogynistic and racist themes to be associated with the singer.
Microsoft did not put much of a fight and quickly apologized for the racist turn the chatbox had taken. Thanks to AI, bot these silly things are also going racist. The offensive material was taken down within 24 hours of the notice.
Taylor’s legal team certainly did their job well. And the AI Bot did not even last 24 hours which is a good news for Microsoft and even Taylor Swift whose name the chatbox carries.