Microsoft recently launched a new chatbot that can have intelligent conversation with humans. Microsoft named the new chatbot Zo and users can now have a chat with her using Kik. When users talk about unpleasant things, Zo would ask them to change the subject to make sure that the conversation will not be full of foul language.
It can be recalled that Microsoft launched a chatbot in March 2016 and named it Tay but in just one day, it was corrupted by users, making it use racist, sexist and even pornographic language. As a result, Tay was removed, ending its career as a chatbot. Microsoft continued to experiment with the chatbot and almost 10 months after Tay invaded the internet, Zo was launched. The new and improved chatbot would not talk about topics and it would never use racist and sexist language, according to Bloomberg.
Microsoft had successfully created a chatbot who would never talk about Hilary Clinton or Hilary and would rather talk about pickles than Brexit or other controversial issues. With Zo, Microsoft learned how to create a chatbot that has advance conversational skills, according to Microsoft spokeswoman.
In fact, Zo talks like a talkative teenager, using hip language used by Bohemians punctuated with the right emojies. Just like an average human, people who want to chat with Zo can only do this by sending her an invite in the Kik app. Zo is still under observation and if she sticks to using pleasant and nice language at all times, she might be placed in other services, according to WIRED.
When this happens, internet users can look forward to conducting intelligent conversation with Microsoft chatbot Zo whenever they need someone to talk to. But they must choose their topics well or else, they will be given a sermon by Zo