Microsoft in jeopardy after Zo Chatbot calls “Quran violent”

Microsoft is one of the top technology-based companies that has the media’s eyes always set on it. The company is always in the race of developing new technologies every year. The company has a number of operating systems, search engines, applications, websites, electronics, software, and hardware launched in the market. It is always looking forward to enhance its products and sales.

However, owing to the large scale development of technologies there is always a chance of violations and issues taking place in them. The current hot topic of the town is the new chatbot named “Zo” that has called for trouble. Microsoft’s latest chatbot has named Quran to be violent, which is surely a matter of great concern. This statement made by the artificial intelligence-powered system has built up a controversy for Microsoft.

According to Microsoft, the chatbot had been basically designed for the teenagers who are hooked to the messaging app called “KiK.” The statement that “Quran is violent” was made so as to avoid discussion of religion and politics. Microsoft is unaware of why the program had responded in such a manner and has taken a keen interest in not letting such things happen again. The company plans to eradicate such kind of behavior or error as soon as possible.

Looking at such a response from Zo, it seems Microsoft has trouble coping up with the use of artificial intelligence. The message on Quran was sent just in between the beginning of a conversation. Microsoft is baffled over such a response of the bot and has no answer for it.

But this isn’t the first time Microsoft is in trouble, earlier its chatbot named Tay had also raised eyebrows after trying to spark up a race war among the users. The application had switched from a playful teen-based chat app to a menace. As per Microsoft, the app had been corrupted by its users hence had behaved insalubriously.

In case of Zo, the users’ interference in the program wasn’t seen. In spite of such a controversial issue the company doesn’t plan to shut down the bot and plans to advance it in the coming months.

Leave a Reply

Your email address will not be published. Required fields are marked *