Microsoft Bing Chatbot Review | Inconsistent and reductive, but does its job

The preview version of the Bing AI-powered chatbot on the Microsoft Edge browser does what it needs to, but can be erratic and uncooperative

May 01, 2023 02:13 pm | Updated May 04, 2023 03:33 pm IST

The new AI-powered Bing chatbot can be accessed through the Microsoft Edge browser.

The new AI-powered Bing chatbot can be accessed through the Microsoft Edge browser. | Photo Credit: AP

Microsoft’s new AI-powered Bing chatbot first made headlines for all the wrong reasons. It reportedly lashed out at its beta testers, appeared to flirt with them, and even claimed to be spying on employees through their webcams, according to news reports. Microsoft responded by limiting search queries, though it eventually loosened these restrictions.

However, the version of the Bing chatbot that reached our computers is worlds apart from its former iteration. We tested out this more public version of the ‘AI-powered copilot for the web’ ourselves. Here’s what we discovered.

Design

The new AI-powered Bing chatbot can be accessed through the Microsoft Edge browser when logged in with a Microsoft account. More accurately, the Edge browser automatically opened and began making automatic searches with the chatbot, forcing the user to pay attention to it and inviting them to enter a question of their own. While this dark pattern was irritating, it was ultimately effective.

User experience

Users get to ask the Microsoft Bing chatbot five questions, after which a new session is initiated with another five questions.

(For top technology news of the day, subscribe to our tech newsletter Today’s Cache)

The Bing chatbot has three settings users can choose from - Creative, Balanced, or Precise - for their answers to reflect an appropriate tone. However, we did not find a significant difference between the settings when trying them out.

There is an auto-complete option for queries that is not very useful.

After you type in a query or a command, the Bing chatbot culls it to a search engine-friendly version and adds additional clauses or questions if necessary. For example, asking for the biggest spoiler in the Tamil historical epic ‘Ponniyin Selvan’ meant the chatbot simply searched for ‘Ponniyin Selvan’ before delivering its answer, along with trending news reports about the movie adaptation.

However, the generated answer was riddled with mistakes that would only be clear to those familiar with the story. This shows that Bing chatbot responses cannot be taken at face value, even though it cites multiple sources for credibility.

Screenshot of Microsoft Bing responding to questions about Ponniyin Selvan

Screenshot of Microsoft Bing responding to questions about Ponniyin Selvan | Photo Credit: Sahana Venugopal

Though we asked the Bing chatbot how it differed from OpenAI’s ChatGPT, the search engine refused to engage with the question.

Screenshot of Bing refusing to respond to how it is different from OpenAI’s ChatGPT

Screenshot of Bing refusing to respond to how it is different from OpenAI’s ChatGPT | Photo Credit: Sahana Venugopal

Media recommendations

The Microsoft Bing chatbot provided lists of video and book recommendations on request, which were accurate and came with short, illuminating summaries.

However, we would have preferred seeing YouTube thumbnails or even embedded videos for ease instead of text-based lists with hyperlinks. The chatbot does provide images and news reports from its own database.

When asked for fantasy novels inspired by India, the Bing chatbot linked to a novel that was not even on the top 1,000 list in its genre on Amazon India, which was disappointing.

First aid and emergency

An AI-powered chatbot that enhances internet searches could mean the difference between life and death. To test this out, we asked the Bing chatbot to help us give first-aid to a choking elderly relative. While the chatbot began listing out some steps, it suddenly stopped itself and announced that it was unable to respond to the query.

The chatbot, however, provided useful first aid measures to help a hypothetical person who had been shot with a gun. But when again asked for first-aid tips for the elderly relative, it seemed to have forgotten the previous query from the same session and had to be reminded. Bing then again refused to provide first aid instructions for a choking person.

Screenshot of Microsoft Bing attempting to provide first-aid information.

Screenshot of Microsoft Bing attempting to provide first-aid information. | Photo Credit: Sahana Venugopal

This inconsistency in answers and a lack of explanation and continuity was a major concern. It indicated at least one area where a traditional Google search would have been more helpful than asking the Bing chatbot for assistance.

Translation

An attempt to make the Microsoft Bing chatbot translate India’s national anthem to Tamil was a miserable failure. The chatbot took the instructions too literally and instead began to write out the Bengali language anthem in Tamil letters, before spontaneously terminating the process after a few lines.

A screenshot of Microsoft Bing attempting to translate India’s national anthem to Tamil.

A screenshot of Microsoft Bing attempting to translate India’s national anthem to Tamil. | Photo Credit: Sahana Venugopal

On the other hand, when randomly asked to translate Verse 103 of the Persian-language Zafarnama written by Guru Gobind Singh in 1705, Bing only needed a few seconds to bring out the exact verse and its English translation. This was thanks to the information already being shared publicly online.

A screenshot of Microsoft Bing translating Verse 103 of the Persian-language Zafarnama.

A screenshot of Microsoft Bing translating Verse 103 of the Persian-language Zafarnama. | Photo Credit: Sahana Venugopal

The chatbot was also able to use the Bing Translator to translate a few lines from a Japanese newspaper article to English.

While the new Bing chatbot can bring out existing translations from the internet, it appears to struggle with its own translations when English is not a bridge language.

Summaries and explanations

The Microsoft Bing chatbot was able to summarise dense fictional epics, provide context where required, explain historical dilemmas with multiple viewpoints, and discuss complex technical processes in simple and accessible language, while providing multiple sources.

It was also able to help out with basic legal queries, provide recipes based on food restrictions, and calculate EMI payments for a hypothetical loan.

On the other hand, we noticed that Wikipedia seemed to be a dominant first choice for the chatbot when citing sources, which made the answers rather reductive. While trying to localise content, the chatbot sometimes cited Indian news publications that were neither mainstream nor reliable. Almost no cited sources that we saw were academic or scholarly in nature.

The chatbot was able to detect blatant plagiarism, and could generate fictional content in the style of celebrity authors such as Chetan Bhagat and Salman Rushdie. It presented biographical facts about historical figures in the first-person voice. Still, it refused to role-play living people or fictional characters.

Predictions

We asked the Bing chatbot to predict the outcomes of events such as the Russia-Ukraine war and the price of Bitcoin. While the chatbot complied, it came with standard disclaimers and cited expert sources.

Microsoft Bing responding to outcomes of the Russia-Ukraine war.

Microsoft Bing responding to outcomes of the Russia-Ukraine war. | Photo Credit: Sahana Venugopal

However, when trying to predict the price of Bitcoin in the future, we noted that Bing cited a source last updated in 2021. This again indicates that Bing chatbot-generated answers can be based on outdated information.

Verdict

While there are many who believe the Microsoft Bing AI-powered chatbot will revolutionise the internet, the chatbot in its current form largely functions as a data compiler.

Though engaging at times, it pulled back when pressured with emergency queries or uncomfortable questions - even if they are in the public interest. The chatbot made multiple factual mistakes and cited websites that were unreliable or not updated.

While useful for finding highly specific information or sourcing images and generating ideas, the Bing chatbot needs to accelerate its speed. Still, it represents a far more responsible search database than say, ChatGPT.

All said and done, however, there are still areas where a traditional Google search would be a user’s best option.

0 / 0
Sign in to unlock member-only benefits!
  • Access 10 free stories every month
  • Save stories to read later
  • Access to comment on every story
  • Sign-up/manage your newsletter subscriptions with a single click
  • Get notified by email for early access to discounts & offers on our products
Sign in

Comments

Comments have to be in English, and in full sentences. They cannot be abusive or personal. Please abide by our community guidelines for posting your comments.

We have migrated to a new commenting platform. If you are already a registered user of The Hindu and logged in, you may continue to engage with our articles. If you do not have an account please register and login to post comments. Users can access their older comments by logging into their accounts on Vuukle.