Google’s AI-powered chatbot made a factual error during its first public demonstration. But for the sake of justice, it should be noted that he is not the only one. Independent AI researcher Dmytro Brereton found that early demonstrations of Microsoft’s Bing AI were also riddled with errors in financial and other data.
Last week, Microsoft confidently demonstrated the capabilities of its Bing AI. The search engine handled tasks such as identifying the pros and cons of top-selling pet vacuums, planning a 5-day trip to Mexico City, and comparing data in financial reports. But Bing failed to tell the difference between a corded and cordless vacuum cleaner, omitted important details for the bars it linked to in Mexico City, and distorted financial data.
In one of Microsoft’s AI demos, Bing tried to summarize Gap’s financial report for the third quarter of 2022 and got it wrong in many ways. The report mentions that gross margin was 37.4% and adjusted gross margin was 38.7% excluding impairment. Bing misreported a gross margin of 37.4% after adjusting for impairment charges. Bing then said Gap’s operating margin was 5.9%, which was not reflected in the financial results. Actual operating margin was 4.6%, or 3.9% adjusted for impairment. Bing then compared Gap’s financials to Lululemon’s Q3 2022 results. Bing made more mistakes when processing Lululemon’s data, and the comparison is rife with inaccuracies as a result.
Brereton also points out a glaring error in the pros and cons of the top-selling pet vacuums. Bing calls it the “Bissell Pet Hair Eraser Handheld Vacuum” and lists the downsides of the short 16-foot cord.
“It doesn’t have a cord,” says Brereton. “It’s a portable handheld vacuum cleaner.”
But there are actually two versions of this vacuum cleaner: wired and wireless. The link to the wireless version is in the HGTV article, sourced from Bing. Bing appears to be using multiple data sources without listing them in full, combining the two versions of the vacuum cleaner in the answer.
Bing’s AI errors aren’t limited to demos. Now that thousands of people are accessing the AI-powered search engine, it’s making more obvious mistakes. In a correspondence published on Reddit, Bing AI claims that we live in 2022.
“Sorry, but today is not 2023. Today is the year 2022,” Bing AI declares.
When a user says it’s 2023 on their phone, Bing suggests checking the settings and making sure the phone doesn’t have a “virus or bug that distorts the date.” In addition, Bing AI confidently and incorrectly states that “Croatia will leave the EU in 2022” and teaches people racial slurs. Microsoft is aware of some of these bugs and is working to fix them.
Source: The Verge