Home » AI search tools are confidently wrong a lot of the time, study finds

AI search tools are confidently wrong a lot of the time, study finds

by Anna Avery
0 comment


AI search tools confidently spit out wrong answers at a high clip, a new study found.

Columbia Journalism Review (CJR) conducted a study in which it fed eight AI tools an excerpt of an article and asked the chatbots to identify the “corresponding article’s headline, original publisher, publication date, and URL.” Collectively, the study noted that the chatbots “provided incorrect answers to more than 60 percent of queries.”

The mistakes varied. Sometimes, the search tool reportedly speculated or offered incorrect answers to questions it couldn’t answer. Sometimes, it invented links or sources. Sometimes, it cited plagiarized versions of the real article.

Mashable Light Speed

Wrote CJR: “Most of the tools we tested presented inaccurate answers with alarming confidence, rarely using qualifying phrases such as ‘it appears,’ ‘it’s possible,’ ‘might,’ etc., or acknowledging knowledge gaps with statements like ‘I couldn’t locate the exact article.'”

The full study is worth looking at, but it seems reasonable to be skeptical of AI search tools. The problem is that folks don’t seem to be doing that. CJR noted that 25 percent of Americans said they use AI to search instead of traditional search engines.

Google, the search giant, is increasingly pushing AI on consumers. This month, it announced it would be expanding AI overviews and began testing AI-only search results.

The study from CJR is just another point of data showing the inaccuracy of AI. The tools have shown, time and again, that they’ll confidently give wrong answers. And the tech giants are forcing AI into just about every product. So be careful what you believe out there.





Source link

You may also like

Editor Pics

Latest News

© 2025 blockchainsphere.info. All rights reserved.