By AJ Dellinger
February 22, 2023 at 5:00 am ET
This article is part of a series on generative artificial intelligence, aimed at creating a foundational understanding of consumer attitudes on the emerging technology.
Read more of our coverage: AI in Search | Interest in Other AI Applications | Who’s Using Generative AI
The introduction earlier this month of artificial intelligence chatbots from Microsoft Corp. and Alphabet Inc.’s Google that will integrate into traditional search engines to provide conversational-style responses to search queries marked the latest in the mainstream adoption of generative AI. Despite initial fanfare, including OpenAI’s ChatGPT’s milestone of achieving more than 100 million monthly users in just two months, most of the public has some concerns about AI, both in search engines and more broadly, according to a Morning Consult survey.
In the first week that Microsoft’s AI-powered Bing search engine was made available to a limited portion of the public, users reportedly gave the AI-produced answers a “thumbs up” response 71% of the time, according to the company. This may help to quell some concerns about result accuracy and misinformation, which are among top concerns for adults in regards to search engines that use AI — though the fact both Bing and Google’s chatbot service Bard displayed incorrect results during demos highlight why users may be worried.
Those who carried on conversations with the AI-powered Bing eventually ran into strange and often disconcerting responses, including threatening remarks. These experiences are unlikely to comfort users: 63% of adults said they worry that AI will encourage harmful behavior, and nearly 2 in 3 adults are concerned about AI applications’ potential to learn to function independently from humans.
Experiences of conversational AI tools “hallucinating” — a term used to describe the AI system responding in unpredictable ways or with convincing but made-up information — have led some people to believe the chatbots are sentient, though this isn’t the case. Microsoft announced it would introduce new limits on its Bing chatbot in order to prevent these kinds of potentially upsetting interactions.
The Feb. 17-19, 2023, survey was conducted among a representative sample of 2,205 U.S. adults, with an unweighted margin of error of plus or minus 2 percentage points.
AJ Dellinger is a data reporter at Morning Consult covering tech. @ajdell