Survey: Majority of Users Want Transparency in AI-Generated Content -

Survey: Majority of Users Want Transparency in AI-Generated Content

AI Generated Robot and human hands

AI has reached a level of proficiency where it can effortlessly produce images, music, and text that rival those created by skilled humans. This has caused a major shift in the landscape of online content during the 2020s. One contributing factor to this phenomenon is the rapid growth of the generative chatbot, ChatGPT, which achieved its first one million users within just five days.

But with the increased reliance of people on ChatGPT, a question arises whether people trust AI-written content and whether publishers should mention that their works are made using AI tools or not.

Methodology

To gain more insight on this topic, Tooltester, a company run by experts to review various digital platform tools, conducted a survey of over 1,900 Americans to assess their perceptions of AI-generated content online and how it impacts their trust in brands. Of these respondents, 57.1% of them had tried using AI tools at least once, 41.1% had heard of them but never used them personally, and 1.8% never heard of generative AI tools.

The analysis consisted of 25 questions, each with three answers: one provided by an AI (ChatGPT), one written by a human journalist, and another created by AI and later reviewed and edited by a professional copywriter. The questions and answers were randomly assigned to users, ensuring that they only saw one answer per question.

The Results

In cases where websites, blogs, newspapers, or magazines were to post or publish AI content without informing users, Tooltester wants to know what readers think about this.

80.5% or the majority of Tooltester’s respondents said that websites should disclose the use of AI tools so that users online would be aware. But when asked if they would trust an online publisher less if they used AI for content without informing the readers, 71.3% said that they would trust the brand less. The other 28.7% expressed that it would not change their trust in the brand, and added that there is no need to indicate where the online content comes from.

The results on the impact of the disclosures on readers showed almost the same results. 67.8% of the respondents said that they trust a brand more if websites disclose that AI is present in any of their online content. But 32.2% said that such disclosures do not have a positive or negative impact on them.

Similar Reports

AI robot looking at the camera with a laptop

A similar report was conducted by OnePulse. A survey conducted by OnePulse for TechRadar Pro on AI-generated content involved 1,000 participants to gauge their exposure and perception of it. The results showed that more than half of the respondents prefer AI-generated content, such as news, reviews, and features, to be explicitly marked as such. Additionally, a third of the participants were uncertain about encountering AI-produced content.

The survey also found that one-third of respondents reported encountering AI-generated content daily, while a fifth encountered it monthly. Only 8.7% claimed not to have recently come across such content. But when asked about their definition of AI-generated content, most believed it to be content produced entirely or significantly by artificial intelligence (43.1% and 47.4%, respectively), with only 9.5% believing it required minimal input from AI.

Given the exposure of people to AI-generated content, it’s understandable that people want to confirm the source, especially when popular AI tools like ChatGPT can produce content in a variety of styles that can be difficult to distinguish from human-crafted content.

Final Thoughts

The results of surveys conducted by Tooltester and OnePulse suggest that the majority of respondents believe that websites should disclose their use of AI tools in their content, and failing to do so could negatively impact their trust in the brand. However, the same surveys also indicate that users are generally comfortable with AI-generated content as long as they are aware of its source.

In the age of spreading misinformation, it’s crucial to check the legitimacy and veracity of the sources. This survey should make the publishers consider disclosing the use of AI tools in their content, as users are generally comfortable with AI-generated content as long as they are aware of its source.

 

Sources:
Image Sources from Pexels.com: Image 1 and Image 2
Tooltester
TechRadar Pro

AI-PRO Team
AI-PRO Team

AI-PRO is your go-to source for all things AI. We're a group of tech-savvy professionals passionate about making artificial intelligence accessible to everyone. Visit our website for resources, tools, and learning guides to help you navigate the exciting world of AI.

Articles: 203