Home Education Introduction to live audio social networks and misinformation

Introduction to live audio social networks and misinformation

Introduction to live audio social networks and misinformation

What does the growth of live, audio-based social networks mean for the detection and moderation of misinformation? We reviewed the platforms and their moderation policies and offer some key findings for journalists and disinformation researchers. (Updated January 31, 2022)

Audio misinformation comes in many forms

One of the problems of tracking audio misinformation on social media is that audio is easily recorded, remixed and transcribed. One misinformation during the 2020 U.S. election viral was a record of training workers surveyed in Detroit. There was no evidence of anything disgusting in the recording itself, but it was trimmed by sinister music, superimposed on deceptive lyrics and called #DetroitLeaks. In 2018, First Draft organized a consultation with Brazilian journalists to follow the misinformation being spread on WhatsApp ahead of the country’s presidential election. Over a 12-week period, 4831 of the 78,462 reports received were audio files, many of which contained erroneous allegations of election fraud. Also popular were transcripts of misleading audio and video, one example of which has been reported more than 200 times.

All these cases were united by the fact that they were extremely difficult to track and verify. Live audio chats, both in the Clubhouse and its competitors, share these challenges. However, they invite live, purposeful harassment and even more ephemeral, often disappearing when the conversation ends. So how can disinformation researchers track this content and how do platforms develop policies around it? There are several key themes.

Live audio moderation is time consuming

We wrote about how audiovisual platforms have been able to circumvent criticism of misinformation, although they are a significant part of the problem. One reason is that audiovisual content takes longer to consume and study. It is also more difficult to moderate automatically. Most content moderation technologies rely on text or visual cues that have previously been flagged as problematic; live audio provides neither.

It’s no surprise, then, that Clubhouse and Twitter Spaces are hoping that users mark potentially offensive conversations (about specific policies see below) as the primary form of moderation, shifting the burden onto content-targeted people. As for Facebook (now known as Meta), listeners can report to the audio room about potential violations of community standards.

Ethics and privacy

Part of the appeal of live audio chats is that they feel intimate. This raises issues of agreement and privacy for researchers and journalists who may be interested in using the space to gather news or track misinformation. Club roommakers can record chats; Facebook allows audio room owners to post a record after the room is complete.

Regardless of the platform’s official policy, journalists and researchers should take care to whether to listen to or use material from conversations in these spaces. Participants may not know who is in the room or that their words may be published. Journalists should also ask if they hear misinformation on the platform merit coverage: Has the lie spread so widely that reports of it have not spread it to a new audience? The size of the audience is the only indicator available in many cases, as there are no exchanges, likes and comments that help assess the reaction and reach the conversation.

Platforms and what we know about their moderation policy

Club house

  • How it works: Launched in March 2020, Clubhouse is an app that many attribute to the growing number of live audio on social media. Speakers broadcast live messages from “rooms” to listeners entering and leaving those rooms.
  • Moderation Policy: Part of the reason the Clubhouse attracted so much attention soon after launch was this laissez-faire attitude towards moderation. This gave some principles of community, emphasizing the role of users in moderating and tagging content.

Facebook Live Auditoriums

  • How it works: In public groups, everyone can join the live audio hall. Private groups are participants only. The host host can monetize the room by allowing users to send stars – paid tokens of appreciation – or donations.
  • Moderation policy: Audience creators can remove unwanted participants.

Twitter spaces

  • How it works: Feature provides live audio chats, publicly available to listeners. Up to 13 people (including the presenter and two co-presenters) can talk at the same time. There are no restrictions on the number of listeners.
  • Moderation Policy: If users believe the space violates Twitter rules, they can report the space or any account in the space, according to Twitter FAQ.

Reddit talk

  • How it works: At the time of writing, the feature remains in beta. This allows users to have live audio conversations in Reddit communities. So far only community moderators can start a conversation, but anyone can join in to listen. Hosts can give permission to speak.
  • Moderation policy: «The hosts can invite speakers, turn off conversations, remove redditors from the conversation and end the conversation, ”Reddit reports. Community moderators have the same privileges as well as the ability to initiate conversations and ban community members.

Discord Stage Channels

  • How it works: Discord is a chat program aimed at gamers that allows them to find each other and talk while playing. It supports video calls, voice chat and text. In 2021 it is introduced “Stage Channels,” a club-like feature that allows users to stream live conversations to a listener’s room.
  • Moderation policy: Platform Community Rules, last updated in May 2020, to read: “If you come across a message that violates these rules, let us know. We may take a number of steps, including issuing an alert, deleting content, or deleting responsible accounts and / or servers. ”


  • How it works: Spoonplatform only for live audio, exists since 2016. It allows users to create and broadcast live shows in which viewers can sit and participate.
  • Moderation Policy: Spoon talks in detail about community rules here. Instructions on obscene swearing give an example of how violators are affected. “Some languages ​​are not suitable for users under 18 years of age. We reserve the right to take this into account when deciding to restrict or remove content, including terminating Lives or requesting a renaming. ”


  • How it works: It is similar to the Clubhouse, but with an emphasis on self-service.
  • Moderation Policy: His principles of community read: “As a presenter, you have the ability to mute other participants or move speakers back to the listening section in case anything happens that cancels out the band’s experience.” Users are encouraged to report any non-compliant behavior to the Quilt command, after which the platform “may remove malicious content or disable accounts or hosting privileges if we are notified.”

Stay up to date with First Draft’s work follow us Facebook and Twitter.

Source link

Previous articleOxford Brookes will deliver more courses in Greece
Next articleThe Texas governor has provoked an unpleasant reaction, talking about the abolition of free school for immigrant children