A school principal in Stoke-on-Trent told me that not only must she guarantee her students a COVID-safe return to school in September, but also reassure parents that their children will not be forcibly removed and isolated in a secret location when they start coughing in class.
The school principal repeatedly receives a Facebook post warning parents to face the threat posed by the UK Coronavirus Act. “Is that true, can you take my child?” she is asked.
The Facebook post these parents saw went viral in mid-August. It’s one of several similar posts in the UK and Australia and follows a template in many posts related to the QAnon conspiracy theory.
This often includes a direct appeal to parents asking the reader to do their own research to “prove” the truth of the claim, a appeal to defend the rights of individuals against large governments, elites, or some undefined “them”.
Despite being quick Verified facts and marked as wrong, this and related posts Those who use the hashtag #SaveTheChildren are still floating around and the phrase “covid act 2020 kids in school” will still appear as an autofill option when you search for “covid act” on Google.
The power of memes
For the past five years I’ve been researching how strangers talk politics to each other on Facebook. I focused on four English constituencies – Stoke-on-Trent Central, Burton and Uttoxeter, Bristol West and Brighton Pavilion – and followed conversations on public pages, posts and public information on people’s schedules and profiles.
Through the 2015, 2017 In the general election in Great Britain in 2019, I saw the increasing polarization of these Facebook conversations and with it increasing incivility, partisanship and sectarianism. I was impressed with the increased use of memes and how a handful of core topics found their way from meme to faith. During the 2019 election, I noticed memes being posted and disseminated by far-right US Facebook pages about people in the UK constituencies I studied.
I recently decided to investigate how the upcoming US elections on Facebook in the UK could affect partisan ideas. I decided to focus on one meme and the individual Facebook users who cared enough about this topic to share or comment publicly – and see where it took me.
In late August, after a seven month hiatus, I returned to Facebook and selected the meme that was at the top of my timeline – a post by the Migrant Watch group shared by UKIP Brighton & Hove. This has consistently been one of the most active meme seeders among the constituency party’s Facebook groups that I follow.
I had found links between the active seeding of memes against migrants and against immigration by British users and far-right organizations and individuals in the US in the last election, so I was expecting to find similar links through this meme. What I didn’t expect, however, was that the meme would lead me to British mothers and grandmothers to deal with QAnon conspiracy theories from the USA.
Of the 45 people who commented on this Migration Watch meme shared by Brighton & Hove UKIP, 27 were women and most, as far as I could tell from their profiles, were middle-aged grandmothers. When I looked at what other content these women were sharing, I found memes about cruelty to animals, protests against Black Lives Matter, anti-BBC proms, and content in favor of Brexit.
Some of the women were also concerned about the threat to “our” children from pedophile rings. And in doing so, they demonstrated the next level of political meme exchange – free interaction with content from the UK and US.
For a woman, that meant sharing conspiracy theories of herself Mama Wolf, one of the Facebook accounts that distribute QAnon content. One of them had the title “Frequent flyers of the Epstein IslandsA plethora of unsubstantiated allegations linking Hilary Clinton, Oprah Winfrey, Bill Gates, Madonna, the Queen, and other (mostly black or Jewish) “elites” to the late Jeffrey Epstein, a global child trafficking network. Blood and classified news, which are included in Trump’s press conferences about his plans to save the children.
I found one of the same Facebook users who shared the Migration Watch meme and shared a post asking people to flood the BBC’s Facebook page with the tag #saveourchildren on August 25th. “They are not going to deal with child trafficking, so we are going to bring it to them. It is time to fix this,” said the meme.
The bubble communities we live in on Facebook protect us from alternative views to our own, while at the same time making it easier to reinforce, improve – even maintain – over more radical positions.
Facebook encourages pools of like-minded people, be it through an architecture that promotes what activist Eli Pariser calls “Filter bubblesOr what the psychologist Daniel Kahneman called “cognitive ease” – our willingness to believe ideas that are familiar, comfortable, and easy to believe, and to avoid ideas that would strive to be accepted. It’s also possible to play Facebook’s algorithms to manipulate public opinion, like the investigative work of journalists like Carole Cadwalladr and Craig Silverman has shown.
However, seeing a radical meme is not enough to trigger more of the same content. It’s the way we interact with the content that is important to Facebook. The depth of interest required to comment on and then share a political idea will trigger more of it and potentially lead the user through increasing radicalization.
A slightly racist grandma can be cared for quickly to adopt more radical views. Or a fellow mother is convicted of conspiracy theories about the coronavirus law Epstein’s island. And then this can lead to thousands of protesters marching against the wearing of masks in London in late August Defense of a “truth” only they are shown.
It can be tempting to fire the anti-mask protesters or groups March to Buckingham Palace to #SaveOurChildren as a few thousand cranks in a sea of sensible people. But we don’t know the size of the iceberg – there may be thousands of partial believers beneath each visible demonstrator, including an unknown number of grandmothers who are helping QAnon grow.
To learn more about the history of conspiracy theories, how they spread, and how dangerous they are, listen to our expert guide to conspiracy theories, a series from The Conversation’s The Anthill podcast. Listen, on Apple Podcasts or Spotifyor search for The Anthill wherever you get your podcasts.