November 23, 2024

Voices: Colleen Ballinger controversy: Welcome to the internet – where everything is going wrong

Colleen #Colleen

YouTuber Colleen Ballinger, known on the platform as Miranda Sings (Getty Images)

I don’t want to sound like an old person but: kids today. Am I right? With their internet, and their phones, and their… fidget spinners (are they still around?).

I know it’s the duty of every generation to have a moral panic about the way the subsequent generation chooses to entertain itself. In the Eighties they said Dungeons and Dragons was causing kids to worship Satan. In the 2000s they said the same about Harry Potter. When I was growing up, the grown ups were convinced that Call of Duty was going to turn me and my friends into serial killers. Ridiculous (I carry out my murders in the name of Crash Bandicoot).

So bear in mind that I’m fully aware of the potential hypocrisy of what I’m about to say: I think that children’s entertainment has taken a very dangerous turn in the past few years, and is only primed to get worse.

Specifically, I’m talking about the overreliance on content made for children by unvetted, unregulated self-styled “entertainers” on platforms such as YouTube. It feels like every week there’s some new story that emerges about one of these guys that doesn’t cast them in the best light.

Most recently, it’s the disturbing allegations made about YouTuber Colleen Ballinger, known on the platform as Miranda Sings. Ballinger, whose content is squarely aimed at younger viewers, has been accused of inappropriate behaviour with her fans, including creating a private Twitter chat with several underage followers, in which she made sexually charged remarks and revealed inappropriate personal details about herself and her relationships, and sending one fan her bra and underwear (a charge to which Ballinger has previously admitted).

I’m sure every generation of adults thinks that their paranoia about emergent trends and technologies is the exception to the rule, but our parents never had to deal with anything as far-reaching and largely unregulated as the internet. We as a society have become far too comfortable with the idea of letting our children run loose in an online ecosystem populated with content made by strangers whose only unifying trait is that they’ve decided to make a living by vying for the attention of minors.

Story continues

As a business model it makes sense – internet revenue is largely generated via engagement, and in an online ecosystem where 99 per cent of people are anonymous, a child’s clicks are just valuable to creators as those of an adult. It’s also a lot easier to make content for a userbase that isn’t exactly known for its discerning standards. Why write a script when you can just play Fortnite and scream into a microphone?

But online content isn’t the same as traditional forms of entertainment. Whereas something like watching telly is a very passive experience, consumers of online media are able – even expected – to interact with creators on a level that I still find shocking. They can leave comments in real time. They can vote on polls. Creators can even go back into their old content and see which parts of their videos did better than others, down to the second. Does your viewership spike every time you talk about ponies? Well guess what: you’re the pony channel now. Sorry, I don’t make the rules. Ten thousand anonymous eight-year-olds do.

And what happens when the content that children respond to is dangerous? I’ve written before about deliberately misleading online content, including DIY and cooking “hack” videos that encourage children to engage in dangerous behaviour involving tools, chemicals and kitchenware. Most of this content isn’t made by people with malicious intent; it’s made by people who’ve been told by an algorithm: “Hey, do you know what your viewers really love? Fire and electricity and being lied to.”

Remember a few years back when some of the most popular channels on YouTube were the ones making baffling videos, where licensed Disney and Marvel characters would act out medical procedures and light fetish content? “Elsagate” was an online phenomenon in which adult content was pushed on kids who were left to scroll through YouTube at their leisure, by creating videos that had all the hallmarks of child-friendly content and so could trick inattentive parents, but dealt with themes that could be borderline traumatic for children. It was sinister, it was depraved, and judging my recent events we learned precisely nothing from it.

It isn’t just the kids who at risk, though. Because of the heightened level of back-and-forth creators have with their fan communities, they put themselves in an enormous amount of potential danger as well. Take the Ballinger situation for example: even if the allegations against her are found to be one big misunderstanding, she isn’t going to be able to shake the accusations of “groomer”, likely for the rest of her life. Once you get a reputation like that, even full exoneration doesn’t make a dent.

This is what happens when you don’t have an official organisation to back you up, or media training, or experienced advisers, or lawyers to tell you what’s appropriate and what isn’t when catering to children. You can’t just improvise when it comes to interacting with minors.

Discussion around government regulation is tricky enough, and it becomes borderline impossible when talking about something as large and decentralised as the internet. But when we’re talking about the safety of children, we see time and time again why we can’t just leave things up to chance.

Call it a moral panic if you want. Call me out of touch. Maybe I’m wrong, and there’s actually a very good reason we should leave the safety of our children in the hands of any stranger with an internet connection and a webcam. I’m yet to hear a compelling one, though.

Leave a Reply