How caregivers can protect children from the vicious trolls that infiltrate virtual games and cats.
Extremists targeted my 12 year old son online.
He was playing a virtual game with friends during the summer when another child let a user into the group they had never played with before. This account then brought in other users, and a few days later they launched a toxic tirade of harassment and flooded the cat with anti-Semitic vitriol, swastikas and neo-Nazi propaganda.
When my son pushed back, they bombarded him with aggressive and hateful messages. As soon as we blocked and reported an abusive account, another disturbing message appeared within seconds in a seemingly coordinated attack.
My son and I had already discussed what to do if he has ever been targeted online or witnessed harassment, and we were able to react quickly, but his experience failed is not unusual.
Hate speech and online abuse have been ubiquitous in spaces digital for many years, but the use of gaming and messaging platforms by extremists and alt- right to target young users is increasing as more and morechildren play online. A 2017 Pew study found that 90% of teenagers now use gaming platforms; and a 2019 Common Sense Media poll found that 64% of tweens ages 8-12 play games online.
"Extremists are increasingly moving into play spaces and targeting a young audience " said Mark Potok, an expert on domestic hate groups and former senior researcher at the Southern Poverty Law Center. "This type of access is whatthey've wanted for years. "
Virtual hate speech has also increased during the pandemic as online activity skyrockets , according to a report published by L1GHT, a tech company that identifies toxic speech online.
"These aren't white hooded men on the street anymore," said Laura Guy , a clinical social worker at New York who works with children who have been targeted online. “They don't always start with overtly hateful language. Often times they try to engage young people with dark and nervous humor and provocative jokes. "
Caregivers can use privacy settings as a first line of defense against online harassment or recruitment, but extremists find workarounds to win access to children. For example, they create deceptive or fake accounts to trick children and their friends into accepting friend requests, or joining their games.
In Discord - a messaging platformpopular where players can chat while playing - extremists espoused hatred and created servers glorifying the Nazis. Users can organize " raids ” that encourage their members to bombard another server with hate messages.
Children may unknowingly let extremists in if they post their server link on Disboard (a site not belonging to not to Discord where people can search for Discord servers). The extremistsmay use this published link to infiltrate.
Frequency of hate groups only use video games to target recruit members, but they have also become a privileged space for harassing children. "If you 're not one of them, you ' re an enemy, and they like to try and make people miserable, " Mr Potok said.
A 2020 Anti-Defamation League a survey found that 68% of online gamers have experienced severe harassment. Fifty-three percent of those surveyed said they had been harassed because of "race / ethnicity, religion, ability,sex or sexual orientation ”; and 51 percent have received threats of violence.
Mr. Potok said the online abuse is a problem that all children can encounter, but marginalized groups are especially risk .
Lydia Elle, African American business owner and writer in California, said her 11-year-old daughter started playing the game online Roblox during the pandemic to connect with school friends. Her daughter put an avatar of an African American girl on her profile and was quickly targeted, Dec.lare his mother. "Vicious racists quickly picked up on her and called her by horrible names," Ms. Elle said.
Some platforms report that 'they are take action to counter extremism, such as use artificial intelligence to detect offensive content and increase moderation, but many users say it took tech companies far too long to has harassment , and that it has not refused.
Representatives for Discord and Roblox have each declared that their platforms have zero tolerance nce for hate speech and "violent extremism".
Discord uses "a mix of proactive and reactive tools to prevent activities that violate our policies of the service, "the company said in a statement. These include automated search tools such as PhotoDNA and the means for users to report violations. Roblox said it is using a "combination of machine learning and a team of over 3,000 people" to detect the contenu inappropriate.
Lori Getz , internet security expert and author of the Tech Savvy User Guide for the Digital World, said that caregivers cannot control everything children are exposed to, but parents can empower children to deal with difficult situations online. Here's how:
Start the conversation early.
Speak with children according to their age ways regarding the hate - including overt and hidden signs, such as words, symbols and images - and trust their instincts if something doesn't seem right. " If tutors don't talk about these things with their children, someone else will, and that may not be a credible source, "Ms. Guy said.
If children are bullied online, make sure they are supported, said Robyn Silverman , a specialist in the development of the Child and adolescent. Online abuse should be taken just as seriously as other types of abuse, she said, noting that targeted children and adolescents "can suffer from anxiety, depression, sleep disturbances, upset stomach and other physical symptoms related to cyber abuse ".
Maintaining an ongoing and open dialogue about online safety is crucial. Even though children are not allowed to play some games at home, they may be exposed to them in other places. A Brit survey of 20,000 children aged 11-18 found that 57% reported having accounts that "adults don't know ".
Children can hide information from caregivers, especially if they are targeted online, for fear of losing their games, said Dr. Silverman. "Tell your children that 'They ' ll be fine if they come to you about it "she suggested. " Tell them you are there to support them. "
Check the content and check the settings.
Review the online content and accounts your children interact with, as well as privacy settings and parental controls. Be transparent so that your children know you will check.
Make sure your child knows what to do if they are targeted. To get started, they should talk to a trusted adult who can help them. It is important to take a screenshot of comments, block offending users, quit the game or chat, and report abusive accounts.
Reporting procedures vary by platform - discuss with your child how to submit a report before a problem arises.
Reports can seem trivial when platforms are slow to respond, but experts say it is important. “Reporting helps children feel empowered when they have done something,” Ms.Getz. "Choosing not to report also means that hate accounts are unlikely to be reported.
Caregivers should report threats of violence to law enforcement.
Encourage your child to speak up.
Child witnesses of online abuse can help by making it clear that they will not be bystanders to hate, Ms. Getz said.
Ms. Elle's daughter and her friends have a plan - if one of them is attacked, they capture all comments and report the account. "Opposing hatred is not just about the targeted person," Elle said. "Being an ally can really make a difference when someone is targeted.
Speaking out against hatred is important, Ms. Getz said, but s 'to engage in a toxic and continuous process the exchange can bebe traumatic for children and give more attention to extremists.
Ms. Getz recommended that when children and teens witness hate online, they should respond with a clearly worded response: "Tell them what they're doing is wrong and you won't be one of them," then opt out and report the account. "