
(RightWing.org) – The advent of the Internet and social media networking have helped family and friends stay in contact with each other regardless of where in the world they might live. Unfortunately, it has also produced a dark underbelly that unscrupulous actors use to spread fake news and vile rhetoric, as well as cater to people’s sexual appetites, no matter how depraved they might be. Now one senator wants to hold these networks accountable.
Child Exploitation
Children are among the most vulnerable members of society, and federal laws require — and basic decency demands — those who are in charge of these platforms, such as Mark Zuckerberg (CEO of Meta, the parent company of Facebook and Instagram) and Elon Musk (CEO of Twitter) do everything within their power to stop the spread of child pornography.
Senator Ted Cruz (R-TX) sent a letter to Zuckerberg on June 12 after the Wall Street Journal (WSJ) published an exclusive story relating how one of his companies facilitates the exchange of child sexual abuse material (CSAM) and it expressed just how “appalled” Cruz was.
One problem that Cruz brought up that he wants Zuckerberg to address from the WSJ article was the report that Instagram “has permitted users to search for terms that its own algorithms know may be associated with illegal material.” While the platform obviously cannot stop people from entering search terms, the fact that a warning box came up that gave people two options, including “see results anyway,” was of grave concern to the senator. After the WSJ authors questioned the disturbing option, it was quietly removed, and Instagram refused to say why was there in the first place.
Self-Generated Materials
The WSJ article also cited a report by the Stanford (University) Internet Observatory (SIO), which is part of their Cyber Policy Center that studied CSAM, specifically focusing on the subgenre of materials self-generated (SG-CSAM) by minors themselves.
SIO stated that to stay within the law protecting minors from being exposed and for the mental health of researchers, they used several automated platforms to identify problematic images and the metadata attached to those were forwarded to the National Center for Missing and Exploited Children (NCMEC). According to their published report, they found 405 accounts on Instagram and 128 on Twitter that were offering SG-CSAM materials.
The report notes that in many cases, the minors in question have taken intimate pictures of themselves to share with “a romantic partner,” but the images could be shared by third-party friends or posted to a more public location after a breakup, a.k.a. revenge porn.
Another disturbing source comes from the emerging trend of sextortion, which happens when a teenager believes they are sharing images with someone their own age, only to find out that the other person is threatening to share the images with friends and family unless they send more pictures and/or money. These cases all too often lead to suicide.
Parents and other guardians of minor children need to be vigilant as to what apps the child is active on because the risk factor can vary widely from platform to platform. The SIO study says that “Instagram appears to have a particularly severe problem with commercial SG-CSAM accounts,” and Telegram and Discord were found to have problems as well. However, when researchers looked for keywords and hashtags, they found that “almost no results” were found on TikTok.
Copyright 2023, RightWing.org