The COVID pandemic has contributed to the prevalence, creation, and dissemination of child sexual abuse materials (CSAM). As global online activity increased, so did the distribution of CSAM. The number of reported offenses skyrocketed with the time people spent confined to home, doubling from 2019 to 2020, and increasing another 35% from 2020 to 2021. Facebook, Google, Snapchat, and TikTok together reported over 48 million incidents of CSAM in that time period. Tragically, live-streaming of this abuse using online web chats, social media, and video-meeting platforms also increased greatly due to lockdowns and social distancing. Evidence of these live streams is gone at the end of each stream, making it very difficult for law enforcement to detect. Further, the emergence of “Deepfakes” has made the creation of computer-generated CSAM alarmingly easy, allowing for the malicious use of face swaps, lip-syncing, and puppeteering to blackmail and inflict emotional harm on its targets. In turn, it is thought that this digitally produced CSAM escalates “contact offending”, triggering viewers into perpetrating abuse themselves. Next week, we will define “sextortion” and steps U.S. and global agencies are taking to combat these terrible acts as well as those we can take ourselves to protect our children.
Sources: The Center for Missing and Exploited Children, USCCB, and the Internet Watch Foundation.