(CNN)At first glance, Shelley Allwang's cubicle looks just like any other office space.
But alongside a pinboard full of tchotchkes and a photo of her dog, sits a story that reminds Allwang of the importance of her job, now more than ever.
"The Boy and The Starfish" is about a boy tossing beached starfish back into the ocean, saving just one at a time.
Allwang doesn't work with starfish. But as a program manager at the National Center for Missing & Exploited Children (NCMEC), her day-to-day job involves helping children who are abused online.
The story is a reminder to Allwang of perseverance in the face of difficult odds; there will always be more starfish to pick up, like there will always be more children who need help. The story has taken on added significance in recent weeks, as Allwang has watched the number of reports of children being abused online soar four-fold.
In April of 2019, the NCMEC received roughly 1 million reports. Last month, it saw more than 4.1 million come in via its CyberTipline, according to data provided to CNN.
"The volume is overwhelming and it can be very daunting. It can feel like a constant pressure and that it's never going to end," Allwang told CNN in a phone interview. "But every single time we send out a report, every single time there's a child rescued, we're making a difference for that child."
Officials at the center attribute the exponential increase in reports, which was seen in March as well, to a confluence of circumstances caused by the coronavirus pandemic.
"The pandemic is having an effect," John Shehan, vice president of NCMEC's Exploited Children Division, told CNN. "There are some very bad actors out there who are preying on this opportunity of more kids at home and online with parents who are having to juggle homeschooling and work."
An increasingly tech-savvy population of minors, combined with millions of students engaged in remote learning and more adults spending time online than ever before has created more opportunities for children to be exploited, Shehan said.
He said being cooped up during stay-at-home orders has also added to the increase in reports.
"Kids can't go out and play with friends. They are online more and more."
The horrors children face online
The Deep Web, as its name suggests, is the freewheeling part of the internet that has seen increased activity from bad actors since the pandemic struck. That's where people are "seizing this opportunity," Shehan said.
Tor is perhaps the darkest corner of the Dark Web where a collection of secret websites require special software to access them. Tor says it's a way to "use the internet with as much privacy as possible, and the idea was to route traffic through multiple servers and encrypt it each step of the way," according to its website.
"They are talking about these stay-at-home orders and their desire to use this opportunity to entice children to produce sexually explicit content," Shehan said.
This sort of content has spilled over into the mainstream web. NCMEC works with more than 1,400 companies to help them report instances of child pornography to NCMEC's CyberTipline.
Facebook, Snapchat and Imgur were among the 15 companies with the most online child abuse reports sent to NCMEC in 2019, according to data on NCMEC's website.
Facebook had 15,884,511 reports, Snapchat had 82,030, and Imgur had 73,929, according to publicly available data on NCMEC's website.
The companies told CNN they all employ technology to help identify abusive content and bad actors on their platforms.
"The size and expertise of our team, together with our sophisticated technology, have made us industry leaders in detecting, removing and reporting these images, and stopping people from sharing them," a Facebook spokesperson said in an email statement to CNN.
Imgur, which says it reaches more than 300 million people per month, has not seen an increase in child sexual abuse material in 2020, Sarah Schaaf, Imgur co-founder and vice president of community, told CNN over email.
All content uploaded to Imgur is run through databases maintained by Thorn, a non-profit organization that uses technology to find child sexual abusers.
"Imgur has a zero-tolerance policy for illegal content and activity on our platform, and our reporting represents our diligent, proactive approach to identifying, eradicating and preventing child sexual abuse material," Schaaf said. "Any Imgur upload found to be matching a known hash is immediately deleted from our servers and reported to NCMEC for further investigation."
Snapchat also uses tools to allow people to report abusive content that violates its terms of service, a spokesperson for the company said in an email to CNN.
All reports of child exploitation are reviewed by its Trust and Safety team, and evidence of this activity results in account termination and reporting to NCMEC, the company said.
New users are required to provide their date of birth when they register and if the account owner is under the age of 13, the company will terminate the user's account immediate