What the FTC’s Crackdowns on Chegg & Edmodo Mean for Online Education Privacy
The Federal Trade Commission appears to be on quite a roll during 2022 and 2023 by launching landmark data security and privacy enforcement actions to help protect online education students.
What’s more, the FTC’s new actions seem designed to make clear examples out of two companies—Chegg and Edmodo—in order to encourage firms throughout the online education industry to comply with the agency’s data integrity rules.
The FTC Cracks Down on Chegg
If Chegg’s name looks familiar, that’s because it’s the very same Silicon Valley education technology company that was all over the press as recently as May 2023. Chegg’s stock price plummeted by about 50 percent—resulting in a billion-dollar market capitalization loss—during only a few hours on May 2 after the firm’s CEO disclosed that the artificial intelligence chatbot ChatGPT was hurting the company’s business. This was the first known case in history where a firm had revealed a loss in revenue because of an AI platform, and the firm’s stock price has never recovered.
The price drop happened because some of Chegg’s 62 million college and graduate student customers had stopped subscribing to the firm’s homework help products upon realizing that ChatGPT could help them with much of their studying for free.
It turns out that only three months before Chegg’s market crash, the FTC had cracked down on the firm. The agency went after the company for its data security practices, and Chegg then settled with the FTC in January 2023.
The complaint claimed that these procedures were so careless that the firm had actually violated the Federal Trade Commission Act’s unfair and deceptive business practice prohibitions. In fact, the vast scope of Chegg’s negligence was so reckless and irresponsible that, upon first impression, it may seem difficult for many to believe.
In four separate security breaches between 2017 and 2020, the company’s private data from tens of millions of customers ended up exposed online. Those breaches had disclosed personal characteristics of some customers—like their religion or sexual orientation—because Chegg’s scholarship search platform had collected and stored those details to match customers with potential awards.
The leaks appear to be because Chegg had granted root-level login credentials—essentially, free passes permitting access anywhere within Chegg’s databases—to several employees and external subcontractors. By entering those credentials, many individuals could log in to Chegg’s systems on servers operated by Amazon Web Services, and then review any of the data contained within millions of customer database records.
According to the FTC, in 2018, a former Chegg subcontractor downloaded the names, email addresses, and passwords from around 40 million customer records by using such root credentials obtained from the company. That subcontractor even collected sensitive information in some instances about a customer’s heritage, birth date, disabilities, sexual preference, religious denomination, and parental income.
The FTC reported that about 63 percent of those records had actually turned up for sale later on the internet. The agency also pointed out that Chegg failed to notify many of the 40 million customers and even some of the firm’s own employees whose private information was compromised during the four breaches.
Attorney Samuel Levine, the director of the FTC’s Bureau of Consumer Protection, offered these observations:
Chegg took shortcuts with millions of students’ sensitive information. Today’s order requires the company to strengthen security safeguards, offer consumers an easy way to delete their data, and limit information collection on the front end. The Commission will continue to act aggressively to protect personal data.
Specifically, the FTC says that Chegg failed to implement basic, commercially-reasonable security measures to prevent these breaches. These are widely-used techniques like limiting access to data, conducting security audits and testing, and providing adequate security training to employees and contractors.
The agency also alleged that—believe it or not— the firm never monitored its databases or networks for security threats. Chegg also failed to require employees to use two-factor authentication to log into its databases and allowed employees and contractors to access databases using a single login without ever rotating access keys. Incredibly, the company even kept sensitive personal information on its AWS cloud storage databases in non-encrypted plain text. Chegg also employed obsolete and inadequate encryption techniques to safeguard user passwords until at least 2018.
For remedies, the FTC will compel Chegg to implement a comprehensive data security program that addresses all these defects. Specifically, the order stipulates that Chegg will encrypt consumer data, require multifactor authentication, provide customers with access to the data the firm collects about them, and perform steps to correct numerous additional problems that the FTC outlined.
For its part, Chegg replied in a carefully-worded statement almost certainly drafted by legal counsel. The firm led off with “data privacy is a top priority” and noted that the incidents in the FTC’s complaint “related to issues that occurred more than two years ago.” It also pointed out that the FTC imposed no fines, which it believes indicates “efforts to continuously improve our security program.”
Moreover, the company says it complies “fully” with the Commission’s order after it worked cooperatively with the FTC. The statement wraps up by claiming that “Chegg is wholly committed to safeguarding users’ data and has worked with reputable privacy organizations to improve our security measures and will continue our efforts.”
Nevertheless, what seems especially curious is that Chegg’s CEO of 13 years, Dan Rosensweig, is a technology industry veteran with nearly a four-decade career. Before Chegg, Rosensweig had maintained a long tenure with Ziff Davis before serving for five years as a vice president at Yahoo.
Generally, chief executives of this caliber in Silicon Valley tend to be extraordinarily conscientious about security and privacy risk mitigation strategies and tactics. That’s because these CEOs realize that the long-term future of their enterprises depends in many ways upon data integrity.
The FTC Cracks Down on Edmodo
Another online education enforcement action from the FTC comes only four months after the Chegg settlement. And unlike the fine-free Chegg settlement, this time, the agency asked a federal court to approve a $6 million fine against an obscure educational technology company that called itself Edmodo.
According to the FTC, Edmodo gathered data from young children without their parents’ approval and then sold other companies’ advertisements targeting those kids by using their data. Practices like these are illegal because they’re clear violations of COPPA, the Children’s Online Privacy Protection Act, at 15 U.S.C. §§ 6501–6506. In response to a mid-1990s public outcry, Congress enacted this legislation to provide basic privacy protections for young kids during the start of the Internet Age in 1998.
Edmodo operated a free platform that enabled elementary and junior high school teachers to deliver assignments and tests to their students online, including those as young as kindergarten age. That means that as many as 600,000 students and their parents across America could not opt-out of using this platform. In other words, if students in certain school districts wanted to pass their courses, their schools required those students to use Edmodo—and demanded that their parents consent to that use.
Under 15 U.S.C. § 6501, the statute only permits companies to collect data from kids without parental consent if the firms exclusively use that data for educational purposes. That is to say that companies are barred from vacuuming up data from children aged 13 and under for advertising purposes unless their parents approve. But because the schools required that consent, Edmodo then used it to justify under COPPA the firm’s sleazy data collection practices.
In turn, the platform made money by displaying advertising to children that the FTC claims was personalized with sensitive information the kids had no choice but to provide—including their locations, names, ages, birthdays, and email addresses. What’s more, the FTC says that instead of directly asking for permission from parents, Edmodo even forced the schools to enforce the compliance of teachers and parents with COPPA illegally.
Oddly enough, the company unexpectedly went out of business in September 2022 while the FTC was investigating. That means the FTC won’t be able to collect the $6 million fine it’s asking the Northern California U.S. District Court to award, nor will it be able to directly enforce the specific performance required from Edmodo in the agency’s proposed order.
Protecting Students Against Data-Hungry Corporations
So why in the world would the FTC go after a firm that no longer exists? Gizmodo writer Thomas Germain points out that the FTC has now reached a series of similar settlements over the past several months, although these enforcement actions applied to healthcare technology companies instead of online education firms.
For example, in one of these cases, the prescription drug price monitoring platform GoodRX unlawfully utilized prescription data for advertisements—without customer consent—that the firm then shared with Google and Facebook. Another platform, a fertility support application called Premom, similarly used information about women’s menstrual cycles for advertising purposes without consent from the firm’s customers.
An expert on online privacy and security and an ardent critic of corporate offenders, Germain says that the FTC is “making an example out of companies to show that you can’t, in fact, just ignore what few privacy laws exist in the United States.” Although he acknowledges that these healthcare company settlements only imposed small fines that won’t impact these violators’ operations, he then concludes:
What these cases do, however, is send a message to data-hungry corporations. For years, companies got away with ignoring consumers’ expectations about privacy, burying details in terms-of-service legalese or just failing to let people know about their shady data practices altogether.
These recent cases are essentially a power grab attempting to set new legal precedent to finally give consumers some privacy on this godforsaken internet. The strategy appears to be making landmark settlements that go uncontested so as to scare other companies into straightening up.
Germain’s assessment makes a lot of sense, and these principles apply to both the Chegg and Edmodo cases. The trend among such settlements suggests that online education firms need to comply with privacy and security laws and regulations if they don’t want to draw aggressive enforcement actions from the current activist FTC commissioners. The consequences of such actions could range from adverse publicity to potential or imposed fines large enough to drive many startups out of business—as was apparently the situation with Edmodo.
The progression of these cases under the FTC’s current leadership also suggests that online students may no longer need to worry as much about the risks of privacy and security leaks, at least not to an extent that should discourage them from registering for virtual courses and degree programs. Compared with all the vast benefits to students from online education, the relatively small privacy and security risks should now be even less likely to deter students from enrolling.