Earlier this week, court documents revealed that 18-year-old Celeste Burgess and her mother Jessica Burgess face multiple charges after police obtained Facebook messages that allegedly show the two making reference to abortion medication for Celeste and a plan to hide the remains.
While the Burgesses were charged before the Supreme Court overturned the 1973 Roe v. Wade decision that protected the right to abortion, the case highlights issues of digital privacy that could have widespread ramifications post-Roe.
The teenager, who was about 28-weeks along when her pregnancy ended, told investigators she had unexpectedly miscarried a stillborn fetus and that she and her mother buried the remains, according to an affidavit in support of a search warrant.
But police went on to serve Meta, Facebook’s parent company, a search warrant. After obtaining roughly 300MB of data, including private messages between the two, authorities allege the 41-year-old mother helped her daughter obtain abortion medication in Nebraska, where abortions are illegal after 20 weeks.
Celeste Burgess, who was 17 at the time of the alleged incident, will be tried as an adult. Attorneys for both mother and daughter declined to comment to CNN. On Tuesday, Meta spokesperson Andy Stone issued a statement on the company’s website saying, “Court documents indicate that police were at that time investigating the alleged illegal burning and burial of a stillborn infant. The warrants were accompanied by non-disclosure orders, which prevented us from sharing information about them. The orders have now been lifted.”
As much as Meta wants to make it seem as if their hands are tied, emphasizing that “the warrants did not mention abortion at all,” they still can and should do so much more to safeguard users’ privacy. Instead, they’ve been removing information about abortion pill access (Stone said in a Twitter post that Meta had “discovered some instances of incorrect enforcement and are correcting these”) and, according to a memo obtained by the New York Times, ordering employees not to discuss the issue of abortion in the workplace.
Long story short: If big tech isn’t going to protect women and girls, then we have to protect ourselves. That starts by obtaining — and sharing — knowledge about our digital rights.
Like many of us, privacy experts saw the writing on the court walls long ago, and repeatedly sounded the alarm about the unique online vulnerabilities of pregnant women, their abortion providers and anyone who assists them. With abortion bans and restrictions now the law in numerous states, prosecutors could increasingly rely on digital data to punish pregnant people for their choices. It’s a catch-22: inaccessible health care and hostile abortion laws drive pregnant women online to seek medical advice, financial support or transportation — but these searches expand the digital trail of evidence that can later be used to prosecute them.
So what are the risks? In states where abortion is illegal, personal data can be subpoenaed from tech companies, like what happened to Burgess, or bought from a third-party data broker. Your period tracking app can flag a pregnancy before you’re even aware, your texts and emails could provide incriminating evidence, and as more red states consider abortion travel bans, your location tracking data or electronic transaction records might reveal a trip to Planned Parenthood in a neighboring state — and the government can use all of that data against you in court.
Burgess’s story is, tragically, one of many. Even before Roe was overturned, women’s data was weaponized against them.
It’s particularly alarming that women and girls are targeted online when we are already at a disadvantage to men when it comes to digital literacy. We know that American girls are most likely to abandon computer science courses between the ages of 13 and 17. As they grow up, they’re less confident in their digital skills, and less sure of their ability to find the information they are searching for online. It’s no wonder that women comprise only a quarter of STEM professionals, and earn less in those positions than their male counterparts.
This digital gender disparity is why I founded Girls Who Code 10 years ago. Today, I’m proud to say we’re on track to close the gender gap in new entry-level tech jobs by 2030. But that success doesn’t just prove women and girls’ appetite for STEM — it offers a blueprint for the massive education effort needed to enhance young women’s digital literacy and safety.
Part of that effort must be learning and sharing the simple tips for safer online interactions: using encrypted communications apps like Signal, making phone calls via Google Voice, creating (and promptly deleting) new email accounts to coordinate appointments and transit.
For the more complicated stuff, we should rely on and support nonprofits like Electronic Frontier Foundation, Fight for the Future and Digital Defense Fund, which mobilized to help us understand both technology itself and our rights using it.
Over the past few months, they’ve assembled an arsenal of resources that specifically advise abortion-seekers about surveillance, providing them with best encryption practices and private internet browser recommendations. We can even consult the Department of Health and Human Services’ new guidance under the Health Insurance Portability and Accountability Act (HIPAA) for securing one’s personal health data.
We must urge our legislators to protect women’s privacy by curbing abusive data practices and instituting stronger consumer privacy protections through proposed legislation like Sen. Elizabeth Warren’s Health and Location Data Protection Act, or Rep. Sara Jacobs’ My Body, My Data Act.
And if tech companies want to regain our trust, they must become part of the solution. In addition to making end-to-end encryption the default setting in messaging apps, they can refuse to comply with requests for data that violate civil liberties, and even delete such data to protect our privacy.
In all of this work, we must ensure digital privacy education reaches those who need it most. Historically, many Asian, Black, Indigenous and Latino women, those from immigrant communities and women with low incomes have simultaneously suffered from disproportionate police violence and are deprived of adequate healthcare, sex education and other critical resources necessary for reproductive justice. We can’t have true digital inclusion — and justice — if only some women can access this information.
In Justice William O. Douglas’ majority opinion in Griswold v. Connecticut (1965), the court established the right to privacy — and by extension, women’s bodily autonomy — within the “penumbra” of the Bill of Rights, that sliver of partial illumination between “perfect shadow” and “full light.”
For so long, women and girls have had to assert our right to privacy — in our phones, our homes and in our bodies — in the shadows of the law and technology. By empowering women with a knowledge of our digital rights, we inch our way towards the full light — where we have always deserved to be.