What FemTech Apps Get Wrong About Security After Dobbs
- 2 days ago
- 8 min read

The regulatory landscape shifted — most apps haven't
When the Supreme Court issued its ruling in Dobbs v. Jackson Women's Health Organization in June 2022, the immediate conversation in the FemTech industry centered on what users should do: delete their apps, disable location services, avoid syncing data to the cloud. That conversation was largely correct, but it was aimed at the wrong audience.
The more important question — one that still hasn't been answered clearly by most platforms — is what FemTech founders and operators need to do differently. The data practices that were legal, or at least tolerated, before Dobbs now carry materially different risk. The regulatory framework has changed. The enforcement posture has changed. And the legal exposure for users of platforms that haven't adapted has changed significantly.
This article is written for founders, CTOs, and product leaders building in the FemTech space. It is not a user guide. It is a technical and regulatory assessment of what your architecture, your vendor integrations, and your data practices actually need to look like in a post-Dobbs environment.
For platforms approaching a Series A, SOC 2 readiness is a parallel requirement worth understanding early.
Why HIPAA does not protect most FemTech apps
The most common misunderstanding in this space is that HIPAA compliance is the relevant standard for protecting reproductive health data. It is not — at least not for the majority of FemTech platforms.
HIPAA applies to covered entities: healthcare providers, health plans, and healthcare clearinghouses, along with their business associates.
A consumer-facing period tracking app, fertility app, or cycle monitoring platform that is not integrated into a clinical workflow and does not operate as a covered entity is not subject to HIPAA's privacy and security requirements. The data it collects — cycle dates, ovulation windows, sexual activity logs, pregnancy status, miscarriage records — is not protected health information under HIPAA.
This is not a grey area. The FTC has stated it explicitly, and it has enforced accordingly. GoodRx, Premom, Flo, and BetterHelp were all pursued under the FTC Act and the Health Breach Notification Rule, not under HIPAA, because HIPAA did not apply to their consumer-facing data practices.
What this means for founders: if your privacy policy implies HIPAA compliance as a reason users can trust their data is protected, and you are not actually a covered entity, the FTC considers that a misrepresentation. GoodRx displayed a HIPAA compliance seal on its platform. That seal was cited in the FTC's complaint as a deceptive practice.
The FTC Health Breach Notification Rule is now your primary federal framework
The regulatory framework that actually governs most FemTech platforms at the federal level is the FTC Health Breach Notification Rule, updated in July 2024. The updated rule explicitly covers health apps and connected devices that are not subject to HIPAA, which includes the vast majority of consumer FemTech products.
Under the Rule, a "breach of security" includes not just unauthorized access by external attackers — it also includes unauthorized disclosures. Sharing user health data with a third-party analytics provider, advertising network, or SDK vendor without user authorization is a breach under the Rule, regardless of whether a malicious actor was involved. Premom shared fertility and cycle data with AppsFlyer, Google, and two Chinese analytics firms via SDK integrations. That was a breach. GoodRx shared prescription medication data with Facebook and Google. That was a breach. Neither company was hacked.
The updated Rule requires notification to affected users, the FTC, and in some cases the media within 60 days of discovering a breach. Civil penalties apply for non-compliance. The FTC has demonstrated it will enforce — the enforcement actions against GoodRx ($1.5 million), Premom, Flo, and BetterHelp ($7.8 million) make that clear.
The practical implication: if your platform shares any health-related user data with third parties — including through analytics SDKs, advertising pixels, or data broker integrations — you need to assess whether that sharing constitutes an unauthorized disclosure under the Rule.
The SDK problem most FemTech platforms have not solved
The enforcement actions against Premom revealed a technical dimension that is worth understanding in detail, because it applies to almost every consumer health app operating today.
Premom integrated analytics and advertising SDKs from AppsFlyer, Google, Umeng (owned by Alibaba), and Jiguang into its application. These SDKs transmitted user data to their respective companies automatically, as a function of how they operate. The data transmitted included fertility and cycle information, but also — critically — non-resettable device identifiers including International Mobile Equipment Identity (IMEI) numbers.
The FTC's complaint noted this was its first case specifically establishing that non-resettable device identifiers constitute identifiable information. When a device identifier is transmitted alongside health data, it allows third parties to associate that data with a specific individual even when other identifying information has been stripped. Premom's claim that it only shared "non-identifiable analytics" was false — not because of any deliberate deception in the core product, but because the SDK integrations were doing it automatically.
This is the SDK trap. Most consumer health apps use third-party SDKs for analytics, attribution, crash reporting, or advertising. Most of those SDKs transmit some combination of device identifiers, behavioral signals, and contextual data back to the SDK provider. If your app is a health app and those transmissions include any data that can be associated with a user's health status — including the fact that they are using a fertility or cycle tracking app — you may be in violation of the Health Breach Notification Rule.
The technical fix is not obvious. It requires auditing every SDK integration in your application, understanding exactly what data each SDK transmits and to whom, and either removing SDKs that cannot be configured to exclude health data or implementing a consent gate that gives users meaningful control over that transmission before it occurs.
State laws are creating a fragmented and increasingly strict compliance landscape
At the federal level, the HBNR is your primary framework. At the state level, the picture is significantly more complex — and moving fast.
Washington state's My Health My Data Act, which took full effect in March 2024, is the most comprehensive consumer health data law in the United States. It covers data that falls entirely outside HIPAA's scope and defines consumer health data broadly to include reproductive and sexual health information, geolocation data that could indicate someone is seeking health services, and — critically — data inferred or derived from non-health information through algorithms or machine learning. If your platform infers cycle patterns from behavioral data, that inference is consumer health data under Washington law.
The Act requires opt-in consent before collecting or sharing consumer health data beyond what is necessary for the stated purpose. It prohibits geofencing within 2,000 feet of healthcare facilities for advertising purposes. It includes a private right of action, meaning individual users can sue — not just the attorney general. And it applies to any entity that collects Washington consumers' data, regardless of where the entity is based.
Washington is not alone. California, Colorado, Connecticut, Illinois, Massachusetts, New Jersey, New Mexico, New York, Rhode Island, and Vermont have all enacted laws with varying degrees of reproductive health data protection. Several include shield law provisions that protect data from being shared with out-of-state law enforcement seeking to enforce restrictive abortion laws. The patchwork means that a FemTech platform operating nationally is now effectively subject to the most protective state law wherever its users are located — and those laws are stricter than the federal baseline. Platforms expanding into behavioral health or substance use tracking face an additional compliance layer under 42 CFR Part 2.
The subpoena risk most founders underestimate
Post-Dobbs, 19 states have enacted laws that restrict or criminalize abortion access to varying degrees. Law enforcement in those states has legal authority to seek data through subpoenas and court orders, and the data held by consumer health apps is a potential target.
The critical distinction is between platforms where data is stored on centralized servers and platforms where data is stored locally on the user's device. A subpoena or court order compels a company to produce data it holds. It cannot compel a company to produce data it does not hold. If cycle, fertility, or reproductive health data is stored exclusively on the user's device and your platform does not retain it on your servers, you cannot produce it in response to legal process.
Most FemTech platforms store data in the cloud. That is a legitimate architectural choice for many reasons — backup, cross-device sync, personalization, research. But it creates a legal exposure that did not exist before Dobbs in the same way. Reproductive health data stored on your servers can be compelled through legal process in states where that data could be used to prosecute users. This is not theoretical — law enforcement in restrictive states has begun issuing subpoenas and search warrants in abortion-related cases, and digital health data has been cited in prosecutions.
If your platform stores reproductive health data centrally, you need a clear legal response policy: what you will produce in response to a subpoena, what you will contest, under what circumstances you will notify users that their data has been requested, and whether you have counsel with reproductive rights expertise advising on those decisions. Washington's My Health My Data Act requires entities to demand an attestation under penalty of perjury before complying with legal requests for health data that may relate to lawful out-of-state health services — a procedural protection that your legal team needs to understand and implement.
What a security review of a FemTech platform actually needs to cover
The security review framework most FemTech companies apply is built around standard consumer app security: authentication, encryption, API security, cloud configuration, dependency management. Those are necessary but not sufficient in the post-Dobbs environment.
A security review that is adequate for a FemTech platform in 2025 needs to address several additional areas.
SDK and third-party data transmission audit. Every SDK and third-party integration in the application needs to be mapped for what data it transmits, to whom, under what conditions, and whether user consent is obtained before transmission. The goal is to identify whether any integration is transmitting health-associated data — including device identifiers — to third parties in a way that constitutes an unauthorized disclosure under the HBNR.
Data minimization and retention assessment. The FTC's guidance from its enforcement actions is explicit: collect only what you need, retain only as long as you need it, and do not use it for purposes beyond what you disclosed to users. Your data model, your retention policies, and your deletion workflows need to be assessed against this standard. Inferred health data — cycle predictions, fertility windows, pregnancy likelihood scores — is health data and needs to be treated accordingly.
Geolocation handling. Precise location data that could indicate a user is visiting a reproductive health facility is explicitly covered under Washington's My Health My Data Act and implicitly sensitive under the HBNR. If your platform collects or processes location data, you need to assess how that data is stored, who it is shared with, and whether your retention policy ensures it is not held long enough to create legal exposure.
Legal response readiness. Your incident response plan needs a section on legal process responses: who decides whether to comply or contest a subpoena, what the legal review process looks like, what your user notification policy is, and whether you have reproductive rights legal counsel on retainer or accessible.
Encryption and deletion architecture. If your platform stores sensitive reproductive health data, encryption at rest using keys that you do not retain in a recoverable form substantially reduces the utility of that data to law enforcement even if it is compelled. End-to-end encryption between device and server, combined with a genuine deletion architecture that removes data from your systems when a user requests deletion, are security posture choices that have direct legal significance in the post-Dobbs environment.
Where Sekurno fits
The security gap in most FemTech platforms is not traditional vulnerability exposure. It is the combination of unaudited data flows, SDK integrations that share more than founders realize, and a legal response posture that was built before Dobbs changed the stakes.
Sekurno works with consumer health and FemTech platforms to close that gap through independent penetration testing that covers the full data flow including third-party SDK transmissions, combined with data flow audits and readiness assessments that map your architecture against the HBNR, Washington My Health My Data Act, and applicable state law requirements.
The output is not a compliance certificate. It is the documented understanding of exactly what data your platform holds, who it shares it with, and what your exposure looks like — so you can make informed decisions before a regulator or a subpoena forces the issue.
If you are building in the FemTech or consumer health space and want to understand where your platform stands, contact us.





