P.S.R. Los Angeles 2024: Vendor Audits; My Health, My Data
A quick summary of my favorite sessions at the IAPP’s 2024 Privacy, Risk, Security event.
Here’s a quick summary of my favorite sessions at the IAPP’s Privacy, Risk, Security event in Los Angeles last week. I focused almost exclusively on third-party/vendor risk management, with the sole exception of a deep-dive session on Washington State’s My Health My Data Act (which indirectly affects data processors as well). I will start with the latter, subsequently cross-referencing all of the other sessions in a single summary.
My personal opinions are in italic.
MY HEALTH, MY DATA: KEY TAKEAWAYS
Presenters: Justine Young Gottshall (Managing Partner, InfoLawGroup), Mike Hintze (Partner, Hintze Law).
[Read Mr. Hintze’s own blog post series on this law if still unfamiliar]
Large businesses will probably try to avoid territorial scope, as material scope may cover fitness trackers, wearable devices, certain product choices and ad preferences… any data point that, while not representing health data in itself, could lead to identifying a medical condition, including a broad definition of biometric data. As Mr. Hintze pointed out, the state happens to host the two largest cloud providers, so I am guessing that avoiding AWS and Azure data centers in Washington would be a smart move — an interesting side-effect.
When applicability is unavoidable (eg., by selling to Washington residents, regardless of where the data processing actually happens), there will be a requirement of prior affirmative consent for the collection of such data, and a more burdensome need for “authorization” whenever such data is going to be used in targeted advertising (ie. resulting in the “sale of personal data”, à la CCPA). A separate “Consumer Health Data” privacy notice will also be required. Some relief is provided for scenarios in which the information being processed is necessary for the provision of the service being requested by the consumer, in which case consent is not required. I am guessing that this could encompass all of the scenarios that we cover under either contractual necessity or legitimate interest in the GDPR.
The most worrisome part about this law is its private right of action. It does not have statutory damages like Illinois’ BIPA, so you still need to show harm. This would have to happen under the Washington Consumer Protection Act, and it appears that existing case law has already considered a damage to someone’s reputation as a valid “injury to business or property” (that is, something of the sort would qualify as “real and quantifiable”, provided that there is causation between the data processing activities and said harm).
The speakers pointed out that there is a clear risk in online advertising: too narrow a segment (or targeted audience) could be correlated with a medical condition.
All in all, I very much agree with Mr. Hintze’s approach to minimizing risk: a combination of “necessity” and consent where the former takes care of less intrusive purposes (which may require all of the available data points) and the latter deals with riskier scenarios (in which a small sample of the audience — those individuals who happen to agree when asked properly — could support the underlying purpose). This would avoid the temptation of bundled purposes under a single (insufficient) consent request. In any case, the usual consent banners would fall short of a valid authorization for targeted ads, so data controllers would probably have to avoid them altogether.
THIRD PARTY AUDITS
I will combine my takeaways from three different sessions here, and I will do it from the least explored of both angles: that of the vendor/data processor (ie., turning around some key pieces of advice).
The sessions in question:
Guardians of Data: Balancing Privacy, AI and Security in Third-Party Management: Holly Gion (Director - Legal Privacy, Smartsheet), Jennifer Harkins Garone (Senior VP, Director of Privacy, Comcast), Melinda Clarke (Senior Privacy Manager Lead, Prowess Consulting).
Ensuring Privacy Program Maturity; How to Leverage Third-Party Audits: Drew Concannon (Ford), Lynn Parker Dupree (Partner at Finnegan), Sheila Pham (Legal-Privacy Director at RingCentral).
Manage Supplier Risk, Shorten Your Procurement Cycle and Be a Hero to All: Peggy Eisenhauer (Privacy & Information Management Services) Jonathan Fox (Cisco), John Gevertz (DLA Piper).
Regarding pre-contractual audits, during the procurement process
The most advanced controllers will avoid questionnaires (everyone’s nightmare) altogether, instead referring you to their own list of non-negotiable requirements.
Pre-contractual audits will often be looking at a processor’s actual privacy practices, regardless of its privacy-preserving and “Privacy by Design” claims: how is their website’s privacy notice? Do they practice what they preach? (Fully agreed, we do the same, despite the fact that vendors are collecting data about business representatives in a B2B scenario, rather than consumers.)
Regarding Data Processing Agreements governing the processing of personal data by a third party
Core instrument
Some controllers are happy with a vendor’s own DPA. They accept that this will be imposed when the processor is simply a bigger organization (eg., Salesforce, Adobe), and they will also appreciate that working on the vendor’s own template could save everyone some valuable time. In this case, the template must be 100% compliant with both the GDPR and applicable state comprehensive privacy laws.
Secondary use
Savvy data controllers will not accept secondary use clauses hiding away in the terms of use of the platform (or primary contract). These belong in the DPA and could very well include product-related metrics and improvements whenever data is truly de-identified. The same could be applied to AI training purposes, and the concept of “de-identification” (or “anonymization”, in the EU context) should be clearly defined.
Audit rights
Audit rights can be limited to yearly access to existing certifications, backing it up with an ad hoc questionnaire if any issues arise, and resorting to a third-party audit as a last option. This should be paid by the data controller whenever dealing with startups for low value contracts.
Data processors should not accept the resolution of product-related issues that a data controller unilaterally finds concerning — sharing the costs by putting a price cap on them is an alternative formula.
Data processors should not accept anything as vague as an undefined “suspected breach” as a reason for the data controller to impose an audit.
Resolution of a data breach
Put a limit on remediation in case of a data breach. Also, there will be an expectation that the processor pays for all the necessary follow-up steps (beyond the GDPR, if applicable, bear in mind call center operation, mailings, etc. as per data breach laws), but controllers are happy with vendors hiding their name (and saving their reputation) from such communications, simply acting on behalf of their customer.
Do not accept data breach-related penalties embedded in a DPA. Some customers will aim for a sum that goes up to 7x the total contract value (from 3x or 5x), but this should be taken care of by required insurance -certainly on the vendor’s side, but most likely on both.
A controller may impose a 24-hour window for data breach notifications, but they should be willing to accept 72 hours without much concern.
(All in all, I found a major difference with the European approach to DPAs in this particular section, surely informed by an established practice of dealing with fifty state-specific data breach laws.)
Ongoing contractual relationship
Beware: a data controller will leverage renewal dates in yearly subscriptions to negotiate insufficient DPAs.
OTHER GEMS
Max Schrems was mostly predictable on the main stage (interviewed by Casey Newton and Kevin Roose, hosts of the Hard Fork podcast), but I still enjoyed this particular statement, surely because I completely agree:
“The ePrivacy Directive is a crazy law […] All those cookie banners are stupid.”