Tune in to the second episode of Ropes & Gray's podcast series The Data Day, brought to you by the firm’s data, privacy & cybersecurity practice. This series focuses on the day-to-day effects that data has on all of our lives as well as other exciting and interesting legal and regulatory developments in the world of data, and will feature a range of guests, including clients, regulators and colleagues. On this episode, hosts Fran Faircloth, a partner in Ropes & Gray's Washington, D.C. office, and Edward Machin, a London-based associate, discuss recent enforcement by the California Attorney General, including a new round of enforcement sweeps, actions by the California Privacy Protection Agency, and the relationship between the California Consumer Privacy Act (CCPA) and the California Privacy Rights Act (CPRA).
Transcript:
Edward Machin: Welcome back, and thank you for joining us on the second installment of The Data Day from Ropes & Gray, a podcast series brought to you by the data, privacy & cybersecurity practice at Ropes. In this podcast, we’ll discuss exciting and interesting developments in the world of data. We feature attorneys at Ropes, as well as clients, regulators, and other industry leaders in conversation about what’s new in the world of data. I’m Edward Machin, an associate in Ropes & Gray’s data, privacy & cybersecurity practice based in London. I’m joined by my co-host and colleague, Fran Faircloth, who’s a partner in our Washington, D.C. office.
Fran Faircloth: First, to set the stage for the non-Americans or anyone who hasn’t been paying attention to California lately, the California Consumer Privacy Act (the CCPA) was a groundbreaking privacy law in California that went into effect in 2020. It does several things, but among those things, it requires many businesses to provide notices to individuals about the use of their data. It gives individuals data subject rights, like the right to know what data is collected about them, and the right to request that that data be deleted. It also requires businesses to provide individuals with the right to opt out of the sale of their data, which is defined pretty broadly. Almost immediately after that law went into effect, the privacy advocates who had been sponsoring the legislation decided that the CCPA, as it had been amended through the legislative process, hadn’t gone as far as they wanted, and they needed more, so they introduced the California Privacy Rights Act (the CPRA). That law went into operation on January 1st, but it won’t be enforced until July 1st. Of course, these aren’t the only laws going into operation this year—we also have seen laws in Virginia, Colorado, Connecticut, and Utah. And lots of other states have been proposing their own legislative solutions to comprehensive privacy laws, so we’re watching that space to see what will come next. What’s with the relationship between the CCPA and the CPRA? Do we call it the CPRA now, or are we still calling it the CCPA?
Speaker: Yes, this is sometimes an area of confusion for people, including, as you said, even around what people are calling it. The CPRA amends but it does not replace the CCPA, so even though the CPRA, as you said, is not enforceable until July, the law that has already been in place since 2020 still is in place. And, as we’ll discuss in a moment, the California attorney general is actively enforcing it, so folks just can’t forget about the CCPA as it existed prior to January—that law still is in operation, and there’s still risk related to it. But in terms of what we’re calling it, the CCPA is still the operative law. So, I don’t know how you’ve been referring to it, Fran—I still usually call it the CCPA, sometimes CCPA/CPRA—you can call it a number of different things, but I think that context is important. The CPRA makes changes to the CCPA and expands on its requirements. As you also just explained, there are also regulations that are important. Again, there are existing regulations implementing the prior version of the CCPA, and the regulations that the California Privacy Protection Agency is currently considering, again, amends and expands on those existing regulations—they don’t come out of nowhere.
Fran Faircloth: Got it—that’s very helpful. I’ll make sure I’m calling it the CCPA from here on out. What’s been going on recently in California?
Speaker: The CPRA went into operation on January 1st, as you mentioned, without its implementing regulations in place. The California Privacy Protection Agency (or CPPA) recently unanimously adopted the draft regulations that have been percolating now since this past summer. They will go through review by the Office of Administrative Law (or the OAL). Likely that process will take until about April to complete, so we’re looking at April to be around when the regulations are going to go into operation. The OAL review is not just a rubber stamp. When the CCPA went into operation in 2020, the OAL did make some changes to the regulations that went into effect at that time, so we’re going to monitor to see what happens—it’s not likely to be particularly material changes, but there could be some that should be noted.
At the same meeting where the regulations were adopted by the CPPA, the CPPA also initiated pre-rulemaking activity on a number of important subjects, and some of these subjects could have some significant impact on clients. Among them, they talked about cybersecurity audits that might be required; risk assessments that may be required and may need to be submitted to the California Privacy Protection Agency for review in some instances; and lastly, they began the process of considering some questions regarding automated decision-making and the regulations that could go in place there. This is still pre-rulemaking activity—there have not been rules proposed. They’re still asking questions, but there will be a number of important developments in that area.
Lastly, another important development just in time for Data Privacy Day, and probably more significantly, the kickoff of this podcast: The California attorney general announced a new set of enforcement sweeps focused on the CPPA’s do-not-sell requirements.
Fran Faircloth: Interesting—so, the sweeps are starting now, even though CPRA isn’t enforced until July 1st?
Speaker: Right, but as we just discussed, the CPPA requirements are still enforceable today, and so, the California attorney general is conducting these sweeps with respect to the preexisting requirements under the CCPA. The California AG announced these sweeps that are focused on what the AG describes as “popular apps” that fail to honor consumer requests to opt out of the sale of their personal information. The announcement also says the sweeps will target the retail, travel, and food services industries. The do-not-sell aspect of the CCPA has been an area of focus for the AG for some time. They have looked at whether businesses post the do-not-sell link and whether you can actually find and use that link. They’ve actually looked at some studies where professors and other academics have looked into the usability of some of these privacy features.
Fran Faircloth: It’s interesting—I heard you say that, even though a lot of attention is being placed on the new California Privacy Protection Agency (the CPPA), it sounds like the California AG is still really active in this space. I know last year, we saw the first monetary settlement of $1.2 million that was reached with Sephora. Of course, there was no admission as part of that settlement by Sephora, but there were allegations related to the transfer, and even potentially sale, of data there that related back to the CCPA.
Speaker: Exactly, and that settlement related to many of the issues that the California attorney general was focused on now through these sweeps.
Fran Faircloth: Thanks. Turning to the regulations, how are our clients treating them, given that they’re not yet enforced?
Speaker: Clients are taking different approaches, many of which are appropriate. The regulations are pretty close to near-final form, so you can look at them as very close to what will be operative beginning in April, but you still do have some time to implement them if you haven’t already. Like we discussed, the CPRA amendments, these regulations in particular, are not enforceable until July.
Fran Faircloth: Assuming that the Office of Administrative Law accepts the current draft of the CPPA’s regulations, what are some of the highlights? Anything that’s come up for your clients or that you’ve been seeing recently?
Speaker: Yes, absolutely. Of course, I’m interested to hear from you, too, if there’s anything in particular that’s come up for your clients, but there are a number of practical pieces that come through the regulations. For example, there’s a discussion of what specific data protection terms need to be in contracts with vendors that are covered by the CCPA. There’s actually an order list of the information that’s supposed to be included in various privacy notices, including the website privacy notice and notice of that collection. An important detail involves specifics around responding to browser-enabled opt-out preference signals like Global Privacy Control (or GPC), which the regulations state should be treated as an opt-out request regarding the sale or sharing of personal information. And there are details around the circumstances where you have to make further efforts to identify the individual and opt them out in response to those GPC signals, so that’s a really important development. I think a lot of people aren’t focused on GPC at this time, but it’s really, again, going back to Sephora and some of the California AG sweeps that have been conducted so far, that’s something they’re actively looking at.
Fran Faircloth: Yes, I know I’ve had a lot of clients with questions about that as it relates to a lot of the attention that’s been given to things like targeted marketing lately, similar to what we saw, as you said, in Sephora, so that’s a piece I’m watching.
Speaker: Yes, absolutely. One of the most interesting pieces to me—and I think this is honestly something that clients are not quite as focused on, because it’s easy to talk about a privacy notice, and then draft a privacy notice and post it on your web—an area that was much discussed in the regulation and is going to be required, has to do with the so-called “principles of processing,” such as data minimization and the purpose limitation. So, what that is, in brief: Any processing conducted by business—and when I say “processing,” I really mean anything you’re doing with data that’s collecting, storing, disclosing, or ordering—is supposed to be reasonably necessary and proportionate to achieve either the purpose for which the information was collected and that you’ve disclosed, or another disclosed purpose that is compatible with the context in which the information is collected. It’s interesting to think how you are supposed to implement a principle rather than just posting a privacy notice—what I’m seeing is clients implementing that through policies, trainings, and other approaches like that to try to get people to implement these in practice. The regulations also go further and say that any processing has to be consistent with the reasonable expectations of the consumers, and that’s a difficult standard to try to interpret: How do you determine that?
Fortunately, the regulations do give us some examples to give us some guidance—some of them are obvious. It gives an example of a flashlight app and says that it would not be consistent with consumer expectations for that flashlight app to collect geolocation data. And actually, that example comes straight out of FPC jurisprudence—there was a settlement involving exactly that scenario. Some of the examples, though, are less obvious and maybe more difficult for clients to comply with. For example, it says that consumers giving their personal information to use one product or service might not expect that business to use that information for another product or service—and that could have real-world implications, because businesses commonly do that, particularly where they’re developing new products. That emphasizes, to me, the importance of disclosures, which is something else the regulations mention as part of this consumer expectations test. It’s harder to say that consumers don’t expect something if you tell them, and so, making good disclosures that talk about all the different ways you can be using data is, frankly, increasingly important.
Fran Faircloth: It’s interesting that they pin this to consumer expectations. I’ve had a lot of clients ask about that. I’m glad there are some examples. It’ll be interesting to see whether consumer expectations change over time—I would think they would as the current Gen Z and Millennials get older and become a vast majority of the consumer population at some point down the road. I think there’s going to be more of an expectation among those groups that their data is being shared regardless of the app.
Speaker: Exactly—and who gets to decide what consumer expectations are? Is it some 60-year-old person who might be very surprised with data uses, or, exactly as you said, some Gen Z individual who might have a very different set of expectations?
Edward Machin: I’m particularly interested in the pre-rulemaking activity, because it sounds like a lot of these concepts seem like extensions of the GDPR requirements, such as potential rules around automated decision-making, So, first: What is that, and what would the CPRA require?
Speaker: Absolutely, Edward—these are concepts that are embedded in GDPR, as well, particularly the requirements around automated decision-making. And so, the CPRA instructs the California Privacy Protection Agency (the CPPA) to issue regulations governing access and opt-out rights with respect to businesses’ use of automated decision-making technologies. It also instructs them to talk about what disclosures businesses should make regarding the logic involved in that decision-making. So, again, the CPPA is starting to look into what practically that means. It’s asking questions for comments regarding some basic things like, “What is automated decision-making?” It’s also asking questions around other laws that currently apply in this space and how these regulations should interact with them, and, like many, it’s looking into discrimination in the context of automated decision-making. I think we all have heard about the famous case where Amazon was trying to roll out its Amazon Prime product and used AI in order to do so, but the result of the use of AI unfortunately led to discrimination around certain districts where Amazon Prime was not rolled out to first. So, the CPPA is starting to look into many of these issues and how it can issue regulations that will help address them.
Edward Machin: Do you think that the CPPA will look to how these types of rules and technologies are being regulated in Europe to feed into its own understanding of automated technology, or will it look to carve a new path?
Speaker: Absolutely—I think it’s looking to many different areas and approaches taken by other regulators. We see, explicitly, references to GDPR approaches, for example, in its privacy impact-assessments regulations that it’s looking at, and so, absolutely. It’s focused on what other regulators are doing in this area, and the gaps and the space that it needs to fill.
Edward Machin: What are the other areas that the CPPA is looking into?
Speaker: This is just a start, so we’ve had the initial round of draft regulations that we just talked about. Now, they’re starting to look at three issues. But this isn’t the end—they’re going to look at some other issues, as well. Right now, there are two other areas that the CPPA is asking questions about, and both of these could have quite significant practical impact on businesses whose processing is deemed to represent a significant risk to consumers’ privacy or security. The first: The CPRA requires the agency, the CPPA, to consider regulations requiring these businesses—again, businesses conducting processing that would create significant risks—to conduct cybersecurity audits. Again, they’re asking questions around what laws are currently in place in this space and what the gaps are in those existing regulations. The second area that the CPPA is asking questions about is around risk assessments regarding the processing in these areas, and these are risk assessments that businesses would be required to submit to the agency. Edward, to your question before, this sounds a lot like “data protection impact assessments,” to use the GDPR term, and this is an area that other laws in the United States, including in Virginia, Colorado, and others, are looking at and specifically addressed. So, again, the CPPA is asking questions about what the other laws are and the gaps that they need to fill, they’re asking a number of questions around the content and form that these risk assessments should take, and they’re also asking how to determine what processing constitutes as significant risk. And here, again, we’re looking specifically at GDPR and whether they should follow the GDPR approach.
Edward Machin: It sounds like companies that are subject to existing European laws will probably not be starting from scratch if they also are subject to these new and emerging U.S. state laws. Would that be right?
Speaker: Absolutely. “Not starting from scratch” is a good way of putting it, but of course, there are going to be new wrinkles coming from these U.S. state laws. In particular, one interesting aspect of this is: The GDPR approach is to look at specific processing activities—and where those specific processing activities create risk to individuals, they will conduct a DPIA regarding that specific processing activity. The risk assessment is more holistic, at least as it’s phrased in the CPRA itself, so a business would be required to submit annually a risk assessment regarding holistically its activities rather than with respect to any specific processing activity.
Edward Machin: You may know that we like to wrap things up by discussing the best, the worst, or the weirdest things we’ve seen or heard about privacy in the last few weeks, so why don’t we start with you?
Speaker: Sure—we were just discussing automated decision-making, and I was actually reading yesterday an article about the triumph of man versus machine. You may remember, in the early days of AI, stories about humans competing with computers in various games—I think most famously chess, where grand master Garry Kasparov was ultimately defeated by IBM’s Deep Blue. This was topped in 2019 when Deep Mind’s AlphaGo software defeated South Korean Go champion Lee Se-dol, which was apparently notable, because Go is a more intuitive game, I guess, than chess. Edward, I don’t know if you play Go?
Edward Machin: I play Go very badly.
Speaker: Okay, so maybe you’ll understand this more than I, but it’s apparently more intuitive than chess, and so, many thought that humans would hold the advantage over computers for some time. Apparently, we should not give up hope for humanity, because Kellin Pelrine, an American who was one level below the top amateur ranking, recently defeated the top AI system in 14 out of 15 games. Pelrine was able to exploit a flaw in the AI, and, again, Edward, you might understand this better than me, but what he did was: He created a circle around some of the pieces while distracting the AI with moves in the corner of the board. A human would likely have noted this tactic, but because the AI is trained through simulations of past games and this wasn’t a common tactic, the AI couldn’t keep up, because it couldn’t see the big picture—which I think gets to one of the main points that’s frequently discussed in the context of automated decision-making regarding the importance of human intervention, especially where automated decisions have legal or other similarly significant effects, because humans might be able to see the big picture in a way that AI can’t.
Fran, what about you—has there been anything that’s piqued your interest since the last podcast?
Fran Faircloth: Actually, just a few days ago, I saw a really interesting article about a house that’s being sold in Florida. In March, a contractor is planning to put up his latest creation on the market—it’s an 11,000-square-foot mansion with seven bedrooms and a pool in Pinecrest, Miami. You might wonder why we’re talking about that on The Data Day… that’s because purchasers of the house don’t just get the home—they also get an exact replica of the house. The only catch is that the replica exists in the metaverse, and it comes also with a giant, lime-green gorilla that can climb buildings and, I guess, terrorize the town if you want it to, King Kong-style. The contractor paid $10,000 for the digital parcel of land where this replica is built. It’s in an online world called The Sandbox, and he had an architecture firm that specializes in these kind of virtual 3D properties to build the metaverse replica. The listing price for the house is expected to be around $10 million for both the actual house, and the metaverse replica, and, I guess, the lime-green King Kong. This is pretty hard to believe, given that financial transactions in the metaverse are handled in crypto, and with the implosion of FTX and projections of a crypto winter, it’s pretty unexpected. But real estate in the metaverse seems to be going strong—some people say that by 2026, they expect the market there to grow by more than $5 billion.
The Sandbox, which is where this virtual house is built—which is also one of the most popular metaverse worlds—much of the virtual land rush has been at the hands of big global corporations like Adidas and Atari. And those corporations have bought space to create entertainment venues, sell goods, launch virtual headquarters—those types of things. But it’s not just big corporations—we also heard that Snoop Dogg purchased parcels in The Sandbox and christened them “the Snoopverse.” Someone actually paid $450,000 just to be Snoop Dogg’s neighbor in The Sandbox, so it’s pretty big business. Last year, the total value of land in The Sandbox—which is all managed and sold through NFTs—it’s estimated to be about a $167 million, so it’s a pretty significant market out there.
Edward, what about you? What have you been reading recently?
Edward Machin: I’ve recently finished an excellent book called Chip War, which sadly isn’t about French fries or potato chips, but rather the book traces the history of semiconductors and the hidden roles they play in all of our lives, or, at least, it was hidden to me. That’s particularly the case for the billions of devices that generate, use, and store personal data, and so, it’s a very topical book for listeners of this podcast. The book is called Chip War by Chris Miller—it’s a strong recommendation from me.
Fran Faircloth: Fantastic—I’ll definitely add that to my Amazon cart just as soon as we finish recording here. Thanks to everyone who tuned into this episode of The Data Day from Ropes & Gray. If you would like to join us for an episode, or you know somebody who we need to have on the show, please reach out to Edward or to me via email or via LinkedIn. And if you enjoyed the show, you can subscribe and listen to this series wherever you regularly listen to podcasts, including on Apple and Spotify.
Edward Machin: Until next time, thanks for listening.
Stay Up To Date with Ropes & Gray
Ropes & Gray attorneys provide timely analysis on legal developments, court decisions and changes in legislation and regulations.
Stay in the loop with all things Ropes & Gray, and find out more about our people, culture, initiatives and everything that’s happening.
We regularly notify our clients and contacts of significant legal developments, news, webinars and teleconferences that affect their industries.