Are You Liable for the Data Shenanigans of Others? (Part 1 – A Brief Introduction to the Legal Framework)
Re-posted from intothecyberbreach.com, originally published on August 10, 2019.
Yes. The end. Ok, it’s not quite that cut and dry, but it is somewhat of a scary proposition. I had initially envisioned discussing vendor management in the context of “controllers” and “processors”, when it occurred to me that a lot of people don’t really know what that means or even what the GDPR is and whether they need to worry about it. The actual answer is, of course, it depends.
The question came up recently for me in a conversation with a couple of attorneys who had gone to a data privacy event for the purpose of figuring out what they had to do themselves to become GDPR compliant. They were shocked to learn that as a “controller” of data, they were potentially liable for the actions, or inactions, of the “processor”. This is all Greek to the solo practitioner, working with personal injury or family law cases, who just wants to know whether Google Analytics is going to cause them to be fined by the European Union. But, I think it is an opportunity to break down what some of these concepts mean, and to say something regarding vendor management under the GDPR in Part 2. I guess we’ve got our first “two part series” on our hands.
Bear in mind, these are really broad strokes, and depending on your own situation, may be an oversimplification. As always, I recommend you retain counsel for the purpose of establishing and maintaining compliance with data privacy laws.
Before we jump right into the GDPR, it is helpful to start at the beginning. I am going to assume for starters that your business is located in the United States. It may seem like, in the privacy world, all anyone ever talks about is the GDPR and the CCPA. For the uninitiated, it is not even clear what those acronyms mean.
The GDPR stands for General Data Protection Regulation. It is a set of regulations established by the European Commission on behalf of the European Union to update existing data privacy laws in recognition of changing technology and social norms which have put people’s personal information at risk.
The CCPA is the California Consumer Privacy Act, which is a state law enacted by the state of California to ensure that California residents have a right to know what companies are doing with their personal information, as well as to ensure that companies collecting that data are taking all reasonable steps to act responsibly with the information they gather.
The reason data privacy conversations so often refer to the E.U. and California law are that these are two of the strictest rulesets in the world regarding how to handle data collected from individuals. Further, because of the nature of the internet, the relevant query here isn’t necessarily where your business is located, it is where your business is reaching others. For instance, if you are a New York-based business but you have customers on your website from Germany, the GDPR applies to you. The query is as much about the location of the consumer as it is about the location of the business. And in an interconnected world you have far less control over who your customers really are than you would in a brick-and-mortar operation.
Today, 48 of the 50 states in the U.S. have data privacy laws. And all 50 states have some form of consumer protection and tort system. Further, there are laws and regulations regarding other contexts in which personal information can arise (for instance, the Health Insurance Portability and Accountability Act, i.e., HIPAA, or the Securities and Exchange Commission’s regulations about reporting financial information). I am going to put HIPAA and SEC regulations aside for now, to avoid muddying the waters. For the sake of context, if you are handling patient medical information, you need to be HIPAA compliant, which is a separate universe of rules, and if you are a publicly traded company, you need to follow SEC regulations. The majority of issues related to data breach in the SEC context have to do with making public, misleading statements about the nature of the breach. If you are dealing with data about children, that’s a different set of rules as well.
Just as importantly, you have to be aware of your local state laws to see what anomalies may apply to you. That said, as a VERY general rule of thumb, i.e., not-legal-advice and not true in all cases, if you are in compliance with the GDPR and the CCPA, you are very likely in compliance with other states’ privacy laws. However, these laws do not apply to every business.
The CCPA is set to go into effect in January 2020, although there are rumors this will be extended by several months. The law is targeted to businesses with “annual gross revenues in excess of twenty-five million dollars ($25,000,000)”, or who “annually buys, receives for the business’ commercial purposes, sells, or shares for commercial purposes, alone or in combination, the personal information of 50,000 or more consumers, households, or devices”, or “derives 50 percent or more of its annual revenues from selling consumers’ personal information”. If you don’t meet that criteria, the CCPA does not apply to you. However, my advice would be that even if the CCPA does not apply, you should consider the feasibility of building CCPA compliance into your business process, for several reasons. First is that other states are changing their privacy laws all the time and may encompass some or all of these measures in the near future. Second is that it allows you to grow your business to fit the CCPA, rather than have to take remedial (pronounced: e-x-p-e-n-s-i-v-e) measures in the future. Third, the CCPA offers a set of “best practices” that are likely to keep you out of trouble in most state jurisdictions.
The language of the CCPA also raises the interesting question of what a business is, but I hope to address that at some point in a future post. If you are unsure whether your outfit is a “business”, go talk to a lawyer. If you can afford to hire said lawyer, chances are good that what you are doing is a business.
The GDPR casts a far more ambitious net. First, dispel with the idea that the law does not apply to you because you are a U.S.-based business. That’s so 2017! The GDPR applies even to U.S.-based businesses that never step foot in the E.U., if they find themselves handling the “personal data” of E.U. citizens, or even people located in the E.U. (cue puzzling questions about whether we’ll see a cottage industry of “data privacy tourism” for Americans who want to fly to France, eat their fill of cheese, and claim E.U.-style privacy rights before returning home.)
How “personal data” is defined must be discussed before we can decide whether the GDPR applies, and here the boldness of the law really comes into focus. “Personal data” can be any information relating to an identified or identifiable natural person, including name, ID number, location, online identifier, physical traits, physiological, genetic, mental, economic, cultural or social data about that person. That also covers IP address, cookies, social media posts, contact lists, and mobile device data. Probably also includes dessert recipes and favorite color. So… yeah, we are talking about nearly anything.
It is very hard to collect any information about your customers or website visitors without triggering the protections of the GDPR. The crazy thing here is that it is unclear what personal information will be identifiable from future technologies, which could also be problematic. Is asking “how are you?” over the telephone a GDPR triggerable event? Maybe…
If we are still wondering whether the GDPR applies to you, I think we can distill it down a little further. Do you have a website? Does the website have any cookies? Does the website keep a log of IP addresses visiting your site? Do you use a third-party service to contact your customers or track website visitors (like Google Analytics or MailChimp)? If your answers tend to be yes, then the GDPR is likely to apply. Now, if you have less than 250 employees, not only are you my target audience for this blog, but the GDPR recognizes that you are a smaller data risk than the larger big corps out in the world. The rules apply to you, but the requirements are somewhat different.
I am going to have to write about what these laws actually require in a separate post (I will put a link here once I’ve done that). But that last question about third-party vendors is really the issue that I wanted to try to tackle in this series. What are your responsibilities when a company that you use to track your website traffic, or to manage your contact list, experiences a data breach of your data?
To answer that question, we have to understand and discuss the concepts laid out by the GDPR of “data controller” (the people with a website), and “data processors” (the people who are given third-party access to information about that website). As you can see, this is a big topic, and you’ll have to wait for Part 2 to really dive in (or, you can discover this post months later and by then I hope to have a link to Part 2 right here on the page).
Stay Tuned!
What’s In Your Wallet?
Re-posted from intothecyberbreach.com, originally published on July 30, 2019.
Yesterday, Capital One announced a breathtaking breach of 100 million accounts within its system, thus compromising the private data of a significant percentage of Americans in one single incident. The scope of the breach is comparable to the Equifax breach in 2017, which Equifax had acknowledged affected 143 million Americans.
The question of “how can this keep happening?” should, by now, be replaced with “when is the next big one?” Is this even a “big one?” Breaches like the one announced by Capital One yesterday are the new normal.
From the consumer side, people who think their private information may have been breached can take a few steps towards solace. One is, obviously, check your credit card statement and make sure there are any goofy charges on there. If you want to take it step farther, you can freeze your credit reports, which would prevent anyone from opening a new credit card account with your information. Third, change your passwords.
The issue of compromised passwords is all the more alarming when considering that most people still use the same password on all of their accounts. So, if when your password is finally compromised, it is essentially compromised everywhere. Here’s a hint, chances are good that by the time you find out about a breach, it’s way too late. The name of the game nowadays is detection, not prevention. This means there is some acknowledgement from the establishment that preventing breaches is a losing battle, and many security groups are re-focusing their attention on just making sure that the breaches that do occur actually get noticed.
So, what does the Capital One breach tell us from the perspective of a data controller? See above. One takeaway here is that if Capital One, Equifax, Marriot, Yahoo!, Myspace (when was the last time I said those two in one sentence? 2003?), Under Armor, Uber, Target, Home Depot, and countless others have been unable to thwart 100% of all data breach attempts, what makes you think you can?
One common misconception on that theme is that it’s only the big boys that are being targeted. That couldn’t be farther from the truth though. According to the Verizon 2019 Data Breach Investigations Report, 43% of cyber-attacks target small businesses.
The takeaway here is that if you don’t already, you need to have a plan for what happens when it happens.
New York State Of Mind.
Re-posted from intothecyberbreach.com, originally published on July 29, 2019.
This last Thursday, July 25, 2019, lawmakers in New York enacted the cleverly named “Stop Hacks and Improve Electronic Data Security Act” (the SHIELD Act), Senate Bill 5575. While Nick Fury could not be reached for comment, I was able to cobble together some details from the new law…
Following the lead of many other states, the SHIELD Act updates New York’s data breach laws by expanding the definition of private information, expanding notification requirements, and requiring that individuals and businesses handling sensitive information implement “reasonable” data security measures. Perhaps most significantly, these requirements will now apply to any person or business that owns or licenses “private information” of a New York resident.
According to the Governor’s office in New York, “[t]his legislation imposes stronger obligations on businesses handling private data of customers, regarding security and proper notification of breaches by:
Broadening the scope of information covered under the notification law to include biometric information and email addresses with their corresponding passwords or security questions and answers;
Updating the notification requirements and procedures that companies and state entities must follow when there has been a breach of private information;
Extending the notification requirement to any person or entity with private information of a New York resident, not just those who conduct business in New York State;
Expanding the definition of a data breach to include unauthorized access to private information; and
Creating reasonable data security requirements tailored to the size of a business.
This bill will take effect 240 days after becoming law.” https://www.governor.ny.gov/news/governor-cuomo-signs-legislation-protecting-new-yorkers-against-data-security-breaches
The new law does not expand the definition of private information to include passport number, employer ID number or financial transaction devices, all of which are included in California’s new privacy regime.
While New York’s previous data breach statute, passed in 2005, required notification of breaches whenever unauthorized private information had been accessed, the SHIELD Act now requires such notice whenever such data has been accessed. Not surprisingly, this significantly expands the number of incidents that will require breach notification. Notification is required to occur within “the most expedient time possible and without unreasonable delay”, unless it can be verified that the access was “inadvertent” and that it “will not likely result in misuse.”
The Act’s requirement for “reasonable” security measures is an interesting one. It states, “[a]ny person or business that owns or licenses computerized data which includes private information of a resident of New York shall develop, implement and maintain reasonable safeguards to protect the security, confidentiality and integrity of the private information…”. The Act even states some examples of what “reasonable” could mean: employee training, regular risk assessment exercises, regular testing of key controls and procedures, and the disposal of private information when no longer needed. There is some risk here that while the list is not meant to be seen as exhaustive, a court could de facto apply those requirements rather rigidly. I’ll be following that issue once we see some guidance from the courts.
Notably, the SHIELD Act does not create a private right of action for an entity’s failure to comply with the law. While this may warrant a sigh of relief from companies within the technology space, we will have to continue to look out for The New York Privacy Act, which is under consideration by the New York State Senate at this time. The New York Privacy Act would indeed create such a private right of action. If passed, it would represent the most aggressive data protection policy in the United States, if not the world.
It Was Just A Mission Statement…
Re-posted from intothecyberbreach.com, originally published on July 28, 2019.
Just what the world needs. Another blog.
Let me start that over. What are we doing here?
This first post will be my mission statement, if you will. My statement of intentions.
So, who is this blog for?
It’s mainly directed to entrepreneurs, technologists, business owners, executives, in-house counsel or really anyone trying to figure out: 1) how to prevent the data in my possession from being compromised or stolen; 2) what I need to do if it has been compromised; and 3) how I can protect myself and my company from liability in the event of a breach? I will be covering these things from the legal aspect, but there will be actionable information relevant to your approach to technology as well.
And who am I?
I have a relatively unusual background for a lawyer. (cue Liam Neeson’s explanation of my “unique set of skills”) I started my adult life dropping out college in 2000 to go join the new technology revolution. Back then, you could get a job writing code by just reading a few books and having the gumption to ask for a job.
I started my first tech job in Newark, NJ at NJIT’s business incubator in the late 90s. My best friend was working for at a tech startup, writing software for one of the world’s first online travel booking engines. For those of you born in this century, what that means is that before this project, in order to book a travel vacation people would either drive to a travel agent’s office, or pick up the phone and book their vacation through the telephone. My friends were changing that. And they were making way more money doing it than I was going to make digging ditches or painting fences.
So, in an effort to get what was intended merely to be a summer job, I showed up to their office and begged for a job. The boss asked me, “What do you know how to do? Can you write code? Ever use SQL? Unix? Do you know any Perl?”. “No, but I can learn really fast.” I said. He wasn’t impressed and ignored me the rest of the day. They were too small and busy to have the sense to kick me out of their single-room incubator office.
There was an energy there that can only be found in a new startup, and I absorbed everything, like a sock in a puddle. I sat in their office reading coding manuals all day. There were very few websites that taught programming back then, but they existed, and I sought them out. I started with HTML, a little javascript, and it didn’t take long before I was piping into grep. (Don’t ask)
I hung around for a few days, asking for a job each day, and having a sense, deep down, that if they just hired me, I’d be great. I read and learned and waited for the job that I knew I would get.
After those first few days, as I sat around in their office, some data entry task for a client’s website came up that no one wanted to do. It involved making it so that clicking on certain parts of an image brought you to different links (i.e., travel agent locations). I eagerly volunteered. It required almost no skill, just effort. I did it for free. It took me all day (in hindsight, probably a 20 minute job). At the end of the day, my new mentor said, “well, if you are going to work here I guess we’ll have to pay you.” I was in!
I dove in, learned as much as I could and was (in my mind) on track to make my first million before 21. I dropped out of college shortly thereafter to go full time. We were doing cutting edge stuff, and I was in the middle of it. I worked long hours, and it hardly ever felt like work. Our little company with a few people grew to 10. After hours, I wrote more code at night on my own time, eventually creating a task management system that utilized some of the prototypical aspects of social media, which I sold to our company in exchange for a stake in ownership. We were on our way!
*bubble pop*
Then it was gone in a couple years. It all happened so fast. I went to see my doctor for a checkup one day and my insurance had lapsed. A few months later, my paycheck bounced. I felt like the wind was knocked out of me. My lease came up, and instead of renewing, I lived in my truck for a few weeks and began to re-group. Re-grouping looks a lot like mostly moping in the day and partying at night to outside observers. It took me a long time to understand what happened, and even longer to come to terms with it.
One thing led to another, I wrote code freelance in my living room for a number of years to get through college, and made the decision that I would go to law school to pursue my original path before my affair with the startup world. I loved law school, and I avoided anything tech like the plague. I think part of it was that it hurt too much. Besides, anytime I told a prospective internship about my tech experience, they always asked me to work on their website, while the other interns were going to be doing policy research or watch oral arguments in court. I felt like I couldn’t escape. I stopped telling people that I knew how to write code, and I graduated law school to become a trial lawyer. That was 9 years ago, and the world has changed. People don’t need me to make them a website anymore, they need me to help them keep their data secure and stay out of trouble if they get breached.
You’ve probably already gotten one of those letters explaining that your private information has been compromised by a major retailer. You might have seen even more in the news. Companies that find themselves in the position of having been breached need someone who understands the technology, understands the rules governing breach responses, and who can handle any litigation that may arise out of the breach. This isn’t just about big-box retailers anymore. In many states, anyone who handles private information (or has a third-party vendor that does so), could liable for either mishandling that information or not reporting and notifying in the event of a breach.
So, that’s what this blog is about. I am a seasoned litigator and business attorney in a mid-sized law firm with offices across the country, and am admitted in New York, California and New Jersey. I live in upstate New York. I have seen the inside of a server, and I have seen the inside of a courtroom. The law is changing fast, and almost all of the states now require a complex response in the event of a company having its private data accessed inappropriately (i.e., a data breach). Not surprisingly, I offer these services (as well as other more traditional litigation and corporate law representation). You can contact me if you find yourself needing counsel regarding a data breach. But, my hope is that this blog is useful to you whether you become my client or not.
That said, let me throw this disclaimer out there, because it really needs to be made clear (to protect us both): NOTHING IN THIS BLOG IS LEGAL ADVICE. UNLESS WE HAVE A RETAINER AGREEMENT, I AM NOT YOUR LAWYER. IF YOU ARE RESPONSIBLE FOR A COMPANY WHOSE PRIVATE DATA HAS BEEN BREACHED, YOU SHOULD CONTACT A LAWYER IMMEDIATELY IN ORDER TO COMPLY WITH THE NUMEROUS STATE AND INTERNATIONAL DATA BREACH NOTIFICATION REQUIREMENTS. There are real consequences to being breached and not complying with notification laws. There are also real consequences to over-notification (and we’ll talk about that here too). Ideally, this is something that you should work out ahead of time, so you have someone to help you right away. In some cases, you really have very little time, a matter of a few days, not weeks.
Anyway, I will be writing about arising issues in the cybersecurity world, notable data breaches, and developments in the law, yes. But more importantly, I want this blog to provide actionable information, and I intend to do it in as human a fashion as possible. This isn’t a stuffy generic presentation of “what you need to know.” I’m going to write about what’s new in the cyber security world, but I might also write about why movies showing hackers hacking are mostly nonsense. I might also write about why Terminator is an amazing piece of art.
So read the blog. If you have questions, or just want to riff on these issues, get in touch. If you have complaints, keep those to yourself. Good luck navigating this crazy world!