Alex Urbelis on Legal Implications of Cyber Risk
Is cross-functional collaboration a myth? Anyone who has ever participated in a boardroom meeting in which legal, compliance, and security leadership were present knows of the ongoing tension between these functions.
General counsel and heads of compliance view digital risks and security-related policies from a completely different perspective than that of technical leadership such as the chief information security officer (CISO) or chief security officer at nearly every organization.
When competing priorities reach a standstill, whose guidance should inform a company’s security policies?
SecureDisruptions content director Jeremy Seth Davis spoke recently with hacker-turned attorney Alexander Urbelis, a partner at Blackstone Law Group, to explore this dilemma. His background in the intelligence and defense industries informs his approach to cyber and legal issues.
He served as the National Football League’s chief information security officer and previously as chief compliance officer at Richemont, so he has plenty of experience on both sides of the table.
Alex created a domain name threat intelligence platform, OMNI, which detects cybersecurity threats and IP infringement. Last year, he identified state-sponsored intrusion attempts on the World Health Organization.
His unique perspectives prompted a lively discussion of cybersecurity and regulatory harmonization, approaching security as an enterprise-wide risk, and his counter-intuitive thoughts on over-compliance.
This transcript has been edited slightly for length and clarity.
Alex, how do you think senior executives who want to approach cyber security can take a leadership role in managing the diverse cyber threats that we face?
Senior leadership must ensure that cybersecurity and the role of a CISO are priorities for the organization. A lot of organizations are facing a form of cyber security that needs to be qualified—we have a need for demonstrable cybersecurity. And that’s driven in large part by legal and compliance requirements depending upon the industry, be it health care, finance, insurance, defence, or some other form of government contracting or, really, any organization that touches on personally identifiable information (PII).
Another driving force for cybersecurity that is affecting executives and is really worth mentioning is M&A and due diligence. Investors or any kind of acquiring entity want to see that the business into which they're putting their money has protected its intellectual property and trade secrets from misappropriation. And these parties also want to make sure that their money is not going to go out the window to pay a regulatory fine or settle a class action for data breach or misuse of PII. Another driving force that's worth mentioning for better cyber security these days is cyber liability insurance whereby, as a condition of coverage, entities or the insured must maintain a certain baseline of cybersecurity readiness. Otherwise, the insured could find themselves in the tricky position of having insurance, but no coverage for an incident. Cyber liability insurance in this manner can be a real force for good when it comes to evolving the baseline cybersecurity postures for larger enterprises.
We're seeing the insurance industry driving a lot of developments in cyber security. The industry has certainly evolved and is much more sophisticated. What do you see as the main aspects to look for when assessing a company's cybersecurity insurance policies?
It depends on the industry and the size of the enterprise. These policies are underwritten much more vigorously than they were even five years ago. There are now different types of data that insurance companies are going to request from you—perhaps even on an annual basis. You definitely want to read the exceptions. You want to know those exceptions extraordinarily well and you want to try to engage in a thought experiment about how each exception would apply or would not apply. So, you really need to think about these things. You need to work with good counsel. It's not something that you should approach individually. These are complicated and difficult legal issues and a lot of times they're reliant on case law. These days there's a lot of cyber liability insurance-related case law. Those legal precedents can affect whether or not an insured is going to be in the unfortunate and strange position of having insurance with no coverage, should a major incident affect the enterprise. So, in short, it's complicated.
Cybersecurity is very much on the minds of regulators globally and there's a greater focus on cybersecurity and data privacy than ever before. What are the main concerns that you think about in terms of the legal and regulatory implications of cybersecurity threats?
That's an expansive question and I'll give you an expansive answer. The legal and regulatory risks associated with what I believe now is the practice of cybersecurity will never decrease. That's something we need to come to terms with. Things might get easier in the future because there may be more uniformity of regulatory and legal requirements, but I don't foresee a period where legal and regulatory requirement burdens are lightened. Now, it's simple to throw out a whole bunch of acronyms like GDPR, CCPA, COPPA, HIPAA, GLBA and tell stories about how the Office for Civil Rights, the Federal Trade Commission, or an ICO in a European country brought the hammer down on a particular company for having bad cybersecurity privacy practices.
There's always a risk of litigation, or that a legislature may create new private rights of action for cybersecurity violations that could give rise to additional lawsuits. Although this seems daunting, it's possible to work with competent counsel, enterprise leadership, and the CISO to come up with a framework that makes sense in terms of risk versus operational goals. You always want to have a risk value that is lower than the value of organization's operational goals. That's the equation we're always striving for, and that's how you want to build your cybersecurity program. So, if you're a global company trying to comply with every single local law around the globe, it’s nearly impossible and you run the risk of overcompensating with over-compliance. People think about under-compliance, but over-compliance is equally bad—that tends to negatively affect the business side of things. Going about cybersecurity or privacy regulation in this way could, quite frankly, lose you some birthday invitations. Because advocating for overly complicated, burdensome compliance and security protocols is probably not the best way to win friends and influence people in the workplace.
Also, legal and compliance requirements should not be viewed as comprehensively prescriptive. They don't tell you exactly what to do. They provide certain frameworks, baselines, and principles that should guide a cybersecurity program. But there's a huge amount of filling in of specific gaps. Cybersecurity professionals and CISOs must be collaborating with privacy engineering teams and their legal departments to do that. One reason for this is that the law, and the common law in particular, can be looked at as providing proscriptive advice—that is, advice about what not to do. Because, when matters end up in court, things have been going wrong for quite a while, and a well-reasoned judicial opinion about where things went wrong can be a very good starting place. Figuring out what not to do is almost as important these days as knowing what to do.
Also, legislators and lawyers misapprehend or do not understand the effect of their language on how a particular product, industry, or platform must be specifically engineered, or how those requirements may affect something very basic, like user experience on a platform. This is because, to a great extent, legislators—most of whom have a legal background, for better for worse—and lawyers think about things differently than engineers. Legal minds are trained to live and thrive in grey areas and have no problem with ambiguities like “protected by reasonable security measures” or “deleted within a reasonable period of time.” To an engineer, on the other hand, these terms sound like somebody scraping their fingers down a chalkboard because they are way too ambiguous to design a deterministic system around. Engineers are, in a sense, the opposite of lawyers. They don't want ambiguity. They don't want breathing room built into a standard or designed into a system. What an engineer wants is to limit randomness and to produce the same output time and time again from an initial state or starting condition. That's why that the practice of privacy engineering is so interesting these days and so critical to many organizations.
The other really important point to note is that regulatory fines associated with bad cybersecurity and privacy practices don't just appear out of nowhere. They're not like those incredibly annoying surprise camera fines that we get from momentarily crossing into the bus lane in New York.
There's always some form of process or due process. This is our tradition of justice and fairness in this country. So, there has to be some form of process and an opportunity to be heard before fines are imposed. That's why, again, it's important to work with competent lawyers: first, to design cybersecurity programs in compliance with these applicable laws and regulations, and second, if the proverbial fecal matter does “hit the fan,” you may be facing regulatory action that could lead to a class action lawsuit.
That leads to another consideration: understanding how cyber and legal risk overlap is crucial to avoiding blind spots. The domain name system (DNS), is a good case in point because security or cyber security professionals may see the DNS as a legal problem with cyber security implications; on the other hand, the legal department may see the DNS as a cybersecurity program with legal implications.
If both functions share the view that policing the domain name system is not really their problem, you've got a dangerous blind spot that will be used as an attack vector against your organization or against third parties with which your organization contracts. Collaborating and creating a symbiotic relationship between departments and the cybersecurity function is absolutely critical to getting a handle on the legal and regulatory risks associated with a cybersecurity posture. You need that type of relationship when things go wrong.
What do you think are the important things that every manager should know about cybersecurity?
If I were to impart one piece of knowledge to non-technical managers, it would be that we don't live on the set of a CSI episode. Cybersecurity can be imprecise. It can be messy, and even though we're dealing with systems, logic, and networks, there can still be a great deal of ambiguity. And there will be grey areas, especially when it comes to things like identifying the root causes of an incident, or when an incident may have actually started. Finding that data can be very tricky and attribution for an incident to a nation or an APT group—an advanced persistent threat—or rogue band of cyber criminals may not always be possible, and in many instances, may not matter. We need to come to grips with the fact that, the amount of money, the amount of manpower, C-level or board attention given to cybersecurity notwithstanding, there will always be incidents, and there will always be more to learn and more to do. It's these fundamental truths that make this industry so appealing and dynamic.