TechPulse took over the second floor of the Saint Paul RiverCenter last Wednesday, April 19th, 2017. The theme was “A Mature Look at the Cloud” but more apt would have been “Security. No, Really. Security”.
"There is one person in every organization who will click on anything." #TechPulse
— Riley Major (@RileyMajor) April 19, 2017
A recurring theme was that the weakest link in any security chain is often the user. In the opening keynote, Doug Splinter reminded us that “there is one person in every organization who will click on anything.” In a breakout session, Evan Francen, a professional social engineer, explained that “it’s easier to go through your secretary than it is your firewall.” Emily Duke relayed stories in her session of email phishing attacks leading to companies wiring millions to overseas fraudsters. And during Mark Lanterman’s lunch keynote, he recounted checking in at the largest security conference in the country and finding an unattended registration desk laptop with the username and obvious password attached with a sticker. He also echoed that “it’s easier for me to hack you than hack your technology.”
"It's easier for me to hack you than hack your technology." #TechPulse
— Riley Major (@RileyMajor) April 19, 2017
As difficult as users make the lives of security professionals, the response should not be to frustrate those users. Most breaches occur because of well-intentioned people just trying to do their jobs. And you’ve never seen such ingenuity as employed when circumventing burdensome corporate IT restrictions. You’ll never convince users to come in from the dangerous ocean and swim in your wading pool. You have to provide a compelling alternative to the risks they will otherwise take.
All security risks are ultimately business risks, and the business should define how much risk is acceptable. Ideally, those security goals would be defined in an official security policy which describes what information is sensitive, what constitutes acceptable use of the information (at what times, by which users, and through what interfaces), who has responsibility for protections and enforcement, and what security responses should entail.
When considering those risks, Emily Duke encourages businesses to realize that it’s not a question “if” but rather “when” they will be hacked. She lists the direct consequences of those attacks, such as loss of access to data (ransomware) or resources (denial of service attacks); deletion, corruption, or even subtle alteration of data; damage to systems or equipment; or even harm to people. She also warns of the consequential damages– fines from regulatory agencies (FTC, SEC), legal action from customers (contract breach), and class action lawsuits from consumers. And of course, win or lose, those cases bring hundreds of thousands of dollars of legal bills.
On the market for insurance against cyber attacks: "it's the Wild West right now." #TechPulse
— Riley Major (@RileyMajor) April 19, 2017
While you can insure against some of these risks, they aren’t going to pay the ransom to get your data back, and they can’t compensate for the hit you will take to your reputation. The underwriting process is often lengthy and laborious, and the costs will increase depending on your security practices. So it’s no substitute for doing the work to mitigate your risk the old fashioned way– by making bad things less likely to happen.
Cyber attacks are all about money, and attackers will do a cost/benefit analysis of their targets. Security is all about making yourself more expensive to compromise than someone else. And defense in depth matters. We have to end the mentality that all you need are perimeter defenses. Breaches begin with beachheads; attackers compromise one system and then weasel their way into another, escalating their privileges at each step. A compromised user’s machine will act up and the malicious keylogger will capture the credentials of the help desk tech– with elevated access– who logs in to investigate. Typically attackers gain a foothold in days and then spend months surveilling and creeping. The more difficult you make each step, the more costly the overall endeavor. And the more monitoring you have, the more likely you are to catch malicious activity early in the process.
Ultimately, if economic forces don’t drive adoption of better security practices, legislation might. Uncle Bob Martin prophesized that if software developers don’t establish quality systems themselves, they will be forced on them by governments. That’s already happening with security. Doug Splinter explained how California’s Attorney General has established the CIS Controls (formerly SANS Top 20) as the baseline for “reasonable” security. And of course for some organizations, there there is already HIPPA, Sarbanes–Oxley, and a patchwork of other regulations.
These rules might seem hypocritical, as governments are often prime offenders. “Do as I say not as I do.” Consider the examples from Mark Lanterman’s keynote, like the deployment of wind turbines from a manufacturer with such lax security practices that by default they are publicly accessible on the Internet with no credentials required (unless you want to install new firmware, which requires the password in the supplied documentation). Or one of the hundreds of cities who deploy water pumps with no protection from random Internet passersby turning off water supplies. Pointing to their failures, though, will serve as no defense of these obviously bad practices.
Surprisingly, despite all of the doom and gloom, the TechPulse presentations were short on specific, practical, technical advice. It’s probably beyond the scope of the high-level keynotes, but even breakout sessions gave it cursory coverage. There was one common refrain, though: use two-factor authentication. And not just for logins, but for human activity, too. Get a request from a senior executive for a copy of everyone’s W2? Talk to them and confirm.
As Mark Lanterman exhorted: “just slow down”.