How ‘lowest’ as best budgeting, opensource intelligence, a few manuals and a curious mind can turn out the lights. Chris Kubecka, Woman Hacker of The Year 2020, tells Martin Cooper MBCS some of what she knows about the art of the criminally possible.

‘I grew up with a lot of family in the US government,’ explains Chris Kubecka, CEO of HypaSec. ‘NSA, Voice of America... CIA... but I was mainly raised in a NASA family. There was lots of NASA. I grew up wanting to be an astronaut.’

This, she reflects, is one of the reasons why an early line on her CV shows she helped America’s Space Command protect its telemetry data from cyber attacks. Space Command, for the record, is the US military body responsible for operations in outer space (that is 62 miles above sea level, apparently). It shouldn’t be confused with Space Force.

The hacking begins - aged ten

The second event that led to a career in cybersecurity happened when she was ten years old. ‘I got busted for breaking into the Department of Justice’s systems,’ she says.

‘I didn’t have the emotional awareness to understand what I was doing - but then, how many ten year olds do? When you’re dealing with systems that don’t have a password, or it’s “1,2,3,4” - it doesn’t seem, well, real.’

Today, of course, cybersecurity is very real for Kubecka and much of her time is spent advising, consulting, speaking and lecturing on just how real. Much of her work focuses on attacking and defending physical and critical infrastructure: gas, electricity, water, nuclear.

‘Now I’ve grown up, all this stuff - cybercrime - seems like it shouldn’t be real,’ she says. ‘But it is. And I’m still seeing the things I did when I was 10 years old, only for major companies.’

Check out YouTube and you’ll find an array of Kubecka’s presentations to hacker conferences. She also advises governments and governmental departments who have three letter acronyms for names. As such, she often deploys what could be the most dangerous yet infectious laughs in cybersecurity; a laugh which says, ‘I’ve signed lots of secrecy documents so I can’t say any more... you’ll have to imagine the rest.’ You find yourself laughing too and then thinking, this is real, it could actually happen.

Not a Tom Clancy novel

So, what are the risks our nuclear, gas, electrical and water supplies face? What are the clear and present dangers which nationally critical infrastructure needs to defend against?

‘It can start as simply as getting into your electricity smart meter. They weren’t built with either privacy or security in mind and they can be used as a surveillance device,’ she says.

With access to your smart meter, a prospective burglar could, for example, work out when you’re at home or, through observing your power consumption patterns, when you’re most likely to be away. And this could all be done without the need for sitting outside your house.

‘Elsewhere, there are issues with physical security,’ she explains. ‘There might be a substation, particularly in the US... America is a big country. The substation might be in the middle of nowhere – there are no cameras - power can be diverted from the substation,’ she says. ‘We saw a case around six years ago where a particular US agency thought that there was a foreign entity that was able to remotely [access] a small segment of California’s electric grid.’ The entity, she says, was able to reroute power around California.

Flared trousers and flawed security

Elsewhere, she says, the curious-minded can discover a great deal about how America’s electricity grid is controlled. Sections of it rely on a protocol called Modbus. Designed in the late ‘70s, Modbus became the de facto standard communication protocol for controlling industrial devices. The problem is, Modbus devices will take a command - written in hexadecimal - and execute it without question.

Again, the curious-minded can easily find lists of Modbus commands online. Which leads neatly to an excellent research tip for anyone with an enquiring mind: read the service or admin manual for devices you’re interested in. Again, they’re largely available online and are a likely trove of valuable information.

Script kiddies and nation states

Of course, those ‘enquiring minds’ and ‘entities’ can be state sponsored hackers and there have been real cases where power grids have been shut down. The most infamous was a 2015 attack, which saw three energy companies in Ukraine taken down, along with electricity supplies to homes and businesses. In this case, a piece of malware called BlackEnergy was deployed, which targeted industrial control systems (ICS) and supervisory control and data acquisition (SCADA) infrastructure. Elsewhere, similar attacks have happened in Israel and in different parts of America.

Digging deeper into the mechanics of ICS and SCADA, Kubecka says: ‘You can think of a control system as a system that has been programmed for automation. In a control system environment, we’re talking about operators - they have all this automation - but they have to make small changes to ensure everything is interoperable, everything works and that there are safety protocols in place. Think about your automatic espresso machine at home, it’s almost the same as a nuclear power plant!’

Automation, Kubecka explains, is far from a new idea. ‘We’ve had it for a long time. The first Space Shuttle launched and returned; they were piloted through re-entry by an automatic piloting system... That’s because the re-entry is kind of rough.’

Security vs engineering

So, what are the common flaws in control systems? Why can such potentially critical pieces of equipment be subverted?

‘When you build a nuclear power plant, you’re expecting it to function for maybe upwards of fifty years,’ she says. ‘And, by the time it’s specified and built, there are going to be bits of kit that are out of date - it takes years to build these things.’

When the power station is built, commissioned and connected to the internet, some of its critical industry control hardware will need to be patched.

‘The problem is,’ she asks, ‘what do you do if everything else is working? Do you take the risk? Do you change [software] and risk other things stopping working? This is a big risk and, typically, a lot of designers don’t understand that if you mix an IT system with a control system - a control system which is intended to last for [10-30] years - and an IT system intended to last for three to five years, if you’re mixing those two things, they’re just not going to work when it comes to security.’

Building out the idea that IT systems and control systems aren’t easy and natural bedfellows, Kubecka explains that, in technology systems, we have the Transmission Control Protocol (TCP) - it’s one of the systems of rules which underpins the internet’s suite of equally important protocols. Critically, TCP provides reliability, order and error checking. And it is error detection, correction and error control which is often missing in control systems.

‘Industrial control systems typically don’t have error control,’ she says. ‘That’s because they do just one job; they don’t have much computing power - and, on some control networks, there’s not a huge amount of encryption. That’s because the time involved in encrypting a command, sending it, decrypting it, making sure everything is correct and carrying it out - that takes so long, it can delay things in certain safety systems.’

Beyond network level technicalities there’s another reason why IT security and engineering don’t always work well together: the difference in priorities. IT security, at its heart, has three core and critical concerns: confidentiality, integrity and availability. These form the three constituents of the information security triad and they run in that priority order. Information must be confidential, protected from damage and available to the right people at the right time.

In control system engineering, this model (and its embedded order of priority) is flipped. The most important factor in a managed control system is that the devices are always available to make changes.

‘Do you really want availability to come last when you need heat in your house and it’s freezing cold?’ Kubecka asks.

Summing up, she says: ‘If you don’t think about privacy and security in the beginning and try and bolt it on later, it’s going to be more expensive and it’s not going to fit properly. It’s like building foundations: you need to get those right.’

Back to the future

Critical energy systems are, of course, going through a period of rapid and essential transformation. Driven by climate change, countries are scrambling to decarbonise and are turning their backs on coal, oil and gas. Travel around the UK and you’ll see wind farms and solar farms springing up. Are these newer pieces of infrastructure - because they’re born of the internet age - inherently more secure than their smoky and blackened precursors?

The answer, Kubecka says, is ‘no.’

‘Two things keep me awake about it,’ she explains. ‘Some wind turbines have default login and passwords set to admin, admin. I did an exercise years ago for GCHQ to look at the [wind turbines] in the UK. It was, I think 82% or 83% of the UK’s wind turbines had admin, admin. GCHQ wasn’t happy about that.

‘The other thing is, we all want to go green - we want to get ourselves off coal. But few people think that solar farms are open to the internet. Panels [automatically] track the sun and this can be changed so there’s no electricity.’

Remote access for all

Beyond turning off an electrical system, there are also more dangerous attacks which are theoretically possible. ‘You can try to make something operate outside of its normal range. This was shown on a very big diesel generator. Remotely, a team of people were able to change the operating parameters and they basically blew it up. Do something to solar panel systems and the fires are more difficult to put out. There are remote exploits where you can set old HP printers on fire. Imagine what you could do with a wind turbine?’

How is all this possible? Again, the manual might well hold many valuable clues. Firstly, Kubecka says, ‘find out who manufactures hardware and see if they have published documentation. Many do and straight to the internet too.

For you

Be part of something bigger, join the Chartered Institute for IT.

‘Next, there are tools such as censys.io which let researchers locate particular pieces of hardware on the internet, all while remaining beyond the reach of local computer misuse legislation.

‘To move a research project forward, you’ll need some key identifiers about the hardware you’re looking at. This might be the manufacturer’s name or some key phrases harvested from that service manual. The PDF book might reveal protocols or key parameters - things you might likely see on the hardware’s management page. You can use censys.io to search its scraped database for these key phrases. This might reveal global locations and these can then be refined down to more local geographies.’

A solution to the problem

One partial solution to this problem is, of course, good password hygiene and banishing default passwords. That, Kubecka explains, is a good opening gambit when it comes to protecting internet-facing infrastructure - but it’s only an opener.

Another big problem is that some devices have both an internal and external IP address. Think about your internet enabled lightbulb: it will be given an address by your broadband router and insulated (to some degree) from the internet by the routing device’s firewall. Externally, however, the bulb’s manufacturer might maintain a means of communicating directly with the bulb. They might use it for hardware updates, analytics or for extra features. The point is: there’s a backdoor visible to the internet that you don’t know about. And this self-same problem occurs with much bigger and more critical pieces of equipment.

‘Let’s say you buy a new power plant. It’s brand-spanking new and just like a new car, there’s a support contract that comes with it,’ Kubecka reveals. ‘After year three, maybe you need to buy a new contract. But, because everything is digital, the manufacturer will sometimes say: “To maintain the support, we need to watch the system working and make sure there are no abnormalities. You have to do this or we’ll void your support contract. We’ll do this cheaply and we’ll turn on remote capabilities and maybe interact with the device. We won’t use encryption, we’ll use an old, insecure remote control management system and all our clients will have the same username and password to make it easy for our engineers.”’ Essentially, some manufacturers punch a hole through their clients’ security.

These problems, she says, can be tracked back to procurement and commercial drivers which dictate that the cheapest option is, more often, the best.

‘I think it was the astronaut Glen Jones,’ she says, returning to NASA. ‘A journalist asked him “How do you feel in the capsule listening to the countdown?” He said: “As you would, sat on a million parts - all built by the lowest bidder on a government contract.”’

A final summary

It’s hard to sum up all of Kubecka’s research and advice into a sentence - it needs a book or two. But, there are recurring themes and takeaways. She uses the word ‘real’ in a very particular way - cybercrime isn’t just limited to the digital world. It’s impacting on the real world and it will cost lives. And the point where this realness can - and is being - felt, is in electricity generation and distribution.

Open source intelligence forms a critical part of her research arsenal. And, as such, it follows that organisations should know what they are contributing to the freely available pool of information about themselves and their products.

Closing up, she says: ‘You can’t protect what you don’t know you have. It might be those devices with a public and private IP address, or that you have a tendency to recycle passwords, or that you’ve got an old email account sat out there and it uses the same password as a bunch of your other accounts... We can find that old account using open source intelligence or by checking haveibeenpwned.com.

In the infrastructure world, this might have an orphan system somewhere. The bigger and older the organisation, the more likely it is that there will be orphan systems. Knowing what you have is critically important.’

‘Curiosity,’ Kubica says as she leaves, ‘is what gets you into the field and then you decide whether you want to do good or bad.’

About the author

Chris Kubecka is the founder and CEO of HypaSec. Previous Group Leader for AOC, tasked with setting up digital security after the world’s most devastating cyber warfare attack so far, the 2012 Shamoon attacks. Currently the Distinguished Chair at Middle East Institute Cyber Program at Middle East Institute, Chris is also a member of the Cyber Senate.

She is an advisor and subject matter expert to several governments and industries on cyber security and incident response for cyber warfare. Her digital security expertise is recognised in the financial, oil and gas, water and nuclear industries.