A few blogs ago in one titled, “The ODDA Loop,” I commented on one of the scariest moments in my professional career. A senior officer jumped from his seat during my presentation protesting my comment that utilities move too slowly … sometimes being characterized as “moving in geologic time frames.” He ended his outburst protest about me exaggerating by stating that “utilities did not move that quickly!” Besides my relief at his clever and obvious humor, he was certainly agreeing with me. In life, I often find that these subtle reversals in expressive thought can trigger new insights and approaches. That is what this blog is about.
A recent article in Energy Central on cyber security suggesting utilities are burying their heads in the sand like ostriches prompted me to check out that phrase for its veracity. Do ostriches really burry their heads in the sand? The popular statement implies ostriches bury their heads in the sand when they’re scared or threatened.
According to National Geographic, the reason for the comment starts with an optical illusion. Ostriches are the largest living birds, but their heads are pretty small. “If you see them picking at the ground from a distance, it may look like their heads are buried in the ground,” says Glinda Cunningham of the American Ostrich Association. But they do dig holes in the dirt to use as nests for their eggs. Several times a day, a bird puts her head in the hole and turns the eggs. So it really does look like the birds are burying their heads in the sand!
So, lighten up on all those IT folks who are now searching for the perfect way to stop bad guys from doing bad things in a system so complex that no one can truly get their arms completely around it. As I think about this challenge, maybe they have it all backwards. Instead of trying to stop those bad guys from getting into their nests and breaking eggs they should hide those eggs.
The analogy here is very simple. Any security expert will tell you that any system can be hacked if someone really wants to get into it. Often, the biggest threat is the person who designed the security system in the first place and has been fired or been paid by those bad guys enough money to compromise the company they work for.
Maybe then we should follow the design objective made famous in the line in movie 2001: Space Odyssey. Click here to watch. Here is the full conversation:
Dave Bowman: Hello, HAL. Do you read me, HAL?
HAL: Affirmative, Dave. I read you.
Dave Bowman: Open the pod bay doors, HAL.
HAL: I’m sorry, Dave. I’m afraid I can’t do that.
Dave Bowman: What’s the problem?
HAL: I think you know what the problem is just as well as I do.
Dave Bowman: What are you talking about, HAL?
HAL: This mission is too important for me to allow you to jeopardize it.
Dave Bowman: I don’t know what you’re talking about, HAL.
HAL: I know that you and Frank were planning to disconnect me, and I’m afraid that’s something I cannot allow to happen.
Dave Bowman: [feigning ignorance] Where the hell did you get that idea, HAL?
HAL: Dave, although you took very thorough precautions in the pod against my hearing you, I could see your lips move.
Dave Bowman: Alright, HAL. I’ll go in through the emergency airlock.
HAL: Without your space helmet, Dave? You’re going to find that rather difficult.
Dave Bowman: HAL, I won’t argue with you anymore! Open the doors!
HAL: Dave, this conversation can serve no purpose anymore. Goodbye.
Perhaps then there is another approach to cybersecurity and the related grid hardening. Perhaps we need artificial intelligence in the subsystems to detect “intent” and rationalize that instead of hoping we can put enough locks on the doors. Bad guys still seem to be able to pick those locks or simply blow them off the wall if they want to.