On Complexity & Bureaucracy vs Security…

“Things have always been done this way.” —> Doesn’t mean they will be done that way in the future, or even that this is a good way.

“We know we need to change, but we can’t find the person who can authorize the changes we need.” —> Then who will punish you for the change? Even if punishment comes, you still win, as you’ll know who can authorize the change in the future.

“We don’t have enough time, money or skills to support those controls, even though we agree they are necessary.” —>Have you communicated this to upper management? If not, why not? How high have you gone? Go higher. Try harder.

“That’s too fast for our organization, we can’t adapt that quickly.” —>Welcome to the data age. Attackers are moving faster that ever before. You better adapt or your lack of speed WILL get exploited.

In many of my clients, complexity and bureaucracy have become self re-enforcing regimes. They lean on them as a way of life. They build even more complexity around them and then prop that up with layers and layers of bureaucracy. Every change, every control, every security enhancement or even changes to make existing tools rational and effective, is met with an intense mechanism of paperwork, meetings, “socialization” and bureaucratic approvals.

While many organizations decry “change management” and “security maturity” as being at the core of these processes, the truth is, more often than not, complexity for the sake of bureaucracy. Here’s the sad part, attackers don’t face these issues. They have a direct value proposition: steal more, get better at stealing and make more money. The loop is fast and tight. It is self correcting, rapid and efficient.

So, go ahead and hold that meeting. Fill out that paperwork. Force your technical security people into more and more bureaucracy. Build on complexity. Feed the beast.

Just know, that out there in the world, the bad guys don’t have the same constraints.

I’m not against change controls, responsibility or accountability, at all. However, what I see more and more of today, are those principals gone wild. Feedback loops to the extreme. Layers and layers of mechanisms for “no”. All of that complexity and bureaucracy comes at a cost. I fear, that in the future, even more so than today, that cost will be even more damage to our data-centric systems and processes. The bad guys know how to be agile. They WILL use that agility to their advantage. Mark my words…  

Three Ways to Help Your Security Team Succeed

Over the years, I have watched several infosec teams grow from inception to maturity. I have worked with managers, board members and the front line first responders to help them succeed. During that time I have keyed in on three key items that really mean the difference between success and failure when it comes to growing a teams’ capability, maturity and effectiveness. Those three items are:

  • Cooperative relationships with business units – groups that succeed form cooperative, consultative relationships with the lines of business, other groups of stakeholders and the management team. Failing teams create political infighting, rivalry and back stabbing. The other stakeholders have to be able to trust and communicate with the infosec team in order for the security team to gain wisdom, leverage and effective pro-active traction to reform security postures. If the other teams can’t trust the security folks, then they won’t include them in planning, enforce anything beyond the absolute minimum requirements and/or offer them a seat at their table when it comes time to plan and execute new endeavors. Successful teams operate as brethren of the entire business, while failing teams either play the role of the “net cop” or the heavy handed bad guy — helping neither themselves, their users or the business at large.
  • Embracing security automation and simplification – groups that succeed automate as much of the heavy lifting as possible. They continually optimize processes and reduce complex tasks to simplified ones with methodologies, written checklists or other forms of easy to use quality management techniques. Where they can, they replace human tasks with scripting, code, systems or shared responsibility. Failing teams burn out the team members. They engage in sloppy processes, tedious workflows, use the term “we’ve always done it this way” quite a bit and throw human talent and attention at problems that simple hardware and software investments could eliminate or simplify. If you have someone “reading the logs”, for example, after a few days, they are likely getting less and less effective by the moment. Automate the heavy lifting and let your team members work on the output, hunt for the bad guys or do the more fun stuff of information security. Fail to do this and your team will perish under turnover, malaise and a lack of effectiveness. Failing teams find themselves on the chopping block when the business bottom line calls for reform.
  • Mentoring and peer to peer rotation – groups that succeed pay deep attention to skills development and work hard to avoid burn out. They have team members engage in mentoring, not just with other security team members, but with other lines of business, stakeholder groups and management. They act as both mentors and mentees. They also rotate highly complex or tedious tasks among the team members and promote cross training and group problem solving over time. This allows for continuous knowledge transfer, fresh eyes on the problems and ongoing organic problem reduction. When innovation and mentoring are rewarded, people rise to the occasion. Failing groups don’t do any of this. Instead, they tend to lock people to tasks, especially pushing the unsexy tasks to the low person on the totem pole. This causes animosity, a general loss of knowledge transfer and a seriously bad working environment. Failing teams look like security silos with little cross training or co-operative initiatives. This creates a difficult situation for the entire team and reduces the overall effectiveness for the organization at large.

Where does your team fit into the picture? Are you working hard on the three key items or have they ever been addressed? How might you bring these three key items into play in your security team? Give us a shout on Twitter (@microsolved or @lbhuston) and let us know about your successes or failures. 

Thanks for reading, and until next time, stay safe out there! 

3 Tough Questions with Dan Houser

I recently spent some time discussion certifications, training, the future of the information security community and the “hacker conference” scene with Dan Houser. While I don’t agree with some of his views, especially about how hackers play a role in our community, I think his view points are interesting and worth a discussion. I also think his keen attention to sexism in our community is both timely and important for us to resolve. Here are my 3 Tough Questions for Dan.


A Short Biography of Mr. Houser: Dan Houser (@SecWonk) is Security & Identity Architect for a global healthcare company, with 20+ years experience creating security, cryptography and eBusiness solutions. He is a frequent speaker at regional and international security conferences, a Distinguished Toastmaster, published author, and serves on the (ISC)2 Board of Directors. Dan is passionate about professional development, teaching, motorcycles, Safe and Secure Online, advancing the role of women in Information Security, ethics, certification, and, most of all, his family.

 

Question #1: I know you are involved in a lot of professional organizations focused not only on providing continuing education for Information Security Professionals, but also on teaching information security skills to adults and children in the community. When Information Security Professionals come to training courses and seminars, we see they have a wide range of skills, various areas of interest and different levels of technical capability. Why do you think information security has so many problems with level-setting knowledge? Is it simply because there is such a large body of information that must be encompassed in order to be an effective security person? Or could it be the high rate of change present in the industry, or even a particular personality trait common to information security practitioners? Why is it so hard to build an Information Security Professional?

 

Mr. Houser: There are many reasons why it’s hard to build an Information Security Professional, (and there are some great clues in the awesome book “The Great Influenza” by John M Barry – this book is definitely worth a read!). In essence, we are building a new profession from the ground up, and 50% of the job titles you now see in information security (infosec) didn’t even exist 30 years ago. For example, my own job title didn’t exist 15 years ago: Sr. Security & Identity Architect. 

We can look to modern medicine as a parallel that began roughly 100 years ago. Although medicine has been practiced since someone first noticed bear grease on a wound seemed to help in healing, it’s only in the recent past that science was diligently applied to the practice of medicine. Law enforcement started experiencing the same thing when a scientific study of policing reversed a 4000 year old belief that patrolling was an effective deterrent to crime. The study showed that this practice in fact had a zero impact on crime prevention. Although I hope it won’t take us 4000 years to really move forward, we have to anticipate that there are a number of changes in our field that universities and corporations are finding difficult to track. One lesson we can learn from medicine is the advent of the “nurse practitioner”. This is a medical professional who has nearly the same skill in general medicine as a full M. D., but who only requires about half the investment in schooling. 

At this point, the information security industry does not have an undergraduate program, (at least one I’m familiar with), that can turn out graduates who are ready to jump right into InfoSec at a meaningful level. We also lack a journeyman/apprenticeship program in the profession. By studying our craft scientifically, encouraging professionalism, and understanding “what it is that makes a great Information Security Professional”, we will be able to determine the root studies necessary for competency, and get to train people on “the right thing”. 

We have to discard the notion that there is a single path to information security. We have to stop teaching InfoSec Professionals from curricula created to churn out developers, and understand the complete spectrum of pathways that lead to true information security. We need to understand what is valuable (and what is not).

I have made an impassioned plea, (and continue to do so), for an investment in scientific study of the information security profession; in particular to understand the root causes behind the lack of women in the field. Are they not finding the same on-ramps as men? Are they taking an off-ramp due to sexism, lack of opportunity, lack of fulfillment? We have no clue as an industry. We have some solid data showing Science, Technology, Engineering and Math (STEM) issues with gender split, and that STEM isn’t engaging and keeping women in associated disciplines. But that doesn’t necessarily mean that that is the root cause in the information security industry; we just pretend to believe it is so. Just as police practiced patrolling and doctors used blood-letting, because “everyone knows it helps”. 

Our profession is at the same point as breast-cancer research (note: not being crass, I lost my Mom to cancer). We are focusing so much on walks, runs, screening and exams that we have COMPLETELY lost sight of the fact that 18,000 women in the US die each year from breast cancer, and we have NO CLUE WHY. Frankly, that ticks me off. We must focus on understanding the cause before we can make any reasonable statements about a cure.

Through an actual scientific study of the development of the Information Security Professional – and I’m talking by actual PhD sociologists and psych folks, not geeks in InfoSec — we can learn the actual on-ramps and off-ramps in our profession. What creates a strong InfoSec Professional, why women don’t enter or quickly leave the InfoSec Profession, and how to start repairing the actual problems with the industry instead of fighting only symptoms. That will usher in a new age for creating Information Security professionals, and truly achieve gender equity in our field.

 

Question #2: As you look to the future of information security, what do you see as the long term role of certifying bodies such as ISC2, ISACA, etc.? What about future roles of educational organizations such as OWASP, ISSA and the like?

 

Mr. Houser: I think that the future is bright for these organizations because we have a continued need for differentiating professionals from pretenders, and certification is the only mechanism I can currently see that allows us to know that an individual has attained a base level of competency in a stated area of expertise. According to Frost & Sullivan statistics, we’re going to be growing by nearly double in the next decade, which will create TREMENDOUS market pressures. We must find InfoSec professionals somewhere, and we must have mechanisms in place that allow us to determine whether or not they have the requisite skills. I see no other viable means of determining that cross-market other than certification. 

Additionally, Security and Audit professional certification authorities like (ISC)2, ASIS and ISACA provide a code of ethics that governs the membership. And that’s inherently quite valuable; to know that my peers have not only met an independent standard for competency and knowledge, but are also held to an ethical code of conduct for their behavior. With us doubling-down in the next decade, we’re going to have a lot of people entering the profession from other professions, and certifications will grow in importance. (ISC)2, ASIS and ISACA further promote professionalism through local chapter representation, which is another key way to tie together the complete package.

Educational organizations that provide solid educational experiences, such as ISSA, OWASP and Infragard, can also provide vital networking and educational programs in communities to broaden the reach of the InfoSec community. I’d also add that there are some non-traditional avenues that should be considered — such as LockSport/TOOOL, Make and Meetup IT communities who often fill in the edges of our BoK with valuable insights.

 

Question #3: What role does the “Not a Conference” movement like BSides, DerbyCon, NotaCon play in advancing Information Security?

Mr. Houser: Our profession is challenging the nature of information use, and the exceptionally difficult challenges we have in protecting intellectual property with an increasingly advanced foe in the face of mobile, big data, cloud and internationalization.  One challenge we have as an industry is understanding the role that non-traditional knowledge plays in moving the profession forward.  There is great excitement in the industry from less-formal means of sharing information, such as DefCon, BSides, NotaCon, DerbyCon — all great stuff.  Certainly, there is substantial value we gain from meeting in different ways to share knowledge with each other.  What we must be cognizant of is that these should become further pathways for intellectual pursuit, and not forces that hold us back – that we don’t lose sight in the “not-a-conference” up-the-establishment ribaldry that we are a serious profession with serious problems, and deserve to be taken seriously.  That doesn’t mean we can’t have fun, but have to be careful that we aren’t sending the message that any rank amateur can do the work of a security professional. 

Sure, there is a lot of talent in the hacker community, just like there are uber-thieves.  However, at some point, the FBI agent who hangs out with organized crime becomes part of the problem, and can no longer be differentiated from the good guys, and have shredded their image and reputation.  Greyhat is dangerous in what it can do to your reputation and the professionalism we’ve fought very hard to achieve over the past 25 years.  There is also the issue that you absorb from associating with amateurs – sure it’s refreshing and great to feel the passion from those who do it for the love, but the unguided amateur sends the wrong message about the profession.  If anyone can do it, with the huge scarcity of Information Security folks right now, then why should they pay you a professional rate, when they can get an amateur for $12 an hour? 

The other big issue I see from greyhat conferences is that many provide glorification and validation of hacking, which I think is freaking stupid – this is like arming terrorists.  By glorifying hackers, you’re recruiting for them and filling their ranks with talented people that are then going to fight against you.  How stupid is that?!?!?  Hackers are roaches that should be squashed, not bred to make them stronger.  So, InfoSec professionals are advised to study from afar, and not wallow in the grey/black hat mentality.  What I see in some of the “not a conference” tracks is that the response to a hacker zero-day has undergone a subtle but important transition, from “Wow, that’s stunning”, to “Wow, you’re awesome”, to “What you do is awesome”… which is a whisker from “please hack more”.  By treating hackers like rock stars, you encourage their craft.  That’s nothing less than arming your enemy.  Even if you aren’t cheering, does your presence validate?  Lie down with dogs, get up with fleas.  Careful, colleagues, you’re playing with fire, and we all may get burned.

 

Thanks to Dan for sharing his time with us and thanks to you for reading. I look forward to doing more 3 Tough Questions articles, and if there are people in the community you think we should be talking to, point them out to me on Twitter (@lbhuston) or in the comments.

3 Tough Questions with Chris Jager

Recently, I got to spend some time interviewing Chris Jager via email on industrial control systems security. He didn’t pull any punches and neither did I. Here, are 3 Tough Questions between myself (@lbhuston) and Chris.


A Short Biography of Chris Jager (@chrisjager): I have over 15 years of experience in Information Technology and have focused on the practical application of security principles throughout my career. Most recently, I was director of the NESCO Tactical Analysis Center at EnergySec; a non-profit organization formed to facilitate information sharing, situational awareness, and education outreach to the energy sector. I am active in a number of information security workgroups and have provided operational, architectural, and regulatory compliance guidance to large and small organizations in both the public and private sectors, focusing on the energy sector exclusively since 2006.


Brent: You have spent a lot of time working on Industrial Control Systems (ICS) in your career. During that time, you have been witness to the explosion of interest in IT security as a profession. Why should some of the younger folks thinking about information security as a career consider a focus on ICS and SCADA? Why should they care?

Mr. Jager: This is a fantastic question and, if I frame my response correctly, the answer will hopefully be self-evident to your readers.

ICS and SCADA are terms that are seldom understood and often misused by information security (infosec) publications. SCADA systems typically manage geographically disperse areas and often consist of numerous functionally disparate processes.

However, because of the immense variety of different processes that can be managed by industrial control systems, ICS has become somewhat of a catchall term – including SCADA systems. For example, you’ll often find electric power generation processes such as turbine control, burner management, vibration monitoring and more lumped into the mix. Each of these processes has discrete or locally distributed control and instrumentation systems, any of which can cause catastrophic safety, reliability, and financial issues if misused.

For me, the challenge of protecting these kinds of systems is far more interesting than making sure that little Bobby can’t drop the student records table in a classroom database. Much of the actual management technology is the same as what is used in general IT, but the application is very different. Things get a little more exotic (and arcane) when you go further down the stack into digital–to-analog conversion, but it’s not overly difficult for most folks to understand once exposed to it. The negative impacts of misuse aren’t limited to convenience and financial loss. Risk to life and limb is a very real possibility in many processes that are managed by industrial control system automation that is being run out of specification.

Typically, industrial control systems are deployed in step with the physical equipment they are designed to manage. The physical equipment is often orders of magnitude more expensive than the ICS components that ship with it and may be designed for lifespans measured in decades. In short, upgrades seldom occur as they need to be engineered and tested for functionality, safety, and a myriad of other issues pertaining to the existing physical equipment.

This has led to a situation where the groups that understand control systems and processes are naturally (and often generationally) gapped from those groups who understand the current threat and vulnerability landscapes. Consequently, there are currently very few individuals that understand industrial control system security as it relates to the changing threat picture. If the challenge of doing something very few dare to try doesn’t sound good on its own, this is the sound of opportunity knocking. Answer the door!

I’d like to make one last point on this question. Take a look around your house or apartment and count the number of internet-enabled devices you have. Most people these days have far fewer traditional computers than embedded systems – devices that aren’t user-serviceable without breaking a warranty or two. And the hacking skills necessary to modify such devices to fit use cases unintended by the manufacturers seem to come naturally to the younger folk of today. Those skills are also relatively portable to the ICS/SCADA world where embedded systems are the norm. Sure, some of the protocols and hardware packages are somewhat different, but they are all relatively simple compared to what folks are tinkering with at their coffee tables. We can always use more QA/breakers – particularly at key points in the supply chain where issues can be spotted and fixed before they become permanently vulnerable installations. Again I say, “knock knock”!

 

Brent: You talk a lot about how challenging ICS/SCADA security is. Do you think protecting ICS/SCADA systems in a meaningful way is an attainable goal? Assuming yes, do you think it could be done during what’s left of our careers? Why or Why not?

Mr. Jager: If I didn’t think it was an attainable goal, I’d not be doing the kind of work I’ve done over the past number of years. There are much easier ways to make a buck than to have people who are entrenched in the old way of doing things actively work to prevent you from even introducing discussions about change – let alone actually implementing it!

There is momentum in this area, but much work still needs to be done. Devices still ship from manufacturers with easily discerned hardcoded administration credentials, firmware updates are accepted without challenge and more. Once deployed in the field, user passwords seldom change, vulnerabilities discovered post-installation go unmitigated, and so on.

Because we have all this noise around basic security failures and their associated issues, we don’t yet know what constitutes “meaningful” or “attainable” when we speak of complex industrial control systems. A prime example here is that the electric sector is still using the exact same set of controls and asset scoping for its regulated security standards as when I first started working in the sector in 2006. NERC CIP version 1 was in final draft form, and the current requirements catalog will remain largely unchanged until at least 2015 when and if version 5 becomes effective. There have been minor changes in the interim, but not one that comes remotely close to addressing change in the threat landscape.

Will we ever have a perfect system? No. We do, however, urgently need to stop being complacent about the subject and implement those security measures that we can.

 

Brent: If you had your own ICS system, let’s say you ran Chris’s power company, what would that look like? How would it be protected?

Mr. Jager: It would look very, very “dumb”. Until such time as ICS and other automation technologies are vetted by process engineers – and I’m talking about the entire ICS/automation stack, I would automate only where it was impossible to operate the business or process without it.

It seems to me that we have a major employment problem in this country and no clear path to resolution. Putting some of these people to work securing our industrial control systems is an area where the private sector can help get the country back to work without relying on government funded stimulus packages. An added bonus is that we’ll end up with a whole cadre of workers who have been exposed to the industry, a percentage of who will stay in the field and help to address the industry’s gray out problem. All it takes is one or two sizable impacts from automation failure or misuse for the cost savings seen through automation to be wiped out.

Where I had no choice but to automate, Chris’ Power Company would look very much like any power company out there today, unfortunately. There simply aren’t enough vendors and manufacturers out there presently that produce secure equipment. Even then, systems integrators often further weaken the environment by adding support accounts and other remotely accessible backdoors to these systems.

Be it in the energy sector or any other, process automation installations will inevitably mature to a state of persistent vulnerability due to their long lifespans. Vulnerability discovery and exploitation techniques advance over time, vulnerabilities are introduced through regression bugs elsewhere in the software or protocol stack, or the process itself may have changed to a point where a previously innocuous vulnerability now has the ability to introduce a large impact if exploited.

Eventually, pointing out that the emperor has no clothes becomes a career limiting move – particularly when said emperor is an exhibitionist! Instead, the focus should be on identifying the more sensitive individuals in the crowd and protecting them appropriately through sound risk identification principles. We can’t make the problems go away through risk management, but we can use the techniques to identify the things that matter most and, where we can’t mitigate the risk, implement monitoring and response controls. This sort of approach also helps prioritize future efforts and dollars.

The top security controls at Chris’ Power Company would center around monitoring and response as employees would be trained to assume the environment was in a persistent state of compromise. In the environment we live in today where threats are real and expressed, and vulnerabilities aren’t able to be universally mitigated, the only real chance at controlling risk you have is to manage the impact of a successful attack. You only get that chance if you are able to detect and respond before the attack balloons to the maximum impact value.

If you failed to give my company that chance, you wouldn’t be working at Chris’ Power Company!


Thanks to Chris Jager for his insights and passion about ICS security. We appreciate his willingness to spend time with us. Thanks, as always, to you the reader, for your attention. Until next time, stay safe out there!

Evolution, Maturity and Rethinking the Problems…

I have been following a number of attacker trends and I see a potential point of convergence just over the horizon.

Most especially, I think that an intersection is likely to occur between bot development/virtual machines/rootkits and man-in-the-browser. My guess is that a hybrid juggernaut of these technologies is likely to emerge as an eventual all-in-one attack platform.

The use of these technologies alone are already present in many attack platforms. There are already a ton of examples of bot/rootkit integration. We know that man-in-the-browser has already been combined with rootkit technologies to make it more insidious and more powerful. If we add things like installation of illicit virtual machines, evil hypervisors and other emerging threats to the mix, the outcome is a pretty interesting crime/cyber-war tool.

If all of these problems would come together and get united into a super tool, many organizations would quickly learn that their existing defenses and detection mechanisms are not up to the challenge. Rootkit detection, egress traffic analysis, honeypot deployments and a high level of awareness are just beginning to be adopted in many organizations whose infosec teams lack the budgets, maturity and technical skills needed to get beyond the reactive patch/scan/patch cycle.

Vendors are already picking up on these new hybrid threats, much like they did with worms – by offering their products wrapped with new marketing buzzwords and hype. We have heard everything from IPS to NAC and hardened browsers (that mysteriously resemble Lynx) to special network crypto widgets that provide mysterious checksums of web transactions with other users of the special widgets… I don’t think any of these thigs are going to really solve the problems that are coming, though some might be interesting as point solutions or defense in depth components. My guess is that more than a few of the currently hyped vendor solutions are likely to be practically useless in the near future.

The real problem is this – security team maturity needs to be quickly addressed. Attackers are nearing another evolutionary leap in their capabilities (just as worms were a leap, bots were a leap, etc…) and we are still having issues dealing with the current levels of threats. It is becoming increasingly clear that we need to have infosec folks start to think differently about the problems, learn more about their adversaries and embrace a new pragmatic approach to defending data, systems and networks.

Maybe we need less whiz bang technology and more Sun Tzu?