Don’t Blame Target’s Audit Committee For The Sins Of Technology Management

Yesterday, Institutional Shareholder Services (ISS), a third-party advisor to Target Corp. investors, recommended ousting Target’s Audit Committee because they failed to do appropriate risk management, resulting in a breach of customer data. According to Twin Cities Business Magazine, ISS stated that “… in light of the company’s significant exposure to customer credit card information and online retailing, these committees should have been aware of, and more closely monitoring, the possibility of theft of sensitive information, especially since it involves shoppers and the communities in which the company operates, as well as the overall impact on brand reputation and brand value.”  This suggests a fundamental lack of understanding of both the nature of the breach and who should be held responsible for the outcome.

First, let's understand what really happened here: Target updated their point of sale (POS) systems before the holiday season. There was a known vulnerability in those POS systems that let credit card data travel between the POS system and the register before it was encrypted and sent off to the clearinghouse for approval. Target’s technology team was warned of the vulnerability and DECIDED that the risk was worth accepting – not the board, not the auditors; it was the people involved in the project who accepted the risk of losing 70 million records. When departments accept that level of risk, they in essence, end the conversation.  The audit committee and board of directors would be none the wiser. When was the last time you notified your board about how you were disposing of hard drives? 

Never, right?

Read more

The Shuttle Challenger Anniversary Still Offers Risk Management Lessons, If We Are Willing to Learn Them

January 28th was the anniversary of the Space Shuttle Challenger disaster. The Rogers Commission detailed the official account of the disaster, laying bare all of the failures that lead to the loss of a shuttle and its crew. Officially known as The Report of the Presidential Commission on the Space Shuttle Challenger Accident - The Tragedy of Mission 51, the report is five volumes long and covers every possible angle starting with how NASA chose its vendor, to the psychological traps that plagued the decision making that lead to that fateful morning.  There are many lessons to be learned in those five volumes and now, I am going to share the ones that made a great impact on my approach to risk management. The first is the lesson of overconfidence.

In the late 1970’s, NASA was assessing the likelihood and risk associated with the catastrophic loss of their new, reusable, orbiter. NASA commissioned a study where research showed that based on NASA’s prior launches there was the chance for a catastrophic failure approximately once every 24 launches. NASA, who was planning on using several shuttles with payloads to help pay for the program, decided that the number was too conservative. They then asked the United States Air Force (USAF) to re-perform the study. The USAF concluded that the likelihood was once every 52 launches.

In the end, NASA believed that because of the lessons they learned since the moon missions and the advances in technology, the true likelihood of an event was 1 in 100,000 launches. Think about that; it would be over 4100 years before there would be a catastrophic event. In the end, Challenger flew 10 missions before it’s catastrophic event and Colombia flew 28 missions before its catastrophic event, during reentry, after the loss of heat tiles during take off. During the life of a program that lasted 30 years, they lost two of five shuttles.

Read more

Privacy Activists Are Cheering For The NSA Ruling, But It Won't Be A Lasting Victory

Privacy is on trial in the United States. Legal activist Larry Klayman asked US District Judge Richard J. Leon to require the NSA to stop collecting phone data and immediately delete the data it already has. His argument was that US citizens have a right to privacy and this is a violation of the Fourth Amendment of the Constitution protecting citizens from illegal search and seizure. Monday's ruling that this practice is unconstitutional has privacy activists cheering in the streets, but it will not be a lasting victory.  

In the United States, there is not a single privacy law on the books. (You can argue that HIPAA is a privacy law, but nuances exist that can lessen its impact.) What is protected has come from judgments based on the application of the Fourth Amendment regarding search and seizure. US citizens were given "privileges,” thanks to Richard Nixon, which say we have an expectation of privacy when using a phone, which basically means that the government has to get a warrant for a wiretap. (It’s worth noting that in the UK, they don’t get that privilege.)

Data is up for grabs. And everyone is grabbing.

Read more

Cold War Security: Four Phones, Two Doors, A Scrap Of Paper, And A Lighter

Outside of Tempe is a place called Sahuarita, Arizona. Sahuarita is the home of Air Force Silo #571-7 where a Titan missile, that was part of the US missile defense system and had a nine-megaton warhead that was at the ready for 25 years, should the United States need to retaliate against a Soviet nuclear attack.  This missile could create a fireball two miles wide, contaminate everything within 900 square miles, hit its target in 35 minutes, and nothing in the current US nuclear arsenal comes close to its power. What kept it secure for 25 years? You guessed it...four phones, two doors, a scrap of paper, and a lighter. 

Photo Credit: Renee Murphy

Technology has grown by leaps and bounds since the cold war. When these siloes went into service, a crew supplied by the Air Force manned them. These men and women were responsible for ensuring the security and availability of the missile. Because there was no voice recognition, retinal scanning, biometric readers, and hard or soft tokens, the controls that were in place were almost entirely physical controls. All of the technology that we think of as keeping our data and data centers secure hadn’t been developed yet. It is important to note that there was never a breach. Ever.

It might be an occupational hazard, but I can relate almost anything to security and risk management, and my visit to the Titan Missile Museum at AF Silo #571-7 was no exception. The lesson I took from my visit: there's room for manual controls in security and risk management. 

Read more

Scenario modeling is anything but a guess

Emergency management professionals say,  “The plan is useless, but the planning is priceless.”  There is a lesson in there for risk managers and it’s about the value of scenario modeling.

The Federal Emergency Management Administration (FEMA) conducted a study to determine the likelihood and impact of a hurricane hitting New Orleans. FEMA assembled the paramedics, fire department, emergency room doctors, parish officials, and other responders in a hotel in New Orleans for "Hurricane Pam". Their goal was to plan for the worst-case scenario. The group was given the following scenario:

  1. A slow moving, category-3 hurricane would directly hit New Orleans.
  2. The storm surge would cause the levees to top, but not break.
  3. The National Weather Service showed how the storm would form, what track it would take and what parishes would be effected.
Read more

NASA Flunked Its Cloud Computing Audit: Are You Next?

Ok, so NASA failed an audit. Don’t we all? I think it is important to understand the government’s cloud computing adoption timeline before passing judgment on NASA for failing to meet its cloud computing requirements. And, as someone who has read NASA’s risk management program (and the 600 pages of supporting documentation), I can say that this wasn’t a failure of risk management policy or procedure effectiveness.  Clearly, this was a failure of third-party risk management’s monitoring and review of cloud services.  

The Cloud Is Nebulous

Back in 2009, NASA pioneered cloud technology with a shipping container-based public cloud technology project named Nebula -- after the stellar cloud formation. (I love nerd humor, don’t you?)

Photo Source: NASA

During 2009, NASA, to determine if current cloud provider service offerings had matured enough to support the Nebula environment, did a study. The study proved that commercial cloud services had, in fact, become cheaper and more reliable than Nebula. NASA, as a result of the study, moved more than 140 applications to the public sector cloud environment.

In October of 2010, Congress had committee hearings on cybersecurity and the risk associated with cloud adoption.  But remember, NASA had already moved its noncritical data (like www.nasa.gov or the daily video feeds from the international space station, that are edited together and packaged as content for the NASA website) to the public cloud in 2009.  Before anyone ever considered the rules for such an adoption of these services.

Audit Recommendations

Read more

Are You In A Decision Trap? You Decide.

Before joining Forrester, I ran my own consulting firm. No matter how ridiculous the problem or how complicated the solution, when a client would ask if I could help, I would say yes. Some people might say I was helpful, but I was in an overconfidence trap. There was always this voice in the back of my mind that would say, “How hard could it be?” Think of the havoc that kind of trap can have on a risk management program. If any part of the risk program is qualitative, and you are an overconfident person, your risk assessments will be skewed. If you are in an overconfidence trap, force yourself to estimate the extremes and imagine the scenarios where those extremes can happen. This will help you understand when you are being overconfident and allow you to find the happy medium.

Have you ever padded the budget of a project “just to be safe”? I hate to tell you this, but you are in the prudence trap.  By padding the project budget, you are anticipating an unknown. Many other managers in your company may be using the same “strategy.” But the next time you do a project like this, you will pad the budget again, because the inherent uncertainty is still there. The easiest way to keep your risk management program out of the prudence trap is to never adjust your risk assessments to be “on the safe side,”  There is nothing safe about using a psychological trap to predict risk.

Read more

Allow Me To Introduce Myself...

Allow me to introduce myself. I am Renee Murphy, and I am new Sr. Analyst here at Forrester Research. Prior to joining Forrester, I was both an internal and external auditor. My experience includes network and data center engineering and management, operations process development and implementation and creating auditable technology environments in many different industries with diverse client needs. 

I often say that trust is not a control, luck is not a strategy, and if you can’t have fun in Albuquerque, you aren’t a fun person. (That last one isn't really useful unless you are in Albuquerque and having a bad time.)  I joined Forrester to use my audit powers for good and not evil, and I plan to assist you with your audit issues, control frameworks, regulatory requirements, risk management, and security, building stronger relationships between you and your auditors.

With my extensive regulatory knowledge and technical process expertise, my goal is to give Forrester clients a unique view of your regulatory and best practice programs to ensure that you take advantage of the efficiencies that strong audit and control frameworks can provide. I will also help you navigate the security and risk ramifications of existing and upcoming regulatory requirements. 

I am proud and very excited to be part of the Forrester family and I look forward to working closely with our clients to help them achieve their GRC goals.