Cybersecurity requires a specialized skillset and a lot of manual work. We depend on the knowledge of our security analysts to recognize and stop threats. To do their work, they need information. Some of that information can be found internally in device logs, network metadata or scan results. Analysts may also look outside the organization at threat intelligence feeds, security blogs, social media sites, threat reports and other resources for information.
This takes a lot of time.
Security analysts are expensive resources. In many organizations, they are overwhelmed with work. Alerts are triaged, so that only the most serious get worked. Many alerts don’t get worked at all. That means that some security incidents are never investigated, leaving gaps in threat detection.
This is not new information for security pros. They get reminded of this every time they read an industry news article, attend a security conference or listen to a vendor presentation. We know there are not enough trained security professionals available to fill the open positions.
Since the start of the Industrial Revolution, we have strived to find technical answers to our labor problems. Much manual labor was replaced with machines, making production faster and more efficient.
Advances in artificial intelligence and robotics are now making it possible for humans and machines to work side-by-side. This is happening now on factory floors all over the world. Now, it’s coming to a new production facility, the security operations center (SOC).
Today, IBM announced a new initiative to use their cognitive computing technology, Watson, for cybersecurity. Watson for Cyber Security promises to give security analysts a new resource for detecting, investigating and responding to security threats.
Defining your data via data discovery and classification is the foundation for data security strategy. The idea that you must understand what data you have, where it is, and if it is sensitive data or not is one that makes sense at a conceptual level. The challenge, as usual, is with execution. Too often, data classification is reduced to an academic exercise rather than a practical implementation. The basics aren’t necessarily simple, and the existing tools and capabilities for data classification continue to evolve.* Still, there are several best practices that can help to put you on the road to success:
Keep labels simple. At a high level, stick to no more than 3 or 4 levels of classification. This reduces ambiguity about what each classification label means. Lots of classification labels increases confusion and the chance for opportunistic data classification (where users may default to classifying data at a lower level for ease of access and use).
Recognize that there are two types of data classification projects: new data and legacy data. This will help to focus the scope of your efforts. Commit to tackling new data first for maximum visibility and impact for your classification initiative.
Identify roles and responsibilities for data classification. Consider data creators, owners, users, auditors (like privacy officers, or a risk and compliance manager), champions (who’s leading the classification initiative?). Data is a living thing and all employees have a role in classification. Classification levels may change over time as data progresses through its lifecycle or as regulatory requirements evolve.
Forrester’s Security & Risk Analyst Spotlight - Chris Sherman
The title hasn’t yet been put to client vote, but Chris Sherman may be the renaissance man of Forrester’s S&R team. As an analyst, Chris advises clients on data security across all endpoints, giving him a broad perspective on current security trends. His experience as a neuroscience researcher at Massachusetts General Hospital also gives him insight into the particular challenges that Forrester’s clients in the healthcare industry face. Lastly, when he hasn’t been writing about endpoint security strategy or studying neural synapse firings, Chris flies Cessna 172’s around New England. Listen to this week’s podcast to learn about recent themes in Chris’s client inquiries as well as the troubles facing a particular endpoint security technology.
To help security pros plan their next decade of investments in data security, last year myself, John Kindervag, and Heidi Shey, researched and assessed 20 of the key technologies in this market using Forrester's TechRadar methodology. The resulting report, TechRadar™: Data Security, Q2 2014, became one of the team’s most read research for the year. However, it’s been a year since we finalized and published our research and it’s time for a fresh look.
One can argue that the entirety of the information security market - its solutions, services, and the profession itself - focuses on the security of data. While this is true, there are solutions that focus on securing the data itself or securing access to the data itself - regardless of where data is stored or transmitted or the user population that wants to use it. As S&R pros continue to pursue a shift from a perimeter and device-specific security approach to a more data- and identity-centric security approach, it’s worthwhile to hyper focus on the technology solutions that allow you to do just that....
Last year, we included the following 20 technologies in our research:
Big data and Hadoop (Yellow Elephants) are so synonymous that you can easily overlook the vast landscape of architecture that goes into delivering on big data value. Data scientists (Pink Unicorns) are also raised to god status as the only real role that can harness the power of big data -- making insights obtainable from big data as far away as a manned journey to Mars. However, this week, as I participated at the DGIQ conference in San Diego and colleagues and friends attended the Hadoop Summit in Belgium, it has become apparent that organizations are waking up to the fact that there is more to big data than a "cool" playground for the privileged few.
The perspective that the insight supply chain is the driver and catalyst of actions from big data is starting to take hold. Capital One, for example, illustrated that if insights from analytics and data from Hadoop were going to influence operational decisions and actions, you need the same degree of governance as you established in traditional systems. A conversation with Amit Satoor of SAP Global Marketing talked about a performance apparel company linking big data to operational and transactional systems at the edge of customer engagement and that it had to be easy for application developers to implement.
Hadoop distribution, NoSQL, and analytic vendors need to step up the value proposition to be more than where the data sits and how sophisticated you can get with the analytics. In the end, if you can't govern quality, security, and privacy for the scale of edge end user and customer engagement scenarios, those efforts to migrate data to Hadoop and the investment in analytic tools cost more than dollars; they cost you your business.
The CES Tech West Expo has a number of specific areas of coverage including fitness and health, wearables, connected home, family safety, and some young innovative companies located in the startup area of the section. I spent a few hours interviewing and discussing the Internet of Things (IoT) with as many vendors as I could find. I had many good laughs and shed a few tears during the process. To describe the process, the general communication would go something like this:
Me: "Can you point me at the most technical person you have at your booth? I'd like to talk about how you secure your devices and the sensitive / personal data that it accesses and collects."
Smartest tech person at the booth: "Oh! We are secure; we [insert security-specific line here]."
Me: "Never mind . . ." (dejected look on my face).
CMOs historically focused narrowly on marketing and promotion. That’s not enough in the age of the customer. The CMO of 2015 must own the most important driver of business success -- the customer experience -- and represent the customer’s perspective in corporate strategy. Andy Childs at Paychex is a great example -- he owns not only traditional marketing but strategic planning and M&A.
We are in a golden age of data breaches - just this week, the United States Post Office was the latest casualty - and consumer attitudes about data security and privacy are evolving accordingly. If your data security and privacy programs exist just to ensure you meet compliance, you’re going to be in trouble. Data (and the resulting insights) is power. Data can also be the downfall for an organization when improperly handled or lost.
In 2015, Forrester predicts that privacy will be a competitive differentiator. There is a maze of conflicting global privacy laws to address and business partner requirements to meet in today’s data economy. There’s also a fine line between cool and creepy, and often it’s blurred. Companies, such as Apple, are sensitive to this and adjusting their strategies and messaging accordingly. Meanwhile, customers — both consumers and businesses — vote with their wallets.
An IT mindset has dominated the way organizations view and manage their data. Even as issues of quality and consistency raise their ugly head, the solution has often been to turn to the tool and approach data governance in a project oriented manner. Sustainability has been a challenge, relegated often to IT managing and updating data management tools (MDM, data quality, metadata management, information lifecycle management, and security). Forrester research has shown that less than 15% of organizations have business lead data governance that is linked to business initiatives, objectives and outcomes. But, this is changing. More and more organizations are looking toward data governance as a strategic enterprise competence as they adopt a data driven culture.
This shift from project to strategic program requires more than basic workflow, collaboration, and data profiling capabilities to institutionalize data governance policies and rules. The conversation can't start with data management technology (MDM, data quality, information lifecycle management, security, and metadata management) that will apply the policies and rules. It has to begin with what is the organization trying to achieve with their data; this is a strategy discussion and process. The implication - governing data requires a rethink of your operating model. New roles, responsibilities, and processes emerge.
On May 5, 2014, Target announced the resignation of its CEO, Gregg Steinhafel, in large part because of the massive and embarrassing customer data breach that occurred just before the 2013 U.S. holiday season kicked into high gear. After a security breach or incident, the CISO (or whoever is in charge of security) or the CIO, or both, are usually axed. Someone’s head has to roll. But the resignation of the CEO is unusual, and I believe this marks an important turning point in the visibility, prioritization, importance, and funding of information security. It’s an indication of just how much:
Security directly affects the top and bottom line. Early estimates of the cost of Target's 2013 holiday security breach indicate a potential customer churn of 1% to 5%, representing anywhere from $30 million to $150 million in lost net income. Target's stock fell 11% after it disclosed the breach in mid-December, but investors pushed shares up nearly 7% on the news of recovering sales. In February 2014, the company reported a 46% decline in profits due to the security breach.
Poor security will tank your reputation. The last thing Target needed was to be a permanent fixture of the 24-hour news cycle during the holiday season. Sure, like other breached companies, Target’s reputation will likely bounce back but it will take a lot of communication, investment, and other efforts to regain customer trust. The company announced last week that it will spend $100 million to adopt chip-and-PIN technology.