Big data is one of the “big” industry trends that is challenging enterprises these days, especially from a data security perspective. Thanks to the explosion of Big Data, the Internet of Things (IoT), and global mobilization, the way companies use, collect, store and process data has changed forever. If we look back to 2011, IT analysts IDC published the “Extracting Value from Chaos” report, in which they announced, “While 75% of the information in the digital universe is generated by individuals, enterprises have some liability for 80% of information in the digital universe at some point in its digital life.”
Fast forward to 2013, and Science Daily stated, “90% of the world’s data was generated over the last two years.” And that was three years ago. We are now successively generating more data than has ever been generated before.
Not surprisingly, KPMG has identified security as a key concern within the IoT ecosystem, stating that “Those tech companies and IoT solutions developers that take a disciplined approach, investing the appropriate time and resources to integrate security, privacy and trust concepts into their IoT solutions will – ultimately – win out over those that eschew discipline in order to be first to market.”
This clearly highlights the case for addressing security concerns at an early stage of an IoT project, when this type of project is generating vast amounts of data for downstream analysis and consumption. Concerns that will only become more pressing as the amount of big data increases, and enterprises look to leverage big data to achieve business objectives – accurate predictions, new revenue streams, and where possible, revenue from selling the data itself.
So what lies ahead for those responsible for securing big data? For security and CISO departments that need to make sure data is protected and that it doesn’t just get put on the proverbial shelf, where it isn’t of use or value? Can sensitive data be protected and securely shared across an entire collaborative chain?
I believe it can.
To start with, it’s important to distinguish between data that should be protected and data that is not sensitive. For example, in a use case at a global insurance customer of ours, the insurer opened up virtually all non-sensitive data to employees and focused on protecting business critical assets: a far better and more cost effective way of using a security budget.
Once you know which data should be protected you can then begin identifying the best way to protect it – or rather the best way to share it securely. And if you’re going to do that, there is one collaborative model that makes sense. A model that only allows authorized users access to sensitive data under the right conditions.
It sounds obvious. It sounds like we should all do it. But in the past it has proven to be difficult. That’s because when it comes to authorization, application developers embedded the security access rules directly in the application – which meant they had to do all the heavy lifting. Further, much focus has been put on “who” is accessing data, rather than “why”, “when” or “where”. Who is important. It’s a key attribute, but so too is the relationship between the user and the data, their location, the device they are using, and the time of day. True security needs to consider all these variables. Knowing and enforcing them could be the difference between big data being shared securely, not shared at all, or even worse, being shared inappropriately.
The collaborative model I’m referring to is known as Attribute Based Access Control(ABAC), and it’s unique in the sense that it only grants access to data if every attribute is aligned with a corporate policy – no matter how general or granular the policy is. For instance, a car manufacturer could have a business policy governing third party access to data that states “During rush hour, analysts at California-based insurance companies can view reports on nationwide GPS activity of all vehicles owned by 20 – 30 year olds, but they cannot access drivers’ PII data, such as names, ages and license plates, unless it relates to a customer of that insurance company.” Analysts could then view general driver trends, and customer specific data to set car insurance premiums. In New York, insurance analysts would get access to the same nationwide data, but different customer data, and they would get access to the data at slightly different times of the day due to the time zone difference.
If the manufacturer were to change the policy and only allow insurance analysts access to statewide GPS activity, this change would only need making once at a centrally managed ABAC policy server, rather than in every application that is impacted by the change. If you consider how many access control policy changes occur in an organization due to regulatory or corporate changes, you can imagine how much coding resources such a centralized policy management system can save. Coding resources that could, for instance, be spent developing new applications.
Visibility is also a key feature of ABAC. There is no second guessing who can access what data. If it is written in a policy, then it is enforced. And unlike with other authorization models, conflicts of interest cannot happen – as one policy will always overrule another if a conflict is possible – according to the precedence setting in the ABAC system. Put into context from the previous example, an insurance analyst from New York wouldn’t be able to view California data just because he/she is on a business trip in Los Angeles. There is always a predetermined policy hierarchy, and in this case it would be the location of the market served by the insurance company, as opposed to the location of the analyst.
So what is the takeaway from this? Security is a key big data concern that should be addressed at an early stage of any project if you are going to realize the full potential of IoT and other big data scenarios. There is a logical model that enables you to securely share business intelligence and other valuable nuggets of data among collaborative teams. Using ABAC will secure your data, free up resources and give you full transparency of who can access what data and for what purpose.