Trends in Security: Mitigating the risks of the data explosion

January, 2012

Trends in Security: Mitigating the risks of the data explosion

While many organizations are still coming to grips with the brute impact of the data explosion, others are already starting to experience some of its deeper consequences. Massive volumes of data and the increasing integration of data-related systems are producing new risks of human error that require a shift in how companies approach their data administration.

Every step leaves a data trail
The story of King Midas, who turned everything he touched into gold, is in some ways a good analogy for the phenomenon of “datafication.” Nearly everything we touch during our professional and personal lives now turns into some kind of data: when we do our banking, when we book a flight, when we text our friends, when we log onto a webinar, when we download files from the company server.

Businesses create trillions of bytes of data each day. People share more than 30 billion pieces of content a month on Facebook. Passive devices like sensors in cars, computers, smartphones and energy meters log trillions of bytes more. To get a handle on all the information flying around, companies are integrating and consolidating their systems. Integration typically involves having dozens of machines work together in sophisticated ways to serve single applications. Consolidation refers to the trend in which multiple users house their IT assets together for cost efficiencies.

While integration and consolidation are essential, they are also creating an “all eggs in one basket” scenario that organizations need to be aware of—and address. Maybe surprisingly, because we’re talking about data and automated systems, one of the key areas of risk to mitigate is actually human behavior.

The price of a small mistake
Mistakes are costly for any business. And with the advent of datafication, where single systems are carrying more and more responsibility for the business, they’re becoming even costlier. If you’re an online retailer, your entire business depends on your e-commerce platform. If it goes down or is compromised in any way, you face revenue losses, productivity losses, reputational damage and more. In the datafied world, ‘mission critical’ is more critical than ever—meaning even the smallest mistake by an internal or external database administration team can spell catastrophe.

Other fields have extensive experience dealing with this. Healthcare, civil aviation, nuclear safety—all of these have extensively studied the risk associated with human error and how to minimize it. The THERP (techniques for human error prediction) model used in those disciplines needs to be applied in the data sphere as well.

THERP looks at all the potential kinds of errors a person might commit on the job. For errors of omission and sequence, a good checklist and proper training are usually a solid defense. Errors of commission—performing work on the wrong server, forgetting an important “where” clause in a delete statement, shutting down a server outside the appropriate maintenance window because of a time-zone conversion mistake—can be more challenging to prevent.

Can human error be eliminated?
Automating processes, of course, can help prevent human error. Where processes can’t be automated, ensuring they are as fine-tuned and streamlined as possible minimizes the number of opportunities for mistakes to be made.

In working on complex data installations for organizations such as Western Union, Toyota and LinkShare, Pythian has gained valuable experience in the area of human reliability. Pythian studies every incident in much the same way the Transportation Safety Board investigates a plane crash, getting down to the root cause and implementing processes to prevent the same error from being repeated. Application of the THERP approach has enabled Pythian to reduce its human error rate to one every 5.5 person years.

While human error can never be eliminated, it is highly reducible. As organizations seek to manage the sheer volume of data being generated today, in 2013 they should look beyond the purely technical side of things to ensure they are protecting themselves from the risks that data dependency and systems integration can bring.

Link to article