“Big Data” Challenges our Perspective of Technology

It’s easy to hold onto the concept that IT is all about systems, networks, and software.  This has been accepted wisdom for the past 50-years.   It’s a comfortable concept, but one that is increasing inaccurate and downright dangerous as we move into an era of “big data”!  In today’s world not about systems, networks, applications, or the datacenter – it’s all about the data!

For decades accumulated data was treated as a simply bi-product of information processing activities.  However, there is growing awareness that stored information is not just digital “raw material”, but a corporate asset containing vast amounts of innate value.  Like any other high-value asset, it can be bought or sold, traded, stolen, enhanced, or destroyed.

A good analogy for today’s large-scale storage array is to that of a gold mine.  Data is the nuggets of gold embedded in the mine.  The storage arrays containing data are the “mine” that houses and protects resident data.  Complex and sophisticated hardware, software, tools, and skill-sets are simply tools used to locate, manipulate, and extract the “gold” (data assets) from its surrounding environment.  The presence of high value “nuggets” is the sole reason the mining operation exists.  If there was no “gold”, the equipment used to extract and/or manipulate it would be of little value.

This presents a new paradigm.  For years storage was considered some secondary peripheral that was considered only when new systems or applications were being deployed.  Today storage has an identity of its own that is independent from the other systems and software in the environment.

Data is no longer just a commodity or some type of operational residue left over from the computing process.  “Big Data” forces a shift in focus from IT assets deployment and administration to the management of high-value data assets.  It dictates that data assets sit at the center of concentric rings, ensuring security, recoverability, accessibility, performance, data manipulation, and other aspects of data retention are addressed as abstract requirements with unique requirements.  Now information must be captured, identified, valued, classified, assigned to resources, protected, managed according to policy, and ultimately purged from the system after its value to the organization has been expended.

This requires a fundamental change in corporate culture.  As we move into an era of “big data” the entire organization must be aware of information’s value as an asset, and the shift from technology-centric approaches for IT management.  Just like gold in the above analogy, users must recognize that all data is not “created equal” and delivers different levels of value to an organization for specific periods of time.  For example, financial records typically have a high level of inherent value, and retain a level of value for some defined period of time.  (The Sarbanes-Oxley act requires publicly-traded companies to maintain related audit documents for no less than seven years after the completion of an audit. Companies in violation of this can face fines of up to $10 million and prison sentences of 20 years for Executives.)

However, differences in value must be recognized and managed accordingly.  Last week’s memo about the cafeteria’s luncheon specials must not be retained and managed in the same fashion as an employee’s personnel record.  When entered into the system, information should be classified according to a well-defined set of guidelines.  With that information it can be assigned to an appropriate storage tier, backed up on a regular schedule, kept available on active storage as necessary, later written to low-cost archiving media to meet regulatory and litigation compliance needs.  Once data no longer delivers value to an organization, it can be expired by policy, freeing up expensive resources for re-use.

This approach moves IT emphasis away from building systems tactically by simply adding more-of-the-same, and replacing it with a focus on sophisticated management tools and utilities that automate the process.  Clearly articulated processes and procedures must replace “tribal lore” and anecdotal knowledge for managing  the data repositories of tomorrow.

“Big Data” ushers in an entirely new way of thinking about information as stored, high-value assets.  It forces IT Departments to re-evaluate their approach for management of data resources on a massive scale.   At a data growth rate of 35% to 50% per year, business-as-usual is no longer an option.  As aptly noted in a Bob Dylan song, “the times they are a-changin”.   We must adapt accordingly, or suffer the consequences.

About Big Data Challenges

Mr. Randy Cochran is a Senior Storage Architect at Data Center Enhancements Inc.. He has over 42-years of experience as an IT professional, with specific expertise in large and complex SAN/NAS/DAS storage architectures. He is recoginzed as a Subject Matter Expert in the enterprise storage field. For the past five years his primary focus has been on addressing the operational requirements and challenges presented by petabyte-level storage.

Posted on June 24, 2012, in Storage and tagged , , , , , , . Bookmark the permalink. 1 Comment.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: