The amount of data generated, stored, and accessed every day is absolutely mind blowing and nearly impossible to keep up with. According to IDC, the total amount of data in the world was 4.4 zettabytes in 2013 and will jump to 44 zettabytes by 2020. For those who still think a gigabyte is a lot of data, one zettabyte is equivalent to roughly 1.1 trillion gigabytes.

Looking at this data tsunami incrementally, a Reduxio live ticker estimates (at the time of this posting) that 31 billion terabytes of data are generated worldwide every second. This amount of data would require more than 400 million square feet of data center space to store. Such a data center would be more than double the size of LAX airport—simply to house the data that’s being generated every second.

Data centers, and thus data center storage capacity, are growing exponentially because of this data tsunami. Globally, the data stored in data centers will quintuple by 2020 to reach 915 exabytes, up 5.3-fold from 2015, according to Cisco. Data center storage capacity will grow to 1.8 zettabytes by 2020, nearly 5-fold from 2015.

While more than 70% of data is created by individuals, it is business enterprises that have responsibility for most of its information storage, protection, and management. Not all digital data is sensitive, but at least 40% of it is—from corporate financial data and medical records to personally identifiable information. Globally, less than half of this sensitive information is properly protected.

If the importance of data security wasn’t already obvious and topical, the recent Equifax data breach that exposed the sensitive personal information of 143 Americans is a fresh reminder. According to Ponemon Institute, the average cost of a data breach in the United States is $7.35 million. The most expensive breach of all time was 1,000 times that: costing Epsilon an estimate $4 billion. It’s far too early to tell, but this Seeking Alpha article estimates Equifax’s expenses related to this breach will be at least $1 billion.

Granted, the Equifax data breach was the result of a hacker. However, as I rationalize in this blog, companies are spending tens of millions of dollars a year on cybersecurity. Yet, when it comes to protecting that very same data as it comes off the network, companies willingly accept data security methods that merely mitigate data breach risk. IT enterprises are often so consumed with locking down the front door to their data that they leave the back door open.

Data Security in Data Center ITAD

It’s easy for businesses to get panicked into spending more money and resource on data and asset security when a breach has occurred, even if it’s not very cost effective or efficient. And conversely, companies can also get forced into poor data and asset security practices because of cost or productivity constraints, if they let their guard down or don’t perceive the data security risk to be sufficiently high.

Complicating the data security decision-making process for data center operations is the fact that ITAD service providers have traditionally failed to make erasing data onsite a viable option in the data center. As a result, many companies have been forced to adopt lesser risk-mitigating practices that drive up ITAD costs, diminish value recovery of the equipment, and increase data center carbon footprint. These options include shredding or otherwise physically destroying hard drives, relying on secure transport, data encryption, or simply storing data-bearing IT equipment until it can be dealt with at another time. However, in our experience, very few companies have accurate asset management records when data center IT hardware is decommissioned. Placing this equipment into storage, with data intact and inventory inaccuracies, is a data security disaster waiting to happen.

Some businesses have attempted to erase data onsite in the data center and others currently do, but with one major hitch. Even the so-called industry leading data sanitization tools can take days, and sometimes even weeks, to erase the server rack configurations in today’s data centers. Worse yet, these data erasure tools may not even support the high-capacity or solid-state hard drives (SSDs) commonly configured in data center IT hardware. Regardless, from our experience working with major cloud companies, we’ve found that conventional data erasure tools will fail 40-50% of the time in the data center, for a variety of reasons. Erasure failure then forces these drives to be destroyed (along with their residual value) and increases exposure to data breach.

Now, imagine where you have tens of thousands, hundreds of thousands, or even millions of active hard drives, which hyperscale data center enterprises do. Then consider that 10-15% of those drives will fail or be flagged for predictive failure and require immediate replacement. This is life in the big data center. How you handle these IT refresh and break-fix management situations can and will have a dramatic impact on your data security as well as the Total Return from your disposition project. 

Balancing Data Security and Cost in Data Center ITAD

Total Return from an IT asset and data security perspective balances the level of security you receive with the total investment you make to achieve it. There are several critical factors to take into consideration:

  • Erasure tool reliability. The tool you use to sanitize hard drive data must be able to operate in the modern data center. These environments are commonly populated with high-capacity, SSD, and helium filled hard drives, among other types of advanced storage technology. The data erasure solution must run effectively on any brand, size, or type of hard drive, independent of the operating system, including failed hard drives that have been pulled from production. As I explain in Data Erasure Certification: Don’t Just Take My Word for It, credible certification of the data erasure tool is critical to reliability and compatibility. It’s why ITRenew went to great effort to achieve Teraware ADISA certification on the broadest array of storage technology deployed in the data center. With 17 ADISA certificates, Teraware holds more certifications than all the other data erasure products combined.
  • Data erasure processing speed and scalability. The data erasure tool must run fast and process any number of hard drives in parallel rather than in sequential order to maximize efficiency—without having to remove hard drives from the server rack cabinet. Many data erasure tools are designed to process hard drives individually or up to 60 or so at a time, which just doesn’t cut it in the data center environment, especially not for hyperscale data centers. As this case study shows, our Teraware data erasure platform took a cluster wipe from 10 days down to two, reducing internal IT labor by 80%. This software also rapidly sped up the IT refresh process, which has a trickledown effect to maintaining data center uptime and generating revenue from production.
  • Data erasure yield. This important ITAD consideration is rarely discussed, and almost never in the context of data security, but it should be. When data erasure fails, hard drives are typically destroyed. Destroying hard drives increases data security risk, over having them wiped to a sector level with a Certificate of Sanitization as proof. You also cannot otherwise reuse or remarket those destroyed hard drives for value recovery. If you’re only getting 50-60% data erasure yield and have millions of hard drives across your data centers, well you can do the math. We’ll discuss data erasure and reusable IT hardware yield in greater depth in the next blog.
  • Data sanitization and IT asset security processes. Many process-related factors must be taken into consideration as well such as:
    • Are you sanitizing IT assets at the time of hardware decommissioning, or is your retired data center IT equipment sitting idle for lengths of time with sensitive data vulnerable?
    • Are you erasing data in server racks or removing the hard drives for loose processing?
    • If your data erasure solution is too slow or incompatible, are you forced into secure transport, shredding/degaussing, or storage?
    • How much of your data sanitization and IT asset reconciliation process is automated, or otherwise at risk to undetected failure or human error?
    • Are you reconciling every serialized hard drive in every server—before data is erased, and especially before equipment leaves your data center?
    • Does your process include both a systematic and manual verification that no hard drives or any other storage component in data center IT equipment has gone undetected?For more depth on this topic, read Challenges in Data Center ITAD: IT Asset Security.

Our proprietary data erasure and IT asset management platform, Teraware, drives Total Return by allowing you to sanitize every type of storage device found in the data center, with an ITAD industry-best 95+% erasure yield. The software can be integrated with your asset management infrastructure and run over a private network, providing unlimited scalability. It can also be run from a Terabot machine that is custom-configured for your specific data center environment, or from a USB drive. Teraware is also the only software ADISA certified to erase both SSDs and HDDs beyond forensic recovery. All told that means you have an efficient, scalable and highly secure solution for achieving consistently higher yields and erasure compatibility.

With the proper data sanitization tools and internal ITAD processes, no longer must the purely financial interests of your company be at war with data and IT asset security efforts or initiatives to reduce the carbon footprint of your data centers. For the first time, you can have your cake and eat it too!