I ran across an interesting post by a business continuity consulting firm that outlines four types of risk mitigation strategies: acceptance, avoidance, limitation, and transference. According to the article, organizations accept risk when the cost of other risk management options outweighs the risk itself. Avoiding risk is the opposite of accepting it and is typically the most expensive undertaking. Limiting risk is the most common business strategy, splitting the difference between accepting it and avoiding it. Transferring risk passes the buck to a willing third party, at a price, of course.

ITRenew’s business is helping major cloud companies and Fortune 500 businesses safely retire IT equipment from their data centers and other mass-scale IT environments. All these risk mitigation strategies are leveraged at different points in the IT asset recovery and disposition process—for varying reasons. But when it comes to protecting proprietary company information and sensitive consumer data that is stored on this equipment, is any risk management strategy short of elimination really acceptable?

A Forced Data Security Strategy

I believe most businesses would opt to have all data erased from hard drives before IT equipment ever left their data centers, versus alternative methods that accept, limit, or transfer data security risk. However, ITAD service providers that are paid to protect their clients’ IT assets and data during data center decommissioning have struggled to make erasing data onsite a viable option.

As a result, many companies have adopted policies that mandate their hard drives be removed and destroyed onsite or equipment be transported via high-security means to an ITAD vendor for offsite data erasure. These alternative methods, among others, merely serve to mitigate data security risks—not eliminate them. They also increase ITAD costs for businesses, decrease the residual value of the data center equipment at disposition, and diminish overall IT sustainability.

ITAD Data Security: A High Stakes Game of Chance

According to a study by the Ponemon Institute, the average cost of a data breach in the United States in 2016 was $7.35 million. But that’s chicken feed compared to the most expensive data breach in corporate history, by Epsilon in 2011, which at $4 billion was 1,000 times the average cost.

There are a variety of causes of data breaches from malicious or criminal attack to a system glitch or human error, but the simple fact is this: a breach is a breach is a breach. It doesn’t matter what kind or whose fault it was, when a data breach occurs the damage is done. Not just to your bottom line, but to your brand and to customer confidence.

As a skilled professional in data security best practices at data center IT decommission, here’s what truly amazes me: according to Forbes, businesses worldwide spent $74 billion on cybersecurity in 2016. They are expected to spend more than $101 billion by 2020, an increase of 38%. So, typical Fortune 500 companies will spend tens of millions of dollars a year locking down their networks from hackers but when it comes to protecting that very same data as it comes off the network, they are willing to accept, limit, or presumably transfer data security liability to a third party. Yet all the while, a viable alternative (onsite data erasure) exists that completely eliminates risk and provides 100% auditable certainty of the outcome.

Maybe I could understand if there were financial advantages to be gained, but there aren’t. In fact, other methods are costlier, especially considering they increase your exposure to the $4 million-a-pop data breach.

If possibly there was sound sustainability and social responsibility rationale it might make sense, but other methods are also a detriment to the environment as they create more e-waste and less reuse of equipment. And they are worse for society, too, as they put personal identities at risk and deprive secondary users of having low-cost used technology alternatives as opposed to purchasing new equipment.

Furthermore, as explained in this ITRenew blog, alternative methods create a ripple-down effect on overall productivity when considering the opportunity costs.

So, given everything that’s at stake, shouldn’t companies approach data center ITAD data security with the same mindset as they do cybersecurity? Do you think the companies spending tens of millions a year on cybersecurity are content to just make it a little harder for hackers to penetrate their networks? No, they want to lock them out completely.

Best Practice Data Erasure: Eliminating Security Risk at the Source

Erasing data onsite in the data center according to best practices is the only way to eliminate data security risk and ensure 100% auditability at decommission. These best practices include:

  • Erasing data center equipment in-cabinet and over a network for scalability and efficiency
  • Reconciling every serialized hard drive and data-bearing IT device onsite—before it leaves the data center
  • Documenting the host-to-disk relationship to verify equipment was not tampered with after data erasure and during transit
  • Digitally verifying data erasure down to the sector level
  • Re-verifying successful erasure and serialized chain-of-custody upon receipt at the ITAD service provider facility

Following best practice data erasure allows you to prove every fragment of data and every serialized storage device has been accounted for when your IT equipment is retired. It also helps to protect you from the inevitable: human error and the loss or theft of IT assets during the recovery and disposition process. Best practice data erasure leverages automation to eliminate risk when one of these events occur. And with proper documentation, you can avoid breach remediation events because you will know with certainty the data was erased and could prove such in event of security audit.

Even though erasing data onsite in the data center according to best practices is a bullet-proof method to eliminate risk of data breach from IT asset disposition, surprisingly few companies do it. The simple truth is that it can be very challenging and risky if you don’t have the right data erasure tools, processes, and people managing the jobs.

As part of ITRenew’s blog series on data center decommissioning, next we will detail the unique security challenges that data centers present and what you need to overcome in order to implement best practice data erasure and asset accountability in your data center decommissioning program. Stay tuned.

About the Author

As the Director of Product Development, Matt Mickelson is responsible for development of ITRenew’s data sanitization product line. This includes Teraware, an enterprise-grade data sanitization and asset management software platform, and Terabot, a line of do-it-yourself data erasing machines that are powered by Teraware software. Matt also directs all aspects of the ITRenew Innovation Center, an R&D facility and data security think tank. The center’s core charter is to stay ahead of the various technologies that storage manufacturers develop and customers deploy to ensure compatibility with Teraware during data center decommissioning.