In an age where we are all becoming increasingly obsessed with data; its protection has become a necessary burden for businesses. Investment in the applications, mass storage, backup processes and training required to keep this data safe is taking its toll on IT teams, that have little time to sift through data and assess what really needs to be backed-up and how, meaning for many companies the default approach is to simply backup everything.
However as data sources continue to grow, many are asking how to turn data protection from a run-the-business to a change-the-business process, and if this essential operation can be made more efficient and cost effective. Information is a company’s most important asset, but protecting this data is an expensive and complex process. We noticed that for every $1 that a company spends to keep customer information, it spends at least $4-6 to protect it. For many of today’s
organisation’s data protection is a form of costly but essential insurance; offering a guard against data losses that could damage brand reputation.
Advances in technology over the last five years have produced a number of ‘must have’ enterprise solutions—server virtualisation, Internet of Things, data analytics and software-defined data centers. As a result, most companies have in place at least three backup solutions to manage their data protection needs. With IT budgets remaining the same while data handling and analysis continue to skyrocket, it is unsurprising that many companies are looking to reduce cost and optimise processes associated with data protection. But where should they start? Here are a few things to consider:
Protection should add value: The cost of protection should be commensurate to the business value of the information. By clearly identifying what is critical, important businesses can start to align processes with requirements.
Why do you protect data in the first place?
For operational recovery—limited disaster such as data corruption
For disaster recovery —large scale disaster such as flooding or fire
For long-term recovery—to meet compliance requirements
This begins by understanding the different types of recovery and levels of data importance. For example, core banking systems which are critical to a bank’s operation should employ hardware-based snapshots for operational recovery. While this approach costs more, it is essential to allow recovery within minutes to keep the business going. This would also minimise data loss and impact to users.
On the other end of the spectrum, things such as healthcare patient data, which have to be kept for the length of a patient’s lifetime or more but are not regularly accessed, can be moved on the right-hand side of the spectrum—either in a private or public cloud as regulations dictate to reduce costs. By taking the time to assess where their data sits, businesses can begin to form a structured and cost effective approach to data protection.
Fully leverage the cloud: The business case for using cloud to store backup data can be compelling and there is a lot of pressure on IT departments to adopt it. However, while cloud may be the fastest way to reduce cost, it can also be the fastest way to introduce risk with issues around security, flexibility and real costs all proving to be common concerns when it comes to using the cloud.
Security concerns can be allayed with encryption and ensuring companies ultimately ‘hold the key’ to their data. Understanding how much the cloud will cost based on expected usage patterns requires planning, but ensures businesses aren’t surprised by additional charges or system limitations.
Being able to ensure flexibility is a slightly harder task. Cloud providers come and go rapidly, and IT is under pressure to adopt new systems, but regularly migrating data can be risky. Companies need a ‘concierge to the cloud’; a solution that can help move data between clouds as companies please to take advantage of the latest cloud innovations without any headaches is very critical.
Process transformation and consolidation: In the past, customers were forced to take a best of breed approach in order to better support new business requirements. However, this resulted in a lot of complexity. As solutions catch up on trends, it is becoming easier to take a single solution approach to support multiple applications in one place.
Do you build your own or buy purpose-built solutions? Businesses today are expected to be agile enough to respond to ever-changing customer demands. As a result, IT is no longer seen as just a provider but more of a partner to the business, expected to consult on best-practice and oversee transformation.
It may be tempting for some to want to build a custom data protection solution to maintain complete control but this comes with a costly set of storage solutions and applications, as well as a team to run them. By opting instead for purpose-built integrated solutions, IT can free up time that was previously used to maintain, upgrade and troubleshoot custom-built systems to focus more on high value tasks while ensuring predictability in both cost and maintenance.
The reality is that data will continue to take up a large part of IT processes for the foreseeable future so now is the time to examine how costs can be reduced and processes made more efficient. Rather than approaching data protection with a single backup as ‘keep everything’ mindset, businesses now have far more flexibility to make savings and ultimate get value from the data they are required to keep. A single solution that can pull this all together is invaluable in the fight to get value from data protection.
The writer is technical director, data resilience solution, Hitachi Data Systems