Read time: 3 minutes

Hitachi Unpacks Big Data

By , ITWeb
22 Oct 2012

Hitachi Unpacks Big Data

Big data and the implications thereof was a focus at the recent Hitachi Information Forum, held in Rosebank.

During the morning session of the forum, speakers from Hitachi Data Systems attempted to separate the finite from the folly and to bring some answers to the big data question.

“What we are seeing is that companies are spending double what they used to spend on storage, but their businesses aren’t getting bigger,” said Bob Plumridge, chief technical officer for EMEA at Hitachi Data Systems. According to Plumridge, this is because businesses are experiencing exponential growth in the volume of data they have to handle.

“Big data is essentially a data warehouse with funky data," said Harry Zimmer, senior director of global competitive and market intelligence at Hitachi Data Systems. During his presentation on industry trends, Zimmer described big data as a 90:10 situation. “About 90% of the hype surrounding big data is realistic, with only 10% being hot air,” he said.

Plumridge used an example from the UK to describe the volume of data that is being discussed in big data conversations. In the UK, medical records are stored electronically and the government has implemented legislation that all these records be stored for as long as a person is alive and for an additional five years after they die. For a population of over 60 million, that is a huge volume of data that needs to be stored, said Plumridge.

With this in mind, a concern for Plumridge is the fact that much of the data from 20 years ago is inaccessible, and even if it were accessible, it would not be usable because the file storage technology is out of date. “The problem with keeping data tied to a certain application is that you will only be able to access that data while that file storage application is in use," Plumridge said.

He posits object format storage as a solution to this long-term storage conundrum, an idea echoed by Lynn Collier, senior director of cloud, file and content for Hitachi Data Systems EMEA at Hitachi Data Systems.

“Object format storage allows us to make the data independent of the media it was stored on, with metadata available describing that information,” Collier said.

While the market for solutions that can handle big data has grown, Collier stressed that it is still a very immature market and it is important for each business to choose the best methods to make data work for the organisation. “Businesses need to figure out how to convert this information into business value, and once they know what they want to do, they can choose the best partner to help them achieve those goals,” she said.

Data protection
When discussing big data and data storage, data protection is a key concern, according to Ros Schulman, data protection product line manager at Hitachi Data Systems. “Data protection is a cost-versus-risk exercise. We are not just talking about business continuity, we are talking about protecting your data so that you no longer have to worry about recovering data after a disaster,” she said. According to Schulman, 76% of companies have a recovery time objective of four hours, making it important to have the best solutions in place to get things up and running in the shortest amount of time.

Schulman advised that businesses have various levels of data protection, with the most advanced protection for the most important data. She added that any inactive data should be archived so as to reduce the volume of data that needs protecting and to make backups and recovery easier.

For Zimmer, big data is in an "age of exploration”. "There is no template or guidebook of how to handle the big data situation," he said, advising businesses to do a great deal of research before committing to a data storage strategy. "I view 2012/13 as a time when you should just be doing your homework – don't build a darn thing yet,” he said.

Editorial contacts
Melinda de Gee
083 212 9840
Melinda.degee@shoden.co.za

Daily newsletter