What Switzerland and vendor agnostic storage have in common
If CA Technologies Cloud Storage for System z were a country, it would be Switzerland.
What does a cloud storage solution have to do with Swiss chocolate, the Swiss Alps and the alphorn? Nothing.
What it does have in common is that it’s “Switzerland.” Not the country itself, but in its mindset – like Switzerland’s neutral position on foreign affairs.
CA Cloud Storage for System z is vendor agnostic. A recent addition to the CA Cloud Storage for System z partner eco-system is Microsoft Azure. Microsoft’s sophisticated and secure storage capabilities, along with its enterprise focus, make for a great fit with CA cloud storage partners.
The solution provides customers with multiple options for short- and long-term data retention and recall as well as improved disaster recovery posture for mid-size and public organizations.
Best of all, it costs pennies per gigabyte (GB) of storage as opposed to hundreds of dollars – now that’s something your CIO can take to the bank.
And we know it works. CA has tested and successfully pushed and pulled mainframe data to and from Azure. But don’t just take my word for it – come see it yourself at CA World ’14 – there’s still time to register.
Why move to the cloud?
Most mainframe shops have traditionally depended on hardware-based storage for their data centers. The downside of these traditional methods for enterprise storage is keeping up with the astronomical and mostly unpredictable growth of the data in today’s big data environments.
CA recently surveyed a large number of our customers around their storage environments. One resounding response from the storage area rings loud and clear: “Capacity requirements are far out stripping storage availability.”
Keeping up with the growth of data
The data center continues to need more and more storage to keep up with demand. “Consumerization” of data-generating technologies in the application economy is contributing to a large amount of the growth of data in the world – around 2 to 3 zetabytes (ZB).
This is predicted to grow exponentially over the next few years. The mainframe continues to play a critical role in support of this growth. It has a phenomenal ability to support millions of transactions per second and continues to be central to the data center’s strategy for managing this growth in data.
Predicting future usage
With this continued growth in the data market, one question that creeps up is, “How can the data center manager predict how much hardware and storage will be required on a day-to-day or quarterly basis?”
The answer is: they do their best. Some managers use historical trends to determine peak usage and other factors then subsequently purchase hardware that is used during these peak times so they don’t experience critical availability issues that can arise when storage reaches maximum capacity.
While storage disks continue to drop in price, I don’t think any data center manager is happy to over-allocate hardware for, “just in case,” scenarios. In other words, they purchase hardware that is only used during peak times and sits idle the rest of the time taking up expensive floor space and using data center power and cooling.
What you do with the data
The creation of new data is definitely one of the issues for data center managers. But there is also the ongoing handling and archiving of data.
Many federal and local industries are heavily regulated when it comes to how long they have to store data. Some data has to be stored for seven to 10 years while other data, such as health-related data, has to be stored for 50-plus years or even more.
These requirements cause the long-term storage of data to continue to grow. Once again, the responsibility falls upon the data center managers to find long-term storage for this type of data. This data has to be maintained and recoverable upon demand.
It is an interesting thought when data is in a format generated by a current storage system, but will need to be recoverable and readable 50 years later. What type of device lasts 50 years? Why would a data center want to continue refreshing that old data into newer formats?
Take the data worries off your mind
CA Cloud Storage for System z provides a solution to these pitfalls. Let the cloud providers worry about the data. It’s capacity on demand – you purchase a much smaller footprint and you don’t have to worry about how to rehydrate the data into a readable format 50 years later when a random audit comes along.
See me at CA World
At CA World ’14, Microsoft will join me on stage on Tuesday November 11 to give an overview of Azure as well as a preview of CA Cloud Storage for System z, featuring Azure.
The future of cloud storage will work in everyone’s favor if vendors take a page out of Switzerland’s foreign diplomacy.
Now you’ve seen how CA Cloud Storage for System z is “Switzerland,” where do you stand?
Image credit: CandyTian
The post What Switzerland and vendor agnostic storage have in common appeared first on Highlight.