2015 Data Center Predictions
In 2014 I made some more granular technology predictions about the data center. I will write separately to review those predictions. For 2015 I would like to make some higher level trend predictions based on conversations I have had with industry peers and customers as well as the marketing materials I read from today’s popular technology vendors. There is always pressure from vendors to push the market in a certain direction but the rate at which customers adopt those ideas is largely based on a mix of the economy, the organization’s capacity for change, and how easily understood and adopted those ideas and eventual products can be implemented or consumed by targeted customers.
There is a lot of value in data and the Big Data craze has enlightened some organizations while leaving others wondering what people are talking about and how to use Big Data. However tools are popping up from companies such as New Relic, Splunk, and Tableau to help deal and comprehend information from disparate data sources. These dashboards allow different members of the organization to capture and display information visually in order to make better informed decisions. More people within organizations will demand and benefit from a personalized dashboard and the ability to pose questions to the system that will provide answers based on the data. These dashboards have existed statically and in report form for a while in certain pockets of the organizations. Financial reports, sales pipeline, ticket queues, service response times but are usually done in a very manual or traditional approach. They are using like data in an existing system and running basic queries and information processing against collocated and similar data. The power people are realizing from Big Data queries of lots of unstructured and loosely collocated data is providing information discovery and information realization that can inform decisions on the fly. The query tools and the computing horsepower exists today to make this a reality. One of the biggest hurdles for the organization is getting this information into their Big Data analytic system. In fact analysts like those from Freakonomics say that today it takes 6-months to find and compile the data within an organization. At RackTop we are adding capabilities to our data ecosystem to make that easy, automatic and instantaneous because of the value we see from data. With the metadata about your data we will be able to enable our customers to easily use Big Data Analytics.
Depending on your background you will likely have a perspective that colors what you think of first when you hear data protection. Fifteen years ago the first thing that would have popped in my mind was something akin to data backup. However, going forward data protection is going to be all about protection against loss because of system error, user error, or natural disaster as well as loss due to theft or unauthorized access. Organizations need to protect against the data being unusable as well as someone getting a copy of their data which has been highlighted many times over in the news with big security breaches such as Sony, Home Depot, and others. Security is always best in layers and defense in depth so I predict that more organizations will adopt encryption of data at different layers, more refined and advanced access controls, and intrusion detection systems. At RackTop we are doing our part to make sure that our customers protect their data in accordance with their business priorities and policies. Our data replication technology will ensure data is replicated to the appropriate on premises appliances or data clouds at the appropriate intervals based on the rate of data change. Our systems support inline AES-256 bit encryption to ensure your data won’t “walk away”. Moreover our read only operating system provides the smallest possible attack vector and unmatched protection against malware. It is important to restrict access to data to the smallest set necessary to conduct business and features such as multifactor authentication and attribute based access controls instead of just role based access controls will become more main stream.
Most things in life are better with moderation and 2015 will continue to see more organizations adopt the hybrid cloud model. For a large organization with critical data or lots of data it is often the case that a hybrid cloud implementation makes the most sense for several reasons. Even with declining cloud costs a large organization will pay significantly more to run 100% out of its operations out of the cloud. Many large organizations have an IT department and even to run operations out of the cloud will require an IT staff. However there are several IT services that are cheaper to run out of the cloud regardless. Organizations can often run temporary, surge, test, or development operations out of the cloud and often save money because it is not a full time capacity they need and eliminates the organization from buying extra capacity that they will grow into later. The biggest and most effective use case for the cloud in 2015 will be disaster recovery and business continuity. Business continuity in the cloud allows the organization to run operations in the cloud and in a location far away from their primary data centers. This means they don’t have to maintain a distant staff or facility yet have the on demand capacity to run operations in a critical time for less cost than buying and implementing a duplicate facility.
CIO/CTO seat at the table
Many analysts and surveys talk about line managers in other organizations having control over the technology budget. The context of the survey or the report always seems to be that in today’s organization people now realize that technology is enabling function of the business and now the technology department has to answer to the operations or sales department for example. I couldn’t agree more that IT needs to be an enabling or accelerating function of the organization. In many companies the IT department is a support organization to the rest of the company and doesn’t generate profit itself. However, the CIO needs to have an equal seat at the table and internal organizations must act more like equal partners with the IT organization at every level and work collaboratively to iteratively reach the best outcome. Devops captures this process in the fact that it stresses communication, collaboration, and integration between software developers and Information Technology professionals. In fact, I would take this a step further to include the end user or customer organization. Rather than the IT organization taking requests from the customer organization and throwing the response back over the organizational fence the IT organization must work with all organizations to create the optimal solution for the whole company that makes the company more efficient, effective, and able to deliver the company’s products and services. The most successful companies in 2015 will force the IT department and the other departments to work collaboratively to understand the business problem and then create and implement a solution.
Search the Archives
- RackTop Wins 2021 Fortress Cyber Security Award
- A presidential order for modernizing cybersecurity
- RackTop Wins Three Awards In Data Security, Encryption, and Insider Threat Prevention
- What you need to know about attacks against critical infrastructure
- What you need to know about software attacks in the supply chain