The holidays are a time of year when you take a step back from the day-to-day and think about the big picture.

For enterprise executives, those thoughts should be swirling around the future of the data environment. At one time, data infrastructure was a fixed, known quantity, but those days are quickly coming to an end. In a virtual, distributed and largely automated future, organizations will be forced to reevaluate their data footing, and many may realize that owning those resources outright is no longer in the best interests of the enterprise.

At the very least, the data resources that do remain under the firm control of the enterprise will be configured along radically different lines than today’s rigid architectures. Emerson Network Power has identified four archetype data centers that will more effectively address emerging Big Data, cloud computing, IoT and other challenges. These include the Data Fortress, in which security is implemented as a founding principle, the “Cloud of Many Drops,” in which excess resources can be allocated on a shared service model, the Fog Computing architecture that places processing and intelligence on the edge, and the Compliant Data Center that stresses energy efficiency and carbon neutrality. All of these models will center on improved productivity, cost efficiency and increased agility.

 

But as cloud computing and IT outsourcing evolve, will we get to the point at which the data center – at least the monstrous collection of servers and storage that we know today – will cease to provide a justifiable value proposition? It may very well come to this eventually, says Romonet CEO Zahl Limbuwala. Once organizations start to realize the costs of maintaining both cloud and in-house infrastructure, the pressure will be on to provision the lower-cost solution, which will lead to even less utilization of local resources and thus a further erosion of its value. It’s kind of like owning a house while renting an apartment in order to keep the house half empty.

Once the enterprise comes to realize the true cost disparity between traditional data centers and the cloud, it will be very difficult to justify the data center for all but the most critical workloads – and even then, these will be supported on streamlined, modular systems that utilize only a fraction of the space and power currently reserved for data infrastructure.

No matter where data resides, users will likely shun it unless it fulfills their increasingly strict availability, quality of service and performance expectations, says IBM’s Avrilia Floratou, and that means today’s data architecture has to shape up if it’s to play a role in the new data economy. These characteristics are best delivered on new infrastructure as opposed to legacy retrofits, however, and this has caused many organizations to pursue all-software architectures sitting on commodity hardware. But there is still some question as to whether this approach misses out on the opportunity for hardware and software to work together in support of key workloads.

The best solution just might be an integrated approach, which is becoming increasingly easy to model using advanced simulation and analytics techniques. In this way, enterprise executives can take a highly systematic approach to infrastructure deployment, testing the behavioral characteristics of a wide array of solutions against an equally wide array of workloads and operating conditions, and then deploying whatever works best.

No matter what happens, the bulky data infrastructure of today is not long for this earth. Even the hyperscale infrastructure of Amazon and Google will be models of efficiency considering the volumes of data they handle. And whether resources reside in the next room or across the country, they will still function as an integrated, cloud-based data ecosystem.

The new data environment, therefore, could very well be wherever, whenever and whatever you need it to be.

 

[Source:- ITBusinessEDGE]

By Adam