The most-discussed trend hitting the enterprise next year will surely be Big Data and the Internet of Things, but whether these twin movements will end with a bang or a fizzle is still uncertain.

The run-up has garnered its fair share of attention, so there will certainly be strong efforts in the coming year aimed at getting the infrastructure and architecture in place and then ramping up all the capabilities that turn raw data into actionable intelligence.

Tableau’s Bob Middleton says to watch for Hadoop clusters to move from the lab to full production, which is when the rest of the world will find out if it scales as well in disparate cloud environments as in the more homogeneous environments of Google and Yahoo. Middleton also says Hadoop will quickly become a core part of the IT landscape, which means organizations will have to augment functions related to analytics, namely security, to account for that status. This can already be seen in developments like the Apache Sentry project, which is trying to place fine-grained authorization over data and metadata within the Hadoop cluster, while at the same time tools like Cloudera’s Impala and Actian’s Vector are working to bring legacy OLAP database functionality to Hadoop.

 

The final payoff, of course, will be the keen insights that business leaders will gain upon the discovery of all those secret connections that lie within their massive volumes. According to ZDnet’s Brian Hopkins, developments like machine learning will cut the delay between data generation and insight by replacing many of the manual processes that currently slow things down – a movement that will be supplemented by tools like streaming ingestion and distributed analytics on platforms like Kafka and Spark. This will lead to a vibrant market for algorithms and other analytics tools, although, again, the learning curve could be steep in order to get to the point where insight is turned into meaningful action.

But could disillusionment set in if, as is likely, Big Data and the Internet of Things fail to produce major opportunities or redefine business models right away? Timo Ahomäki , CTO of telecom systems provider Tecnotree, says he is already seeing the signs in existing deployments, particularly when it turns out that the data being analyzed has little or no intrinsic value to begin with. Most likely, organizations will need to shift their attitude away from grand strategic initiatives toward more limited objectives. Call it “small data” or “smart data,” but it will invariably involve capitalizing on micro opportunities that leverage real-time data for small groups or individuals, rather than predicting individual behavior based on generalized populations.

Real-time data is a key element in this endeavor, says the Wise Marketer newsletter, butother factors are at play as well, and not all of them may be to the enterprise’s liking. For one thing, streaming data will have to be made available to multitudes of people in order to avoid information bottlenecks, which may require a rethinking of data management and access rules. As well, analytics will have to leave the purview of the data scientist and trickle down to the average knowledge worker, or an automated system, which will then have to be empowered to make split-second decisions regarding products and pitches. And all of this is likely to result in myriad, incremental value propositions, rather than a grand, sweeping boost of activity.

So where will that leave the enterprise at the end of the year? Pretty much where it has been with previous, paradigm-shifting technologies at this stage of deployment: lots of new toys, but only the slightest inkling as to how to use them effectively.

 

[Source:- ITBusinessEDGE]

By Adam