Reporting And Analytics: 4 Trends That Will Shape 2014

Reporting And Analytics

1. Forget Sentiment Analysis, Sensors & Software Will Change The World

Much of the Big Data hype has focused on social media and sentiment analysis, in an effort to get closer to the customer and better understand the market in which an organisation competes. While this is a valid goal, relatively few organisations will find both the skill and useful data patterns that add up to a material top-line difference.

Instead, the focus should be on the “Internet of Things”, for the transformative power it represents. Every day, I see more powerful examples of sensors and software in action. I prefer to describe it as “sensors + software” because these terms better symbolise the grittier, more real-world value that can be delivered by measuring, monitoring and better managing vast amounts of sensor-generated data.

Why is this important in 2014? Firstly, sensor technology has become remarkably low cost (an RFID tag, for instance, can cost as little as 50 cents, according to this report – which means more data points). Secondly, the data storage and analytic technology to capture and analyse this data is incredibly low cost and widely available (often in open source editions). Lastly, sensor-based data is well suited for correlation analysis, rather than looking strictly for causation, which increases the potential for finding value among this machine-generated data.

Analyst predictions are vast for the economic and far-reaching value of making “Things” smarter and connecting them to the Internet. Why limit analysis to the words and attitudes of a relatively vocal few (social media and sentiment analysis), when you can analyse the actual behaviour of a much larger population (sensor data)? So, I believe a quiet revolution is already underway. In 2014, sensors + software will change the world.

2. Cloud-Based Delivery Will Change The Analytics Consumption Pattern

As cloud-originated data growth accelerates, the attractiveness of using additional cloud-based data management and analysis services will grow too. I’ve previously written about an entirely new generation of platform and middleware software that will emerge to satisfy the new set of cloud-based, elastic compute needs within organisations of all sizes.

Commonly at this software layer, utility-based pricing and scalable provisioning models will be chosen to more strictly match consumption patterns with fees paid. These improved economics, in turn, will enable broader use of platform and middleware software, especially reporting and analytics, than ever before possible.

Additionally, cloud-based delivery portends a level of simplicity and ease-of-use (consumer-like services) that defies the earlier generation of enterprise software, ushering in deeper consumption of analytics by organisations of all sizes. In short, cloud-based delivery becomes a key component of the quest for pervasive analytics – especially when those analytics are delivered as components within the web applications we use every day.

According to Nucleus Research: “As companies learn to take full advantage of the analytics functionalities that are now available with utility and subscription-based pricing options, they will continue to become more able to take advantage of market trends and opportunities before their peers and take advantage of the average return of $10.66 for every dollar spent in analytics.” In 2014, cloud-based delivery will change the analytics consumption pattern.

3. From Schema-Constrained To Idea-Constrained, The Real Big Data Opportunity

In the past (and too often today), we collected just the data that we could afford to store and for which we had a clear, known use. In this sense, we were hard-wired to winnow the data down to its most obvious and practical subset; thus, we were (and are) schema-constrained.

By this I mean that today we must know, in advance, the uses for data as they are being captured. This mindset leaves little-to-no room for future, latent value that may exist within a data set. In physics, we recognise that energy has an immediate value (kinetic) and a future value (latent). Why should data be any different?

As costs have declined and the power of technology has increased exponentially, we now have the ability to store and use ALL of the data, not just some of the data. But, we may not always know the value of this data while it is being captured.

That’s okay. The latent value of data will become more obvious each year and the technology now exists for this to be the norm. In this sense, the real big data opportunity is based on the scale of our ideas to put data to work, finding new correlation and value where it previously was not discernible.

Unlocking this new value becomes easier as the world is increasingly digitised; that is, we now regularly put very new data types to work: geo-position/location, sensor updates, click streams, videos and pictures, documents and forms, etc. Just a few years ago, almost none of this would have been considered “data”.

Commonly using all these new data types and searching for correlations that can positively impact business will shift the primary constraint to the quality and quantity of our ideas. In 2014, we’ll move from being schema-constrained to idea-constrained, more often finding the real value in Big Data.

4. Follow The Money, Business-Led Tech Innovation Will Disrupt & Drive Growth

Putting more data to work drives innovation. Innovation can transform processes, products, services and people. Our newfound ability to cost-effectively analyse and find hidden patterns in huge swaths of data will enable a new generation of business-led technology innovation. With this trend, the IT organisation must find new ways to integrate and collaborate within the enterprise, becoming an enabler of business-led innovation.

This collaboration is more important than ever as technology now defines the new economic battleground for every industry and organisation. Even Gartner’s latest predictions abound with a Digital Industrial Revolution theme and watchwords for CIOs and their IT organisations to either lead or get out of the way. It’s a bold new world.

All companies are now technology companies. Every organisation must put technology to work in ways that create distinction and competitive advantage. Evidence of this trend can be found in any industry-leading company today, where IDC says that business units already control 61% of tech spending. Fortunately, the technological barriers to entry have never been lower.

Organisations of all sizes now have affordable access to powerful, enterprise tools, which levels the playing field, allowing even small companies to compete with the big guys (sometimes even more effectively, because of their nimbleness).

Leading enterprises will overtly staff, skill and organise to maximise innovative uses of technology – creating a cascade that will impact education, training and personnel management in all corners of the organisation. Even military organisations realise that gaining skill and expertise in data and analytics will remain at the forefront for personal advancement. The risk for all is that a concentration of even deeper technology skills will create digital haves and have-nots within industries, creating a difficult spiral for laggards.

Lastly, in order for business-led tech innovation to really flourish, many more knowledge workers (than today) must have access to just the right amount of data and analysis, at the right place and the right time (not too much, not too little), which promises to make everyone into a more capable analyst and decision maker (regardless of job level, title and even skill). In 2014, analytics becomes a thing that you do, not a place that you go and the need for intelligence inside of the applications and business processes we use every day becomes an agent of business-led tech innovation.

Brian Gentile

Brian Gentile brings a successful, 27-year technology track record to Jaspersoft, helping it to become the open source business intelligence market leader, measured by commercial size and growth, production deployments of its software, the size and vibrancy of its community, and product downloads. Brian joined Jaspersoft as its first independent Board member in 2005 and then as CEO in 2007. Prior to Jaspersoft, Brian was Executive Vice President and Chief Marketing Officer at Informatica, the industry-leading data integration software company, where he helped the company grow consistently and substantially. Previously, Brian served as Executive Vice President and Chief Marketing Officer for Brio Software, a leading business intelligence software provider that was acquired by Hyperion.