The Big Data Conundrum: Big Challenges And Big Opportunities

Of all the societal transformations wrought by the Internet revolution, perhaps the most significant has been the rapid but permanent shift from an environment defined by information scarcity, to one defined by information overload.

The era of “Big Data” is here, and an element of success will be an organisation’s ability to navigate and make use of its data. According to recent research, the global Big Data market was worth USD $6.3 billion in 2012 and is expected to reach USD 48.3 billion by 2018, at a compound annual growth rate of 40.5 per cent from 2012 to 2018.

Big Data’s exact definition depends a great deal on who is defining it. At its core, the term “Big Data” refers to a phenomenon that should be instantly familiar to organisations of all sizes – the ability to collect and store highly relevant, mission-critical data is far outpacing the ability to effectively process, analyse and leverage it to make informed business decisions.

Twenty years ago success in business, as often as not, was determined by who could gather the best and most relevant data (about competitors, customers, emerging markets, etc.) in the timeliest fashion. Because analysing that data was comparatively simple, and a relatively homogenous process from one organisation to another, competitive differentiation came from who could find the best data first.

The Internet changed that paradigm in three critical ways. First, it globally democratised access to data, enabling many more players to gather similar relevant data; second, it exponentially increased the amount of relevant data that is generated, and could be collected and stored; third, there are now tools and technologies that make it easier to analyse large amounts of unstructured data. Success is now determined less by who can find the best data, but who can make the best sense of the massive amounts of data available.

In many ways, the term “Big Data” may be the biggest understatement in the history of business. The amount of information described by that term is staggering. IBM estimates that human beings now generate more than 2.5 quintillion bytes of data every day, and that 90 per cent of the world’s total data has been generated in the past two years. The archive of the Library of Congress currently consists of 285 terabytes of data, and is growing at a rate of five terabytes per month (or about 60 terabytes a year).

Obviously the percentage of this massive, rapidly expanding global data store that is relevant to any individual business at any given time may be comparatively tiny. But that is precisely the point – developing the capacity to quickly identify and act on relevant information is the most pressing need for companies in the Big Data Age.

Fortunately, the business challenges posed by Big Data have spurred the creation of an impressive range of technological solutions. Some of the world’s most innovative companies have turned their efforts toward creating open source tools that allow organisations to analyse and process the data that is critical to their markets and their customers. Sorting through these options can be a data challenge in itself, but information professionals now have access to the tools they need to navigate the Big Data landscape.

Big Data In The Domain Name Space

Big Data is nothing new for anyone involved in the domain name industry. With more than 252 million registered domain names generating billions of Web pages, the Domain Name System (DNS) itself presents its own unique Big Data challenge, but also offers distinctive opportunities.

Today, companies can insert intelligence into their DNS servers in order to analyse the abundance of data that may flow into their systems. By analysing DNS transactions, companies can glean greater insight into precisely how domain names are being used, including their functionality, connectivity and reach, or what information users leverage the most.

Such intelligence can help companies make more informed decisions regarding their future business strategy or offer better services that meet their customers’ needs. And given that nearly every Internet transaction goes through DNS servers, that data source can become a true business differentiator, when analysed correctly.

In addition, DNS data can become an important tool in securing the network. Being able to analyse network activity and traffic through DNS queries can help network administrators determine where malicious traffic comes from and prevent access to these sources where Distributed Denial of Service (DDoS) attacks and spam originate.

Today, companies must focus not on their capacity to store massive amounts of data, but rather on their ability to turn that data into meaningful and insightful information. Significant advances are happening in the way we understand and analyse the DNS environment, and important steps are being taken toward managing the addressing system’s own unique Big Data challenges. As the challenges continue to evolve, so too will the tools, as technologists work to keep decision-makers one step ahead of the Big Data deluge, turning it into a major business opportunity.

Dr. Burt Kaliski Jr., senior vice president and chief technology officer, is responsible for developing Verisign's long-term technology vision. He is the leader of Verisign Labs, which focuses on applied research, university collaboration, industry thought leadership and intellectual property strategy. He also facilitates the technical community within Verisign. Prior to joining Verisign in 2011, Kaliski served as the founding director of the EMC Innovation Network, the global collaboration among EMC’s research and advanced technology groups and its university partners. He joined EMC from RSA Security, where he served as vice president of research and chief scientist. Kaliski holds a Bachelor of Science in computer science and engineering, Master of Science in electrical engineering and computer science and doctorate in electrical engineering and computer science from the Massachusetts Institute of Technology, where his research focused on cryptography.