Banking On Big Data Analytics

Big Data Analytics

For a number of years, the purse strings of IT departments have been tied, in a bid to cut costs. The financial services industry in particular has been making a concerted effort to improve balance sheets, through cut backs and efficiency drives. This put a freeze on many IT projects deemed not ‘mission critical’.

However, 2013 should see a tide shift, with the latest research from analyst house Ovum suggesting that UK IT departments are planning to spend more on IT this year, with 76 per cent of IT professionals expecting their IT budgets to increase or remain the same in 2013, up from 57 per cent last year. Tackling issues resulting from Big Data, and the desire to take advantage of more convenient cloud technologies, are considered to be major drivers of this change – but how can CIOs make the most of their investment?

The fact is that the cracks are beginning to show for businesses that drastically reduced or completely froze IT investment – particularly financial institutions, who continue to rely on legacy technology. February saw Deutsche Bank and HMRC admit that legacy systems are holding them back in the quest to make the most of the data available to them, largely due to the difficulties associated with marrying their existing systems with innovative and more productive tools.

At the heart of this need for newer technologies is the increasing priority placed on risk management at Board-level. Banks are under pressure to tackle the challenges posed by the tightening of the regulatory noose, with the likes of Basel III, the Vickers reforms, IT glitches, and rogue trading scandals continuing to dominate headlines.

Despite their diverse origins, many vertical challenges require better data aggregation methods to analyse information on the fly – something which can be achieved through the use of in-memory analytics. In-memory analytics is a method of operational intelligence used to decipher complex and time-intensive scenarios, for example risk calculation. This advanced method of computing enables these tricky calculations to be completed in real-time, using the most accurate and up-to-date data possible.

In order to make the most of in-memory analytics, financial institutions need to be educated about the innovative breed of cloud solutions which are now being developed to bridge the gap between existing banking systems and modern data analysis. In particular, the use of private clouds will provide a turning point for the market, with data hosted and managed in grids due to the large influxes of information being analysed at any one time.

The development of infrastructure-as-a-service (IaaS) now means these grids are suited to cloud environments, given the sporadic nature of the workloads. Data aggregation is a natural candidate for cloud platforms, since the scalability options are relatively economical, and provide unlimited infrastructure to accommodate intensive computation as businesses expand.

Private clouds should be considered a stepping stone to bigger cloud projects for intraday analysis. Cloud provides the perfect platform for super fast in-memory analytics solutions (where data is managed within a computer memory, rather than on disk, for enhanced speed) and should be a top priority for institutions seeking to meet workload and cost demands. Cloud ultimately makes the integration of real-time risk technologies less expensive, more scalable, and more simple for the bank, and can consequently revolutionise what has become a notoriously tricky integration process.

Industries outside of the financial services arena are also seeing the benefits of in-memory technology. Complex processes and fast-moving data exist in many areas, such as supply chain management and e-commerce. Actionable intelligence delivered from multiple data sources on the fly is now required for a variety of business cases. This includes the consolidation of dense and siloed facets of the supply chain, or the generation of pricing intelligence on competitor products in order to ensure retailers’ pricing structures entice consumers to buy.

The financial services industry is well placed to be an early adopter of data analytics in the cloud. The regulatory and operational challenges in-memory analytics technology can solve are certainly hastening the need for banks to understand the benefits of a cloud solution which is flexible enough to be adapted for a number of use cases and business scenarios. It’s essential that financial institutions bank on the scalable benefits that private clouds offer, and thereby lessen the impact of the issues associated with adopting new technologies.

SHARETweet about this on TwitterShare on LinkedInShare on FacebookShare on Google+Pin on PinterestDigg thisShare on RedditShare on TumblrShare on StumbleUponEmail this to someone
Georges Bory

Georges Bory is Managing Director of Quartet FS and one of its four co-founders. He overlooks the company's product innovation strategy and drives its diversification outside of its legacy in financial services. Prior to founding Quartet FS in 2005, Georges Bory was the Managing Director for Western Europe at Summit Systems. From his background in the application world, Georges has developed a pragmatic vision of technology as a business enabler. Georges promotes Quartet FS' In-Memory Analytics solution as a universal backbone able to easily adapt to the ever changing needs of business users. This holistic approach has already proven its value in the financial sector where Quartet FS' flagship solution ActivePivot is able to meet needs as diverse and complex as CVA, VaR, or P&L. Georges is passionate about excellence and strives to maintain an open dialogue with his customers, his partners and his co-workers at all times.