In The News

Big Data – Big Deal?

Big data is still a thing, despite it sounding so last year. In fact, our complete connectedness and the Internet of Things keeps adding to the data pile, making it an indispensable part of every economy, industry, organisation and business.

If analysed properly, Big Data can allow businesses to gain a genuine competitive advantage. Indeed, it’s being actively captured by organisations in the hope that it will help them to better understand not only themselves but their customers, suppliers and partners.

According to a report by Novonous, the global Big Data market is set to grow at a CAGR of 36.67% by 2020. The European Big Data market contains the second largest market share – in terms of revenue – in global Big Data market. Novonous predicts that Europe’s Big Data market is expected to grow at a CAGR of 36.50% until 2020 – maintaining its market position.

So, clearly, Big Data is a big deal – but the conceptions that surrounds it are often woolly and therefore it can be hard to pin down exactly what data is ‘Big’ and what isn’t – leading to companies pursuing analytical strategies that might end up leading them up the garden path.

So, while Big Data is more than the emperor’s new clothes, it is the collective headache of our time as organisations struggle to come up with a slick, easily translatable and useful way to harness this Pandora’s Box of knowledge. It’s time to let the dog see the rabbit.

The concept of data analysis is nothing new, we’ve been doing it in one form or another for years, but Big Data is different. The term is used to define granular and varied data sources that delve deep into the chasms of everything from socio-economic data to details on the performance of your IT infrastructure that even if you knew how to access , the time taken to do so would be enough to deter you.

Big Data can be seen as rich and detailed – something that can’t easily be pasted into an Excel document. Indeed, the fact that ‘Big’ means that the usual management and analytical tools struggle to cope with it. Think about it for a second – we’re all pumping out data every second of the day. Digital sensors, recording devices, smart phones, Internet searches and social media are just a few culprits – and then there’s the Internet of Things. Increasingly, digital devices are infiltrating our homes – yes, I’m talking about you Alexa – as well as tech such as Hive that allows you to control your lighting and heat from your smart phone. They are all data sources that someone, somewhere will be trying to make sense of – now, or in the future. The trouble is, Big Data is spewing into the ether at a rate that Mr Bolt would struggle to keep up with.

However, one thing should never be forgotten. Data – no matter how big – is not the same as information. Accessing the untapped world of “machine data” will provide masses of raw data – the skill has always been, and will continue to be, giving that data meaning, structure and applying context to it – that’s when the data is transformed into information and insight and becomes the “Holy Grail” of real-time business insight.

Naturally, Big Data isn’t the only theme of Digital transformation we are all going through – the move to the cloud as we all look for greater agility, responsiveness and cost reduction opportunities is a world that goes hand in glove with Big Data.

The rise of cloud computing and cloud data stores like Google Cloud and the iCloud have been facilitators to the emergence of Big Data. Cloud computing is the commodification of computing time and data storage by means of standardised technologies.

It has significant advantages over traditional, physical storage such as servers. However, cloud platforms come in several forms and sometimes have to be integrated with traditional architectures, which can pose problems if the right questions aren’t asked.

This can be tiresome for decision makers in charge of Big Data projects. How and which cloud computing system is the optimal choice for their computing needs, especially if it is a Big Data project? These projects regularly exhibit unpredictable or huge computing power and storage needs. At the same time, business stakeholders expect swift, inexpensive, and dependable products and project outcomes. Added to this, you need the right team of scientists and architects in place to oversee a smooth transition from physical to cloud storage – as well as having the ability to make sense of the reams of Big Data collected within it – and then presenting it in an easy-to-understand way to key internal and external stakeholders, who might not be au fait with the language or technology.

Ten years ago, a start-up that needed reliable and Internet connected computing resources had to rent or place physical hardware in one or several data centres. Fast-forward to 2017, and anyone can rent computing time and storage on any scale. Most cloud services offer a pay-as-you-go, allowing even the smallest businesses to use a supercomputer as and when they need to. Even better, cloud services and resources are globally distributed and accessed, meaning they are available and attainable to everyone…

Next time, we’ll be looking at why Big Data doesn’t take into account all the other data out there including manually inputted and unstructured data.

Up next

Gibbs Hybrid Joins the Atrium Family of Companies!

Gibbs Hybrid Joins the Atrium Family of Companies!