CIO East Africa (Nairobi)

13 December 2012

East Africa: Big Data, Huge Opportunity and a Threat Too

For any company to be competitive in both local and global economies today, they need to look at the big data aspect for their future operations. In other words, they need to look at what today's sales data look like but not last month's. They also need to focus on what kind of feedback the sales people bring back to the organisation.

Dannie Stelyn, Intel Corporation General Manager for East and Sub-Saharan Africa says big data is today's biggest single opportunity and a threat too, for any company. " It is an opportunity because it increases your top line today, it helps you understand what is happening, and make more informed decisions as well as making you to be much more agile," says Stelyn.

He notes that definitely big data will give a company an advantage over the competitor for not using it adding that if you however ignore it, it will be a big challenge for you since other companies will adopt it.

Meet the new boss:

The Wall Street Journal in September 2012 published an article, "Meet the new boss: Big Data", and it is recognized as a big deal for companies. They will no longer have to rely on executives Intel about how successful or not a company will be.

Big data helps you decide which side to go on to business. Big data is the new boss, and it is completely changing how business is being done.

What are companies doing about it?

"We were used to being unstructured; we then moved to IT becoming the business, and extremely dependent on what kind of product companies deliver. Therefore you can no longer be for example in a manufacturing industry without your IT environment being in place", says Stelyn. This is the case because the competitiveness of a company's IT environment has a direct effect to its income.

An example of what is happening at BMW, the company's IT used to have 100,000 employees worldwide and so their IT environment had to support these 100,000 people. However this year they have 1 million vehicles that they are now adding as customers to their IT environment.

This is because all these vehicles are connected to various IT platforms. By 2015 there will be 10 million connected vehicles which among other data mining features they should be able to upload and pull numerous data to and from the cloud, and by 2018 the company foresees that it will create 1 terabyte of big data in day. So BMW is looking at how it can make use of this data.

What to watch:

Therefore companies need to watch out for four things in this IT transformation era, self service, customers, on demand services and resilience.

In the 90s, the IT environment was a mess, it was not nicely integrated. Between the 2000 and 2010 period, IT shifted into the virtualisation era, and in 6 years it was all about talks, companies' implemented virtualisation only at the end of this period.

There was however a substantial improvement in terms of IT utilisation then. So far IT becomes a service environment, the network, storage and processing facilities become a complete unit as opposed to separate systems that does computing, storage etc, even the network switching is no longer traditional, it is more like a server, and so companies need to be convergent to new capabilities, competitiveness and work capabilities in big data.

These three things are imperatives for big data and for any company to be competitive. According to Stelyn, companies can no longer divorce their IT environment from doing business. A company's IT cost has an implication on its product that is why a number of people are moving away from propriety systems into open standard solutions, and this is what is affecting efficiency.

Another aspect is to look at a company competitiveness verses the cloud environment. For example, how competitive is a company investing on its own data centre than doing the same in the cloud.

"Cloud services in Kenya are a reality especially now that the country is cabled. Having your own data center rather than the cloud is no longer a CAPEX question but an efficiency question. So the current existing benchmark is a baseline of 10 cents per Vmh. This has come down a lot in the past year and Intel foresees that it will come down further", says Stelyn.

The cloud or in house virtual hosting:

Now the big question is if companies have their own data centre can they measure the cost per VM? CIOs need to look at, how do they quantify cost for running their VM and at what point should they decide what to keep in the cloud and what applications should they host in house.

Regulations that prevent data from being online and offline will change, they will no longer be a problem in future. For a CIO, he/she will decide what the cost effective solution is, and will they build their own solution in the data center to make sure that they are competitive?

With supercomputing, companies will be able to decide when to switch their applications to the cloud and having them in their own data centre depending on the time and cost. This is the competitive side of it.

Traditionally, we used to have storage, servers and networks, these are the three big areas that CIOs looked at when they looked at data centre environments, however the lines in-between these areas have completely blurred, in the sense that you very often can no longer tell the difference of your data centre's object store, distribution analysis, databases and actual servers. These are becoming blurred because of open standards, where companies have been able to do things most cost effectively.

The other reason why this is the case is that companies have started to get devices that do more than one of these things. Switches and routers are becoming smart, so how companies are going to handle data generated from these devices is the bigger question. So there are a few ways in which storage is being addressed to allow big data get addressed efficiently. One of them is intelligent tiering where important data is crunched rather than storing it in the backend and then bring it forward into a new environment every time.

Thin provisioning is also another important concept, where a little data storage is allocated and scaled depending on the application on demand. Realtime compression is another aspect when dealing with big data where one needs to decompress lots of data in real time off storage, and because of the capacity, the solutions have become more powerful, for example de-duplication.

Front end:

"Now that you have your data at the backend what are you doing in the frontend? This year (2012) we are shipping 10 gigabyte ports than 1 gigabyte ports, what this will allow companies to consolidate different environment the same way virtualisation does." says Stelyn.

High bandwidth will also help them consolidate their data centre for their capacity which means companies can do more virtualisation over infrastructure and allow much better efficiency in that space. Much better through put between the storage system, switching system and the data capacity environment, this addresses convergence.

Why now:

Two things, the cost of a server from 2000 to 2010 has come down to 40%. This includes the traditional proprietary risk architect solutions. The storage cost has come down in the last 10 years by 90%. When this two are combined, it means companies' affordability in the data centre environment is substantially higher.

They can now analyze big data. 5 years ago it would be really costly to do a big data analysis. Now it is cost effective. At the same time if you look at the innovations happening in that specific space according to IDC, by 2015 USD 17 billion, which means it, has almost doubled the amount of money spent this year on innovation. By combining innovation and portability, companies are able to achieve most of these things with additional improvements.

Some of such innovations that Intel has in conjunction with Oracle for example, the Big Iron solution, an eight processor system that can do five million tpms, 1 million tpms was achieved 5 years ago, this shows the speed at which we are moving, and on the same benchmark, the 2nd device to this has a 32 processor solution in place.

"With such solutions in partnership with Oracle, Microsoft, IBM among other companies, open standards allows us to do huge data capacity crunching." Among these solutions in partnership with Oracle we have in memory computing capabilities, this means in that system, one has 1 terabyte of memory capacity out of the hard drive

You can pour all your big data in there, crunch it and have real time views of what is happening in the market.

Ads by Google

Copyright © 2012 CIO East Africa. All rights reserved. Distributed by AllAfrica Global Media (allAfrica.com). To contact the copyright holder directly for corrections — or for permission to republish or make other authorized use of this material, click here.

AllAfrica publishes around 2,000 reports a day from more than 130 news organizations and over 200 other institutions and individuals, representing a diversity of positions on every topic. We publish news and views ranging from vigorous opponents of governments to government publications and spokespersons. Publishers named above each report are responsible for their own content, which AllAfrica does not have the legal right to edit or correct.

Articles and commentaries that identify allAfrica.com as the publisher are produced or commissioned by AllAfrica. To address comments or complaints, please Contact us.