Sept 21, 2013
I just created a cloud at home. Suppose I start on a project aiming to create a computer game. I purchase 4 servers and some software. After a couple of weeks I realize that in order to complete the project I’ll need 6 more servers, but I have run out of money. I decide to write an operating system that connects the 4 servers and creates a single virtual platform– simplified version of visualization software such as VMware. I won’t get into the nuts and bolts, suffice it to say that I can now create as many as 12 virtual servers be it Windows, Linux, or UNIX. Note that the underlying hardware hasn’t changed; rather, I am making efficient use of them. The cloud masks the infrastructure beneath it. I am going to proceed with my example on the premise that I’ll remain on private cloud (not sharing resources with other systems in a data center)
I complete my video game project and subsequently form a small business offering the game online. A year has gone by and my business is booming but I no longer have the time or the bandwidth to do all the work myself. I weigh the pros and cons, and I decide to move my application to Amazon and let them run it (Software as a Service). Although I have to pay for the service, it is less expensive than running it in-house. Besides, since I don’t have to worry about my application, I can focus on marketing. I’ll be able to cast a wide net and get more subscribers. My investment might pay dividends later.
A couple of years down the road my online game is phenomenally successful. I have over 1000 new subscribers each day and the daily data generated by my application is over 100 TB. This confirms that I made the right decision by choosing cloud services because my internal IT infrastructure couldn’t scale to handle the volume of data. Moreover, the costs could spiral out of control amid the need for rapid expansion and the expenses which accompanies the chaos and the confusion that are inherent in attempting to grow IT in a short period of time. Accommodating 100 TB of data each day will require steady and sustainable resource allocation as well as an elastic infrastructure. Cloud is a suitable solution.
Fast forward to the present time. Bad news! My subscribers are leaving and my business takes a nosedive. I hire a game guru who knows all the tricks of the trade to find out which way the wind is blowing. He tells me that my subscribers are switching to a competitor. My competitor has borrowed ideas from my game and it has produced a smash hit. I go online and take a look at their game. I am at a loss and cannot figure out the underlying cause.
Shortly thereafter, I am told that the competitor has a way of predicting when players might be getting too flustered with the game and call it quits. Their application is designed to utilize nuggets of intelligence to automatically ease the player’s frustration by subtly providing clues at the right time when the player is in a tight corner. Although these cases may be few and far between within each session, providing more opportunities to make headway helps turn the corner. That prevents the user from throwing in the towel.
How is that possible? Well, that’s the job of Data Analytics. The competitor is running SAP HANA’s in-memory Predictive Analytics software on IBM Smartcloud. HANA invokes a function in the game’s application before the barometer reaches the saturation point. We know that all else being equal the longer a game can keep a player engaged, the less likely that he/she will switch to a different game. Engagement is the key. HANA communicates with the game by sending it signals which modifies the game’s behavior in real-time. In short, it strikes the right balance between hardship and sail-through.
The following includes the list of companies that in my view will in time dominate the Big Data Cloud Analytics market.
Amazon, Google, Microsoft, IBM, Oracle, Pivotal (VMware/EMC)