The Curious Case Of Big Data Definition

 Big Data Studio

As Featured On EzineArticles

15 December 2012 — By Fari Payandeh

Most technical people I have talked to think that Big Data is nothing new. They seem to be proceeding on the premise that Big Data’s sole purpose in life is to serve business intelligence.  As someone said to me the other day, “Walmart has been enjoying the fruit of their investment in data warehousing/business intelligence for years; way before there was a Hadoop or NoSql in existence”. True, but Big Data is not about “What”. It’s about “How”. How long does Walmart’s nightly jobs run to transform the raw data into meaningful data (business data) that can be used by its BI tools? Moreover, is Walmart currently processing its unstructured data to add value to its BI strategy?

I watched Werner Vogels, the CTO of Amazon elaborate on what is today called “Big Data” back in 2006. He was talking about how Amazon had made a radical shift from Relational Databases to flat files to store its customer data. He said that Relational Databases weren’t able to meet Amazon’s requirements. What is interesting is that Werner Vogel was referring to the difficulties they were facing in processing the OLTP portion of their business and not DSS. However, today, Big Data encompasses OLTP, DSS, and real-time BI.

Let’s balance the myth against the facts: What is not Big Data? Big Data is not attached to a set of technologies nor is it applicable to every single company that sits on top of huge amounts of data. It is true that the IT industry has made great strides in data caching, I/O throughput, scalability, availability, consistency, real-time data processing, and working with unstructured data. However, those enhancements could have come to life organically by the invisible hands of market dynamics to support the evolution of business intelligence. Where facts and myth deviate is that the myth fails to take account of the likelihood that we would have been where we are today even if there were no likes of Amazon around.

In conclusion, the term “Big Data”, although legitimate in that it is referring to  new ways of processing large amounts of data, is misleading due to the fact that “size” is part of the name, but size types (small, medium, large) are not constants and they change overtime. What was considered a large data set twenty years ago may fall into small category today. I personally would rather refer to it as “Net Data”, alluding to the way  data is spread across many servers on disk files as opposed to Databases.

From DBA to BDA: The revenge of the developers

December 06, 2012 by Fari Payandeh

The people who claim that the days of DBA’s are numbered should have their heads examined.  Nonetheless, the winds of change are sweeping across the Data landscape and it won’t be long before companies have a hybrid information system comprising Relational Databases and NoSql Databases. For a traditional DBA to learn NoSql is no easy task; It’s not like attending a five-day class and learning some new topics to augment your existing  knowledge. NoSql, as it stands today requires a different set of skills that has tipped the balance in favor of developers– the future Big Data Administrators (BDA’s). Developers working with System Administrators can manage NoSql Databases without having to rely on DBA’s. For all of those developers who, for years have had to wait for DBA’s to open the gates for them, this is a good opportunity to level the playing field; learn NoSql. Cloudera (Hadoop trainer)  rasied $65 million in new funding and MongoDB launched a new training program.