Recently, I read Why Big Data Projects Fail by Stephen Brobst athttp://data-informed.com/why-big-data-projects-fail. I can’t agree more with his opinions which exposed the problem I’ve been worried about. In this article, I am going to further discuss this topic to remind the enterprises to beware of falling into such pitfall of failure.
Let’s have a look on this positive example. As a successful enterprise in leveraging big data, how does Google make use of the big data?
- Collect the row data, capture the contents of each website, e-mail, or Cookie, and extract the key information.
- Create the complex synthetic index for this information. Needless to say, the advertisement-related index must be also created.
- Store these indices and corresponding contents in the distributed servers.
- When users are browsing website and searching or viewing e-mails, Google will arrange their requests to go through a complex translation procedure, and several index entries will be located accordingly.
- Retrieve data from server according to the index, and return the search result or advertisement.
Of all those above-mentioned contents, what contents are related to Hadoop architecture? They are the No. 3 and the No. 5 items. That is, data storing and data retrieving.
Can the No.3 and the No. 5 items be implemented easily? Yes. The alike Hadoop solution is of good expandability and low purchase cost.
Can I operate like Google once implemented the No.3 and No.5 items? No, you can’t because you have not implemented the key items of No.2 and No.4 yet.
What are the items of No.2 and No.4? They are business analysis algorithm. This is the algorithm designed by business experts meticulously on the basis of data, business knowledge, and market trends, as a core competency and business decision making procedure for many enterprises. This is the “Value” component of the 4V Theory.
Why big data will fall into the pitfall of failure? It is because the current big data only provides the solution for data storage and query. It lacks a good solution for business analysis to enhance the competitiveness, which is the most crucial. There is a great gap in-between. In facts, the current big data is the tool for IT experts. They are able to implement the MapReduce functions with C++ or Java, but unable to reach the ultimate goal – provide the valuable business algorithms.
To avoid the pitfall of failure, enterprises must use the advanced analysis tool that is business-expert-oriented, regardless of user’s technical background, and capable to convert the business logics to the business algorithm rapidly, intuitively, and conveniently. NoSQL or SQL? Neither of them is ideal. They are for the IT personnel only, owing to their requirements on the strong technical background, complex operations, and comparatively weak computation capability.
What are the ideal tools for business experts? From the TCO perspective, I would rather choose the lightweight R language and esProc than pin my hopes on the heavyweight Teradata Aster and SAP Visual Intelligence. Let’s expect the R language and Raqsoft esProc would give us a good show.
More original points please visit my personal blog: http://datakeyword.blogspot.com/