Arquivo para September 23rd, 2013
Use of Big Data with Hadoop 2.0
As said Merv Adrian, an analyst at research of the Gardner, Hadoop 2.0 is “an important step”, making it a technology with “an operating environment data much more versatile,” and now also works with traditional SQL tools.
Hadoop is one of several projects of the Free Software Foundation Apache, the most widely used software for servers, was built in Java, uses distributed computing HDFS (Hadoop Distributed File System) and works with the concepts of MapReduce and GoogleFS (GFS) concepts bigdata important to treat.
The design of high-level Apache was built by a developer comunicade, that even includes Yahoo, see the wiki contributors in the Apache.
So far, though Hadoop has been used mainly to reduce huge data sets for analysis, but only in batches, not flows (Workflow) is now possible.
According to research by Gartner in 720 companies, made in June, 64% were investing in bigdata, representing 58% compared to last year, and Hadoop 2.0 promises a simplified treatment for small and medium enterprises in control of Workflow.