How to Analyze Video Data Using Hadoop

Hadoop is an open source framework created by Apache. It’s designed for distributed storage and concurrent processing of big data sets. In other words, it’s a technology built for dealing with huge amounts of data. It does this with a clever programming idea called MapReduce — which allows processing to be done on thousands of servers at the same time. By doing this, you can solve problems quickly that would take you days or even weeks on a single computer.

Hadoop works with a ‘cluster’ of many computers or servers built from relatively cheap commodity hardware. All of these work together to solve very large processing problems.

Further text is here