Definition
An open-source framework for distributed storage and processing of large data sets.
Detailed Explanation
Hadoop provides a distributed file system HDFS and processing framework MapReduce for handling big data. It enables parallel processing across clusters of computers providing scalability and fault tolerance for large-scale data processing.
Use Cases
1. Log analysis 2. Data lake implementation 3. Batch processing 4. Data archiving