Apache Hadoop: An open-source framework for distributed storage and processing of large datasets across clusters of computers using simple programming models.
Core Components:
HDFS (Hadoop Distributed File System): A distributed file system that provides high-throughput access to application data.
MapReduce: A programming model for processing large datasets in parallel across a distributed cluster.