PENGEMBANGAN SISTEM BIG DATA: RANCANG BANGUN INFRASTRUKTUR DENGAN FRAMEWORK HADOOP
Abstract
Hadoop is a distributed data storage platform that provides a parallel processing framework. HDFS (Hadoop Distributed File System) and Map Reduce are two very important parts of Hadoop, HDFS is a distributed storage system built on java, while Map Reduce is a program for managing large data in parallel and distributed. The research focused on testing the data transfer speed on HDFS, testing using 3 types of data, namely video, ISO Image, and text with a total of 512 MB, 1 GB, and 2 GB of data. Testing will be carried out by entering data into HDFS using the Hadoop command and changing the size of the block size with parameters 128 MB, 256 MB and 384 MB. Hadoop's performance on a large block size of 384 MB has better speed than block sizes of 128 MB and 256 MB because the data will be divided into 384 MB so that data mapping will be less than on block sizes of 128 MB and 256 MB.Downloads
Published
2024-06-22 — Updated on 2024-08-24
Versions
- 2024-08-24 (2)
- 2024-06-22 (1)
Issue
Section
Articles