Search

Consider case scenario: In Mapreduce system, HDFS block size is 256 MB and we have 3 files of size 256 KB, 266 MB and 500 MB then how many input splits will be made by Hadoop framework ?



Hadoop will make 5 splits as follows
– 1 split for 256 KB file
– 2 splits for 266 MB file  (1 split of size 256 MB and another split of size 10 MB)
– 2 splits for 500 MB file  (1 Split of size 256 MB and another of size 244 MB)