The files that need to be distributed can be specified as a comma-separated list of URIs as the argument to the -files option in hadoop job command. Files can be on the local file system, on HDFS.
Archive files (ZIP files, tar files, and gzipped tar files) can also be copied to task nodes by distributed cache by using -archives option. these are un-archived on the task node.
The -libjars option will add JAR files to the classpath of the mapper and reducer tasks.
jar command with distributed
cache
$ hadoop jar example.jar ExampleProgram -files Inputpath/example.txt input/filename /output/