Open In App
Related Articles

Hadoop – copyFromLocal Command

Like Article
Save Article
Report issue

Hadoop copyFromLocal command is used to copy the file from your local file system to the HDFS(Hadoop Distributed File System). copyFromLocal command has an optional switch –f which is used to replace the already existing file in the system, means it can be used to update that file. -f switch is similar to first delete a file and then copying it. If the file is already present in the folder then copy it into the same folder will automatically throw an error. 

Syntax to copy a file from your local file system to HDFS is given below: 


hdfs dfs -copyFromLocal /path 1 /path 2 .... /path n /destination

The copyFromLocal local command is similar to the -put command used in HDFS. we can also use hadoop fs as a synonym for hdfs dfs. The command can take multiple arguments where all the paths provided are of the source from where we want to copy the file except the last one which is the destination, where the file is copied. Make sure that the destination should be a directory. 

Our objective is to copy the file from our local file system to HDFS. In my case, I want to copy the file name Salaries.csv which is present at /home/dikshant/Documents/hadoop_file directory. 


Hadoop - copyFromLocal Command


Steps to execute copyFromLocal Command

Let’s see the current view of my Root directory in HDFS. 

Step 1: Make a directory in HDFS where you want to copy this file with the below command. 


hdfs dfs -mkdir /Hadoop_File


making a directory in HDFS


showing the directory of HDFS

Step 2: Use copyFromLocal command as shown below to copy it to HDFS /Hadoop_File directory. 


hdfs dfs -copyFromLocal /home/dikshant/Documents/hadoop_file/Salaries.csv /Hadoop_File


using copyFromLocal Command in Hadoop

Step 3: Check whether the file is copied successfully or not by moving to its directory location with below command. 


hdfs dfs -ls /Hadoop_File


checking file is copied or not - 1


checking file is copied or not - 2


Overwriting or Updating the File In HDFS with -f switch

From below Image, you can observe that copyFromLocal command itself does not copy the same name file at the same location. it says that the file already exists. 


Overwriting or Updating the File In HDFS with -f switch - 1

To update the content of the file or to Overwrite it, you should use -f switch as shown below. 


hdfs dfs -copyFromLocal -f /home/dikshant/Documents/hadoop_file/Salaries.csv /Hadoop_File


Overwriting or Updating the File In HDFS with -f switch - 2

Now you can easily observe that using copyFromLocal with -f switch does not produce any error or it will easily update or modify your file in HDFS.

Last Updated : 27 Dec, 2021
Like Article
Save Article
Share your thoughts in the comments
Similar Reads