Intellipaat Back

Explore Courses Blog Tutorials Interview Questions
0 votes
2 views
in Big Data Hadoop & Spark by (11.9k points)

I have constructed a single-node Hadoop environment on CentOS using the Cloudera CDH repository. When I want to copy a local file to HDFS, I used the command:

sudo -u hdfs hadoop fs -put /root/MyHadoop/file1.txt /

But,the result depressed me:

put: '/root/MyHadoop/file1.txt': No such file or directory

I'm sure this file does exist.

Please help me,Thanks!

1 Answer

0 votes
by (32.1k points)

As user hdfs, do you have access rights to /root/ (in your local hdd)?. Usually you don't. You must copy file1.txt to a place where hdfs user has read rights.

Try:

cp /root/MyHadoop/file1.txt /tmp

chown hdfs:hdfs /tmp/file1.txt

sudo -u hdfs hadoop fs -put /tmp/file1.txt /

Browse Categories

...