Sunteți pe pagina 1din 2

/ Big Data Zone Over a million developers have joined DZone.

Log In / Sign Up  

REFCARDZ GUIDES ZONES | Agile AI Big Data Cloud Database DevOps Integration IoT Java Microservices Open Source Performance Security Web Dev

Top 10 Hadoop Shell Commands to


Manage HDFS
So you already know what Hadoop is? Why it is used? What problems you can solve with
it?

by Saurabh Chhajed  · Jun. 30, 14 · Big Data Zone · Analysis

 Like (8)  Comment (4)  Save  Tweet  315.9k Views

Join the DZone community and get the full member experience. JOIN FOR FREE

The Architect’s Guide to Big Data Application Performance. Get the Guide.

So you already know what Hadoop is? Why it is used? What problems you can solve with it? And you
want to know how you can deal with files on HDFS? Don’t worry, you are at the right place.

In this article I will present Top 10 basic Hadoop HDFS operations managed through shell commands
which are useful to manage files on HDFS clusters; for testing purposes, you can invoke this commands
using either some of the VMs from Cloudera, Hortonworks, etc. or if you have your own setup of pseudo
distributed cluster.

Let’s get started.

1. Create a directory in HDFS at given path(s).


1 Usage:
2 hadoop fs -mkdir <paths>
3 Example:
4 hadoop fs -mkdir /user/saurzcode/dir1 /user/saurzcode/dir2

2. List the contents of a directory.


1 Usage :
2 hadoop fs -ls <args>
3 Example:
4 hadoop fs -ls /user/saurzcode

3. Upload and download a file in HDFS.


Upload:

hadoop fs ‑put:

Copy single src file, or multiple src files from local file system to the Hadoop data file system

1 Usage:
2 hadoop fs -put <localsrc> ... <HDFS_dest_Path>
3 Example:
4 hadoop fs -put /home/saurzcode/Samplefile.txt /user/saurzcode/dir3/

Download:

hadoop fs ‑get:

Copies/Downloads files to the local file system

1 Usage:
2 hadoop fs -get <hdfs_src> <localdst>
3 Example:
4 hadoop fs -get /user/saurzcode/dir3/Samplefile.txt /home/

4. See contents of a file


Same as unix cat command:

1 Usage:
2 hadoop fs -cat <path[filename]>
3 Example:
4 hadoop fs -cat /user/saurzcode/dir1/abc.txt

5. Copy a file from source to destination


This command allows multiple sources as well in which case the destination must be a directory.

1 Usage:
2 hadoop fs -cp <source> <dest>
3 Example:
4 hadoop fs -cp /user/saurzcode/dir1/abc.txt /user/saurzcode/dir2

6. Copy a file from/To Local file system to HDFS



copyFromLocal

1 Usage:
2 hadoop fs -copyFromLocal <localsrc> URI
3 Example:
4 hadoop fs -copyFromLocal /home/saurzcode/abc.txt /user/saurzcode/abc.txt

Similar to put command, except that the source is restricted to a local file reference.

copyToLocal

1 Usage:
2 hadoop fs -copyToLocal [-ignorecrc] [-crc] URI <localdst>

Similar to get command, except that the destination is restricted to a local file reference.
7. Move file from source to destination.
Note:‑ Moving files across filesystem is not permitted.

1 Usage :
2 hadoop fs -mv <src> <dest>
3 Example:
4 hadoop fs -mv /user/saurzcode/dir1/abc.txt /user/saurzcode/dir2

8. Remove a file or directory in HDFS.


Remove files specified as argument. Deletes directory only when it is empty

1 Usage :
2 hadoop fs -rm <arg>
3 Example:
4 hadoop fs -rm /user/saurzcode/dir1/abc.txt

Recursive version of delete.

1 Usage :
2 hadoop fs -rmr <arg>
3 Example:
4 hadoop fs -rmr /user/saurzcode/

9. Display last few lines of a file.


Similar to tail command in Unix.

1 Usage :
2 hadoop fs -tail <path[filename]>
3 Example:
4 hadoop fs -tail /user/saurzcode/dir1/abc.txt

10. Display the aggregate length of a file.


1 Usage :
2 hadoop fs -du <path>
3 Example:
4 hadoop fs -du /user/saurzcode/dir1/abc.txt

Please comment which of these commands you found most useful while dealing with Hadoop /HDFS.

Learn how taking a DataOps approach will help you speed up processes and increase data quality by
providing streamlined analytics pipelines via automation and testing. Learn More.

Like This Article? Read More From DZone


Counters in Apache Pig A Smattering of HDFS

Introducing Hadoop (HDFS) Connector Free DZone Refcard


v5.0.0 Software Usage Analytics for Data-
Driven Development

Topics: BIGDATA , HADOOP , HDFS , BIG DATA

 Like (8)  Comment (4)  Save  Tweet  315.9k Views

Published at DZone with permission of Saurabh Chhajed , DZone MVB. See the original article here. 
Opinions expressed by DZone contributors are their own.

Big Data Partner


Resources
White Paper: 12 Best Practices for Modern Data  Data Lakes: A Serious Consideration for Serious  The Architect’s Guide to Big Data Application 
Ingestion Data Science Performance
StreamSets HPCC Systems Unravel Data

Whitepaper: Scaling Capabilities to Build a  Where DevOps Meets Data Integration  The Many Advantages of Cloudera Enterprise 
Homogeneous Big Data Ecosystem StreamSets [Whitepaper]
HPCC Systems Cloudera
Eckerson Group Report - DataOps: Industrializing 
Cloudera Data Warehouse  Data and Analytics Strategies for Streamlining the eBook: Simplify Apache Kafka 
Cloudera Delivery of Insights StreamSets
Unravel Data
Mastering Data Lakes with ease and efficiency 
HPCC Systems

DZone's Guide to IoT: Connecting Devices and


Data
Make your IoT venture more successful with these guidelines on
product lifecycle, privacy and permissions policies, and security risk
mitigation. Download Now

S-ar putea să vă placă și