Cloud Monitoring

What is Hadoop Distributed File System?

Definition Hadoop Distributed File System

Hadoop Distributed File System (HDFS) is a file system developed by Apache Hadoop to support standard hardware.

Quizplus

Explanation

Hadoop Distributed File System (HDFS) is used to store distributed data. That distributed data may be stored on a large number of distributed machines. These machines are distributed and connected over the network. Hadoop Distributed File System (HDFS) typically maintains three copies of distributed data in two groups i.e. two copies in one group and one copy in other group. These copies of data are used as a backup of data of distributed machines. This backup of data is used in case of system failure over distributed machines. Hadoop Distributed File System (HDFS) provides native support for Java application and Java APIs (Application Programming Interfaces) because it itself is developed in Java and can be accessed using any web browser.

Quizplus
Subscribe To Cloudopedia

Subscribe To Cloudopedia

Join our mailing list to learn the lastest Cloud Computing Terms and Jargon.

Thank you for Subscribing. 

Powered byRapidology
Become an Expert in Cloud Computing

Become an Expert in Cloud Computing

Learn New Terms and Jargon and stay on Top of Cloud Computing

You have Successfully Subscribed!