What are
the default configuration files that are used in Hadoop
|
As of
0.20 release, Hadoop supported the following read-only default configurations
-
src/core/core-default.xml
-
src/hdfs/hdfs-default.xml
-
src/mapred/mapred-default.xml
|
How will
you make changes to the default configuration files
|
Hadoop
does not recommends changing the default configuration files, instead it
recommends making all site specific changes in the following files
-
conf/core-site.xml
-
conf/hdfs-site.xml
-
conf/mapred-site.xml
Unless
explicitly turned off, Hadoop by default specifies two resources, loaded
in-order from the classpath:
-
core-default.xml : Read-only defaults for hadoop.
-
core-site.xml: Site-specific configuration for a given hadoop installation.
Hence if
same configuration is defined in file core-default.xml and src/core/core-default.xml then
the values in file core-default.xml (same is true for other 2
file pairs) is used. |
Tuesday, December 4, 2012
Hadoop Interview Questions
Monday, December 3, 2012
Hadoop Interview Question
Here are Some Hadoop Administration question you May expect. answers you need to find.... :) i can give but i wont : if you find good answer share with me also :) hope you will right ? If you are not able to find let me know through comments i will post the answers too.
- What is Hadoop? Brief about the components of Hadoop.
- What are the Hadoop daemon processes tell the components of Hadoop and functionality?
- Tell steps for configuring Hadoop?
- What is architecture of HDFS and flow?
- Can we have more than one configuration setting for Hadoop cluster how can you switch between these configurations?
- What will be your troubleshooting approach in Hadoop?
- What are the exceptions you have come through while working on Hadoop, what was your approach for getting rid of those exceptions or errors?
Subscribe to:
Posts (Atom)