[prev in list] [next in list] [prev in thread] [next in thread] 

List:       hadoop-user
Subject:    Re: Hadoop Installation Path problem
From:       Hamza Zafar <11bscshzafar () seecs ! edu ! pk>
Date:       2014-11-26 10:57:55
Message-ID: CAE3RYH8Bamj27aeKZbvoUHsPEDtDcvbFQBn2Y2U6GXg21DQTPQ () mail ! gmail ! com
[Download RAW message or body]

run the following commands in $HADOOP_HOME/sbin to start the HDFS and Yarn
Services on single machine

hadoop-daemon.sh start namenode  //start the namenode service

hadoop-daemon.sh start datanode //start datanode service

yarn-daemon.sh start resourcemanager //start the resourcemanager

yarn-daemon.sh start nodemanager // start nodemanager service

Report any errors from these commands
 On Nov 26, 2014 1:38 PM, "Anand Murali" <anand_vihar@yahoo.com> wrote:

> Dear Zafar:
>
>
> I aam not running distributed mode. I want only standalone or pseudo
> distributed mode. By default the slaves file contains local host. The
> errors I am getting are all path related, and I am unable to fix them. I am
> following the directions given by apache and I am setting only the Java
> path and Hadoop home and hadoop install variables and appending them to the
> $PATH variable. I just want a learning environment. Please advise.
>
> Thanks,
>
> Regards,
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>
>
>   On Wednesday, November 26, 2014 1:11 PM, Hamza Zafar <
> 11bscshzafar@seecs.edu.pk> wrote:
>
>
> Please set the compute nodes in slaves file at
> $HADOOP_HOME/etc/hadoop/slaves
>
> run the following commands in $HADOOP_HOME/sbin to start the HDFS and Yarn
> Services
>
> hadoop-daemon.sh start namenode  //start the namenode service
> hadoop-daemons.sh start datanode //start datanode on all nodes listed in
> slaves file
>
> yarn-daemon.sh start resourcemanager //start the resourcemanager
> yarn-daemons.sh start nodemanager // start nodemanager service on all
> nodes listed in slaves file
>
>
>
> On Tue, Nov 25, 2014 at 2:22 PM, Anand Murali <anand_vihar@yahoo.com>
> wrote:
>
> Dear Alex:
>
> I am trying to install Hadoop-2.5.2 on Suse Enterprise Desktop 11 ONLY in
> standalone/pseudo-distributed mode. Ambari needs a server. Now these are
> the changes I have made in hadoop-env.sh based on Tom Whyte's text book
> "Hadoop the definitive guide".
>
> export JAVA_HOME=/usr/lib64/jdk1.7.0_71/jdk7u71
> export HADOOP_HOME=/home/anand_vihar/hadoop
> export PATH=:$PATH:$JAVA_HOME:$HADOOP_HOME/bin:$HADOOP_HOME/sbin
>
> All other variables are left un-touched as they are supposed to pick the
> right defaults. Once having done this at
>
> $hadoop version
>
> Hadoop runs and shows version, which is first step successful the
>
> $hadoop namenode -format
>
> Is successful except for some warnings. I have set deafults in
> core-site.xml, hdfs-site.xml and yarn-site.xml
>
> then
>
> $start-dfs.sh
>
> I get plenty of errors.. I am wondering if there is a clear cut install
> procedure, or do you think Suse Desktop Enterprise 11 does not support
> Hadoop. Reply welcome.
>
> Thanks
>
> Regards,
>
> Anand Murali.
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>
>
>   On Tuesday, November 25, 2014 2:22 PM, AlexWang <wangxin.dt@gmail.com>
> wrote:
>
>
> Normally we only need to configure the environment variables in ~/.bashrc
> or /etc/profile file, you can also configure the hadoop-env.sh file, they
> are not in conflict.
> I think hadoop-env.sh variables will override .bashrc variables.
> For your question, you can try setting HDFS_CONF_DIR variables. Then try.
> Cloudera hadoop installation you can use Cloudera Manager tool
>
> http://www.cloudera.com/content/cloudera/en/documentation/core/latest/topics/cm_ig_install_path_a.html
> Install apache hadoop, unzip the tar.gz file and configure hadoop-related
> configuration files and environment variables.
> apache hadoop installation tools: http: //ambari.apache.org/
>
>
> On Nov 25, 2014, at 16:12, Anand Murali <anand_vihar@yahoo.com> wrote:
>
> Dear Alex:
>
> If I make changes to .bashrc, the above variables, will it not conflict
> with hadoop-env.sh. And I was advised other then just JAVA_HOME, no other
> environment variables should be set. Please advise.
>
> Thanks
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>
>
>   On Tuesday, November 25, 2014 1:23 PM, AlexWang <wangxin.dt@gmail.com>
> wrote:
>
>
> hadoop environment variable for example :
>
> echo  "
> export HADOOP_HOME=/usr/lib/hadoop
> export HADOOP_HDFS_HOME=/usr/lib/hadoop-hdfs
> export HADOOP_MAPRED_HOME=/usr/lib/hadoop-mapreduce
> #export HADOOP_MAPRED_HOME=/usr/lib/hadoop-0.20-mapreduce
> export HADOOP_COMMON_HOME=\${HADOOP_HOME}
> export HADOOP_LIBEXEC_DIR=\${HADOOP_HOME}/libexec
> export HADOOP_CONF_DIR=\${HADOOP_HOME}/etc/hadoop
> *export HDFS_CONF_DIR=\${HADOOP_HOME}/etc/hadoop*
> export HADOOP_YARN_HOME=/usr/lib/hadoop-yarn
> export YARN_CONF_DIR=\${HADOOP_HOME}/etc/hadoop
> export HADOOP_COMMON_LIB_NATIVE_DIR=\${HADOOP_HOME}/lib/native
> export LD_LIBRARY_PATH=\${HADOOP_HOME}/lib/native
> export HADOOP_OPTS=\"\${HADOOP_OPTS}
> -Djava.library.path=\${HADOOP_HOME}/lib:\${LD_LIBRARY_PATH}\"
> export PATH=\${HADOOP_HOME}/bin:\${HADOOP_HOME}/sbin:\$PATH
>
> ">> ~/.bashrc
>
>  .   ~/.bashrc
>
>
>
>
> On Nov 24, 2014, at 21:25, Anand Murali <anand_vihar@yahoo.com> wrote:
>
> Dear All:
>
> After hadoop namenode -format I do the following with errors.
>
> anand_vihar@linux-v4vm:~/hadoop/etc/hadoop> hadoop start-dfs.sh
> Error: Could not find or load main class start-dfs.sh
> anand_vihar@linux-v4vm:~/hadoop/etc/hadoop> start-dfs.sh
> Incorrect configuration: namenode address dfs.namenode.servicerpc-address
> or dfs.namenode.rpc-address is not configured.
> Starting namenodes on [2014-11-24 18:47:27,717 WARN  [main]
> util.NativeCodeLoader (NativeCodeLoader.java:<clinit>(62)) - Unable to load
> native-hadoop library for your platform... using builtin-java classes where
> applicable]
> Error: Cannot find configuration directory: /etc/hadoop
> Error: Cannot find configuration directory: /etc/hadoop
> Starting secondary namenodes [2014-11-24 18:47:28,457 WARN  [main]
> util.NativeCodeLoader (NativeCodeLoader.java:<clinit>(62)) - Unable to load
> native-hadoop library for your platform... using builtin-java classes where
> applicable
> 0.0.0.0]
> Error: Cannot find configuration directory: /etc/hadoop
>
> But in my hadoop-env.sh I have set
>
> export JAVA_HOME=/usr/lib64/jdk1.7.1_71/jdk7u71
> export HADOOP_HOME=/anand_vihar/hadoop
> export PATH=:PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin:$HADOOP_HOME/share
>
> Would anyone know how to fix this problem.
>
> Thanks
>
> Regards,
>
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>
>
> On Monday, November 24, 2014 6:30 PM, Anand Murali <anand_vihar@yahoo.com>
> wrote:
>
>
> it works thanks
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>
>
> On Monday, November 24, 2014 6:19 PM, Anand Murali <anand_vihar@yahoo.com>
> wrote:
>
>
> Ok. Many thanks I shall try.
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>
>
> On Monday, November 24, 2014 6:13 PM, Rohith Sharma K S <
> rohithsharmaks@huawei.com> wrote:
>
>
> The problem is with setting JAVA_HOME. There is .(Dot) before /usr which
> cause append current directory.
> export JAVA_HOME=*./**usr/lib64/jdk1.7.0_71/jdk7u71*
>
> *Do not use .(Dot) before /usr.*
>
> Thanks & Regards
> Rohith Sharma K S
>
> This e-mail and its attachments contain confidential information from
> HUAWEI, which is intended only for the person or entity whose address is
> listed above. Any use of the information contained herein in any way
> (including, but not limited to, total or partial disclosure, reproduction,
> or dissemination) by persons other than the intended recipient(s) is
> prohibited. If you receive this e-mail in error, please notify the sender
> by phone or email immediately and delete it!
>
> *From:* Anand Murali [mailto:anand_vihar@yahoo.com <anand_vihar@yahoo.com>
> ]
> *Sent:* 24 November 2014 17:44
> *To:* user@hadoop.apache.org; user@hadoop.apache.org
> *Subject:* Hadoop Installation Path problem
>
> Hi All:
>
>
> I have done the follwoing in hadoop-env.sh
>
> export JAVA_HOME=./usr/lib64/jdk1.7.0_71/jdk7u71
> export HADOOP_HOME=/home/anand_vihar/hadoop
> export PATH=:$PATH:$JAVA_HOME:$HADOOP_HOME/bin:$HADOOP_HOME/sbin
>
> Now when I run hadoop-env.sh and type hadoop version, I get this error.
>
> /home/anand_vihar/hadoop/bin/hadoop: line 133:
> /home/anand_vihar/hadoop/etc/hadoop/usr/lib64/jdk1.7.0_71/jdk7u71/bin/java:
> No such file or directory
> /home/anand_vihar/hadoop/bin/hadoop: line 133: exec:
> /home/anand_vihar/hadoop/etc/hadoop/usr/lib64/jdk1.7.0_71/jdk7u71/bin/java:
> cannot execute: No such file or directory
>
>
> Can somebody advise. I have asked this to many people, they all say the
> obvious path problem, but where I cannot debug. This has become a show
> stopper for me. Help most welcome.
>
> Thanks
>
> Regards
>
>
> Anand Murali
> 11/7, 'Anand Vihar', Kandasamy St, Mylapore
> Chennai - 600 004, India
> Ph: (044)- 28474593/ 43526162 (voicemail)
>
>
>
>
>
>
>
>
>
>
>

[Attachment #3 (text/html)]

<p dir="ltr"><br>
run the following commands in $HADOOP_HOME/sbin to start the HDFS and Yarn Services \
on single machine</p> <p dir="ltr">hadoop-daemon.sh start namenode   //start the \
namenode service</p> <p dir="ltr">hadoop-daemon.sh start datanode //start datanode \
service</p> <p dir="ltr">yarn-daemon.sh start resourcemanager //start the \
resourcemanager</p> <p dir="ltr">yarn-daemon.sh start nodemanager // start \
nodemanager service </p> <p dir="ltr">Report any errors from these commands<br>
</p>
<div class="gmail_quote">On Nov 26, 2014 1:38 PM, &quot;Anand Murali&quot; &lt;<a \
href="mailto:anand_vihar@yahoo.com">anand_vihar@yahoo.com</a>&gt; wrote:<br \
type="attribution"><blockquote class="gmail_quote" style="margin:0 0 0 \
.8ex;border-left:1px #ccc solid;padding-left:1ex"><div \
style="color:#000;background-color:#fff;font-family:HelveticaNeue,Helvetica \
Neue,Helvetica,Arial,Lucida Grande,sans-serif;font-size:12px"><div \
dir="ltr"><span>Dear Zafar:</span></div><div dir="ltr"><br><span></span></div><div \
dir="ltr"><br><span></span></div><div dir="ltr"><span>I aam not running distributed \
mode. I want only standalone or pseudo distributed mode. By default the slaves file \
contains local host. The errors I am getting are all path related, and I am unable to \
fix them. I am following the directions given by apache and I am setting only the \
Java path and Hadoop home and hadoop install variables and appending them to the \
$PATH variable. I just want a learning environment. Please advise.</span></div><div \
dir="ltr"><br><span></span></div><div dir="ltr"><span>Thanks,</span></div><div \
dir="ltr"><br><span></span></div><div dir="ltr"><span>Regards,</span></div><div>  \
</div><div><div><font color="#00ff80">Anand Murali    </font></div><div><font>11/7, \
&#39;Anand Vihar&#39;, Kandasamy St, Mylapore</font></div><div><font>Chennai - 600 \
004, India</font></div><div><font>Ph: (044)- 28474593/  43526162 \
(voicemail)</font></div></div> <div><br><br></div><div style="display:block"> <div \
style="font-family:HelveticaNeue,Helvetica Neue,Helvetica,Arial,Lucida \
Grande,sans-serif;font-size:12px"> <div style="font-family:HelveticaNeue,Helvetica \
Neue,Helvetica,Arial,Lucida Grande,sans-serif;font-size:16px"> <div dir="ltr"> <font \
face="Arial"> On Wednesday, November 26, 2014 1:11 PM, Hamza Zafar &lt;<a \
href="mailto:11bscshzafar@seecs.edu.pk" \
target="_blank">11bscshzafar@seecs.edu.pk</a>&gt; wrote:<br> </font> </div>  <br><br> \
<div><div><div><div dir="ltr"><div>Please set the compute nodes in slaves file at \
$HADOOP_HOME/etc/hadoop/slaves<br clear="none"><br clear="none"></div><div>run the \
following commands in $HADOOP_HOME/sbin to start the HDFS and Yarn Services<br \
clear="none"><br clear="none"></div><div>hadoop-daemon.sh start namenode   //start \
the namenode service<br clear="none"></div><div>hadoop-daemons.sh start datanode \
//start datanode on all nodes listed in slaves file<br clear="none"><br \
clear="none"></div><div>yarn-daemon.sh start resourcemanager //start the \
resourcemanager<br clear="none"></div><div>yarn-daemons.sh start nodemanager // start \
nodemanager service on all nodes listed in slaves file<br clear="none"><br \
clear="none"></div><div><br clear="none"></div></div><div><div><br \
clear="none"><div>On Tue, Nov 25, 2014 at 2:22 PM, Anand Murali <span \
dir="ltr">&lt;<a rel="nofollow" shape="rect" href="mailto:anand_vihar@yahoo.com" \
target="_blank">anand_vihar@yahoo.com</a>&gt;</span> wrote:<br \
clear="none"><blockquote style="margin:0 0 0 .8ex;border-left:1px #ccc \
solid;padding-left:1ex"><div><div \
style="color:#000;background-color:#fff;font-family:HelveticaNeue,Helvetica \
Neue,Helvetica,Arial,Lucida Grande,sans-serif;font-size:12px"><div>Dear \
Alex:</div><div><br clear="none"></div><div dir="ltr">I am trying to install \
Hadoop-2.5.2 on Suse Enterprise Desktop 11 ONLY in standalone/pseudo-distributed \
mode. Ambari needs a server. Now these are the changes I have made in hadoop-env.sh \
based on Tom Whyte&#39;s text book &quot;Hadoop the definitive guide&quot;.<br \
clear="none"></div><div dir="ltr"><br clear="none"></div><div dir="ltr">export \
JAVA_HOME=/usr/lib64/jdk1.7.0_71/jdk7u71<span><br clear="none">export \
HADOOP_HOME=/home/anand_vihar/hadoop<br clear="none">export \
PATH=:$PATH:$JAVA_HOME:$HADOOP_HOME/bin:$HADOOP_HOME/sbin</span></div><div \
dir="ltr"><br clear="none"></div><div dir="ltr">All other variables are left \
un-touched as they are supposed to pick the right defaults. Once having done this \
at</div><div dir="ltr"><br clear="none"></div><div dir="ltr">$hadoop \
version</div><div dir="ltr"><br clear="none"></div><div dir="ltr">Hadoop runs and \
shows version, which is first step successful the</div><div dir="ltr"><br \
clear="none"></div><div dir="ltr">$hadoop namenode -format</div><div dir="ltr"><br \
clear="none"></div><div dir="ltr">Is successful except for some warnings. I have set \
deafults in core-site.xml, hdfs-site.xml and yarn-site.xml</div><div dir="ltr"><br \
clear="none"></div><div dir="ltr">then <br clear="none"></div><div dir="ltr"><br \
clear="none"></div><div dir="ltr">$start-dfs.sh</div><div dir="ltr"><br \
clear="none"></div><div dir="ltr">I get plenty of errors.. I am wondering if there is \
a clear cut install procedure, or do you think Suse Desktop Enterprise 11 does not \
support Hadoop. Reply welcome.</div><div dir="ltr"><br clear="none"></div><div \
dir="ltr">Thanks</div><div dir="ltr"><br clear="none"></div><div \
dir="ltr">Regards,</div><div dir="ltr"><br clear="none"></div><div dir="ltr">Anand \
Murali.<br clear="none"></div><span></span><div dir="ltr"><br clear="none"></div><div \
dir="ltr"><br clear="none"></div><div dir="ltr"><br clear="none"></div><div \
dir="ltr"><br clear="none"></div><div dir="ltr"><br clear="none"></div><div \
dir="ltr"><br clear="none"></div><div dir="ltr"><br clear="none"></div><div \
dir="ltr"><br clear="none"></div><div dir="ltr"><br clear="none"></div><div \
dir="ltr"><br clear="none"></div><div dir="ltr"><br clear="none"></div><div \
dir="ltr"><br clear="none"></div><div dir="ltr"><br clear="none"></div><div \
dir="ltr"><br clear="none"></div><div dir="ltr"><br clear="none"></div><div \
dir="ltr"><br clear="none"></div><div dir="ltr"><br clear="none"></div><div \
dir="ltr"><br clear="none"></div><div dir="ltr"><br clear="none"></div><div \
dir="ltr"><br clear="none"></div><div dir="ltr"><br clear="none"></div><div \
dir="ltr"><br clear="none"></div><div dir="ltr"><br clear="none"></div><div \
dir="ltr"><br clear="none"></div><div dir="ltr"><br clear="none"></div><div \
dir="ltr"><br clear="none"></div><div dir="ltr"><br clear="none"></div><div \
dir="ltr"><br clear="none"></div><div dir="ltr"><br clear="none"></div><div \
dir="ltr"><br clear="none"></div><div dir="ltr"><br clear="none"></div><div \
dir="ltr"><br clear="none"></div><div dir="ltr"><br clear="none"></div><div \
dir="ltr"><br clear="none"></div><div dir="ltr"><br clear="none"></div><div \
dir="ltr"><br clear="none"></div><div dir="ltr"><br clear="none"></div><div \
dir="ltr"><br clear="none"></div><div dir="ltr"><br clear="none"></div><div \
dir="ltr"><br clear="none"></div><div dir="ltr"><br clear="none"></div><div \
dir="ltr"><br clear="none"></div><div dir="ltr"><br clear="none"></div><div \
dir="ltr"><br clear="none"></div><div dir="ltr"><br clear="none"></div><div \
dir="ltr"><br clear="none"></div><div dir="ltr"><br \
clear="none"></div><div><span></span></div><div>  </div><div><div><font \
color="#00ff80">Anand Murali    </font></div><div><font>11/7, &#39;Anand Vihar&#39;, \
Kandasamy St, Mylapore</font></div><div><font>Chennai - 600 004, \
India</font></div><div><font>Ph: (044)- 28474593/  43526162 \
(voicemail)</font></div></div> <div><br clear="none"><br \
clear="none"></div><div><div><div style="display:block"> <div \
style="font-family:HelveticaNeue,Helvetica Neue,Helvetica,Arial,Lucida \
Grande,sans-serif;font-size:12px"> <div style="font-family:HelveticaNeue,Helvetica \
Neue,Helvetica,Arial,Lucida Grande,sans-serif;font-size:16px"> <div dir="ltr"> <font \
face="Arial"> On Tuesday, November 25, 2014 2:22 PM, AlexWang &lt;<a rel="nofollow" \
shape="rect" href="mailto:wangxin.dt@gmail.com" \
target="_blank">wangxin.dt@gmail.com</a>&gt; wrote:<br clear="none"> </font> </div>  \
<br clear="none"><br clear="none"> <div><div><div><div>Normally we only need to \
configure the environment variables in ~/.bashrc or /etc/profile  file, you can also \
configure the hadoop-env.sh file, they are not in conflict.</div><div>I think \
hadoop-env.sh variables will override .bashrc variables.</div><div>For your question, \
you can try setting HDFS_CONF_DIR variables. Then try.</div><div>Cloudera hadoop \
installation you can use Cloudera Manager tool</div><div><span \
style="white-space:pre-wrap">	</span><a rel="nofollow" shape="rect" \
href="http://www.cloudera.com/content/cloudera/en/documentation/core/latest/topics/cm_ig_install_path_a.html" \
target="_blank">http://www.cloudera.com/content/cloudera/en/documentation/core/latest/topics/cm_ig_install_path_a.html</a></div><div>Install \
apache hadoop, unzip the tar.gz file and configure hadoop-related configuration files \
and environment variables.</div><div>apache hadoop installation tools: http: //<a \
rel="nofollow" shape="rect" href="http://ambari.apache.org/" \
target="_blank">ambari.apache.org/</a></div><div><br clear="none"></div><div><br \
clear="none"></div><div><div><blockquote type="cite"><div>On Nov 25, 2014, at 16:12, \
Anand Murali &lt;<a rel="nofollow" shape="rect" href="mailto:anand_vihar@yahoo.com" \
target="_blank">anand_vihar@yahoo.com</a>&gt; wrote:</div><br \
clear="none"><div><div><div \
style="background-color:rgb(255,255,255);font-family:HelveticaNeue,&#39;Helvetica \
Neue&#39;,Helvetica,Arial,&#39;Lucida \
Grande&#39;,sans-serif;font-size:12px"><div>Dear Alex:</div><div><br \
clear="none"></div><div dir="ltr">If I make changes to .bashrc, the above variables, \
will it not conflict with hadoop-env.sh. And I was advised other then just JAVA_HOME, \
no other environment variables should be set. Please advise.</div><div dir="ltr"><br \
clear="none"></div><div dir="ltr">Thanks<br \
clear="none"></div><div><span></span></div><div>  </div><div><div><font \
color="#00ff80">Anand Murali    </font></div><div><font>11/7, &#39;Anand Vihar&#39;, \
Kandasamy St, Mylapore</font></div><div><font>Chennai - 600 004, \
India</font></div><div><font>Ph: (044)- 28474593/  43526162 \
(voicemail)</font></div></div> <div><br clear="none"><br clear="none"></div><div \
style="display:block"> <div style="font-family:HelveticaNeue,Helvetica \
Neue,Helvetica,Arial,Lucida Grande,sans-serif;font-size:12px"> <div \
style="font-family:HelveticaNeue,Helvetica Neue,Helvetica,Arial,Lucida \
Grande,sans-serif;font-size:16px"> <div dir="ltr"> <font face="Arial"> On Tuesday, \
November 25, 2014 1:23 PM, AlexWang &lt;<a rel="nofollow" shape="rect" \
href="mailto:wangxin.dt@gmail.com" target="_blank">wangxin.dt@gmail.com</a>&gt; \
wrote:<br clear="none"> </font> </div>  <br clear="none"><br clear="none"> \
<div><div><div><div><div>hadoop environment variable for example  :</div><div><br \
clear="none"></div><div><div>echo   &quot;</div><div>export \
HADOOP_HOME=/usr/lib/hadoop</div><div>export \
HADOOP_HDFS_HOME=/usr/lib/hadoop-hdfs</div><div>export \
HADOOP_MAPRED_HOME=/usr/lib/hadoop-mapreduce</div><div>#export \
HADOOP_MAPRED_HOME=/usr/lib/hadoop-0.20-mapreduce</div><div>export \
HADOOP_COMMON_HOME=\${HADOOP_HOME}</div><div>export \
HADOOP_LIBEXEC_DIR=\${HADOOP_HOME}/libexec</div><div>export \
HADOOP_CONF_DIR=\${HADOOP_HOME}/etc/hadoop</div><div><b>export \
HDFS_CONF_DIR=\${HADOOP_HOME}/etc/hadoop</b></div><div>export \
HADOOP_YARN_HOME=/usr/lib/hadoop-yarn</div><div>export \
YARN_CONF_DIR=\${HADOOP_HOME}/etc/hadoop</div><div>export \
HADOOP_COMMON_LIB_NATIVE_DIR=\${HADOOP_HOME}/lib/native</div><div>export \
LD_LIBRARY_PATH=\${HADOOP_HOME}/lib/native</div><div>export \
HADOOP_OPTS=\&quot;\${HADOOP_OPTS} \
-Djava.library.path=\${HADOOP_HOME}/lib:\${LD_LIBRARY_PATH}\&quot;</div><div>export \
PATH=\${HADOOP_HOME}/bin:\${HADOOP_HOME}/sbin:\$PATH</div><div><br \
clear="none"></div><div>&quot;&gt;&gt; ~/.bashrc</div></div><div><br \
clear="none"></div><div>  .    ~/.bashrc  </div></div><div><br \
clear="none"></div><div><br clear="none"></div><div><br clear="none"></div><br \
clear="none"><div><div><blockquote type="cite"><div>On Nov 24, 2014, at 21:25, Anand \
Murali &lt;<a rel="nofollow" shape="rect" href="mailto:anand_vihar@yahoo.com" \
target="_blank">anand_vihar@yahoo.com</a>&gt; wrote:</div><br clear="none"><div><div \
style="font-style:normal;font-variant:normal;font-weight:normal;letter-spacing:normal; \
line-height:normal;text-indent:0px;text-transform:none;white-space:normal;word-spacing:0px;background-color:rgb(255,255,255);font-family:HelveticaNeue,&#39;Helvetica \
Neue&#39;,Helvetica,Arial,&#39;Lucida \
Grande&#39;,sans-serif;font-size:12px"><div>Dear All:</div><div><br \
clear="none"></div><div dir="ltr">After hadoop namenode -format I do the following \
with errors.</div><div dir="ltr"><br clear="none"></div><div \
dir="ltr">anand_vihar@linux-v4vm:~/hadoop/etc/hadoop&gt; hadoop start-dfs.sh<br \
clear="none">Error: Could not find or load main class start-dfs.sh<br \
clear="none">anand_vihar@linux-v4vm:~/hadoop/etc/hadoop&gt; start-dfs.sh<br \
clear="none">Incorrect configuration: namenode address \
dfs.namenode.servicerpc-address or dfs.namenode.rpc-address is not configured.<br \
clear="none">Starting namenodes on [2014-11-24 18:47:27,717 WARN   [main] \
util.NativeCodeLoader (NativeCodeLoader.java:&lt;clinit&gt;(62)) - Unable to load \
native-hadoop library for your platform... using builtin-java classes where \
applicable]<br clear="none">Error: Cannot find configuration directory: \
/etc/hadoop<br clear="none">Error: Cannot find configuration directory: \
/etc/hadoop<br clear="none">Starting secondary namenodes [2014-11-24 18:47:28,457 \
WARN   [main] util.NativeCodeLoader (NativeCodeLoader.java:&lt;clinit&gt;(62)) - \
Unable to load native-hadoop library for your platform... using builtin-java classes \
where applicable<br clear="none">0.0.0.0]<br clear="none">Error: Cannot find \
configuration directory: /etc/hadoop</div><div dir="ltr"><br clear="none"></div><div \
dir="ltr">But in my hadoop-env.sh I have set<span>  </span><br \
clear="none"></div><div dir="ltr"><br clear="none"></div><div dir="ltr">export \
JAVA_HOME=/usr/lib64/jdk1.7.1_71/jdk7u71<br clear="none"></div><div dir="ltr">export \
HADOOP_HOME=/anand_vihar/hadoop</div><div dir="ltr">export \
PATH=:PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin:$HADOOP_HOME/share</div><div \
dir="ltr"><br clear="none"></div><div dir="ltr">Would anyone know how to fix this \
problem.</div><div dir="ltr"><br clear="none"></div><div dir="ltr">Thanks</div><div \
dir="ltr"><br clear="none"></div><div dir="ltr">Regards,<br clear="none"></div><div \
dir="ltr"><br clear="none"></div><div><span></span></div><div>  </div><div><div><font \
color="#00ff80">Anand Murali    </font></div><div><font>11/7, &#39;Anand Vihar&#39;, \
Kandasamy St, Mylapore</font></div><div><font>Chennai - 600 004, \
India</font></div><div><font>Ph: (044)- 28474593/  43526162 \
(voicemail)</font></div></div><div><br clear="none"><br clear="none"></div><div \
style="display:block"><div style="font-family:HelveticaNeue,&#39;Helvetica \
Neue&#39;,Helvetica,Arial,&#39;Lucida Grande&#39;,sans-serif;font-size:12px"><div \
style="font-family:HelveticaNeue,&#39;Helvetica Neue&#39;,Helvetica,Arial,&#39;Lucida \
Grande&#39;,sans-serif;font-size:16px"><div dir="ltr"><font face="Arial">On Monday, \
November 24, 2014 6:30 PM, Anand Murali &lt;<a rel="nofollow" shape="rect" \
href="mailto:anand_vihar@yahoo.com" target="_blank">anand_vihar@yahoo.com</a>&gt; \
wrote:<br clear="none"></font></div><br clear="none"><br \
clear="none"><div><div><div><div \



[prev in list] [next in list] [prev in thread] [next in thread] 

Configure | About | News | Add a list | Sponsored by KoreLogic