[prev in list] [next in list] [prev in thread] [next in thread] 

List:       hadoop-user
Subject:    Re: lzo error while running mr job
From:       Kiru Pakkirisamy <kirupakkirisamy () yahoo ! com>
Date:       2015-10-28 20:33:07
Message-ID: 210149576.346328.1446064387598.JavaMail.yahoo () mail ! yahoo ! com
[Download RAW message or body]

Harish,Thank you very much for your valuable/assertive suggestion :-)I was able to \
identify the problem and fix it.Else where in the code, we were setting a different \
mapred-site.xml in the configuration.I still do not know why it is using the \
DefaultCodec for compression (instead of the one I set - SnappyCodec), but I am \
hopeful I will get there. Thanks again.  Regards,  - kiru  From: Harsh J \
<harsh@cloudera.com>  To: "user@hadoop.apache.org" <user@hadoop.apache.org> 
 Sent: Tuesday, October 27, 2015 8:34 AM
 Subject: Re: lzo error while running mr job
   
The stack trace is pretty certain you do, as it clearly tries to load a class not \
belonging within Apache Hadoop. Try looking at the XML files the application uses? \
Perhaps you've missed all the spots. If I had to guess, given the JobSubmitter entry \
in the trace, it'd be in the submitting host's /etc/hadoop/conf/* files, or in the \
dir pointed by $HADOOP_CONF_DIR (if thats specifically set). Alternatively, it'd be \
in the code. If you have control over the code, you can also make it dump the XML \
before submit via: job.getConfiguration().writeXml(System.out);. The XML dump will \
carry the source of all properties along with their value.


On Tue, Oct 27, 2015 at 8:52 PM Kiru Pakkirisamy <kirupakkirisamy@yahoo.com> wrote:


> Harish,We don't have lzo in the io.compression.codecs list.That is what is puzzling \
> me.Regards,  Kiru   
> From:"Harsh J" <harsh@cloudera.com>
Date:Mon, Oct 26, 2015 at 11:39 PM
Subject:Re: lzo error while running mr job

 |

 |


> 
> Every codec in the io.compression.codecs list of classes will be initialised, \
> regardless of actual further use. Since the Lzo*Codec classes require the native \
> library to initialise, the failure is therefore expected.
On Tue, Oct 27, 2015 at 11:42 AM Kiru Pakkirisamy <kirupakkirisamy@yahoo.com> wrote:

I am seeing a weird error after we moved to the new hadoop mapreduce java packages in \
2.4We are not using lzo (as in io.compression.codecs) but we still get this error. \
Does it mean we have to have lzo installed even though we are not using ? Thanks. \
Regards,- kiru 2015-10-27 00:18:57,994 ERROR \
com.hadoop.compression.lzo.GPLNativeCodeLoader | Could not load native gpl \
libraryjava.lang.UnsatisfiedLinkError: no gplcompression in java.library.path at \
java.lang.ClassLoader.loadLibrary(ClassLoader.java:1886) ~[?:1.7.0_85] at \
java.lang.Runtime.loadLibrary0(Runtime.java:849) ~[?:1.7.0_85] at \
java.lang.System.loadLibrary(System.java:1088) ~[?:1.7.0_85] at \
com.hadoop.compression.lzo.GPLNativeCodeLoader.<clinit>(GPLNativeCodeLoader.java:31) \
[flow-trunk.242-470787.jar:?] at \
com.hadoop.compression.lzo.LzoCodec.<clinit>(LzoCodec.java:60) \
[flow-trunk.242-470787.jar:?] at java.lang.Class.forName0(Native Method) [?:1.7.0_85] \
at java.lang.Class.forName(Class.java:278) [?:1.7.0_85] at \
org.apache.hadoop.conf.Configuration.getClassByNameOrNull(Configuration.java:1834) \
[flow-trunk.242-470787.jar:?] at \
org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1799) \
[flow-trunk.242-470787.jar:?] at \
org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:128) \
[flow-trunk.242-470787.jar:?] at \
org.apache.hadoop.io.compress.CompressionCodecFactory.<init>(CompressionCodecFactory.java:175) \
[flow-trunk.242-470787.jar:?] at \
org.apache.hadoop.mapreduce.lib.input.CombineFileInputFormat.isSplitable(CombineFileInputFormat.java:159) \
[flow-trunk.242-470787.jar:?] at \
org.apache.hadoop.mapreduce.lib.input.CombineFileInputFormat.getMoreSplits(CombineFileInputFormat.java:283) \
[flow-trunk.242-470787.jar:?] at \
org.apache.hadoop.mapreduce.lib.input.CombineFileInputFormat.getSplits(CombineFileInputFormat.java:243) \
[flow-trunk.242-470787.jar:?] at \
org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:493) \
[flow-trunk.242-470787.jar:?]  Regards,  - kiru
 |

 |




  


[Attachment #3 (text/html)]

<html><head></head><body><div style="color:#000; background-color:#fff; \
font-family:HelveticaNeue-Light, Helvetica Neue Light, Helvetica Neue, Helvetica, \
Arial, Lucida Grande, sans-serif;font-size:16px"><div \
id="yui_3_16_0_1_1446064175616_3935"><span>Harish,</span></div><div \
id="yui_3_16_0_1_1446064175616_3937"><span id="yui_3_16_0_1_1446064175616_3936">Thank \
you very much for your valuable/assertive suggestion :-)</span></div><div \
id="yui_3_16_0_1_1446064175616_3938"><span id="yui_3_16_0_1_1446064175616_3942">I was \
able to identify the problem and fix it.</span></div><div \
id="yui_3_16_0_1_1446064175616_3939">Else where in the code, we were setting a \
different mapred-site.xml in the configuration.</div><div dir="ltr" \
id="yui_3_16_0_1_1446064175616_3940">I still do not know why it is using the \
DefaultCodec for compression (instead of the one I set - SnappyCodec), but I am \
hopeful I will get there. Thanks again.</div><div></div><div \
id="yui_3_16_0_1_1446064175616_3960">&nbsp;</div><div class="signature" \
id="yui_3_16_0_1_1446064175616_3961">Regards,<div \
id="yui_3_16_0_1_1446064175616_3962">&nbsp;- kiru</div></div><br>  <div \
style="font-family: HelveticaNeue-Light, Helvetica Neue Light, Helvetica Neue, \
Helvetica, Arial, Lucida Grande, sans-serif; font-size: 16px;" \
id="yui_3_16_0_1_1446064175616_3965"> <div style="font-family: HelveticaNeue, \
Helvetica Neue, Helvetica, Arial, Lucida Grande, sans-serif; font-size: 16px;" \
id="yui_3_16_0_1_1446064175616_3964"> <div dir="ltr" \
id="yui_3_16_0_1_1446064175616_3963"> <hr size="1">  <font size="2" face="Arial" \
id="yui_3_16_0_1_1446064175616_3966"> <b><span \
style="font-weight:bold;">From:</span></b> Harsh J &lt;harsh@cloudera.com&gt;<br> \
<b><span style="font-weight: bold;">To:</span></b> "user@hadoop.apache.org" \
&lt;user@hadoop.apache.org&gt; <br> <b><span style="font-weight: \
bold;">Sent:</span></b> Tuesday, October 27, 2015 8:34 AM<br> <b><span \
style="font-weight: bold;">Subject:</span></b> Re: lzo error while running mr job<br> \
</font> </div> <div class="y_msg_container" \
id="yui_3_16_0_1_1446064175616_3967"><br><div id="yiv0601663184"><div \
id="yui_3_16_0_1_1446064175616_3969"><div dir="ltr" \
id="yui_3_16_0_1_1446064175616_3968">The stack trace is pretty certain you do, as it \
clearly tries to load a class not belonging within Apache Hadoop. Try looking at the \
XML files the application uses? Perhaps you've missed all the spots.<div \
id="yui_3_16_0_1_1446064175616_3970"><br clear="none"></div><div \
id="yui_3_16_0_1_1446064175616_3971">If I had to guess, given the JobSubmitter entry \
in the trace, it'd be in the submitting host's /etc/hadoop/conf/* files, or in the \
dir pointed by $HADOOP_CONF_DIR (if thats specifically set). Alternatively, it'd be \
in the code.</div><div id="yui_3_16_0_1_1446064175616_3972"><br \
clear="none"></div><div id="yui_3_16_0_1_1446064175616_3973">If you have control over \
the code, you can also make it dump the XML before submit via: \
job.getConfiguration().writeXml(System.out);. The XML dump will carry the source of \
all properties along with their value.</div></div><br clear="none"><div \
class="qtdSeparateBR"><br><br></div><div class="yiv0601663184yqt1017394843" \
id="yiv0601663184yqt98894"><div class="yiv0601663184gmail_quote" \
id="yui_3_16_0_1_1446064175616_3975"><div dir="ltr" \
id="yui_3_16_0_1_1446064175616_3974">On Tue, Oct 27, 2015 at 8:52 PM Kiru Pakkirisamy \
&lt;<a rel="nofollow" shape="rect" ymailto="mailto:kirupakkirisamy@yahoo.com" \
target="_blank" href="mailto:kirupakkirisamy@yahoo.com">kirupakkirisamy@yahoo.com</a>&gt; \
wrote:<br clear="none"></div><blockquote class="yiv0601663184gmail_quote" \
style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;"><table \
cellspacing="0" cellpadding="0" border="0"><tbody><tr><td colspan="1" rowspan="1" \
valign="top"><div>Harish,</div><div>We don't have lzo in the io.compression.codecs \
list.</div><div>That is what is puzzling \
me.</div><div>Regards,&nbsp;</div><div>Kiru&nbsp;</div> <hr><table cellspacing="0" \
cellpadding="0" border="0"><tbody><tr><td colspan="1" rowspan="1" valign="top"> <div \
style="font-family:Roboto, sans-serif;color:#7e7d80;"><b>From</b>:"Harsh J" &lt;<a \
rel="nofollow" shape="rect" ymailto="mailto:harsh@cloudera.com" target="_blank" \
href="mailto:harsh@cloudera.com">harsh@cloudera.com</a>&gt;<br \
clear="none"><b>Date</b>:Mon, Oct 26, 2015 at 11:39 PM<br \
clear="none"><b>Subject</b>:Re: lzo error while running mr job<br clear="none"><br \
clear="none"></div></td></tr></tbody></table></td></tr></tbody></table><table \
cellspacing="0" cellpadding="0" border="0"><tbody><tr><td colspan="1" rowspan="1" \
valign="top"><table cellspacing="0" cellpadding="0" border="0"><tbody><tr><td \
colspan="1" rowspan="1" valign="top"> <div dir="ltr">Every codec in the \
io.compression.codecs list of classes will be initialised, regardless of actual \
further use. Since the Lzo*Codec classes require the native library to  initialise, \
the failure is therefore expected.</div><br clear="none"><div><div \
class="yiv0601663184gmail_quote"><div dir="ltr">On Tue, Oct 27, 2015 at 11:42 AM Kiru \
Pakkirisamy &lt;<a rel="nofollow" shape="rect" \
href="">kirupakkirisamy@yahoo.com</a>&gt; wrote:<br clear="none"></div><blockquote \
class="yiv0601663184gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc \
solid;padding-left:1ex;"><div><div \
style="color:#000;background-color:#fff;font-family:HelveticaNeue-Light, Helvetica \
Neue Light, Helvetica Neue, Helvetica, Arial, Lucida Grande, \
sans-serif;font-size:16px;"><div><span>I am seeing a weird error after we moved to \
the new hadoop mapreduce java packages in 2.4</span></div><div><span>We are not using \
lzo (as in io.compression.codecs) but we still get this error. Does it mean we have \
to have lzo installed even though we are not using ?  \
Thanks.</span></div><div><span><br \
clear="none"></span></div><div><span>Regards,</span></div><div><span>- \
kiru</span></div><div><span><br clear="none"></span></div><div><span>2015-10-27 \
00:18:57,994 ERROR com.hadoop.compression.lzo.GPLNativeCodeLoader | Could not load \
native gpl library</span></div><div><span>java.lang.UnsatisfiedLinkError: no \
gplcompression in java.library.path</span></div><div><span><span>	</span>at \
java.lang.ClassLoader.loadLibrary(ClassLoader.java:1886) \
~[?:1.7.0_85]</span></div><div><span><span>	</span>at \
java.lang.Runtime.loadLibrary0(Runtime.java:849) \
~[?:1.7.0_85]</span></div><div><span><span>	</span>at \
java.lang.System.loadLibrary(System.java:1088) \
~[?:1.7.0_85]</span></div><div><span><span>	</span>at \
com.hadoop.compression.lzo.GPLNativeCodeLoader.&lt;clinit&gt;(GPLNativeCodeLoader.java:31) \
[flow-trunk.242-470787.jar:?]</span></div><div><span><span>	</span>at \
com.hadoop.compression.lzo.LzoCodec.&lt;clinit&gt;(LzoCodec.java:60)  \
[flow-trunk.242-470787.jar:?]</span></div><div><span><span>	</span>at \
java.lang.Class.forName0(Native Method) \
[?:1.7.0_85]</span></div><div><span><span>	</span>at \
java.lang.Class.forName(Class.java:278) \
[?:1.7.0_85]</span></div><div><span><span>	</span>at \
org.apache.hadoop.conf.Configuration.getClassByNameOrNull(Configuration.java:1834) \
[flow-trunk.242-470787.jar:?]</span></div><div><span><span>	</span>at \
org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1799) \
[flow-trunk.242-470787.jar:?]</span></div><div><span><span>	</span>at \
org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:128) \
[flow-trunk.242-470787.jar:?]</span></div><div><span><span>	</span>at \
org.apache.hadoop.io.compress.CompressionCodecFactory.&lt;init&gt;(CompressionCodecFactory.java:175) \
[flow-trunk.242-470787.jar:?]</span></div><div><span><span>	</span>at  \
org.apache.hadoop.mapreduce.lib.input.CombineFileInputFormat.isSplitable(CombineFileInputFormat.java:159) \
[flow-trunk.242-470787.jar:?]</span></div><div><span><span>	</span>at \
org.apache.hadoop.mapreduce.lib.input.CombineFileInputFormat.getMoreSplits(CombineFileInputFormat.java:283) \
[flow-trunk.242-470787.jar:?]</span></div><div><span><span>	</span>at \
org.apache.hadoop.mapreduce.lib.input.CombineFileInputFormat.getSplits(CombineFileInputFormat.java:243) \
[flow-trunk.242-470787.jar:?]</span></div><div>























</div><div dir="ltr"><span><span>	</span>at \
org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:493) \
[flow-trunk.242-470787.jar:?]</span></div><div dir="ltr"><span><br \
clear="none"></span></div><div></div><div>&nbsp;</div><div>Regards,<div>&nbsp;- \
kiru</div></div></div></div></blockquote></div></div></td></tr></tbody></table></td></tr></tbody></table></blockquote></div></div></div></div><br><br></div> \
</div> </div>  </div></body></html>



[prev in list] [next in list] [prev in thread] [next in thread] 

Configure | About | News | Add a list | Sponsored by KoreLogic