[prev in list] [next in list] [prev in thread] [next in thread]
List: hadoop-dev
Subject: [jira] Created: (HADOOP-964) Hadoop Shell Script causes
From: "Dennis Kubes (JIRA)" <jira () apache ! org>
Date: 2007-01-31 23:49:05
Message-ID: 23361462.1170287345512.JavaMail.jira () brutus
[Download RAW message or body]
Hadoop Shell Script causes ClassNotFoundException for Nutch processes
---------------------------------------------------------------------
Key: HADOOP-964
URL: https://issues.apache.org/jira/browse/HADOOP-964
Project: Hadoop
Issue Type: Bug
Components: scripts
Environment: windows xp and fedora core 6 linux, java 1.5.10...should affect \
all systems Reporter: Dennis Kubes
Priority: Critical
Fix For: 0.11.0
In the ReduceTaskRunner constructor lin 339 a sorter is created that attempts to get \
the map output key and value classes from the configuration object. This is before \
the TaskTracker$Child process is spawned off into into own separate JVM so here the \
classpath for the configuration is the classpath that started the TaskTracker. The \
current hadoop script includes the hadoop jars, meaning that any hadoop writable type \
will be found, but it doesn't include nutch jars so any nutch writable type or any \
other writable type will not be found and will throw a ClassNotFoundException.
I don't think it is a good idea to have a dependecy on specific Nutch jars in the \
Hadoop script but it is a good idea to allow jars to be included if they are in \
specific locations, such as the HADOOP_HOME where the nutch jar resides. I have \
attached a patch that adds any jars in the HADOOP_HOME directory to the hadoop \
classpath. This fixes the issues with getting ClassNotFoundExceptions inside of \
Nutch processes.
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.
[prev in list] [next in list] [prev in thread] [next in thread]
Configure |
About |
News |
Add a list |
Sponsored by KoreLogic