[prev in list] [next in list] [prev in thread] [next in thread]
List: hadoop-commits
Subject: svn commit: r1598451 - in /hadoop/common/trunk/hadoop-common-project/hadoop-common: CHANGES.txt
From: atm () apache ! org
Date: 2014-05-30 1:52:17
Message-ID: 20140530015217.941652388868 () eris ! apache ! org
[Download RAW message or body]
Author: atm
Date: Fri May 30 01:52:17 2014
New Revision: 1598451
URL: http://svn.apache.org/r1598451
Log:
HADOOP-10638. Updating hadoop-daemon.sh to work as expected when nfs is started as a \
privileged user. Contributed by Manikandan Narayanaswamy.
Modified:
hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt
hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/bin/hadoop-daemon.sh
Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt
URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt?rev=1598451&r1=1598450&r2=1598451&view=diff
==============================================================================
--- hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt (original)
+++ hadoop/common/trunk/hadoop-common-project/hadoop-common/CHANGES.txt Fri May 30 \
01:52:17 2014 @@ -519,6 +519,9 @@ Release 2.5.0 - UNRELEASED
HADOOP-10639. FileBasedKeyStoresFactory initialization is not using default
for SSL_REQUIRE_CLIENT_CERT_KEY. (tucu)
+ HADOOP-10638. Updating hadoop-daemon.sh to work as expected when nfs is
+ started as a privileged user. (Manikandan Narayanaswamy via atm)
+
Release 2.4.1 - UNRELEASED
INCOMPATIBLE CHANGES
Modified: hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/bin/hadoop-daemon.sh
URL: http://svn.apache.org/viewvc/hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/bin/hadoop-daemon.sh?rev=1598451&r1=1598450&r2=1598451&view=diff
==============================================================================
--- hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/bin/hadoop-daemon.sh \
(original)
+++ hadoop/common/trunk/hadoop-common-project/hadoop-common/src/main/bin/hadoop-daemon.sh \
Fri May 30 01:52:17 2014 @@ -87,6 +87,14 @@ if [ "$command" == "datanode" ] && [ "$E
starting_secure_dn="true"
fi
+#Determine if we're starting a privileged NFS, if so, redefine the appropriate \
variables +if [ "$command" == "nfs3" ] && [ "$EUID" -eq 0 ] && [ -n \
"$HADOOP_PRIVILEGED_NFS_USER" ]; then + export \
HADOOP_PID_DIR=$HADOOP_PRIVILEGED_NFS_PID_DIR + export \
HADOOP_LOG_DIR=$HADOOP_PRIVILEGED_NFS_LOG_DIR + export \
HADOOP_IDENT_STRING=$HADOOP_PRIVILEGED_NFS_USER + starting_privileged_nfs="true"
+fi
+
if [ "$HADOOP_IDENT_STRING" = "" ]; then
export HADOOP_IDENT_STRING="$USER"
fi
@@ -162,6 +170,9 @@ case $startStop in
echo "ulimit -a for secure datanode user $HADOOP_SECURE_DN_USER" >> $log
# capture the ulimit info for the appropriate user
su --shell=/bin/bash $HADOOP_SECURE_DN_USER -c 'ulimit -a' >> $log 2>&1
+ elif [ "true" = "$starting_privileged_nfs" ]; then
+ echo "ulimit -a for privileged nfs user $HADOOP_PRIVILEGED_NFS_USER" >> $log
+ su --shell=/bin/bash $HADOOP_PRIVILEGED_NFS_USER -c 'ulimit -a' >> $log 2>&1
else
echo "ulimit -a for user $USER" >> $log
ulimit -a >> $log 2>&1
[prev in list] [next in list] [prev in thread] [next in thread]
Configure |
About |
News |
Add a list |
Sponsored by KoreLogic