Hadoop on OSX "Unable to load realm info from SCDynamicStore"

MacosHadoopOsx Lion

Macos Problem Overview


I am getting this error on startup of Hadoop on OSX 10.7:

> Unable to load realm info from SCDynamicStore put: org.apache.hadoop.hdfs.server.namenode.SafeModeException: Cannot create directory /user/travis/input/conf. Name node is in safe mode.

It doesn't appear to be causing any issues with the functionality of Hadoop.

Macos Solutions


Solution 1 - Macos

Matthew Buckett's suggestion in HADOOP-7489 worked for me. Add the following to your hadoop-env.sh file:

export HADOOP_OPTS="-Djava.security.krb5.realm=OX.AC.UK -Djava.security.krb5.kdc=kdc0.ox.ac.uk:kdc1.ox.ac.uk"

Solution 2 - Macos

As an update to this (and to address David Williams' point about Java 1.7), I experienced that only setting the .realm and .kdc properties was insufficient to stop the offending message.

However, by examining the source file that is omitting the message I was able to determine that setting the .krb5.conf property to /dev/null was enough to suppress the message. Obviously if you actually have a krb5 configuration, better to specify the actual path to it.

In total, my hadoop-env.sh snippet is as follows:

HADOOP_OPTS="${HADOOP_OPTS} -Djava.security.krb5.realm= -Djava.security.krb5.kdc="
HADOOP_OPTS="${HADOOP_OPTS} -Djava.security.krb5.conf=/dev/null"

Solution 3 - Macos

I'm having the same issue on OS X 10.8.2, Java version 1.7.0_21. Unfortunately, the above solution does not fix the problem with this version :(

Edit: I found the solution to this, based on a hint I saw here. In the hadoop-env.sh file, change the JAVA_HOME setting to:

export JAVA_HOME=`/usr/libexec/java_home -v 1.6`

(Note the grave quotes here.)

Solution 4 - Macos

FYI, you can simplify this further by only specifying the following:

export HADOOP_OPTS="-Djava.security.krb5.realm= -Djava.security.krb5.kdc="

This is mentioned in HADOOP-7489 as well.

Solution 5 - Macos

I had similar problem on MacOS and after trying different combinations this is what worked for me universally (both Hadoop 1.2 and 2.2):

in $HADOOP_HOME/conf/hadoop-env.sh set the following lines:

# Set Hadoop-specific environment variables here.
export HADOOP_OPTS="-Djava.security.krb5.realm= -Djava.security.krb5.kdc="

# The java implementation to use.
export JAVA_HOME=`/usr/libexec/java_home -v 1.6`

Hope this will help

Solution 6 - Macos

and also add

YARN_OPTS="$YARN_OPTS -Djava.security.krb5.realm=OX.AC.UK -Djava.security.krb5.kdc=kdc0.ox.ac.uk:kdc1.ox.ac.uk"

before executing start-yarn.sh (or start-all.sh) on cdh4.1.3

Solution 7 - Macos

I had this error when debugging MapReduce from Eclipse, but it was a red herring. The real problem was that I should have been remote debugging by adding debugging parameters to the JAVA_OPTS

-Xdebug -Xrunjdwp:transport=dt_socket,server=y,suspend=y,address=1044

And then creating a new "Remote Java Application" profile in the debug configuration that pointed to port 1044.

This article has some more in-depth information about the debugging side of things. It's talking about Solr, but works much the same with Hadoop. If you have trouble, stick a message below and I'll try to help.

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionTravis NelsonView Question on Stackoverflow
Solution 1 - MacosJeromy CarriereView Answer on Stackoverflow
Solution 2 - MacosmdanielView Answer on Stackoverflow
Solution 3 - Macosuser411279View Answer on Stackoverflow
Solution 4 - MacosbtiernayView Answer on Stackoverflow
Solution 5 - MacosVladimir KrozView Answer on Stackoverflow
Solution 6 - MacosKaKaView Answer on Stackoverflow
Solution 7 - MacosJnBrymnView Answer on Stackoverflow