Mac spark-shell Error initializing SparkContext

Apache Spark

Apache Spark Problem Overview


I tried to start spark 1.6.0 (spark-1.6.0-bin-hadoop2.4) on Mac OS Yosemite 10.10.5 using

"./bin/spark-shell". 

It has the error below. I also tried to install different versions of Spark but all have the same error. This is the second time I'm running Spark. My previous run works fine.

log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Using Spark's repl log4j profile: org/apache/spark/log4j-defaults-repl.properties
To adjust logging level use sc.setLogLevel("INFO")
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 1.6.0
      /_/

Using Scala version 2.10.5 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_79)
Type in expressions to have them evaluated.
Type :help for more information.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 ERROR SparkContext: Error initializing SparkContext.
java.net.BindException: Can't assign requested address: Service 'sparkDriver' failed after 16 retries!
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125)
	at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485)
	at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089)
	at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430)
	at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415)
	at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903)
	at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198)
	at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348)
	at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
	at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
	at java.lang.Thread.run(Thread.java:745)
java.net.BindException: Can't assign requested address: Service 'sparkDriver' failed after 16 retries!
	at sun.nio.ch.Net.bind0(Native Method)
	at sun.nio.ch.Net.bind(Net.java:444)
	at sun.nio.ch.Net.bind(Net.java:436)
	at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
	at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
	at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125)
	at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485)
	at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089)
	at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430)
	at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415)
	at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903)
	at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198)
	at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348)
	at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
	at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
	at java.lang.Thread.run(Thread.java:745)

java.lang.NullPointerException
	at org.apache.spark.sql.SQLContext$.createListenerAndUI(SQLContext.scala:1367)
	at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:101)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
	at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028)
	at $iwC$$iwC.<init>(<console>:15)
	at $iwC.<init>(<console>:24)
	at <init>(<console>:26)
	at .<init>(<console>:30)
	at .<clinit>(<console>)
	at .<init>(<console>:7)
	at .<clinit>(<console>)
	at $print(<console>)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
	at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
	at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
	at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
	at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
	at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
	at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
	at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
	at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:132)
	at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
	at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
	at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
	at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
	at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
	at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
	at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
	at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
	at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
	at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
	at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
	at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
	at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
	at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
	at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
	at org.apache.spark.repl.Main$.main(Main.scala:31)
	at org.apache.spark.repl.Main.main(Main.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

<console>:16: error: not found: value sqlContext
         import sqlContext.implicits._
                ^
<console>:16: error: not found: value sqlContext
         import sqlContext.sql

Then I add

export SPARK_LOCAL_IP="127.0.0.1"

to spark-env.sh, error changes to:

 ERROR : No route to host
    java.net.ConnectException: No route to host
    	at java.net.Inet6AddressImpl.isReachable0(Native Method)
    	at java.net.Inet6AddressImpl.isReachable(Inet6AddressImpl.java:77)
    	at java.net.InetAddress.isReachable(InetAddress.java:475)
...
<console>:10: error: not found: value sqlContext
       import sqlContext.implicits._
              ^
<console>:10: error: not found: value sqlContext
       import sqlContext.sql

Apache Spark Solutions


Solution 1 - Apache Spark

Following steps might help:

  1. Get your hostname by using "hostname" command.

  2. Make an entry in the /etc/hosts file for your hostname if not present as follows:

     127.0.0.1      your_hostname
    

Hope this helps!!

Solution 2 - Apache Spark

I always get that when switching between networks. This solves it:

$ sudo hostname -s 127.0.0.1

Solution 3 - Apache Spark

I've built it from the current master branch with version 2.0.0-SNAPSHOT. After adding export SPARK_LOCAL_IP="127.0.0.1" to load-spark-env.sh it worked for me. I'm using Macos 10.10.5. So it could be version issue?

Solution 4 - Apache Spark

Just set the spark.driver.host to be your localhost if you use IDE

SparkConf conf = new  SparkConf().setMaster("local[2]").setAppName("AnyName").set("spark.driver.host", "localhost");
JavaSparkContext sc = new JavaSparkContext(conf);

Solution 5 - Apache Spark

There are two errors I think.

  1. Your spark local ip was not correct and needs to be change to 127.0.0.1.
  2. You didn't difine sqlContext properly.

For 1. I tried:

    1. exported SPARK_LOCAL_IP="127.0.0.1" in ~/.bash_profile
    1. added export SPARK_LOCAL_IP="127.0.0.1" in load-spark-env.sh under $SPARK_HOME

But neither worked. Then I tried the following and it worked:

val conf = new SparkConf().
    setAppName("SparkExample").
    setMaster("local[*]").
    set("spark.driver.bindAddress","127.0.0.1")
val sc = new SparkContext(conf)

For 2. you can try:

sqlContext = SparkSession.builder.config("spark.master","local[*]").getOrCreate()

and then import sqlContext.implicits._

The builder in SparkSession will automatically use the SparkContext if it exists, otherwise it will create one. You can explicitly create two if necessary.

Solution 6 - Apache Spark

If you don't want to change the hostname of your Mac, you can do the following:

  1. Find the template file spark-env.sh.template on your machine (It is probably in /usr/local/Cellar/apache-spark/2.1.0/libexec/conf/).
  2. cp spark-env.sh.template spark-env.sh
  3. Add export SPARK_LOCAL_IP=127.0.0.1 under the comment for local IP.

Start spark-shell and enjoy it.

Solution 7 - Apache Spark

If you are using Scala to run the code in an IDE, and if you face the same issue and you are not using SparkConf() as pointed out above and using SparkSession() then you could bind the localhost address as follows as set only works in SparkConf(). You should use .config() to set the spark configuration as shown below:

    val spark = SparkSession
       .builder()
       .appName("CSE512-Phase1")
       .master("local[*]").config("spark.driver.bindAddress", "localhost")
       .getOrCreate()

Solution 8 - Apache Spark

export SPARK_LOCAL_IP=127.0.0.1

For mac .bash_profile.

Solution 9 - Apache Spark

This happens when you switched between different networks(VPN - PROD, CI based on your company networks to access different environments).

I had the same issue, whenever I switch the VPN.

update sudo /etc/hosts with the hostname value on your Mac.

Solution 10 - Apache Spark

Sometimes firewall prevents creating and binding a socket. make sure that your firewall is not enable and also you have to check the ip of your machine in /etc/hosts and make sure it's OK then try again:

sudo ufw disable

Solution 11 - Apache Spark

sparkContext = new JavaSparkContext("local[4]", "Appname")

export SPARK_LOCAL_IP=127.0.0.1

just doing above worked for me.

Solution 12 - Apache Spark

In Mac,Check IP in System Preference -> Network -> click the Wifi you are connected(it should show green icon) -> check the IP just above your Network Name.

Make following entry in ../conf/spark-env.sh :

SPARK_MASTER_HOST=<<your-ip>>
SPARK_LOCAL_IP=<<your-ip>>

and than try spark-shell. Doing above changes worked for me.

Solution 13 - Apache Spark

When ever you will switch network while working with spark, It happens due to network switch. for instant fix you have to add "spark.driver.bindAddress" in "localhost" mode or "127.0.0.1" in your spark application

// Create a SparkContext using every core of the local machine
val confSpark = new SparkConf().set("spark.driver.bindAddress", "localhost")
val sc = new SparkContext("local[*]", "appname", conf = confSpark)

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionJiaView Question on Stackoverflow
Solution 1 - Apache SparkGaurav SharmaView Answer on Stackoverflow
Solution 2 - Apache SparkArdavanView Answer on Stackoverflow
Solution 3 - Apache SparkmeltacView Answer on Stackoverflow
Solution 4 - Apache SparkMohamed AhmedView Answer on Stackoverflow
Solution 5 - Apache SparkRong DuView Answer on Stackoverflow
Solution 6 - Apache SparkDvinView Answer on Stackoverflow
Solution 7 - Apache SparkSidharth PanickerView Answer on Stackoverflow
Solution 8 - Apache SparkalturiumView Answer on Stackoverflow
Solution 9 - Apache SparkYoga GowdaView Answer on Stackoverflow
Solution 10 - Apache SparkMahdi EsmailoghliView Answer on Stackoverflow
Solution 11 - Apache SparkgeekgirlspuView Answer on Stackoverflow
Solution 12 - Apache SparkJyoti ValejaView Answer on Stackoverflow
Solution 13 - Apache SparkDevbrat ShuklaView Answer on Stackoverflow