Fix - ERROR SparkUI: Failed to bind SparkUI

Fix - ERROR SparkUI: Failed to bind SparkUI

Raymond Tang Raymond Tang 0 9127 5.50 index 12/13/2020

When starting Spark shell in Windows 10 machine, I encountered an error - ERROR SparkUI: Failed to bind SparkUI. The detailed error message looks like the following:

20/12/13 20:47:34 ERROR SparkUI: Failed to bind SparkUI
java.net.BindException: Failed to bind to /0.0.0.0:4056: Service 'SparkUI' failed after 16 retries (starting from 4040)! Consider explicitly setting the appropriate port for the service 'SparkUI' (for example spark.ui.port for SparkUI) to an available port or increasing spark.port.maxRetries.
        at org.sparkproject.jetty.server.ServerConnector.openAcceptChannel(ServerConnector.java:346)
        at org.sparkproject.jetty.server.ServerConnector.open(ServerConnector.java:308)
        at org.sparkproject.jetty.server.AbstractNetworkConnector.doStart(AbstractNetworkConnector.java:80)
        at org.sparkproject.jetty.server.ServerConnector.doStart(ServerConnector.java:236)
        at org.sparkproject.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
        at org.apache.spark.ui.JettyUtils$.newConnector$1(JettyUtils.scala:301)
        at org.apache.spark.ui.JettyUtils$.httpConnect$1(JettyUtils.scala:332)
        at org.apache.spark.ui.JettyUtils$.$anonfun$startJettyServer$5(JettyUtils.scala:335)
        at org.apache.spark.ui.JettyUtils$.$anonfun$startJettyServer$5$adapted(JettyUtils.scala:335)
        at org.apache.spark.util.Utils$.$anonfun$startServiceOnPort$2(Utils.scala:2256)
        at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:158)
        at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:2248)
        at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:336)
        at org.apache.spark.ui.WebUI.bind(WebUI.scala:146)
        at org.apache.spark.SparkContext.$anonfun$new$10(SparkContext.scala:470)
        at org.apache.spark.SparkContext.$anonfun$new$10$adapted(SparkContext.scala:470)
        at scala.Option.foreach(Option.scala:407)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:470)
        at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2555)
        at org.apache.spark.sql.SparkSession$Builder.$anonfun$getOrCreate$1(SparkSession.scala:930)
        at scala.Option.getOrElse(Option.scala:189)
        at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:921)
        at org.apache.spark.repl.Main$.createSparkSession(Main.scala:112)
        at $line3.$read$$iw$$iw.<init>(<console>:15)
        at $line3.$read$$iw.<init>(<console>:42)
        at $line3.$read.<init>(<console>:44)
        at $line3.$read$.<init>(<console>:48)
        at $line3.$read$.<clinit>(<console>)
        at $line3.$eval$.$print$lzycompute(<console>:7)
        at $line3.$eval$.$print(<console>:6)
        at $line3.$eval.$print(<console>)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:745)
        at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1021)
        at scala.tools.nsc.interpreter.IMain.$anonfun$interpret$1(IMain.scala:574)
        at scala.reflect.internal.util.ScalaClassLoader.asContext(ScalaClassLoader.scala:41)
        at scala.reflect.internal.util.ScalaClassLoader.asContext$(ScalaClassLoader.scala:37)
        at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:41)
        at scala.tools.nsc.interpreter.IMain.loadAndRunReq$1(IMain.scala:573)
        at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:600)
        at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:570)
        at scala.tools.nsc.interpreter.IMain.$anonfun$quietRun$1(IMain.scala:224)
        at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
        at scala.tools.nsc.interpreter.IMain.quietRun(IMain.scala:224)
        at org.apache.spark.repl.SparkILoop.$anonfun$initializeSpark$2(SparkILoop.scala:83)
        at scala.collection.immutable.List.foreach(List.scala:392)
        at org.apache.spark.repl.SparkILoop.$anonfun$initializeSpark$1(SparkILoop.scala:83)
        at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
        at scala.tools.nsc.interpreter.ILoop.savingReplayStack(ILoop.scala:99)
        at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:83)
        at org.apache.spark.repl.SparkILoop.$anonfun$process$4(SparkILoop.scala:165)
        at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23)
        at scala.tools.nsc.interpreter.ILoop.$anonfun$mumly$1(ILoop.scala:168)
        at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
        at scala.tools.nsc.interpreter.ILoop.mumly(ILoop.scala:165)
        at org.apache.spark.repl.SparkILoop.loopPostInit$1(SparkILoop.scala:153)
        at org.apache.spark.repl.SparkILoop.$anonfun$process$10(SparkILoop.scala:221)
        at org.apache.spark.repl.SparkILoop.withSuppressedSettings$1(SparkILoop.scala:189)
        at org.apache.spark.repl.SparkILoop.startup$1(SparkILoop.scala:201)
        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:236)
        at org.apache.spark.repl.Main$.doMain(Main.scala:78)
        at org.apache.spark.repl.Main$.main(Main.scala:58)
        at org.apache.spark.repl.Main.main(Main.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:928)
        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
        at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Fix the error

The error itself is self-explaining. The root cause is that all the ports in the 16 tries are not available.

About Spark config spark.ui.port

Spark configuration spark.ui.portcan be used to specify the default port of Spark UI. By default it is on port 4040. If the port number is occupied by other programs, Spark will try to increase the port number and try up to spark.port.maxRetries times. By default, the value for spark.port.maxRetries is 16.

Change default Spark UI port

The following command specifies the default port as 11111 when starting Spark session.

spark-shell -c spark.ui.port=11111

infoPlease choose a port numbers that is available to use in your system. 

Once Spark session is started successfully, the console will print out Spark UI web URL. For example, the UI is http://localhost:11111 in my computer as the following screenshot shows:

20201213100915-image.png

The UI looks like the following:

20201213101158-image.png

Install Spark 3.0.0

If you don't have a Spark cluster to work with, you can try configuring one on your Windows or Linux machine. Follow these guides to do it yourself:

spark

Join the Discussion

View or add your thoughts below

Comments