Hi -
Today I've been trying to use the "GATK 4 CNV Proportional Coverage for WGS " (version 4) task in Firehose copied from Algorithms Commons. After 16 warnings:
WARN Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041.
Warnings there is an
16/09/16 14:14:39 ERROR SparkUI: Failed to bind SparkUI
java.net.BindException: Address already in use: Service 'SparkUI' failed after 16 retries!
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:433)
at sun.nio.ch.Net.bind(Net.java:425)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at org.spark-project.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
at org.spark-project.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
at org.spark-project.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.spark-project.jetty.server.Server.doStart(Server.java:293)
at org.spark-project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:252)
at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:262)
at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:262)
at org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1988)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1979)
at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:262)
at org.apache.spark.ui.WebUI.bind(WebUI.scala:136)
at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:481)
at org.apache.spark.SparkContext$$anonfun$13.apply(SparkContext.scala:481)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.SparkContext.(SparkContext.scala:481)
at org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:59)
at org.broadinstitute.hellbender.engine.spark.SparkContextFactory.createSparkContext(SparkContextFactory.java:152)
at org.broadinstitute.hellbender.engine.spark.SparkContextFactory.getSparkContext(SparkContextFactory.java:84)
at org.broadinstitute.hellbender.engine.spark.SparkCommandLineProgram.doWork(SparkCommandLineProgram.java:36)
at org.broadinstitute.hellbender.cmdline.CommandLineProgram.runTool(CommandLineProgram.java:102)
at org.broadinstitute.hellbender.cmdline.CommandLineProgram.instanceMainPostParseArgs(CommandLineProgram.java:155)
at org.broadinstitute.hellbender.cmdline.CommandLineProgram.instanceMain(CommandLineProgram.java:174)
at org.broadinstitute.hellbender.Main.instanceMain(Main.java:69)
at org.broadinstitute.hellbender.Main.main(Main.java:84)
16/09/16 14:14:39 INFO DiskBlockManager: Shutdown hook called
16/09/16 14:14:39 INFO ShutdownHookManager: Shutdown hook called
16/09/16 14:14:39 INFO ShutdownHookManager: Deleting directory /tmp/cgaadm/spark-921e83a7-f0a2-4d14-b52e-1a8a9733e16c
16/09/16 14:14:39 INFO ShutdownHookManager: Deleting directory /tmp/cgaadm/spark-921e83a7-f0a2-4d14-b52e-1a8a9733e16c/userFiles-a080c53e-7413-48ed-a3dc-1d4afcf7875b
This task ran ok in this An_REBC_dedicated workspace at the beginning of August and it's not clear why it should fail now.
Could this be an environment problem on the nodes the task is running on?
Thanks
Chip
P.S. the top lines of the error log:
Picked up JAVA_TOOL_OPTIONS: -Xmx1g -DR_HOME=/broad/software/free/Linux/redhat_6_x86_64/pkgs/r_2.10.1
[September 16, 2016 2:14:31 PM EDT] org.broadinstitute.hellbender.tools.genome.SparkGenomeReadCounts --keepXYMT false --binsize 3000 --outputFile THCA-TCGA-DJ-A2Q8-Tumor-SM-2BWKC.pcov --reference /seq/references/Homo_sapiens_assembly19/v1/Homo_sapiens_assembly19.fasta --input /seq/picard_aggregation/G32528/TCGA-DJ-A2Q8-01A-11D-A18F-08/v5/TCGA-DJ-A2Q8-01A-11D-A18F-08.bam --sparkMaster local[1] --readValidationStringency SILENT --interval_set_rule UNION --interval_padding 0 --bamPartitionSize 0 --disableSequenceDictionaryValidation false --shardedOutput false --numReducers 0 --help false --version false --verbosity INFO --QUIET false
[September 16, 2016 2:14:31 PM EDT] Executing as cgaadm@rebc-c001.broadinstitute.org on Linux 2.6.32-642.el6.x86_64 amd64; Java HotSpot(TM) 64-Bit Server VM 1.8.0_92-b14; Version: Version:version-unknown-SNAPSHOT
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
16/09/16 14:14:32 INFO SparkContext: Running Spark version 1.6.1
16/09/16 14:14:32 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/09/16 14:14:33 INFO SecurityManager: Changing view acls to: cgaadm
16/09/16 14:14:33 INFO SecurityManager: Changing modify acls to: cgaadm
16/09/16 14:14:33 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(cgaadm); users with modify permissions: Set(cgaadm)
16/09/16 14:14:36 WARN ThreadLocalRandom: Failed to generate a seed from SecureRandom within 3 seconds. Not enough entrophy?
16/09/16 14:14:36 INFO Utils: Successfully started service 'sparkDriver' on port 38238.
16/09/16 14:14:37 INFO Slf4jLogger: Slf4jLogger started
16/09/16 14:14:37 INFO Remoting: Starting remoting
16/09/16 14:14:37 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem@10.200.103.92:46814]
16/09/16 14:14:37 INFO Utils: Successfully started service 'sparkDriverActorSystem' on port 46814.
16/09/16 14:14:37 INFO SparkEnv: Registering MapOutputTracker
16/09/16 14:14:37 INFO SparkEnv: Registering BlockManagerMaster
16/09/16 14:14:37 INFO DiskBlockManager: Created local directory at /tmp/cgaadm/blockmgr-7b3e574a-a01d-46ac-bed4-3b85e5458a0a
16/09/16 14:14:37 INFO MemoryStore: MemoryStore started with capacity 3.8 GB
16/09/16 14:14:38 INFO SparkEnv: Registering OutputCommitCoordinator