Quantcast
Channel: Recent Discussions — GATK-Forum
Viewing all 12345 articles
Browse latest View live

CNNScoreVariants, too much threads

$
0
0

Hi,

in the BestPractice workflows you advise to use HaplotypeCaller with the "-XX:GCTimeLimit=50" and "-X:GCHeapFreeLimit=10" java options.

Is there something similar for CNNScoreVariants? I tried to use several java options with different values to limit threads but it is quite impossible. Without any option I have 116 threads, running only one command, with 5 java options I can limit them to 95 ... still too much! What should I limit here?

Many thanks


【购买澳洲】ANU国立大学毕业证书Q薇418049015〖国立大学毕业证成绩单The Australian National University

$
0
0
【购买澳洲】ANU国立大学毕业证书Q薇418049015〖国立大学毕业证成绩单The Australian National University
【购买澳洲】ANU国立大学毕业证书Q薇418049015〖国立大学毕业证成绩单The Australian National University
【购买澳洲】ANU国立大学毕业证书Q薇418049015〖国立大学毕业证成绩单The Australian National University

Plus and minus strands removed from computation by HaplotypeCaller

$
0
0

Hi, is there a way to know how many plus or minus strands are removed by HaplotypeCaller when comparing its AD with track numbers in IGV? I can see only in caller output QUAL, GQ, QD, AD, DP, PL, MQ, FS etc. Would like to know for maximum concordance with Sanger.
Thanks a lot, any help appreciated -Anne

【购买澳洲】MQ麦考瑞大学毕业证书Q薇418049015〖麦考瑞大学毕业证成绩单Macquarie University

$
0
0
【购买澳洲】MQ麦考瑞大学毕业证书Q薇418049015〖麦考瑞大学毕业证成绩单Macquarie University
【购买澳洲】MQ麦考瑞大学毕业证书Q薇418049015〖麦考瑞大学毕业证成绩单Macquarie University
【购买澳洲】MQ麦考瑞大学毕业证书Q薇418049015〖麦考瑞大学毕业证成绩单Macquarie University

【购买澳洲】Newcastle纽卡斯尔大学毕业证书Q薇418049015〖纽卡斯尔大学毕业证成绩单The University of Newcastle

$
0
0
【购买澳洲】Newcastle纽卡斯尔大学毕业证书Q薇418049015〖纽卡斯尔大学毕业证成绩单The University of Newcastle
【购买澳洲】Newcastle纽卡斯尔大学毕业证书Q薇418049015〖纽卡斯尔大学毕业证成绩单The University of Newcastle
【购买澳洲】Newcastle纽卡斯尔大学毕业证书Q薇418049015〖纽卡斯尔大学毕业证成绩单The University of Newcastle
【购买澳洲】Newcastle纽卡斯尔大学毕业证书Q薇418049015〖纽卡斯尔大学毕业证成绩单The University of Newcastle

【购买澳洲】UOW卧龙岗大学毕业证书Q薇418049015〖卧龙岗大学毕业证成绩单University of Wollongong

$
0
0
【购买澳洲】UOW卧龙岗大学毕业证书Q薇418049015〖卧龙岗大学毕业证成绩单University of Wollongong
【购买澳洲】UOW卧龙岗大学毕业证书Q薇418049015〖卧龙岗大学毕业证成绩单University of Wollongong
【购买澳洲】UOW卧龙岗大学毕业证书Q薇418049015〖卧龙岗大学毕业证成绩单University of Wollongong
【购买澳洲】UOW卧龙岗大学毕业证书Q薇418049015〖卧龙岗大学毕业证成绩单University of Wollongong

【购买澳洲】GU格里菲斯大学毕业证书Q薇418049015〖格里菲斯大学毕业证成绩单Griffith University

$
0
0
【购买澳洲】GU格里菲斯大学毕业证书Q薇418049015〖格里菲斯大学毕业证成绩单Griffith University
【购买澳洲】GU格里菲斯大学毕业证书Q薇418049015〖格里菲斯大学毕业证成绩单Griffith University
【购买澳洲】GU格里菲斯大学毕业证书Q薇418049015〖格里菲斯大学毕业证成绩单Griffith University
【购买澳洲】GU格里菲斯大学毕业证书Q薇418049015〖格里菲斯大学毕业证成绩单Griffith University

【购买澳洲】Flinders弗林德斯大学毕业证书Q薇418049015〖弗林德斯大学毕业证成绩单Flinders University

$
0
0
【购买澳洲】Flinders弗林德斯大学毕业证书Q薇418049015〖弗林德斯大学毕业证成绩单Flinders University
【购买澳洲】Flinders弗林德斯大学毕业证书Q薇418049015〖弗林德斯大学毕业证成绩单Flinders University
【购买澳洲】Flinders弗林德斯大学毕业证书Q薇418049015〖弗林德斯大学毕业证成绩单Flinders University
【购买澳洲】Flinders弗林德斯大学毕业证书Q薇418049015〖弗林德斯大学毕业证成绩单Flinders University

【购买澳洲】UTAS塔斯马尼亚大学毕业证书Q薇418049015〖塔斯马尼亚大学毕业证成绩单University of Tasmania

$
0
0
【购买澳洲】UTAS塔斯马尼亚大学毕业证书Q薇418049015〖塔斯马尼亚大学毕业证成绩单University of Tasmania
【购买澳洲】UTAS塔斯马尼亚大学毕业证书Q薇418049015〖塔斯马尼亚大学毕业证成绩单University of Tasmania
【购买澳洲】UTAS塔斯马尼亚大学毕业证书Q薇418049015〖塔斯马尼亚大学毕业证成绩单University of Tasmania

【购买澳洲】UWS西悉尼大学毕业证书Q薇418049015〖西悉尼大学毕业证成绩单University of Western Sydney

$
0
0
【购买澳洲】UWS西悉尼大学毕业证书Q薇418049015〖西悉尼大学毕业证成绩单University of Western Sydney
【购买澳洲】UWS西悉尼大学毕业证书Q薇418049015〖西悉尼大学毕业证成绩单University of Western Sydney
【购买澳洲】UWS西悉尼大学毕业证书Q薇418049015〖西悉尼大学毕业证成绩单University of Western Sydney
【购买澳洲】UWS西悉尼大学毕业证书Q薇418049015〖西悉尼大学毕业证成绩单University of Western Sydney

【购买澳洲】Bond邦德大学毕业证书Q薇418049015〖邦德大学毕业证成绩单Bond University

$
0
0
【购买澳洲】Bond邦德大学毕业证书Q薇418049015〖邦德大学毕业证成绩单Bond University
【购买澳洲】Bond邦德大学毕业证书Q薇418049015〖邦德大学毕业证成绩单Bond University
【购买澳洲】Bond邦德大学毕业证书Q薇418049015〖邦德大学毕业证成绩单Bond University
【购买澳洲】Bond邦德大学毕业证书Q薇418049015〖邦德大学毕业证成绩单Bond University

【购买澳洲】Deakin迪肯大学毕业证书Q薇418049015〖迪肯大学毕业证成绩单Deakin University

$
0
0
【购买澳洲】Deakin迪肯大学毕业证书Q薇418049015〖迪肯大学毕业证成绩单Deakin University
【购买澳洲】Deakin迪肯大学毕业证书Q薇418049015〖迪肯大学毕业证成绩单Deakin University
【购买澳洲】Deakin迪肯大学毕业证书Q薇418049015〖迪肯大学毕业证成绩单Deakin University

【购买澳洲】UTS悉尼科技大学毕业证书Q薇418049015〖悉尼科技大学毕业证成绩单University of Technology , Sydney

$
0
0
【购买澳洲】UTS悉尼科技大学毕业证书Q薇418049015〖悉尼科技大学毕业证成绩单University of Technology , Sydney
【购买澳洲】UTS悉尼科技大学毕业证书Q薇418049015〖悉尼科技大学毕业证成绩单University of Technology , Sydney
【购买澳洲】UTS悉尼科技大学毕业证书Q薇418049015〖悉尼科技大学毕业证成绩单University of Technology , Sydney
【购买澳洲】UTS悉尼科技大学毕业证书Q薇418049015〖悉尼科技大学毕业证成绩单University of Technology , Sydney
【购买澳洲】UTS悉尼科技大学毕业证书Q薇418049015〖悉尼科技大学毕业证成绩单University of Technology , Sydney

【购买澳洲】Curtin科廷科技大学毕业证书Q薇418049015〖科廷科技大学毕业证成绩单Curtin University of Technology

$
0
0
【购买澳洲】Curtin科廷科技大学毕业证书Q薇418049015〖科廷科技大学毕业证成绩单Curtin University of Technology
【购买澳洲】Curtin科廷科技大学毕业证书Q薇418049015〖科廷科技大学毕业证成绩单Curtin University of Technology
【购买澳洲】Curtin科廷科技大学毕业证书Q薇418049015〖科廷科技大学毕业证成绩单Curtin University of Technology
【购买澳洲】Curtin科廷科技大学毕业证书Q薇418049015〖科廷科技大学毕业证成绩单Curtin University of Technology

how to run CalculateGenotypePosteriors on a case-control cohort

$
0
0

One usage example is to refine genotypes based on the discovered allele frequency in an input VCF containing many samples

 gatk --java-options "-Xmx4g" CalculateGenotypePosteriors \
   -V multisample_input.vcf.gz \
   -O output.vcf.gz

Now if I have a case-control cohort of autoimmune disease, I wonder if I should run the above step separately for case and control, or put them into one vcf file?

Thanks for any insight!


GATK Spark Logging

$
0
0
Hello,

I've been trying to decrease the verbosity of the Spark runs for GATk tools, e.g. MarkDuplicatesSpark

My call is as follows:

python ${gatkDir}/gatk MarkDuplicatesSpark --spark-master local[$threads] -R ${GRC}.fa --input ${TU}.bam --output ${TU}.dd.bam --tmp-dir temp --verbosity ERROR

I thought the --verbosity ERROR would write only ERROR level output from the tools, but I'm still getting a lot of INFO output.

Is there another way to get only ERROR level output?

Thanks!

conda env create fails: Invalid requirement: '$tensorFlowDependency'

$
0
0
Hi,

I get an error while trying to create conda environment for gatk on Centos 7, the gatk installed successfuly, conda installed, standard issue of python supplied with the system and updated to latest version. From what I can see, Anaconda has its own python3. The $tensorFlowDependency and other lines are puzzling to me.

Below is the entire output of the command:

------------------------------------------------
# conda env create -n gatk -f gatkcondaenv.yml
Solving environment: done


==> WARNING: A newer version of conda exists. <==
current version: 4.5.12
latest version: 4.6.8

Please update conda by running

$ conda update -n base -c defaults conda



Downloading and Extracting Packages
intel-openmp-2018.0. | 620 KB | ############################################################################################################## | 100%
pip-9.0.1 | 1.7 MB | ############################################################################################################## | 100%
zlib-1.2.11 | 109 KB | ############################################################################################################## | 100%
readline-6.2 | 606 KB | ############################################################################################################## | 100%
openssl-1.0.2l | 3.2 MB | ############################################################################################################## | 100%
tk-8.5.18 | 1.9 MB | ############################################################################################################## | 100%
certifi-2016.2.28 | 216 KB | ############################################################################################################## | 100%
xz-5.2.3 | 667 KB | ############################################################################################################## | 100%
python-3.6.2 | 16.5 MB | ############################################################################################################## | 100%
sqlite-3.13.0 | 4.0 MB | ############################################################################################################## | 100%
setuptools-36.4.0 | 563 KB | ############################################################################################################## | 100%
mkl-2018.0.1 | 184.7 MB | ############################################################################################################## | 100%
wheel-0.29.0 | 88 KB | ############################################################################################################## | 100%
mkl-service-1.1.2 | 11 KB | ############################################################################################################## | 100%
Preparing transaction: done
Verifying transaction: done
Executing transaction: done
Invalid requirement: '$tensorFlowDependency'
Traceback (most recent call last):
File "/usr/share/anaconda2/envs/gatk/lib/python3.6/site-packages/pip/_vendor/packaging/requirements.py", line 92, in __init__
req = REQUIREMENT.parseString(requirement_string)
File "/usr/share/anaconda2/envs/gatk/lib/python3.6/site-packages/pip/_vendor/pyparsing.py", line 1617, in parseString
raise exc
File "/usr/share/anaconda2/envs/gatk/lib/python3.6/site-packages/pip/_vendor/pyparsing.py", line 1607, in parseString
loc, tokens = self._parse( instring, 0 )
File "/usr/share/anaconda2/envs/gatk/lib/python3.6/site-packages/pip/_vendor/pyparsing.py", line 1379, in _parseNoCache
loc,tokens = self.parseImpl( instring, preloc, doActions )
File "/usr/share/anaconda2/envs/gatk/lib/python3.6/site-packages/pip/_vendor/pyparsing.py", line 3376, in parseImpl
loc, exprtokens = e._parse( instring, loc, doActions )
File "/usr/share/anaconda2/envs/gatk/lib/python3.6/site-packages/pip/_vendor/pyparsing.py", line 1379, in _parseNoCache
loc,tokens = self.parseImpl( instring, preloc, doActions )
File "/usr/share/anaconda2/envs/gatk/lib/python3.6/site-packages/pip/_vendor/pyparsing.py", line 3698, in parseImpl
return self.expr._parse( instring, loc, doActions, callPreParse=False )
File "/usr/share/anaconda2/envs/gatk/lib/python3.6/site-packages/pip/_vendor/pyparsing.py", line 1379, in _parseNoCache
loc,tokens = self.parseImpl( instring, preloc, doActions )
File "/usr/share/anaconda2/envs/gatk/lib/python3.6/site-packages/pip/_vendor/pyparsing.py", line 3359, in parseImpl
loc, resultlist = self.exprs[0]._parse( instring, loc, doActions, callPreParse=False )
File "/usr/share/anaconda2/envs/gatk/lib/python3.6/site-packages/pip/_vendor/pyparsing.py", line 1383, in _parseNoCache
loc,tokens = self.parseImpl( instring, preloc, doActions )
File "/usr/share/anaconda2/envs/gatk/lib/python3.6/site-packages/pip/_vendor/pyparsing.py", line 2670, in parseImpl
raise ParseException(instring, loc, self.errmsg, self)
pip._vendor.pyparsing.ParseException: Expected W:(abcd...) (at char 0), (line:1, col:1)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/usr/share/anaconda2/envs/gatk/lib/python3.6/site-packages/pip/req/req_install.py", line 82, in __init__
req = Requirement(req)
File "/usr/share/anaconda2/envs/gatk/lib/python3.6/site-packages/pip/_vendor/packaging/requirements.py", line 96, in __init__
requirement_string[e.loc:e.loc + 8]))
pip._vendor.packaging.requirements.InvalidRequirement: Invalid requirement, parse error at "'$tensorF'"


CondaValueError: pip returned an error

-----------------------------------------------

Can you please point me into the right direction what's missing (apart of updated conda)?


Thanks

Best Regards
Maciej

Java related error encountered while running gatk PathSeqPipelineSpark

$
0
0

Hi,

I am trying to test the pathseq tutorial following the tutorial on this link

I ran the following commands

bioinfo@bioinfo$ conda activate gatk
(gatk) bioinfo@bioinfo$ gatk PathSeqPipelineSpark \
>     --input test_sample.bam \
>     --filter-bwa-image hg19mini.fasta.img \
>     --kmer-file hg19mini.hss \
>     --min-clipped-read-length 70 \
>     --microbe-fasta e_coli_k12.fasta \
>     --microbe-bwa-image e_coli_k12.fasta.img \
>     --taxonomy-file e_coli_k12.db \
>     --output output.pathseq.bam \
>     --scores-output output.pathseq.txt

And encountered below error:

Using GATK jar /home/bioinfo/Installers/gatk4/gatk-4.1.0.0/gatk-package-4.1.0.0-local.jar
Running:
    java -Dsamjdk.use_async_io_read_samtools=false -Dsamjdk.use_async_io_write_samtools=true -Dsamjdk.use_async_io_write_tribble=false -Dsamjdk.compression_level=2 -jar /home/bioinfo/Installers/gatk4/gatk-4.1.0.0/gatk-package-4.1.0.0-local.jar PathSeqPipelineSpark --input test_sample.bam --filter-bwa-image hg19mini.fasta.img --kmer-file hg19mini.hss --min-clipped-read-length 70 --microbe-fasta e_coli_k12.fasta --microbe-bwa-image e_coli_k12.fasta.img --taxonomy-file e_coli_k12.db --output output.pathseq.bam --scores-output output.pathseq.txt
18:57:39.629 WARN  SparkContextFactory - Environment variables HELLBENDER_TEST_PROJECT and HELLBENDER_JSON_SERVICE_ACCOUNT_KEY must be set or the GCS hadoop connector will not be configured properly
18:57:39.729 INFO  NativeLibraryLoader - Loading libgkl_compression.so from jar:file:/home/bioinfo/Installers/gatk4/gatk-4.1.0.0/gatk-package-4.1.0.0-local.jar!/com/intel/gkl/native/libgkl_compression.so
18:57:41.594 INFO  PathSeqPipelineSpark - ------------------------------------------------------------
18:57:41.594 INFO  PathSeqPipelineSpark - The Genome Analysis Toolkit (GATK) v4.1.0.0
18:57:41.594 INFO  PathSeqPipelineSpark - For support and documentation go to https://software.broadinstitute.org/gatk/
18:57:41.739 INFO  PathSeqPipelineSpark - Initializing engine
18:57:41.739 INFO  PathSeqPipelineSpark - Done initializing engine
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
19/03/05 18:57:41 INFO SparkContext: Running Spark version 2.2.0
18:57:41.968 WARN  NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
18:57:42.155 INFO  PathSeqPipelineSpark - Shutting down engine
[5 March, 2019 6:57:42 PM IST] org.broadinstitute.hellbender.tools.spark.pathseq.PathSeqPipelineSpark done. Elapsed time: 0.04 minutes.
Runtime.totalMemory()=645922816
Exception in thread "main" java.lang.ExceptionInInitializerError
    at org.apache.spark.SparkConf.validateSettings(SparkConf.scala:546)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:373)
    at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
    at org.broadinstitute.hellbender.engine.spark.SparkContextFactory.createSparkContext(SparkContextFactory.java:178)
    at org.broadinstitute.hellbender.engine.spark.SparkContextFactory.getSparkContext(SparkContextFactory.java:110)
    at org.broadinstitute.hellbender.engine.spark.SparkCommandLineProgram.doWork(SparkCommandLineProgram.java:28)
    at org.broadinstitute.hellbender.cmdline.CommandLineProgram.runTool(CommandLineProgram.java:138)
    at org.broadinstitute.hellbender.cmdline.CommandLineProgram.instanceMainPostParseArgs(CommandLineProgram.java:191)
    at org.broadinstitute.hellbender.cmdline.CommandLineProgram.instanceMain(CommandLineProgram.java:210)
    at org.broadinstitute.hellbender.Main.runCommandLineProgram(Main.java:162)
    at org.broadinstitute.hellbender.Main.mainEntry(Main.java:205)
    at org.broadinstitute.hellbender.Main.main(Main.java:291)
Caused by: java.net.UnknownHostException: bioinfo: bioinfo: unknown error
    at java.net.InetAddress.getLocalHost(InetAddress.java:1505)
    at org.apache.spark.util.Utils$.findLocalInetAddress(Utils.scala:891)
    at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress$lzycompute(Utils.scala:884)
    at org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress(Utils.scala:884)
    at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:941)
    at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:941)
    at scala.Option.getOrElse(Option.scala:121)
    at org.apache.spark.util.Utils$.localHostName(Utils.scala:941)
    at org.apache.spark.internal.config.package$.<init>(package.scala:204)
    at org.apache.spark.internal.config.package$.<clinit>(package.scala)
    ... 12 more
Caused by: java.net.UnknownHostException: bioinfo: unknown error
    at java.net.Inet6AddressImpl.lookupAllHostAddr(Native Method)
    at java.net.InetAddress$2.lookupAllHostAddr(InetAddress.java:928)
    at java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1323)
    at java.net.InetAddress.getLocalHost(InetAddress.java:1500)
    ... 21 more

美国本地做文凭学历.国外大学学位证.美国假毕业证样本图片(微信464571773)

$
0
0
(微信464571773),QQ95634381专业办理英国大学毕业证,办理英国大学文凭,英国成绩单大毕业证,代办英国大学毕业证证。仿真毕美国业证文凭,英国成绩单样本制作,本科毕业证.制造成绩单.美国学历认证书.录取通知书,英国国外学历学位认证,使馆认证,留学人员证明,如果要办理请联系我们.诚信制作.质量三包.请放心办理
制作程度
您得先对您自己所办理大学毕业证书要求要有一个简单的描述。以一毕业证为例吧,首先您要做哪个国家的大学文凭,,可很多学校为了显示自己的特点,同时也为了防止假冒,印章盖得也就不规则了,证件里面的内容是电脑打字的,钢印压得清晰度等等。描述完后以电子邮件传递给我们,我方收到后仔细核实并报价给您,这个价位是根据您证件上面的具体印章而报的,,,您如果同意我公司报价,即把资料发给我公司并且支付证件的百分之三十为定金,到此为止我们的第一步就算成交

How to make somatic variant calls from RNA-Seq data (tumor) and whole exome data (matched normal)

$
0
0

I'm in a situation to make variant calls from a mouse tumor cell line. We have RNA-Seq data from this tumor cell line but for the matched normal, what we have is the whole exome sequencing data. Is there any tools/workflows I can use in this case? Thanks!

Viewing all 12345 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>