[BioC] DESeq error thrown during estimateDispersions w/ coxReid method
Steve Lianoglou
mailinglist.honeypot at gmail.com
Thu Apr 26 00:41:26 CEST 2012
Hi,
(maybe this is more appropriate for bioc-devel, but ...)
Using R-2.15-patched, DESeq_1.9.4
DESeq isn't liking 1 row in my count data, and throws an error in the
`estimateAndFitDispersionsWithCoxReid` function. Specifically this
error:
Error in glm.fit(mm, y, family = MASS::negative.binomial(initialGuess), :
NA/NaN/Inf in 'x'
The count data looks like this, where w1,w2,w3 are replicates of experiment w:
w1 w2 w3 x1 x2 x3 y1 y2 z1 z2
0 0 18 0 52 0 0 0 1 1
Ok -- it's weird, I'll grant you that. Still, instead of killing the
entire run (it's a little time consuming) I was curious if something
could be done about such troublesome count rows?
For instance, in the `apply` loop we could wrap glm.fit into a
tryCatch() and just set the dispersion for this row as NA. When all is
said and done, perhaps emit a warning about "Can not estimate
dispersions for XX rows" and set their dispersion to `max(disps)`. You
could even set as an attribute of the object that is ultimately
returned the indices of the "bad" rows that the user could then remove
after wards.
Would that be a reasonable thing to do?
Also, would you accept a patch to the
`estimateAndFitDispersionsWithCoxReid` that parallelizes it in a
similar way that DEXseq parallelizes some of its cpu-intensive bits?
Thanks,
-steve
--
Steve Lianoglou
Graduate Student: Computational Systems Biology
| Memorial Sloan-Kettering Cancer Center
| Weill Medical College of Cornell University
Contact Info: http://cbio.mskcc.org/~lianos/contact
More information about the Bioconductor
mailing list