[R] Logistic Regression

Mike Marchywka marchywka at hotmail.com
Tue Jun 7 12:20:05 CEST 2011





----------------------------------------
> Date: Tue, 7 Jun 2011 01:38:32 -0700
> From: farah.farid.n09 at student.aku.edu
> To: r-help at r-project.org
> Subject: [R] Logistic Regression
>
> I am working on my thesis in which i have couple of independent variables
> that are categorical in nature and the depndent variable is dichotomus.
> Initially I run univariate analysis and added the variables with significant
> p-values (p<0.25) in my full model.
> I have three confusions. Firstly, I am looking for confounding variables by

I'm not sure what your thesis is about, some system that you are strying
by statistics or maybe the thesis is about statistics, but
according to this disputed wikipedia entry,

http://en.wikipedia.org/wiki/Confounding

confounding or extraneous is determined by the reality of your system.
It may help to consider factors related to that and use the statistics
to avoid fooling yourself. Look at the pictures ( non-pompous way of saying
look at graphs and scatter plots for some ideas to test ) and then test various
ideas. You see bad cause/effect inferences all the time in many fields- 
from econ to biotech ( although anecdotes suggest these mistakes usually
favour the sponsors LOL). Consider some mundane "known" examples about
what your data would look like and see if that relates to what you have.
If you were naively measuring car velocity 
at a single point in front of traffic light and color of light, what might you
observe ( much like with an earlier example on iron in patients, there are a number
of more precisely defined measurements you could take on a given "thing.").

If your concern is that " I ran test A and it said B but test C said D and
D seems inconsistent with B" it generally helps to look at assumptions and detailed
equations for each model and explore what those mean with your data. With continuous
variables anyway, non-monotonic relationships can easily destroy a correlation
even with strong causality. 


> using formula "(crude beta-cofficient - adjusted beta-cofficient)/ crude
> beta-cofficient x 100" as per rule if the percentage of any variable is >10%
> than I have considered that as confounder. I wanted to know that from
> initial model i have deducted one variable with insignificant p-value to
> form adjusted model. Now how will i know if the variable that i deducted
> from initial model was confounder or not?
> Secondly, I wanted to know if the percentage comes in negative like
> (-17.84%) than will it be considered as confounder or not? I also wanted to
> know that confounders should be removed from model? or should be kept in
> model?
> Lastly, I wanted to know that I am running likelihood ratio test to identify
> if the value is falling in critical region or not. So if the value doesnot
> fall in critical region than what does it show? what should I do in this
> case? In my final reduced model all p-values are significant but still the
> value identified via likelihood ratio test is not falling in critical
> region. So what does that show?
>
>
> --
> View this message in context: http://r.789695.n4.nabble.com/Logistic-Regression-tp3578962p3578962.html
> Sent from the R help mailing list archive at Nabble.com.
>
> ______________________________________________
> R-help at r-project.org mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
 		 	   		  


More information about the R-help mailing list