[R] p-value for the fitted parameters in linear models

Li SUN vraifreud.test at gmail.com
Sun Jun 24 21:04:43 CEST 2012


2012/6/24 Uwe Ligges <ligges at statistik.tu-dortmund.de>:
>
>
> On 24.06.2012 20:35, Li SUN wrote:
>>
>> Thanks David and Brian.
>>
>> But what if x is exact while y has some uncertainty Δy, in the
>> relation y = k * x + b?
>>
>> Now I need to fit some data like
>> x       = 1,          2,          3,          4,          5
>> y±Δy = 1.1±0.1, 2.0±0.2, 3.1±0.2, 4.1±0.1, 5.0±0.2
>>
>> Is there any mechanism to pass x, y and Δy to lm() so that I can find
>> k, b as well as their uncertainties Δk, Δb?
>
>
> Again, no: this is not a linear model. Assumption in a linear model is that
> the errors are identically distributed.

Thanks, Uwe. Can the nonlinear model nls() handle this situation?


>
> Uwe Ligges
>
>
>
>
>>
>>
>> Li Sun
>>
>>
>> 2012/6/24 Prof Brian Ripley <ripley at stats.ox.ac.uk>:
>>>
>>> On 24/06/2012 18:39, David Winsemius wrote:
>>>>
>>>>
>>>>
>>>> On Jun 24, 2012, at 1:21 PM, Li SUN wrote:
>>>>
>>>>> Sorry for the confusion.
>>>>>
>>>>> Let me state the question again. I missed something in my original
>>>>> statement.
>>>>>
>>>>> When using the linear model lm() to fit data of the form y = k * x +
>>>>> b, where k, b are the coefficients to be found, and x is the variable
>>>>> and has an error bar (uncertainty) Δx of the same length associated
>>>>> with it. Is it possible to pass Δx to the linear model lm(), and from
>>>>> the output to find the uncertainty Δk for k, Δb for b as well?
>>>>
>>>>
>>>>
>>>> In one sense this could be done if you were interpreting the "Δx" as the
>>>> vector of individual residuals of a model, but I'm guessing that might
>>>> not be what you meant. You would be able to recover the original data,
>>>> assuming you knew the X values, and would proceed by calculating the Y
>>>> values as the sum of predictions and the residuals, thus recovering the
>>>> original data. But  I'm guessing you want to supply a small number of
>>>> parameters from an analysis you are reading about and you are hoping to
>>>> be getting from lm() further information to answer some question. That's
>>>> not the direction of teh flow of information. The flow is data INTO
>>>> lm(), estimation of parameters OUT.
>>>>
>>>> Show us a sample dataset constructed with R code or show us the console
>>>> output of dput() applied to your dataset, and you may get better answers
>>>> to what is still an unclear question.
>>>>
>>>
>>> This is not linear regression if 'x' is not known exactly.  There are
>>> various formulations of the problem, but that is off-topic here. However,
>>> consulting
>>>
>>> @Book{Fuller.87,
>>>  author       = "Fuller, Wayne A.",
>>>  title        = "Measurement Error Models",
>>>  publisher    = "John Wiley and Sons",
>>>  address =      "New York",
>>>  year         = "1987",
>>>  ISBN         = "0-471-86187-1",
>>> }
>>>
>>> would be a good start.
>>>
>>> --
>>> Brian D. Ripley,                  ripley at stats.ox.ac.uk
>>> Professor of Applied Statistics,  http://www.stats.ox.ac.uk/~ripley/
>>> University of Oxford,             Tel:  +44 1865 272861 (self)
>>> 1 South Parks Road,                     +44 1865 272866 (PA)
>>> Oxford OX1 3TG, UK                Fax:  +44 1865 272595
>>>
>>>
>>> ______________________________________________
>>> R-help at r-project.org mailing list
>>> https://stat.ethz.ch/mailman/listinfo/r-help
>>> PLEASE do read the posting guide
>>> http://www.R-project.org/posting-guide.html
>>> and provide commented, minimal, self-contained, reproducible code.
>>
>>
>> ______________________________________________
>> R-help at r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-help
>> PLEASE do read the posting guide
>> http://www.R-project.org/posting-guide.html
>> and provide commented, minimal, self-contained, reproducible code.
>>
>



More information about the R-help mailing list