[R] Extracting elements from a nested list
Gregory Ryslik
rsaber at comcast.net
Mon Oct 18 20:40:09 CEST 2010
Hi Everyone,
This is closer to what I need but this returns me a matrix where each element is a factor. Instead I would want a list of lists. The first entry of the list should equal the first column of the matrix that mapply makes, the second entry to the second column etc...
I've attached the two files that have all.predicted.values and max.growth from dput to make for easy testing. Thanks again!
Kind regards,
Greg
On Oct 18, 2010, at 1:33 PM, Erich Neuwirth wrote:
> You probably need mapply since you have 2 list of arguments which you want to use "in sync"
>
> mapply(function(x1,x2)x1[[x2]],all.predicted.values,max.growth)
>
> might be what you want.
>
>
>
> On Oct 18, 2010, at 5:17 PM, Gregory Ryslik wrote:
>
>> Unfortunately, that gives me null everywhere. Here's the data I have for all.predicted.values and max.growth. Perhaps this will help. Thus I want all.predicted.values[[1]][[4]] then all.predicted.values[[2]][3]] and then all.predicted.values[[3]][[4]].
>>
>> I've attached what your statement outputs at the end.
>>
>> Thanks again!
>>
>> Browse[2]> max.growth
>> [[1]]
>> [1] 4
>>
>> [[2]]
>> [1] 3
>>
>> [[3]]
>> [1] 4
>>
>> Browse[2]> all.predicted.values
>> [[1]]
>> [[1]][[1]]
>> [1] 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
>> [55] 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
>> Levels: 0 1 2
>>
>> [[1]][[2]]
>> [1] 2 2 2 0 2 0 2 2 2 2 2 2 2 2 0 2 2 2 2 2 2 2 2 2 2 0 0 2 2 2 2 0 0 0 2 2 0 0 2 2 0 2 2 2 2 2 0 2 2 2 0 2 2 0
>> [55] 0 0 2 0 2 0 0 0 0 2 2 2 2 0 2 2 2 0 2 2 0 0 2 2 2 2 2 2 2 0 0 0 2 0 2 2 2 2 0 2 2 2 0 2 0 0
>> Levels: 0 1 2
>>
>> [[1]][[3]]
>> [1] 0 0 0 0 2 0 0 0 0 0 0 0 0 2 0 0 0 0 0 2 0 2 2 2 0 0 0 2 0 0 2 0 0 0 0 0 0 0 2 0 0 0 0 0 2 2 0 0 0 2 0 0 0 0
>> [55] 0 0 2 0 2 0 0 0 0 2 2 0 0 0 0 0 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 2 0 0 0 0 0 0 2 0 0
>> Levels: 0 1 2
>>
>> [[1]][[4]]
>> [1] 0 0 0 0 2 0 0 0 0 0 0 0 0 2 0 0 0 0 0 2 0 2 2 2 0 0 0 2 0 0 2 0 0 0 0 0 0 0 2 0 0 0 0 0 2 2 0 0 0 2 0 0 0 0
>> [55] 0 0 2 0 2 0 0 0 0 2 2 0 0 0 0 0 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 2 2 0 0 0 0 0 0 2 0 0
>> Levels: 0 1 2
>>
>>
>> [[2]]
>> [[2]][[1]]
>> [1] 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2
>> [55] 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2
>> Levels: 0 1 2
>>
>> [[2]][[2]]
>> [1] 2 2 2 2 1 2 2 2 2 2 1 2 2 2 2 1 2 1 2 2 2 2 2 2 2 2 2 2 1 2 2 2 2 2 1 2 2 2 1 2 2 1 1 2 2 2 2 2 2 2 2 1 2 2
>> [55] 2 2 2 2 1 2 2 2 2 1 2 2 1 1 1 2 2 2 1 2 1 2 1 2 1 2 2 2 1 1 2 2 1 2 2 1 1 2 1 1 1 2 2 1 2 2
>> Levels: 0 1 2
>>
>> [[2]][[3]]
>> [1] 2 2 2 0 1 2 2 2 2 2 1 2 2 2 0 1 2 1 2 2 2 2 2 2 2 0 0 2 1 2 2 2 0 0 1 2 0 0 1 2 0 1 1 2 2 2 0 2 2 2 0 1 2 2
>> [55] 0 2 2 2 1 0 0 0 0 1 2 2 1 1 1 2 2 0 1 2 1 0 1 2 1 2 2 2 1 1 2 2 1 2 2 1 1 2 1 1 1 2 2 1 0 2
>> Levels: 0 1 2
>>
>>
>> [[3]]
>> [[3]][[1]]
>> [1] 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
>> [55] 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
>> Levels: 0 1 2
>>
>> [[3]][[2]]
>> [1] 2 2 2 0 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 0 0 2 2 2 2 2 0 0 2 2 2 0 2 2 0 2 2 2 2 2 0 2 2 2 0 2 2 2
>> [55] 0 2 2 2 2 2 0 0 2 2 2 2 2 2 2 2 2 0 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2
>> Levels: 0 1 2
>>
>> [[3]][[3]]
>> [1] 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 1 1 1 1 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 1 0 0 1 1 0 0 0 0 0 0 0 0 1 0 0
>> [55] 0 0 0 0 1 0 0 0 0 1 0 0 1 1 1 0 0 0 1 0 1 1 1 0 1 0 0 0 1 1 0 0 1 0 0 1 1 0 1 1 1 0 0 1 1 0
>> Levels: 0 1 2
>>
>> [[3]][[4]]
>> [1] 2 2 2 0 1 0 2 2 0 2 1 2 2 0 0 1 1 1 1 0 2 0 0 0 2 0 0 0 1 2 0 0 0 0 1 2 0 0 1 2 0 1 1 2 0 0 0 2 2 0 0 1 2 0
>> [55] 0 0 0 0 1 0 0 0 0 1 0 2 1 1 1 2 0 0 1 2 1 1 1 2 1 2 2 2 1 1 0 0 1 0 2 1 1 2 1 1 1 2 0 1 1 0
>> Levels: 0 1 2
>>
>>
>> Browse[2]> predicted.values.for.max.growth<-diag(sapply(all.predicted.values,'[[','max.growth'))
>> Browse[2]> predicted.values.for.max.growth
>> [[1]]
>> NULL
>>
>> [[2]]
>> [1] 0
>>
>> [[3]]
>> [1] 0
>>
>> [[4]]
>> [1] 0
>>
>> [[5]]
>> NULL
>>
>> [[6]]
>> [1] 0
>>
>> [[7]]
>> [1] 0
>>
>> [[8]]
>> [1] 0
>>
>> [[9]]
>> NULL
>>
>>
>>
>> On Oct 18, 2010, at 11:08 AM, Henrique Dallazuanna wrote:
>>
>>> Try this:
>>>
>>> diag(sapply(all.predicted.values, '[[', 'max.growth'))
>>>
>>>
>>> On Mon, Oct 18, 2010 at 12:59 PM, Gregory Ryslik <rsaber at comcast.net> wrote:
>>> Hi,
>>>
>>> I have a list of n items and the ith element has m_i elements within it.
>>>
>>> I want to do something like:
>>>
>>> predicted.values<- lapply(all.predicted.values,'[[',max.growth[[i]])
>>>
>>> Where max.growth[[i]] is the element I want to extract from each of the ith predicted elements. Thus, for example, I want to extract the max.growth[[1]] element from all.predicted.values[[1]] (which is itself a list). Then I want to extract max.growth[[2]] element from all.predicted.values[[2]].
>>>
>>> I realize I can do this with a for loop but then if I can do this as one line that would be preferable.
>>>
>>> Thanks!
>>>
>>> Greg
>>> [[alternative HTML version deleted]]
>>>
>>> ______________________________________________
>>> R-help at r-project.org mailing list
>>> https://stat.ethz.ch/mailman/listinfo/r-help
>>> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
>>> and provide commented, minimal, self-contained, reproducible code.
>>>
>>>
>>>
>>> --
>>> Henrique Dallazuanna
>>> Curitiba-Paraná-Brasil
>>> 25° 25' 40" S 49° 16' 22" O
>>
>>
>> [[alternative HTML version deleted]]
>>
>> ______________________________________________
>> R-help at r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-help
>> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
>> and provide commented, minimal, self-contained, reproducible code.
>
> --
> Erich Neuwirth
> Didactic Center for Computer Science and Institute for Scientific Computing
> University of Vienna
>
>
>
>
More information about the R-help
mailing list