[R] reading daily snow depth data
Alemu Tadesse
alemu.tadesse at gmail.com
Tue Jun 16 22:47:26 CEST 2015
Thank you Jim and Bob. This is really big help for me.
Jim, this is your second time to help me out.
Best
Alemu
On Tue, Jun 16, 2015 at 1:50 PM, boB Rudis <bob at rudis.net> wrote:
> This look similar to snow data I used last year:
> https://github.com/hrbrmstr/snowfirst/blob/master/R/snowfirst.R
>
> All the data worked pretty well.
>
> On Tue, Jun 16, 2015 at 3:21 PM, jim holtman <jholtman at gmail.com> wrote:
> > Here is an example of reading in the data. After that it is a data frame
> > and should be able to process it with dplyr/data.table without much
> trouble:
> >
> >> x <- readLines("
> >
> http://www1.ncdc.noaa.gov/pub/data/snowmonitoring/fema/06-2015-dlysndpth.txt
> > ")
> >> writeLines(x, '/temp/snow.txt') # save for testing
> >> head(x)
> > [1]
> > ""
> >
> > [2] "State:
> > AL"
> >
> > [3] " Lat Lon COOP# StnID State City/Station Name
> > County Elev Jun 1 Jun 2 Jun 3 Jun
> > 4 Jun 5 Jun 6 Jun 7 Jun 8 Jun 9 Jun10
> > Jun11 Jun12 Jun13 Jun14 Jun15 Jun16"
> > [4] " 33.59 -85.86 010272 AL ANNISTON ARPT ASOS
> > CALHOUN 594 0.000 0.000 0.000
> > 0.000 0.000 0.000 0.000 0.000 0.000
> > 0.000 0.000 0.000 0.000 0.000 0.000 -9999.000"
> > [5] " 33.83 -85.78 014209 AL JACKSONVILLE
> > CALHOUN 608 -9999.000 -9999.000 -9999.000
> > -9999.000 -9999.000 0.000 0.000 -9999.000 -9999.000
> > -9999.000 -9999.000 -9999.000 -9999.000 -9999.000 -9999.000
> -9999.000"
> > [6] " 34.74 -87.60 015749 AL MUSCLE SHOALS AP
> > COLBERT 540 0.000 0.000 0.000
> > 0.000 0.000 0.000 0.000 0.000 0.000
> > 0.000 0.000 0.000 0.000 0.000 0.000 -9999.000"
> >> z <- grepl("(^$)|(^State)|(^ Lat)", x) # get lines to discard
> >> xm <- x[!z] # remove info lines
> >> head(xm)
> > [1] " 33.59 -85.86 010272 AL ANNISTON ARPT ASOS
> > CALHOUN 594 0.000 0.000 0.000
> > 0.000 0.000 0.000 0.000 0.000 0.000
> > 0.000 0.000 0.000 0.000 0.000 0.000 -9999.000"
> > [2] " 33.83 -85.78 014209 AL JACKSONVILLE
> > CALHOUN 608 -9999.000 -9999.000 -9999.000
> > -9999.000 -9999.000 0.000 0.000 -9999.000 -9999.000
> > -9999.000 -9999.000 -9999.000 -9999.000 -9999.000 -9999.000
> -9999.000"
> > [3] " 34.74 -87.60 015749 AL MUSCLE SHOALS AP
> > COLBERT 540 0.000 0.000 0.000
> > 0.000 0.000 0.000 0.000 0.000 0.000
> > 0.000 0.000 0.000 0.000 0.000 0.000 -9999.000"
> > [4] " 31.32 -85.45 012372 AL DOTHAN FAA AIRPORT
> > DALE 374 0.000 0.000 0.000
> > 0.000 0.000 0.000 0.000 0.000 0.000
> > 0.000 0.000 0.000 0.000 0.000 0.000 -9999.000"
> > [5] " 32.70 -87.58 013511 AL GREENSBORO
> > HALE 220 0.000 0.000 0.000
> > 0.000 0.000 0.000 0.000 0.000 0.000
> > 0.000 0.000 0.000 0.000 0.000 0.000 -9999.000"
> > [6] " 33.57 -86.74 010831 AL BIRMINGHAM AP ASOS
> > JEFFERSON 615 0.000 0.000 0.000
> > 0.000 0.000 0.000 0.000 0.000 0.000
> > 0.000 0.000 0.000 0.000 0.000 0.000 -9999.000"
> >>
> >> # read in the data
> >> xf <- textConnection(xm)
> >> snow <- read.fwf(xf
> > + , width = c(6,8,7,10,3,32,26,6,rep(11,16))
> > + , comment.char = ''
> > + , as.is = TRUE
> > + )
> >> str(snow)
> > 'data.frame': 3067 obs. of 24 variables:
> > $ V1 : num 33.6 33.8 34.7 31.3 32.7 ...
> > $ V2 : num -85.9 -85.8 -87.6 -85.5 -87.6 ...
> > $ V3 : int 10272 14209 15749 12372 13511 10831 11225 14064 12245 15478
> ...
> > $ V4 : chr " " " " " " " " ...
> > $ V5 : chr "AL " "AL " "AL " "AL " ...
> > $ V6 : chr "ANNISTON ARPT ASOS "
> > "JACKSONVILLE " "MUSCLE SHOALS AP "
> > "DOTHAN FAA AIRPORT " ...
> > $ V7 : chr "CALHOUN " "CALHOUN "
> > "COLBERT " "DALE " ...
> > $ V8 : int 594 608 540 374 220 615 461 624 100 215 ...
> > $ V9 : num 0 -9999 0 0 0 ...
> > $ V10: num 0 -9999 0 0 0 ...
> > $ V11: num 0 -9999 0 0 0 ...
> > $ V12: num 0 -9999 0 0 0 ...
> > $ V13: num 0 -9999 0 0 0 ...
> > $ V14: num 0 0 0 0 0 ...
> > $ V15: num 0 0 0 0 0 ...
> > $ V16: num 0 -9999 0 0 0 ...
> > $ V17: num 0 -9999 0 0 0 ...
> > $ V18: num 0 -9999 0 0 0 ...
> > $ V19: num 0 -9999 0 0 0 ...
> > $ V20: num 0 -9999 0 0 0 ...
> > $ V21: num 0 -9999 0 0 0 ...
> > $ V22: num 0 -9999 0 0 0 ...
> > $ V23: num 0 -9999 0 0 0 ...
> > $ V24: num -9999 -9999 -9999 -9999 -9999 ...
> >> table(snow$V5) # tally up the states
> > AK AL AR AZ CA CO CT DE FL GA HI IA ID IL IN KS KY LA
> MA
> > MD ME MI MN MO MS MT
> > 72 18 65 55 99 128 10 1 30 33 6 112 57 103 85 90 49 29
> > 35 14 40 86 90 124 27 113
> > NC ND NE NH NJ NM NV NY OH OK OR PA RI SC SD TN TX UT
> VA
> > VT WA WI WV WY
> > 45 19 136 22 13 53 65 76 31 106 51 84 2 30 79 64 185 68
> > 70 18 56 103 36 84
> >>
> >
> >
> > Jim Holtman
> > Data Munger Guru
> >
> > What is the problem that you are trying to solve?
> > Tell me what you want to do, not how you want to do it.
> >
> > On Tue, Jun 16, 2015 at 11:38 AM, Alemu Tadesse <alemu.tadesse at gmail.com
> >
> > wrote:
> >
> >> Dear All,
> >>
> >> I was going to read daily snow data for each state and station/city
> from
> >> the following link. I was not able to separate a given state's data from
> >> the rest of the contents of the file, read the data to a data frame and
> >> save it to file.
> >>
> >>
> >>
> http://www1.ncdc.noaa.gov/pub/data/snowmonitoring/fema/06-2015-dlysndpth.txt
> >>
> >> I really appreciate your time and help, and also appreciate any
> information
> >> for an alternative source.
> >>
> >> Best,
> >>
> >> Alemu
> >>
> >> [[alternative HTML version deleted]]
> >>
> >> ______________________________________________
> >> R-help at r-project.org mailing list -- To UNSUBSCRIBE and more, see
> >> https://stat.ethz.ch/mailman/listinfo/r-help
> >> PLEASE do read the posting guide
> >> http://www.R-project.org/posting-guide.html
> >> and provide commented, minimal, self-contained, reproducible code.
> >>
> >
> > [[alternative HTML version deleted]]
> >
> > ______________________________________________
> > R-help at r-project.org mailing list -- To UNSUBSCRIBE and more, see
> > https://stat.ethz.ch/mailman/listinfo/r-help
> > PLEASE do read the posting guide
> http://www.R-project.org/posting-guide.html
> > and provide commented, minimal, self-contained, reproducible code.
>
[[alternative HTML version deleted]]
More information about the R-help
mailing list