[R-sig-DB] R and PostgreSQL - Writing data?

Sean Davis @d@v|@2 @end|ng |rom m@||@n|h@gov
Fri Sep 28 17:43:22 CEST 2012


On Fri, Sep 28, 2012 at 10:36 AM, James David Smith
<james.david.smith using gmail.com> wrote:
> Hi Sean,
>
> Thanks for the reply. I'm familiar with UPDATE queries when working in
> PostgreSQL, but not from within R. Would it look something like this?
>
> dbWriteTable(con, " UPDATE table SET ucam_no2 = 'ucam_no2$interpolated_data' ")
>
> My problem is how to get the R data 'within' my SQL statement I think.

To do an update, you'll need to loop through your data.frame and issue
a dbSendQuery().  To create the SQL string, I often use something
like:

sprintf("UPDATE originalTable SET ucam_no2=%f WHERE originalTable.id = %d",....)

You can't do this in one step, unfortunately.  This is how UPDATE
works and has nothing to do with R.

Sean


>
> On 28 September 2012 15:19, Sean Davis <sdavis2 using mail.nih.gov> wrote:
>> On Fri, Sep 28, 2012 at 10:14 AM, James David Smith
>> <james.david.smith using gmail.com> wrote:
>>> Dear all,
>>>
>>> Sorry if this isn't quite the right place, but it's the first time
> SendSave NowDiscardDraft autosaved at 15:36 (0 minutes ago)
> 33% full
> Using 3.4 GB of your 10.1 GB
> ©2012 Google - Terms & Privacy
> Last account activity: 50 minutes ago
> Details
> People (2)
> Sean Davis
> Add to circles
>
> Show details
> Ads – Why these ads?
> Big Data Too Slow?
> Real-Time Analytics for Big Data. Visual Drag & Drop UI. Quick & Easy
> PentahoBigData.com
> Talend Open Source ESB
> Open Source ESB Based on Apache CXF and Apache Camel. Free Download!
> www.talend.com/Free_ESB_Software
> Warp I/O for SQL Server
> Speed SQL Server performance 3x Faster I/O, reduced storage
> www.confio.com/warp-io
> Storage Container Sussex
> Ex-Shipping Containers Sale & Hire Storage Container 0800 043 6311
> www.CsShippingContainers.co.uk
> More about...
> MS Access Database SQL »
> Database »
> Excel Database Query »
> Oracle Database Problems »
>
>>> I've posted here. My issue is to do with writing to a PostgreSQL
>>> database from within R. My situation is best explained by some R code
>>> to start:
>>>
>>> #Connect to the database
>>> con <- dbConnect(PostgreSQL(), user="postgres", password="password",
>>> dbname="database")
>>>
>>> #Get some data out of the database.
>>> ucam_no2$original_data <- dbGetQuery(con, "select ucam_no2 FROM table")
>>>
>>> This returns say 10000 rows of data, but there is only data in about
>>> half of those rows. What I want to do is interpolate the missing data
>>> so I do this:
>>>
>>> #Generate some data
>>> ucam_no2$interpolated_data <- na.approx(ucam_data$ucam_no2, na.rm = FALSE)
>>>
>>> This works well and I now have 10000 rows of data with no empty cells.
>>> I now want to write this back into my PostgresSQL database. Into the
>>> same row that I took the data from in the first place. But I don't
>>> know how. I can write to a new table with something like the below,
>>> but what I'd really like to do is put the data back into the table I
>>> got it from.
>>>
>>> # Try to write the data back
>>> dbWriteTable(con, "new_data", ucam_no2$interpolated_data)
>>
>> Hi, James.
>>
>> You'll need to look into doing a SQL UPDATE.  That is the standard way
>> to "put data back into the table I got it from".
>>
>> Sean




More information about the R-sig-DB mailing list