[R-SIG-Mac] ODBC driver with postgresql

Tim Coote tim+r-project.org at coote.org
Wed Dec 2 23:49:23 CET 2009


For some reason I've got a problem with double precision numbers in  
postgres coming across into R as integers (they're just truncated).

I'm using:
R 2.10.0 GUI 1.30 Leopard build 32-bit (5511)
R-RODBC version 1.3
I've tried both the packaged postgresql odbc driver for the mac and  
rebuilt from source with the same results (psqlodbc-08.04.0100),  
against postgres 8.4 on my mac and against 8.3 on Fedora Linux.

the odbc driver works as expected with python + pyodbc, and with R  
2.10, RODBC (R-RODBC-1.2-2.fc10.i386, R-2.10.0-2.fc10.i386).

Here's some noddy results (I create a table called x with with one  
double precision column) - I'm ignoring the warning, which seems to be  
a distro error for fc10:

 From R on Linux:
 > library (RODBC)
Warning message:
package 'RODBC' was built under R version 2.8.1 and help may not work  
correctly
 > c=odbcConnect ("PostgreSQL",  uid="yy", pwd="xxx")
 > sqlQuery (c, "select * from x")
        x
1  1.234
2 43.989
 > sqlColumns(c, "x")
   TABLE_QUALIFIER TABLE_OWNER TABLE_NAME COLUMN_NAME DATA_TYPE  
TYPE_NAME
1           cases      public          x           x         6    float8
   PRECISION LENGTH SCALE RADIX NULLABLE REMARKS COLUMN_DEF  
SQL_DATA_TYPE
1        15      8     0    10        1               <NA>             6
   SQL_DATETIME_SUB CHAR_OCTET_LENGTH ORDINAL_POSITION IS_NULLABLE  
DISPLAY_SIZE
1               NA                NA                1         
<NA>           22
   FIELD_TYPE AUTO_INCREMENT PHYSICAL NUMBER TABLE OID
1        701              0               1     16517


However, on the mac:
 > c=odbcConnect ("Hg", uid="yy", pwd="xxx")
 > sqlQuery(c, "select * from x")
    x
1  1
2 43
 > sqlColumns (c, "x")
   TABLE_QUALIFIER TABLE_OWNER TABLE_NAME COLUMN_NAME DATA_TYPE  
TYPE_NAME PRECISION LENGTH SCALE RADIX NULLABLE REMARKS
1           cases      public          x           x         6     
float8        15      8     0    10        1
   COLUMN_DEF SQL_DATA_TYPE SQL_DATETIME_SUB CHAR_OCTET_LENGTH  
ORDINAL_POSITION IS_NULLABLE DISPLAY_SIZE FIELD_TYPE
1       <NA>             6               NA                 
NA                1        <NA>           22        701
   AUTO_INCREMENT PHYSICAL NUMBER TABLE OID BASE TYPEID
1              0               1     16517           0

Uid and pwd obfuscated. As far as I can see, the driver on the mac is  
correctly identifying the type of the column 'x' as a float8, but then  
truncating before the decimal point.  I have seen somewhere that  
Oracle drivers used to suffer from this issue, when they truncated  
type 6 variables.

python looks ok on the mac:
 >>> import pyodbc
 >>> c=pyodbc.connect (dsn="Hg")
 >>> cu=c.cursor().execute ("select * from x")
 >>> print cu.fetchall()
[(1.234, ), (43.988999999999997, )]

I cannot find a pyodbc equivalent to sqlColumns. As far as I can see,  
R and python on my mac are using a common stack, and R seems to work  
ok with the fedora odbc drivers. Is it likely that I've still got bugs  
in my odbc set up or is there an issue with R + ODBC?  I'm not sure  
where else to look for what's going wrong.

Tim



More information about the R-SIG-Mac mailing list