You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Jan 13, 2022. It is now read-only.
I am writing a data I am retrieving from an remote Postgresql server to MonetDBLite. I got the error message: Error: cannot allocate vector of size 712.8 Mb. I thought as long as the data is smaller than my disk, MonetDBLite will be able to handle it. Is there any limit for the size of a single data to be able to fit into MonetDBLite?
The text was updated successfully, but these errors were encountered:
Some more details please. What is the script you use, can you provide a reproducible example? And yes, as long as the data is smaller than your disk, it should be fine. Did you monitor disk use while running your export?
My script is as below, I am sorry I cannot post my password and username here.
library(tidyverse); library(zoo)
require(data.table); library(lubridate)
### WRDS CONNECTION ###
library(RPostgres)
wrds <- dbConnect(Postgres(),
host='wrds-pgdata.wharton.upenn.edu',
port=9737,
user='user',
password='password',
dbname='wrds',
sslmode='require')
res <- dbSendQuery(wrds, "select DATE, PERMNO, PERMCO, SHROUT, PRC, RET, VOL
from CRSP.DSF")
# where PRC is not null")
crsp.dsf <- dbFetch(res, n = -1) # too large too fit in memory
library(DBI)
library(MonetDBLite)
dbdir <- tempdir() # creates a temporary directory
con <- dbConnect(MonetDBLite(), dbdir)
dbWriteTable(con, "crsp.dsf", dbFetch(res, n = -1))
Is this on Windows? Either way, I would probably fetch /append the data in chunks of 10000 or so instead of in one go. If on Windows, consider increasing the memory limit using memory.limit
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
I am writing a data I am retrieving from an remote Postgresql server to MonetDBLite. I got the error message: Error: cannot allocate vector of size 712.8 Mb. I thought as long as the data is smaller than my disk, MonetDBLite will be able to handle it. Is there any limit for the size of a single data to be able to fit into MonetDBLite?
The text was updated successfully, but these errors were encountered: