database - Very large .csv file, -
i have large amount of data, 20 million rows , 6 columns. trying extract data large .csv file. tried r, error msg, using macbook 4 gb ram, i5 processor. there way can extract information, tried excel, can take 1 million rows. advise or useful
file more 1.3 gb, want divide data base set of 2000-3000 based on parameter. tried r , when used read.csv.. tries moment but after 10 mints or r not responding –
i want separate these data based on 3rd column.
sha pct practice bnf code bnf name
first of have tell mean extract data. if sort of aggregation functions or can divided, think easiest way split huge csv file many small one.
if need else, have here:
- package storing big data on disk (not ram) http://ff.r-forge.r-project.org/
- package allocates points unused memory or points swap file. https://r-forge.r-project.org/r/?group_id=556
- paralelizing big data http://www.r-bloggers.com/taking-r-to-the-limit-parallelism-and-big-data/
- few discussions here http://www.mathfinance.cn/handling-large-csv-files-in-r/ http://r.789695.n4.nabble.com/how-to-read-a-large-csv-into-a-database-with-r-td3043209.html
Comments
Post a Comment