Reading rather big data into R

Reading big data into R can take some time, since R reads the data directly into the Ram-memory. If the data is big it can even happen that R crashes. Things has become better, but this is still a problem. I have 16 GB in Ram and seldom have so big data that it does not fit into the Ram. But while using R with rather big data, it may not be a good idea to be running other heavy programs at the same time. It is for example not recommended to ”virtualize” other computers, if you happen to do that (I have Windows as a virtual machine).

But even though data may fit, it can still take some time to read it into R. A trick around this is to use the data.table package. The data.tabel function reads the data as a table instead of data.frame — which takes a lot lesser time.

Install the package


install.packages("read.table")

Read data


wd <- fread(dataname.csv)

 


Publicerat

i

av

Etiketter:

Kommentarer

Lämna ett svar

Din e-postadress kommer inte publiceras. Obligatoriska fält är märkta *

Denna webbplats använder Akismet för att minska skräppost. Lär dig hur din kommentardata bearbetas.