R on MacOS Error: vector memory exhausted (limit reached?)

RMacosBioconductor

R Problem Overview


I'm trying to run an R script (in particular, I am using the "getLineages" function from the Bioconductor package, Slingshot.

I'm wondering why the error "vector memory exhausted (limit reached?)" is showing up when I use this function, as it doesn't seem to be the most memory-intensive function compared to the other functions in this package (with the data I am analyzing).

I do understand that there are other questions like this on Stackoverflow, but they all suggest to switch over to the 64-bit version of R. However, I am already using this version. There seem to be no other answers to this issue so far, I was wondering if anyone might know?

The data is only ~120mb in size, which is far less than my computer's 8GB of RAM.

R 64 bit version

R Solutions


Solution 1 - R

For those using Rstudio, I've found that setting Sys.setenv('R_MAX_VSIZE'=32000000000), as has been suggested on multiple StackOverflow posts, only works on the command line, and that setting that parameter while using Rstudio does not prevent this error:

Error: vector memory exhausted (limit reached?)

After doing some more reading, I found this thread, which clarifies the problem with Rstudio, and identifies a solution, shown below:

Step 1: Open terminal,

Step 2:

cd ~
touch .Renviron
open .Renviron

Step 3: Save the following as the first line of .Renviron:

R_MAX_VSIZE=100Gb 

Note: This limit includes both physical and virtual memory; so setting _MAX_VSIZE=16Gb on a machine with 16Gb of physical memory may not prevent this error. You may have to play with this parameter, depending on the specs of your machine

Solution 2 - R

I had the same problem, increasing the "R_MAX_VSIZE" did not help in my case, instead cleaning the variables no longer needed solved the problem. Hope this helps those who are struggling here.

rm(large_df, large_list, large_vector, temp_variables)

Solution 3 - R

This can be done through R studio as well.

library(usethis) 
usethis::edit_r_environ()

when the tab opens up in R studio, add this to the 1st line: R_MAX_VSIZE=100Gb (or whatever memory you wish to allocate).

Re-start R and/or restart computer and run the R command again that gave you the memory error.

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionAnjan BharadwajView Question on Stackoverflow
Solution 1 - RGraeme FrostView Answer on Stackoverflow
Solution 2 - RÖmer AnView Answer on Stackoverflow
Solution 3 - RPurrsiaView Answer on Stackoverflow