Lately I have been in the habit of using Rstudio Server hosted on Amazon Web Services (AWS) Elastic Compute 2 (EC2) cloud. It’s convenient to be able to continue my work right where I left off, whether I’m in the office or at home. At work it avoids the corporate firewall. I can run large jobs and still have my laptop available. Most importantly, it prepares me for scaling up when it comes time for a big project. If you are interested in getting started with EC2 I suggest Louis Aslett’s excellent site.
The natural choice when starting is the free micro instance, which includes 615 MB of memory. Depending on how you use R, this may not meet your requirements. Amazon provides instructions that suggest long running jobs in particular are not well suited for a micro instance.
Today I saw an error code for the first time on Rstudio Server warning me that I was out of memory.
<pre tabindex="0">Error in system(paste(which, shQuote(names[i])), intern = TRUE, ignore.stderr = TRUE) : cannot popen '/usr/bin/which 'svn' 2>/dev/null', probable reason 'Cannot allocate memory'
Or if you open a shell window you’ll get this popup.
I logged out, but that did not free up the memory. I ended up killing the process from the Linux shell, and then jumping back on. After an hour of typical R work I issued the shell command to check memory:
~$ free -m
It said that I had 65 MB free. I restarted R from within Rstudio Server and checked again; this time 298 MB were free.
So does this mean that I always have to restart R to clear my memory? According to this answer at StackExchange, yes.