since 1999

 

2 minutes estimated reading time.

WGET to Keep New Rails Site in Memory

A Ruby on Rails web application can be really fast in production. However, for a site that receives less traffic the application can be very, very slow on the first request after it has gone dormant.

One way to make sure that your site is always fast and never goes dormant is to setup an automatic process to periodically fetches the home page.

On an Linux, FreeBSD, or Mac OS X system it is easy to setup an automatic fetch using cron and wget.

Simple type “crontab -e” at your terminal and enter a line like this for each Rails website you want to keep alive by pinging every 15 minutes.

*/15 * * * * wget -O /dev/null -q http://test.rietta.com/ > /dev/null 2>&1

For those not familiar, /dev/null is a special file in Unix that goes nowhere. By using it as the place that the output file is written and any messages are written, this command will be completely silent.

This also works to keep Heroku images alive. You simply need to run the crontab on a Unix-based machine of yours that is on all of the time. The requesting box doesn’t have to be a full-blown server. Any desktop will do. A laptop is probably not the best choice since when you have it offline it won’t be able to keep your sites alive.