Rietta
Rietta: Web Apps Where Security Matters
You are reading The Rietta Blog, a publication about the web since 2005.

WGET to Keep New Rails Site in Memory

A Ruby on Rails web application can be really fast in production. However, for a site that receives less traffic the application can be very, very slow on the first request after it has gone dormant.

One way to make sure that your site is always fast and never goes dormant is to setup an automatic process to periodically fetches the home page.

On an Linux, FreeBSD, or Mac OS X system it is easy to setup an automatic fetch using cron and wget.

Simple type “crontab -e” at your terminal and enter a line like this for each Rails website you want to keep alive by pinging every 15 minutes.

*/15 * * * * wget -O /dev/null -q http://test.rietta.com/ > /dev/null 2>&1

For those not familiar, /dev/null is a special file in Unix that goes nowhere. By using it as the place that the output file is written and any messages are written, this command will be completely silent.

This also works to keep Heroku images alive. You simply need to run the crontab on a Unix-based machine of yours that is on all of the time. The requesting box doesn’t have to be a full-blown server. Any desktop will do. A laptop is probably not the best choice since when you have it offline it won’t be able to keep your sites alive.

About Frank Rietta

Frank Rietta's photo

Frank Rietta is specialized in working with startups, new Internet businesses, and in developing with the Ruby on Rails platform to build scalable businesses. He is a computer scientist with a Masters in Information Security from the College of Computing at the Georgia Institute of Technology. He teaches about security topics and is a contributor to the security chapter of the 7th edition of the "Fundamentals of Database Systems" textbook published by Addison-Wesley.