Quickly monitor availability of web services with curl

Created: 06 Dec 2014

Some of my recent work was to reduce the interruption to users when we deploy changes.

We have comprehensive scenarios that we run against our complete test environments with a performance testing tool to get a thorough answer. It takes quite a while to get feedback from these tests, so to get quicker feedback I needed a fast way to measure continued availability of a web service in isolation, usually on a vagrant box.

This is the kind of thing that I unfortunately forget, so I am documenting it here mostly for my own benefit.

Useful curl options

The basic curl command I ended up using:

curl -sS --max-time 5 -w "%{http_code}\n" -o /dev/null http://www.example.com/

Why I picked these options:

-sS 

Be silent unless errors occur. The default output from curl is too verbose and not particularly amenable to parsing into structured data. But if errors happen, knowing what they were will help debug problems.

--max-time 5

Timeout after 5 seconds. The default timeout in curl is much more lenient than either a user or a client calling an API will be.

-w "%{http_code}\n"

Prints a newline delimited list of HTTP response codes, with ‘000’ used if it encounters a problem that means that it didn’t see a HTTP response. This makes it easy to count error responses.

-o /dev/null

As long as the thing being tested sets its HTTP response codes correctly, there is no need to see the content of the response.

Repeated tests and other embellishments

This runs requests in a loop as fast as possible. If anything goes wrong, output will be printed to the console. I’ve used example.com, as it should work if you have Internet access. Please cancel this after a short time, to avoid needless requests to example.com:

while true; do curl -sS --max-time 5 -w "%{http_code}\n" -o /dev/null http://www.example.com/ | egrep -v '^200$'; done

Some examples that should fail:

while true; do curl -sS --max-time 5 -w "%{http_code}\n" -o /dev/null http://www.example/ | egrep -v '^200$'; done
while true; do curl -sS --max-time 5 -w "%{http_code}\n" -o /dev/null http://blackhole.webpagetest.org/ | egrep -v '^200$'; done

A much friendlier 100 requests, pausing for half a second between requests:

for i in {1..100}; do curl -sS --max-time 5 -w "%{http_code}\n" -o /dev/null http://www.example/ | egrep -v '^200$'; sleep 0.5; done

Add the date and pipe that and the status code to a comma-delimited file for further analysis:

rm -f output.csv; for i in {1..100}; do curl -sS --max-time 5 -w "$(date --rfc-3339=seconds),%{http_code}\n" -o /dev/null http://www.example/ >>output.csv; sleep 0.5; done