Simple load testing with Siege fed with url list


So, I wanted to use a simple way of load testing my caching in Nginx. I went on with Siege, and it’s brilliant for easy and fast load testing a web page. Instead of writing down all the urls that I wanted to feed Siege with, I created a small curl script. This small bash script will fetch the front page you specify with SITE, and parse all href tags found. Again fetch each of those urls, and find another set of href tags on each of the previous found urls. Then it will put all those unique urls into a file. You can then feed Siege with that txt-file to load test your web page.

#!/bin/bash
#
# Curl an url, and parse href-tags to feed Siege with!
#
# Created: 2015-04-23 09:09 kim@myrveln.se
# Changed: 2015-05-22 21:22 kim@myrveln.se
#

SCHEME="http"
SITE="example.com"
TMPFILE="/tmp/tempurls"
TMPFILE2="/tmp/tempurls2"

curl -k ${SCHEME}://${SITE} | grep -oP '(?<=href=")[^"]*(?=")' | grep "${SCHEME}://${SITE}" | sort | uniq > ${TMPFILE}

while read URL; do
    curl -k ${URL} | grep -oP '(?<=href=")[^"]*(?=")' | grep "${SCHEME}://${SITE}" | sort | uniq >> ${TMPFILE2}
done <${TMPFILE}

cat ${TMPFILE2} | sort -u > ${SITE}.txt

rm ${TMPFILE} ${TMPFILE2}

Example command for Siege:
siege -c200 -d5 -i -f example.com.txt

With above command, you will send 200 concurrent users to battle your web page. Requests will be delayed with 5 seconds with the -d flag, -i makes the url hits random and with -f you specify the txt-file you generated with the bash script.

Comments

comments