Shortfin vs Nginx vs Python SimpleHTTPServer, and introduction

Shortfin vs Nginx vs SimpleHTTPServer

Shortfin is a lightweight, high-performance HTTP server that claim it is fast for static web server. It is open source web server under the GPL v2 License. Shortfin was inspired from the C10k problem. Shortfin targets to be the fastest web server on UNIX environment.

Get Started

To install the Shortfin server, it is simple

# wget //shortfin.io/install.sh && sh install.sh

Running the script above, it will download, compile and install the Shortfin. As Shortfin is lightweight, the installation time is so fast!

The config file is located in

# vim /etc/shortfin/shortfin.conf

the config file is well documented and annotated. It is easy to get started. Let’s see the simple server setting on Shortfin.

## shortfin.io, no cache.
server = {

    doc-root = /var/www/shortfin/
    cache-files = no
    hostname = shortfin.io www.shortfin.io

    # not yet implemented
    access-log = /var/logs/shortfin/access.log
    error-log = /var/logs/shortfin/error.log

    # not yet implemented
    # proxy all .php and .cgi
    ## new-rule = <type> <arg>
    ## cache = 1/0
    #proxy = {
    #       new-rule = ext .php
    #       new-rule = ext .cgi

    #       host = 192.168.0.12
    #       port = 80

    #       cache = 0
    #}

    # proxy all images to an image server
    #proxy = {
    #       new-rule = dir   /img
    #       new-rule = file  /logo.png
    #       new-rule = ip    192.168.0.100

    #       host = 192.168.0.100
    #       port = 8080

    #       cache = 1
    #}
}

Cache is built in feature in Shortfin. The proxy is not yet developed, so that it currently cannot run others languages program code on the server.

Benchmark

We have made a simple test on Shortfin, Nginx and python simpleHTTPServer. We use wrk to benchmark.

wrk is a modern HTTP benchmarking tool capable of generating significant
load when run on a single multi-core CPU. It combines a multithreaded
design with scalable event notification systems such as epoll and kqueue.

Where Nginx’s config and Shortfin’s config are remain by default.

Testing environment is ubuntu 12.04 in vagrant box on MBP i5 2.3 GHz with 8GB Ram. 1GB ram was shared with vagrant virtual box.

Testing command

wrk -t12 -c400 -d30s //dev.codersgrid.com

Result

Shortfin

Running 30s test @ //dev.codersgrid.com
  12 threads and 400 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   150.00ms  125.31ms 731.22ms   54.73%
    Req/Sec     0.00      0.00     0.00    100.00%
  29094 requests in 30.00s, 4.05MB read
  Socket errors: connect 156, read 0, write 0, timeout 2339
Requests/sec:    969.80
Transfer/sec:    138.27KB

Nginx

 Running 30s test @ //dev.codersgrid.com
  12 threads and 400 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   835.65ms    1.56s   18.52s    95.34%
    Req/Sec     0.00      0.00     0.00    100.00%
  2530 requests in 30.02s, 6.15MB read
  Socket errors: connect 156, read 0, write 0, timeout 3749
Requests/sec:     84.29
Transfer/sec:    209.66KB

Python SimpleHTTPServer

Running 30s test @ //dev.codersgrid.com:8000
  12 threads and 400 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency    14.12ms  236.11ms   8.26s    99.50%
    Req/Sec     0.00      0.00     0.00    100.00%
  2967 requests in 30.02s, 701.19KB read
  Socket errors: connect 156, read 454, write 0, timeout 5129
Requests/sec:     98.84
Transfer/sec:     23.36KB

Wow… by our simple test, Shortfin really play so well than other 2 server platforms.

Check it out


Update 10th Aug, 2013:

As @Jay Fenton said, we should test those environment in no error states in fairly.

Testing Document



	Helle World!

Shortfin

Running 30s test @ //dev.codersgrid.com/
  10 threads and 240 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   242.78ms   18.94ms 277.60ms   97.33%
    Req/Sec     0.00      0.00     0.00    100.00%
  29445 requests in 30.02s, 4.10MB read
Requests/sec:    980.96
Transfer/sec:    139.86KB

Nginx

Running 30s test @ //dev.codersgrid.com/index.html
  10 threads and 140 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency    62.47ms   96.25ms   2.55s    93.34%
    Req/Sec     0.00      0.00     0.00    100.00%
  47866 requests in 30.01s, 18.35MB read
  Socket errors: connect 0, read 2, write 0, timeout 3
Requests/sec:   1595.09
Transfer/sec:    626.15KB

Related Articles

  • http://www.linkedin.com/in/jfenton Jay Fenton

    Can you re-do your benchmark and tune the values to get to a zero-error situation?

    • cauliturtle

      @aksyn:disqus, updated, um… maybe nginx running on virtual box cannot boost its best performance?