home / tils / til

Menu
  • GraphQL API

til: caddy_pause-retry-traffic.md

This data as json

path topic title url body html shot created created_utc updated updated_utc shot_hash slug
caddy_pause-retry-traffic.md caddy Pausing traffic and retrying in Caddy https://github.com/simonw/til/blob/main/caddy/pause-retry-traffic.md A pattern I really like for zero-downtime deploys is the ability to "pause" HTTP traffic at the load balancer, such that incoming requests from browsers appear to take a few extra seconds to return, but under the hood they've actually been held in a queue while a backend server is swapped out or upgraded in some way. I first heard about this pattern [from Braintree](https://simonwillison.net/2011/Jun/30/braintree/), and a [conversation on Twitter](https://twitter.com/simonw/status/1463652411365494791) today brought up a few more examples, including [this NGINX Lua config](https://github.com/basecamp/intermission) from Basecamp. [Caddy](https://caddyserver.com/) creator Matt Holt [pointed me](https://twitter.com/mholt6/status/1463656086360051714) to [lb_try_duration and lb_try_interval](https://caddyserver.com/docs/caddyfile/directives/reverse_proxy#lb_try_duration) in Caddy, which can hold requests for up to a specific number of seconds, retrying the backend to see if it has become available again. I decided to try this out. This was my first time using Caddy and I'm really impressed with both the design of the software and the quality of the [getting started documentation](https://caddyserver.com/docs/getting-started). I installed Caddy using Homebrew: brew install caddy ## The Caddyfile You can configure Caddy in a bunch of different ways - the two main options are using JSON via the Caddy API or using their own custom Caddyfile format. Here's the `Caddyfile` I created: ``` { auto_https off } :80 { reverse_proxy localhost:8003 { lb_try_duration 30s lb_try_interval 1s } } ``` Caddy defaults to `https`, even on `localhost`, which is very cool but not what I wanted for this demo - hence the first block. The next block listens on port 80 and proxies to `localhost:8003` - with a 30s duration during which incoming requests will "pause" if the backend is not available, and a polling interval of 1s. ## Running Caddy I started Caddy in the same directory as my `Caddyfile` using: caddy run Then I hit `http://localhost/` in my browser. The browser hung, waiting for the response. Then I started [Datasette](https://datasette.io/) on port 8003 like this: datasette -p 8003 And about a second later my browser returned a response showing the Datasette homepage! Quitting and restarting Datasette while executing new requests confirmed that traffic was being paused while the backend was unavailable. ## Trying to reconfigure the proxy My second experiment didn't work, sadly. I wanted to see if I could reconfigure the backend to use `localhost:8004` instead, and then reload Caddy such that paused traffic would resume against the new backend. I edited the `Caddyfile` to use port `8004` and ran this to hot-reload the configuration: caddy reload New requests did indeed get served from the new backend, but sadly requests that I had already started (and were paused awaiting the backend) did not automatically get served from the new backend - I had to hit refresh in my browser instead. I [filed a Caddy issue](https://github.com/caddyserver/caddy/issues/4442) about this. <p>A pattern I really like for zero-downtime deploys is the ability to "pause" HTTP traffic at the load balancer, such that incoming requests from browsers appear to take a few extra seconds to return, but under the hood they've actually been held in a queue while a backend server is swapped out or upgraded in some way.</p> <p>I first heard about this pattern <a href="https://simonwillison.net/2011/Jun/30/braintree/" rel="nofollow">from Braintree</a>, and a <a href="https://twitter.com/simonw/status/1463652411365494791" rel="nofollow">conversation on Twitter</a> today brought up a few more examples, including <a href="https://github.com/basecamp/intermission">this NGINX Lua config</a> from Basecamp.</p> <p><a href="https://caddyserver.com/" rel="nofollow">Caddy</a> creator Matt Holt <a href="https://twitter.com/mholt6/status/1463656086360051714" rel="nofollow">pointed me</a> to <a href="https://caddyserver.com/docs/caddyfile/directives/reverse_proxy#lb_try_duration" rel="nofollow">lb_try_duration and lb_try_interval</a> in Caddy, which can hold requests for up to a specific number of seconds, retrying the backend to see if it has become available again.</p> <p>I decided to try this out. This was my first time using Caddy and I'm really impressed with both the design of the software and the quality of the <a href="https://caddyserver.com/docs/getting-started" rel="nofollow">getting started documentation</a>.</p> <p>I installed Caddy using Homebrew:</p> <pre><code>brew install caddy </code></pre> <h2> <a id="user-content-the-caddyfile" class="anchor" href="#the-caddyfile" aria-hidden="true"><span aria-hidden="true" class="octicon octicon-link"></span></a>The Caddyfile</h2> <p>You can configure Caddy in a bunch of different ways - the two main options are using JSON via the Caddy API or using their own custom Caddyfile format.</p> <p>Here's the <code>Caddyfile</code> I created:</p> <pre><code>{ auto_https off } :80 { reverse_proxy localhost:8003 { lb_try_duration 30s lb_try_interval 1s } } </code></pre> <p>Caddy defaults to <code>https</code>, even on <code>localhost</code>, which is very cool but not what I wanted for this demo - hence the first block.</p> <p>The next block listens on port 80 and proxies to <code>localhost:8003</code> - with a 30s duration during which incoming requests will "pause" if the backend is not available, and a polling interval of 1s.</p> <h2> <a id="user-content-running-caddy" class="anchor" href="#running-caddy" aria-hidden="true"><span aria-hidden="true" class="octicon octicon-link"></span></a>Running Caddy</h2> <p>I started Caddy in the same directory as my <code>Caddyfile</code> using:</p> <pre><code>caddy run </code></pre> <p>Then I hit <code>http://localhost/</code> in my browser. The browser hung, waiting for the response.</p> <p>Then I started <a href="https://datasette.io/" rel="nofollow">Datasette</a> on port 8003 like this:</p> <pre><code>datasette -p 8003 </code></pre> <p>And about a second later my browser returned a response showing the Datasette homepage!</p> <p>Quitting and restarting Datasette while executing new requests confirmed that traffic was being paused while the backend was unavailable.</p> <h2> <a id="user-content-trying-to-reconfigure-the-proxy" class="anchor" href="#trying-to-reconfigure-the-proxy" aria-hidden="true"><span aria-hidden="true" class="octicon octicon-link"></span></a>Trying to reconfigure the proxy</h2> <p>My second experiment didn't work, sadly. I wanted to see if I could reconfigure the backend to use <code>localhost:8004</code> instead, and then reload Caddy such that paused traffic would resume against the new backend.</p> <p>I edited the <code>Caddyfile</code> to use port <code>8004</code> and ran this to hot-reload the configuration:</p> <pre><code>caddy reload </code></pre> <p>New requests did indeed get served from the new backend, but sadly requests that I had already started (and were paused awaiting the backend) did not automatically get served from the new backend - I had to hit refresh in my browser instead.</p> <p>I <a href="https://github.com/caddyserver/caddy/issues/4442">filed a Caddy issue</a> about this.</p> <Binary: 91,253 bytes> 2021-11-24T17:18:24-08:00 2021-11-25T01:18:24+00:00 2021-11-24T18:37:21-08:00 2021-11-25T02:37:21+00:00 9abe5cc7d3baf71f2b3a28b0c12b8bbe pause-retry-traffic
Powered by Datasette · How this site works · Code of conduct