OpenResty Performance – ngx.ctx vs ngx.shared.dict

The Nginx Lua module provides two structures for maintaining Lua-land data: per-request context tables, and shared memory zones. Each has its pros and cons; ngx.ctx can store arbitrarily complex Lua structures, and only live within a single transaction’s lifecycle. Conversely, shared dictionaries are capable of storing key/value pairs as Lua primitives (complex Lua structures must be serialized to be stored in a shared dictionary); the lifetime of a shared dictionary’s contents is the lifetime of the master nginx process (dictionary contents survive HUP signals). Additionally, the Lua module documentation notes that ngx.ctx performance is slow:

The ngx.ctx lookup requires relatively expensive metamethod calls and it is much slower than explicitly passing per-request data along by your own function arguments. So do not abuse this API for saving your own function arguments because it usually has quite some performance impact.

A similar note appears in the documentation under the ngx.var API call; no such note lives in the shared dictionary documentation. Given that shared dictionaries can be leveraged to store per-request data in a manner similar to the context table (for primitive values), I wanted to see if there is a noticeable difference between the two methods:

We setup a simple test harness to run through getting and setting data from each structure independently, and compare the timing results. We also run the same number of iterations of setting a primitive, as a baseline control. The results are acutely noticeable:

So we can see an obvious performance advantage¬†in using shared dictionaries over per-request context tables (30-50% in this environment, which, granted, not realistic, but it’s a start). So why bother keeping ngx.ctx API in the module? Even Cloudflare developers stopped using ngx.ctx, and noticed significant performance improvements. It would make more sense to store multi-phase data in a shared dictionary (and asynchronously remove cache elements if space is an issue), or leverage the resty LRU cache if serialization is a concern, rather than continue to use an expensive API method that seems largely geared towards convenience.

Update: I’ve re-examined the proper usage of ngx.ctx here.

One thought on “OpenResty Performance – ngx.ctx vs ngx.shared.dict

Leave a Reply

Your email address will not be published. Required fields are marked *