lua-users home
lua-l archive

[Date Prev][Date Next][Thread Prev][Thread Next] [Date Index] [Thread Index]


On Sat, Mar 8, 2014 at 6:27 AM, Aapo Talvensaari
<aapo.talvensaari@gmail.com> wrote:
> Hi,
>
> If by context you mean user supplied context table, e.g.
> template.render("test.html", { context_value = "test" }), you are correct
> that the result of this call is not cached, but the actual function that is
> generated from test.html is cached (by default). So, the cached function can
> be called with many different context tables. If you also want to cache the
> results, you can do it like this: local res = template.compile("test.html"){
> context_value = "test" }. Then just store the res string in somewhere (e.g.
> in memory, file or redis).
>
> But if by context you mean actual Lua process, the cache is not shared
> between them. And I do not think that it can be easily done, as the cache
> actually holds Lua functions. On Nginx / OpenResty context this means that
> each Nginx worker has their own Lua VM, so the cache is local to Nginx
> worker. If you have two Nginx worker processes each template function is
> cached twice (if these workers run the same code). You may also disable
> caching with template.caching(false). I'm also going to add support for set
> $template_caching 0|1; for Nginx, so that you can easily disable caching in
> Nginx configuration (many may want to have caching disabled on development
> env).

Thanks for clarifying this, and this is what I meant. I looked up the
ngx_lua module, it vaguely talked about how the lua state is
maintained across the workers' process; but if you are right, would it
be possible to use init_worker_by_lua directive to initialize global
variables (i.e. the caching table) so that at least the caching would
be shared per-process. This is probably what you are already doing...

I guess this mailing list is probably not a good thread to talk about
openresty in details though...

>
> Regards
> Aapo
>
> 8.3.2014 8.43 kirjoitti "Hao Wu" <wuhao.wise@gmail.com>:
>
>> Hi Aapo,
>>
>> The benchmark seems to be running without switching contexts; in the
>> real environment, if there are hundreds of requests, the contexts
>> won't be shared so the cache will actually not be used? Or am I
>> missing something?
>>
>> ~Hao
>>
>>
>> On Fri, Mar 7, 2014 at 8:56 AM, Aapo Talvensaari
>> <aapo.talvensaari@gmail.com> wrote:
>> > My (HTML) templating engine for Lua with special support for Nginx /
>> > OpenResty (HttpLuaModule):
>> >
>> > https://github.com/bungle/lua-resty-template/
>> >
>> > I started this for my own needs. I shortly looked at other alternatives
>> > in
>> > templating (https://github.com/bungle/lua-resty-template/#alternatives),
>> > but
>> > I though that there was a room for another implementation. The syntax
>> > should
>> > be familiar (not 1:1) to anyone who has worked with things like Mustache
>> > (http://mustache.github.io/) or say PHP.
>> >
>> > I feel that the codebase is quite stable, and the performance is good
>> > (especially the cached performance). I also think that the API is clean,
>> > while still allowing some extensions / overwrites for flexibility. I
>> > have
>> > not yet made automated unit tests or integrated this with LuaRocks. At
>> > this
>> > point I wish to receive feedback. The code is fully Lua 5.2 and LuaJIT
>> > 2.x
>> > compatible. It doesn't work with Lua 5.1 as the 5.1 version of
>> > load-function
>> > is different. It could be made to work with 5.1 using setfenv, but that
>> > is
>> > not on my agenda (if I don't get enough feedback to add support for it).
>> > Feel free to fork it, if you need 5.1 support, or send me a pull request
>> > if
>> > you have managed to make things work on 5.1 (and even earlier versions).
>> >
>> > So, please take a look at it on Github (it's fairly well documented
>> > there).
>> >
>> >
>> > Regards
>> > Aapo
>>
>