[Date Prev][Date Next][Thread Prev][Thread Next]
[Date Index]
[Thread Index]
- Subject: Trying Pallene
- From: "Pierre Chapuis" <lua@...>
- Date: Tue, 11 Oct 2022 10:47:56 +0200
Hello everyone,
Roberto's talk at the workshop made me want to try the current state of Pallene [1].
In case others want to try it too, I have added support for its development branch to localua.sh [2].
This morning I have ported a "real" piece of code to it. I picked the back-propagation algorithm from a toy Recurrent Neural Network I wrote in Teal [3] and absolutely did not optimize on purpose - i.e. matrices are implemented as {{number}}.
I won't publish this code (at least not now) but porting from Teal was straightforward - basically replacing instances of "number" with "float" and tweaking a few things. I just had to add a small Teal type signature file (.d.tl) so I can call the Pallene code from Teal. This straightforward port gave me a 4x speedup (compared to Teal but that is basically plain Lua).
[1] https://github.com/pallene-lang/pallene
[2] https://blog.separateconcerns.com/2022-10-10-localua-pallene.html
[3] https://github.com/teal-language/tl
--
Pierre Chapuis