On Fri, Apr 2, 2021 at 7:34 PM Bogdan Marinescu
> eLua uses Lua 5.1 only. I did run Lua 5.3 on a couple of Cortex MCUs, but didn't attempt to port the int-only patches, as those MCUs had enough resources to run the full version.
I was a big fan of eLua, although I had at the time no opportunity to
Is the project dead/asleep, nowadays?
It's in (low) maintenance-only mode, not actively developed anymore. I am working with Lua (5.3) in a different embedded project, but one which is not backward compatible with eLua. It targets MCUs with more resources, but it's better in any other way than I can think of. I might be able to open source it one day.
A target I am interested in is the new Raspberry Pi Pico. The MCU is
a Cortex M0+ with no FP support, and with 260 KB on-chip RAM.
260KB on-chip RAM is good enough for a lot of use cases. The project that I mentioned above starts vanilla Lua 5.3 with some supporting Lua code (including a coroutine scheduler) in ~70KB of RAM IIRC. You should have enough room to play with Lua in the remaining RAM. The lack of an FPU will impact your speed, but should have very little to no impact of the RAM usage. And running vanilla Lua has obvious benefits, although I do wish that the Lua team would look more closely to MCU-specific use cases. Maintaing patches between various Lua versions can be a very frustrating experience. At the same time, I understand that MCU-specific use cases are somtimes at odds with Lua's excellent cross-platform portability.
The Raspberry foundation proposes MicroPython as an on-board scripting
option. I am certain that Lua aficionados would love to see Lua as an
alternative option for this nice board!
A recent Lua (5.3+) might be a bit big or RAM-hungry for many
Arduino-like devices, but I think would be the best fit for the Pico.
I think so too. While targetting smaller MCUs have been my goal for a long time, that goal doesn't seem that useful anymore. More and more MCUs have more than 200K of on-chip SRAM, the one that I'm using right now has 512K and I've seen parts with 1M (I think they use one of those on the Arduino Potenta). The transition to parts with larger RAM isn't very fast, but it's happening and will likely be accelerated by the current "Edge AI on MCUs" trend. Things are looking good.