lua-users home
lua-l archive

[Date Prev][Date Next][Thread Prev][Thread Next] [Date Index] [Thread Index]


I'm trying to concatenate huge strings using table.concat()

local n=7
local a = { ("a"):rep(2^30) }
for j = 2, n do
   a[j] = a[1]
local b = table.concat(a)
print("GBytes used:", #b/2^30, collectgarbage("count")/2^20)

I expect table.concat to act wisely:
it should calculate the result string length, allocate its memory and fill it with the result content.
So, for concatenating n copies of 1GB-string only (n+1) GBytes of memory would be needed:
1 GByte for the source string and n GBytes for the result string.

Contrary to my expectation, on a 16 GB RAM Linux machine: n=7 works OK, but n=8 starts eating swap space.
It looks like table.concat temporarily uses twice more memory than it is actually needed for a clever concatenation.

Unlike table.concat, the usual Lua concatenation operator does not have this problem.  It may be used as a workaround, but it is limited by the number of VM registers.