[Date Prev][Date Next][Thread Prev][Thread Next]
[Date Index]
[Thread Index]
- Subject: Re: Load large amount of data fast
- From: Luis Carvalho <lexcarvalho@...>
- Date: Sun, 17 Oct 2010 10:35:12 -0400
> > I'm almost sure I'm missing something here, but since each line is a table, is
> > there anything wrong with this?
>
> > local load_huge_table_list = function (filename)
> > local f = assert(io.open(filename))
> > local s = "return {" .. f:read"*a" .. "}"
> > f:close()
> > return assert(loadstring(s))()
> > end
>
> There is a limit of constants per chunk, you'll hit it.
Ok, you'd get a constant table overflow. Simple modification:
local chunker = function (s)
return assert(loadstring("return " .. s))()
end
local load_huge_table_list = function (filename)
local result, n = {}, 1
for line in io.lines(filename) do
result[n] = chunker(line)
n = n + 1
end
return result
end
And here's a kickstart if you really want to parse the file:
require "lpeg"
local P = ("{" * (1 - lpeg.S"{}") ^ 0 * "}")
local T = (1 - P) ^ 0 * (P / chunker)
local load_huge_table_list = function (filename)
local f = assert(io.open(filename))
local s = f:read"*a"
f:close()
return lpeg.match(lpeg.Ct(T ^ 0), s)
end
Cheers,
Luis
--
Computers are useless. They can only give you answers.
-- Pablo Picasso
--
Luis Carvalho (Kozure)
lua -e 'print((("lexcarvalho@NO.gmail.SPAM.com"):gsub("(%u+%.)","")))'