[Date Prev][Date Next][Thread Prev][Thread Next]
[Date Index]
[Thread Index]
- Subject: Re: Load large amount of data fast
- From: Luis Carvalho <lexcarvalho@...>
- Date: Sat, 16 Oct 2010 19:47:17 -0400
> Apologies for a lazy question, I have not done my own homework.
>
> I've got a large file (3M entries, 250 MB) with data.
> Each entry is one line with a small Lua table:
>
> { foo = 1; bar = 2; baz = 'text' };
>
> (Actually, there are two different entry formats, but that does not matter.)
>
> I need to load this data fast enough. (Faster than several hours that
> my original loader runs on LJ2, and it still had not stopped.)
>
> So, if you know an implementation than ad-hoc unoptimized one below,
> please share.
<snip>
I'm almost sure I'm missing something here, but since each line is a table, is
there anything wrong with this?
local load_huge_table_list = function (filename)
local f = assert(io.open(filename))
local s = "return {" .. f:read"*a" .. "}"
f:close()
return assert(loadstring(s))()
end
Cheers,
Luis
--
Computers are useless. They can only give you answers.
-- Pablo Picasso
--
Luis Carvalho (Kozure)
lua -e 'print((("lexcarvalho@NO.gmail.SPAM.com"):gsub("(%u+%.)","")))'