lua-users home
lua-l archive

[Date Prev][Date Next][Thread Prev][Thread Next] [Date Index] [Thread Index]

2016-09-28 8:19 GMT+02:00 Dane Springmeyer <>:

> The program I'm seeing this with is osrm-backend, a tool used to process the
> 35 GB into a graph structure
> for routing.
> Testing is being done on r3.8xlarge amazon instances running ubuntu trusty
> with 32 vCPU and 244 GB ram.

This is a seriously large amount of data and a seriously large machine.
Factors O(log(N)) in algorithm times can no longer be pooh-poohed.

The only place I can think of is garbage collection.There were a few
threads on that in the past six months or so. While you wait for the
New World's day to start, when the real experts can chip in, maybe you
could experiment with the settings available in collectgarbage().