[Date Prev][Date Next][Thread Prev][Thread Next]
[Date Index]
[Thread Index]
- Subject: Re: Hash Table Collisions (n.runs-SA-2011.004)
- From: Leo Razoumov <slonik.az@...>
- Date: Tue, 3 Jan 2012 09:53:38 -0500
On Mon, Jan 2, 2012 at 09:54, David Kolf <kolf@gmx.de> wrote:
> Leo Razoumov wrote:
>>
>> On Sun, Jan 1, 2012 at 12:48, Vladimir Protasov<eoranged@ya.ru> wrote:
>>>
>>> To avoid the problem we just should generate random salt at lua startup,
>>> then use it during hash generation. It will prevent attacker to guess which
>>> values will be placed in the same bucket.
>>
>>
>> Salt would not help if one keeps ignoring characters the way Lua does.
>> Two strings that differ only in those characters that are ignored by
>> the hash function still hash to the same value.
>
>
> I considered picking the 16 to 32 characters which aren't ignored at random,
> too. But then I noticed that an attacker only needs to pick 2 or 3
> characters that aren't used for the hash in order to generate enough
> collisions to cause trouble -- the probability for hitting ignored chars at
> random is way too high.
>
> And Lua has to skip chars or else the operation file:read("*a") would be a
> problem for big files.
>
> So I am afraid that Lua still needs to raise an error when there is an
> extreme number of collisions. (As Mark suggested it).
>
> -- David
>
Alternatively, as I suggested in an earlier post to this thread, you
can use self-balancing trees like RB-tree to resolve collisions
instead of lists when lists get larger. Most of the time current Lua
hash-table implementation is in effect but it switches to RB-tree
buckets when list performance gets in trouble (e.g. when under
attack). RB-tree give you O(log(N)) performance which is sufficient to
defeat the attack.
--Leo--