[Date Prev][Date Next][Thread Prev][Thread Next]
[Date Index]
[Thread Index]
- Subject: Re: A stable sort
- From: Dirk Laurie <dirk.laurie@...>
- Date: Fri, 10 May 2013 18:45:45 +0200
> There are two possible strategies:
> 1. Top-down, bisecting each time, with recursion.
> 2. Bottom-up, doubling each time, with iteration.
> (chunk size 12 for Strategy 1,16 for Strategy 2 — but there
> is no reason why it must be a power of 2, actually).
> There is still more tuning possible, e.g. size-dependent minimum
> chunk length a la Timsort, or implementing the supervising routine
> in C too.
It makes a difference to Strategy 2, but not to Strategy 1. It;s not
hard to understand why: Strategy 1 merges almost equal-sized
chunks all the time, Strategy 2 might have small pieces left over.
chunk_size = n
while chunk_size>too_big do chunk_size=math.ceil(chunk_size/2) end
This way, the number of chunks in Strategy 2 is always a power of 2.