lua-users home
lua-l archive

[Date Prev][Date Next][Thread Prev][Thread Next] [Date Index] [Thread Index]


On 2010-03-13, Isaac Gouy <igouy2@yahoo.com> wrote:
> Leo Razoumov <slonik.az <at> gmail.com> writes:
>  >
>  > I just noticed that the default language shootout homepage
>  > http://shootout.alioth.debian.org/
>  > links directly only to x64 quad-core benchmarks that cover only a
>  > small subset of languages and excludes Lua. To see LuaJIT results one
>  > has to click "Help" and from there choose single-core platforms and
>  > then select "all languages" in a pull-down menu before clicking "show"
>  > button. Most likely, people will miss Lua benchmarks altogether. Is it
>  > intentional?
>
> What you seem not to have noticed on the homepage is the links "LuaJIT" and
>  "Lua" - you'll find them in the Z to A.
>
>  You'll find that they are always shown on the homepage.
>  You'll find that they always link to measurements made on either on x64 one core or x86 one core.
>
>  They do that because no one's contributed programs for Lua that make use of
>  multicore.
>

Isaac,
I appreciate very much your efforts in getting shootout page together.
As a first time user of your resource I can assure you that I have not
been aware what platforms besides x64 quad-core were part of your
benchmark. A typical first-time-user such as myself  clicks on the
"compare performance" link and gets a self-contained page with results
for a subset of languages on a single platform. There is no clear
indication of what else to expect. After pocking around for
half-an-hour I finally found what I was looking for, mostly because I
knew from earlier postings to this list what I was after.

I am afraid that many first-time-users would not display such a
patience and would leave your resource after just skimming its
surface. I admire your effort but the web-interface is not designed
for an "uninitiated".

BTW, I love your box-plots, quantiles and ranking by the median. It is
substantially more accurate statistics than typical mean-value
comparisons in other benchmarks.

--Leo--