lua-users home
lua-l archive

[Date Prev][Date Next][Thread Prev][Thread Next] [Date Index] [Thread Index]

I have released version 2.0 of my JSON module.


dkjson aims to be a strict implementation of the JSON standard, using
the original JavaScript implementation as reference.
(<>, json2.js)

It is written in pure Lua without dependencies to external modules
although it is possible to accelerate it using LPeg.

Changes since version 1.0 are:

- Optional LPeg support.

- Invalid input data for encoding raises errors instead of returning nil
  and the error message. (Invalid data for encoding is usually a
  programming error. Raising an error removes the work of explicitly
  checking the result).

- The metatable field __jsontype can control whether a Lua table is
  encoded as a JSON array or object. (Mainly useful for empty tables).

- When decoding, two metatables are created. One is used to mark the
  arrays while the other one is used for the objects. (The metatables
  are created once for each decoding operation to make sandboxing
  possible. However, you can specify your own metatables as arguments).

- There are no spaces added any longer when encoding.

- It is possible to explicitly sort keys for encoding by providing an
  array with key names to the option "keyorder" or the metatable field

- The values NaN/+Inf/-Inf are recognised and encoded as "null" like in
  the original JavaScript implementation.


>From a security point of view I'm a bit sceptical of duplicating the
same functionality once for pure Lua and once for LPeg as there are now
two implementations that have to be checked for bugs. The significant
performance increase for decoding convinced me to keep the LPeg version
included though. (And I still want my module to be independent of LPeg).

I had to cheat a bit to allow deep recursion in LPeg and to avoid the
"too many pending calls/choices" error. I am using match-time captures
with Lua functions that do the recursion by calling lpeg.match again.

I was tempted to encode Infinity (in Lua available as 'math.huge') as
'1e9999'. If such a number would be parsed again as an IEEE 754-2008
floating point number it would overflow and create Infinity again. But
this solution could create problems when some other JSON module parses
numbers using an arbitrary precision library so I switched to the way
the json2.js implementation did it ("null"). The lack of handling this
special numbers was the only bug I found in version 1.0 of my module.

I hope I didn't miss any bugs for version 2.0 even though it got a bit
more complex. ;-)

Best regards
-- David Kolf