[Date Prev][Date Next][Thread Prev][Thread Next]
- Subject: Re: Could Lua itself become UTF8-aware?
- From: Jay Carlson <nop@...>
- Date: Sun, 7 May 2017 21:54:28 -0400
On May 2, 2017, at 8:43 AM, Paul Merrell <email@example.com> wrote:
> The lion's share of the TTF fonts out there have extremely
> limited UTF-8 support.
> A a few free fonts that look good and have pretty broad UTF-8 support
> (both are available in most (all?) Linux package management systems).
> * Arial (one of the Microsoft core fonts); 
Sadly, Arial Unicode MS is not in the core fonts. :-(
Google’s Noto Sans (and now Noto Serif) are font families with the intent of “no missing glyphs, ever."
> * Deja Vu font family (open source) 
"U+4e00 CJK Unified Ideographs (0/0) (0/0) (0/0)”
There are 0 out of 0 CJK Unified Ideographs? Meaning they didn’t want all the missing Han glyphs to count against their coverage ratio, so they declared the Han glyphs are out of scope. Adding insult to injury :-) , the precomposed Korean is right out too:
"U+ac00 Hangul Syllables (0/0) (0/0) (0/0)”
But these days you don’t need to have a single font file with coverage of everything (like Arial Unicode). Font substitution really does seem to work.
Perhaps we should gauge progress by how many of the scripts at the bottom of the English Wikipedia home page are missing.