Fri 12 May 2017 10:07:50 PM UTC, comment #1:
Sorry for the confusion, but this patch is wrong for two reasons:
1. The idea to "consistently render as U+002D across all output devices" is ill-defined. There is no such thing as a U+002D (or ASCII HYPHEN-MINUS) output glyph on some devices, namely on -Tps and -Tpdf. The PostScript TR font has glyphs for "hyphen", "minus", and "endash", but nor for "hyphen-minus".
2. Unicode is not a superset of everything, in the following sense: if two glyphs represent the same Unicode character, that doesn't imply that they are the same glyph. So it may well make sense to have two input characters that map to the same Unicode character, but to different output glyphs. In particular, Doug McIlroy pointed out the following long-established definition:
input : Unicode : output
+ : U+002B : plus (normal font, e.g. for running text)
\(pl : U+002B : plus (special font, e.g. for mathematics)
\- : U+2212 : minus (normal font, e.g. for running text)
\(mi : U+2212 : minus (special font, e.g. for mathematics)
So redefining
\- : U+002D : hyphen-minus # NO!
is sadly not an option because it would break lots of existing non-manual-page documents.
So the problem cannot be solved, at least not without defining a completely new input character. But defining
\(hm : U+002D : hyphen-minus # NO!
would be so ugly that it couldn't reasonably be recommended for manual pages. So the problem has no complete solution, and the current workaround of using \- in manual pages and rendering it as U+002D for manual page UTF-8 output only already is the best possible compromise.
So please close this bug report as invalid.
|