I was primarily thinking of readability and comprehension, not representation. If you are receiving a support request or error report, and all supporting documentation uses characters that make no sense to you, you may have great difficulties in interpreting the bug report or error request.
Eddy Vluggen wrote:We are, since we're no longer limited to ASCII.
And: The alternative to UTF-16 (which is hardly used at all in files) is UTF-8, not ASCII. In the Windows world, you may still see some 8859-x (x given by the language version of the 16-bit Windows), but to see 7-bit ASCII, you must go to legacy *nix applications. Some old *nix-based software and old compilers may still be limited to ASCII - I have had .ini files that did not even allow 8859-1 in comments! But you must of course be prepared for 8859 when you read plain text files from an arbitrary source (and ASCII is the lower half of 8859).
you should save in ISO, but display nicely in the format that the user has set as his preference in WindowsThen we are talking about not reading a text representation as as text file, but using an interpreter program to present the information. Just as you would do with a binary format file.
Ehr.. no. You could have ASCII in binary, with a completely useless date format.I am not getting this "ASCII in binary". Lots of *nix files with binary data use Unix epoch to store date and time. If your data is primarily intended for the Windows market, you might choose to store it as 100 ns ticks since 1601-01-01T00:00:00Z - then you can use standard Windows functions to present it in any format. Conversion to Unix epoch is one subtraction, one division. If you insist on ISO 8601 character format, you may store it in any encoding you want, all the way down to 5-bit baudot code
You started with a wheel, now you're also including a dashboard and breaks.Did you ever roll snowballs to make a snowman when you were a kid?
I have no idea what you are trying to sayOne major point is that binary data file formats, as opposed to a character representation, is underestimated; most programmers are stuck in the *nix style of representing all sorts of data in a character format, where a binary format would be more suitable. (The same goes for network protocols!) I am surprised that you haven't discovered that point.