jgharston wrote:Even if INF files don't make sense, either now or then, it's been around for more 25 years and any system that doesn't support them will have the failing that it doesn't support them.
Agreed. As I said:
crj wrote:I mean yes, if it's become a de facto standard it has to be supported for interface purposes
It looks like supporting them is a bit of a headache, unfortunately. Coming to the party late, things I'm unclear on include:
- How do you transform the base filename into the INF filename?
- Do you replace any existing extension, or append another?
- If replace, what happens to files with multiple extensions?
- If append, what happens if the name becomes too long, e.g. on an 8.3 filesystem?
- On a VFAT or similar filesystem, do you transform the 8.3 name or the long name?
- If the base filename is 8.3 but the .INF has a long filename, do you derive the case of characters in the long filename from the base filename or the Acorn filename?
- How do you handle collisions when either you or some other software has already created a file with that name?
- What do you do if you have no permission to create the .INF file? (This is a particular problem if you want to provide Acorn attributes for a directory but are not allowed to create files in its parent.)
- How do you transform an INF filename into the corresponding base filename?
- If presenting a directory to an Acorn system, what are you expected to do with files that have no corresponding .INF? What about .INFs with no corresponding base file?
- Are we guaranteed that all characters in the host filename for anything with a .INF will be 7-bit ASCII?
- Do we pair up files with their .INF case-sensitively or insensitively? Or does this vary according to the host filesystem?
- How are top-bit-set characters represented in the filename field of a .INF file? (As-is? UTF-8 encoded? UTF-16 encoded? Transcoded? Prohibited?)
- Am I correct that load/exec addresses should be sign-extended from 24 bits to 32 bits if and only if they are represented by six hex characters in the .INF?
- What heuristic should be used to determine whether a .INF file is a regular Windows INF or Acorn attributes? (Unfortunately, square brackets are legal in Acorn filenames...)
- What heuristic should be used to determine whether each successive field in the .INF is usable or a non-"standard" variant?
- Does anything require, expect, or even use, CRCs in .INF files? If I modify the file, do I ignore the CRC, strip it, or update it? (If a CRC is present, is it guaranteed to be the final entry on the first line of the .INF? Is it guaranteed to be denoted by exactly four hex characters even if leading nibbles happen to be zero?)
- If you modify the Acorn attributes, do you remove any tail of line 1 you don't understand, or do you preserve it verbatim?
- What about subsequent lines of a multi-line file?
- Are line-end characters prohibited, optional or mandatory? If editing the attributes should this be canonicalised or left as-is?
- Is a DOS EOF character (ASCII 26) prohibited, optional or mandatory? If editing the attributes should this be canonicalised or left as-is?
- How do you decide when to modify an existing .INF file v. when to replace it?
(For background, one reason I'm so careful about this kind of thing, especially as pertains to Murphy's Law, is that in a past life I worked on software that built a database for a directory hierarchy full of music files. Making sure that you could prepare that database on any of vfat, NTFS, HFS or ext3, under any of Windows, MacOS or Linux, then use it seamlessly under any of the others was... a headache.)