Hacker News new | past | comments | ask | show | jobs | submit login

There is a great deal of both misunderstanding and ignorant Microsoft bashing in this comment.

First of all, you are mixing up two completely different concepts.

For character encoding on Windows: For many functions in the Windows API there two versions of a function, one with an A (for ANSI) at the end and one with a W (for wide). This was added to make it easier to support Win32 on both Windows 95, which used 8-bit characters and codepages and Windows NT which was natively utf-16 unicode. At the time utf-16 was considered the best and most standard choice for supporting unicode. In most cases it is implemented as W function with an A function that is little more than a wrapper.

This has nothing to do with what Raymond is describing.

For the 64-32 bit stuff they ensured that all code would compile and work correctly with both 32/64 bit stuff and built two versions, one for ia32 and one for amd64. The kernel would have to be modified to support the amd64 architecture. This is exactly what Linux, OSX and other operating systems that support multiple architectures do. On top of this, because amd64 supports backwards compatibility, they also included an ia32 environment with it as well, but this is optional, so anything that ships with the OS cannot depend on it. I assume this is what OSX does too, the only difference is that with Windows the two versions ship as different SKAs, and MacOSX ships with both versions and installs the one that the computer originally shipped with.

Second, the number of system calls has nothing do with any of this at all.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: