The "metric" is always relevant. Your confusion comes from the fact that you're thinking about whether those errors are reflected in a user's experience. If you think about the concept of RAM coupled with the fact that desktops are most likely running every little piece of software available (OS, a browser with several windows/tabs open, IM programs, a game, etc.) you'll see why this is a big deal. And let's not even mention servers and machines where actually significant work is being carried out..
The author touches on the "how much memory is in use" question: all major OSes use unallocated RAM as a file cache (or equivalent), so no matter where the error happens, it is almost certain to hit something. Whether that "something" is actually relevant is another matter.
Those are not directly related questions; the answer to each of those depends on the setup of the machine, both on hardware and software, and the practical cases are far too many to enumerate on a single blog post. You are welcome to perform some tests or even theorycrafting on your systems.
On an unrelated note, I did not mean to demean desktops, but the reality is that there's orders of magnitude more devices that carry out tasks more critical than image processing or development. Embedded devices are one example.