A lesson I once learned from a mentor: attention to detail in the little things often demonstrates attention to detail in the big things.
No one really cares if you misspell a word, use the wrong tense, or make some other grammatical error. But sometimes, that's all they've got to go on to assess your attention to detail. I've seen vendors proposing multi-million dollar purchases rejected because of carelessness in their writing. And the explanation was always the same, "If he's that careless with his writing, how careless will he be with my business?"
When the division manager who has made us $90 million misspells a word in an email, no one bats an eyelash; he must have been in a hurry. But when a cover letter has errors, it really makes people wonder.
I don't care if you misspell a word, or make an error. But errors require extra time to process on the reader's part. Writing that's filled with spelling and grammar errors wastes the reader's time.
I don't like having my time wasted, and so I do care about spelling, grammar, and punctuation. Similarly, I assume other people care about their time, and I try my best to write well.
Of course, the amount of care needed varies based on what you're writing. I spend more time proofreading my blog posts than this comment, and much more time proofreading the books I've worked on.
And the more permanent and public the context, the more relevant that attention to detail is as an indicator.
A few days ago, I was stopped at a red light behind a landscaping company's truck. They'd had one of those vehicle wraps applied to the body of the truck, turning the whole thing into a billboard for their company. The graphics and copy on the truck wrap contained several spelling errors, uncapitalized sentences, a bulleted list in which the bullets were not vertically aligned, and visible JPEG artifacts.
Now imagine what your lawn might look like if you hired this company.
In more white-collar contexts, poor grammar, spelling, or writing style might also work as a broader indicator of intelligence and education: I'd presume that being able to write effectively is fairly well correlated with being well-read.
That's a terrible example to draw from. Your lawn might look great when they get done with it, because I doubt those gardeners were the people that designed their truck's advertising. They probably hired some guy to do it for $300.
The example is quite valid. If I hire someone to make a sign, I will make sure it's done right before I pay them. Moreover, putting a crappy, poorly spelled ad on my vehicle reflects on me and my attention to detail since that's all the reader has to go on.
Well, you are right that if you hire someone to make a sign you should at least verify yourself that there are no spelling mistakes and that it won't reflect poorly on your company. I concede.
I feel the same way, and it's not just about spelling. When I've been reviewing research proposals, I always get an instant bad impression if it's clear that they haven't followed the proposal preparation instructions, ie not proper sections, shrunk the font to squeeze more text in despite explicitly being told not to, etc.
It's a matter of credibility and reputation. When someone who has a track record of delivering value to a given organization makes a little error, it's not a big deal, because it's balanced against "karma" that they have already banked.
On the other hand, a cover letter and resume (aka "sales letter" for your services) is your first introduction to the organization. Same thing with a vendor proposal. Such a document probably arrives in a pile of several (perhaps even tens or hundreds) of others. In such a situation, and in the absence of personal connections to the decision makers, your credibility is pretty much zero. Your written word is your reputation.
In such a case, people are looking for reasons to filter, and being perfectionistic about grammar is a reasonably inoffensive axis to use for a preliminary cut.
Is anyone else baffled by how the article title talks about "failure to communicate" and then goes on to talk about basic things such as spelling and grammar?
In my opinion, communicating something is a far deeper skill than just writing correct grammar (and following your local style guide regarding other conventions). It's about structuring your thoughts into a logical arguments that others can follow, and that is specific enough that people can find out on which points they disagree with you if they do.
Most from the Graham/Spolsky/whoever crowd can do this awesomely well, but in all likelihood, not many enough people learn this in high school or college. And it's something that overworked school teachers or TAs cannot ever hope to address if they are to spend 15min. with a 10-page essay. (Why are they making anyone write a 10-page essay if there's no time to provide adequate feedback?)
At the risk of being snarky, I read this article as: Boring exposition, boring transition, finally some data and then a hedged conclusion with no suggested solution.
This article had impeccable grammar I'm sure, but I hardly cared because it had a weak, watered-down message.
For a tech analogy, it's a bit like web service uptime.
- No one will care unless something's wrong
- It doesn't provide a net gain, it only prevents a net loss
- It only matters if it's worth reading anyway
This is because grammar is very important when writing technical specifications, as they can easily become ambiguous or plain nonsense.
Example: "The users save the files on the server with the internet connection."
Does the users use the internet connection to save files on the server? Or does this server have an internet connection (and the other servers don't), and this is how to know which server that they should use?
Such ambiguous statements may result in the customer getting a very different system than they wanted.
Another example of "failure to communicate": I received the following bug report today: "At the moment all umlauts (ä, ö, ü) will advertised in the wrong format."
The problem? All accented letters were replaced by a question mark.
I would have had no idea what the problem was, if it was not for the screen shot.
Some companies simply cannot oursource their IT-dept as they cannot communicate in writing.
I've always avoided being the grammar Nazi, but it still bugs me to see lazy writing, especially from people I know. Far too often, family members ignore glaring rules of grammar without a care in the world. I fully realize I'm not a paragon of English grammar myself, but I learn when corrected.
The problem is that for many people, it does't matter as much. If you have bad grammar, or write poorly, it won't matter because so many other people have bad grammar and write poorly as well. You don't see anything wrong with it.
It's usually simple things to: capitalization, punctuation, proper tense, and spelling. I understand typos occur, and sometimes people just don't know better. Fine. The biggest problems are the use of chat words, like "u" for "you" and other problems. People use these in a casual atmosphere, and I feel it becomes normal for them. They see proper writing as work.
There's another benefit of using proper grammar (and punctuation): It helps people whose first language isn't English.
I grew up in a mixed-language household, and I struggled a bit with English (when I was very little). I relied on the rules of the language to understand complex sentences.
Deviation from these rules may seem like no big deal to people who have internalized the language on a deeper level; but for people who are not as comfortable with English, the rules are very important.
I was just waiting for the first grammar or usage mistake to be made and pointed out in this thread. :-)
It's ironic that now, when anyone can publish their writing to the world in a blog, so many people don't have the writing skills to communicate effectively.
Also, it's always been true that many people don't have the writing skills to communicate effectively. What's different today is that for the first time, those people have the ability to publish their writing.
I don't think it stems from really increasing the author pool so much as reducing the amount of effort put into editing the material. A good editor can dramatically improve legibility, but few bloggers are paying for it.
You raise an interesting point, but the purpose of an editor is to filter content before publication in a scarce medium so that resources aren't wasted publishing crap no one wants to read. The editor serves the owner of the means of publication and acts as a quality filter so that only content worth buying and reading will make it into print.
Unlike a printing press or a broadcast frequency, the internet is unbounded and access is dirt cheap. There's effectively no cost to publish crap, so there's no reason not to wait until after publication to filter for quality.
Editors do both. However, if you look at the standard quality of a small town newspaper from the 1920's the overall quality is not all that high. Realistically, papers like the New York Times are an outgrowth of having a huge audience which can support a lot of staff, but there used to be a lot of small players out there who had a lot less content to choose from.
I wonder if, over time, this trend will lead to a shift in what is considered acceptable writing. Once the "texting" teenagers of today take positions of leadership in government and corporations, they may be more accepting of this writing style.
Mark Twain complained about this long ago. As a previous commenter said: it's not so much that people are not able to write now, it's that they can publish their bad writing. We have metrics in schools to track how well our kids can write. We can blog about how poorly our co-workers write. It's more visible than it ever has been.
At the end of the day, every communication with your employer is an opportunity. If you can't communicate well, you will be less likely to get what you want out of those interactions. On the other hand, if you're able to write well and communicate clearly, you're going to get what you want far more often.
My theory is that texting abbreviations are currently in vogue mostly because of the limitations of early texting technology. As technologies like smart auto-completion make it easier and easier to text complete words, we should see a reversion back to a more conventional style.
I'm clinging to this theory even though I have pretty much zero evidence backing it up, because the alternative, that "dat was hawt, amirite?" will eventually become acceptable writing, makes me depressed beyond words.
Even if texting abbreviations become widely acceptable in business writing, the issues of lack of clarity and disorganized flow of ideas will still distinguish good writing from bad writing. And I wouldn't bet on typical texting abbreviations becoming widely acceptable for business-to-business writing among the most profitable businesses.
As someone who has been a TA for the past three years at a top-tier university, I have had the same experience as the author. It's shocking how many students are admitted that do not know how to write an essay. I find that I spend at least half an hour per 10-page essay and almost always end up drowning them (even the good ones) in red ink.
Perhaps this is due to growing up with a mother who worked as an editor, but I tend to think that I'm not being overly harsh when I have to repeat the same comments over and over again: proofread for typos, needs a strong introduction, put your thesis up front, number your pages, use consistent formatting, punctuation inside quote marks, footnotes outside quote marks, don't use big fonts / extra spacing to pad the paper's length, use a (correctly formatted) bibliography, don't cite Wikipedia (or any encyclopedia), give the reader a roadmap up front, etc.
I was a math TA for a while and I can tell you that people are just as bad when it comes to mathematical writing and reasoning. We don't expect everyone to have perfect pitch so why is it more reasonable to expect everyone to be a good writer? Sure, everything you mentioned seems obvious to you but you've had plenty of practice and expecting the same from college students who are just barely beginning to figure things out is a bit much.
It has been a long time since college for me, but I vaguely remember this one.
Just like Wikipedia, encyclopedias are not generally original source research. Citing more specific original or closer-to-original sources gives the evaluator more tools in considering the relevance or validity of an argument.
This is pretty close to the reasoning. You can broadly categorize sources into three tiers: primary, secondary, and tertiary.
Primary sources are the actual documents or individual recollections involved in an event—a treaty, a memoir, a transcript, etc. These are the archival documents that historians love. A programming-language-analogy might be writing something in a close-to-the-metal language like Assembly or C.
Secondary sources are usually synthetic works that comment on primary sources and other secondary sources, or strongly advance/test a certain hypothesis. These would be the works produced by historians, political scientists, sociologists, and so on analyzing the historical data and evaluating the theories of other social scientists. Most books fall into this category. The analogy would be higher level languages like Java and Ruby.
Tertiary works are produced for the general reader. Rather than providing new scholarship or advancing a hypothesis, they generally try to summarize the existing consensus in a concise manner. Because encyclopedia articles aim to give the generalist reader an overview, they are usually a good starting point for someone with no background in a particular area. But, the idea behind writing analytical or research papers is to get the student beyond just a general familiarity with a subject, and to get them to engage some part of the debate. Citing encyclopedias, while in and of itself not wrong, it usually evidence that the student hasn't really engaged the topic. Presumably, if they've read 3-6 secondary sources, they would have all the citations they needed and the encyclopedia would be extraneous. The programming analogy here would be using very little code to paste together a bunch of libraries that you don't understand.
Wikipedia has an additional epistemic problem that other encyclopedias usually don't have. Other encyclopedia entries are usually written by experts in their field. There is no such guarantee for Wikipedia articles: they could be written by experts, they could be written by 14-year-old Singaporean communists writing from their mother's basement (I speak from experience as an admin who's had to deal with this scenario). While studies have shown Wikipedia to be comparable in terms of quality to mainstream encyclopedias, there's no way of telling at any given moment which revision you're viewing and who it was last edited by and what their level of expertise is. For that reason, Wikipedia is usually especially discouraged by academia, which as an institution places emphasis on linking the authority of a piece with the person who wrote it.
1. You use the authority of the source you cite. (If a well-respected researcher has made the point, I may be more likely to assume it's sensible than if it just came to your mind). This is why citing Wikipedia or single newspaper articles or popular science books may be a less-than-good idea.
2. You point the reader to someone who has invented/first described a certain phenomenon (i.e., recognition of achievement). It's debatable whether that is relevant at an undergraduate level, but it's probably not a bad idea. (Encyclopedias and textbooks are clearly bad on this criterion).
3. You point the reader to a place where he finds more extensive material on the topic. Many encyclopedias do not cite their sources, which makes them less-than-useful on this account.
Note the conflict between 2. and 3.: If Someone Else published a cryptic paper in the 50s about something, yet no one understands that one until they've read the excellent overview paper by A. Random Researcher, 2. would say you should cite the Else paper from the 50s, whereas for 3. it may be better to cite Researcher's more recent paper.
Regarding 1, that's a huge problem I had with every college writing class I took. Either the argument being made is correct, or it's not. The fact that Joe Bloe, PhD agrees with me doesn't change things one way or the other.
It's almost as if my professors wanted to instill in me the belief that I should trust credentialed experts rather than trying to figure things out for myself.
> It's almost as if my professors wanted to instill in me the belief that I should trust credentialed experts rather than trying to figure things out for myself.
Well, yeah. That's exactly what they're trying to do. They have a vested interest in it!
I spent a entire class arguing that it was a logical fallacy ('appeal to authority') with my Eng. Comp. professor, when he required that x% of your work be cited.
He finally did agree, but declined to change the requirement. I dropped the class (and the college!).
And yet with all their errors, the majority of his students still pass and presumably go on to be leaders or at least high earners. So even though it's a "crisis" no one will do anything meaningful to fix it.
I agree with the importance of good writing, but I'm curious about the headline. As I read it, the author gives two examples of how poor writing is costly: in terms of time spent correcting papers, and a reference to an estimate from the "National Commission on Writing" that "remedying deficiencies" costs as much as "$3.1 billion annually". There is no explanation of how this estimate was calculated, nor does the article explain how big the US economy is, for purposes of comparison.
Edit: I'm the local grammar Nazi and have always been a fan of engineers/programmers that could write their asses off, e.g. Paul Graham, Joel Spolsky, _why the lucky stiff, etc., etc. Gives us all something to aspire to is all.
At my college, the "gatekeeper" CS course was notorious, even among non-CS majors. Your code could compile, pass all the unit tests, have optimal algorithmic efficiency and highly optimized runtime... and you could still get a C- because you used inconsistent capitalization for your variable names, or had misspellings in your comments[1]. May the ghost of Dijkstra have mercy on you if you did anything "clever" at the high level.
The goal, of course, is not that later on you will spend half your coding time thinking about variable capitalization. The goal is that when you're writing your code, you pick good variable names automatically because you've built up the reflex, by having the idea pounded into your head with the wrath of an angry god. Anecdotally, the class was largely successful in its goal of setting up good habits before the bad ones settled in. After taking that class, people wrote code that was significantly more readable and maintainable than they had written before, and the habits tended to stick.
I suspect that at least part of the problems with effective communications is that we develop our early communications largely ad-hoc, and tend to build up a lot of bad habits in the process. Without concentrated effort to correct this habits, they remain and reinforce themselves through repetition. And of course, concentrated effort doesn't scale well.
[1] Granted, part of this was because the class was graded on a negative curve. If everyone got 145/150, then 145 is the average, which means a C.
In high school, my senior AP Eng. Comp class was similar. We had to write 2-3 page paper, in class, every day. The papers were all literary analysis (ugh!), but it definitely sharpened the 'writing muscle'.
I can write much faster, with more clarity, than most people I know today - all because of that one year of English.
First, many seem to have received little writing instruction in high school. I initially noticed this as an undergraduate English major at Yale, where I helped peers revise their papers. I saw it again in graduate school at Tufts, where I taught freshman writing classes. And it has also struck me at Babson, where, for the past two years, I have instructed first-year students.
Maybe in informal academic writing, there's a special kind of topic sentence whose purpose is to introduce an opportunity for promiscuous name-dropping. I.e. in an essay on insomnia, one might write: "I usually wake up late. It's because I go to parties at Noam Chomsky's house, or on Richard Dawkins' yacht, or..."
It is name-dropping to an extent, but it is somewhat relevant. A lot of people would expect freshman students at prestigious schools to have a decent grasp of the English language.
Communicating well is just as hard as any other worthwhile activity and it doesn't come naturally to most people. It took me forever to figure out that part of the reason that I was always so frustrated with people was because I wasn't communicating well enough with them. Oftentimes I'd think I had expressed one thing and later I would find out that the other person had completely misinterpreted my meaning. After the n-th time I figured I was partly to blame and decided to do something about it and picking up a grammar book was the first step. It's not hard, it just takes practice like anything else and the benefits of being understood better are definitely worth it.
No one really cares if you misspell a word, use the wrong tense, or make some other grammatical error. But sometimes, that's all they've got to go on to assess your attention to detail. I've seen vendors proposing multi-million dollar purchases rejected because of carelessness in their writing. And the explanation was always the same, "If he's that careless with his writing, how careless will he be with my business?"
When the division manager who has made us $90 million misspells a word in an email, no one bats an eyelash; he must have been in a hurry. But when a cover letter has errors, it really makes people wonder.