A post of whose kind there are far too many on the web

I’m back from being away from the web for a week, precipitated by an exploding power adapter whose replacement I kept postponing the purchase of because my evenings were suddenly being filled with wonderful books. And Swedish televisionI do have web access at work, but my employer will be glad to know that blogging from work is not my forte..

One of the books I finally read this week was Five Points, purchased in an initial flurry of enthusiasm when Gangs of New York came out. Author Tyler Anbinder laments frequently how there are so few narratives by Five Pointers themselves — most of his primary sources were outside accounts by journalists, reformers, police and travellers.

Five Pointers certainly never thought their neighborhood was worth documenting, nor that they would be of interest to future generations. If only they had. Today, the neighborhoods of lower Manhattan boast a surfeit of personal narratives, all pre-sorted and indexed for future generations of historians to peruse in excruciating detail.

I’ve been wondering if our writing blogs changes how we perceive ourselves in the eyes of the future. I don’t think we exactly pitch our stories to the future, but it certainly has crossed my mind that what I write will probably survive in some searchable archive somewhere, and it is imperative therefore either 1) to be right and/or accurate, or 2) to be self-deprecating about matters that most likely betray assumptions the future will be disabused of. My own strategy has been to aim for (1) but settle for (2).

Will future historians bother with blogs at all? They might dismiss the whole medium on account of it containing far too many self-referential sentences such as this one. Or this one. Blogs have to compete, after all, with TV, newspapers, books, academic studies, movies, and oodles of statistics and records. Do blogs add anything to the future historian’s perspective on us?

I venture yes. Perhaps the most-loved primary source of contemporary historians is the personal journal. It gives the kind of color that official histories and bank records cannot capture. There is no reason why that should change in the future.

* * *

Not blogging for a week does not mean that the urge to blog was quieted. Watching the Nobel prizes being handed out and then the banquet live on Swedish TV offered ample opportunity for color commentary that I did not, alas, indulge in. The highlights only, then: Princess Madeleine; Prime Minister Persson showing off his newest wife; and JM Coetzee’s odd but riveting banquet speech, delivered in clear staccato fragments that had you focus on every syllable.

The other day, suddenly, out of the blue, while we were talking about something completely different, my partner Dorothy burst out as follows: “On the other hand,” she said, “on the other hand, how proud your mother would have been! What a pity she isn’t still alive! And your father too! How proud they would have been of you!”

It certainly had me wondering what had been said on the one hand.

15 thoughts on “A post of whose kind there are far too many on the web

  1. And as a knowingly annoying aside, self-deprecating is a neologism. You really mean self-depreciating. Currencies and reputations depreciate the same way.

  2. Get a better dictionary. Self-deprecating is so commonly used to be accepted synonym for self deprectiate, but it’s still incorrect. I read it somewhere definitive. Will go hunt and gather.

  3. Didn’t think I was making this up. This from a recent WSJ style edict, which you’ll forgive me for trusting more than dictionary.reference.com:
    “Misusage, repeated often enough, can usurp proper usage entirely. Belittling oneself is almost universally referred to as self-deprecation these days, through constant misapplication, though it is a corruption of self-depreciation. Deprecate, which had meant plead against, now primarily means depreciate, or belittle.”

  4. Well, Matthew, you might not have been making it up, but you were wrong. See where you said that self-deprecating was “still incorrect”? Yes, it’s right there, in comment #4. Well, reading your WSJ edict, it seems clear to me that the usage might have been incorrect at some point in the past, but that at this point it is no longer incorrect. The primary meaning of deprecate, according to the WSJ edict you quote, is precisely what Stefan used it to mean. No conflict, no incorrectness. What used to be “proper usage” has now been usurped entirely, and what used to be misusage is now the primary meaning of the term. Language changes and evolves, Matthew: do try to keep up.

  5. Remember, Matthew, meaning is use. That’s a fundamental axiom of what language is: if everybody else thinks that a word means one thing, and you, alone, think it means something else, then by definition they’re right and you’re wrong. Moreover, words change their meanings over time. How do you know when a usage moves from being mistaken to being the new meaning of a word? Well, when that usage becomes the primary meaning, for one thing.
    The fact is that “self-deprecating” is a perfectly good piece of English. It does the job: both the people using it and their interlocutors know exactly what it means. The fact that such usage might have been incorrect in the past is neither here nor there. Today, there really is no ambiguity: no disconnect between what is meant and what is said.
    I’m trying to think of a common word usage which is still incorrect, to instantiate your hypothesis. The best I can come up with is “hopefully”, and I think that only the most pedantic among us are still waging that battle. It can still be used in the old-fashioned sense (“he walked hopefully into the examination hall”) but that doesn’t stop the newer meaning (“Hopefully, I’ll pass this exam, get a job, and start being able to repay my loans”) from being understood.
    Or did you have something else in mind?

  6. That’s just poor grammar. (See Irony). I’m talking about incorrect variants of words being substituted for the correct term and becoming standard through constant misuse. Don’t see why we should acquiesce in that.
    One example: I read your comment and felt nauseous. You may think this is fine, but you would be wrong, although many dictionaries agree with you. You should say: I read your comment and felt nauseated. Alternatively (not alternately) I read your nauseous comment and felt nauseated. (This is also WSJ style; the grammar, not the Felix-induced nausea.) Fine to argue this stance is archaic or pedantic or irrelevent, but that still doesn’t render incorrect word usage correct.

  7. Stupid HTML coding. Here it is without the constant italics. I’m not that emphatic.
    One example: I read your comment and felt nauseous. You may think this is fine, but you would be wrong, although many dictionaries agree with you. You should say: I read your comment and felt nauseated. Alternatively (not alternately) I read your nauseous comment and felt nauseated. (This is also WSJ style; the grammar, not the Felix-induced nausea.) Fine to argue this stance is archaic or pedantic or irrelevent, but that still doesn’t render incorrect word usage correct.

  8. OK, Matthew, if use is not the judge of meaning, then what is? How are we to know what a word means if we can’t go out and see how it’s used? The standard way for a dictionary editor (say) to find out what “self-deprecating” or “nauseous” means is to look at how they’re used and work it out from that. But you would object to such a methodology, since by your standards, that would wind up with an incorrect definition. How would you determine what the meaning of a word is?
    Also, if that’s really the meaning of nauseous, then what’s the difference between nauseous and nauseating?

  9. 1. not clear to me why you can’t accept a distinction between traditional meaning and modern usage. There are multiple sources to determine meaning. They fight. They compete. I prefer the former, although understand the appeal of the latter.
    2. Not entirely sure, but i would guess nauseating is a redundant synonym for nauseous.

  10. 1. I do accept the distinction between traditional meaning (what a word used to mean, in the past) and modern usage (what it means, now). It seems to me that you’re the one who’s having difficulties here. Why should a word mean what it was used to mean in the past, rather than what it means in the present?
    2. Other than usage, what sources are used to determine meaning? Your “multiple sources” are what, exactly?

  11. As a comment just left on an earlier linguistic post deftly illustrates, words, even nonsense words, gain meaning over time if they show up in google often enough.
    In fact, I often try variant spellings in google to get a “popular vote” on which to use.

  12. I’m not an expert in linguistics or dictionaries, but I think no one else in this conversation is either. You bring up good issues:
    1. How do dictionary composers determine the meaning of words? I think you’ll find that it isn’t simply “how they’re used.”
    2. Correct definitions or usage aren’t the same as common usage. What is correct isn’t found in one place (unless you live in France, where the government defines what is proper French), but dictionaries and grammers should, I think, have higher authority than common usage. A paper submitted to a scholarly journal should consult the journal’s standards for what is correct, for instance.
    3. Language, pronunciations and meanings do change, significantly, over time. And it’s true that common (incorrect) meanings can become correct meanings. Deprecate originally meant “to pray against (as an evil)”, came primarily to mean “to express disapproval of”, and finally began becoming synomous with depreciate, “to belittle”. It has gathered a new meaning in computers, where it’s used to indicate a term that shouldn’t be used and may be rendered obsolete. (By the way, I deprecate this usaged of “deprecated”.)
    4. I don’t think it’s true that just because the majority agree on a meaning that it becomes the correct meaning. Nor do I believe this is an axiom.
    5. I believe it was William S. Burroughs who said “language is a virus from outer space.”
    In case you’re wondering, I came across this page while writing a rant against “deprecate” (see item 3). Thanks for giving me more to consider.

Leave a Reply

Your email address will not be published. Required fields are marked *