Thursday, August 20, 2009

Album Rating System

I mentioned this in a footnote here, but sure, I'll clarify my reviewing system.

I used to have a numerical scale from 0-100 for rating albums that was down-skewed relative to most scales. I only review albums that I have (and therefore probably like to some degree), after all, not everything that gets released, so it made no sense to use a standard a/b/c/d/f type scale and end up rating everything 70 or above. So the scale looked like this:

100: Perfect, and probably nothing should get this.
90: Excellent
80: Great
70: Very Good
60: Good
50: Solid
40: Fair
30: Okay
20: Just passing.
0-19: Varying degrees of failure

In other words, an album that got a 55, failing by most scales, was still somewhere between a solid and a good album on my scale. And an album could go as low as 35 or so and still be a pretty decent disc. A slew of albums that I actively like and have affection for got 65s. You get the idea.

At some point I realized that this level of detail induced a sort of OCD in me, and I would obsess over whether albums belonged in the 65 or 70 category (I had to make it in five point steps to avoid complete insanity; another lesson learned along the way). I would try to do this by comparing the albums very superficially (do I like album X as much as album Y that got a 65 or album Z that got a 70?), and the whole thing turned out to be pretty unreliable. Either due to my slightly changing tastes over the course of a year or just different moods on different sittings, I got really confused as to why certain albums had been placed in certain slots. I mean, I was never going to confuse a 30 with a 70, but sometimes on a particular day it would seem that a 55 should have been a 65, or vice versa.

Eventually I also realized that this focus on the numerical score also took my attention away from the albums - that I started trying to digitize the album rather than think about it as an experience. Plus, I realized all the getting hung up over numbers was silly, as when push comes to shove, these were works of art, and the ordinal scale by which you could compare them was really sort of an abstract fabrication that doesn't make a whole lot of sense. Really it only serves for a later effort of putting the albums in a sort of top X list (100, 500, whatever), and that's sort of an artificial construct, too*. I've come to think the only things that matter are 1, would I recommend this to someone, and 2, is it one of my favorite albums.

* - I do, though, really appreciate the artificial constructs that come out at the end of every year or decade from Pitchfork and the like. There's just too much music out there for me to parse without such aids, so while it's a little silly to assert that such and such was the best album of 200x, it at least gives me a good starting point.

So these reviews are intended to be more about the description and recording feelings about and associations with I have for the particular album. The recommendation is a much broader place holder in my mind. "Not recommended" means there's something about the disc that makes it not really worth the time, money or effort to get to know, relative to other albums. "Recommended" means it's an unqualified recommendation, a sort of "yes, you should definitely check this out." "Recommended (solid)" means there's some sort of qualification to the recommendation, like it's for a particular mood or toward a particular purpose, or it just generally doesn't make me viscerally say "YES, that is a good/great/awesome album." So it's a kind of catch-all category for albums that I like, that are good, but that don't elicit a ringing endorsement from me. The last category, "Desert Island Recommended," should be pretty self-evident.

If you really pressed me, I'd say 0-30 = Not Recommended, 30-65 = Recommended (solid), 65-90 = Recommended, and 90-100 = Desert Island Rec. But those are fuzzy borders and I don't know if they'll really hold. E.g., Lies was a qualified recommendation because of the trainwreck song that ends it, but that originally got a 70 from me as a (otherwise) very good album; The B-52s originally got a 65 from me, but it's such a seminal dance album from that period that despite its relatively lackluster back half (relative to the front), I'd still recommend everyone have it, so it gets the unqualified Recommended. There you have it.

Again, I wouldn't pay too much attention to that - that whole effort to categorize the album by number sometimes caused me to rush through albums in an effort to quickly get a number out, whereas the point of these reviews is 1, to take some time to really sit down with a disc and let it sink in (if it hasn't done so in the past), and 2, let people know what the listening experience is, so they can evaluate whether they'd like to check it out. Part of that is the little tag on recommendation at the end.

Just for posterity, here's the old system and some of my more general thoughts on what makes for a good album experience (from the old website):

Muzak: A Ratings Explanation

SOME would say that it's pointless and a gigantic waste of time to attempt to catalog and rate all of your music. They clearly do not get it. I completely agree that this is pointless, but just as pointless as, say, eating and sleeping. Either way it successfully passes/wastes the time, and I don't often find myself dancing to a plate of spaghetti (non audiospaghetti anyways). Keeps me off the streets, etc. It occurs to me that defending my heroin-esque music addiction against an unanswering audience is somewhat neurotic, so I'll just skip ahead to the explanation. Just know that this qualifies as a hobby/passion, and at least I don't leave weekends at a time to play paintball. Frisbee, yeah, but not paintball. So I've got that going for me.

It's a fairly simple system. Songs are 1-5 stars, confined by the reality that is iTunes. But I don't buy that Tim McGraw stance (like it, love it, want some more of it) on the various star totals, so I have my own take:

1 - bad to okay, or something that is hyper-ambient and/or spoken
2 - decent to solid, but wouldn't be on a mix
3 - good song, this and above could make it onto a mix
4 - great song
5 - transcendant song

Note that a 5-star rating has more to do with evoking a feeling (for me, and me only) than any objective quality of the song, and also does not entirely depend upon how much I like the song. For example, the song "Smells Like Teen Spirit" is a five star song by Nirvana because of, well, its awesomeness, but also because of all its connotations: ushering in the grunge era, great atmosphere of depression / alienation / angst, the fact that it came out when I was 13. But the 4-star song "In Bloom" is actually my favorite song off the album Nevermind, and I can't really explain that very well - "SLTS" is a better song, evokes that je ne sais quois, whatever, and I fully realize it's better, but I just like "In Bloom" better. If I were, say, making a top 1000 list of songs, "In Bloom" would not come above "SLTS." So, there's basically no rationality going on here whatsoever.

Albums are a little more involved: (See above)

All of that is based on numerous factors:

Raw Numbers: The number of four and five star songs. The Beatles album Revolver clocks in with 7, which you'll just have to trust is a high number.

Album Graph: Woe unto the album that dips badly. I vastly prefer consistent albums that are solid throughout to albums with incredible parts and terrible parts. Singles do not make an album in my book. That's just me. A classic example is The Police album Synchronicity, which features gems like "Every Breath You Take" but then comes crashing to a halt with that god-awful song "Mother."

Flow / Gestalt: This refers to the cohesiveness of an album: is sitting down and listening to it a collective experience, or does it feel like a collection of singles that were lumped together? The Who album Tommy and the Pink Floyd album Dark Side of the Moon set the standard here.

Transcendant Patchwork: Maybe the opposite of F/G, sometimes a collection of songs in widely varied styles comes together with an anti-flow flow like a great narrative. The Beatles album known as The White Album serves as a great example: songs written independently, styles ranging from folk to proto-metal to surf-music to experimental tape loop crap and yet it comes together. Weird. Don't question it.

Intimacy: Sometimes albums have a feel like you stumbled into the studio and witnessed a moment. The Pixies album Surfer Rosa, complete with studio banter, pulls this off very well, as does The Rolling Stones album Exile on Main Street.

Mystique / Intangibles: I just really need a category like this so I can fudge things if I want to.

Opener / Closer: Most great albums have an incredible intro, and if you want to maintain great status with me you have to leave the album with some kind of grand conclusion and/or encore. There are scores of examples, but the Pearl Jam album Ten pulls it off and then some - the album creep crawls through some ambient, almost machine rain sounding bits and then explodes with "Why Go" for the opener, and its closer "Release" goes out on a theatrical high note before fading into a mesmerizing ambient drum and bass piece that devolves into... the same ambient rain sound from the albums opener. Bookends are not exactly the most original framework for any work of art, but this has a nice effect - if the album is on repeat, the last track fades into the first and the album starts anew. Gives the album a very circular vibe, and if pulled off, is pretty great. Similar effects: the Pink Floyd album Animals and the Phish album Story of the Ghost.

Rock Peaked in 1974: Or in my case, the early 1990's. Or the late 60's. Basically, if I have some kind of nostalgic connection to a record, it gets a big boost.

So there it is. My ratings are purely based on my experience of the album, so you and I invariably will disagree. That's cool.

No comments:

Post a Comment