Star Trek back at iTunes Store. Features original first season and remastered episodes

remastered trek on itunes

Yes, after nearly two months offline, Star Trek is back on the iTunes Store. The store has separated the newly remastered episodes from the original broadcast versions. Still, only episodes from the first season are available.

iTunes remains the only source to buy and download the original series remastered in the uncut versions.

The first season of Enterprise has also returned.

City on the Edge of Forever (remastered)
City on the Edge of Forever (original)

Japanophilia: Four Japanese albums you should have in your collection

This article is also a guest post for Webomatica, who asked me to fill in for a day while he’s in Japan. Appropriately, I think, I dove through my library and pulled out some of my favorite Japanese albums. Enjoy…

::

sumum yokota - symbol

Susumu Yokota – Symbol (2005)

Yokota is a musician of the sonic contortionist variety, meticulously sculpting sounds and bending them to his will. Symbol features some delicately constructed mashups of classical music, with passages that are both instantly recognizable and relatively obscure. Lightweight and easy on the ears, this album is sonic bliss that samples predominantly from the western musical heritage. It’s an engagingly mellow aural experience. Read my full review.

Listen to Traveller In The Wonderland:
[audio:070123TravellerInTheWonderland.mp3]

Get it on iTunes Get it at Amazon

::

cornelius - point

Cornelius – Point (2002)

Similarly, I would also describe Cornelius, who takes his pseudonym from Roddy McDowell’s character in Planet of the Apes, as a meticulous creator of sounds. But high art isn’t his game; his level is clearly that of catchy pop numbers and urban culture. In the early 90s, he came to fame in Japan as part of a mostly straight-ahead pop outfit called Flippers Guitar. Since then, he’s embraced a kind of whiz-bash indie electronic eclecticism, which comes to a head on his magnum opus. This record is the reason I’ve called him Japan’s greatest natural resource.

Listen to Another View Point:
[audio:070327AnotherViewPoint.mp3]

Another View Point on iTunes point at Amazon

::

yoshinori sunahara - sound of the 70s

Yoshinori Sunahara – PAN AM: Sound of the 70s (1999)

This album may have been released in 1999, but as the title suggests, it might as well have been set much earlier. As for the particular sound of the 70s, this isn’t disco, or funk, or classic rock. It’s smooth and jazzy with a retro lounge feel. Sunahara, who is positively obsessed with TWA-era airline travel, pulls out a soulful downtempo groove that will make you feel like you’re waiting to jet off to London from the terminal at JFK.

Listen to Theme from Take-off (Magic Sunset):
[audio:070323ThemeFromTake-Off.mp3]

Theme from Take-off on iTunes sound of the 70s at Amazon

::

pizzicato 5 - Happy End of the World

Pizzicato 5 – Happy End of the World (1997)

Released at the peak of Tokyo’s so-called Shibuya-kei scene (the emergence of which had parallels with that of American grunge–but that’s another story), P5’s Happy End of the World is filled to the brim with the ultra cute, ultra stylish and ultra smooth vibe with a little tongue-in-cheek mixed in that makes the world created by this music so inviting for American hipsters and hipster wannabes. It also doesn’t hurt that the album is expertly crafted, with wide-ranging musical influences layered on top of some very infectious beats. However, for all the sophistication this album exudes, there’s a certain childlike giddiness to the whole affair. This album ranks among my all-time favorites.

Listen to Love’s Theme:
[audio:070323LoveTheme.mp3]

Love's Theme on iTunes happy end of the world at Amazon

::

Extra credit. Japan for the past 15 years or so has been cranking out some excellent music. Check out the stylings of: Yoko Kanno/Cowboy Bebop, Nobukazu Takemura, Cibo Matto, Fantastic Plastic Machine. Explore them at your leisure.

Yeah, What They Said 3/23

Yeah, What They Said is a new feature on tunequest. Some people call it “link sharing.” These links won’t necessarily be music or iTunes related, but I’ll try not to stray too far from the topics on this site. Mostly though, it’ll be stuff that’s cool, but that I don’t have time to write about.

Behind the Mario Maestro’s Music:
Koji Kondo was in his mid-20s when he wrote the iconic music for The Legend of Zelda and Super Mario Bros. for the NES, but he doesn’t compose much these days. Wired takes a look a him.

10 Albums in 10 Minutes: Classic albums cut up and squeezed into 60 seconds of playing time.

Millions Dream of Megamillions: Compete blog charts the rise in Internet activity as the recent Megamillions Jackpot increased.

If You Can Read This, You’re Hired: How would you find a really good typographer? Would you use dingbats? If you can decipher these ads by 3/25, you could get a 1-year subscription to Indesign Magazine.

::

Bonus video: here’s Koji Kondo playing Super Mario Bros on the piano:

This recording is 105 years old

wax cylinder

Growing up, my parents had (and still have actually) an old Victrola record player. It was completely machine operated; no electronics whatsoever. To use it, you had to wind a handle, which tightened a spring. Flipping a switch unwound the spring and started the disc spinning. A needle, of course, translated the record into sounds. Volume was controlled by opening and closing two doors on the front.

Along with the Victrola itself, my parents had a nice collection of records for it. I always enjoyed exploring the various old pop, jazz and orchestral standards, using those recordings as a window to the past. Plus, there was a subtle aural appeal to the tinny, lo-fi sound quality of the music.

As much as I appreciated it, the machine was a bear to use. The records were heavy, but delicate. The handle needed constant turning. And most records only had one song per side. As enjoyable as the time was spent, the effort forced my sessions to be rather short.

Since the mp3/digital music revolution hit full throttle, I’ve had a dream to start digitizing some of those old records before they deteriorate beyond recognition. Being able to drop them on an iPod would greatly enhance my ability to explore those recordings.

Unfortunately, I’ve traditionally lacked a suitable recording environment. Also, that Victrola now lives more than 850 miles away from me. So for the time being, it will remain a dream.

Good news on a related front though! The University of California, Santa Barbara has been digitizing the recordings in its wax cylinder collection. Some of those recordings are even older than the ones I listened to growing up. Some of the oldest in the collection date to the 1890s while the most recent is dated 1928. The project has been ongoing since 2002 and, as of this writing, the digital collection turns up 6824 individual recordings.

The collection isn’t limited to music. It includes sermons, speeches, vaudeville and other spoken word (try the Humorous Recitations)

Each recording’s entry includes detailed information about the performer, the release title and the date (if known). Audio is downloadable as both mp3 and unedited .WAV files.

Explore the catalogue, catch the streaming audio of Cylinder Radio or subscribe to the site’s RSS feed.

Here is a taste to get you started. It’s Johann StraussBlue Danube waltz performed by Edison Symphony Orchestra in 1902, when the piece was only 35 years old. You’ll recognize the tune.

[audio:070320BlueDanube1902.mp3]

There is something awe-inspiring about listening to music that was probably recorded before my great grandparents were born.

What’s in a star rating?

Yesterday, I wrote a detailed article about the new formula I’m using to quantify the overall quality of albums in my iTunes library. It’s been working for me, but I realized that everyone rates their music differently. Webomatica, for example, explains in the comments that his song ratings are relative to other songs by the same artist.

So I’d like to explain the thought process that goes into my rating system. I’ve been using the same star rating criteria for years and that system has gone a long way toward helping me maintain control over my sprawling library. It allows me to quickly construct playlists of quality music, which is the single largest goal I have when managing and utilizing my library.

When thinking about a song’s rating, I basically need it answer one question: How likely I am to want to hear this song again? They are not designed to attribute a greater cultural value to a song, though the song’s general artistic worth plays a large role in the rating it receives. I’m more likely to enjoy a high-quality song and thus want to listen to it more often.

The rating is essentially a weighted vote for helping me determine how often a particular song gets played in the future. The breakdown looks like this:

  • Rating: ★★★★★ 5 stars: This song is excellent. It shows poise and craftsmanship and I’m pretty much guaranteed to enjoy this one the next time.
  • Rating: ★★★★☆ 4 stars: This song is very good. Well done and not off-putting, I’ll most likely enjoy this again, but it’s not brilliant enough to be a 5. The majority of songs in my library fall into this rating.
  • Rating: ★★★☆☆ 3 stars: This song is good. I’m not going to go out of my way to hear this one, but if I’m listening to an album beginning-to-end, I won’t skip it.
  • Rating: ★★☆☆☆ 2 stars: This song wasn’t very good. I’m fairly certain I’ll never want to hear it again. These songs are candidates for deletion. If any song stays at 2 stars for long enough, it is either upgraded to 3 stars or removed from the library.
  • Rating: ★☆☆☆☆ 1 star: Not used for rating purposes. Instead, songs that are marked with 1 star are taken out of circulation, usually because of encoding problems or bad ID3 tags. Its normal rating is returned when the problem is solved. Additionally, special audio such as comedy or spoken word is automatically given 1 star to keep it from mingling with music.
  • It is also worth noting that my ratings are not static. As my tastes fluctuate, I’ve been known to change them. It doesn’t happen often, but sometimes a 4 star song might become a 5. Or it could fall to a 3 if whatever aspect of the song I found appealing the last time I heard it is missing. In one extreme example, a song went from 5 to 2 stars and was subsequently deleted.

    There you have it. That’s where I’m coming from as I discuss song and album ratings on this site. I’d be interested to know how other people handle ratings in their iTunes libraries?

    In search of a definitive album rating formula

    When it comes to my iTunes library, I’m a regular statistics nut. Sure, my library exists primarily for my own enjoyment, but it contains so much organically-compiled data about my habits and tastes that I can’t help but want to take a look at it and find out what the data says about my interests.

    But for a while now, I’ve struggled to quantify, tabulate and analyze the overall sense of my library. Which of my albums albums are truly the greatest? Which artists, when the sum of their parts are combined, are really my favorites? And by how much? I want numbers.

    None of the iTunes stats options available at the moment give me the type of results that I want. The Album Ranking AppleScript provides a simple average that skews toward albums with fewer tracks. SuperAnalyzer provides a top 10 list that is skewed toward albums with more tracks.

    Most iTunes stats tools simply provide averages or totals of play counts and/or star ratings. Averages, while somewhat useful, can be misleading. An album could have a handful of awesome songs and a bunch of filler and still rank as well as and album that’s consistently good, but without much breakout material.

    And that can be frustrating to me, because, in terms of album or artist worth, I tend to value the ones with consistent performance.

    Take, for example, my recent run-down of Air’s discography, specifically the albums 10000 Hz Legend and The Virgin Suicides. After many years of listening, my artistic impression is that Virgin Suicides is ever so slightly the better of the two. The songs on Legend vary from excellent to clunkers. Suicides is overall pretty good, with only one exceptional track. However, averaging my ratings shows that Suicides is a 3.85 while Legend rates as an even 4.

    So, to reward albums that don’t veer wildly around the quality wheel, I’ve developed my own album rating formula that takes into account the consistency of all the star ratings on a given album.

    The Formula

    album rating = (mean of all songs + median of all songs) - standard deviation of the set

    The mean sums up the whole of the album. The median shows the state of the album at its core. The standard deviation indicates the variety of the individual ratings. The result is a number on a scale of 1 to 10. (Alternately, divide that number by 2 to return the result to a 5-star scale).

    Let’s take a look at the formula in action. Suppose we have two albums with twelve songs each. The first is generally excellent, but varies in quality. The second is good stuff throughout.

    Ex. 1 Ex. 2
    5 4
    4 4
    5 4
    2 4
    4 4
    5 4
    5 4
    2 4
    5 4
    3 4
    5 4
    3 4
    Mean 4 4
    Median 4.5 4
    total 8.5 8
    STDEV 1.21 0
    Score 7.29 8

    This table shows the individual star ratings for the two theoretical albums, as well as all the statistical data, as calculated by Excel. As you can see, both albums average score is the same (4) and Ex 1 even has a higher median than Ex 2. But, because the quality of Ex 1’s songs vary a great deal, its standard deviation is substantial, so much so that its album rating becomes 7.29 (or 3.645 on a 5-star scale) when my formula is applied. Ex 2’s score suffers no penalty and its score remains 8 (4). In this case, the standard deviation awarded Ex 2 a bonus for being of uniform quality.

    Let’s take a real world example, the two Air albums I mentioned above.

    10 kHz Legend Virgin Suicides
    4 4
    5 4
    4 4
    5 3
    5 3
    4 4
    3 5
    4 4
    3 4
    3 4
    4 4
    4
    3
    Mean 4 3.84
    Median 4 4
     
    total 8 7.84
     
    STDEV 0.77 0.55
     
    Score 7.23 7.29

    When the formula is applied to my ratings for each, the scores for 10000 Hz Legend and The Virgin Suicides become 7.23 (3.62) and 7.29 (3.65), respectively. So factoring in the standard deviation results in a score that more closely reflect my thoughts of those two albums.

    So what does this mean? I’m not sure exactly. In practice, I could whip up some listy goodness and see which albums are truly my favorites. A comprehensive analysis would be cool. I’d love to see the distribution of my album ratings. However, that would require more programming skills than I have. Though that could be a good project to help me learn.

    Out of curiosity though, I have picked 10 albums, just to see how they rate. One provision, of course, is that every song on an album must have a rating before the album score can be calculated. These ratings are on a 5-star scale.

    AVG My Score
    Radiohead – OK Computer 4.5 4.41
    Air [french band] – Moon Safari 4.5 4.39
    Nirvana – Nevermind 4.5 4.24
    Mouse on Mars – Radical Connector 4.33 4.23
    Ratatat – Ratatat 4.45 3.97
    Nine Inch Nails – With Teeth 4.31 3.77
    The Strokes – Is this it? 4.09 3.7
    LCD Soundsystem – LCD Soundsystem 4 3.68
    Basement Jaxx  –  Remedy 3.73 3.51
    Prefuse 73 – One Word Extinguisher 3.82 3.47
    Weezer – Make Believe 3.58 3.21

    This is by no means a top 10 list, but it is interesting to see where things ended up. It’s also interesting to see how minor fluctuations in star ratings can change the final score. For instance, if that Ratatat album had one more 5 star song in place of a 4 star song, its median number would become 5 and its album score would jump to 4.51. Lower a 5 star to a 4 star and the score only drops slightly to 3.93. I don’t know if this is a flaw in the formula or a reward for albums that have a lot of good songs.

    Problems and issues

    Small data sets. These are troublesome in all statistical circumstances and this formula is no different. Albums with only one song will, by definition, not have a mean, median or standard deviation, and that kills the formula with a divide-by-zero error. Also, because the formula uses the average rating as a component, albums with a low number of songs will tend to skew one way or the other.

    In my library, Boards of Canada’s EP In A Beautiful Place Out In The Country has four fantastic songs and ranks at 4.63, higher than anything on that list above. As a release, I’d say that’s accurate, but I’m sure it doesn’t surpass OK Computer. I would be interested to see a chart of how the album score changes as the number of tracks on an album increases.

    Additionally, I haven’t figured out a way to rank partial albums, i.e. albums where I either don’t own all the songs or albums where I’ve deleted songs I didn’t like. For now, I’m just excluding them altogether.

    Still, I’m fairly pleased with the results I’ve been getting as I run various albums through the formula. It’s working for me and my own song rating system, but I’m curious to see how it works with someone else’s.

    Fortunately, Webomatica has posted his song-by-song ratings for The Beatles’ Sgt. Pepper’s Lonely Hearts Club Band. Using his numbers, the average for the album is 4.38, while my formula renders a 4.28. I’d say that’s a consistently good album.

    ::

    Here’s a Microsoft Excel file you can download. Plug in your star ratings to find the album score. AlbumScore.zip