Having written last month about Pandora apparently opening up, and having drawn comparisons with Last.fm, two music services have licensed some of the Last.fm data to add recommendations to their sites.
Download store and magazine site TuneTribe.com is perhaps the less interesting example. Their home page now has a search facility "powered by Last.fm". Provided your search gets an 'exact match', you get a link to recommendations for similar artists. Thus TuneTribe's similar artists for Brian Eno are effectively the same as the Last.fm listings of similar artists for Brian Eno — though interestingly the rankings are slightly different, suggesting that TuneTribe does not have a 'live' data feed. The Last.fm-TuneTribe arrangement is reciprocal in that the Last.fm web site includes links to download tracks from TuneTribe.
More intriguing to me is MusicStrands' use of Last.fm data. MusicStrands already have their own client software for tracking people's listening and generating recommendations. But as well as providing their own interface for customised recommendations, they are now offering a means to generate track recommendations based on your (or any user's) Last.fm profile. (Last.fm itself doesn't let users generate lists of recommended tracks; it just plays them those tracks as part of its 'personalised radio station'.) You can also identify MusicStrands users with similar tastes to you, based on your Last.fm profile.
This last feature holds the promise of people being able to 'port' their profiles, to some degree, between different services and and thus avoid being locked into any one service. Some might say that's a tactical business error by Last.fm, but I think they understand the different ethos of web 2.0. Interestingly, MusicStrands has also recently recruited the business intellegence manager from Apple's iTunes Music Store.
In the longer term, I wonder if there is the prospect of the kind of recommendation-generation data that Last.fm have developed becoming much more widely available, albeit in different versions. At the moment this data is profiling gold dust for the music industry, even at aggregate level without it being linked to individuals. However, as well as Last.fm and MusicStrands collecting their own data, the latest versions of iTunes has the facility to build the same kind of profiles (even though this was controversial when first introduced).
Once each of these companies has a dataset of a million users (Last.fm currently has 1.5 million), each of who has played hundreds of tracks, then I'd imagine that, statistically, those datasets will be hard to tell apart, and will generate very similar recommendations. No-one expects or demands 100% 'accuracy' in recommendations. Any differentiator will be in how well tuned the datasets are to spotting the emerging hits from among the very latest releases, and recommending these tracks to others who might like them.
Posted by David Jennings in section(s) Future of Music, Music and Multimedia, Radio, Social Software on 5 April 02006 | TrackBackI'm not sure I agree with your point that it will be hard to tell the different recommenders apart. These social recommenders have funny feedback loops that tend to attract and repel different people. For instance, look at the 'top artists' charts at last.fm. They are quite different than what you see on billboard. That's because last.fm attracts a certain kind of music listener. Any recommender is going to have some biases, and these biases are seen in the recommendations they make. So if I go to last.fm and ask for a recommendaton based on my music listening history, I may like the recommendation or I may hate it. If I like the recommendation, I am likely to stay at last.fm, and since last.fm gets bases its recommendations on its user base, by staying I am reinforcing the recommendation. But if I don't like the recommendation I get, I am likely to leave last.fm, so my tastes never get incorporated into the last.fm model. This means little pertubations early on in a social recommender can have a large affect on the system. I think that is why last.fm is so tilted toward the alternative/geek rock. The geeks got there first and now the recommendations are geared toward them. I don't think CF systems naturally converge.
Posted by: Paul on 6 April 02006 at 2:04 AMThanks for the thoughtful comments, Paul. You could be right, and I could be wrong (wouldn't be the first time!). Let me just see if I can defend my position credibly.
The main strand of my defence would be that recommendations are, I assume, based on relative links between artists, rather than where they are in the charts. Example: perhaps there aren't many Bruce Springsteen fans on Last.fm, and he's only a top 200 artist there, compared with a top 10 artist on another service. Does that matter as long as there are enough Bruce fans on Last.fm to be representative of the wider population? Surely Last.fm will still recommend relevant music to Bruce fans (Tom Petty? John Mellencamp? whoever they are - I'm not a big Bruce fan myself...). I can't imagine that Last.fm would recommend Aphex Twin to Bruce fans just because there are a large number of Aphex Twin fans using the service.
Secondly, are listeners always as impatient as you describe? Every service is going to throw up a recommendation you don't like every now and again (perhaps as often as every 30 minutes), but all you have to do on Last.fm (or Pandora) is click the button to say "don't play that again" and it's gone. Sometimes, if I'm feeling adventurous, I increase the 'randomness' setting on Last.fm. I know this will increase the number of recommendations I don't like, but it also has a greater chance of exposing me to something that is outside my normal range. I may be unusual in that, and perhaps more people are put off by duff recommendations in the way you describe — but we don't have any detailed studies of the range of ways people use these services, do we?
Time will tell which of us is closer to being right. Perhaps in a couple of years it will be clear whether the kind of 'herding' you describe (where people with similar tastes cluster together) is a big factor, or whether, over time, the stats underlying the recommendations converge.
Posted by: David Jennings on 6 April 02006 at 9:39 AMWith regard to your bruce springsteen example, I think that different communities have different 'links' to an artist. For example, think of Johnny Cash - a staple of country playlists for years, but then he covered the song 'hurt' by Nine Inch Nails ... and all of a sudden he started appearing on alternative, metal playlists. Now if I look at the All Music Guide for similar artists to Johnny Cash, I won't see NiN or Trent Rezor, there's no link there at all, but in this alternative, metal community there is a link. A good recommender that is recommending songs for someone with NiN and Johnny Cash should probably not recommend June Carter (like the AMG would). The point is, the relationship between artists is context dependent, driven by the community of interest, and people will be attracted to communities that mirror their own preferences.
Thanks for the interesting discussion btw.
Posted by: Paul on 6 April 02006 at 1:24 PMWell, thank you. Yes, I think you are definitely onto something when you say, "the relationship between artists is context dependent, driven by the community of interest, and people will be attracted to communities that mirror their own preferences".
Of course, you'll get some services (like iTunes Music Store, AOL etc) where the community of interest is very broad and popular. And then you may get more niche communities that cater for metal, electronica, or country etc.
I think it's too early to say whether Last.fm and MusicStrands, for example, will be (or remain) niche communities, or whether they will grow and cross over to a wider, more general audience.
MySpace is interesting in this respect — though I don't think it's directly comparable to Last.fm — because it's clearly popular with millions of users and attracts major artists, but still (for the moment) has a feel of supporting unsigned acts as well.
And Pandora and All Music Guide are interesting in a different way, because their recommendations are not driven by their community of users, but by their resident 'experts'. So they may find it harder to evolve to follow the needs of any particular community of interest?
Posted by: David Jennings on 6 April 02006 at 4:40 PMGreat commentary guys. Here are my two cents:
The example of Johnny Cash covering "Hurt" gets really interesting, because while it was written and originally recorded as an Alternative Metal song, Johnny's version is at its core a sad, sad country song. Acoustic guitars in a minor key, discordant piano, plaintive earthy vocals...not a drum machine or sequencer in sight.
So the question there becomes: What would appropriate companions to this song be? Alt-Metal songs by Nine Inch Nails, Ministry and Front 242? (Wouldn't that be kinda like recommending William Shatner to Beatles fans? They did both do versions of "Lucy in the Sky with Diamonds" after all.) How about country songs about dogs and pickup trucks and the USA? Neither one really works. In my opinion the ideal "Similar Songs" list would be slightly edgy alternative country songs dealing with grim dark subjects. Maybe some slow Steve Earle, a divorce song from Richard Buckner, one of Gillian Welch's contemporary murder ballads, and maybe a sweetly depressing Willie Nelson or Merle Haggard song like "Funny How Time Slips Away" thrown in for good measure.
A really great playlist solution will take into account a combination of the sonic qualities of the music (style, tempo, dynamics, musical elements) and the song's contextual information (subject matter, thematic topics, lyrical bullet points) to really get to the root of what the song is.
Ween is a goofy Alternative Pop band, but I'll be damned if they didn't put out a country album that sounds more honky-tonk than anything coming out of Nashville now. A playlist of Beck, Cake, and They Might Be Giants featuring one of Ween's "12 Country Golden Greats" would sound totally out of place, despite the fact that Ween and Cake's music may have been purchased from the same "Alternative" section of the record store.
(Again, in my opinion) these recommendation lists work best when the determining factors come from the song itself, with very little influence of the performer's past history, or where we would categorize the artist on the shelves of the local record shop. A lot of these "Genre" distinctions are purely for navigation reasons. If I want to find a Bob Dylan CD, I need to go to the Rock section of the store, even if the album I want is "Nashville Skyline." It wouldn't make sense to have two Ween sections, most CDs in the Alternative rack and one lonely disc in the Country section. It doesn't make sense to recommend individual songs based on the Genre of the artist (or even the Genre of the album for that matter). As consumers (or more accurately "Navigators") we need some sort of order in which to find the things we're looking for (side note: when I worked at Tower Records, we always joked about putting the entire store in Alphabetical order by artist), but as listeners, the categorization doesn't really matter, the actual essence of the song is what counts.
There is obviously a real value the social recommendation model, and I think the ultimate best results will be able to incorporate user/community listening habits tempered with more granular "expert"-based descriptive data.
[Full disclosure: I have been working on AllMusic's intelligent playlisting solution for a few years now, so I may have strong opinions].
Posted by: Zac Johnson on 6 April 02006 at 6:20 PMThanks, Zac, for more insight and another perspective. Please excuse me if this response doesn't do full justice to your contribution — it's nearly 1am at the end of a long day, and I'm not feeling that alert.
This discussion about what counts as a recommendation for Cash's 'Hurt' reminds me of Todd Beaupré's comparison of recommendation/personalisation services on the Yahoo Music Blog, which you probably know. As I said there (though possibly not very clearly), I'm not sure whether what you might call the "one song test" is a particularly meaningful test. It's a bit like telling a career advisor one bit of information (e.g. that you like working with numbers) and expecting them to recommend a job that's ideal for you. A better test would be to say I like "Hurt" by Johnny Cash, but not by NIN; I like Blue Monday by New Order, but I'm not so keen on Joy Division; I like Jonathan Richman, but I hate They Might Be Giants; I like Neil Young's acoustic and country music, but I'm not sure about his work with Crazy Horse, and so on, and so on. Now recommend me a playlist of ten songs you think I should hear, and if I like eight or nine of them, I'll be happy.
If recommendation is a statistical process, don't you need multiple data points for the stats computation to work effectively?
On a different tack, possibly slightly tangential, I was struck by your comment, "A playlist of Beck, Cake, and They Might Be Giants featuring one of Ween's '12 Country Golden Greats' would sound totally out of place". I thought, "Would it really sound that bad?" This connects back with what Paul said about "the relationship between artists is context dependent, driven by the community of interest". My context and community of interest may be different... Over 20 years, I used to listen to John Peel's radio shows, and in the space of half an hour he might play a traditional English folk song sung by Martin Carthy, some reggae by Misty in Roots, a live track by the Clash, and a 78rpm record by Artie Shaw. This mixed-up approach is not uncommon on British radio (see the playlist for the radio show I listened to last night for another example). So, Beck and country music in the same playlist sounds OK to me. I'm not picking a fight with you on this, Zac; I'm just saying that, if we sometimes see things differently, then that reinforces Paul's point.
Posted by: David Jennings on 7 April 02006 at 1:22 AMRe: The "One Song Test" -- Good point. There are obviously multiple ways of determining user interest and patterns of listening habits. An assumed "best" outcome of one of these recommendation services would be to act like a really knowledgeable best friend who knows that you like everything by The Police except side 1 of Synchronicity, so please dear friend, tell me what other British reggae-influenced new wave post-punk acts I should check out. This would be ideally accomplished by a pretty involved analysis of a user's collection and recent habits (as last.FM is doing), some kind of "Thumbs Up/Thumbs Down" functionality (Pandora, Tivo, Bose's uMusic), and some kind of in-depth record of listening habits based on environment or time of day (like Yahoo's Launchcast purports to do).
But it can't just be an analysis of what you like and what you've already listened to. If that were the case, it could just keep recommending things you already own, or more albums by the same artists you already like (as the iTunes Mini-Store seems to do). There still needs to be some corollary between the songs you've indicated as your "favorites" and the universe of music out there. That's where the descriptive song-level data comes into play. I like "Hurt" by Johnny Cash, but not by NIN; I like Blue Monday by New Order, but I'm not so keen on Joy Division; I like Jonathan Richman, but I hate They Might Be Giants; I like Neil Young's acoustic and country music, but I'm not sure about his work with Crazy Horse is terrific information, and the next step is to look at what makes up those songs (and also an analysis of what makes up the songs you don't like) to make recommendations.
Jonathan Richman is a College Rock Proto-punk Singer/Songwriter whose songs are quirky and witty, honest and bittersweet, he deals with the topics of heartache, love and childlike wonder, usually in a traditional rock setting with electric guitars. They Might Be Giants are Alternative Pop/Rock College Rockers whose songs are witty and quirky, but more ironic and occasionally cynical. TMBG's songs deal with elements of pop culture and science, with unusual musical devices involving accordions and drum machines. From this data, it appears that you might like quirky College Rock, but it needs to be honest, deal with real human experiences, and you prefer it to be in the traditional rock mode (electric guitar, bass, drums, vocals). If this analysis is close, you'd probably dig Green on Red and Elvis Costello and the Attractions, but not want to be recommended any songs by Camper Van Beethoven, the Dead Milkmen or (shudder) Barenaked Ladies.
My point (if I have one!) is that community data alone isn't enough to really fuel a great recommendation engine, since the majority of New Order fans may also be Joy Division fans, and a lot of Jonathan Richman fans have some TMBG in their collection. The really helpful data is the stuff that's associated with the individual song. I can't imagine that one of the guys in They Might Be Giants killed your dog or stole your girlfriend, so it isn't the band itself that you can't stand, it's that you don't like their sound, or their usual subject matter, or the overall feel of their music. That's what I meant by getting down to the sound of the individual song, not looking at the history of the band itself.
Posted by: Zac Johnson on 7 April 02006 at 2:43 PMRe: The "mixed-up approach" -- Too true. As an American, I've always been jealous of your ability to flip on the radio and tune in to John Peel's show (R.I.P.). The increasing homogenization of radio in the U.S. is really depressing. I totally agree that a good DJ can throw two seemingly disparate songs back to back and somehow make it work. Look at Quentin Tarantino's soundtracks: There is absolutely no way to programmatically determine how Maria McKee's "If Love is a Red Dress" and Urge Overkill's cover of "Girl You'll Be a Woman Soon" could possibly work together. Ditto for Wes Anderson's soundtracks. I honestly don't believe that music recommendation technology will ever take the place of a really good DJ...there's just too much human element involved in making really great segues and selections to be handled by cold plastic machinery.
That being said, I don't have a personal DJ at my beck and call, so I'll need something that will make good recommendations to me in a consistent manner.
Steve Krause has a great post where he talks about recommendation services being akin to "Bowling vs. Batting" -- "Should we expect spot-on recommendations like a pro bowler expects a strike every time? Or is this more like the baseball batter, who is happy to get a hit one in three times?" Who knows which one is more correct...maybe in the perfect recommendation service there should be a slider that says "Limit my results to only the most exact matches" on one end, and "Throw me a curveball" on the other end, allowing for the "John Peel-Style" list with more freedom to throw in something like a country song by Ween.
(All of) Paul's comments obviously have a lot of merit, and by no means should we abandon the context of the community as an influencer, but at the same time if everybody on the planet has both Bob Marley's "Legend" and a James Brown's Greatest Hits, that doesn't mean that they sound alike. Because everybody and their mother has a Beatles album and Nirvana's "Nevermind," that doesn't mean I want to hear them back-to-back.
I just played a Beatles radio station in Last.FM, and it played a Rolling Stones song, a U2 song, a Radiohead song, Beatles, Bowie, Simon & Garfunkel, Coldplay, Red Hot Chili Peppers, Beck, The Doors, The White Stripes, and finally "On a Plain" by Nirvana. In all honesty, every single one of these songs is a song that I can get into, so as a "Radio Station" it did a great job, but using that technology as a recommender it gets a little scattered.
Starting with Bob Marley, I got Outkast, Buju Banton, Dave Matthews, Bob Marley, Jimi Hendrix, Jimmy Cliff, 2Pac, The Dead, Phish, 50 Cent and Sublime. I felt like I was at a Fraternity party.
So I dunno. I think there is a real place in using a community for the long tail (that's where really specific connections become a gold mine), but for the artists that are in everybody's collection it gets a little washed out. That is why I echo the concern in your initial post (about 10,000 words ago) that once every band in Last.FM gets to the point of "Beatles Saturation" the value of the connections will decrease.
Posted by: Zac Johnson on 7 April 02006 at 3:37 PMWow, Launchcast claims to map your listening habits at different times of the day and in different environments? I didn't know that. Do you have a link to where they say this?
Coming back to you on the "one song test" thread... This discussion is very enjoyable, though it may be getting slightly academic (but who cares if our audience is just each other?). I sense we may be in different camps on the 'music genome question'. Let me try and describe these two camps.
The 'music genome' camp believes that what makes people like music are a set of qualities or characteristics that inhere in the music itself, and some people like a particular mix of qualities, while others like a different mix (college rock, authentic lyrics).
On the other hand, the 'environmental interaction' camp believes that what makes people like music is what goes on in the interaction between the music and the listener's expectations, personal history etc. When I blogged about Does music have a genome last year, I quoted Brian Eno saying "music is actually a contingent combination of sounds whose emotional resonances are entirely dependent on the audience's personal and shared histories as listeners." Which I think is another way of expressing the 'environmental interaction' position.
Anyway, I think your analysis suggests you tend to the music genome camp, and I think I tend to the environmental interaction camp. Does that sound fair?
(I wouldn't want to push this analogy too far, but it's almost like the music genome camp is 'nature' and the environmental interaction camp is 'nurture'. If there's anything in that comparison, then it suggests we need both camps, just as we need nature and nurture.)
As for you reading my palm with your analysis... well, not everything I said before was true of me (e.g. I like Joy Division and love Crazy Horse), but it is true that I like Jonathan Richman and hate They Might Be Giants. Yes, one of things I like about Jonathan is the songs of 'heartache, love and childlike wonder' (though I think I'm right in saying that for over a decade now he's mostly played acoustic guitar with just a tiny drumkit for accompaniment — certainly that's how it was the last two times I saw him). I agree that TMBG place more emphasis on being clever and witty — the main thing I don't like about them is that, to my ears, they fail to be either clever or witty.
Up to this point, I'm following your analysis. However, I've never seen myself as a fan of College Rock. That may be a US/UK terminology thing, but even if it is, then it starts to undermine the scope to use College Rock as a part of any universal analysis of music. That's why I'm more in the 'environmental interaction' camp: when you (Zac) hear a Jonathan Richman song, you hear College Rock and a bunch of associations that goes with that; when I hear the same thing, I hear something different.
As it happens, I've always had a blindspot for Elvis Costello. I don't hate him (though I've never much liked his voice), but I just can't get excited by him, despite the fact that many people whose taste I trust rate him very highly. Green on Red didn't do it for me, either — sorry!
Finally, like you, I can't imagine one of the guys in They Might Be Giants stealing my girlfriend — they're not her style. If Jonathan Richman took a fancy to her, though, I reckon he'd be in with a chance, since she was completely charmed by seeing him live.
Posted by: David Jennings on 7 April 02006 at 4:00 PMBriefly:
Nature vs Nurture = Very apt.
College Rock = Probably an American phrase. Again, a semantic "Navigation" term, not really what the music sounds like.
Elvis/Green on Red = Just off the top of my head, no real analysis involved, (but I'll bet you hate the Barenaked Ladies, right?!?)
Posted by: Zac Johnson on 7 April 02006 at 5:27 PMThe Barenaked Ladies: well I've heard of them, but I don't think any radio show I've listened to has played them, and none of my friendly recommenders (human or machine) has ever pushed them at me. So they've been kind of filtered out of my life. I just listened to some 30 second samples of their stuff online. I quite like the bits of bluegrass I could hear occasionally in their sound. Obviously I couldn't get a proper handle on the lyrics, and maybe those would annoy me. But I haven't got enough evidence to hate them.
Posted by: David Jennings on 7 April 02006 at 5:48 PMYes, Jonathan Richman might be in with a chance — especially if he does that hip-wiggling dance.
Posted by: David's Girlfriend on 7 April 02006 at 10:19 PMThanks Zak and DJ for a great discussion. Back to social recommenders and convergence. Another thing to keep in mind is that even if all of the social recommenders (last.fm, yahoo, musicstrands etc.) had exactly the same data they would still generate very different recommendations. The algorithms these systems use for generating recommendations from the data vary significantly. They work hard to deal with the popularity biases, the feedback loops and the inertia that is inherent in these systems. These systems have lots of 'knobs' that adjust the kind of recommendations that will be generated. Each recommender has the knobs set differently.
Posted by: Paul on 9 April 02006 at 12:38 PMGood point, Paul. Another factor may be how much the systems allow their users to twiddle the 'knobs' to get different kinds of recommendations. Last.fm allowed you to choose whether you wanted 'personal' radio (i.e. just the stuff that fits your individual radio), 'neighbour' radio (the stuff that people who share some of your tastes like) or 'random' radio.
Similarly, Musicstrands customised recommendations allow you to choose whether the recommendations you get come from a pool of 'all music', 'popular music', 'somewhat known' music, or 'unfamiliar music'. Bizarrely, when I had the knob set to 'unfamiliar music', I got recommendations including Oasis and Gorillaz… (I've asked Musicstrands about that, but not had a reply yet).
Posted by: David Jennings on 9 April 02006 at 1:47 PMMusicMobs has an excellent label for their equivalent slider. You can slide it anywhere in between 'main stream' and 'hipster' ... of course any recommender system should be able to look at your music listening habits and determine whether you are a Coldplay fan (main stream) or a fan of 'the go-betweens'.
Posted by: Paul on 10 April 02006 at 8:06 PM