I spoke at the BigThink Search event, where the Google-Bing controversy came to fruition. This was an aside with O’Reilly media about how search can influence news presentation, and how FanFeedr uses social signals, and the people who generate the signals to refine the prominence of a particular story, team or player.
Feedback welcome and appreciated, as always.
What is of greater interest is how the product chooses what to cover, and how that delivers value to users.
There are four ways to assign the news, and five ways to present it. We are going to focus on the former, saving the latter for another post.
How to assign the news, or how to decide what to cover . . .
- Algorithmic plus on site interactions
Human beings (I am one of them!) provide great news judgement, but can often lag behind the information zeitgeist or misjudge it. There is no reason, in 2011, to make editorial and assignment decisions in a vacuum.
A purely editorial lens uses an editor, obviously, to make decisions about what to cover. They may not care what is trending on twitter, or the page views that they received on the stories from yesterday, or what their competition is covering, and the attention that various subject areas received.
I am not accusing the Daily of being purely editorial. However, in the absence of an editorial mechanism that is actively mining the data of who is looking at what, when, in what volumes, and with what authority, one is left thinking that just using Bob or Sue Editor is not enough.
Using a curatorial lens instead of making purely editorial decisions about what news to cover is not just a sidelight, or auxiliary plan for coverage. It is the most fruitful means to cover the largest number of events in the absence of infinite staff.
While the 100 journalists at the Daily is 100 more journalists than we have at FanFeedr, the application isn’t going to be able to deliver articles that hit the sweet spot of trending topics, garnered through social media, which is something that Business Insider, BuzzFeed, the Huffington Post, SB Nation and Bleacher Report seem to have pinned down quite nicely.
UPDATE at 1.54p ET: These sites use analysis of trending topics via Twitter and/or Google Trends to drive news coverage. Vin Diesel in a car crash? Let’s write an article about Vin Diesel, and a piece on celebrities who get into car crashes. Tiger Woods exposed? Let’s get salacious pictures of his special friends through their social media profiles. The technique looks at hot topics and then uses those topics to drive coverage. A curator also looks in a slightly longer timeframe, like the last 24 hours or week, to tease out the topics that are driving continued interest. You don’t need to do this for a subject like politics, but covering specificÂ politicalÂ figures will have greater resonance based on their frequency of mention, or prominence.
The curatorial lens has a downside: it can devolve into subject areas that some may find less tasteful. It does allow any publication, including the Daily, to both react well to areas of potential user interest as well as determine areas of coverage for future focus.
You can get a sense of how the Huffington Post does thisÂ here.
The third way of making editorial decisions about coverage is using a purely algorithmic framework to suss out what people are talking about, and then present the “hot” news to users based on real-time interactions across a large body of content.
One might suspect that Google News is crafted this way, but, in fact, they have humans sitting on top of the algorithms.
Many of the SEO content farms, as they are somewhat disparagingly labelled, pursue this avenue for assignment purposes. Simply put: they exploit opportunities to attract search traffic. These opportunities are gleaned through data analysis and then assigned to writers at low cost.
They use a single signal: search queries.
Algorithmic plus on site interactions
The fourth way for editorial presentation is using trending topics through social channels, and then refining those topics based on attention signals off of one’s own site. Put simply, this is a more refined version of the purely algorithmic take, and it requires data mining user interactions at scale, but in a way that competitors cannot.
Two concrete examples:
- Bleacher Report has an email newsletter that has over 1MM subscribers (they have stated this publicly.) They use the interactions of their users through those newsletters to determine areas of interest for coverage, in addition to their other data mining techniques. No one else has access to this data.
- Gawker Media promotes a single story on front of its new websites, based, again, on a set of interactions that only Gawker can measure, in addition to their normal story coverage techniques, and they also have the scale that makes their private user feedback a valuable tool for refining their coverage.
Human beings don’t play a role here, but this process is highly dependent on having enough signal to understand both what is happening on social channels and the interrelationship of internal (site and application) data with the “signals” at large.
The joint venture between bitly and the New York Times, News.me, pursues this logic. The JV can use Twitter + Bitly to do this type of news presentation at scale, and accurately.
They have access to bitly’s click-stream data for discrete URLs in addition to perhaps the most robust analysis of social media information pursuits through the use of shortened bitly URLs across social media.
FanFeedr, our company, does something similar, but just for sports.
- Of the four, the purely editorial lens will show greater distribution in page views and time spent across all of its individual news products, as it will tend to be hit or miss.
- The curatorial product will show a more tightly clustered set of viewers, but greater volatility over time, depending on whether scandal is in the air or not.
- Curatorial products love controversy because they add kerosene to the fire, in a good way. Purely curatorial products also introduce latency into news coverage, in that they can’t cover the news as it is happening, because they require signals (both social and sharing based) that necessarily trail the actual event. With this technique, you can’t assign a story about a high-ranking Senator caughtÂ in flagrante at the local Applebees until there is a social media reaction to that event.
- The purely algorithmic product can be victimized by being banned, as the search engine Blekko has done (and Google may do), Â gaming (companies or individuals affecting topic popularity through unethical/non-natural means) or an insanely devout fan base (tip of the hat to Justin Bieber).
- The blend of algorithmic plus internal signals can also get gamed, but the gaming is less likely because of the diversity of signals used in the analysis, assignment and feature presentation. Is it also less latent that a purely curatorial model, because humans aren’t involved.
We aren’t advocating one technique at the expense of the other, but suggesting that two or three of the techniques, used in a complementary fashion, will yield the highest user value. HuffPo uses a blend, as does Gawker, SB Nation and Bleacher Report.
Right now, there aren’t enough “hit” stories on the Daily that personally appeal to me or that I couldn’t get elsewhere. Â The Daily can solve this by adopting a larger umbrella for linking to third-party stories, and choosing those stories based on user interest on their site and elsewhere.
Essentially, they need a more diverse story selection model to deliver greater individual value to individual users.
Last but not least, to really refine their news presentation model, they should actively promote sharing for every content item, including excerpts from articles and videos, so that they have a proper internal feedback loop for what is working and what is not. They currently enable posting to Twitter and Facebook, but it isn’t granular (that is, at the sub-story level), and leaves little UX affordances for color commentary.
The last feature that they can add to deliver greater user value, and something that we do at FanFeedr (self-promotion alert), is provide users the ability to personalize based on topics of interest, and if they used Facebook authentication (which they do not), they would be able to pull in user interests directly from their profile, and then customize the news delivery based on those explicit preferences.
First the really good news: we hit over 160,000 Twitter followers across our accounts last month. Yay for us!
For the following site upgrades, most are technology fixes and upgrades, so you are duly forewarned.
- We paginated ourÂ Leaderboard, so the pages loads faster and you can do a deeper dive on your favorite sports
- We are now aggregating all of the player and team tweets from Major League Soccer, as you can see here via our list of MLS Tweets
- Fixed a weird glitch with SIU Edwardsville Cougars’s basketball schedule, because we were besieged with requests and prioritized it
- We added monitoring to all of our server processes to ensure better uptime
- We upgraded SQL Alchemy and Mako to versions 0.6.5 and 0.3.6 respectively
- We are publishing some additional material directly out of Solr
- We upgraded and fixed a number of FanFeedr API calls, which are noted in detail on the developer portal
More good stuff in the pipeline, and thank you, as always for using the service.
Over the last two weeks, we have had a slate of good news:
- FanFeedr was selected as one of the PepsiCo10 top emerging startups
- We launched our FanFeedr Pick’Em game directly on Facebook
- We released our Official Redskins Feedr application in partnership with the Washington Redskins
- Verizon is actively promoting our FanFeedr Android application for personalized sports news and information
Additionally, the developers have worked hard (and over weekends) to improve the user experience on FanFeedr, Facebook and Twitter for you, our users.
Improvements to FanFeedr.com
- The Associated Press accidentally cut off our photo feed, and that has been restored. Top photos on the site in the last hour are here.
- European football and US football scores, schedules and rosters have been updated
- We made it easier to login and sign-up for the site by creating a new first-time user experience
- We added navigation to the sports in the search bar at the top of every page to make it easier to go to scores, photos and the Hot News
On our Facebook Pick’Em application
- We made it easier and clearer to invite friends to the game
- Fixed some bugs when we published the results of scores
- Added in options with our partner, gWallet, to allow you to get points for free
As always, we appreciate your business, and your feedback, and please let us know if you are missing anything important in the FanFeedr Feedback area.
The numbers are pretty startling: our website visitors come for 1m 34s, and our iPad users come for just over 14m 20s each time they come to FanFeedr. I was recently asked why this was the case, and the usual answers came out:
- We offer a personalized service, so viewing it on a personal device should increase the time spent per session
- Using Facebook and Twitter, which we require for login, is easier on an iPad than on a work computer (our peak usage is around 3p ET (GMT -5).) Some workplaces restrict access to the social networks
- The iPad application is an easier way to consume information than our web application
That explains part of the 10-fold increase in time per session, but it doesn’t seem like enough.
Perhaps it was timing?
- We launched the FanFeedr iPad application on April 3rd, the day that the device launched.
- We were a featured application for the first week in the iTunes app store
- The 2010 World Cup began two months later, right after the 3G version of the device debuted in the United States.
That still doesn’t seem to full account for the difference.
The single answer that makes the most sense is that the iPad interface can’t multitask, and that single-threaded application behavior forces users to focus solely on our application.
Which sort of underscores that multitasking doesn’t make you smarter, and greatly impedes time spent on web sites, to boot.
Our friend and advisor Scott Rafer has a short post on why startups shouldn’t focus on SEO or SMO (social media optimization on Twitter and Facebook), but should develop their own independent channels to create customer demand so that they aren’t beholden to a big company that could care less.
This is a post about traffic that I paid for, and the results, and traffic that I didn’t pay for, and the results.
The former: Early on at FanFeedr, I had big-company-itis, and I paid $2,000 for 10 radio spots, or $200/per spot. This was for sports talk radio, obstensibly, but my bookers placed me on shows that had little to do with sports, so I cancelled the contract at the halfway point.
When I did appear on a demographically-accurate radio show, I could see the number of users on the site in real-time, as we use a betaworks‘ portfolio company, Chartbeat, to monitor our real-time traffic flows. This was shortly after our alpha launch, last summer, so any customer was a good customer. This is what we did, across five interviews that averaged eight minutes apiece in markets like Los Angeles, New York, Providence, Hartford: 10 extra visitors to the site. The CPA (cost-per-acquisition) was a highly-absurd $200, regardless of the number of shows, as my $2,000 partial deposit was not refundable.
There is one hurdle in mentioning our site name on radio: it is missing the trailing “e” in FanFeeder, as you would say it naturally, and my efforts to stress that missing “e” did nothing to encourage intrepid radio fans across the country from coming to the site. Since they clearly had no previous exposure to the site, as we had launched a week earlier, this entire strategy was a waste of money.
Then, this Monday, 1 March 2010, I got a message from a producer of ESPN2′s SportsNation that we would be mentioned as one of their “Sites We Like,” as you can see below:
Based on the radio experience, I didn’t expect anything traffic-wise based on 10 seconds of exposure before cutting to a commercial, but they actually showed the site name in a fashion that is clear enough to read. And here are the numbers, based on our appearance at the 33m mark, in the middle of a one-hour show:
1 March 2010 | 04.33p ET (initial broadcast)
- 93 extra site visitors
1 March 2010 | 11.33p ET (repeat)
- 34 extra site visitors
2 March 2010 | 01.33a ET (repeat)
- 37 extra site visitors
Simply put: those incremental 164 visitors came from 10 seconds of television exposure on ESPN2 at odd hours. Those visitors, however, have been more highly engaged, surprisingly, on the site, than traffic that we get from social networks, which tends to be flightier (less time spent per visit) than traffic that we acquire through organic search. Specifically, these new visitors spent 27 seconds longer on the site than traffic from social networks.
Additionally, they engaged with our Pick’Em game at a higher rate than social networks or search engines. The latter makes a lot of sense, as the search engine traffic is much more directed and less around general sports topics of interest than the traffic from social networks. The Pick’Em game (“Who do you think will win, Arsenal or Portsmouth?” (Only the dimmest of bulbs would chose Portsmouth, BTW)) reflects picks, results and badges back on Facebook, and thus is one of the most pandemic (a hyper-form of virality that we just made up) channels that we have for acquiring new customers. This channel is based on prestige, which we have discussed before.
Scott is right in his advocacy for finding other channels for audience development outside of Facebook, Twitter and the search engines, but I cannot begin to suggest that happenstance mentions on television networks are a marketing channel.
But when it does happen, it can provide a nice, small bump for a web site that is seeking a much larger audience. Connecting through video with people who are fanatical about sports has potential as a marketing channel, and the good thing for us is that even if we don’t get the direct traffic, we can power other sites with sports-related goals through our API (which has aggregated news and information as well as the Pick’Em game.)