Onward: My new job at Publish2

I am extremely excited to let you know that I’m starting a new job on Monday, as Director of News Innovation at Publish2.  I’ll be working for Scott Karp, who I’ve been following since I started blogging back in 2005, and with a team of top-notch online news thinkers, evangelists, and developers.

What does a Director of News Innovation do?

I’m expecting to work with newsrooms and journalists across the media world to get them the tools they need to bring the best of the Web to their readers, and maybe even to bring the best of their readers to the wider Web.  Sound good?

Well, help me out. Let me know what you think of Publish2, how you’ve used it, and what you’d like to see in the P2 toolkit that isn’t there yet.

Here’s my favorite recent Publish2 story, about how a group of disparate news organizations in Washington state used the service as a tool for collaborative curation during floods this winter.

I can’t wait to get started.  Matter of fact, if you’re at BCNI Philly this weekend, feel free to throw your ideas about Publish2 at me in person.

+++

To answer an obvious question, yes, I’ve left my job at GateHouse Media, effective today.

I had a great 19-month run with GateHouse, doing my best to give journalists at more than 125 newspapers the tools and training they needed to serve their communities.

Any and every success that I had there belongs to the incredible team of developers, the awesome revenue team, and the online news innovators I worked with, including Howard Owens — who hired me and has since left GateHouse to put his money where his mouth is at The Batavian — and Bill Blevins, the VP who Howard reported to, whose door was always wide open to new ideas and possibilities.  Thank you.

+++

Onward. I’ll be spending a great deal of my time over the coming days and week wrapping my head around how Publish2 has been used so far and where it’s going.  Let me know what you think of it, here, on Twitter, or wherever you see me.  I’m easy to find.

Sometimes, robots just aren’t enough

TechMeme adds a human editor to make adjustments when the algorithm fails:

“Any competent developer who tries to automate the selection of news headlines will inevitably discover that this approach always comes up a bit short. Automation does indeed bring a lot to the table — humans can’t possibly discover and organize news as fast as computers can. But too often the lack of real intelligence leads to really unintelligent results. Only an algorithm would feature news about Anna Nicole Smith’s hospitalization after she’s already been declared dead, as our automated celeb news site WeSmirch did last year

Would Google News add humans to the mix to craft a more up-to-date, relevant news site?  I doubt it.

But I’d be interested to see further variations of the algorithms that run Google News, TechMeme, and perhaps to a lesser extent, Digg or Reddit, to see what else is possible when it comes to translating the logic of linking behavior into actual prioritization of “importance,” if that’s still a relevant metric.

via @jayrosen_nyu

Sunday morning links: Data, DocumentCloud, and the Obama Bounce for news

A few things I haven’t had time yet to dig deeper on, but maybe you will:

“4. Go off the reservation: No matter how good your IT department is, their priorities are unlikely to be in sync with yours. They’re thinking big-picture product roadmaps with lots of moving pieces. Good luck fitting your database of dog names (oh yes, we did one of those) into their pipeline. Early on, database producer Ben Welsh set up a Django box at projects.latimes.com, where many of the Times’ interactive projects live. There are other great solutions besides Django, including Ruby on Rails (the framework that powers the Times’ articles and topics pages and many of the great data projects produced by The New York Times) and PHP (an inline scripting language so simple even I managed to learn it). Some people (including the L.A. Times, occasionally) are using Caspio to create and host data apps, sans programming. I am not a fan, for reasons Derek Willis sums up much better than I could, but if you have no other options, it’s better than sitting on your hands.”

“To get a sense of DocumentCloud’s potential, take a look at the database of Guantánamo Bay detainees that the Times made public on Nov. 3, when it was accompanied by a 1,500-word story. Each record is linked to relevant government documents that have been made public since ‘enemy combatants’ were first held there in 2002. Pilhofer said the database isn’t using a full-featured version of DocViewer, but it certainly demonstrates the benefit of browsing documents grouped by subject rather than, say, the order in which the Defense Department happened to release them. What’s remarkable about the Gitmo collection, aside from its massive scope, is that the Times has offered up this information at all. As Pilhofer said, ‘It’s not usually in a newsroom’s DNA to release something like that to the public — and not just the public, the competition, too.'”

Packaging national election headlines for local news sites with Publish2

Happy Election Day-After!

I’m still up to my neck in post-election analytics, gathering stats and data from hundreds of news sites I work with to do a little postmortem on what worked, who learned some new tricks, and what the readers thought of it.

One of the things we put together here at GateHouse for Election Day coverage was a national election news widget.

Because I work with small-town and rural community newspapers, national news is the last thing I want reporters and editors to spend time on while local election results are coming in.  Copying and pasting AP stories?  No.

But, wouldn’t it be nice to have a constant stream of headlines available for readers obsessively pounding the refresh button looking for updates on local races?  Without depending on any one source, like the AP, that everyone else has on their sites?

Yeah, it would:

Neosho Daily News on November 5

This is a screenshot of a chunk of the content well at NeoshoDailyNews.com a small (think: sub-10,000 print circulation) news site in Southwestern Missouri.

The top part of it is list of headlines that we’re putting together with Publish2.

I’ve been working with editors from the GateHouse News Service to use Publish2 as a bookmarking engine to route headlines from our browsers to that widget on many GateHouse sites.  In fact, the news service has been using it for months now to feed links to their Elections page.  You can find notes on that use and our use of Publish2 to feed Hurricane Gustav headlines to sites in Lousiana here.

Here’s why we’re getting so much use out of this:

  1. It’s simple. Use a bookmarklet, surf the Web, hit the button when you find a story to share.
  2. It’s diverse. You choose the sources, you push the buttons, you curate the content.
  3. It’s timely. I sat in front of my laptop last night, scanning politics.alltop.com, Google News, Yahoo News, the network’s sites, big national papers, Memeorandum, Twitter, and Google Reader, plucking the results off the pages and bookmarking the link with Publish2, giving local sites an instant feed of national headlines.

Scott Karp started telling about his plans for Publish2 more than a year ago, when it quickly inspired me to start thinking about ReportingOn, based on what Publish2 didn’t do; but he keeps developing the idea, and I think he makes a strong case for using it as a tool for curating the Web in a way that makes sense for news organizations.

Yes, yes, I know that you could do something similar with Delcious or Google Reader or FriendFeed or even a Twitter account, but Publish2 has a by-journalists-for-journalists feel that I like.  You’re not repurposing some other app to do what you want; this is designed to do what you want, and even allow you some editorial control, or even to group users with access to a set of links.

Check it out.  Curate the Web that your community cares about.

A short manifesto on local linkblogging

Brittney Gilbert blogs for a TV station in the Bay Area.  I’ve mentioned her before, and even though I’ve moved geographically far from her coverage area, I keep up with her tweets and various postings.

Today she writes: I am not a journalist.

While I disagree (the curator is a journalist and the journalist is a curator), she lays out the logic and opportunity for the local linkblogger:

“The Bay Area is crawling with people passionate about their communities. They have their feelers out, covering the legislature, watching their streets and otherwise covering the San Francisco-area like a blanket. In fact, there are so many awesome local bloggers out there breaking and reporting news that you need a human to point you to the best and most important stuff. This, my friends, is my job.”

Does your news organization have a dedicated linkblogger, or does your staff contribute to the task of curating the local Web?  If not, why not?

The good stuff is over there –>

If you prefer my tweets and shared reader bits and delicious links to the infrequent and sometimes long-winded content here at what passes for a blog, click on through from that reader of yours and take a look at the right side of your screen. (Actually, let me check that in a few browsers first… OK, we’re cool.)

You’ll find a stream of links, and some other stuff that was buried a bit lower until a few minutes ago.

Enjoy.

(Thanks to the folks at SimplePie, especially for the WordPress plugin. It. Is. Rather. Simple.)

Watch out for secondary characters with more interesting stories than your protagonist

That’s good advice up there in the title of this post. I got it from a screenwriting teacher, and it’s been a running joke around our house for the last week based on a couple movies we’ve watched lately.

And it’s also good advice for narrative journalists.

But that’s not what this post is about.

This post is about the Long Bet between Dave Winer and Martin Nisenholtz.

Just in case anyone is keeping score, I’ll add my name to the list of unofficial judges who think Wikipedia was the winner.

Here’s the kicker from Rogers Cadenhead’s post on the topic:

“Winer predicted a news environment ‘changed so thoroughly that informed people will look to amateurs they trust for the information they want.’ Nisenholtz expected the professional media to remain the authoritative source for ‘unbiased, accurate, and coherent” information. Instead, our most trusted source on the biggest news stories of 2007 is a horde of nameless, faceless amateurs who are not required to prove expertise in the subjects they cover.”

He’s exaggerating (‘nameless, faceless’) to harp on the contrast between the interesting secondary character in this story and the protagonist/antagonist pair, but the point is clear:

The crowd beats the individual and the organization when it comes to …well, SEO is a factor… but the reason the crowd’s version of events floats to the top of search results has more to do with individuals linking to the crowd’s record than a header tag matching a title tag.

There’s plenty of intriguing thought about this being thrown around, including these bits:

  • Dave Winer: “The world that I hoped would come about did not. While blogs have broken many stories, they have not, in general, turned into the authoritative sources I hoped they would in 2002. When the blogosphere resembles journalism it’s often the tabloid kind.”
  • Paul Boutin: “Cadenhead has exposed the flaw in my genius idea: I presumed there were only two sides. That’s journalist math. Any real techie knows there are never only two values to anything in real life.”

Where’s Martin Nisenholtz’s blog, anyway?  I’m eager to hear his take on this.

And the last open-ended question: Who’s the third player in the scene you’re writing? For example, is there a third element in the Newspapers vs. craigslist equation?