March 13, 2010

Users vs. Companies: Conflicts over the Real-Time Web?

If 2009 was the year of real-time Web, with practically every major service finding ways to bring content to its users instantly, 2010 is about optimizing the new real-time world, expanding interoperability between sites, finding more ways for users' content to be discovered, and taking the potential of real-time out of the status world and into the real world. Today, at the South by Southwest Interactive event in Austin, Texas, one panel asked if we were making serious progress in this vision, and if companies, feeling increased competitive pressures, are short-changing users in the process.

Marshall Kirkpatrick of ReadWriteWeb, who moderated the panel, featuring representatives from Collecta, Google, Gowalla and Microsoft, said "the real-time Web is a big, complex and multi-headed beast," adding, "almost as many people you talk to on the subject will give a different perspective."

For most, the real-time Web represents reducing latency from the time updates are published and when they are experienced practically to zero. This can be anything from updates from blogs to downstream aggregators and RSS feed readers, status updates from social networks to other points in the ecosystem, or instant alerts from the Web at large that a saved search you requested has found a positive match.

But one of the existing problems with the real-time Web that has occurred is that despite the focus by many services to solve the same problem, many have done so without delivering true data interoperability - and other services are trying to solve for real-time without having full access to users' public data.

"Back in the day, you couldn't send e-mail from AOL to Compuserve, and today, you can't send data from Google Buzz to Facebook," said Brett Slatkin of Google's App Engine team, and co-author of Pubsubhubbub. "Part of what we are trying to work on is breaking down these barriers that connect to different sites. If I am on Buzz and Marshall is on Identica and Jack is on Twitter, we should all be able to communicate."

Standards have evolved in the real-time Web space, from OAuth to PubSububbub, WebFinger and Salmon (as documented here), but that's not to say there aren't still heated debates over these standards, or even which version of standards should be supported. (See this article for a discussion of OAuth 2.0)

"I try to be a practical person, and when I hear about a family of specifications, it sounds like a family of work," said Dare Obasanjo of Microsoft. "There is clearly a place where we have a common pain that we can work on. There is a bunch of shared pain, and the way you have to get real-time service is to work on APIs, and that is a clear starting point for standards. Pubsubhubbub can help solve that problem, but I get concerned when you have to implement certain specs to solve that problem."

"These specifications we agree on should be useful on their own," answered Slatkin. "When you implement a specification like HTML, you are not buying into an ideology."

As the real-time Web's protocols are debated and deployed, so too does the application of these services. Google Buzz and Facebook have received scrutiny for their aggressiveness in converting assumed private data to public, and Netflix recently canceled an algorithm development contest thanks to concerns of assumed privacy violations.

"When talking about privacy, right now, unfortunately, the social networking market is failing, and they have little incentive to encourage user privacy," said Obasanjo. "I am waiting to see when people find what they thought were private updates as part of trending topics on Google and Bing. Users and companies are in conflict."

Obasanjo gave the example of Twitter needing its users to be public in order to drive value into the system. After all, if users were all private, there would be no trending topics, and thus it is Twitter's best interests for updates to be public. "There is a factor that if a user wants to be private, it subtracts value from the system," he said.

Beyond these concerns, known benefits of the real-time Web are scratching the surface of what could be done with more expanded to real-time data from other sources, it was argued. Slatkin forecast a time where you could query supply chains for inventory and purchase locally instead of from, turning economies of scale on their head. Scott Raymond of Gowalla talked about intersecting real-time Web technologies with geodata to show trending locations and the hot parties of the moment, by decaying the relevance of checkins over time. Jack Moffitt, CTO of Collecta, said a development environment for new tools and applications that leveraged zero latency was becoming "very interesting".

"All these guys are working on realizing the potential right now, working on real-time data," Kirkpatrick said. "Brett Slatkin said it was important people focus on the unforseen future that systems we worked on to support undiscovered use cases - things are going to get real crazy real soon."

Web-wide adoption of RSS and Atom standards has eliminated the problem of publishers providing their data, and tools like Pubsubhubbub are working to get data from one site to another faster. "Polling doesn't scale and you need a push notification to deliver it. It's possible we will have multiple winners, and we have to consider privacy considerations that people won't want their data available to everyone," said Moffitt.

The element of real-time is being layered across the Web, and it seems to be happening even if developers aren't completely in agreement over the tools needed to optimize the experience or if the debates on privacy versus public data are solved. And there's a lot of room for real-time to grow outside of the statusphere and to more traditional markets. The question is can developers provide solutions that don't have users running to the FCC?

No comments:

Post a Comment