Stanford University held a workshop last Friday - The Policy Implications of End-to-End - covering some of the policy questions cropping up which threaten the end-to-end paradigm that serves today's Internet so well. It was attended by representatives from the FCC, along with technologists, economists, lawyers and others. Here are my notes from the workshop. I'm going to try to skip describing each individual's background and resume, instead substituting a link to a biography page whenever I can. (Part two of two - part one ran yesterday.)
The final segment of the morning covered caching. The main issue centered around transparent caching, where users ask for certain content but their request is silently fulfilled by a caching proxy server instead, generally without the user having any way to detect this. The standard concept of caching has the user being presented with the same content she would otherwise have gotten from the requested site, but that need not be true - Singapore, China and Australia have all used transparent caches to censor their citizens. This can also be a security violation (are you really talking to the secure server on stupidpettoys.com, or a proxy in between? Most users won't notice the difference.). Ann Brick noted a subsidiary issue - big commercial players have the ability to pay for their sites to be cached, while individuals do not. Similar to the QoS issue, this might be used to discriminate between paying, fast, commercial sites, and sites owned by individuals or even competitors.
David Clark made the insightful observation that dollars spent on caching don't go to general network improvements -- one small piece of the network is improved by caches, but the same money spent improving the whole network could improve it for everyone. Timothy Denton concluded this segment with the characterization of transparent caching as the difference between "form follows function" and "function follows form": the mere presence of caching and the ability to interfere with content delivery in the middle of the network destroys end-to-end and creates opportunities for mischief.
In the afternoon, there were two larger sessions covering broadband and wireless Internet access. In both areas, the companies controlling these access methods have strong motivations to violate end-to-end principles.
Jerry Duvall led the broadband discussion. He presented a rather fascinating economists' view of the situation -- an economists' world being solely concerned with customers, producers and markets. Laws are necessary to enable markets -- contract law, commercial law, fraud law, and so on are needed in order for markets to function. He summoned up the ghost of Adam Smith with a brief review of capitalism: producers always conspire against the public to get more profits from them, only competition keeps them in check. Marketing, lock-in, monopolization, and predatory pricing are always used by producers. He denied that end-to-end represented any sort of a perfect competitive market, however, suggesting that customer wants cause problems -- in some cases, customers actually want bundles from a single provider, and may actually prefer non-end-to-end Internet access. From an economist's point of view, end-to-end is only a means to an end. The end in this case is creating value for the customer. If that involves end-to-end Internet access, fine. If it doesn't, still fine. The value to the customer is paramount, engineering elegance is secondary.
Duvall also suggested that many observers have a naive view of regulation. With regard to the debate over open access to cable systems, he stated that there was no easy way for regulators to "come in and fix it." Regulation implies overcoming the resistance of entrenched players, and in the case of open access to cable systems, AT&T and other cable giants have proven adept at fighting lawsuits in support of their ability to keep their systems closed.
As we've seen previously, there was discussion of the reasons why end-to-end can be violated: sometimes customers want it, but (probably more often) the wants of companies are the driving force. Duvall suggested the external value of end-to-end in fostering competition and democratic values isn't adequately valued in most considerations of the economics of broadband. That is, the cost of violating end-to-end is spread out among many users of the network, but the benefits from that action accrue mainly to individual companies -- in economic parlance, this is called externalizing costs.
Another panelist emphasized the democratic value of open systems, a recurring topic in Lessig's writings. There was a bit more discussion of bundling-as-an-aid-for-novice-users vs. bundling-as-a-way-to-lock-in-customers. Jerome Saltzer reiterated the time-tested solution for monopoly problems: separate the content from the content-carriers. Deborah Lathen, acting perhaps as devil's advocate, asked why the builder of the pipe shouldn't be allowed to monopolize it. Duvall noted that no matter what the FCC might do to regulate cable carriers, that economic theory doesn't hold much chance for relief -- any time there's a monopoly (over the cable pipe), the monopolist is going to be able to extract monopoly rents, one way or another. If regulation affects a certain aspects of the business, the monopolist will find some other way to leverage the monopoly for greater profits. The only sure remedy is eliminating the monopoly.
Further audience discussion raised the idea that the concept of "an ISP" is a odd sort of legacy brought about by the necessity to have an intermediary between the telephone network and the TCP/IP network. In the future, the concept of an ISP may change radically. A question was asked: what benefit does the public get by allowing the cable companies to monopolize access? There were no good answers.
Mark Laubach gave a good overview of the architecture of cable Internet access, referring to the DOCSIS standard, which wasn't designed with open access in mind. Laubach stated that "basic IP dialtone" -- that is, a simple TCP/IP Internet connection without frills or bundled services -- should be a consumer right, which should apply to every broadband service regardless of delivery method: cable, DSL, wireless or satellite services.
Peter Huber summarized the open-access debate as it affected phone companies. The phone companies had a 1Mhz twisted pair of copper strands that they swore up and down couldn't be shared. They were ordered to share it, and now are doing so: local and long-distance competition, shared data/voice over that tiny line, co-location at central offices, etc. Now the cable companies have a 750mhz copper wire that they claim is "impossible" to share. Huber emphasized that whatever the regulations, cable and phone companies should be treated equally. Currently there are disjointed regulations, which (depending on your viewpoint) either unduly hamper phone companies or leave cable companies unfairly unrestricted.
Further discussion brought out the case of Stockholm, Sweden. Stockholm and certain other cities have taken on the job of laying fiber-optic cable as a municipal service, similar to sewer service or water or roads. Since the municipality built the pipe to the home, there is no issue of a company attempting to monopolize the pipe, and any company which wants to offer Internet service over the pipe may do so. As a result, Stockholm residents are getting extremely fast access speeds at prices less than U.S. residents pay for cable Internet access, and customers don't have to worry about the cable monopoly steadily reducing their upstream speeds, or banning servers, or whatever other crackdown U.S. cable providers have thought of most recently. The panel then debated whether (and how) it would make sense to move the U.S. to that sort of municipal model. A panelist threw out the figure that true open access to cable pipes might require a choice of 400 ISPs. An audience member suggested that as things are currently going in the U.S., there might be a choice of five ISPs at most, hand-picked by the cable provider.
David Clark added that whatever solution is proposed, it must be an ongoing process -- since cable Internet access is certainly not going to be the final stage of bandwidth development. Finally the broadband session closed with a pithy statement that, despite claims to the contrary, content is not king -- there is, and always has been, more money in individuals talking to each other than in one-way content distribution. The question that remains is how to convince broadband providers that there is more money to be made in selling large quantities of low-profit services rather than small quantities of more profitable ones.
The day concluded with a session about wireless Internet access. Unsurprisingly, WAP was the first topic to come up: a closed, end-to-end-unfriendly, expensive protocol that is all but deceased in the market, yet still actively promoted by companies that hope to benefit from controlling wireless Internet access.
Karl Auerbach had an insightful comment about why to use plain vanilla TCP/IP instead of a bespoke wireless protocol. Similar to the argument raised by Bruce Schneier and others that using a proven crypto algorithm makes sense because there are a lot of bad protocol writers in the world, Auerbach posited that freely available TCP/IP stacks have had the bugs beaten out of them, but the average proprietary protocol hasn't. The topic shifted to the location information that is now required to be built in to mobile phones. The panel discussed the control issues inherent in different network architectures: location information could be built into the phone, and controlled by the user, or it could be built into the cell towers, and controlled by the phone company (or law enforcement, or advertisers). It looks like the second architecture will be the one that is deployed.
Yochai Benkler brought up the issue of spread spectrum changing the rules for FCC frequency allocation -- more communications may shift to frequencies where the FCC does not require licenses to broadcast. Dewayne Hendricks gave a lengthy and interesting description of how amateur radio is currently being used in a manner similar to the venerable Fidonet to pass packet data over the short-wave frequencies via a store-and-forward system. The interesting part is that Amateur Packet Radio has been around for 15 years or so. Hendricks' concept was that the first truly free network would be one composed of independent wireless spread-spectrum devices creating an ad hoc network which could not be censored or controlled by any entity whatsoever. One audience member quipped that disruptive technologies always appear to incumbents as toys.
Hendricks noted some other wireless WANs, such as one in the San Francisco Bay area using Breezecom wireless cards and antennae. (Coincidentally, Salon did a story on wireless WANs just a few days ago.) Dale Hatfield noted that Hendricks' network could be created today using licensed spectrum, and noted that the greatest danger is incumbent spectrum-holders pushing regulations which protect their investments by making it difficult for the FCC to open up or use sections of the spectrum for these innovative uses.
Towards the end, one member of the audience (and I do apologize for not catching who it was), pulled everything together by noting the convergence between end-to-end as a technological issue, open access as an economic issue, and democracy and public debate as a political issue. The idea of eliminating "gatekeepers" on the internet is important for a great many reasons, whether you look at it as a technological issue of promoting progress and innovation, or as an economic issue of fostering competition and preventing monopolies from abusing their power, or as an issue of promoting free and unrestrained speech on the communications media of the 21st century. This is certainly one of the most important issues facing the country today, but relatively few people know anything -- even a smidgeon -- about it, or at most they've read a few news reports about the AOL/Time Warner merger. I'm glad to see such a diverse and intelligent group working on the issues, and if they don't yet have all the answers, it's only because they want to get it right.