After studying traffic growth predictions and plans to increase capacity, Nemertes predicts that the Internet will start to seem pokey as early as 2010, as use of interactive and video-intensive services overwhelms local cable, phone and wireless Internet providers. The findings were embraced by the Internet Innovation Alliance (IIA), a tech industry and public interest coalition that advocates tax and spending policies that favor investments in Web capacity. “We’re not trying to play Paul Revere and say that the Internet’s going to fall,” says IIA co-Chairman Larry Irving. “If we make the investments we need, then people will have the Internet experience that they want and deserve.” Nemertes says that the bottleneck will be where Internet traffic goes to the home from cable companies’ coaxial cable lines and the copper wires that phone companies use for DSL. Cable and phone companies provide broadband to 60.2 million homes, accounting for about 94% of the market, according to Leichtman Research Group. To avoid a slowdown, these companies, and increasingly, wireless services providers in North America, must invest up to $55 billion, Nemertes says. That’s almost 70% more than planned. Much of that is needed for costly running of new high-capacity lines. Verizon is replacing copper lines with fiber optic for its FiOS service, which has 1.3 million Internet subscribers. Johnson says that cable operators, with 32.6 million broadband customers, also must upgrade. Most of their Internet resources now are devoted to sending data to users — not users sending data. They’ll need more capacity for the latter as more people transmit homemade music, photos and videos.

[SOURCE: USAToday, AUTHOR: David Lieberman]

About joly

isoc member since 1995


  1. CircleID provides this excerpt from the report:

    It’s important to stress that failing to make that investment will not cause the Internet to collapse. Instead, the primary impact of the lack of investment will be to throttle innovation-both the technical innovation that leads to increasingly newer and better applications, and the business innovation that relies on those technical innovations and applications to generate value. The next Google, YouTube, or Amazon might not arise, not because of a lack of demand, but due to an inability to fulfill that demand. Rather like osteoporosis, the underinvestment in infrastructure will painlessly and invisibly leach competitiveness out of the economy.

    The report is freely available for download. (registration is required).

  2. Comment by Dave Burstein

    Nemertes’ primary conclusion is sound. People will probably want more local
    access – faster DSL and cable. This is almost certainly true – many people today want more than the maximum 1 meg up, 3-6 meg down that the telcos offer for DSL. Cable upstream is typically even slower. That’s a small fraction of the bandwidth needed to send grandma the DV video from a $400 camcorder. As more people move to HD and multiple TVs, as well as other bandwidth hungry apps, many will want higher speeds.

    The pr people stretched this. possibly out of ignorance, including a headline “User Demand for the Internet Could Outpace Network Capacity by 2010.” “The Internet” is the network of networks, the part that “will scale nicely.” The pr (partially paid by AT&T) added a uninformed comment from Larry Irving “a critical issue facing the Internet …potentially face Internet gridlock that could wreak havoc on Internet services.” That’s directly opposite the actual conclusions of the report, that the Internet does fine, and the local loop will have speeds higher than today’s (except possibly with obsolete shared cable modems.) The press release, but not the report, had the Howler “it may take more than one attempt to confirm an online purchase”. Confirming an online purchase requires a small number of bytes and is not very sensitive to latency or even a significant slowdown.

    Some careless reporters then took it much further, including transforming “could” but unlikely to “will,” as if it were almost inevitable. If his paper were not in the midst of a round of layoffs, I’d single out one particularly bad story.

    Author Johna Till Johnson and Sponsor Larry Irving are acquaintances who have done work I respect. Johna should have been far more careful about material attributed to her organization, and mistaken comments by her partners and reporters. We also have honest disagreements. I believe some of her numbers, like typical Internet usage today, are much too high. Because the carriers hide the data, neither of us can be sure who has it right. I also believe she’s reading far too much into a simple model, clever as it may be.

    Former government officials Larry Irving and Bruce Mehlman are partners in the “Internet Innovation Alliance” , which bills itself as a non-profit with broad support. I believe in fact most or almost all their money comes from the carriers and their suppliers. Mehlman is a well-paid registered lobbyist for AT&T, and they consistently advocate positions useful to the Bells. IIA may have found a loophole that allows them to skirt the law on registering as lobbyists, but it’s disingenuous to suggest they are independent.

    All of which makes me wonder why I, Karl of DSL Reports, and others have spent hours trying to get the correct facts out. We should be able to bypass foolish mistakes like this. But folks like Senators and FCC Commissioners far too often believe what they read in the papers. That’s one reason even honest policymakers (Mike Powell) make profoundly wrong decisions. No one has calculated how much the telcos are spending “creating a climate of opinion.” If the number came out, it would be shocking. My best guess is upwards for $700M is spent each year to influence the FCC and policy, but I can’t come close to proving that.

  3. Comment by Bob Frankston

    The title of the report is “The Internet Singularity, Delayed: Why Limits in Internet Capacity Will Stifle Innovation on the Web” . You can argue that the conclusions (section 9) are not that sensational but the tone throughout the report and they assume that that high capacity is the limiting factor when I argue strong it’s the lack of wire-indifferent ubiquity and their recommendations actually exacerbate that problem

    As I said I didn’t want to respond to the report point by point. It presumes a broadband model with high bandwidth, AKA, video applications driving the dynamic. Even if you agree with that the presumption that more broadband is the solution it fails to look at what the real bottlenecks are in today’s distribution model of broadband and the implications of Moore’s law improvements in edge technology and architecture. Translating a demand for more connectivity into more DSL and Cable is far too narrow a claim. This is akin to the modem crises in the sense that the solution was not limiting modems but removing dial-up as the constraint.

    The real limit on innovation is not a lack of high speed connectivity as much as broadband itself – DSL and Cable – which confines innovation to the billable paths owned by the carriers who set the price and configuration hurdles. The inability to presume (wireless) connectivity is a far more important limit and the proposal of more carrier-provided capacity is will continue to thwart innovation. The report also assumes wireless connectivity in the carrier 3/4G and WiMax model. No mention of pervasive Wi-Fi Connectivity. We see this with the Kindle that uses Sprint’s EVDO network thus creating a tax on every book and the user is unable to extend connectivity and forget traveling outside the US.

    There is a fundamental problem with a methodology that projects from an architecture which is still repurposed telephony rather than one that is composited from local connectivity. More to the point, the Internet is about opportunity and discovering what you can do with it. So we have a broadband system explicitly designed for video distribution so it should be no surprised we use it for video distribution but it’s wrong to say that means it’s all about more video distribution. Telephony is about planning and guarantees so you have to build capacity for the applications you can anticipate and which generate revenue. But even if you accept that the future is more HDTV – that doesn’t mean more HDTV channels in broadcast mode. It just means HD bits delivered and we’re distributing prime time content over the 1% of the broadband path we get for “Internet”. A number of communities around here have three broadband paths with redundant content. That’s a huge amount of latent capacity even before we change the architecture and assuming it’s still about TV.

    If you look at the analysis of wireless the report actually underestimates the potential capacity by orders of magnitude by limiting the analysis to the Telco-centric model and ignoring all those access points with megabits per access point. That cuts both ways – does that add “access” capacity or does it increase the traffic and strain the capacity as it would if you assume a broadband distribution model?

    The history of the Internet has shown that demand creates capacity as we become more adept at using the infrastructure and understanding how to take more advantage of it. While I argue we’ve barely begun to take advantage of the available edge capacity (the report seems to assume that backbone capacity is not a problem) what is the “investment model”. Given the orders of magnitude uncertainty in the report it’s surprising that they are so willing to put down numbers – it’s like converting 1000Km to 621.369949495 miles. The bigger issue is where the investment is to be made and how. If we presume an investment in local infrastructure in response to perceived needs than we have a model that adapts nicely to growth – those who have a problem invest in capacity. If we accept a Moore’s law model for adding capacity the cost is already very low and will become too trivial too notice –it’s already far less than just about any other municipal infrastructure we have. Just look at the cost of connecting your house to the electric grid — I’m upgrading my service now and it’s a lot more expensive than a fiber connection would be using current technologies. It’s an 1/8th trench vs. a huge pipe with thick wires and all sorts of other challenges.

    But the report says “The only hope to close the gap permanently is that technology will provide access solutions that are less capital-investment intensive. Improvements in DSL technology, the rapid build-out of optical access and the emergence of wireless alternatives the last mile will help, but ultimately, access investment by the carriers will be required in order to address the big crunch from a supply side. “

    That is not just wrong but dangerously wrong – it’s like investing in the foxes to increase the number of hens. And what is “supply side” for the Internet?

    They do have disclaimer in explaining sensitivity analysis and given that their model is flawed the results should be viewed modulo that model. Digital systems make modeling especially difficult – the report says that small changes in utilization result in major increases in demand – but it cuts both ways. But, as I said, Moore’s law is about supply not demand.

    The report itself is long and full of qualifiers and I presume the authors are open to counter arguments. The danger, however, is these kind of reports are not benign – they feed lobbyists’ need for credibility. All that they need is a conclusion that says the poor carriers need more money and incentives to save the Internet. That can do real harm which is why I feel obliged to respond.

Leave a Reply