The Federal Communications Commission started off the last of its first round of broadband workshops with some big picture pondering of ideas and outside-the-box developments that could shape the future of broadband.
They included the bifurcation of the Internet into a number of nets, insuring the plan was sufficiently flexible to deal with the fluidity of change, and the need for more research and development.
Moderator Jon Peha, the FCC's chief technology officer, said the goal was to look at cutting-edge broadband ideas, then figure out how to use the broadband plan to hone that edge.
Dick Green, former head of CableLabs, put in a plug for cable's last-mile architecture, which he said was built to be adaptable and flexible. He pointed out that the industry was on its third generation of Docsis, growing from 1 megabit per second at the outset to 10 mbps now to 100-plus mbps speeds and looking at over 300 mpbs in the next iteration.
He said that there was enormous capacity left in the hybrid coax model, pointing out that the actual drop to the home was capable of handling 5 gigbits.
But he said one of the keys to handling the growing capacity from video and whatever else comes down the pike, and pipe, is transitioning from bandwidth-hogging analog to digital.
David Clark, professor and senior research scientist at MIT's computer lab dominated much of the conversation in the first half of the workshop, which focused on those big ideas. He was even asked to sit in on the following panel --on Internet TV- -by impressed FCC staffers. In fact, all of the first panels' participants got high marks from staffers looking for help in drawing up the national broadband rollout plan, due to Congress by Feb. 17, 2010.
One staffer suggested they would bolt the panelists to the floor and make them write the broadband plan to assuage the concerns of broadband advisor Blair Levin about hitting the Feb. 17 deadline. "Make yourself comfortable," he joked. "Food will be brought in."
Clark argued that wireless would not become a substitute for wired broadband, saying it would wind up being some combination of both. Fluidity was the concept he drove home, about both the state of broadband and how the national plan should be structured.
He said the goal was not a one-time objective but a continuing process, which meant sustainability, a concept cable operators have long said was key to any government proposals.
"One-time money infusions don't work," Clark said, especially the investment the government has made to date, which he called "miserable." He said it begged the question of whether the U.S. government cared about leading on the broadband front.
Clark said that both the 'net and the stakeholders would be changing, warning the FCC that "by the time you shape them, they aren't there."
Existing broadband networks have done a pretty good job of keeping up with Moore's law of exponential change, he said, essentially milking the fiber base built out during the dotcom bubble. One key will be the last mile and how wireless fits into that equation.
Clark also argued for a "dynamic definition" of broadband given the changes that will be driven by security, mobility and management issues.
If the Internet becomes lots of nets for business and private use, following the path of virtual private networks now used by many businesses, it could challenge the definition of an ISP or of what neutrality means, or lead to a kind of de facto unbundling, he said. All those would have implications for how the FCC structures its plan.
Clark wasn't expecting the FCC to predict the future, but said its plan needed to deal with uncertainty.
Another big issue was research and development. Most panelists agreed that the U.S. needed to do more of it.
Rob Atkinson, president of the Information Technology and Innovation Foundation, said he was troubled by the lack of investment in R&D, saying there was no institutional system to bring together all the key players in one place and focus on generic tech development.
He said Bell Labs was a shell of its former self, and that U.S. investment lagged behind many Asian and European countries.
Green agreed that more research was needed, perhaps the sponsored research centers proposed by another panelist, but he also said that real-world test-beds, rather than theoretical ones, might be more useful. He called it hard -- on a pure, theoretical basis -- to understand how all the stakeholders fit together, saying there was a need for experimental platforms, like the TV Everywhere online video model being tested by cable operators and telcos.
He said that without such real-world testing, combined with more R&D, he was concerned that the resulting broadband developments would be incremental approaches rather than the "big leaps" that would be necessary.
Clark said he didn't ever think there would be a return to the centralized Bell Labs approach, calling that a top-down structured management space that would not fit with a place as exciting and innovative and unregulated as broadband. He said that while the chip market may have moved overseas, "we are still pretty good at getting big ideas into the marketplace."
Van Jacobson, research fellow at the Palo Alto Research Center, said one problem with research is that the dotcom bubble encouraged academics to start thinking short-term, because there were fortunes to make by turning research into a business model as quickly as possible. There is a short-term horizon because of the need to make money. He said you can still get innovation out of that model, but not something like the Internet, which was designed with a longer horizon in mind.
He also blamed Congress somewhat for a directive in 1995 that research needed to be more "commercially relevant." That, too, he said, encouraged long-term thinkers to think short term.
Atkinson sought a middle ground, saying it was a false dichotomy between Bell Labs and the current state of R&D. He said the answer was to incorporate more organization with the same entrepreneurial spirit.