Regulatory, Technical Issues Bedevil Efforts to Unleash White Spaces for Cognitive Technologies
BRUSSELS -- Spectrum regulators and users must rethink spectrum allocation in order to make white spaces and other shared uses possible, speakers said Wednesday at a Forum Europe conference on an EU policy for dynamic spectrum access. Cognitive technologies will squeeze more out of radio spectrum, but in practice “we're a long way” from having them, said moderator and Aetha Consulting partner Amit Nagpal. Despite successful white spaces trials in the U.S. and U.K., the regulatory issues for dynamic shared access are far from resolved, regulators and industry representatives said.
Sign up for a free preview to unlock the rest of this article
Timely, relevant coverage of court proceedings and agency rulings involving tariffs, classification, valuation, origin and antidumping and countervailing duties. Each day, Trade Law Daily subscribers receive a daily headline email, in-depth PDF edition and access to all relevant documents via our trade law source document library and website.
With demand for wireless data capacity soaring, mobile networks facing declining revenue from voice and texting are trying to cut network operation costs and are rapidly shifting to picocells and other smaller cells, and fixed backhaul services for wireless services, said Pearse O'Donohue, European Commission head of unit for radio spectrum policy. Policymakers must ensure that spectrum use is as efficient as possible for the common good, he said. Europe needs rules that enable shared use, he said.
The EC is examining the issues for a statement due in July, O'Donohue said. The communication will address: (1) How to incentivize use of migration technologies to ensure quality of service and enable more shared use. (2) How to reassure spectrum holders they can better exploit their spectrum by introducing new technologies. (3) How to give sharers legal certainty and predictability about rules and conditions. (4) Whether all regulatory tools are in place to permit economies of scale in the internal market.
The EC now has an “all or nothing approach” to spectrum allocation, O'Donohue said. Either the bands are fully licensed or there’s collective use in unlicensed spectrum, he said. But that leaves out the entire middle part -- licensed shared access, he said. All those tools are needed, he said.
There’s fairly broad support in the U.S. for moving toward dynamic spectrum access, said Ira Keltz, deputy chief of the FCC Office of Engineering and Technology. Several efforts in that direction came to a head in a series of orders on the TV white spaces, he said. The FCC decided to rely on the use of geolocation databases rather than requiring sensing, which has problems, he said. OET has certified one database manager, Spectrum Bridge, and is vetting nine others, he said. One device has been deployed, in Wilmington, N.C., with others under development, he said. There are plans for similar nationwide device deployments that are being held up by the FCC as it seeks to protect wireless microphone users, he said.
An FCC consultation on dynamic spectrum access sparked a lot of comments but many were “thin on information,” Keltz said. The agency is mulling dynamic trading within all bands, but there are many technical and legal issues to work through, he said.
"We have to change our way of thinking” about spectrum allocation, said Microsoft Senior Technology Policy Director Paul Mitchell. That means thinking not in terms of “open or closed,” or “carrier-driven or consumer-driven,” but about how to combine spectrum-sharing and a balance between licensed and unlicensed spectrum, he said. The solution will involve a wide range of technologies and better receiver tolerance for software radio and other devices, he said.
Microsoft has been trialing TV white spaces technologies in various locations around the world, Mitchell said. It’s trying to prove that broadband that uses white space won’t interfere with broadcast frequencies, he said. The biggest challenge with innovation is that it disrupts business models, particularly when incumbent players are involved, he said. Continued use of the same traditional spectrum allocation model is hampering creativity, he said.
Policymakers must move from viewing sharing as exceptional to where it’s considered the norm, said Google Telecom Policy Counsel Aparna Sridhar. This is an area where the FCC and EC can be setting a philosophical policy that individual businesses can build on, she said. In addition, spectrum inventories are an important step to unlock the potential of cognitive technologies, she said. But industry has to move ahead and not wait for all the uncertainties surrounding such technologies to be resolved, she said.
Getting TV and wireless networks to coexist poses problems, said Simon Mason, head of technical development for Arqiva, a U.K.-based infrastructure provider. The two networks have different technical parameters, he said. TV networks are sparse and the receivers are extremely sensitive, he said. In addition, when TV receivers were developed there was nothing else in the band, so the devices are not selective, he said.
Because they're new, white space devices can be designed to meet very high technical requirements, but because digital terrestrial TV (DTT) receivers in the network are so sensitive and unselective, the power of white space networks must be lowered to coexist with TV, Mason said. “My frustration” as a radio frequency planner is how to mitigate that, he said.
That underlying frustration has regulatory implications, O'Donohue said. There are many legacy users and even though everyone may think the use of the white spaces is a great idea, it may not function in places where there’s interference with DTT networks, he said. There can be decisions that are so important for economic and social benefits that the policy should be to force legacy systems to change, he said. That’s a political decision, he said. With all due respect to white spaces tests, such as the one in Cambridge, U.K., he said, regulators also have to look to other bands where the transitions costs aren’t as steep, he said.
Some global coordination of how the white space are used is a good idea, several speakers said. The more we achieve scale, the easier it becomes to apply dynamic spectrum access technologies, said Sridhar. Many of those technologies may be equally as or more useful to users in Asia and Africa as those in Europe and the U.S., she said. The more that’s crammed into the band, the harder it is to develop consumer devices that will work across bands, creating a need for pan-global cooperation, said Mason. Harmonization will drive down equipment costs, said Keltz. But the EC doesn’t want to be in the position of picking winners, said O'Donohue. It wants to move toward technology neutrality to allow innovators to work, he said.
No one knows what’s coming down the road, Keltz said. Regulators must figure out how to craft a reasonable set of rules that enable innovation without forcing things down a single path, and while also hoping incumbents adapt as well, he said. So one question is how to motivate incumbent spectrum holders to try something new, he said. The concept of encouraging them to develop new technologies offers a “chink of light” that may help things move forward, Mason said.
The concept of cognitive radio was introduced a decade ago as a “golden promise” to counter spectrum scarcity, said A.H. van den Ende, senior telecom project manager at TNO (a Dutch independent research and consultancy organization). At the time, the importance for a great deal of research and development was recognized, he said in an interview. Now, after years of work by universities and industry, and rollout of some commercial enterprises, people are starting to see the practical complexities of cognitive radio and dynamic spectrum access in general, he said.
Development of dynamic spectrum access via cognitive technologies must take into account lessons being learned in the TV white spaces, van den Ende said. Among those is that how such access works depends on geography, he said. Technologies that work fine in the U.S. don’t necessarily do so in Europe, where white space is harder to come by and spectrum arrangements fragmented, he said. Another lesson is that two potential cognitive radio technologies, sensing and cognitive pilot channels, to date have proved too ambitious and insufficiently reliable, while database geolocation, though less “sexy,” is a more effective way to let devices know what’s happening around them, van den Ende said. But even that approach is difficult because it must be reliable and consistent in order to protect legacy systems, he said.
Use of white spaces in the TV band is a key exercise to see how dynamic spectrum access could work also in others bands, van den Ende said. We've already learned from the 5 GHz case where wireless local area networks were to share spectrum with radar systems, he said. This has led to complex questions about how to deal with the design of wireless LAN systems and how to detect and avoid radars and still provide adequate throughput. The 5 GHz radar band was the precursor, with the same issues on sensitive equipment and lack of selectivity of incumbent systems, that we are facing now in the TV band, he said.
One “exciting” question that still remains is what kinds of applications will truly benefit from dynamic spectrum access, van den Ende said. There are still questions about whether quality of service can be maintained, which is needed for some categories of applications, he said. Moreover, not all applications can carry the additional costs for device development, he said. One can conclude that where dynamic spectrum access is concerned, “it is promising, but we're definitely not there yet,” he said. One critical aspect is that the U.S., Asia and Europe may end up with different solutions, so what’s needed is much more coordination to reach global solutions, he said.