Monday, February 11, 2013

Is FCC Cooking the Data In Spectrum Grab?

The FCC, quite naturally, wants to find ways to cram new services into an already over-crowded radio spectrum.  Two of the biggest pushes involve repurposing the spectrum currently reserved for local broadcast television.
  There are two different sets of actions at work.  First, to "repack" TV station allocations into a much smaller bandwidth - which the FCC argues can free up around 40% of broadcast spectrum for other services. Second, to allow localized data communications services (WiFi, Bluetooth, etc.) to use what's called "white space" in the TV broadcast spectrum.  Broadcast signals need to be separated by distance and bandwidth to avoid interference, and "white space" refers to those gaps - the areas where existing signals don't reach. What the FCC proposes, is to shift TV bandwidth designation from being exclusively reserved for TV to one where TV is the primary service, and WiFi and similar services are permitted as secondary services (allowed to operate as long as they don't interfere with the primary service).  All of this depends on technical issues involving signal coverage and interference protections, and on the computer models that are used to predict them.
  The NAB (National Association of Broadcasters) is essentially accusing the FCC of trying to rig the computer models it will use in these critical action by changing the data sources used in the modeling, and not allowing independent review of the computer models themselves, possibly in violation of Federal law.
The "violating law" issue comes from the legislation authorizing the incentive auctions that is supposed to generate the allocation and license give-backs at the heart of the "repack", and the models used to determine if the channel reassignments in the "repack" are viable.  The models would also be relied on to estimate the impact of reallocations and other technical shifts, such as whether to allow secondary use of "white space." The NAB's focusing on legislative language the FCC to make its best effort to preserve "the coverage area and population served of each broadcast television licensee, as determined using the methodology described in OET Bulletin 69 of the Office of Engineering and Technology of the Commission."  That is, to try to replicate signal coverage and interference protections as they existed at that time.. 
The FCC started using a newer population distribution dataset, one not available at the time of the authorizing legislation, rather than the one used since the 1990s - and one that reflects both population gains and shifting population distributions.  The computer models used to evaluate possible "repackings" and determine station reallocations is new, as are the models being used to consider "interference protections."  The FCC has resisted industry calls for releasing the software so that the industry can evaluate it.  As a result, coverage maps and impacts are likely to vary from those the FCC and industry has used for decades. Without access to the new data and models, stations face heightened uncertainty in their planning and decision-making.
For an arm of an administration constantly touting transparency, the FCC's behavior is troubling, ultimately unscientific, and places the agency at risk of being unable to reach its publicly stated goals.

Source:   NAB: FCC's Proposed Changes to TV Station Coverage Model May Violate Law, Broadcasting & Cable


No comments:

Post a Comment