SpectrumTalk

The independent blog on spectrum policy issues
that welcomes your input on the key policy issues of the day.

Our focus is the relationship between spectrum policy
and technical innnovation.

Ofcom Chief on Spectrum in an Age of Innovation

Ofcom-logo


unknown
Ofcom is the FCC’s counterpart in the UK with roughly parallel jurisdiction. Yesterday, Ed Richards, Chief Executive of Ofcom gave an interesting talk with the title “Spectrum in an age of innovation”. Here are some interesting statements from it. He starts with

I’m going to begin with a quote, and I’d like to see if you can place it:

“Wireless communication is plagued by a shortage of space for new services. As new regions of the radio spectrum have been opened to practical operation, commerce and industry have found more than enough uses to crowd them.”
Any guesses?

It is fact a direct quote from a report of the United States Joint Technical Advisory Committee of 1952.


So we see that spectrum shortages are not a new issue although the FCC of 1952 might have been better structured to deal with them than today’s FCC.

I want to suggest today that the rapidly increasing rate of change in spectrum use – the dynamism in innovation, technology and in market demand ­– has injected a new urgency into the need to manage spectrum effectively.I’ll argue that, while in general we need greater use of market mechanisms to produce more efficient use, we also need to be both strategic and sometimes straightforwardly pragmatic in our approach.


I was impressed with he parallel treatment of “innovation, technology and in market demand” as well as the point that economics is important, but pragmatism is also key.

The way that spectrum has been allocated and assigned reflected the view that the overall range of use could be relatively easily foreseen and was likely to be fairly settled, with new uses eased in and conflicts managed away over many decades.


Yes, the central planning era of spectrum has left a large impact. Remember that eh Soviet Union was also big into central planning. The fall of that government was not due to a modern day Thomas Jefferson and a band of revolutionaries, rather it was due to total economic failure in a planned economy even it it had the resources of a large nation!

But public spectrum holdings were until recently largely exempt from any such (economic) pressures, and that imbalance – where one sector faces increasingly clear economic incentives and the other does not – means that the balance between public and private holdings is likely to be far short of optimal. Even with an increasing and very welcome focus from the UK government in addressing this, the incentives for public bodies, agencies and government departments to rationalise their spectrum holdings remain variable

.
While the US focuses on more spectrum for public safety, this blunt talk admits the fact that the public sector in the UK probably has too much spectrum and is a statement that we are unlikely to hear from the FCC Chairman or NTIA Administrator.

It has been very disappointing to witness the extent to which the incumbent mobile operators have chosen to entangle ... (realignment of mobiles bands) in litigation or threats of litigation.We recognise, of course, the need for companies to defend their commercial interests and to have recourse to the law in order to do so.

If a regulator or any other public authority makes a decision that is either procedurally or substantively flawed, the right of appeal is there to ensure good decisions replace bad ones.But when litigation becomes essentially strategic rather than based on objective grounds, and when it has the effect of holding back innovation and hampering growth, it is legitimate to ask whether the overall legislative framework fully supports the public interest in this increasingly vital area.


A democracy with the rule of law allows litigation to review regulatory decisions. But sometimes this also has an adverse impact on innovation.

The need for change also extends to those who use spectrum adjacent to the new services. In the (U.K.) case of 800MHz that includes Digital Terrestrial Television. For 2.6GHz, it is air traffic control radars. In the past, the response to the huge planning and mitigation issues that arise from this might have simply prompted the conclusion that it was all too hard to fix.

We no longer have that luxury. It would not be in consumers’ interests or in the interests of the wider economy. And it would not reflect the new realities of a world in which even very important and long established spectrum users may have to adapt to the arrival of new and different neighbours.


Is the “public interest” always absolute protection of incumbents? A thought one rarely hears in Washington from public officials.

Finally, Mr. Richards ends with

In the past, spectrum was assigned to users in the expectation that it would be effectively held in perpetuity. This has led consumers to a commensurate expectation about the lifecycle of their equipment. Now, in this new phase, more frequent change may well be necessary to promote more efficient outcomes for everyone.

In spectrum matters, straightforward long range planning will be replaced by adjustment and adaptation to the dynamism of technology and markets, combined with clear strategic coordination and pragmatism in delivery. As we look ahead, these will be the defining characteristics of successful spectrum management.

0 Comments

RF Globalnet Article on Cell Base Station Antennas and Their Environment

header_genesis

Last January RF Globalnet, a newsletter for wireless technology types, published an article entitled "Antenna Location Is Not An Architectural Decision: Antenna system design and placement is critical to proper system performance”, by Alfred T. Yerger II of Bird Technologies Group, a frequent contributor to the publication. Mr. Yerger argued that technical issues are the key one in antenna design and shows little interest in architectural issues. He states, “As you can imagine, much of this does not fit well with a landlord’s or building owner’s idea of where the antennas should be placed, or a tower owner’s desire to fill a particular space on the tower.”

marcusnew
This week RF Globalnet published my response to the Yerger article, “Consistency With The Architectural Environment Is A Key Issue In Practical And Pragmatic Cellular Antenna Engineering”. The key issue of the response is stated as

“The point of this essay is to emphasize that good engineering is more than about making systems that work in a nominal sense, it is about making systems work in the real world with real constraints such as cost, size, weight, battery life, and compatibility with their environment in the case of systems that are intrusive into their locations as many cellular base stations are. It is this engineering to meet practical constraints which is a key difference between engineers and physicists.”

My article ends with

“Mr. Yerger ends his article with, "Remember, antenna location is not an architectural decision!" Perhaps not, but in the real world, antennas and other engineered systems have to work in the environment in which they are placed. Being compatible with the local environment and acceptable to neighbors is part of that. The cellular operators and suppliers should start making some real efforts to examine base station design for suburbia from a fresh piece of paper and stop focusing on using products in today's catalogs unless forced to do a very expensive custom design. Alcatel-Lucent deserves praise for ‘thinking outside the box’ with their new concept (lightRadio™), whether or not it ever has a significant role. The rest of industry should rethink its current approaches, also.”


Readers are invited to read both articles and post here with their views of the proper role of architecture/city planning issues in antenna system design. All responses will be posted here subject to constraints similar to present broadcasting FCC limits.

Best wishes to all for a Happy Thanksgiving!
0 Comments

2011 Best Places to Work in the Federal Government Report

best places to work


The annual Best Places to Work in the Federal Government report has been a recurring theme here. The Partnership for Public Service uses data from the Office of Personnel Management's Federal Employee Viewpoint Survey to rank agencies and their subcomponents according to a Best Places to Work index score. Agencies and subcomponents are not only measured on overall employee satisfaction, but are scored in 10 workplace categories, such as effective leadership, employee skills/mission match, pay and work/life balance.

FCC avoided the survey in 2003-2007, but has participated in 2009-20011. This year FCC ranked 17th out of 32 small agencies. In the 2009 survey, based on data collected in 2008 under FCC Chairman Martin, FCC ranked 28th out of 32 “small agencies” and was lower than all but 25 out of the 278 organizations surveyed. But in 2010, FCC was justifiably proud of being “most improved agency in the Federal Government” .

BP-2011-data
This year FCC is 17 out of 32 small agencies rated. (The small agency category is a little arbitrary as the U.S. Nuclear Regulatory Commission, slightly larger than FCC, is not a small agency.) FCC’s absolute score is down slightly from last year, but The Washington Post’s Federal Diary column points out that in the current political stalemate, most agencies’ scores are down.

As shown in the chart at left, FCC is still in the average zone for small agencies. I suspect one factor is the lack of stability and continuity for top managers at FCC compared to other agencies. For decades FCC has lacked a top tier of senior civil servants who survived presidential and chairman transitions. (The Mark Fowler to Al Sikes transition was one of the “bloodiest” for senior managers even though the Reagan remained President.)

The reasons for the senior management changes are complex and there is fault on both the 8th Floor political level and the many of the career civil servants who do not act like nonpartisan “British civil servants” as they rise in their career, but rather tend to pander to the political crowds. Bob Pepper was a notable exception serving high positions at FCC under 6 chairmen of both parties. That is truly rare now. But this type of continuity is essential to bridge the gap between the 8th Floor and the staff and make FCC a more effective agency.

0 Comments

FCC Starts Transparent ex parte Rule Enforcement

ex-parte-enf-1-11-11

FCC has had ex parte rules on its books since the late 1970s. The Commission is unique among administrative agencies in the federal government in requiring outside parties to make filings about their ex parte meetings with Commission officials - as far as I can tell every other administrative agency in the federal government has its staff write up memos about such meetings and inserts them into the public record. Only FCC depends on filings by external parties who have mixed incentives with respect to transparency.

On June 1, 2011 major changes to the ex parte rules went into effect along with a promise of greater enforcement. As far as I can tell the only enforcement in the previous 30 + year was one letter sent from the former Cable Services Bureau. Never any action from the Office of General Counsel responsible for ex parte enforcement. A new FCC page on ex parte enforcement reveals a variety of actions in the past year.

In particular the document shown at the top of this post is the first actual finding of a violation EVER! There are 2 other admonitions as well as a referral to the Enforcement Bureau “ to determine whether a forfeiture is warranted”.

Never having attended law school your blogger is still convinced that the threat in the revised rules of fining ex parte violators was a bluff as the terms of Title V of the Comm Act dealing with fines do not permit them in such cases. I note that the R&O adopting the new rules gave no references to the authority for fines.

But, congratulations to OGC for taking the first steps in enforcement. Maybe after they try enforcing the rules they will learn how impractical they are and recommend using the process that the rest of the federal government has used for 30 years.

0 Comments

EAS Test



One of the big news items this week was the first national Emergency Alert System test ever. Press coverage was mixed. Here is one report that said

“The much-hyped national test of the Emergency Alert System was help Wednesday afternoon and now federal officials are saying the test didn't go exactly as planned. Right after 2 p.m. the Federal Emergency Management Agency initiated the nationwide Emergency Alert System test and now that test is being called a complete failure.”


Broadcasting & Cable took a more nuanced view,

The fallout, as it were, from the FCC's first national test Wednesday appears to be that it worked in most areas of the country, but not in others. Anecdotal reports had the 2 p.m. alert airing on some stations, but not on others, and of varying lengths.


Actually, the issue of what worked and what didn’t really isn’t that important for this initial test. As the Commission said prior to the test,

Although local and state components of the EAS are tested on a weekly and monthly basis, there has never been an end-to-end nationwide test of the system. We need to know that the system will work as intended should public safety officials ever need to send an alert or warning to a large region of the United States. Only a complete, top-down test of the EAS can provide an appropriate diagnosis of the system’s performance.


CONELRAD
However, this test brought back memories. In the first stage of what became my 7 year exile from the spectrum policy parts of FCC in 1985 that quickly followed the Commission’s approval of the Docket 81-413 rules that set the framework for Wi-Fi and Bluetooth but were opposed by key industry players, I was given by then FCC Managing Director Ed Minkel the odd job of a crash review of the former Emergency Broadcasting System, predecessor of EAS. EBS replaced the previous CONELRAD system in 1963. (Those of us of a “certain age” can recall little CONELRAD “CD” symbols on all AM radio dials at 640 and 1240 KHz. Younger readers can see this on antique radios.)

A major problem of EBS is that it learned the wrong lessons from the 1971 false activation that was caused by a teletype operator inserting the wrong tape for a test. FCC was very embarrassed by this incident although it was entirely the military’s fault and implemented a number of changes in EBS. In my review it became clear that the main impact of these changes was to prevent future false activations even it it greatly reduced the likelihood of EBS working during the Cold War during a real crisis.

A contrived EBS testing program made sure everything looked OK. For example, it is hard to say whether I was more shocked or amused to find out during a visit
ABC DC
to ABC’s Washington Bureau in 1985 that it was only staffed 2 shifts/day even though it had a major role in the distribution of presidential EBS messages. This role had been decided years before but the third shift was later eliminated as a cost saving measure. While the role could have been performed by ABC’s New York Bureau that still was a 24/7 operation, the problem was never noticed in the contrived EBS testing of that era that always took place during the main shift and had the active participation of high ABC management since they were always announced in advance.

I have had nothing to do with EBS or EAS since this 1986 study, but I suspect the my report’s section on testing is relevant today also:

POSSIBLE NEAR TERM IMPROVEMENTS



The purpose of this section is to present possible near term improvements to EBS for further consideration. Only improvements which can be implemented within a year with little or with no out of pocket costs have been considered. ...

In my opinion. the most critical area for improvement deals with the need for realistic exercises. By realistic exercises, I mean nationwide live over the air broadcasts of White House originated test messages (which could be the same text as today's station weekly test message) at random times, without advance industry notification, at least once a year. The only required testing for an individual station is a weekly test at a time of its own choosing during which it broadcasts the EBS alerting signal and a brief test announcement. Participation in quarterly closed circuit tests is voluntary. The week of these tests is announced in advance and the tests always take place during the business day. The television networks have not chosen to participate in these tests although the radio networks do.( However. PBS did participate in one such test.) ...

As was said previously, I do not believe that it is possible to have a high level of confidence in EBS achieving its requirements with the current level of exercises. (For example the current testing program has failed to identify the fact that at least one of the radio network facilities in Washington is not staffed 24 hours/day and has no planned method for transmitting the audio EBS feed through to its headquarters in New York. Since the present closed circuit tests are only done during the normal working day this problem was never observed.)


So the EAS test this week was a long overdue step forward. The actual results are not as important as the lessons learned if the Commission commits to a realistic testing program to assure that the system actually works and continues to undo the wrong lessons learned from the 1971 incident.
0 Comments

Doug Sicker:
FCC -> NTIA

The NTIA website now has this announcement:

Douglas Sicker, Chief Technology Officer and Senior Advisor for Spectrum

Douglas Sicker is NTIA’s Chief Technology Officer and Senior Advisor for Spectrum. He is also an endowed professor in the Department of Computer Science at the University of Colorado at Boulder, with a joint appointment in the Interdisciplinary Telecommunications Program.

Dr. Sicker has held various positions in academia, industry, and government. Before joining NTIA, he was the Federal Communications Commission’s Chief Technologist. Previously, he served as a senior advisor on the FCC National Broadband Plan and, before that, as Director of Global Architecture at Level 3 Communications, Inc.

Earlier still, Dr. Sicker served as Chief of the FCC’s Network Technology Division. After leaving this agency, he served as Chair of the Network Reliability and Interoperability Council steering committee, an FCC federal advisory committee that focuses on network reliability, wireline spectral integrity and Internet peering and interconnection. He also served on the Technical Advisory Council of the FCC. In addition, he has also held faculty and industry positions in the field of medical sciences.

Dr. Sicker is a senior member of the IEEE, as well as a member of the ACM and the Internet Society. He has chaired and served on the program committees of numerous technical conferences including IEEE, DySPAN, ISART and TPRC. His research interests include network and wireless systems, network security, and telecommunications policy. He has research funded through the NSF, DARPA, the Internet Society and the Federal Aviation Administration. He holds a Ph.D. from the University of Pittsburgh. A full CV is available on his faculty page at the University of Colorado at Boulder.


sicker
In July 2010 we announced here his arrival at FCC. His move to NTIA was rumored for months and then announced in such a low key way that the search engine on the NTIA website can’t find it! We wish to congratulate Doug on this surprising personnel change and wish him best of success in the new position.






0 Comments

FCC Forum on Indoor Deployments of Small Cell Sites

small-cell

On October 28, 2011 FCC held a Public Forum on Indoor Deployments of Small Cell Sites in the Commission Meeting Room with live video online. The forum was organized by the Wireless Telecommunications Bureau, in conjunction with the FCC's Technical Advisory Committee (TAC) Small Cell working group and Spectrum Task Force. I have been somewhat critical of how little the TAC has been asked to do and how little it has accomplished since its formation, but this was certainly a positive move and the FCC staff and the TAC members certainly deserve credit for bringing public attention to this important issue.

The FCC PN on the event said,

Recent developments in technology offer an increasingly wide array of products to provide wireless coverage and capacity in limited or confined areas. Together, they offer potentially useful solutions to addressing the exploding demand for spectrum that is being driven by the exponential growth in wireless data services. The forum will provide an overview of small cell technologies currently available or soon to come on line, including software defined radios and enhanced Wi-Fi in both licensed and unlicensed spectrum. In addition, panelists will explore the business opportunities and challenges involved in expanding wireless data coverage. Finally, the forum will assess the potential economic impact of small cell deployments, particularly with respect to job creation, and explore possible policy approaches.


Cisco/AT&T femtocell

3g_microcell_wht_right_s-339x509-custom

That is what was basically discussed and the forum dealt with both femtocells that act as tiny base stations for cells phones indoors and then connect to the network over the internet - effectively limiting CMRS spectrum use to very short distances - and Wi-Fi base station that extend the network to phones and smart phones that have both Wi-Fi and the more usual CMRS modulations. The cellular industry’s love/hate relationship with unlicensed in general and Wi-Fi will be the subject of a future post here - stayed tuned!

What was interesting to your blogger was also what was not discussed. This fell into two basic categories. Will the growth of indoor small cells impact the voracious demand for spectrum of the CMRS gang? Although the official CTIA party line is that spectrum is the only way to solve the “shortfall” and this is dutifully echoed in the FCC’s recent “infographic” , it would appear that the technologies discussed at the forum would also have an impact on spectrum needs. As we have stated before here, wireless capacity is a function of 3 factors: spectrum, technology, and infrastructure. The speakers successfully evaded this issue.

A second issue is whether femtocell technology should only be used for connections to the public switched network or Internet using standard CMRS modulations. Today’s cellular industry is focused on “killer apps” like iPhone that can affect there bottom line in massive ways and has little or no interest in niche markets. For several years your blogger has been trying to interest cellular carriers and manufacturers in using variants of femtocell technology to meet niche applications that otherwise would require dedicated bands.

Independently, Qualcomm began and R&D project exploring such an complementary use of cellular spectrum and earlier this year announced it as FlashLinq™which “operates in licensed spectrum as a managed service”. This is an example of using short distance links in CMRS spectrum to meet needs not served by traditional CMRS services such as short range communications. While unlicensed spectrum might be used for such uses, CMRS spectrum under carrier control offers the potential of much higher reliability. It would both fill the “white space” that is inevitable in an operational cellular spectrum and created new revenue for the carriers. Services that might be provided include wireless microphone services where a high density is required such as in theaters and concerts and medical information in hospitals where great reliability is needed. But such niche services are not “killer apps” in the eyes of the CMRS top leaders.

Well, here is another way to look at it: Niche markets like wireless microphones and medical uses all have as their preferred solution dedicated exclusive spectrum. We have seen more medical allocations in the US in recent years and other countries have dedicated wireless mic bands. Maybe the leaders of the CMRS community should not look at these applications and innovative technology such as FlashLinq not as a modest revenue source but as a way to eliminate spectrum competitors who are competing at FCC’s trough for the same spectrum the CMRS community seeks. So it may not matter if such applications are not “killer apps” if serving them in existing CMRS spectrum eliminates another roadblock to increased CMRS spectrum.

0 Comments