Friday, September 30, 2011

Where do people go for news?

Where do people go for news?  It's a question that is regularly asked by surveys, and has shown some interesting shifts over time.  However, the question normally asks only a general question about "news", or at best asks about local news or national news, recognizing the different sets of uses and gratifications bundled into the generic "news" label.
The Pew Center's Project for Excellence in Journalism the Pew Internet & American Life Project has just released a study that starts to unbundle the concept of news, and how an active audience shows more nuance and complexity in their news consumption habits.  Fir this study, the Pew Centers identified 16 different topic areas in news and asked participants for their preferences and use.  The goal was to develped a more nuanced understanding of an increasingly complex and competitive media ecosystem.  An example: while local TV news remains the most popular source for "news information," it urns out that adults "rely" on local TV news for weather, traffic, and breaking news, only
Also, not only has the proportion of people identifying the Internet as their preferred news source been growing, but Web-only news outlets are becoming the most-relied-on source of key areas of local news and information (listed as key source for information on education, local business, and restaurants, and tied with newspapers as top sources for news and information of jobs, schools, and housing).
Opinion about newspapers as a news source had some interesting results.  Newspapers were listed at "most-relied on) source for 11 of the 16 categories (listed first, or tied for first).  On the other hand, more than two-thirds of respondents felt that if their local newspaper disappeared, it would not affect their ability to stay abreast of news and information about their community.  In other words, while people felt they could rely on newspapers for many topics, newspapers were not seen as necessary or critical news outlets.
Part of that may be a result of the expansion of new types of digital and mobile media.  Almost half of respondents use mobile devices for local news and information and 17% get local news from social media at least once a month.  And don't forget that more than half of the sample report regularly getting local news and information from friends and neighbors (via word-of-mouth).
The good news for news and journalism generally is that almost two-thirds of the sample said that they regularly used and relied on at least three different media sources.  45% chose not to indicate a "favorite" outlet, suggesting tha rise of a smarter, more active, news consumer - one that assembles and uses a range of media to meet their news and information needs.  The study also found that 41% of the sample qualified as what Pew calls "local news participators" - people who share links, leave comments, and even contribute, to local online sites, articles and opinion pieces about their community.

Sources - Internet, Mobile Compete With TV as Key Local News SourcesOnline Media Daily
How people learn about their local community,  Pew Research Centers report

Amazon's Kindle Revamp & the Future of Digital Media

As predicted, Amazon announced it's first entry into the tablet market on Wednesday, the Kindle Fire. On the technology front, it's not going to directly challenge Apple's iPad.  The Fire has a color touchscreen, but it's only 7 inches (slightly larger than the Kindle reader at 6") compared to the iPad's stunning 10 inch display.  The Fire has no camera or microphone, does not have the capacity to hook up to a 3G network (it does have WiFi), and is limited to 8 GB of internal storage (the iPad comes with up to 64 Gb).  It's stated battery life of 8 hours is several hours less that the iPad 2's expected  time on a charge.  Running on an Android OS, the Fire pulls from the Android app base that still has a way to go to match the range of apps available for the Apple iOS system (approaching half a million).

On the other hand, the Fire isn't being marketed as an iPad-killer high end tablet.  It seems to be envisioned more as an extension of the Kindle branding approach - a device for accessing and using digital media content - only this time accommodating audio and video content (especially content acquired through Amazon).  As the lead feature on the Amazon Kindle Fire product page states, "Movies, apps, games, music, reading and more," with "18 million movies, TV shows, songs, magazines, and books" available through Amazon.  Screen size and battery life is sufficient for regular personal use, and the Fire offers stereo speakers (for stereo on iPads you need to use headphones), and the limited onboard storage is offset by the included free Cloud-based storage for all content acquired through Amazon.  Further, the Fire comes with a one-month trial of its Prime membership, and Prime members get free streaming access to more than 10,000 movies and TV shows.  Amazon's MP3 store regularly offers free songs and samplers, and Kindle bookstore continually offers free promotional titles, access to hundreds of thousands of older public domain titles, and connection with a growing network of public libraries offering eBook loan services.  The Fire is aimed at heavy media consumers rather than Internet and computer users.  But it's not limited to just media.  The Fire includes a customized web browser, and the Android OS means it can run games and other Android apps.  And its priced at $199, while the iPad 2 starts at $499.

John Gruber, blogging at Daring Fireball, gives a good summary of the contrast between Apple and Amazon in the tablet market -
"The iPad takes it on from the high end. It's the best possible device in that price range from the world's best maker of devices. The Kindle Fire takes it on from the low end. The iPad is a credible laptop replacement for many people—and with iCloud and another year or two of hardware improvements that's going to be true for more and more people. The Kindle Fire is a laptop replacement for almost no one. It's a peripheral, not a second computer—and it's priced accordingly."
Analysts expect Amazon to sell 2.5 million Kindle Fires in the first two months (expected to start shipping Nov. 15), and 13-15 million in 2012 (in the U.S. only, for now).  On the other hand, projections are for Apple to sell more than 50 million iPads worldwide in 2012.

While most of the hype has been centered on the Fire as Amazon's first tablet, Wednesday's product launch went well beyond that - introducing a range of new models and price cuts.  Analysts were anticipating a $249 tablet offering, and perhaps the first sub-$100 basic Kindle.  Amazon's biggest surprise was not only pricing the tablet at $199, but three different models breaking the hundred dollar price point, if you're willing to go with the "Special Offers" service, which lets Amazon put ads or offers as the device's screen saver (without is $30-$40 higher).  There's a basic reader at $79, a touchscreen e-Ink version at $99, and a version with Amazon's traditional keypad feature also at $99.  Both the Touch ($149) and Keyboard ($139) models also have 3G models that allow free downloads outside of WiFi areas.  The larger screen DX model remains available as well.

The new price points could well prompt another huge extension of the eBook market this Holiday season - rapidly expanding the ownership base.  A Pew Internet report in June indicated that eBook ownership doubled between November 2010 and May 2011, rising from 6% to 12% of U.S. adults.  A Harris Interactive report released last month suggests ownership and usage will double again in the next six months, and that was prior to Amazon's new models and price points.  With the new prices and models, I anticipate greater adoption and use among younger readers and media consumers, expanding the eBook market.  If Amazon can also follow up with a subscription model for ebooks, this could well push up interest in reading and demand for books
In a similar vein, the Fire price point is also likely to significantly expand adoption of tablets and their use for media content.  This should further hasten the shift to digital media and on-demand usage.  It will be interesting to watch the coming transformation.

Sources - Amazon Kindle Fire No True iPad Rival: MunstereWeek.com
E-reader ownership doubles in six months, Pew Internet  (full report available at this site)
One in Ten Americans Use an eReader; One in Ten Likely to Get One in Next Six Months, harrisinteractive

Wednesday, September 28, 2011

Where's the Scarcity in Wireless Spectrum?

edited to add title

Various groups, including the FCC, have been arguing that there is a serious shortage of viable spectrum available for 4G and mobile broadband services in the U.S.  The arguments have been used to support two otherwise questionable spectrum transfer proposals (see posts on TV spectrum auction and LightSquared for more info).
An analysis of current spectrum allocations and use by Citigroup Global Markets suggests that while wireless providers face some challenges, lack of spectrum is not one of them.  Rather, they suggest that too much spectrum that's been allocated remains largely unused, held by firms who are currently not able or willing to make use of their allocations.  Here's how they summarized their results -
* Mixed Signals — Robust Smartphone and tablet sales suggest wireless demand is growing rapidly. And, the FCC suggests the US faces a long-term spectrum shortage. However, several new wireless carriers - Clearwire, LightSquared and Dish - have been slow to light-up their spectrum suggesting excess supply. So, who's right? Is there a spectrum shortage or not?
* Spectrum Availability High, Use Low — Today, US carriers have 538MHz of spectrum. And, additional 300MHz of additional spectrum waiting in the wings. But, only 192MHz is in use today. And 90% of this in-use spectrum is allocated to 2G, 3G and 3.5G services. As such, we estimate carriers can only offer average wireless speeds of 0.5-1 mbps during the peak busy hour.
* But, Spectrum Is in the Wrong Hands — Too much spectrum is controlled by companies that are not planning on rolling out services or face business and financial challenges. And, larger carriers cannot readily convert a substantial portion of their spectrum to 4G services, because most existing spectrum provides 2G-3.5G services to current users.
* Full 4G Can Deliver 5Mbps — 100% conversion of 538MHz allows carriers to offer 5Mbps with 10% simultaneous usage during peak busy-hour. This speed allows for very robust mobile use and limited home use. However, it is not sufficient for in-home replacement capable of large screen HD video streaming.
* Bottom Lines — We do not believe the US faces a spectrum shortage. However, unless incumbent carriers accelerate their 4G migration plans, or acquire more underutilized spectrum, upstart networks – like Clearwire, LightSquared and Dish – could have a material speed advantage over incumbent carriers provided that they can clear meaningful hurdles for funding and distribution.
At a minimum, the report would challenge the argument that the need is so dire that proposed allocation shifts need to be made without analysis of the consequences.

Source -  Spectrum Shortage? What Spectrum Shortage?  RBR.com

Lots of new content deals for video streaming

Lots of action from various corners, as competition among online video streaming services heats up.

Sources -
Rumor: Facebook Launching Streaming Media Service
Netflix Inks Streaming Agreement with Discovery
Blockbuster Poised to Launch Streaming Service
Amazon Signs Streaming Agreement with Fox
Disney Cuts Streaming and DVD Deal with Brazil's NetMovies
edited to add:
Netflix Secures DreamWorks Deal

Hulu bidding stalemate

The first round of bidding for Hulu and Hulu+ is over, with no apparent winner.  Google is said to have bid $4 billion, but with a number of conditions mostly having to do with getting long-term commitments from Hulu partners to continued provision of programming.  The highest unconditional offer was reported to be $1.9 billion, from Dish Network, below Hulu's expected minimum of $2 billion.  It's unclear at this point whether Hulu will accept any of the bids, seek another round (with or without additional programming commitments) or pull back off the market.

Source - Hulu Has High Bidder,  Around the Net in Online Media

TV Shifts While Retaining Dominance.

People are watching more video programming, but over an increasingly diverse set of devices and technologies.  Television has long dominated entertainment media use, but technological change has shifted the locus of control in what seems an ever-expanding variety of media forms and viewing devices.  Traditional TV has long dominated the market, first through local broadcasters and national networks, then through cable and satellite (with the expansion of networks on offer),  VCRs, and then DVDs provided an additional source of content for viewing, and video recorders gave viewers greater control over when to watch programs.  The expansion of video-on-demand and streaming services have continued that shift in control to the users, as well as expanding their viewing options.  IDC research manager Greg Ireland noted that "now you are seeing remarkable levels of online consumption through so many Internet-connected devices in the living room... Media and tech industries are clearly looking forward."
And even more competitors are jumping into the fray, with major moves by Netflix, Apple, Google, and Amazon to expand the range of content available on demand or through streaming.  Movie studios and other content producers are also exploring ways to make direct sales of digital video content.  On the device side, Amazon's latest contribution is the Kindle Fire, a $199 tablet powered by Android and using a high-resolution color touch screen.  The Fire is positioned as more than a snazzy reader - it's also optimized for audio and video playback and let's users access all of their purchased or rented media content (books, music, videos) from Amazon's Cloud Service.  (There are other new Kindle entries across a range of price points, including a base model at $79).
The appetite for content and new media devices continues to look strong, even in tough economic times.  A new Consumer Electronics report suggests that the typical home of 2.6 people has an average of 24 media gadgets, including at least one smartphone. And the decline in household consumer electronics spending, from an average of $1500 in 2005 to $1179 in 2011 seems to be more a reflection of declining costs than reduced demand.
So while there's more viewing, there's also been strong shifts in how, when and where viewing occurs.  Genevieve Bell, an anthropologist working for Intel and researching how people use and relate to technology, described what's happening -
“Television is now viewed over more devices than ever, and viewers are in control over what they watch more than ever. ... What’s been remarkable is the extraordinary power of television, because at the heart of this is that people just want a good story.”

Source - As more tech-titans provide TV, at-home gadgets multiply, Washington Post 

Youth Drives Radio Audience Growth

The latest Arbitron RADAR National Radio Listening report indicates that 93% of people aged 12 or over listen to radio at least once a week.  The overall growth was driven by increases in the younger demographics (12-17 and 18-34) while falling for older adults,  The report also suggested that ethnic audiences were increasing, with a slight rise in Black audience numbers, and radio use being widest for Hispanic listens ( Reach for Hispanics topped 95%).
The report also suggested that commercial network radio reaches affluent, educated results.  Arbitron reported weekly reach as 95% for households whose income was greater than $75,000 a year, and 96% of adults 18-49 with college degrees and incomes above $75K.
Radio's continued ability to attract listeners in the growing audience segments is a good sign of the value of radio for audiences, and thus for the continued viability of the industry.

Source:  Arbitron: Youth Drive Up Network Radio Listening,  Radio World

Tuesday, September 27, 2011

Fast Fade for DVDs?

"Across Hollywood, a quiet revolution is brewing that's about to transform living rooms around the world."

  So begins an LA Times story on how movie studios are re-envisioning their business models through an embrace of Internet delivery as a way to boost revenues.  The movie industry learned the lesson of adapting to new technology and the disruption of traditional market structures much earlier and quicker than other media.  Faced first with television, and then cable, videotapes, DVD, Blu-Ray, online streaming, etc. the industry first fought innovation, but then as it discovered the revenue potential of entering the new markets created by these innovations, it turned to embrace new media and new distribution systems for its content (movies).  In many cases, the new markets not only increased demand for their products, the revenue potential at times eclipsed those possible from theater ticket sales.  Particularly if you consider the costs of film distribution, and the layers of distributors and theaters taking significant chunks of the revenues generated by ticket sales.
  In fact, revenues from the home entertainment market now generate more than half of all of the movie studio revenues.  In the last few years, however, home entertainment revenues have dropped by as much as 40%, spurring a search for new revenue streams.  There are several factors that arguably may have contributed to a decline in DVD sales.  First is the development and adoption of improved formats - which you can see in the chart in the transition from VHS to DVD to Blu-ray.  Historically, with the introduction of improved storage media, the improved format fails to capture the full value of the previous format.  In the case of DVD vs. Blu-ray, the apparent quality difference between up-converted DVD and Blu-Ray is not significant enough to many users to justify the cost of replacing older DVDs, thus Blu-ray sales come primarily with newer releases.  A related factor is that many people did major purchasing of back catalogs (i.e. favorite earlier movies) during the DVD period, inflating DVD sales numbers in the early to mid stage of the technology.
  The rise of digital distribution forms, though, is arguably a stronger factor in the decline for three reasons.  First, as the industry will repeatedly argue, digital media and networks facilitate piracy, which does impact sales.  The estimates of the relative size of this impact are highly variable, but the fact of piracy and that it does impact home sales is fairly clear.  Second, the development and diffusion of broadband digital networks has facilitated the ability to market DVD and Blu-ray content as digital files, rather than as physical copies.  As online storage costs fall, and users seek the ability to view the movie content on multiple playback devices, digital sales are expected to replace some portion of physical disc sales.  The slow growth of that component can be seen in the chart.  The third factor, and perhaps the one with the greatest potential for long term impact, is the emergence of real-time digital streaming services.  These offer an alternative to the movie fan - instead of purchasing physical discs or digital files for each desired movie, the user can access a video-on-demand rental (fee per movie), or purchase a subscription giving users access to a large catalog of movie or program offerings, again to view on demand.  In those cases, you're tapping a different set of behaviors and related values, and shifting perceptions of how people prefer to access movies for private viewing.  And subscription revenues and rights payments will not show up in traditional models of media sales. While the revenues subscription streaming services pay for the rights to content may well replace the total revenues lost in terms of sales, it is not clear that it comes close. (The good news is that this shift reduces demand for pirated copies). 

  If total revenues for subscription and VOD service rights don't replace the revenues from physical copy sales, then movie studio revenues, and arguably profits, will suffer.
  There are several ways movie studios can react with slumping revenues.  First, they can seek new revenue streams, and some movie companies are realizing good revenues from cross-licensing with videogaming. Second, they can look for places to reduce costs.  Traditional distribution chains for movies are quite expensive - and the significant cost differential between film as a distribution medium and digital distribution is significant, and is already driving the conversion of theaters to high quality digital projection systems, just as it is driving increasing proportions of movie productions to digital.  The savings from conversion to digital can offset revenue losses in other areas, at least for a time.  
  The other major distribution costs are in the home entertainment portion of the business - not so much in the physical reproduction (physical costs for for either DVD or Blu-ray reproduction are not that high, although pure digital distribution is cheaper) - but in the fact that movie studios have to push their product through a chain of distributors and retailers, each of whom extracts a significant chunk of retail value to cover their costs.  And while that chain is fairly necessary for distribution of physical copies of movies, it is less essential for digital distribution.  Already some large retailers and subscription services are dealing directly with studios, rather than acquiring content through distributors.  If movie studios are willing to undertake the costs of digital distribution and processing transactions and the risk of uncertainty in value, they could capture more of the retail value for themselves.
  And the studios are starting to experiment along those lines. Paramount's Rob Moore commented that "what you have now is a lot of people pursuing a lot of different paths to figure out how to reverse the (revenue) trends we've been seeing,"  Sony's David Bishop concurred: "It's now critical that we experiment as much as possible and determine how to build a vibrant market for collecting digital movies."  
  Four studios are experimenting with what they're calling premium video-on-demand, where consumers pay in the neighborhood of $30 for a high quality video-on-demand rental of top movies they can play on their home theaters before its released for retail distribution.  Others are experimenting with offering movie rentals directly through Facebook, YouTube, or other non-retail focused digital outlets.  Another initiative is the UltraViolet program, which will combine purchase of physical discs with access to online digital versions that permit access on multiple devices.  To encourage this concept, studios are even considering offering access to digital copies of previously purchased DVDs (for a nominal fee).Warner Bros. is experimenting with developing apps as a way of exploiting additional content and product tie-ins for blockbuster movies.
  Another advantage that digital distribution allows is the ability to create and exploit a variety of versions, based on factors like recency, quality, accessibility, inclusion of added content/features, etc.  This can allow movie studios to continue to exploit different markets, as is currently done with distribution windows - starting with the high demand/ high value market and working your way down.  Universal's Craig Kornblau put it this way
"I see movies going down a path over time from premium sell-through all the way to the lowest-price rentals, ... If we get digital right, consumers are going to get what they're willing to pay for."
And the more they can do it themselves, the more they can keep whatever amounts that consumers are willing to pay for.

Sources:
Hollywood downloads a post-DVD future, L. A. Times

Britain's Labour Party proposal to license journalists

At this week's Labour Party convention in Britain, party leaders are putting forth a proposal for licensing journalists through a professional body.  Shadow Culture Minister Ivan Lewis will propose "system of independent regulation including proper like-for-like redress which means mistakes and falsehoods on the front page receive apologies and retraction on the front page".
Taking the decidedly illiberal approach even further, the proposal is said to call for the licensing body to have the authority to prohibit unlicensed journalists, and "licensed" journalists found in breach of an unspecified code of conduct, from practicing "journalism." Cory Doctorow's story on the proposal gives some examples of "journalism" being discussed -
Given that "journalism" presently encompasses "publishing accounts of things you've seen using the Internet" and "taking pictures of stuff and tweeting them" and "blogging" and "commenting on news stories," this proposal is even more insane than the tradition "journalist licenses" practiced in totalitarian nations.
The proposal from what used to be the more liberal of Britain's major parties is identified as a "message for Mr Murdoch", and specifically mentions the phone hack scandal at Murdoch's  News of the World (the British equivalent of the US's National Enquirer), and decries alleged attempts to have political influence.

Regrettably, that seems to be much of what passes for proposals for speech regulation these days - finding ways to silence the speech rights of critics.  (The gall of people - wanting to contribute to and possibly influence political discourse).  Defenders of free speech always hope that such proposals generate laughter and derision rather than cheers of support.  But it seems that the more we open channels for discussion, the more those in power regret having to listen.

Sources:   UK Labour Party wants journalism licenses, will prohibit "journalism" by people who are "struck off" the register of licensed journalistsboingboing
Phone hacking fallout: Labour plans tighter media regulation,  Guardian.co.uk

Monday, September 26, 2011

Pop Ups Just Became Useful

Post submitted by Molly McCurdy - posted with minor additions -


AOL recently announced that after a year in development, their Project Devil ad service is complete. The new service allows users to engage in and complete ecommerce transactions from within the original pop-up or display ad.  The advertisement pulls a HTML overlay directly from the company's e-commerce website, allowing users to expand the advertisement to full screen and purchase an item, without having to leave the screen hosting the pop-up.. 

What does this mean for the average consumer? While online ads can be annoying, they can also make online consumers aware of products and specials they might be interested in.  However, to find out more, or to make a purchase, you had to click through the pop-up or display ad, which took your browser to a different location. Concern over leaving the primary site can limit user willingness to pursue online ad offerings.  Project Devil's use of the new HTML overlay feature can reduce that concern, and AOL hopes, the use and effectiveness of pop-up advertising.  The system has yet to be used by current online advertising campaigns, but AOL hopes they will be more widely used during the coming holiday season, especially among clothing retailers.

Source -  Online Shopping: Now, Inside Display Ads!  AdWeek

Amazon works on adding eBook subscription to Prime

According to stories on SlashGear and the Wall Street Journal, Amazon is in the process of negotiating with major publishers for the right to offer a subscription-based ebook service, possibly as part of their current Prime membership benefits.  Similar to the free streaming access to a subset of their digital movies and TV show store that Amazon extended to Prime members, the planned service would give members the right to download and read (or perhaps access and read through the Kindle Cloud app) selections from a selected library of content.  The move from Amazon can be seen as both a way of adding value to its Kindle brand, and competing with emerging "loaner" programs being launched by public libraries and Barnes & Noble (whose plan allows full streaming access to its digital offerings, but only from its stores WiFi feeds.

Reactions from publishers has reportedly been mixed.  Publishers fear that blanket access to titles might limit the value of books.  The huge boom in eReader and eBook sales this year already seems to have the publishing industry in panic mode, and now concerns about libraries loaning newly published ebooks and the potential addition of an Amazon "loaning" pool is fanning the flames.  On the other hand, unlike the library loan programs, Amazon seems willing to pay what's quoted as "substantial" fees for their planned service, and that's something publishers should consider.  Along with the promotional potential, as readers may develop interests in authors or series that could create added demand for related titles.

The stories also suggest that Amazon hopes to have some kind of Prime access plan in place for the coming launch of a Kindle tablet.  The new tablet is expected to run the Android OS, optimized to allow users to benefit from the added value of Amazon's digital content offerings (video, music, text), all obtained through Amazon and stored on the Amazon Cloud service for access anywhere, anytime.

Sources - Amazon Prime ebook subscription in talks for Kindle tabletSlashGear
Amazon in Talks to Launch Digital-Book Library, Wall Street Journal - Technology
See the ebooks tag for other posts in this area

New Sony eReader touts Library links

Post suggested by and largely excerpted from submission by Betsy Poore -

Sony was one of the early innovators in, and suppliers of, eReaders, but sales of early models were handicapped by Sony's reliance on a proprietary eBook standard, need to link to a PC to load books, and relatively high price compared to Amazon's Kindle.  Sony seems to have learned from its plethora of DRM and proprietary standard problems, and is not focused on enhancing usability rather than controlling content use.

Sony's new eReader is promoted as the lightest eReader yet, and will include WiFi, and is equipped with the capacity to link to an emerging public library eBook system.  Member libraries offer not only WiFi connectivity to patrons, but allow those with valid library cards to "borrow" eBooks from their growing collections.  The idea of library eBook access is fairly well tested, with several smaller national libraries in other countries permitting their citizens access to digital copies of books and materials in the public domain, or where licensing permits.  The partnership with US and Canadian public libraries should prove a boon to both libraries and readers.
The new eReader also takes advantage of Sony's scale, coming with a coupon to visit the new Pottermore website and download the first Harry Potter eBook.  Sony is a sponsor of the Pottermore site, and the exclusive retailer of the Harry Potter eBooks.
This new trend in eReaders - collaboration with authors, libraries and newspapers will  impact the way people read.  The interactive aspect of eBooks - the ability to highlight, bookmark and write notes in the margins - has already started to be popular and is only going to become more popular. 

Pottermore website: http://www.pottermore.com/

 

Design online

Post submitted by Britney Dougherty -

We are all very familiar with e-commerce. Almost every website we visit contains at least some advertising. One of the challenges media companies and advertisers are currently facing is how to make the transition from the web on a computer to the web on a tablet. There are questions like how much should there be, where will it fit and how do we make it jump out at users. Stefanie Botelho for FOLIO magazine spoke with Alex Schmelkin. Schmelkin is president at Alexander Interactions, a company that provides solutions for e-commerce. He has plans for these new problems with the move to the tablet.

“It’s important to minimize the navigation, and the clutter…with media sites like NYT or TIME magazine, there could be literally hundreds of links above the fold. This doesn’t work when it comes to the tablet. We need to prioritize to keep the truly most important info on top.”

He is not sure how to easily excite consumers with the ads. One of the most used ways is now ineffective.

“Over the years, web designers like us have overused the concept of when you hover over a certain part of the page, something big happens,” Schmelkin says.

During the next few years, we will surely see new ideas and methods of e-commerce on tablets, or t-commerce.

Source: “E-Commerce Solutions for The Tablet Transition,” FOLIOmag.com

editted to add headline, address some formatting issues.

Saturday, September 24, 2011

White House Spectrum Plans Would Silence 210 TV Stations, LPTV and translator services

Buried in the President's American Jobs Act is language that would force the FCC to auction off to other services some 40% of the current TV broadcasting spectrum.  To free up that much of the current spectrum assignment, the NAB estimates that at least 672 currently operating full-power broadcast stations would have to move to other frequencies, at least 210 full-power stations would be forced off the air, and thousands of low-power and translator stations would also be forced off the air.  Many of the low power stations and translators target and serve ethnic and underserved audiences, services that would largely disappear under the proposed plan.
The White House listed $28 billion in expected "revenue enhancement" to come from auctioning off much of the recovered spectrum.  Another portion of the spectrum would be used to create a broadband wireless "public safety" network favored by several Senators.
The CBO analysis cast some doubts as to the viability of the plan and the revenue estimates. The CBO analysis came up with significantly lower numbers, and said that it would be “difficult to predict how much spectrum would be auctioned by 2021 because of the time and cost involved in moving existing users." 

“For example, the amounts auctioned as a result of incentive auctions would depend on the willingness of two satellite licensees and dozens of television broadcasters to sell their existing spectrum rights at a price that is below the market value of their licenses... Past experience suggests that relocating federal and commercial users can be very costly and take many years to complete.” 
In addition, the FCC has yet to release its analysis of how the reallocation would affect TV broadcast service and reception by viewers would be affected by the mandated changes.  As a point of comparison, during the 2009 digital transition, only 174 stations moved to new channels, and those moves were voluntary.

While the FCC has always planned on recovering some of the TV bands with the digital transition, it looks like this is too much, too fast.  The FCC, Congress, and the White House need to consider the potential impacts of such a massive reconfiguration of existing allocations, and not see it as little more than a potential source of revenues.

Scripps combines digital efforts for TV stations, newspapers

Scripps has reorganized all of its digital and mobile operations into a single unit.  Previously, the 13 Scripps daily newspapers and ten television stations were primarily responsible for their online, digital efforts.  The digital operations were often considered secondary to the primary outlet, and with 23 individual efforts, it was often difficult to maintain a consistently high quality service to online users and advertisers.  Scripps CEO Rich Boehne stated that:
This new structure will result in better products, faster development, more efficiencies and improved financial performance while staying true to the Scripps mission of building value through enterprise journalism and public service.
Boehne added that integrating digital and mobile operations would allow Scripps to "continually improve our current news sites and apps while also pursuing a few initiatives that aren't tied to our core businesses." 

Source - Scripps Consolidates Digital Operations,  Broadcasting & Cable

CBS anticipates big "reverse compensation" payday

Until fairly recently, one stable revenue stream for local TV broadcasters was the compensation that networks paid them for carrying network programming.  In the recent competitive environment, the networks began reducing compensation levels, or dropped compensation fees totally, as network-affiliate contracts came up for renegotiation.  Networks have talked about the next step - requiring local stations to pay networks for the rights to carry programming, just as cable networks get paid by cable and DBS operators.  As stations started to collect retransmission fees from cable and MSO, the networks started claiming that the value came from their programming, and thus should be passed through to networks.  Local stations, particularly in smaller markets, are in a bad spot - television programming is expensive and networks supply a lot of what stations air.  They are likely to acquiesce to network demands, as long as the networks are smart and keep initial fees on the low side.  The stage is set for the next round of affiliation contracts to have carriage fees flowing from stations to networks.
While most of the big CBS-affilate group contracts don't come up for renewal for a couple of years, their approach will be to ask for "reverse compensation."  CBS has reached an agreement with a group of small market stations that reportedly includes stations paying for network programming, and is said to be near a deal with a larger station group that will also include "reverse compensation."  CBS CEO Les Moonves has indicated that station fees for carrying network programming could generate as much as $450 million in added revenues a year, as it gets phased in.

Source - CBS's Moonves Sees Gold In Reverse Comp,  TV Newscheck

'Reality' and Sports Top Viewing

Nielsen recently released some summary numbers from last year's TV prime-time viewing.
'Reality' shows accounted for more than half of all viewing, gathering 56.4%, up from 49.4% for the previous year (but down from its high of 77% for the 2007-2008 season).  Scripted dramas account for 23%.  Sports programming accounted for the next highest proportion of viewing, at 20% of prime-time viewing.  Nielsen reported that total primetime audience numbers remained high, just short of 200 million.

Source - Reality TV Grabs Most Viewers, Sports Nabs 20%,  MediaDailyNews

CBS adds paywall to "48 Hours Mystery" app

CBS seems to be testing the idea of charging for programming by shifting its "48 Hours Mystery" app to a subscription basis.  The app will be restricted, allowing users all users access to a promo clip for the next episode, but to access the full program and other materials, users will have to pay for an annual subscription ($4.99  a year, for now).  The "60 Minutes" app charges $4.99 to download the app, which also gives access its archive of old programs and clips, but the rest of the program-focused apps do not charge for access.
Steve Smith observed that decades of online experience have shown consumer willingness to pay for "their passions."  Dedicated fans of programs have long shown an interest in consuming more information and content related to the series.
"Crafting reasonably priced deep dives into these niches is one way that TV programmers can deal with the platform fragmentation currently challenging this old models. Apps may be one way of experimenting with a model that leverages fragmentation. It zeroes in on the niches that declare themselves when watching certain programs and then gives the network the opportunity to super-serve them in ways the niche gladly pays to access."
Seems to be an idea worth pursuing.

Source - CBS Goes To The Paywall On '48 Hours Mystery' iPad App,  Mobile Insider

The Great Unbundling - Video Edition

he nature of information and media goods has always been a combination of content and channel, or distribution form. Traditionally, the distribution system has been tied to and identified with the content, but one of the consequences of the digital revolution was the emergence of alternative distribution channels for all kinds of content.  Welcome to the Great Unbundling.
The unbundling of print was easiest, but the growth of broadband has now enabled the growth of digital media distribution alternatives.  Historically, and still, video programming is primarily delivered through local broadcast stations, cable, and satellite services. On the other hand, the last few years have seen an explosion of Internet-based IPTV services, the emergence of IP-based cable alternatives like AT&T's U-verse, and many high-end cable systems converting from a video-IP mix to a fully IP-video system.
In a post on The Future of Media blog, Jim Louderback offers some predictions:

  • Within five years, most of the videos watched will be delivered over an open-IP network (i.e. the Internet)
  • Five super-premium video-channel bundles will emerge, largely from the current major players (the big networks and someone else).  The networks will develop direct-payment relationships with viewers to supplement advertising.
  • A few premium independent channels will offer "super-high quality programming" on demand, likely led by Netflix
  • A few demographically-targeted bundles of channels will emerge
  • Branding and prominence will be critical to success as channels and sources for video content continues to expand.
Louderback concludes:
In the end it's all about shelf space.  All of use are racing to build a session-shifting experience that lives as an icon across everything from the smallest smart-phone to the biggest smart TV.  Because in the next five years if it's a glowing rectangle, then it is a video consumption device - or what we used to call a TV.
Source -  Get Ready for the Great Video Unbundling,  The Future of Media blog


edited - had to fix a major formatting problem. (twice)



Google TV expands offerings

The GoogleTV service has announced the addition of new channels from CNN, TBS, and TNT to its service.  Content from those, and a number of other channels are available only to can authenticate paying for the channels through other multichannel TV services.  The offerings from Turner (CNN, TBS, TNT) appear to be similar to what is made available through their websites and iPhone and iPad apps.

Source -  TV Everywhere is Coming to GoogleTV,  GigaOM

Hulu passes million subs

Added information from post submission by Chelsea Taylor -

Video streaming service Hulu recently announced that it has grown to having over a million paid subscribers for the pay Hulu+ service.  That still puts Hulu well behind Netflix's 20+ million paid subscribers, but is positive news as the bidding for the service continues.
Hulu CEO Jason Kilar had earlier predicted this number would be reached by the end of 2011. Hulu continues to provide free video content, but keeps more content behind a pay wall. With public opinion of Netflix on the downfall due to its recent price increases and service split, more people are flocking toward the HuluPlus subscription service (at $7.99/month).
Hulu’s owners have recently been working on selling the company with potential prospects from companies like Amazon and Google. Hulu recently launched its first international site in Japan.
Sources -  Hulu has more than one million paying subscribers: CEO,  Reuters
Hulu Plus Hits One Million Subscriber Mark Well Ahead of Schedule, DailyTech

FCC Publishes new Network Neutrality rules

After its first set of Net Neutrality rules were voided by Federal Courts, the FCC announced its intent to try again.  It's effort to revive the rules continued even after majorities in both the U.S. House and Senate sent a letter advising the FCC not to pursue its stated intent to reinstate the Network Neutrality rules without addressing the concerns that led to the court's previous finding that the FCC had no legal basis to regulate key areas addressed in the earlier rules.  The FCC has finally published the new rules in the Federal Register, and the rules, for now, are set to be in effect after November 20, 2011.

The new rules do not appear to substantially differ from those originally posted last December.  As such, it incorporates an assertion of authority that Federal Courts have consistently ruled (since 2005) that the FCC does not have.  Several telecommunication firms have already challenged the new proposed rules, but were told that they would have to wait for the new rules to be formally published before the Courts would consider challenges.  Expect challenges to be quickly filed, and enforcement of the new rules to be stayed.  And I wouldn't be surprised that these new rules will eventually also be voided by the Courts.

Sources - Net Neutrality Rules to Take Effect Nov. 20,  Broadcasting & Cable

Friday, September 23, 2011

Lightsquared - The Technologies, Issues, and Problems

You may have heard of the growing political scandal linked to the promotion of LightSquared, a start-up hoping to use a combination of terrestrial towers interconnected by satellites to create a new high-speed (4G) wireless network.
While the system may be conceptually interesting and viable, the history of LightSquared, the specifics of the proposed terrestrial-satellite system, and the process by which it gained initial approval and license are more problematic.  Most recently, claims of attempted White House interference and  influence of government and military figures testifying before Congress on the results of early testing of the LightSquared system, which showed a strong potential to interfere with military, government, and private GPS systems.
Lightsquared, or at least its precursor, was one of three companies given initial approval to test the viability of delivering broadband internet connectivity through a satellite-based system.  Apparently, it soon found that the costs of a satellite-only system did not generate the level of demand it felt it needed to be successful.  It asked the FCC to expand the satellite frequency grant to cover terrestrial uses as well.  Apparently, instead of going through the normal petition process, it approached the Chairman of the FCC's office for expedited approval.  The office opened a shortened petition process on the evening before Thanksgiving, and closed it before there was any formally published announcement of the petition (and allegedly, without even informing other FCC Commissioners of the action).  The petition was granted to Lightsquared (again, apparently without the knowledge or approval of other FCC Commissioners or the Office responsible for that aspect of broadcast usages.  Further, it's claimed that the other two satellite service firms petitioned for similar treatment, they were told no, that they had missed the petition window.  The extension of the bandwidth grant to terrestrial use boosted the asset value of LightSquared's satellite bandwidth by about $10 billion, which sent its stock price rising.  It's reported that a number of friends and political supporters of the FCC Chairman, including President Obama, bought stock in Lightsquared around that time.  The President has since sold his holdings, but a number of friends and supporters remain stockholders and investors, and LightSquared ownership and management continue to give heavily to various Democratic and Presidential campaigns.
While this may be somewhat sleazy, the real problem is what Lightsquared proposed to do with the terrestrial-use extension.  The bandwidth that Lightsquared and the other firms were assigned are near the frequencies used for GPS (global positioning systems).  As a satellite system, because the satellites would be in different positions, and the signals of relatively low power, there was little potential for interference between GPS and the proposed satellite-based broadband services.  What LightSquared now planned to do with its permission to use their frequencies for terrestrial services, however, was to build a nationwide system of 40,000 towers to serve as the primary wireless access point, and use the satellite system to connect the towers with Internet backbone access points.  The terrestrial system would need to operate at a much higher power level than the power levels of satellite-based systems (including GPS) at ground level.  Between the higher power and the closer proximity to ground or even air-based GPS receivers, there was, with the proposed terrestrial tower system, a significant potential for interference - that the LightSquared signals would drown out GPS signals.  Results from early tests of prototype systems showed that LightSquared signals did create interference with GPS signals, leaving "dead spots" miles in diameter near their terrestrial towers.
That's where politics arguably intrudes again.  When Congress asked the military and civilian officials charged with oversight of the GPS system to testify as to the impact of the proposed LightSquared, the officials were apparently approached by people in the White House to change their testimony, minimizing any concern about interference, and to specifically include a paragraph indicating that they felt any potential problem would likely be resolved within 90 days.  It's been charged that this was to expedite FCC action on final approval for the LightSquared system.  However, both officials not only declined to change their testimony and include the paragraph, but testified about the attempt to influence, that the concern about potential interference was real and significant, and could significantly disrupt both civilian and military GPS use.  They also characterized the idea that problems could be resolved so quickly, or without significant cost to GPS users, was, frankly, ridiculous.  Congress has now advised the FCC Chairman not to proceed with any further action on LightSquared until a number of issues (including potential securities fraud and influence-peddling charges) have been resolved.
Lightsquared has responded with a claim that it has a "simple filter" that will limit interference on "99.5%" of GPS receivers. aGoing on the offensive, it argued that it "can't believe that such a terrific national benefit as a new national wireless network" could be held up by a problem with a few hundred thousand unfixable GPS devices.  Their announcement also blames the GPS industry for using the frequencies they were assigned by the FCC, claiming the industry failed to install filters to correct for problems it didn't know about and weren't available, and suggesting that the commercial GPS industry unfairly benefits from the government's $18 billion GPS network.  Apparently, it's ok for LightSquared to benefit from a $10 billion subsidy it got from the FCC's extension of terrestrial rights (which kept it from having to bid for, and pay the government for, the terrestrial usage rights for the bandwidth it wants to use), but it's not reasonable that hundreds of millions of individuals can benefit from using a federally-provided service.  They also hope that no one correctly makes the connection that Lightsquared's solution would require replacing that "99.5%" of GPS devices with ones using their filter, and that the replacement costs, or the costs of finding ways to add the filter externally, would be borne by GPS users. 
But the response that's likely to be most problematic is from the FCC technical working groups that studied the proposed system and "fix" found conclusive evidence that "LightSquared's proposed operations defy the law of physics, and therefore simply will not work," that going ahead with the various shifting variations of LightSquared plans "would cause such widespread harmful interference that it would severely cripple GPS," and that the only viable solution was to move the wireless broadband use "out of the MSS band altogether" (The MSS band includes both GPS and LightSquared's current frequency assignments).
Physics and reason should, at a minimum, put a hold on approval for the proposed system.  You have the technical experts in the FCC, the military, and in the executive branch office responsible for maintaining the GPS system all testifying under oath that there are problems with the proposed LightSquared system - problems that are likely to create significant service disruptions in a widely-used public system, and that the disruptions could have serious consequences that could threaten public safety and military uses.
One the other hand, you  have a company in a press release claiming it has a miraculous quick fix that was rebuked by technical experts, would place the cost of the fix fully upon the users of GPS devices, and that anyway, it was all the commercial GPS industry's fault for operating in their assigned frequencies and not anticipating, years ago, the need to install "fixes" for a new proposed system - one that would be solely responsible for creating the problem, and whose technical details have only recently been made public.
It's pretty clear that LightSquared thinks we're all rubes - at least if it thought any of there arguments were reasonable or rational.  I guess the real question is whether whoever in the White House and the FCC Chairman's office is pushing this epitome of crony capitalism agrees, and continues to support this farce.

Sources: GPS industry rages: LightSquared 4G network would "defy" laws of physics, ars technica
myriad other sites

Nielsen Discloses Major Ratings "Glitch"

Nielsen has admitted that much of the data it has been reporting for most of the last year has been incorrect.  Nielsen attributed the errors to a "glitch" that has been generating incorrect estimates for a range of measures, including GRPs (Gross Ratings Points), and the reach and frequency estimates used as a basis for most TV advertising plans.  Nielsen thinks the problem began when it implemented a new method for calculating average audience measures, resulting in incorrect data being sent out beginning on Feb. 1, 2011.  However, an investigation is still ongoing and no specific problem has been identified or fixed.
From the industry perspective, this is certainly embarrassing for Nielsen, particularly as they are trying to sell an already skeptical industry on a new Internet ratings system, and touting their system as following their "high standards of accuracy and reliability" in TV ratings. Particularly following last year's major glitch when the ratings calculations failed to include a significant portion of viewing data by forgetting to link to the location where the data was being collected and stored.
But the potential impact is likely to go well beyond embarrassment.  The measures that Nielsen is acknowledging as probably incorrect are precisely those used to make buys and set prices in the national TV advertising marketplace.  Those measures are widely used to set advertising rates.  Further, with the historic volatility of TV viewing, ad buy deals are increasingly incorporating guarantees of audience reach.  As Nielsen is suggesting that the glitch "overestimated" audience, it means that advertisers paid too much for many buys, and that networks are likely to have to make good on many more audience guarantees.  Because the size of the error is unknown, it's not clear at this moment what the eventual cost of the glitch will be.  One thing that seems clear, though, is that neither the advertising industry or TV networks are likely to want to cover the cost of Nielsen's errors themselves.  It should make for some interesting and volatile transactions for a while.

Source -  Nielsen Discloses Major TV Ratings Glitch, Could Impact Millions in TV Ad BuysMediaDailyNews

Distrust of U.S. news media continues

The various news media in the U.S. have been taking a beating lately.  On the business side, revenues are dropping due to increased competition.  It also took many news organizations much too long to recognize and adapt to the emerging digital networked environment that captured entire segments of advertising, expanded competition, and bypassed traditional gate-keeping.  The push for speed in being the first to report has encouraged a decline in editing and fact-checking.  But perhaps the most significant problem, for the long-term viability of journalism, has been how publishers, editors, and journalists have responded to changing conditions and increased competition.

The stress for speed and the need to cut costs has contributed to a culture of "reporting" as rewriting press releases and minimal effort given to investigation.  Large scale investigative reporting has all but disappeared (if you look, most "investigative" pieces today result from reporters being given information from partisans, rather than independent pursuit of stories). Yet a bigger consequence is that there is even minimal investigation of details, prior coverage, or historical context.  Between a lack of historical perspective, sloppiness in verifying information, and the presence of an Internet that not only allows outsiders to engage in the kinds of verification abandoned by journalism, but also provides a platform for bringing the increasingly embarrassing errors to the attention not only of editors but the wider public, journalists, editors, publishers and news organizations are increasingly embarrassed by errors in the news content they produce.

Another potential long-term problem lies in how many news outlets have responded to increased competition- to try to cater to perceptions of public tastes.  Placing more emphasis on what the public wants, rather than the traditional screed of journalism to provide the information the public needs, need not be problematic in itself.  The problem emerges when there is a growing disconnect between journalists and the public.  When publishers and editors react to competition by emphasizing what they think drives media consumption (conflict, gossip, celebrity, scandal, etc.) they start to move from informing to entertaining.  Modern political coverage, in the shift to "horse-race" coverage epitomizes this - coverage of issues, and investigative coverage of claims and past performance have given way to discussions of the latest poll results, speeches, press releases, or arguments and allegations handily provided by candidates' opposition research efforts.  The fact that only having to discuss what other people are saying is not very expensive, and does not require much commitment of time, effort, or financial support provides an added incentive to shift emphasis in that direction.

The problem is that the public increasingly recognizes that much of what is labeled news is not really news, but opinion and entertainment.  The public is increasingly exposed to the errors encouraged by over-hasty reporting, as well as the growing discrepancy in how reporters and news media treat information coming from "friendly" sources as opposed to the other side.  And all of this raises concerns about the "objectivity", "accuracy," and most importantly, the credibility of news and news organizations.  The insistence of many news organizations that they are objective, and in the arrogance of their reactions when "non-journalists" point out errors, when the evidence of their own content shows otherwise only increases the disconnect and the distrust of journalists and news organizations.

Trust in American news media has been falling for decades.  The latest in a series of Gallup tracking polls found that 55% had little or no trust in news media, while 44% indicated having a great deal or a fair amount of trust.
The bigger problem is that if you look at the full results, only 11% of the sample indicated that they had a great deal of trust that media reported the news "fully, accurately, and fairly."  (One third indicating having a fair amount of trust).  But the value of news, as news, derives from the credibility of its source - and most US major news organizations claim that all of their news is reported "fully, accurately, and fairly").  The vast majority of the news-consuming public disagrees.

The Gallup report is supported by a recent Pew Research Center for the People & the Press study tracking public perception of twelve core press performance measures.  The latest results showed that negative perception levels  reached or tied all-time highs in nine of the twelve measures.  The Pew report tried for a bit of positive spin by saying that there were other groups held in even less esteem.

Now, when you ask about specific media, rather than news media generally, trust and impressions improve.  Still, when asked about the news source they use the most, a full 30% responded that they felt that stories were often inaccurate - hardly a ringing endorsement.  A look at the recent responses on the core performance levels shows a press facing the potential of losing the trust of the public that is the main determinant of their perceived value as news organizations.  They show a growing perception as being sloppy (inaccurate), dishonest (covering up their mistakes), unconcerned about the people they report on, and immoral; as being influenced by powerful people and organizations rather than a by a concern for truth or the public; as letting favoritism and politics shape coverage and reporting.  As for the press's historically claimed role as an independent fourth estate protecting democracy, as many people believe that current press behavior harms democracy as feel that it helps democracy.

The Gallup study also showed that most Americans see the press as showing a political bias, despite a slight gain indicating they felt news coverage was "just about right." Moreover, despite all of the claims from Democrats and mainstream news organizations about the emergence and dominance of "right-wing" and conservative media, the public perception is that if there is bias, it is overwhelmingly seen as a liberal bias.
One emerging criticism of the public is that bias and favoritism in news content is driven by people wanting, and shifting their news consumption to, news sources that reflects and supports their view of the world.  However, while this is true to some degree, the poll suggested overwhelming preference (62%) for news presented with no particular point of view.  When looking for online sources (the focus of much of that criticism), the revealed preference for sources with no particular view was even stronger, at 74%.  (Another indication of the growing disconnect between journalists and the public).

From the perspective (bias) of this blog, the problem revealed by these studies is not the type of bias or the concern of the political or social impact of inaccurate, biased, reporting or concerns about outside influence.  From the business perspective, and for the future of journalism, the concern is, and needs to be, the growing public perception that journalists and news organizations are failing in their self-identified duty of objectivity, accuracy, and to borrow the NY Times' slogan, delivering "all the news" necessary to fulfilling their role as the Fourth Estate and providing the information a democracy needs. News media, and perhaps journalism, are seen as not delivering on their promises of value. As a result, the perceived value of their product as news is declining.
From a business perspective, organizations may be able to survive by replacing lost news value with value from serving entertainment or other information needs, but as the mix of news and entertainment shifts more towards news, organizations are likely to enter a death-spiral from a journalism perspective.  News, and journalism, is losing credibility.  And as it loses credibility, the value of news diminishes, and organizations become even more desperate to find replacements for that lost value.  Journalism needs to stop the death spiral - either by rededicating itself to its historic emphases that it still largely claims, or it needs to develop alternative arguments (if it can) for the existence of value (to the news consumer, the public, or society generally) in the kind of news product that it currently, or wants to, provide.

Sources - Majority in U.S. Continues to Distrust the Media, Perceive BiasGallup
Full Gallup report 
Press Widely Criticized, But Trusted More than Other Information Sources, Pew Research Report 
Complete Pew Research Report 

Edited - removed extraneous line feeds, cleaned up some typos and language.