Wednesday, August 29, 2012

Magazine use of online Codes, Tags, widespread

The use of QR and related codes in print magazine advertisements has more than doubled it the last year - to the point where 10% of all print ads carry at least one code - at least in the top consumer mags.
  The latest quarterly survey by digital and mobile services firm Nellymoser found more than 2200 QR codes, digital watermarks, and/or Microsoft Tags in the advertisements in the 100 largest-circulation generally-available consumer magazines.  Breaking a milestone of sorts, Nellymoser researchers found at least one code or tag in each magazine examined, and 90% of the titles had more than 10 codes.  QR codes remained the most widely used technique, with embedded image recognition and augmented reality tags growing rapidly.  Around 40% of codes led to video experiences, while almost one-fifth linked to sweepstakes entries or opting-in to content requiring subscription or registration.  Another 18% linked to social media, and 14% led to online purchasing opportunities.
  It wasn't all good news, though -
And while the mobile codes are intended to link the user to mobile experiences, the scourge of the mobile misfire remains with us. According to Nellymoser’s analysis of the post-click experiences, a quarter of the landings were not optimized for smartphones. The problem of broken links and codes that simply do not scan properly persists.
Sources -  Magazine Mobile Codes Up 107%, Now On 10% Of AdsMobileMarketing Daily
Nellymoser press release (with graphics)

Milestone: Flipboard passes 20 Million, Adds Video

Flipboard is a tablet must-have app - it's a combination content aggregator and specialized display system that sets the standard for online magazine/print content display on touch-screen tablets.Flipbook lets you choose to integrate content from multiple content platforms and social networks to create individualized online "social magazine."
Flipboard first launched on Apple's iPad in July 2010. Two years later, the app now boasts 20 million total users and 1.5 million daily users, adding a new user every second of the day. In addition, Flipboard averages 86 minutes per user per month, generating 3 billion page flips every 30 days.
Flipboard currently can access content from more than 2000 content partners, including a number of major news outlets (The New York Times, the Associated Press, ABC News, CBS News, and NPR).  It's also expanded to Android-based mobile platforms.
  And it's now adding video pages powered by YouTube channels, including channels focusing on music, fashion, cooking, science, news, and one featuring "Influencers."

If you don't have Flipboard on your mobile device, you're missing out on something special.

Source -  Flipboard unveils curated mobile video features, tops 20M usersFierce Mobile Content

Licensing Photos for/from Social Media

  The copyright for a photograph belongs to the person who took it, and for most photo-sharing sites, you retain the copyright while licensing the site to post and share it.  However, without a definitive indication of downstream licensing, the consequent use of the photo may be problematic.  (And no, the fact that a photo is shared online does not mean its licensed to use by everyone under all conditions).
  If you're a professional, or want to be one, you'll need to explore the various options for licensing.  The American Society of Media Photographers has a good online Licensing Guide that outlines copyrights for photos and the various licensing options.
  On the other hand, if you just want to share your photos with others without having to develop specific licensing language, there's Creative Commons licensing language.  Creative Commons (CC) is a nonprofit organization which has developed a set of standard licensing statements that indicate what uses you want to allow others to make of your content, while confirming your ownership of the copyright..  They basically outline three aspects of downstream use to license - whether you want the work to be attributed to you; whether you want others to be able to manipulate (change) the work; and whether you want to allow your work to be used for commercial purposes.
  The photo-sharing site Flickr allows you to set a basic (default) Creative Commons license for all your photos uploaded to Flickr.  It also allows you to change the license for specific photos through the "Owner Settings" option.  More than 200 million photos on Flickr bear Creative Commons licenses.
  There's also a way to easily attach Creative Commons licenses to photos uploaded to Instagram. I am CC is an add-on service that allows Instagram users to sign up under their Instagram account, select the CC license they prefer, and then every photo uploaded to Instagram for the next three months will carry that license.  In a nice move, I am CC asks you to renew the service every three months as a reminder of the license choice.

  If you're looking for photos to use, there are several sites that aggregate images by Creative Commons license type.  There's also a number of useful online guides for how to properly use and reference those images.
  Regardless of whether or not you're a professional, it's smart to get into the habit of thinking about whether you want others to be able to use your work, and under what conditions.  So why not start with the vacation photos you're sharing with friends and family.
  And if you see a great photo online you want to use, save yourself from the possibility of a copyright violation take-down notice, and pay attention to an image's licensing and follow them.

Sources & Resources
How to License Your Instagram Photos on Creative Commons, Wired
I-Am-CC website
Creative Commons images and you: a quick guide for image users, Ars Technica
Using Creative Commons Images from FlickrSquidooHQ
American Society of Media Photographers Licensing Guide
Creative Commons website

Tuesday, August 28, 2012

Faking Twitter Numbers - and Exposing Them

It's been a dirty little secret for a while - a lot of Twitter followers are "fake."  Offers of generating a thousand "followers" for as little as $5 have been floating around the Net for a while and a number of websites advertise generating Twitter followers for pay.  And some news reports have been quick to question large jumps in followers - often asserting (without evidence) that they must have been bought and paid for.
  I should note that jumps in the numbers of followers can be fairly common for politicians, celebrities, and others in the public's eye.  A big story or new release can focus attention on public figures, putting their names (and twitter accounts) to the attention of thousands or millions of other Twitter users, sparking an interest in starting to "follow" them.  I'd be much more suspicious if the jump in followers occurred before a big event (especially if during that event, the number of Twitter followers are touted) than if it follows something that triggers public attention.
  There are two basic methods for generating "followers" for the "target" Twitter account.  In the first, software looks for existing accounts of people with similar interests.  The software then uses the "target" account to "follow" the generated accounts, hoping that many will reciprocate.  The other approach is to use bots to generate new Twitter accounts just to "follow" sites that pay for them.
  Now, a UK firm, StatusPeople, has developed a means of identifying fake followers.  The measure tracks the accounts of followers back and looks at a number of measures of account activity.  A follower is considered to be fake if there is little activity beyond following. As a company spokesman described it:
 "A fake account is set up to follow people or send out spam. They normally have no followers, but follow large numbers of people. An inactive account is one in which there has been no activity for a while. They could be real people, but we would describe them as consumers of information rather than sharers of information. A good account is everything that remains."
The company tested its process on a number of Twitter accounts for politicians and celebrities - among the highest reported levels of "fakes" were the Twitter accounts of soccer star Cristiano Renaldo (79% of his 12.6 million followers were fakes) Lady Gaga (71% of her 29 million followers were fakes), President Obama (70% of his 18.8 million followers were "fake") and major professional sports leagues (MLB was 75% fake followers, NBA has a 73% fake rate, and NFL had 68% fakes),
  Unfortunately, the measure doesn't indicate what level of followers were bought and paid for, and by whom.  Also, let me note two methodological caveats with the measure - first, the metric differentiates between "fake", "inactive" and "good" accounts before combining "fake" and "inactive" and reporting the total "faked"; second, for the accounts with large numbers of followers, they don't actually test and validate each follower - they examine a sizable number of the most recent followers and extrapolate that to the whole population of that site's followers..

  You can check the "fake level" of Twitter accounts through Fake Follower Check.

Sources - How To Expose Fake Twitter Accounts Of Celebrities, International Business Times
Buying Their Way To Twitter Fame, New York Times
10 Sports Twitter Accounts With a Shocking Number of Fake Followers, Mashable Social Media

edited twice to correct typos

The Evolution of Online TV Viewing

Silicon Alley Insider uses the tenth anniversary of to illustrate the recent changes in how people watch online video. 
People who watched games only on their desktop have fallen by almost half on a year over year basis. While people who watched games on desktop, smartphone, and connected devices like Xboxs nearly doubled.
While the numbers online using wireless (9%) or connected devices (4%) only are still small, the chart also shows that 45% of subscribers watch at least some programming through connected devices (OTT or gaming consoles that output to your TV set), and 60% watch at least some programming via wireless (mobile) devices.  The big jump is in mobile, jumping from 36% to 60% in one year.

Source  -  The Evolution of How We Watch Online VideoSilicon Alley Insider

Stat of the Week - 6300% More ...

A recent study by the Parents Television Council came up with this claim -

There was a 6300% increase in full-frontal nudity in network prime time television last year.

Now before we all rush off to watch it, a few caveats.  First, the actual number of instances was 64 for the 2011-12 season compared to 1 case in the previous year.  As for the full extent of that "full-frontal nudity" - in 74% of the cases, shows used blurring or pixelation to cover the offending body parts, and in another 5% of cases a black bar or object covered the nudity.  And when asked, a PTC representative admitted that placement of other objects obscured the naughty bits in the remaining cases - there was no actual visibly bare full-frontal nudity.

The PTC listed NBC as the biggest "offender", with 35 cases.  ABC had 17, CBS 5 (including Ashton Kutcher's infamous debut on 'Three and a Half Men'), the CW had 4, and Fox was cited for three (non-animated) instances.

Source -  Study Says Full-Frontal Nudity Up 6,300% in Primetime ... But It's Not (Video)The Box blog on The Wrap

NY Times Dumps

The New York Times is selling off, one of the first significant online content farmers, to Barry Diller's IAC for a reported $300 million in much-needed cash.  The deal was described by analysts as part of the NYT strategy to sell off "non-core" assets and refocus on its major newspapers and digital publishing extensions.  The Times originally acquired in 2005 for $410 million, and NYT reported that digital revenues for fell 25% in 2011, while profits declined 67%.

  Analysts seem to think that divesting itself of this particular profit center was a good move by the Times company - improvements in search engines and a glut of content and portals are making content farms less and less valuable.  The original goal of was to
cull the most meaningful content from expert editors and writers on the most important subject matter, quite frequently how-to tips and general reference information.
By refocusing on the core news product, the New York Times company is following
a vision that context, not content may actually be king in a digital publishing environment where content has become ubiquitous and seemingly undifferentiated, especially as the lines between professional journalism, user- and brand-generated content seem to be tipping the scales,
One aspect of this is the idea that the value of fairly generic content is likely to fall in a "marketplace of information glut," while information that adds value (through style, context, or form) can differentiate itself and become "premium" content and more highly valued. 
  And the information glut is coming - research released by Facebook found that a decade ago, the Internet was generating and hosting one petabyte of new content annually.  By the end of this year, the report suggests the Internet will generate one petabyte of new information and content every two to three days.

Source  -  About Time: Times Sells It To IAC For $300 MillionOnline Media Daily

Monday, August 27, 2012

Winning News on the Apple Front

First off, congratulations to Apple for becoming the sort of richest company on Earth, sort of.  When Apple set a new high stock price on the morning of Aug. 20, it surpassed the market cap of Microsoft before the bubble burst - making it the most valuable company ever (using that metric).  However, if you adjust Microsoft's previous high cap evaluation in 1999 for inflation, it still comes out on top.
  And if Apple's not yet on top, many analysts think it will soon be.  There are predictions that the iPhone 5 release will be the "biggest handset launch in history" (see below) - and with a new iPad Mini rumored to be in production, and predictions of iTV, a smart TV with gesture controls and content deals with AT&T and Verizon.  And then there's the Apple-Samsung patent fight over smartphone features and services.

The other big win of the day was in US Court, where Apple won its patent infringement suit against Samsung over various "look & feel" patents for smartphones.  The jury found that Samsung had infringed on four design patents and three utility ("use") patents held by Apple, and rejected Samsung's counterclaim that Apple had infringed on five patents owned by Samsung.  Apple was awarded $1.05 billion in damages - but had been seeking between $2.5 and $2.75 billion.
"This is a resounding victory, not for only Apple and its intellectual property portfolio, but also designers and design rights in general. This verdict strengthens strengthens Apple's design identity, which has arguably has been watered down as each new Apple-like device hit the market," said attorney Christopher Carani, chairman of the American Bar Association's design rights committee.
Rather than face the prospect of another Apple suit, rival phone makers will give the iPhone design a wide berth--yielding more space around its design elements than perhaps necessary, Carani predicted.
The case considered a total of 24 smartphone models from Samsung - and only three models escaped some or all of the infringement claims (the Galaxy Ace, the Intercept, and the Replenish).  The hot-selling Galaxy S models were found to have infringed the most, and Apple is said to be asking the court for an injunction that would prevent Samsung from selling infringing models in the U.S.  With Samsung a leader in the booming Android-OS smartphone marketplace, the case is expected to significantly impact on smartphone markets in the near term.

In the meantime, the IT rumor mills have been flourishing with talk of the iPhone 5 (expected to be released next month) and how it would stack up against the Samsung Galaxy S III.  With the patent case settled, the point may be moot - that is, there may be no Galaxy S III to compete with in the U.S.
   Anyway, various analysts expect the next iPhone to have:  a thinner, larger, screen - reportedly a 4 inch screen instead of the current 3.5 inch; a different docking pin (thinner and narrower); support for LTE (4G variant) networks on AT&T, and later Sprint (who's just now building out their LTE network); possible extension to T-Mobile networks; and NFC (near field communications) - which allows for services like digital wallets and sharing files between nearby iPhones or other devices.

Sources -   Apple becomes most valuable company in history - sort ofSNL Marketweek
Apple Wins $1.05 Billion in Samsung Patent CaseInformation Week Mobile.
The top 5 iPhone 5 rumours: From plausible to outlandish,  FierceWireless

Money for Nothing From YouTube

OK, first let's get past the latest YouTube OMG metric - people around the world upload an average of 72 hours of video every minute to YouTube.  And many are finding ways to make money off it.

  Christine Erickson has a post on the Mashable Social Media blog that talks to some of the bleeding edge profiteers about what it takes to be a profitable YouTube Channel.  The post includes a number of videos, and the main pieces of advice are variations on the standard approach for media success.
  • Content is King - the obverse of GIGO (Garbage In, Garbage Out).  Be innovative, interesting, and care (a bit anyway) about production quality.  Also, to get a channel on YouTube you have to select and identify a content category.
  • Be Original - coming up with original ideas is good, but getting a copyright is critical. To be eligible for monetization, YouTube states that “you must own all the necessary rights to commercially use all visuals and audio, whether they belong to you or a third party.”
  • Patience and Consistency - One hit wonders rarely make much.  And when your video is competing with 4 billion others, it takes a while to get noticed and build an audience base.  Speaking of which...
  • Promote Yourself - YouTube rankings use a wide range of metrics to decide where your site or video comes up on a search.  Connect your channel with social media outlets - and use social media to connect with audiences and engage them - build a relationship that will keep them interested and coming back for more.
  • Equipment -  You don't need film studio or professional broadcast standard equipment.  Lots of cameras and smartphones can shoot HD quality video.  But look into cameras with options for connecting microphones, and investing in some good microphones, tripods, and lights.  Going for editing software a level beyond what comes bundled can also be a good idea.
But read (and watch) the whole thing.

Source  -  How to Make Money From YouTubeMashable Social Media

Local TV News Tops Cable for Voters

A TVB analysis of Nielsen Media Research data from last spring suggests that local early and late news audiences in the top 10 US markets are a closer match for "American Voters" than are the audiences for any of the major cable news networks.  As such, they suggest that advertising on local TV news may be a better deal for political advertisers.
It’s long been established that television remains the most effective and influential medium for political campaigns to reach voters and hard data shows that local broadcast news viewership is far more in line with today’s voting population than the national cable news networks,” said Steve Lanzano, TVB president.
I'm not really surprised by the argument - local news audiences tend to reflect, more or less, market demographics, while the cable news channels target different audiences.  Fox, CNBC and MSNBC are fairly honest about it, but CNN tilts older and a bit liberal as well.  Besides, the demographic mix of actual voters has fluctuated in the last two major election cycles (and I suspect further shifts in 2012) - so it's a bit like trying to match a moving target.
  Then again, a lot of political advertising is targeted, and for those ad buys it may be better to find an outlet that matches up with the target population rather than going for a broad reach.  Placing local ads on highly targeted cable nets (news or other) is one of the better advertising buys for highly targeted campaigns.

Local TV news will get a lot of needed revenues from political advertising revenue this year, in either case.

Source -  Local TV News Tops Cable In Reaching Voters,  TVNewsCheck
TVB Report available here

Saturday, August 25, 2012

Biggest Blunders in Mobile Broadcast TV

Fierce Wireless has an interesting series of posts on the worst wireless blunders.  I'll focus on one that's right on the money - the failure of U.S. mobile broadcast TV to find or develop a market.  The post focuses on is the various efforts to launch a nationwide network dedicated to mobile TV, as distinct from using existing mobile networks to stream TV, whether through dedicated channels (MobiTV) or streaming apps for smartphones or tablets.
  There's already been three major failures in mobile TV - the most widely known is probably Qualcomm's MediaFLO, which actually had some buzz at NAB conventions and had several test markets going for a while.
  In addition, Crown Castle (known for building and operating broadcast towers) tried to push its Modeo service beginning in 2004, using the international Digital Video Broadcasting - Handheld (DVB-H) technology.  It offered the network to operators on a wholesale basis, but never secured major carrier cooperation.  It even ran a beta test in New York, starting in January 2007.  By July Crown Castle had pulled the plug, and leased its nationwide wireless spectrum to other telecomm operators.
  A second company, Aloha Partners, also tried to develop a nationwide service using the DVB-H technology, beginning in 2006.  Aloha managed to at least interest a major wireless carrier (T-Mobile USA) in testing their system in 2007, which they claimed could offer up to 24 channels of programming.  The test evidently didn't go so well, as Aloha sold its wireless spectrum to AT&T in October 2007.
  Qualcomm's MediaFLO service was launched across the U.S. in mid-2009, with deals with both Verizon and AT&T to sell phones capable of receiving the approximately 12 channels of content that FLO offered.  Subscriptions to the FLO service through the carriers ran about $15 a month.  In October 2009, Qualcomm supplemented the carrier service by marketing a personal FLO device for $249 and offering subscriptions to channels starting at $8.99 a month.  Even after several price cuts, the number of subscribers never approached the number needed for long-term viability, and Qualcomm closed down the FLO service in the spring of 2011, selling the wireless bandwidth to AT&T.
  Many of the problems of these systems were that they only offered a fraction of the channels available, and often mixed content from multiple channels (often time-delayed and time-shifted) to the point where consumers couldn't rely on finding the programming they wanted.  But the biggest problem was the fact that these early systems relied heavily on cellphone technology - which at the time, meant very small screens, and severely shortened battery life (processing and displaying video uses much more power than voice calls.)  But the primary issue was that there wasn't much (if any) demand for mobile TV, and the various systems certainly didn't do much to try to build up demand.

  That's not to say that mobile TV will never succeed - handset and device improvements have kindled new interest in watching video content on phones and tablets; wireless broadband and "TV Everywhere" are helping consumers determine the value of mobile TV; and a new transmission technology using local TV broadcasters' excess bandwidth may provide more channels - and the potential for simulcasting the local station's programming gives consumers more insight into the possible value of mobile TV.  There's a new proposal for mobile TV (Dyle) being developed as a joint-venture of 12 major broadcast groups (including Fox, NBC, and Ion station groups) that seems promising. The first handsets are entering the market, so there should be some indications in the next year or so as to whether this iteration of mobile TV will have any better luck attracting consumers.

Source  -  The failure of mobile broadcast TV to capture consumer attention - worst wireless blunders,  Fierce Wireless

Sony's Crackle Joins Video Streamers

Sony has announced that it's video player app, Crackle, has been downloaded more than 11 million times.  With plans to launch tailored apps for Windows Phone, Nook Tablets, and Kindle Fire, Sony intends to have native Crackle apps for virtually every major platform and device.
  Crackle differs from Hulu+, Netflix, and Amazon's Prime streaming services in several ways.  First, the content catalog on Crackle has fewer offerings than its competitors.  Second, the content is heavy on b-grade titles, and (naturally) Sony-owned content from Columbia and Sony studios.  Third, it doesn't offer any content in HD - at least for now.
  The real difference from a business perspective is that Crackle is 100% ad-supported - and free to users.  While Hulu has some content that's ad-supported and "free" Hulu's also subsidized by the subscription Hulu+ service.  Crackle will provide a real test for the viability of a stand-alone "free" (ad-supported) commercial online video distribution.  While Sony's been a leading innovator in new media hardware and content distribution (they had an e-reader before Amazon and a personal digital music players before Apple), they've not always been smart about it, or successful.  Their tendencies in the past have been to load up with proprietary hardware and software, and limit flexibility in how people use and access devices and content.
  Crackle can turn out to be a good test case in two ways - the viability of ad-supported content in a digital world, and whether Sony's learned that digital users really like flexibility.

Sources  -  Netflix, Hulu, HBO Go and Sony? Huh?,  MoBlog (a MediaPost Blog)
Crackle home page -

Friday, August 24, 2012

Growth in Global Online Video Use Continues

A new research report from NPD Group shows that almost 1 in 5 consumers worldwide access online video through their TV sets on a daily basis, and 25% report watching online videos on their TVs at least several times a week.  Those numbers are based on a survey of 14,000 consumers across 14 countries that looked at online video content and how it was accessed and used.
  There were significant differences across countries in terms of the level of online video use, and in the devices used to access and display them.  In urban China, almost 40% reported watching online videos on their TVs daily.  Urban China was, in fact, heavy users of online video - "consuming more online video content than any other country across every device."
  The increase in using TV sets to watch online video is very likely to be related to a shift in content preferences - films are now the most popular online video content on TVs, overtaking TV programs, and other content forms.
  Using mobile devices to watch online videos continues to grow, but still trails the use of laptops (52%) and desktop computers (73%).

Source  -  Report: 18 Percent of Consumers Watch Online Video Content on TVs Worldwide,  Online Video Daily

Germany Ponders New Copyright for News

There's a new intellectual property right being considered by the German legislature.  While precise language is still being developed, proponents have talked about new type of copyright as indicating that the use of any published content online, no matter how small the snippet, would require payments to publishers, at least if the use was commercial or generated value associated with the use.  One scenario discussed was online news aggregators who have advertising on their site.  Another would seem to extend to any use where someone along the line obtained some value.
"The example that was given at the hearing was: a bank employee reads his morning newspaper online and sees something about the steel industry, and then advises his clients to invest in certain markets," Mathias Schindler, who helped found Wikimedia Deutschland, told Al Jazeera.
"The publishers argued that the bank consultant was only able to advise his clients because of the journalistic work in the published article," said Schindler, who's been attending recent government hearings into the proposed copyright amendments. "So that means the publisher deserves a fair share of any money made from that scenario. This was the proposal from the start."
The Federation of German Newspaper Publishers understandably applauded the idea -
"In the digital age, such a right is essential to protect the joint efforts of journalists and publishers," it said in a statement, noting that such revenues were "an essential measure for the maintenance of an independent, privately financed news media."
Like other recent proposals to find new revenue streams for traditional media, this is a proposal that looks good at first (at least to the rights holders).  However, if you start considering the downstream implications of the proposal a lot of problems emerge. A researcher at the Bureau for Information Law Expertise in Germany argued that this kind of copyright expansion being pushed by publishers was likely to lead to significant "collateral damage to fundamental freedoms like the freedom of the press, the freedom of expression, the freedom of science and education as well as the communication and publication practices on the Web."  At the very least, it revokes any kind of fair use/fair dealing principle.

  Another problem mentioned by critics is the inherent difficulty in determining and tracking whether information going through multiple intermediaries leads to some eventual commercial value that would trigger the licensing payments, or (using the above example) determining which specific publisher was the source of the information whose consequent use created commercial value.  And following that logic a bit further, if reading a local German newspaper online would be more costly (because of the licensing fees), wouldn't the online user go to a non-local news source in a country that didn't have this enhanced copyright?  Would information sites and sources flee German jurisdiction to avoid the costs and enforcement requirements of such an extension of copyright?  And then there's the enforcement issues related to dealing with all the small blogs, social media, emails, and other digital sources that might post a snippet of news - particularly those located outside Germany.  Turning the simple idea of creators of news and information sharing in any rewards from other people's use of that information into a viable and functional system for identifying, determining the values and appropriate "share," tracking, collecting, and distributing licensing fees equitably could easily become a operational nightmare, if not outright disaster.

  But the real problem will turn out to be that such moves aren't really likely to generate much added revenue in the first place.  Newspaper publishers think: sure, I get paid for people's use of my online content - that's great.  But they tend to forget that newspapers are news aggregators themselves - studies show that only 5-10% of a newspaper's news content is produced entirely in-house (where they would be the sole owner of copyright).  So if these kind of copyright extensions are enacted, the newspapers might well be receiving licensing fees for 10% of their content, but would be required to pay licensing fees for the remaining 90% of their total output.  On average, and in the long run, these new licensing schemes are likely to cost news organizations dearly, rather than be their salvation.

Source -  Germany Wants To Charge Google For News SnippetsInformation Week

Online piracy - and enforcement - hits apps

It was bound to happen at some point, given the rapid diffusion of smartphones and tablets, and the popularity of apps.
  The U.S. Justice Departments IP Task Force, in conjunction with French and Dutch authorities have seized three website domains for allegedly trafficking in pirated Android apps.  What happens with a domain seizure is that U.S. authorities work with Internet domain name servers to block access to the websites (replacing original content with a home page announcing the seizure), and where they can, they also seize website content on the servers that host them and try to arrest the site operators.  However, due to the nature of the Internet, site owners and operators are often outside US jurisdiction, or are hidden behind layers of fake names or companies.  Domain seizures have had a somewhat mixed record in terms of actually doing anything to slow down piracy or trafficking in counterfeit goods, and have at time seized websites in error.
  Still, this seems to be the first anti-piracy action directed at the unauthorized copying and sale of apps.  App developers have been worried for some time about the potential piracy of Android apps.
A 2011 survey of 75 Android developers by the Yankee Group and Skyhook Wireless found that more than half believed Google wasn't doing enough to prevent app piracy. Last month, developer Matt Gemmell also laid the blame at Google's feet: "People pirate Android apps because it's easy."
Google has implemented some additional security and encryption measures in newer Android licensing system and for apps sold through Google Play, but developers are concerned that these efforts aren't extended to all app developers and purchase platforms.  There is also some concern over whether Google's measures will be sufficiently hamper piracy and unauthorized copying of their apps.

Source -  Android App Piracy Leads Feds To Seize Websites,  Information Week

Thursday, August 23, 2012

AEJMC Panel on University Online NewsStart-Ups

At the recent AEJMC (Association for Education of Journalism & Mass Communication) conference, several interest groups and J-Lab hosted a discussion on "New News Labs."  The panel discussed the variety of models that universities were developing for their online student news sites.  The discussion was videotaped and is available at the J-Lab site - direct link to video.

Source -  2012 AEJMC Luncheon - New News Labs: The Rise of University Entrepreneurial News Startups,  J-Lab

Wednesday, August 22, 2012

Legislative Moves by Broadcasters

American broadcasters and their trade group, the National Association of Broadcasters (NAB) are actively lobbying Congress and the FCC on two current issues with possible economic impact.
  In one case, radio broadcasters are lobbying against some possible changes in intellectual property law.  Recorded music embodies several separate intellectual property rights - copyright, which covers the authors/composers of music, performance rights, which covers the artist's performance of that music, and mechanical reproduction rights, which address the right to make and sell copies of performances.  The last, mechanical reproduction, are held by the record companies.  And broadcasting has always had to pay for copyright permissions.  However, in the early days of radio, there was an informal quid pro quo on performance rights and radio that became formalized in copyright law - since radio was (and still remains) the ore-eminant promotional tool for music, artists and labels waived performance rights fees to maximize radio's capabillity to play new artists.  The logic behind this is solid - adding performance rights fees to copyright fees raises the costs of music to stations.  This may be less of a problem with music that has an established value, as long as that value is higher than the rights costs - but it will discourage playing of lower valued (special interest or limited interest) recordings, whose value is less than the costs of rights fees, and new music whose value is uncertain.  Positive costs in terms of rights fees will discourage broadcasters from serving minority tastes, and reduce stations' interest in airing new music from new artists (where the value of that music is unknown and uncertain. 
   On the other media, most other media channels, where music is often more peripheral to their main service value, has been paying performance rights.
  The music industry, who short-sightly sees this as a new revenue stream for existing music catalogs, has been pushing Congress to overturn what amounts to radio's waiver from paying performance rights.  It may be "fair" to treat all media the same, but not all media have served as a primary (and free) promotional tool for the industry.  Adding costs through adding rights fees may generate more revenues for the performances that artists have already recorded.  However the vast bulk of revenues for music companies and artists comes from reproduction rights, not copyright or performance rights.  And the added costs of performance rights will likely have a negative impact on the demand for new performances and recordings - as radio outlets reduce the amount of new recordings played, and as artists and labels would likely have to pay more to get new music out in front of potential audiences and consumers.
  There are two competing bills circulating in the House. A draft of the proposed "Interim FIRST Act" authored by Democrat Jerrold Nadler was released this week, and would force cable and satellite radio stations to pay performance rights, and would require online streams from broadcast radio stations and other Internet radio services to pay performance rights at an even higher rate.  While proclaiming that it would level the playing field and treat all players equitably, it would embody three different sets of rates - broadcast radio, which would still be exempt; cable and satellite radio, which would pay one rate for performance rights, and online and Internet radio (including streams of broadcast stations) which would have to pay a still higher rate.  Republican Jason Chaffetz is working on a bill that would impose the same performance rights fees on all digital music sources - cable and satellite radio, Internet radio stations, and music streaming services.  It would also call for the rates to be negotiated between music outlets and the music industry, rather than being set by a Federal tribunal.
  While broadcasters would prefer to avoid paying performance rights fees altogether, they are more supportive of the Chaffetz approach rather than the Interim FIRST Act - and the latter's attempt to make broadcast radio stations pay more for their digital streams that other digital music outlets.

  The NAB and broadcasters are simultaneously continuing to lobby for legislation that would require smartphones to be capable of receiving and playing FM radio broadcasts.  Most smartphones already have a chip built into their devices that would do that, but that application is turned off by virtually all U.S. wireless operators.  The initial arguments were phrased as a trade-off for reintroducing performance rights fees.  Broadcasters are adding a new argument - public safety.  They point to the role that radio plays as the primary means of distributing information to the public during natural disasters or other emergencies, when normal information services are interrupted.  In filings to the FCC, the NAB argued that most broadcasters have built-in redundancy in case of emergency, and that the broadcast nature of their service and the widespread availability of receivers make radio particularly well-suited for emergency communications.  Besides, it's already the designated government Emergency Broadcast System.
"It is time to seriously consider steps needed to improve consumer access to free, over-the-air radio via smartphones and other mobile devices,"  (the) NAB said.
While analysts generally doubt that activating FM chips in smartphones will dramatically change radio and music listening behavior, it at least expands the options for consumers.  And its likely that it could prove useful when traditional communication channels (and wireless service in particular) is interrupted.

Sources -  Nadler circulates draft legislation on music royaltiesThe Hill
NAB: Broadcasters Are Answering Call for Reliable Emergency InfoBroadcasting & Cable

Newsfail - From Watchdog to Lapdog

There are a lot of reasons behind the continuing decline of the big media outlets in the U.S.
  You can point to structural issues in the evolving news marketplace - the explosion of competition and substitutes that has splintered audiences; the rise of a superior medium for classified advertising (online); the shrinking news cycle and the shift in focus from accuracy to primacy (that is, being the first with a story has become more important than being accurate).
   You can point to shifting audience preferences - newspaper readership has been declining with each new generation of readers; the rise of TV news, then cable for immediacy and impact; the rise of online news and alternative information sources that allowed news consumers to find the news and framing that suited their preferences, (challenging and dismantling the gatekeeping role of major news outlets and directly challenging its orthodoxy and calling it to task for its errors).
   You can point to a plethora of bad management decisions on the part of once-dominant monopolistic media firms - such as selecting a morning show host with limited experience to anchor a major TV news program; of emphasizing entertainment values over news values (emphasis on the "gotcha" moment, promoting controversy and opinionated contrived conflict over reasonable discussion, hiring for name recognition outside of journalism - or directly from political staffs. etc.), reducing the size of staffs actually producing the product (news) while enlarging management and trying to maintain monopolistic profit levels.  And one glaring recent example - the New York Times hiring as its CEO someone whose media experience is with a European public service broadcaster (the BBC) and offering him a package that could amount to $8+ million in salary and benefits for his first 16 months on the job - when the business he will be CEO of is on track to lose more than $600 million in 2012.
   And you can point to changes in the journalistic process - with shrinking staffs and emerging alternative media forms pushing reporters to cover more, with multiple media, and produce multiple versions of stories; with the emphasis on primacy rushing story production; with the near-disappearance of mid-level copy editors and fact-checkers from newsroom staffs; and with a decline in beat reporting that has contributed to the emergence of a cohort of generalist reporters who often lack the background knowledge and experience to inform their coverage and challenge inaccuracies when they confront them.
   To us old guys, the "profession" of journalism seems to be less investigative and analytical than "reporting" - and in a bad sense.  Too much of modern journalism seems to consist of taking information fed to them from one source, finding one or more other sources to fill out the story's frame, and regurgitating it while adding their own opinions.  And it seems increasingly biased.
   Now, "bias" claims can be controversial, particularly for a profession that has historically embraced standards of accuracy and objectivity.  But any process of selection, including news gatekeeping, embodies bias of one form or another.  News frames and formats, standards and norms - all embody some degree of bias.  Some kinds of bias are good and welcome - such as standards of accuracy, or news values that suggest major corruption in local government is a more important story for local news outlets than a report of a car accident halfway around the world.  But as major news outlets increasingly seek and embrace entertainment values as ways to differentiate their content and remain competitive, and as opinion becomes more widely integrated into mainstream news, major news media damage their value as "news" outlets, and are increasingly seen as less accurate, and less trustworthy.  It's no wonder that trust in news outlets continues to fall.
   Bias doesn't have to be limited to full-blown cheerleading for one side or another - it can happen in the selection of what news to cover, it is evident in what claims are accepted and which are rejected or treated with suspicion, and it can be seen in terms of what sources a reporter goes to, or in whether a reporter honestly seeks sources or information that challenge positions and critically evaluates the positions, or just looks for one source to "confirm" them.
But the most problematic aspect is that people tend to judge bias not on some objective standard, but relative to their own views - and if those views are widely shared among their peers it is an easy jump to the conclusion that it must be right and reasonable.
   The shift to opinion, the increased perception of bias (even dishonesty) of major news media in the U.S. has become increasingly obvious over the last decade - particularly in national political coverage.  And the rise of blogging has contributed to the lack of trust - by allowing true experts the outlet to publicly comment on, and challenge if needed, stories written by generalists lacking critical background knowledge and the experience to recognize problematic aspects of stories.  And there has also been an increase in self-criticism and recognition of failings by journalists (or at least, those "mea culpas" are more widely reported within the blogosphere).
  For example, within the last week we have Jake Tapper of ABC criticizing recent coverage -
"We are spending a lot of time in the last few weeks, those of us in the political world, political journalists and also politicians, talking about things other than the economy... [A] lot of people are hurting out there. I’d like to see more action taken and more emphasis given to this issue.”
Mark Halperin, senior political analyst for Time, put it more succinctly when discussing the Romney tax return "story":
"I think the press still likes this story a lot, the media is very susceptible to doing what the Obama campaign wants, which is to focus on this,..
“Do voters care about it? I don’t think so,” he added, concluding that “The economy is still top, front, and center. The Medicare debate’s important. I think [Romney's tax returns is] mostly something that the press and insiders care about. It’s an important issue about transparency, but there are other bigger issues in the campaign, I think.”
And the concern goes back to the 2008 Presidential campaign, which saw the rise of "JournoList" an online listserve that pushed certain storylines with a strong liberal perspective.  A study by the Pew Reseach Center at the time found a strong slant - then Senator Obama received much more coverage than other candidates, and coverage of Obama had 1.4 positive stories for each negative whereas coverage of McCain had 3.5 negative stories for each positive.
"Sometimes I saw with story selection, magazine covers, photos picked, [the] campaign narrative, that it wasn’t always the fairest coverage.,, I have said before… [that I] thought the media helped tip the scales. I didn’t think the coverage in 2008 was especially fair to either Hilary Clinton or John McCain," Tapper said. 
After the 2008 election, the Washington Post's ombudsman admitted that he felt that the Post's coverage was slanted, and probably worth 10-15% shift in voting to Democrats. It was a perception shared by voters - one post-election poll found that
"a majority of voters (51%) say most reporters have tried to help Barack Obama win the presidency. Just seven percent (7%) think they tried to help John McCain.”
 And it's the perception of voters and news consumers that matters - as it forms the basis for determining how much trust people place in media, and their evaluation of the value of the news being presented.

Sources -  Tapper: Media Failing CountryThe Weekly Standard
Mark Halperin: The Media Is Very Susceptible to Doing What the Obama Campaign Wants, Mediaite
Media Bias in the 2008 U.S. Presidential Election and the Effect of the BlogosphereRed State

Tuesday, August 21, 2012

The Growing Importance of Search for TV

As media markets continue to expand, become more competitive, and with the flood of content availability and options, helping people find the information and content they want becomes more and more important - and valuable - to potential consumers.
  John R. Osborn offers some thoughts on the importance of search for TV firms in the Online Video Insider blog.  I'll offer my own insight that search will remain important, and become increasingly valued, for all media forms and formats.
  Osborn starts by reminiscing about TV's Golden Age of the 1950s, when there were limited channels and programming schedules were fairly static - so that most everyone knew where and when to find the programs they were interested in viewing.  Today, he notes, more than half of US TV homes have DVRs, about 90% use multichannel video providers (cable, DBS, etc.) to access TV content from some or all of the 500+ networks available in the U.S.  And then you also need to consider a number of other content sources that Osborn doesn't list - the huge backlist of movies and TV programs available through home video (DVDs, BluRay); the rise of multiple streaming services offering access to TV content and movies (Netflix, Hulu+, etc.) - including many current programs; and the explosion of online video.  A quick stat from YouTube can give you an idea of the wealth of online video content available - on average, users upload to YouTube more video content each hour than the major U.S. networks have produced in their 60+ years of operation.
  Today, it's not enough to produce good TV programming to successfully attract an audience - potential viewers have to first learn that the content is available, and then to find it.  And to  become successful, the content has to be engaging enough to get them to not only view the program, but to come back for new content as it becomes available.  Search can be incredibly useful in meeting these goals.
  Osborn's post outlines the importance and value that good search options can provide viewers, advertisers, content producers, cable/telco/satellite distributors (PayTV) many of which currently offer a variety of search options - and are applicable for local stations and outlets as well.  There are some good exemplars and templates out there - Tivo's search and recommendation system, Microsoft's new X-Box Live technology, Amazon's recommendation platform, and Netflix's recommendation system.
  The full post is worth a read.

Source -  The Importance Of Search In Next-Gen T/V Business Model,  Online Video Insider

Monday, August 20, 2012

NY Times Gets New CEO - from BBC

Last week, the New York TImes announced that it had finally hired a new CEO to fill the position that had been empty since the previous CEO left last December, after making a series of embarrassing gaffes.  The new CEO will be Mark Thompson, formerly the Director General of the BBC.

  A number of media critics have questioned the hire, noting that -

  • Mark Thompson has spent all of his media career at the BBC - a government-funded public service broadcaster.  He seems to have no newspaper experience.
  • While he was Director General during the times the BBC made considerable, generally successful, efforts moving into the online/mobile universe - most insiders felt that Thompson was seen as more of a "custodian and curator" than an innovator.
  • Is Thompson's salary of $1 million a year reasonable for an organization that is still bleeding revenues profusely? ($140+ million in losses in the last quarter).  But it's actually worse than that - Thompson's deal calls for a reported $8+ million in salary, incentives and prospective bonuses that he could receive by the end of next year.
On the business side, the latest financial reports see that ad revenues continue to decline at the Times, with print-ad revenues down a reported 7% in the last quarter (online ad revenues were also down).  In the last quarter, the Times reported making more revenues from subscriptions than from advertising.
The bottom line is that a business-as-usual or custodial approach is not going to cut it at the NYT, not when revenues are declining as rapidly as they have been. And shedding some staff or tweaking the product with a few digital bells and whistles isn’t likely to accomplish much either. The legendary paper doesn’t need someone to manage its business; it needs someone to reinvent it on a fairly fundamental level. Whether Mark Thompson is the man for that particular job remains to be seen.

Sources - Is Mark Thompson what the NYT really needs right now?, Tech News and Analysis. 
NY Times boss Thompson eyes $8m payday,  AFP/France 24

ICANN Antitrust Case Moves Forward

A Federal  antitrust lawsuit filed against ICANN (Internet Consortium for Assigning Names and Numbers) has passed the first hurdle, with a Federal Judge ruling that ICANN, while a nonprofit, may still be engaged in the kinds of commercial activities covered by U.S. antitrust laws.  The Judge handling this preliminary hearing ruled that "ICANN's argument about its charitable purpose is "irrelevant to an analysis of whether ICANN's activities are commercial."
  A publishing company filed the suit
against ICANN last November, shortly before the rollout of a new ".xxx" top-level domain. ICANN said that companies or individuals could pay the registry ICM -- tapped to manage the .xxx domain -- to prevent their names from being registered with an .xxx at the end, but that doing so would cost $150.
(The suit argued) that companies or individuals who wanted to prevent their names being used by others in a .xxx domain should not have to pay a fee of $150. The company said the fee was artificially high and reflected price gouging, monopolistic conduct and other anti-competitive practices.
The ".xxx" domain was established expressly for porn and other "adult" sites, in part to facilitate the ability of filters to distinguish adult from more general-interest sites.
  When the case was filed, the Association of National Advertisers said the suit illustrated the issues that is likely to emerge from ICANN's plans to allow companies purchase the rights to use their brand names or other words as part of their top-level domain names (the text-based URL).

Source -  Judge Allows Antitrust Lawsuit Against ICANN,  Online Media Daily

Apple's TV interest in Cable Rebuffed

A couple of weeks ago, it was reported that Apple was approaching several large cable MSOs about the possibility of providing set-top boxes that would integrate Apple TV interfaces and services.  It's become fairly clear that cable operators have not exactly embraced the idea.  The Wall Street Journal has a good story on the cable industry's reaction, suggesting that Apple's seems to be focused more on delivering its software and services than providing hardware.
  Moreover, a new report from Bernstein Research suggests that there are meaningful differences between the mobile world, where Apple has been quite successful, and the market for multichannel TV services.
Not only does the cable TV market, unlike mobile, feature network distributors that also own content, it does not subsidize end user devices, and it is heavily involved in determining the capabilities of those end user set-top box. That final point, in particular, is something that service providers in the mobile market pretty much ceded to Apple.
  Bernstein's report suggests that Apple might find the differences difficult to overcome.  More critically, the primary focus of multichannel has been to provide the widest range of choices to subscribers, and Apple has been insistent on providing what's called a "walled garden" - a limited set of services that conform to Apple's standards.  I think that that's going to stand in the way of their move into TV, if they  continue to insist on that approach.

Sources - Apple: High Hurdles to Working with Cable Guys,  Bernstein Says, Barron's
Apple's New Front in Battle for TV, Wall Street Journal

SideNote - Top Apps for Back to School

Fierce Mobile Content has a post on Top "Back-to-School" apps for both iOS and Android mobile devices.  Its aimed at primary and secondary education, but includes a couple of interest to University educators and even media professionals.

  • DropBox - A cloud storage to share your files across devices, or with other users.  And it's free for low-volume users.
  • AT&T Translator - a free real-time translator for your mobile devices.  For now, covers the following languages: English, Spanish, German, French, Japanese, and Chinese).
    AT&T Translator marks a new milestone in how quickly speech technology is progressing in mobile. While there are hundreds of apps that translate text from one language to another, they can be time-consume and tedious. AT&T Translator, developed through AT&T Labs on AT&T's Watson platform, identifies languages automatically, saving the user from having to manually input this information. AT&T has stated it is working to add additional languages to the app.
Source -  Best Back to School Apps for iOS, Android,  Fierce Mobile Content

Tuesday, August 14, 2012

Two Posts on Big Data & Social Media

The Internet allows for the collection of immense amounts of information - on users, on outlets, and on content and its flows.  Aside from continuing concerns about privacy issue, there's been a rise in the attempts to make sense of, and use, all this constantly-generated "Big Data."  Furthermore, the rise of social media, and its transparency, is creating huge amounts of comments, thoughts, recommendations, of millions of users - as well as their follies and foibles - all of which can be mined for data.  
  We've clearly past the first stage - the explosion of information.  That's been happening for decades, and the amount of information generated around the world continues to explode exponentially.  We're also well into the second stage - developing tools and techniques for sorting, managing, and even filtering information.  We're also in the early stages of developing analytics - the means to measure and analyze all that information (although measuring compounds the information explosion as it continuously creates new data about information and data).  And that leaves us with the continuing problem of making sense and finding value.  We're making inroads, but have yet to fully step into the third wave - using all those tools to find value in the information haystack.

Dion Hinchcomb, posting at The Brainyard, argues that
the social world, by dint of a billion people engaging with each other around the clock, is now the richest source of open innovation, product ideas, marketing and sales opportunities, customer care capacity, and much more. One thing we've learned in the last eight years of the mass collaboration era is that, whatever an organization cares about, crowds can help us conceive of it, build it, test it, market it, support it, and fix it--and do all of that at scale.
The problem's been to find the gems or spot the trends in this morass of information and data.  Thankfully, there's been a lot of people and companies working to develop analytics and techniques to sift and sort Big Data. This has opened Pandora's Box - a potential of finding value for Big Data users, as well as the potential for harming social media and Internet users.  Hinchcomb focuses on the positive, positing that Social Media's Big Data can generate positive returns on organization's investment in utilizing and analyzing social media.  Hinchcomb suggests that we're well past the first wave - that organizations are finding and making use of social media.
With the continuing rapid growth of social media, Hinchcomb suggests that some organizations will transition to social businesses.  In an information economy, knowledge workers recognize the benefits of the opportunities the Internet and social media provide - greater access to information, enhanced opportunities for collaborations beyond your own "silo" of expertise/focus, and the ability to focus on project-related tasks.  Particularly if enterprises can transcend internal barriers, such as embedded legacy enterprise-specific applications.  Still, the fairly rapid adoption of social media provides hope that we'll make the transition.

Still, it's not all blooming roses out there in the world of Big Data.  At a recent Kontagent Konnect user conference, Josh Williams talked about the "Seven Deadly Sins of Data Science."  As with any analysis, you can do it well, or poorly (particularly if you don't understand the limits inherent in any analytic technique).  Here's some of the possible ways to mess up.
  1. Sloth - Lazy Data Collection:  Also known as GIGO (garbage in, garbage out), the first limit on analysis is the quality of the data.  It's easy to grab and use numbers that are there, rather than the numbers that you need.
  2. Negligence - Misapplied Analysis: It's easy to use an inappropriate technique - one that doesn't provide relevant results. (An example from one of my first stats classes - "The average American is 47% male and three days pregnant."  Think about it.)
  3. Gluttony - Too Many Reports: Too much information and too many tools can result in too much analysis - and the key results can get lost in the mix.
  4. Polemy - Data Definition, User Disagreements:  One of the problems with "Big Data" at this stage of development is that there are few widely-accepted metrics or analytics, which contributes to arguments over the definition and utility of data measures and procedures.
  5. Imprudence - Jumping to Conclusions: When you have a lot of data and a lot of results, it's easy for something to jump out and seem significant.  Stats people will remind you that even at 95% confidence level, there's a 1 in 20 chance of a false positive (i.e., what you see isn't really there).  If you place too much importance on a single finding, without examining the broader context, definitions, or limits of measures, you could be making a big mistake.
  6. Pride - Decision-Driven Data Making:  If you look at Big Data to confirm your beliefs, it's easy to construe or manipulate things, even subconsciously.  You might define things a certain way, pick supportive data, manipulate datasets - all of which might bias the analysis to foster confirmation.  The true scientist looks for answers rather than confirmation, and is more likely to get to the truth (or reality or whatever).
  7. Torpor - Learning and Acting Slowly:  Historically, data and analysis have been delayed - collecting, publishing, and analysing data took time (in academia you can easily have delays of 1-2 years in getting results out and being able to apply them).  However, the Internet, social media, and Big Data are all racing in real-time.  Being successful there requires collecting and examining data in real time, and being able to react to what you see happening quickly.  A delay can put you on the wrong end of the trend.
What all of these suggest is that when seeking to use Big Data or social media, you need to learn the lessons of data definition, data collection, and data analysis that decades of research and practice in other areas have yielded.  But you also have to be aware of the unique aspects of Big Data, social media - particularly the speed at which things occur online.  Being aware of the 7 Deadly Sins and how they might lead to poor results can help.

Sources - Why Big Data Will Deliver ROI for Social BusinessThe Brainyard
The Three Waves of Enterprise 2.0: Climbing the Social Computing Maturity CurveebizQ
Social business holds steady gap behind consumer social mediaZDNet
7 Deadly Sins of Big Data Users,  Information Week

Knight: The Skills Journalists Need and Want

The Knight Foundation recently released a report on what skills and training journalists think are valuable and needed in the new digital news environment.  Based on a survey of 660 journalists, the majority from outside the U.S., found a strong interest in learning new skills.
  Digital skills topped the list, with more than three-quarters of respondents indicating that additional training in multimedia, new technologies, and data skills would be very beneficial.  Some 70% felt that leadership skills would be helpful, and 62% felt that additional training in the topics they cover would be beneficial.  It was somewhat troubling that almost half felt a need for additional training in ethics (51%), basic reporting skills (46%), and traditional journalism skills (46%).  The report adds that non-U.S. journalists were much more likely to find training in traditional skills valuable (65%) than were U.S. journalists (19%).
  The report found that more than half the participants had participated in an online or virtual class, a significant jump from the 5% reported in 2002.  Moreover, many felt that online training was effective -
84 percent of international journalists thought distance learning was better or about the same as classroom training; only 34 percent of U.S. journalists felt that way.
   The report also asked respondents about their level of job satisfaction.  Reporters overwhelming felt that their work contributed to society, even though they were much less satisfied with opportunities within their newsrooms. Combining the "Somewhat satisfied" with the dissatisfied measures, more than half weren't satisfied with training opportunities, chances for promotion, and their pay and benefits; some 45% expressed concern over job security; and a third weren't satisfied with how much influence they had over work decisions.

  The study not only asked about general satisfaction with their opportunities for getting training, they asked respondents to grade their news organization's efforts in meeting training needs.
Fewer than four in 10 of the journalists who work in newsrooms give their organizations an A or B when it comes to meeting training needs. The majority, about six in 10, rank their news organizations as C or worse. Grades have gone steadily downward in the three Knight surveys, with a greater proportion of Cs, Ds and Fs this year than ever before.
  While the study didn't use a random sample, and heavily over-represented journalists from Latin America and participants in various Knight Foundation funded projects, the report argues that - while the exact percentages aren't generalizable - the basic results can provide insights for both the journalism profession and those interested in providing journalism education and training.
Professional development will play a key role in the transformation of the news landscape.
Not all news organizations will survive the transition to the digital age. The ones that make it will be nimble, adaptable. They’ll have learning cultures, where training is built into the daily routine.
 Source - Knight report on training shows journalists want technology, multimedia, data skillsPoynter
The full Knight Foundation report -  Digital Training Comes of Age

Monday, August 13, 2012

NBC leaves Olympics viewers outraged but satisfied.

NBC's coverage of the London Olympics Closing Ceremony left at least two groups of viewers outraged.
  The first group ran afoul of NBC's online efforts - NBC streamed the Closing Ceremony live (using a generic English language feed), but made the decision to delay its television coverage to US prime time, filling the afternoon broadcast feed with Olympic replays and commentary.  Purists wondered why they didn't go ahead and put the live Ceremonies on the main NBC television feed, and then do a replay (or highlights of the Olympics show) in prime time.
  It was a business decision, of course, but one that led directly to the second, much more significant problem.  NBC apparently thought that its Olympics audience would be a great place to preview a new Fall sitcom about veterinarians.  Originally set to come on after the closing ceremonies, NBC pulled a "Heidi" by deciding to interrupt the closing ceremonies to keep "Animal Practice" in its scheduled time slot.  In doing so, they cut out a number of performances.
The Twitter-sphere exploded, with “#NBCfail” and “#closingceremonies” trending worldwide, after NBC cut out performances by Ray Davies, Kate Bush, The Who and the Muse in favor of a commercial-free airing of “Animal Practice.”

“I still don’t understand, it’s a tape delay, so can’t you do the math in advance? Why do you need to cut off the closing ceremony? #nbcfail,” wrote Raj Sarkar on Twitter.
A spokesman for NBC, in acknowledging earlier social media complaints, said that NBC was paying attention - while still managing to blame viewers.
“Some of the criticism I think was fair and we took note of it and learned from it,” Lazarus said, “but I think in general a lot of it came from people who weren’t fully aware of all of the things we were doing.”
While there were some issues with the tape delays, the at times interminable chatty hosts and feel-good stories, and the choices of what events to cover on which channel, NBC's ratings, online and on TV, surpassed those of the 2008 Beijing Olympics, and a Pew study reported that about three-quarters of Americans watching the Olympics this year were generally satisfied with the TV coverage.

In any case, 2016 should be better - Rio is closer in time.  And if NBC continues this year's trend of streaming all events, more of us will be able to control what we end up watching through our selection of which streaming feeds to send to our screens, rather than  relying on NBC's sometimes problematic choices.
 Sources -  NBC enrages Olympics viewers one last time,  Poynter
Viewers outraged after NBC cuts away from Olympics closing ceremony, World Sport News (from CNN)

Following A Multiscreen Olympics

Some Big Data metrics from Google illustrate the complementary nature of TV viewing and use of online and mobile devices.

The graph looks at the spikes in searches for Paul McCartney, who concluded the Opening Ceremony with a rendition of "Hey Jude."
"One of the significant things to note about the spikes Google saw in online searches for the former Beatle is their timing. Spikes were detected concurring with the performance by those searching in Europe. In the United States, where NBC delayed the ceremony spikes in searches for the former Fab Four member synced up with the East Coast and Pacific replay of the performance."
Google also reported that at times, searches from mobile devices outpaced those from laptops or desktops.   In Japan, 55% of searches were from mobile devices; Great Britain (46%), Australia (45%) and Korea (36%) were the other countries with high mobile shares.

 Meanwhile, a Pew study suggests that Americans are increasingly using "second screens" to supplement their TV viewing.  When it comes to the Olympics, some 80% of respondents indicated they had watched some of the Games on TV (23% watched live coverage), while 17% reported following the Games online, and 12% said they used social media to follow the Games.  Respondents also rated the quality of coverage from the various sources positively, and at about the same level.

Source -  Olympics-related search spikes reveal close tie between first, second screens, Broadcast Engineering.
Americans augment Olympics TV viewing with social networks, online video, finds Pew, Broadcast Engineering

Going Off-Air with Local TV News

The latest RTNDA/Hofstra TV & Radio Newsroom study looked at the growing tendency for local TV newsrooms to look for other ways to distribute (and profit from) their content.  More than three-quarters of stations reported providing content from their newsroom to one or more other media (beyond their station’s newscasts or websites).
  The study found that, over all, there was no significant change in the proportion of newsrooms distributing content externally.  There were some small changes in the mix of channels/distribution preferences.
  Looking forward, more than 80% of local TV newsrooms were now (or would soon) embrace the 3-Screen strategy of providing content on air, online, and to mobile.  This was higher than the number of stations reporting that they were broadcasting their local news in high definition (60%)

All TV
Big 4
Another Local TV Station
TV in another market
Cable TV Channel
Local Radio
Website (not their own)
Mobile devices

Interestingly, stations in the Top 50 markets were slightly less likely to seek additional distribution channels for their news than were outlets in smaller markets. 

The study also looked at newsrooms’ cooperative/collaborative ventures.  One recent trend is for some newsrooms to also produce the local news programs for another station in the market.  A different type of arrangement is to collaborate with other local media in news gathering and coverage.  The study found a small decline in the number of such agreements – 21.2% had agreements with another local TV station; 25.2% had arrangements with local newspapers; 23.8% were with local radio stations; and 4.6% had arrangements with other local news outlets.  Almost half (47.5%) had no collaborative or cooperative agreements with other local media outlets.
  Some of the agreements called for sharing pool video (33.6%) or use of a helicopter (15.1%) – but most were for sharing information (81.6%). Another 14.5% reported sharing other things, such as video, vosots, packages, live shots, and web content.  In these economic times, it was not surprising that almost half of Top 25 market stations had arrangements for sharing helicopters.

The report found that almost half of stations were using their newsroom content on their stations extra digital channels.  TV newsrooms in 5.1% of stations were also programming and producing for all-news digital channels; 19.9% for weather channels (a decline from last year); and 26.3% provided content for digital channels affiliated with a network, or providing a mix of content.

Source  - RTNDA/Hofstra 2012 TV & Radio Newsroom Staffing and Profitability Study, Part III - TV News Business Isn't Limited to Just TV Anymore