P2P-Zone  

Go Back   P2P-Zone > Peer to Peer
FAQ Members List Calendar Search Today's Posts Mark Forums Read

Peer to Peer The 3rd millenium technology!

Reply
 
Thread Tools Search this Thread Display Modes
Old 01-01-14, 09:47 AM   #1
JackSpratts
 
JackSpratts's Avatar
 
Join Date: May 2001
Location: New England
Posts: 10,013
Default Peer-To-Peer News - The Week In Review - January 4th, 2014

Since 2002


































"Rentrak, a firm that compiles box-office data, projected North American ticket sales would total $10.9 billion, a 1 percent increase from 2012." – Brooks Barnes and Michael Cieply


"Given that file sharing thus advances both of copyright's purported objectives, there would seem to be only one reasonable answer as to whether file sharing, at least with respect to music, should be legal under copyright law." – Glynn Lunney






































January 4th, 2014




Research: Illegal File Sharing Leads to More Hit Music
Andre Yoskowitz

According to a new research report published by Tulane University Law Professor Glynn Lunney, illegal file sharing has helped lead to more hit music for record labels and artists.

This is a sharp contrast to what the RIAA and other trade groups believe, with their claims that piracy leads to billions of dollars in losses every year.

Lunney says file-sharing actually encourages the distribution of existing music, and is in line with current copyright law. His latest research paper, "A Case Study of File Sharing and Music Output," looks into how file sharing leads to more hits for artists.

According to TorrentFreak, the paper "shows that the output from existing artists increased, while new artists appeared less frequently in the hit charts. However, since the new material from existing artists was greater than the loss from new artists, the "creation" of new music increased overall."

Piracy did lead to lower revenue overall, but led to increased output from existing artists, which in turn led to more new hit songs. "Because the second marginal effect outweighed the first, file sharing, even assuming that it caused the decline in record sales, led to the creation, on balance, of more new hit songs," read the report.

Professor Lunney concludes that unauthorized file sharing should be legal under copyright law, "given that file sharing thus advances both of copyright's purported objectives, there would seem to be only one reasonable answer as to whether file sharing, at least with respect to music, should be legal under copyright law."
http://www.afterdawn.com/news/articl...more_hit_music





HBO Still Doesn’t Get It: Game of Thrones Again the Most Torrented Show
Rachel Edidin

Three seasons in and Game of Thrones still continues to set records — both legitimate and otherwise. After hitting ratings milestones earlier this year, it now has yet another accomplishment to boast about: the most pirated show of 2013.

This latest honor comes via TorrentFreak, which found that the Season 3 finale of the show had 5.9 million downloads via BitTorrent, beating other shows like Breaking Bad and The Walking Dead by large margins. However, the fact that millions of people are pirating Game of Thrones really isn’t the story here — or, if it is, it’s not a new one. What bears examining is the extent to which that piracy is a direct product of HBO’s policies — and the network’s staunch refusal to budge in the face of mounting evidence that their policy of avoiding third-party distribution to reinforce the value of their product is accomplishing just the opposite.

It’s not that HBO doesn’t acknowledge the volume of torrenting taking place. In fact, in August Jeff Bewkes, CEO of HBO parent Time Warner, called out piracy as “better than an Emmy” for stimulating interest in Game of Thrones and increasing the number of legitimate HBO subscriptions. Yet statements like that demonstrate that they don’t really seem to get how — or why — piracy is happening, or their role in that.

For the last decade evidence has mounted that, while some measure of media piracy is inevitable, it thrives on inconvenience: given the means, viewers will generally pay for a legitimate source, but only if they can get to it more easily than an illegal one. That’s the backbone of services like iTunes and Netflix: making those transactions easy and, critically, fast.

PiracyData.com, which tracks piracy statistics for movies, has found that the list of most-pirated movies in any given week have one factor overwhelmingly in common: There’s no way to legally stream them. That’s a more complicated conversation when applied to the established theatrical-release-home-release structure of movies. Television, on the other hand, is designed to be watched in the convenience of your home, and the success it’s found by way of streaming services reflects that. They give us a more convenient, selective way to interact with media in much the same way we would anyway.

With a show like Game of Thrones, the impetus to piracy is twofold. First, access: as Matthew Inman recently illustrated at The Oatmeal, unless you subscribe to HBO, there’s no legal way to get to Game of Thrones before it’s out on DVD; and in an age where á la carte viewership is becoming more of a norm, the idea of subscribing to an entire channel for one show seems ludicrous. Second, timeliness: conversations around entertainment have moved from the water cooler to the internet. Given the global conversation around each new episode of Game of Thrones, the week-long lag between domestic and international release is an inexcusable offset. (This lag could also explain why TorrentFreak found more than half of those season finale downloads happened in the first week after the show aired.) Even notwithstanding spoilers, viewership in the age of social media has become a social experience; it’s access to that experience as much as access to the show itself that spurs downloading.

This is the reality of the river that HBO is riding. The question isn’t whether viewers will continue to download Game of Thrones, it’s whether HBO will ever be smart enough to stop throwing away money fighting the tide.
http://www.wired.com/underwire/2013/...es-piracy-hbo/





Pirate Bay Uploads Spike 50 Percent, Thwarting Anti-Piracy Groups
Matt Smith

Piracy never changes. For years it has made copyright owners furious, and for years the efforts to stop it have fallen short. 2013 was no different.

The Pirate Bay has thwarted nation-wide ISP blockades, domain changes and the continuing imprisonment of co-founder Gottfrid Svartholm with a 50 percent increase in uploads over the last year. This raises the number of torrent files available on the site to 74,195 as of this writing, an all-time high.

The number of torrents indexed has reached a staggering 2.8 million, which are shared by over 18 million people if both seeds and leechers are included. About half of the sharing volume is devoted entirely to video, followed by audio (at 17 percent) and porn (at 13 percent). Surprisingly, games and software make up only 5 percent of share volume each.

While The Pirate Bay remains healthy in spite of efforts by copyright holders, corporations and even nations to stop it, the war against piracy continues. The site spent much of the last month fighting a running battle against copyright holders, which resulted in several domain name changes as previous domains were seized, forcing the site to move.

The Pirate Bay hopes to thwart future domain seizures with a peer-to-peer browser (predictably named PirateBrowser) which will act as the site’s main hub and circumvent domain name seizures. Of course, the site’s enemies will no doubt try to find a way to limit the browser’s distribution. While uploads have grown, the drama surrounding the world’s most well-known source of torrents is certainly far from over.
http://www.thingamatech.com/computin...-piracy-groups





Google To Close Bump And Flock, Its Recently Acquired File Sharing Apps
Catherine Shu

Bump and Flock, the file sharing apps Google acquired last fall, will be shut down by the end of this month. Both apps will stop working and be removed from Google Play and the App Store on January 31, Bump confirmed on its blog today.

Google bought Bump Technologies, which make both apps, back in September, and Android Police reports that work on the app appeared to stop shortly after the acquisition.

Bump, which let users tap phones together to share contacts and other files, raised nearly $20 million and enjoyed high download rates, but failed to monetize successfully as other easy, mobile-friendly ways to share information were developed, most notably Apple’s AirDrop for iOS 7. Flock is a collaborative photo-sharing app Bump Technologies released in 2012.

As TechCrunch’s Josh Constine wrote in September, the sale wasn’t an acquihire, but Google might plan to turn Flock into part of Google+ in order to compete with Facebook’s photo sharing and Dropbox’s photo saving services, especially since Google+’s Party Mode, a photo sharing service based around events, failed to gain real traction. The acquisition of Bump Technologies also gave Google access to several mobile communication patents that could help it improve Android and create better alternatives to near-field communication (NFC).

When the startup announced its acquisition by Google, co-founder David Lieb said in a statement that “We strive to create experiences that feel like magic, enabled behind the scene with innovations in math, data processing, and algorithms. So we couldn’t be more thrilled to join Google.” The acquisition price was undisclosed but sources told TechCrunch it was around $35 million, a relatively low amount considering how much funding Bump had raised. Bump’s investors included Y Combinator, Sequoia Captial, Felicis Ventures, SV Angel, Andreessen Horowitz, and many angels.
http://techcrunch.com/2013/12/31/goo...-sharing-apps/





Hypedmusic Shuts Down after Receiving Cease and Desist Notice from RIAA
Janko Roettgers

Free music streaming app Hypedmusic shut down this month after it got targeted by the RIAA for copyright infringement. Is this the start of a wider crackdown on music startups without licensing agreements?

Free music streaming app Hypedmusic shut down this month after receiving a cease and desist notice from the recording Industry Association of America (RIAA). Luke Li, the developer of the app, explained in a blog post Tuesday that he always thought Hypedmusic was legal and protected under the Digital Millenium Copyright Act (DMCA), but that he simply didn’t have the resources to defend his point of view:

“I’m 18 years old, and I definitely do not want to get sued… Again, just to reiterate: I did not make HypedMusic with the intention of infringing copyright, I thought I was operating in a legal area after seeing different, large companies do similar things. Once I saw the RIAA’s email, I complied immediately.”

Hypedmusic was a web, Android and iOS app that allowed users to find and stream music hosted on services like Soundcloud, YouTube and Tumblr through a simple UI. Users could also generate playlists, which were synced across devices — think of it a bit like Spotify, but without any subscription costs.

Li assumed that the service itself was legal because the sites that were actually hosting the music are protected by the DMCA, and he offered copyright owners a way to remove their recordings from the Hypedmusic catalog. However, the RIAA disagreed, writing in an email to Li:

“By indexing, linking to, transmitting, retransmitting, providing access to, and/or otherwise assisting users in streaming and downloading infringing copies of sound recordings emanating from various unauthorized sources on the Internet, these applications are violating U.S. copyright law.”

The timing of the takedown is curious: Just last week, Ex.fm announced that it was shutting down its free music service as well, citing various “takedowns and legal emails” as one reason that made it challenging to keep the service up and running. At this point, it’s unclear whether Ex.fm was the target of a RIAA-issued cease and desist letter as well. An RIAA spokesperson wasn’t immediately available to comment on the matter when contacted for this story, and I have yet to hear back from the Ex.fm team.

However, even the possibility of a more concerted effort by the major labels to take down services that tap into music hosted on third-party platforms could spell trouble for the music startup scene, as this is how many companies build their first pr
http://gigaom.com/2013/12/31/hypedmu...ice-from-riaa/





Iron Maiden Story on File Sharing and Touring Turns Out To Be Bogus
Chad Bowar

Last week, a story made the Internet rounds (including on Loudwire) about how Iron Maiden used data on what countries were pirating their music the most to plan a lucrative South American concert tour. Turns out the story was not accurate.

Original source CITEworld has retracted their story, issuing this statement: “Due to writer error, an original version of this article stated that Iron Maiden used MusicMetric’s analysis to plan its South American tours. MusicMetric did not work directly with Iron Maiden. The analysis described in this article was carried out without the band’s participation or knowledge, and we have no confirmation that the band ever saw or used it. CITEworld deeply regrets this error, and we apologize to our readers.”

In a Nov. 29 article published in The Guardian, a MusicMetric official mentioned Iron Maiden’s BitTorrent data and how Brazil was one of the biggest file sharing nations. CEO Greg Mead was quoted in the article, saying the following:

“With their constant touring, [the] report suggests Maiden have been rather successful in turning free file-sharing into fee-paying fans. This is clear proof that taking a global approach to live touring can pay off, and that having the data to track where your fan bases lie will become ever more vital.”

That story was morphed into Iron Maiden using the data to plan a South American tour, which was not the case. Numerous outlets (including Loudwire) picked up on the erroneous story, which quickly spread across the Web. Bands using piracy data to help plan tours might be a tool that’s used in the future, but there’s no confirmation that Iron Maiden used this method.
http://loudwire.com/iron-maiden-stor...g-turns-bogus/




Pirate Party and Anti-Piracy Outfits Get Permission to Spy on File-Sharers
Andy

Despite new legislation in Norway, not one site has been blocked nor a single file-sharer fined. However, behind the scenes there is a hive of activity, with more than a dozen entities now officially registered to spy on file-sharing networks. In addition to the usual anti-piracy groups more unusual applications include those from the Pirate Party and a hip hop artist who wants to track down pirates and buy them coffee and cakes.

On July 1 a new anti-piracy law was passed in Norway that allows file-sharing sites to be blocked by local ISPs at the domain level.

The legislation also allows any rightsholder or group to spy on file-sharers providing they inform the country’s data inspectorate in advance. Over the past few months various outfits have been signing up, each with their own agenda for monitoring the Internet.

The MPA/MPAA, for example, have the infamous pirate-hunting lawfirm, Simonsen, scouring BitTorrent and other networks looking for people downloading and sharing Hollywood movies without permission.

As expected, the movie companies aren’t initially intending to use the information to launch a wave of lawsuits against individuals. Instead, the data will be used to justify site blockades, with The Pirate Bay front and center. Willy Johansen, Secretary General of the Norwegian Videograms Association, hopes that lawsuits against ISPs won’t be necessary.

“We want dialog, but if it does not succeed, we must consider other measures,” Johansen says.

Ragnar Bjerkreimselva, chairman of the Norwegian Society for Composers and Lyricists, also confirms that the public isn’t a target. “We are looking for the illegal services, we are not looking to go after our own audience,” he says.

A surprise addition to the list of Internet snoopers is the Pirate Party. They put in their notification to the data inspectorate in the same manner as the anti-piracy outfits but their agenda is somewhat different.

“We plan to monitor the IP addresses associated with the Prime Minister’s office to see if the Pirate Party’s program is copied,” the Party reveals.

The full list of organizations registered so far totals 13, the majority of which are anti-piracy groups. However, there is another interesting entry in Aslak Borgersrud, former member of hip-hop group Gatas Parlament.

“I would like to know who the pirates are that our downloading our records, so I can invite them for coffee and cakes,” he said.

Although Aslak doesn’t reveal how he will be going about that act of friendliness, at least one of the anti-piracy groups has spoken about their techniques. Surprisingly, Rights Alliance suggest that rather than participating in swarms themselves, they intend to scrape information from BitTorrent trackers instead.

“The tracker reveals who is breaking the law,” says Rune Ljøstad of the Simonsen law firm.

For the purposes of general data collection, tracker scraping is probably accurate enough but if the group wished to progress to chasing down individuals the technique is flawed. There are various techniques to inject fake IP addresses into tracker reports which has the potential to cause all sorts of difficulties (and defenses for those accused), which is something to consider if the studios carry out their veiled threats.

“We have already begun efforts to collect the IP addresses of people who use pirate sites,” says Willy Johansen.

“We collect only the information, and so far we have not gone to court to demand to know the identity of those involved with this. But it may be appropriate to do that later.”
http://torrentfreak.com/pirate-party...harers-131228/





The Year We Broke the Internet

An explanation. An apology. A plea.
Luke O'Neil

s winter storms were buffeting parts of the country last week, our collective attention was drawn halfway around the world to Egypt. Images of the pyramids and the Sphinx covered in snow had emerged, and were being shared tens of thousands of times on Facebook and Twitter. It wasn’t hard to see why. For some, sharing the photos was a statement on global warming. For others, sharing was about the triumph of discovery, making them proud housecats dropping a half-chewed mouse of news on the Internet’s doorstep. For most, however, the photos were just another thoughtlessly processed and soon-forgotten item that represented our now-instinctual response to the unrelenting stream of information we’re subjected to every waking hour: Share first, ask questions later. Better yet: Let someone else ask the questions. Better still: What was the question again?

Needless to say, the photos were bullshit.

It’s hard not to note the tidy symbolism here. The Internet, like the Sphinx, is a ravenous beast that eats alive anyone who can’t answer its hoary riddle. We in the media have been struggling for twenty years to solve that riddle, and this year, the answer arrived: Big Viral, a Lovecraftian nightmare that has tightened its thousand-tentacled grip on our browsing habits with its traffic-at-all-costs mentality—veracity, newsworthiness, and relevance be damned. We solved the riddle, and then we got eaten anyway.

The Egypt photos weren’t the only viral hoax to hijack the social media conversation in the past month. Of the others, the most infamous was reality-TV producer Elan Gale’s in-flight pissing match with a fellow passenger, which he documented on Twitter, and which was shepherded along by BuzzFeed to the delight of hundreds of thousands of onlookers. That it was actually a prank rankled some, but even that turned out to be a boon for the sites that shared it: They got the clicks coming and going, both on the ramp-up and in the reveal. The story may well have been, in the words of Slate’s Dave Weigel, “the sort of shoddy reporting that would get a reporter at a small newspaper fired,” but it was also a perfect microcosm of the way the Internet works now.

“We’re not in the business of publishing hoaxes,” BuzzFeed’s news editor wrote in response to Weigel’s piece, “and we feel an enormous responsibility here to provide our readers with accurate, up-to-date information”—which sounds a bit like Altria’s health inspector saying they’re sorry they gave you cancer.

The fact is, that sort of double-dipping is what most of us who produce Internet content do, myself included. Give me the viral pictures, and I’ll give you the truth. And then, after an appropriate waiting period, I’ll give you the other truth, and capitalize on that traffic too. It’s almost a perfect callback to William Randolph Hearst’s infamous declaration on the eve of the Spanish-American War, “You furnish the pictures and I’ll furnish the war.” Even more fitting, historians don’t think he ever said anything like that. Then as now, it’s the myth that plays, not the reality. Today it just plays on an exponentially larger stage.

The media has long had its struggles with the truth—that’s nothing new. What is new is that we’re barely even apologizing for increasingly considering the truth optional. In fact, the mistakes, and the falsehoods, and the hoaxes are a big part of a business plan driven by the belief that big traffic absolves all sins, that success is a primary virtue. Haste and confusion aren’t bugs in the coding anymore, they’re features. Consider what Ryan Grim, Washington bureau chief for the Huffington Post, told The New York Times in its recent piece on a raft of hoaxes, including Gale’s kerfuffle, a child’s letter to Santa that included a handwritten Amazon URL, and a woman who wrote about her fictitious poverty so effectively that she pulled in some $60,000 in online donations. “The faster metabolism puts people who fact-check at a disadvantage,” Grim said. “If you throw something up without fact-checking it, and you’re the first one to put it up, and you get millions and millions of views, and later it’s proved false, you still got those views. That’s a problem. The incentives are all wrong.”

In other words, press “Publish” or perish.

And so, to our year of bungles: the New Jersey waitress who received a homophobic comment on the receipt from a party she had served; comedian Kyle Kinane’s Twitter beef with Pace Salsa; the Chinese husband who sued his wife for birthing ugly children after he learned she’d had plastic surgery; Samsung paying Apple $1 billion in nickels; former NSA chief Michael Hayden’s assassination; #CutForBieber; the exquisite, otherwordly weirdness of the @Horse_ebooks Twitter account; Nelson Mandela’s death pic; that eagle snatching a child off the ground on YouTube; Jimmy Kimmel’s “twerk fail” video; Sarah Palin taking a job with Al-Jazeera America (an obviously satirical story that even suckered in The Washington Post)…

These all had one thing in common: They seemed too tidily packaged, too neat, “too good to check,” as they used to say, to actually be true. Any number of reporters or editors at any of the hundreds of sites that posted these Platonic ideals of shareability could’ve told you that they smelled, but in the ongoing decimation of the publishing industry, fact-checking has been outsourced to the readers. Not surprisingly—as we saw with the erroneous Reddit-spawned witch-hunt around the Boston Marathon bombing—readers are terrible at fact-checking. And this, as it happens, is good for business because it means more shares, more clicks.

This is not a glitch in the system. It is the system. Readers are gullible, the media is feckless, garbage is circulated around, and everyone goes to bed happy and fed. BuzzFeed’s Jonah Peretti admitted as much when explaining, that, when he’s hiring, he looks for “people who really understand how information is shared on Twitter and Facebook and Instagram and other emerging platforms, because that is in some cases as important as, you know, having traditional reporting talent.” Upworthy editorial director Sara Critchfield seconded the notion. “We reject the idea that the media elite or people who have been trained in a certain way somehow have the monopoly on editorial judgment.”

That mastery of social media platforms is imperative, because this was the year someone isolated the DNA of the viral story, and the world—hearts full of wonder, hope, and maybe a little fear—saw for the first time the awesome potential of viral content. Earlier this month, NewsWhip posted statistics about the most frequently shared sites online. Among the more revealing graphs was one tabulating Facebook shares per article by domain. The leader, Upworthy, had about seventy-five thousand likes per article, twelve times more than fourth-place BuzzFeed. Among them were posts like the one whose headline misleadingly claimed doctors were injecting HIV into a dying girl to treat her cancer.

It worked. Upworthy is now “so much more dominant than other news sites on Facebook,” wrote The Atlantic’s Derek Thompson, “that when you graph its Facebook-shares-per-article, it looks like a skyscraper dropped into a desert.”

To be more specific, it looks like the original skyscraper in the original desert. Like our forebears, we too speak a common tongue in erecting our Tower of Babel, but ours is language of shares and clicks and uniques. Gaze upon our collected works and despair.

On second thought, don’t despair. It’s bad for shareability.

My hands are certainly not clean in all this. While I’ve primarily made my living as a (mostly) upstanding freelancer for dailies, weeklies, and glossies for a decade or so, like many others, I’ve also had to moonlight as a content generator for a wide array of websites—some high-minded and journalistic, others less so. I typically publish about twenty-five pieces a week, from reported features, to tossed-off reaction blogs, and the churn-and-burn pace of daily writing has lead to my passing along some pretty sketchy nonsense this past year—from the pastor who stiffed an Applebee’s waitress, to Lars von Trier’s Nymphomaniac trailer being accidentally shown to a roomful of kids waiting for a Disney movie, to the comedian who live-tweeted a breakup on an apartment roof, to any number of speculative gossip pieces about Amanda Bynes, Kanye West, Miley Cyrus, et al., which I often didn’t bother to double-check before firing off. Why did I do it? Because I knew everyone else was going to, and I wanted to siphon off the traffic. Like buying into an insurance plan, it’s a pooled-risk version of Internet writing: The sheer ubiquity of suspect stories provides cover for all of us.

Perhaps worse, I’m ashamed to say, I sometimes like it. Watching the social media shares multiply exponentially on something I posted triggers the pleasure center of the brain like a stiff rip of cocaine. The opposite is true as well. When I file a piece I like and no one shares it, I sink into a serotonin-deficit that’s hard to climb out of.

Media malpractice like this didn’t trigger the collapse of traditional revenue models, but it’s hastening the job. Everyone wants everything for free now—news, music, movies, etc.—which means the companies don’t have any money to pay people to produce original work. None of this is anything you haven’t heard before, but it bears repeating. In order to make a living, those of us who had the bad sense to shackle ourselves to a career in media before that world ended have to churn out more content faster than ever to make up for the drastically reduced pay scale. We’re left with the choice of spending a week reporting a story we’re actually proud of (as I do just frequently enough to ensure a somewhat restful sleep every other night), reaping a grand sum of somewhere in the ballpark of two hundred to five hundred dollars if we’re lucky, or we can grind out ten blog posts at twenty-five to fifty bucks a pop that take fifteen minutes each. That means the work across the board ends up being significantly more disposable, which in turn makes the readers value it less, which means they want to pay less for it, and so on. It’s an ouroboros of shit.

Among all the things I’ve written this year, the ones that took the least amount of time and effort usually did the most traffic. The more in-depth, reported pieces didn’t stand a chance against riffs on things predestined to go viral. That’s the secret that Upworthy, BuzzFeed, MailOnline, Viral Nova, and their dozens of knockoffs have figured out: You don’t need to write anymore—just write a good headline and point. If what you’re pointing at turns out to be a steaming turd, well, then repackage the steam and sell it back to us.

This conflation of newsiness with news, share-worthiness with importance, has wreaked havoc on the media’s skepticism immune systems. It didn’t happen out of nowhere, it’s a process that’s been midwifed by the willful blurring of the lines between fact and fiction on the part of a key group of influential sites, that have, unfortunately, established a viable financial model amid the wreckage of traditional media. It’s why companies are so eager to shuffle native ads—content produced to appear as if it were a site’s regular content—into the regular mix. They’re hoping we won’t know the difference. They’re right, we often don’t. That’s part of the reason native advertising revenues are up 77 percent this year, according to a new study by BIA/Kelsey. There are practically no consequences anymore.

As Big Viral gets bigger, traditional media organizations are scrambling to keep pace. We’re seeing the BuzzFeedification of the entire spectrum of the media—even The New York Times isn’t immune anymore, having recently inked a content-sharing deal with BuzzFeed (it was reportedly the Times’ idea). You won’t find a major publication without a section on its site devoted to sharing questionable viral content. It’s been particularly disheartening to watch Gawker, a site I’ve read every day for years (and, disclosure: submitted a piece to recently), home to many fine, skeptical cranks, slouch into the viral morass. This magazine’s website has posted some goofy stuff too. I’m sorry, but it has. They all have.

“This isn’t a new model in journalism,” the Wall Street Journal’s Farhad Manjoo noted in his profile of Gawker’s Neetzan Zimmerman, reigning king of viral content. “Bundling the cheap, revenue-generating content with expensive, high-minded content is how newspapers made money for decades—but it has now become the touchstone model of the Web, in use at Gawker, BuzzFeed, the Huffington Post, and dozens of smaller sites.”

This is a common refrain—Yeah, but they do some good long-form journalism. And it’s true, I’ve read some good reporting out there—but on the other hand, after the Army blows up a village they come back around with a couple of sacks of rice to smooth over the damage. The fact is, you cannot justify quality reporting produced from the spoils of the opposite. Journalism does not provide for such leeway. It’s better for a hundred quality stories to go unposted than to let one knowingly false one see the light of day. At the risk of sounding like the boy who cried click-bait, I’m warning you: One of these days a viral hoax is going to come along that we really should pay attention to, and our guards will be down because we’ve become conditioned to lump all information together into the LOL and #feelings files. And one of these days a fake news story is going to have some serious real world consequences too, something like San Francisco elementary school that was widely attacked by people who’d mistook a satirical article in National Report about a student who was suspended for wishing a Merry Christmas to an atheist teacher.

As much as media companies might want to erect barriers between their entertainment and news sides (as Fox News often claims about their editorializing hosts), the average reader or viewer doesn’t register any meaningful distinction between BuzzFeed News and BuzzFeed WTF, or whatever the fck it is, or the Huffington Post’s front page and its “30 Racist Side Boobs” slideshow. This becomes even more convoluted when a story goes viral and it’s received with not only the imprimatur of the site of origin, but also the thousands and thousands of implicit endorsements by the people who deigned to share it. When it finds its way to us, we think, There it is in my feed, my newsfeed—next to ostensibly reliable accounts from The New York Times, the BBC, and others—and we consume.

Yes, newspapers have long printed lifestyle puff pieces next to hard news, but the analogy between that practice and the current model doesn’t hold. As someone who’s written hundreds of newspaper entertainment pieces in my day, I can tell you they still, thankfully, do not take inaccuracies lightly, even minor ones. And as someone who’s written hundreds of hacky blog posts, I can tell you that it’s a practice that rots your guts from inside. Trust me.

Actually, don’t trust me—that’s the entire point. We the media have betrayed your trust, and the general public has taken our self-sanctioned lowering of standards as tacit permission to lower their own.

That may sound fatalistic, but I say this because I love the Internet, not because I hate it. No one is suggesting we need to drain all of the fun out of everything—diversions are a huge part of what make the vast teeming wonder of the web such a joy to behold. But there’s an infinite expanse of information about things that actually exist out there just waiting for us to share them. Why would we take that wealth for granted and resort to passing along things we know—or can easily find—to be false?

Like Odysseus, we’ve plowed our field with salt (to use another example of over-reported fiction). If you remember the myth, he was pretending to be insane to get out of an oath he’d made, now that it was inconvenient and not in his interest. It wasn’t until Palamedes placed his beloved son in front of the plow that Odysseus came to his senses and remembered he had something to fight for.
http://www.esquire.com/blogs/news/we-broke-the-internet





We’re About to Lose Net Neutrality — And the Internet as We Know It
Marvin Ammori

Net neutrality is a dead man walking. The execution date isn’t set, but it could be days, or months (at best). And since net neutrality is the principle forbidding huge telecommunications companies from treating users, websites, or apps differently — say, by letting some work better than others over their pipes — the dead man walking isn’t some abstract or far-removed principle just for wonks: It affects the internet as we all know it.

Once upon a time, companies like AT&T, Comcast, Verizon, and others declared a war on the internet’s foundational principle: that its networks should be “neutral” and users don’t need anyone’s permission to invent, create, communicate, broadcast, or share online. The neutral and level playing field provided by permissionless innovation has empowered all of us with the freedom to express ourselves and innovate online without having to seek the permission of a remote telecom executive.

But today, that freedom won’t survive much longer if a federal court — the second most powerful court in the nation behind the Supreme Court, the DC Circuit — is set to strike down the nation’s net neutrality law, a rule adopted by the Federal Communications Commission in 2010. Some will claim the new solution “splits the baby” in a way that somehow doesn’t kill net neutrality and so we should be grateful. But make no mistake: Despite eight years of public and political activism by multitudes fighting for freedom on the internet, a court decision may soon take it away.

Game of Loopholes and Rules

How did we get here?

The CEO of AT&T told an interviewer back in 2005 that he wanted to introduce a new business model to the internet: charging companies like Google and Yahoo! to reliably reach internet users on the AT&T network. Keep in mind that users already pay to access the internet and that Google and Yahoo! already pay other telecom companies — often called backbone providers — to connect to these internet users. [Disclosure: I have done legal work for several companies supporting network neutrality, including Google.]

But AT&T wanted to add an additional toll, beyond what it already made from the internet. Shortly after that, a Verizon executive voiced agreement, hoping to end what he called tech companies’ “free lunch”. It turns out that around the same time, Comcast had begun secretly trialing services to block some of the web’s most popular applications that could pose a competitive threat to Comcast, such as BitTorrent.

Yet the phone and cable companies tried to dress up their plans as a false compromise. Counterintuitively, they supported telecommunications legislation in 2006 that would authorize the FCC to stop phone and cable companies from blocking websites.

There was a catch, however. The bills included an exception that swallowed the rule: the FCC would be unable to stop cable and phone companies from taxing innovators or providing worse service to some sites and better service to others. Since we know internet users tend to quit using a website or application if it loads even just a few seconds slower than a competitor’s version, this no-blocking rule would essentially have enabled the phone and cable companies to discriminate by picking website/app/platform winners and losers. (Congress would merely enact the loophole. Think of it as a safe harbor for discriminating online.)

Luckily, consumer groups, technology companies, political leaders, and American citizens saw through the nonsense and rallied around a principle to preserve the internet’s openness. They advocated for one simple, necessary rule — a nondiscrimination principle that became known as “network neutrality”. This principle would forbid phone and cable companies not only from blocking — but also from discriminating between or entering in special business deals to the benefit of — some sites over others.

Both sides battled out the issues before Congress, federal agencies, and in several senate and presidential campaigns over the next five years. These fights culminated in the 2010 FCC decision that included the nondiscrimination rule.

Unfortunately, the rule still had major loopholes — especially when it came to mobile networks. It also was built, to some extent, on a shaky political foundation because the then-FCC chairman repeatedly folded when facing pressure. Still, the adopted rule was better than nothing, and it was a major advance over AT&T’s opening bid in 2005 of a no-blocking rule.

As a result, Verizon took the FCC to court to void the 2010 FCC rule. Verizon went to court to attack the part of the rule forbidding them from discriminating among websites and applications; from setting up — on what we once called the information superhighway — the equivalents of tollbooths, fast lanes, and dirt roads.

There and Back Again

So that’s where we are today — waiting for the second most powerful court in the nation, the DC Circuit, to rule in Verizon’s case. During the case’s oral argument, back in early September, corporate lobbyists, lawyers, financial analysts, and consumer advocates packed into the courtroom: some sitting, some standing, some relegated to an overflow room.

Since then, everyone interested in internet freedom has been waiting for an opinion — including everyday folks who search the web or share their thoughts in 140 characters; and including me, who argued the first (losing) network neutrality case before the DC Circuit in 2010.

But, in their questions and statements during oral argument, the judges have made clear how they planned to rule — for the phone and cable companies, not for those who use the internet. While the FCC has the power to impose the toothless “no-blocking” rule (originally proposed by AT&T above), it does not (the court will say) have the power to impose the essential “nondiscrimination” rule.

It looks like we’ll end up where AT&T initially began: a false compromise.

The implications of such a decision would be profound. Web and mobile companies will live or die not on the merits of their technology and design, but on the deals they can strike with AT&T, Verizon, Comcast, and others. This means large phone and cable companies will be able to “shakedown” startups and established companies in every sector, requiring payment for reliable service. In fact, during the oral argument in the current case, Verizon’s lawyer said, “I’m authorized to state from my client today that but for these [FCC] rules we would be exploring those types of arrangements.”

Wait, it gets even worse. Pricing isn’t even a necessary forcing factor. Once the court voids the nondiscrimination rule, AT&T, Verizon, and Comcast will be able to deliver some sites and services more quickly and reliably than others for any reason. Whim. Envy. Ignorance. Competition. Vengeance.

Whatever. Or, no reason at all.

So what if you’ve got a great new company, an amazing group of founders, a seat in a reputable accelerator program, great investors and mentors. With the permission-based innovation over “our pipes” desired from the likes of Comcast, Verizon and AT&T… there’s no meritocracy here.

Of course, despite everything the judges suggested during the two-hour argument, it’s possible that they offer net neutrality a reprieve. Given how sticky this morass is, there’s one simple way for you to judge the opinion: If the court throws out the non-discrimination rule, permission-less innovation on the internet as we know it is done. If the nondiscrimination rule miraculously survives, then, for now at least, so too will freedom on the internet.
http://www.wired.com/opinion/2013/11...net-neutrality





U.S. Struggles to Keep Pace in Delivering Broadband Service
Edward Wyatt

San Antonio is the seventh-largest city in the United States, a progressive and economically vibrant metropolis of 1.4 million people sprawled across south-central Texas. But the speed of its Internet service is no match for the Latvian capital, Riga, a city of 700,000 on the Baltic Sea.

Riga’s average Internet speed is at least two-and-a-half times that of San Antonio’s, according to Ookla, a research firm that measures broadband speeds around the globe. In other words, downloading a two-hour high-definition movie takes, on average, 35 minutes in San Antonio — and 13 in Riga.

And the cost of Riga’s service is about one-fourth that of San Antonio.

The United States, the country that invented the Internet, is falling dangerously behind in offering high-speed, affordable broadband service to businesses and consumers, according to technology experts and an array of recent studies.

In terms of Internet speed and cost, “ours seems completely out of whack with what we see in the rest of the world,” said Susan Crawford, a law professor at Yeshiva University in Manhattan, a former Obama administration technology adviser and a leading critic of American broadband.

The Obama administration effectively agrees. “While this country has made tremendous progress investing in and delivering high-speed broadband to an unprecedented number of Americans, significant areas for improvement remain,” said Tom Power, deputy chief technology officer for telecommunications at the White House.

The disagreement comes over how far behind the United States really is in what many people consider as basic a utility as water and electricity — and how much it will affect the nation’s technological competitiveness over the next decade. “There aren’t any countries ahead of us that have a comparable population distribution,” said Richard Bennett, a visiting fellow at the American Enterprise Institute, who said that the United States was closing the gap.

But as the Obama administration warned in a report this year: “To create jobs and grow wages at home, and to compete in the global information economy, the delivery of fast, affordable and reliable broadband service to all corners of the United States must be a national imperative.”

The World Economic Forum ranked the United States 35th out of 148 countries in Internet bandwidth, a measure of available capacity in a country. Other studies rank the United States anywhere from 14th to 31st in average connection speed.

Generally, fast broadband is considered anything above 10 megabits a second.

In Riga, speeds average 42 megabits a second, but many users had service of 100 to 500 megabits as of mid-December, according to Ookla. In San Antonio, broadband speeds average about 16 megabits a second. While higher speeds are available through cable television or phone companies, the expense is such that many households in the city cannot afford a connection.

Those faster speeds can mean the difference between thriving and surviving. For Kosmodroms Ltd., a web design and video production studio in Riga, that high-speed connection lets it transfer huge files of video or photos in minutes.

With broadband of only a few megabits a second, it would take so long to transmit the files that the company would be better off delivering them physically, on a disk or thumb-drive, said Agnese Krievkalne, a company director.

Nils Usakovs, the mayor of Riga, said that when private investors started to build Internet infrastructure in the city, no systems were in place, so the builders were able to install the latest, fastest communications technology. “We’re the capital of a European Union member country, bordering with Russia,” Mr. Usakovs said. “The technology makes this an even more attractive place to invest.”

Leticia Ozuna, a former San Antonio councilwoman who worked on the municipal broadband effort, said that in her former district in South San Antonio, some 70 percent of households had no Internet service. Often, she added, students gather at night in the parking lot of the Mission Branch Public Library to do homework using the library’s free Wi-Fi connection, long after the library itself has closed.

San Antonio’s power company has a largely unused fiber-optic network that local government offices have been using for high-speed Internet service for years, but a Texas law prevents the city from using the network to give low-cost service to consumers.

Fast broadband, said Ron Nirenberg, a San Antonio city councilman, “should be inherent in a 21st-century city.”

There is ample evidence that faster broadband spurs economic growth. The White House cites a study of 33 of the largest national economies worldwide, which found that from 2008 to 2010, doubling a country’s broadband speed increased gross domestic product by 0.3 percent. In its report, “Four Years of Broadband Growth,” the Obama administration says that since 2002, Internet access has contributed an average of $34 billion a year to the economy, or 0.26 percent of G.D.P. growth.

There is some doubt, however, about how much of that benefit flows to average citizens. The Public Policy Institute of California reported in 2010 that broadband expansion did not appear to affect average wages or the employment rate.

Ms. Crawford, who is also a co-director of the Berkman Center for Internet and Society at Harvard, said that American cities should take on some of the responsibility for building fiber-optic networks and providing broadband service. It is a necessity similar to electricity, she said, “something that no neighborhood or private company would have an incentive to provide on its own to everyone at reasonable prices.”

In the United States, speeds vary widely between cities and regions. The fastest speeds are in the Northeastern corridor between Boston and the Washington, D.C., metropolitan region. The three fastest areas — D.C., Massachusetts and Virginia — have average speeds greater than every country except Japan and South Korea.

Some American cities have such superfast broadband that if they were ranked against foreign countries, several, like Bristol, Va., Chattanooga, Tenn., and Lafayette, La., would rank in the top 10.

Those three cities built municipal fiber-optic networks, and those networks can operate just as fast as the swiftest connections in Hong Kong, Seoul and Tokyo. But those speeds can come at a very high price. In Chattanooga, Internet service of 1 gigabit a second costs a consumer $70. But in Lafayette, the same speed costs nearly $1,000 a month. In Seoul, it’s about $31 — a result of government subsidies to encourage Internet use.

Even if the United States is improving its worldwide standing, some analysts question the logic of focusing on what country ranks where. “Some people like to look at it as a horse race,” said Harold Furchtgott-Roth, a senior fellow at the Hudson Institute, “but I’m not sure that’s the right way to look at it.” He added, “We’re not at the starting gate, we’re not at the finish line. We’re somewhere in the middle of the race.”
http://www.nytimes.com/2013/12/30/te...d-service.html





U.S. to China: We Hacked Your Internet Gear We Told You Not to Hack
Cade Metz

The headline news is that the NSA has surreptitiously “burrowed its way into nearly all the security architecture” sold by the world’s largest computer networking companies, including everyone from U.S. mainstays Cisco and Juniper to Chinese giant Huawei. But beneath this bombshell of a story from Der Spiegel, you’ll find a rather healthy bit of irony.

After all, the United States government has spent years complaining that Chinese intelligence operations could find ways of poking holes in Huawei networking gear, urging both American businesses and foreign allies to sidestep the company’s hardware. The complaints grew so loud that, at one point, Huawei indicated it may abandon the U.S. networking market all together. And, yet, Der Speigel now tells us that U.S. intelligence operations have been poking holes in Huawei networking gear — not to mention hardware sold by countless other vendors in both the States and abroad.

“We read the media reports, and we’ve noted the references to Huawei and our peers,” says William Plummer, a Huawei vice president and the company’s point person in Washington, D.C. “As we have said, over and over again — and as now seems to be validated — threats to networks and data integrity can come from any and many sources.”

Plummer and Huawei have long complained that when the U.S. House Intelligence Committee released a report in October 2012 condemning the use of Huawei gear in telephone and data networks, it failed to provide any evidence that the Chinese government had compromised the company’s hardware. Adam Segal, a senior fellow for China Studies at the Center for Foreign Relations, makes the same point. And now we have evidence — Der Spiegel cites leaked NSA documents — that the U.S. government has compromised gear on a massive scale.

“Do I see the irony? Certainly the Chinese will,” Segal says, noting that the Chinese government and the Chinese press have complained of U.S hypocrisy ever since former government contractor Edward Snowden first started to reveal NSA surveillance practices last summer. “The Chinese government has been hammering home what they call the U.S.’s ulterior motives for criticizing China, and there’s been a steady drumbeat of stories in the Chinese press about backdoors in the products of U.S. companies. They’ve been going after Cisco in particular.”

To be sure, the exploits discussed by Der Spiegel are a little different from the sort of attacks Congress envisioned during its long campaign against Huawei and ZTE, another Chinese manufacturer. As Segal and others note, Congress mostly complained that the Chinese government could collaborate with people inside the two companies to plant backdoors in their gear, with lawmakers pointing out that Huawei’s CEO was once an officer in China’s People’s Liberation Army, or PLA, the military arm of the country’s Communist party. Der Spiegel, by contrast, says the NSA is exploiting hardware without help from anyone inside the Ciscos and the Huaweis, focusing instead on compromising network gear with clever hacks or intercepting the hardware as it’s shipped to customers.

“For the most part, the article discusses typical malware exploits used by hackers everywhere,” says JR Rivers, an engineer who has built networking hardware for Cisco as well as Google and now runs the networking startup Cumulus Networks. “It’s just pointing out that the NSA is engaged in the practice and has resources that are not available to most people.”

But in the end, the two types of attack have the same result: Networking gear controlled by government spies. And over the last six months, Snowden’s revelations have indicated that the NSA is not only hacking into networks but also collaborating with large American companies in its hunt for data.

Jim Lewis, a director and senior fellow with the Center for Strategic and International Studies, adds that the Chinese view state-sponsored espionage a little differently than the U.S. does. Both countries believe in espionage for national security purposes, but the Chinese argue that such spying might include the theft of commercial secrets.

“The Chinese will tell you that stealing technology and business secrets is a way of building their economy, and that this is important for national security,” says Lewis, who has helped oversee meetings between the U.S. and the Chinese, including officers in the PLA. “I’ve been in the room when they’ve said that. The last time was when a PLA colonel said: ‘In the U.S., military espionage is heroic and economic espionage is a crime. In China, the line is not that clear.’”

But here in the United States, we now know, the NSA may blur other lines in the name of national security. Segal says that although he, as an American, believes the U.S. government is on stronger ethical ground than the Chinese, other nations are beginning to question its motives.

“The U.S has to convince other countries that our type of intelligence gathering is different,” he says. “I don’t think that the Brazils and the Indias and the Indonesias and the South Africas are convinced. That’s a big problem for us.”

The thing to realize, as the revelations of NSA snooping continue to pour out, is that everyone deserves scrutiny — the U.S government and its allies, as well as the Chinese and others you may be more likely to view with skepticism. “All big countries,” Lewis says, “are going to try and do this.”
http://www.wired.com/wiredenterprise...-huawei-china/





Shopping for Spy Gear: Catalog Advertises NSA Toolbox
Jacob Appelbaum, Judith Horchert and Christian Stöcker

After years of speculation that electronics can be accessed by intelligence agencies through a back door, an internal NSA catalog reveals that such methods already exist for numerous end-user devices.

When it comes to modern firewalls for corporate computer networks, the world's second largest network equipment manufacturer doesn't skimp on praising its own work. According to Juniper Networks' online PR copy, the company's products are "ideal" for protecting large companies and computing centers from unwanted access from outside. They claim the performance of the company's special computers is "unmatched" and their firewalls are the "best-in-class." Despite these assurances, though, there is one attacker none of these products can fend off -- the United States' National Security Agency.

Specialists at the intelligence organization succeeded years ago in penetrating the company's digital firewalls. A document viewed by SPIEGEL resembling a product catalog reveals that an NSA division called ANT has burrowed its way into nearly all the security architecture made by the major players in the industry -- including American global market leader Cisco and its Chinese competitor Huawei, but also producers of mass-market goods, such as US computer-maker Dell.

A 50-Page Catalog

These NSA agents, who specialize in secret back doors, are able to keep an eye on all levels of our digital lives -- from computing centers to individual computers, from laptops to mobile phones. For nearly every lock, ANT seems to have a key in its toolbox. And no matter what walls companies erect, the NSA's specialists seem already to have gotten past them.

This, at least, is the impression gained from flipping through the 50-page document. The list reads like a mail-order catalog, one from which other NSA employees can order technologies from the ANT division for tapping their targets' data. The catalog even lists the prices for these electronic break-in tools, with costs ranging from free to $250,000.

In the case of Juniper, the name of this particular digital lock pick is "FEEDTROUGH." This malware burrows into Juniper firewalls and makes it possible to smuggle other NSA programs into mainframe computers. Thanks to FEEDTROUGH, these implants can, by design, even survive "across reboots and software upgrades." In this way, US government spies can secure themselves a permanent presence in computer networks. The catalog states that FEEDTROUGH "has been deployed on many target platforms."

Master Carpenters

The specialists at ANT, which presumably stands for Advanced or Access Network Technology, could be described as master carpenters for the NSA's department for Tailored Access Operations (TAO). In cases where TAO's usual hacking and data-skimming methods don't suffice, ANT workers step in with their special tools, penetrating networking equipment, monitoring mobile phones and computers and diverting or even modifying data. Such "implants," as they are referred to in NSA parlance, have played a considerable role in the intelligence agency's ability to establish a global covert network that operates alongside the Internet.

Some of the equipment available is quite inexpensive. A rigged monitor cable that allows "TAO personnel to see what is displayed on the targeted monitor," for example, is available for just $30. But an "active GSM base station" -- a tool that makes it possible to mimic a mobile phone tower and thus monitor cell phones -- costs a full $40,000. Computer bugging devices disguised as normal USB plugs, capable of sending and receiving data via radio undetected, are available in packs of 50 for over $1 million.

'Persistence'

The ANT division doesn't just manufacture surveillance hardware. It also develops software for special tasks. The ANT developers have a clear preference for planting their malicious code in so-called BIOS, software located on a computer's motherboard that is the first thing to load when a computer is turned on.

This has a number of valuable advantages: an infected PC or server appears to be functioning normally, so the infection remains invisible to virus protection and other security programs. And even if the hard drive of an infected computer has been completely erased and a new operating system is installed, the ANT malware can continue to function and ensures that new spyware can once again be loaded onto what is presumed to be a clean computer. The ANT developers call this "Persistence" and believe this approach has provided them with the possibility of permanent access.

Another program attacks the firmware in hard drives manufactured by Western Digital, Seagate, Maxtor and Samsung, all of which, with the exception of latter, are American companies. Here, too, it appears the US intelligence agency is compromising the technology and products of American companies.

Other ANT programs target Internet routers meant for professional use or hardware firewalls intended to protect company networks from online attacks. Many digital attack weapons are "remotely installable" -- in other words, over the Internet. Others require a direct attack on an end-user device -- an "interdiction," as it is known in NSA jargon -- in order to install malware or bugging equipment.

There is no information in the documents seen by SPIEGEL to suggest that the companies whose products are mentioned in the catalog provided any support to the NSA or even had any knowledge of the intelligence solutions. "Cisco does not work with any government to modify our equipment, nor to implement any so-called security 'back doors' in our products," the company said in a statement. Contacted by SPIEGEL reporters, officials at Western Digital, Juniper Networks and Huawei also said they had no knowledge of any such modifications. Meanwhile, Dell officials said the company "respects and complies with the laws of all countries in which it operates."

Many of the items in the software solutions catalog date from 2008, and some of the target server systems that are listed are no longer on the market today. At the same time, it's not as if the hackers within the ANT division have been sleeping on the job. They have continued to develop their arsenal. Some pages in the 2008 catalog, for example, list new systems for which no tools yet exist. However, the authors promise they are already hard at work developing new tools and that they will be "pursued for a future release".
http://www.spiegel.de/international/...-a-940994.html





French Contractors Jump Into Market for Secure Communications

Firms Offer Re-Engineered Phones,Encryption After NSA Revelations
Sam Schechner

A crypto war is coming to your pocket.

In the Paris area, two security contractors are jumping into a burgeoning market for secure mobile phones and encrypted communications as revelations of widespread U.S. government surveillance accelerate a security race among businesses, government agencies and hackers.

This month, Bull SA —a French maker of cybersecurity and intelligence gear—is starting to ship a new €2,000 ($2,760) smartphone for businesses called the Hoox m2. Based on Google Inc.'s Android software, it has been re-engineered to resist hacking and encrypt calls. "Unnecessary to speak in 'coded language,'" brags a marketing brochure.

Just a few miles away, partly state-owned defense contractor Thales SA is selling an enterprise-software system dubbed Teopad. Priced in the "hundreds of euros" per license, the software will split any Android phone or tablet "in two," according to Thales, with one side for personal use and the other encrypted for sensitive business applications—and secure phone calls.

The dueling French companies are part of a growing niche—spanning tiny firms and defense giants—that is banking on growing demand for high-end encryption amid rising threats from organized hackers and growing fears of ubiquitous surveillance.

Some are secure messaging apps like U.S.-based Wickr or mobile-device security platforms like California-based Good Technology Corp.

Others offer actual hardware, like Germany's GSMK, which has sold tens of thousands of its CryptoPhone models, including an Android-based smartphone, in the last decade.

The market for security software used in mobile devices was expected to grow 38% in 2013 to $1.33 billion, and it should hit $3 billion by 2017, according to market-research firm Infonetics Inc.

Interest in security and encryption has risen since the U.S. National Security Agency was accused in the fall of having hacked into the cellphone of German Chancellor Angela Merkel, among dozens of world leaders. Though the NSA says it doesn't conduct industrial espionage, security experts say businesses have woken up to the need for tighter security.

"Five years ago, businesses were asking me why I was so paranoid," says Björn Rupp, GSMK's founder. "Now they're all nodding when you give the presentation."

In Europe, cloud-computing services have tried to cash in on espionage fears, arguing that by hosting their data on European soil they can avoid the prying eyes of U.S. spies. But many U.S. companies that offer communications services say that well-implemented cryptography can remain secure.

U.S.-based Silent Circle, for instance, preemptively shut down its encrypted email service because it feared the government could request data that remained on central servers. But the company continues to offer its mobile calling and messaging services because only subscribers have the encryption keys.

Despite the growth of the sector, it is difficult to verify how secure many of the new products actually are, security experts say. While several well-known cryptography schemes aren't thought to be breakable for the moment, experts say they can be difficult to implement properly, possibly allowing hackers to access data on a phone before it is encrypted.

To guard against that, independent security consultants often recommend open-source solutions, because outsiders can more easily spot security holes and notify the public. But many companies are reluctant to share their methods with competitors. And open-source approaches still require a leap of faith—both that the published code is the real code, and that the open-source community has properly vetted it.

"Using privacy applications and specialized hardware is a positive step, but it isn't a silver bullet," says Mark Dowd, a director of Azimuth Security, an Australia-based information-security consultancy. "It is possible to be secure, but it is difficult because phones do so much stuff."

To back up its security claims, Thales says its new Teopad software is based partly on technology from its military-grade phone, called Teorem. It has supplied 14,000 of those phones to the French armed forces and top civilian leaders. The gray, clamshell device doesn't have the features of a smartphone, but it can encrypt calls and documents at France's second-highest classification level, "Secret Défense." It is also thought to be used by French president François Hollande, though Thales and French officials decline to confirm that.

"Having people who work with the highest levels of cryptography is obviously a strong point, even in the civilian world," says Marc Darmon, head of defense and security systems at Thales.

Thales and Bull disagree on which of their approaches is more secure—Bull's dedicated phone or Thales's software, which works on any Android phone or tablet.

"If you're not worried about sophisticated attacks, maybe a software solution is good enough for you," says Franck Greverie, head of Bull Security Solutions, arguing that advanced hackers can sidestep software encryption by attacking the phone itself.

"Our product is just as secure, but it is infinitely more flexible," responds Thales's Mr. Darmon. He says hardware-based cryptophones become outdated too quickly for users' tastes.

"If you force people to use something obsolete, they won't," he adds.
http://online.wsj.com/news/article_e...MDAwMTEwNDEyWj





NSA Seeks to Build Quantum Computer that Could Crack Most Types of Encryption
Steven Rich and Barton Gellman

In room-size metal boxes secure against electromagnetic leaks, the National Security Agency is racing to build a computer that could break nearly every kind of encryption used to protect banking, medical, business and government records around the world.

According to documents provided by former NSA contractor Edward Snowden, the effort to build “a cryptologically useful quantum computer” — a machine exponentially faster than classical computers — is part of a $79.7 million research program titled “Penetrating Hard Targets.” Much of the work is hosted under classified contracts at a laboratory in College Park, Md.

The development of a quantum computer has long been a goal of many in the scientific community, with revolutionary implications for fields such as medicine as well as for the NSA’s code-breaking mission. With such technology, all current forms of public key encryption would be broken, including those used on many secure Web sites as well as the type used to protect state secrets.

Physicists and computer scientists have long speculated about whether the NSA’s efforts are more advanced than those of the best civilian labs. Although the full extent of the agency’s research remains unknown, the documents provided by Snowden suggest that the NSA is no closer to success than others in the scientific community.

“It seems improbable that the NSA could be that far ahead of the open world without anybody knowing it,” said Scott Aaronson, an associate professor of electrical engineering and computer science at the Massachusetts Institute of Technology.

The NSA appears to regard itself as running neck and neck with quantum computing labs sponsored by the European Union and the Swiss government, with steady progress but little prospect of an immediate breakthrough.

“The geographic scope has narrowed from a global effort to a discrete focus on the European Union and Switzerland,” one NSA document states.

Seth Lloyd, an MIT professor of quantum mechanical engineering, said the NSA’s focus is not misplaced. “The E.U. and Switzerland have made significant advances over the last decade and have caught up to the U.S. in quantum computing technology,” he said.

The NSA declined to comment for this article.

The documents, however, indicate that the agency carries out some of its research in large, shielded rooms known as Faraday cages, which are designed to prevent electromagnetic energy from coming in or out. Those, according to one brief description, are required “to keep delicate quantum computing experiments running.”

The basic principle underlying quantum computing is known as “quantum superposition,” the idea that an object simultaneously exists in all states. A classical computer uses binary bits, which are either zeroes or ones. A quantum computer uses quantum bits, or qubits, which are simultaneously zero and one.

This seeming impossibility is part of the mystery that lies at the heart of quantum theory, which even theoretical physicists say no one completely understands.

“If you think you understand quantum mechanics, you don’t understand quantum mechanics,” said the late Nobel laureate Richard Feynman, who is widely regarded as the pioneer in quantum computing.

Here’s how it works, in theory: While a classical computer, however fast, must do one calculation at a time, a quantum computer can sometimes avoid having to make calculations that are unnecessary to solving a problem. That allows it to home in on the correct answer much more quickly and efficiently.

Quantum computing is difficult to attain because of the fragile nature of such computers. In theory, the building blocks of such a computer might include individual atoms, photons or electrons. To maintain the quantum nature of the computer, these particles would need to be carefully isolated from their external environments.

“Quantum computers are extremely delicate, so if you don’t protect them from their environment, then the computation will be useless,” said Daniel Lidar, a professor of electrical engineering and the director of the Center for Quantum Information Science and Technology at the University of Southern California.

A working quantum computer would open the door to easily breaking the strongest encryption tools in use today, including a standard known as RSA, named for the initials of its creators. RSA scrambles communications, making them unreadable to anyone but the intended recipient, without requiring the use of a shared password. It is commonly used in Web browsers to secure financial transactions and in encrypted e-mails. RSA is used because of the difficulty of factoring the product of two large prime numbers. Breaking the encryption involves finding those two numbers. This cannot be done in a reasonable amount of time on a classical computer.

In 2009, computer scientists using classical methods were able to discover the primes within a 768-bit number, but it took almost two years and hundreds of computers to factor it. The scientists estimated that it would take 1,000 times longer to break a 1,024-bit encryption key, which is commonly used for online transactions.

A large-scale quantum computer, however, could theoretically break a 1,024-bit encryption much faster. Some leading Internet companies are moving to 2,048-bit keys, but even those are thought to be vulnerable to rapid decryption with a quantum computer.

Quantum computers have many applications for today’s scientific community, including the creation of artificial intelligence. But the NSA fears the implications for national security.

“The application of quantum technologies to encryption algorithms threatens to dramatically impact the US government’s ability to both protect its communications and eavesdrop on the communications of foreign governments,” according to an internal document provided by Snowden.

Experts are not sure how soon a quantum computer would be feasible. A decade ago, some experts said that developing a large quantum computer was likely 10 to 100 years in the future. Five years ago, Lloyd said the goal was at least 10 years away.

Last year, Jeff Forshaw, a professor at the University of Manchester, told Britain’s Guardian newspaper, “It is probably too soon to speculate on when the first full-scale quantum computer will be built but recent progress indicates that there is every reason to be optimistic.”

“I don’t think we’re likely to have the type of quantum computer the NSA wants within at least five years, in the absence of a significant breakthrough maybe much longer,” Lloyd told The Washington Post in a recent interview.

Some companies, however, claim to already be producing small quantum computers. A Canadian firm, D-Wave Systems , says it has been making quantum computers since 2009. In 2012, it sold a $10 million version to Google, NASA and the Universities Space Research Association, according to news reports.

That quantum computer, however, would never be useful for breaking public key encryption like RSA.

“Even if everything they’re claiming is correct, that computer, by its design, cannot run Shor’s algorithm,” said Matthew Green, a research professor at the Johns Hopkins University Information Security Institute, referring to the algorithm that could be used to break encryption like RSA.

Experts think that one of the largest hurdles to breaking encryption with a quantum computer is building a computer with enough qubits, which is difficult given the very fragile state of quantum computers. By the end of September, the NSA expected to be able to have some building blocks, which it described in a document as “dynamical decoupling and complete quantum control on two semiconductor qubits.”

“That’s a great step, but it’s a pretty small step on the road to building a large-scale quantum computer,” Lloyd said.

A quantum computer capable of breaking cryptography would need hundreds or thousands more qubits than that.

The budget for the National Intelligence Program, commonly referred to as the “black budget,” details the “Penetrating Hard Targets” project and noted that this step “will enable initial scaling towards large systems in related and follow-on efforts.”

Another project, called “Owning the Net,” is using quantum research to support the creation of quantum-based attacks on encryptions like RSA, documents show.

“The irony of quantum computing is that if you can imagine someone building a quantum computer that can break encryption a few decades into the future, then you need to be worried right now,” Lidar said.
http://www.washingtonpost.com/world/...df2_story.html





Senator Presses NSA to Reveal Whether it Spies on Members of Congress

• Vermont's Bernie Sanders poses question to spy agency

• NSA entering political minefield as it fights to keep programs

Spencer Ackerman

A US senator has bluntly asked the National Security Agency if it spies on Congress, raising the stakes for the surveillance agency’s legislative fight to preserve its broad surveillance powers.

Bernie Sanders, a Vermont independent and socialist, asked army general Keith Alexander, the NSA’s outgoing director, if the NSA “has spied, or is the NSA currently spying, on members of Congress or other American elected officials”.

Sanders, in a letter dated 3 January, defined “spying” as “gathering metadata on calls made from official or personal phones, content from websites visited or emails sent, or collecting any other data from a third party not made available to the general public in the regular course of business”.

The NSA collects the records of every phone call made and received inside the United States on an ongoing, daily basis, a revelation first published in the Guardian in June based on leaks from whistleblower Edward Snowden. Until 2011, the NSA collected the email and internet records of all Americans as well.

In response, the NSA has argued that surveillance does not occur when it acquires the voluminous amount of phone data, but rather when its analysts examine those phone records, which they must only do, pursuant to the secret court orders justifying the collection, when they have “reasonable articulable suspicion” of a connection to specific terrorist groups. Declassified rulings of the secret surveillance court known as the Fisa court documented “systemic” violations of those restrictions over the years.

Sanders’ office suggested the senator, who called the collection “clearly unconstitutional” in his letter, did consider the distinction salient.

Asked if Sanders meant the collection of legislators’ and officials’ phone data alongside every other American’s or the deliberate targeting of those officials by the powerful intelligence agency, spokesman Jeff Frank said: “He’s referring to either one.”

The NSA did not immediately return a request for comment. Hours after Sanders sent his letter, the office of the director of national intelligence announced that the Fisa court on Friday renewed the domestic phone records bulk collection for another 90 days.

Sanders’ question is a political minefield for the NSA, and one laid as Congress is about to reconvene for the new year. Among its agenda items is a bipartisan, bicameral bill that seeks to abolish the NSA’s ability to collect data in bulk on Americans or inside the United States without suspicion of a crime or a threat to national security. Acknowledgement that it has collected the communications records of American lawmakers and other officials is likely to make it harder for the NSA to argue that it needs such broad collection powers to defend against terrorism.

Civil liberties and tech groups are planning a renewed lobbying push to pass the bill, called the USA Freedom Act, as they hope to capitalize on a White House review panel that last month recommended the NSA no longer collect so-called metadata, but rely on phone companies to store customer data for up to two years, which is longer than they currently store it.

On Friday, Shawn Turner, the spokesman for the director of national intelligence, said in a statement that the intelligence community "continues to be open to modifications to this program that would provide additional privacy and civil liberty protections while still maintaining its operational benefits," such as having the data "held by telecommunications companies or a third party".

Advocates want an end to the metadata bulk collection as well as no expansion of phone company data record storage.

The Senate judiciary committee, whose chairman Patrick Leahy is an architect of the USA Freedom Act, announced Friday that it will hold a hearing with the review panel’s membership on 14 January.

Additionally, the Justice Department announced a formal appeal of a 16 December federal court loss over the legality and constitutionality of the NSA’s bulk phone records collection effort. The appeal follows one by the ACLU, which sought redress in a different federal court after a judge ruled 27 December that the NSA bulk collection passes constitutional muster.

The NSA has yet to directly address whether elected officials are getting caught in its broad data trawls. While senator Jeff Merkely of Oregon dramatically waved his phone at Alexander during a June hearing – “What authorized investigation gave you the grounds for acquiring my cellphone data,” Merkely asked – the NSA has typically spoken in generic terms about needing the “haystack” of information from Americans it considers necessary to suss out terrorist connections.

The NSA and its allies have been under fire for months about their public presentation of the scope of domestic surveillance. House judiciary committee Republicans in December wrote to attorney general Eric Holder calling for an investigation of director of national intelligence James Clapper, who has acknowledged untruthfully testifying that the NSA does “not wittingly” collect data on millions of Americans.

“We must be vigilant and aggressive in protecting the American people from the very real danger of terrorist attacks,” Sanders wrote to Alexander on Friday. “I believe, however, that we can do that effectively without undermining the constitutional rights that make us a free country.”
http://www.theguardian.com/world/201...bernie-sanders





Whistleblower Edward Snowden is Tech Person of Year
John Shinal

"They who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety."

— Benjamin Franklin, for the Pennsylvania Assembly, in its reply to the governor, 1755.

In the wake of the 9/11 terrorist attacks, the American people, through their elected representatives in Washington, chose to exchange a significant amount of freedom for safety.

But until a lone information-technology contractor named Edward Snowden leaked a trove of National Security Agency documents to the media this summer, we didn't know just how much we'd surrendered.

Now that we do, our nation can have a healthy debate — out in the open, as a democracy should debate — about how good a bargain we got in that exchange.

For facilitating that debate, at great risk to his own personal liberty, Snowden is this column's technology person of the year for 2013.

While a long line of so-called leaders of the tech industry were repeating the smug mantra that "there is no privacy" — all while secretly cooperating with the NSA's surveillance program — Snowden risked prosecution and jail to give Americans the chance to choose for themselves whether it still matters in the digital age.

Secrecy has long been a favorite tool of totalitarian regimes that want to stifle internal political debate.

Secret courts were a staple of Joseph Stalin's Soviet Union, used to exile dissidents to Siberian gulags.

They are still used today by China's communist government to silence its critics.

The U.S. also has secret courts, first created under the Foreign Intelligence Surveillance Act of 1978.

The FISA court was set up to allow American intelligence agencies to track foreign agents without allowing those agents to know about it, as they would if the warrant for monitoring their communications had to be approved in a public court.

On its surface, that rationale sounds reasonable.

Yet, like all secret courts, hidden from oversight, it was allowed to run amok.

After 9/11, it became a rubber stamp that approved massive surveillance programs that swept up the phone and Internet communications of ordinary Americans.

Whether that surveillance captured any actual communication or so-called metadata about phone calls and e-mails is a new argument that members of Congress and the NSA are now trying to use to confuse the debate.

But it misses the key point: If the government wants to know about the routine communication of law-abiding Americans, it should have to prove to a judge in an open courtroom that that surveillance is in the best interests of public safety.

This privacy safeguard is enshrined in the Fourth Amendment, which protects us all against "unreasonable searches and seizures."

Thanks to Snowden, this country can now argue and debate — in the federal courts and in the halls of Congress — whether the NSA's surveillance programs are constitutional.

Snowden also showed just how inept the NSA was in protecting even its own information.

Let's not forget the 9/11 terrorist attacks represented the worst intelligence failure in the nation's history.

Various government security agencies knew the names of many of the 9/11 terrorists before the attack. They also knew some were taking flying lessons, but failed to connect the dots.

Such a failure should have led to more oversight, not less.

But critics of that failure were soon bullied into silence by hysterical talk from those who needed to focus Americans' attention on our enemies, as when former secretary of Defense Donald Rumsfeld famously said people, "need to be very careful about what they say."

Not since Sen. Joseph McCarthy's Communist witch hunt of the 1950s had fear become so palpable in the land of the free and the home of the brave.

That was enough to cow most of the tech industry's largest companies into cooperating with the NSA.

Fortunately, some Americans are not so frightened of our enemies as Rumsfeld that they would trade liberty for safety.

Some believe that taxpayer-funded entities such as the NSA and the FISA court should have some measure of public oversight, to ensure they are helping to protect the U.S. Constitution, rather than undermining it.

Now that the scope of NSA spying has been exposed, let's have a debate about all of it.

And let's thank Edward Snowden for moving that debate into the public arena.
http://www.usatoday.com/story/tech/c...onomy/4213953/





'Military-Style' Raid on California Power Station Spooks U.S.
Shane Harris

When U.S. officials warn about "attacks" on electric power facilities these days, the first thing that comes to mind is probably a computer hacker trying to shut the lights off in a city with malware. But a more traditional attack on a power station in California has U.S. officials puzzled and worried about the physical security of the the electrical grid--from attackers who come in with guns blazing.

Around 1:00 AM on April 16, at least one individual (possibly two) entered two different manholes at the PG&E Metcalf power substation, southeast of San Jose, and cut fiber cables in the area around the substation. That knocked out some local 911 services, landline service to the substation, and cell phone service in the area, a senior U.S. intelligence official told Foreign Policy. The intruder(s) then fired more than 100 rounds from what two officials described as a high-powered rifle at several transformers in the facility. Ten transformers were damaged in one area of the facility, and three transformer banks -- or groups of transformers -- were hit in another, according to a PG&E spokesman.

Cooling oil then leaked from a transformer bank, causing the transformers to overheat and shut down. State regulators urged customers in the area to conserve energy over the following days, but there was no long-term damage reported at the facility and there were no major power outages. There were no injuries reported. That was the good news. The bad news is that officials don't know who the shooter(s) were, and most importantly, whether further attacks are planned.

"Initially, the attack was being treated as vandalism and handled by local law enforcement," the senior intelligence official said. "However, investigators have been quoted in the press expressing opinions that there are indications that the timing of the attacks and target selection indicate a higher level of planning and sophistication."

The FBI has taken over the case. There appears to have been some initial concern, or at least interest, in the fact that the shooting happened one day after the Boston Marathon bombing. But the FBI has no evidence that the attack is related to terrorism, and it appears to be an isolated incident, said Peter Lee, a spokesman for the FBI field office in San Francisco, which is leading the investigation. Lee said the FBI has "a couple of leads we're still following up on," which he wouldn't discuss in detail. There has not been any published motive or intent for the attack, the intelligence official said, and no one has claimed credit.

Local investigators seemed to hit a dead end in June, so they released surveillance footage of the shooting. But that apparently produced no new information. The FBI says there have been no tips from the public about who the shooter might be and what he was doing there.

The incident might have stayed a local news story, but this month, Rep. Henry Waxman, the California Democrat and ranking member of the Energy and Commerce Committee, mentioned it at a hearing on regulatory issues. "It is clear that the electric grid is not adequately protected from physical or cyber attacks," Waxman said. He called the shooting at the the San Jose facility "an unprecedented and sophisticated attack on an electric grid substation with military-style weapons. Communications were disrupted. The attack inflicted substantial damage. It took weeks to replace the damaged parts. Under slightly different conditions, there could have been serious power outages or worse."

The U.S. official said the incident "did not involve a cyber attack," but that's about all investigators seem to know right now. AT&T, which operates the phone network that was affected, has offered a $250,000 reward for information leading to the arrest and conviction of the perpetrator or perpetrators.

"These were not amateurs taking potshots," Mark Johnson, a former vice president for transmission operations at PG&E, said last month at a conference on grid security held in Philadelphia. "My personal view is that this was a dress rehearsal" for future attacks.

At the very least, the attack points to an arguably overlooked physical threat to power facilities at a time when much of the U.S. intelligence community, Congress, and the electrical power industry is focused on the risk of cyber attacks. There has never been a confirmed power outage caused by a cyber attack in the United States. But the Obama administration has sought to promulgate cyber security standards that power facilities could use to minimize the risk of one.

At least one senior official thinks the government is focusing too heavily on cyber attacks. Jon Wellinghoff, the chairman of the Federal Energy Regulatory Commission, said last month that an attack by intruders with guns and rifles could be just as devastating as a cyber attack.

A shooter "could get 200 yards away with a .22 rifle and take the whole thing out," Wellinghoff said last month at a conference sponsored by Bloomberg. His proposed defense: A metal sheet that would block the transformer from view. "If you can't see through the fence, you can't figure out where to shoot anymore," Wellinghoff said. Price tag? A "couple hundred bucks." A lot cheaper than the billions the administration has spent in the past four years beefing up cyber security of critical infrastructure in the United States and on government computer networks.

"There are ways that a very few number of actors with very rudimentary equipment could take down large portions of our grid," Wellinghoff said. "I don't think we have the level of physical security we need."
http://complex.foreignpolicy.com/pos....XsBOhQsu.dpuf





The Movies With Pasts Ruled the Year
Brooks Barnes and Michael Cieply

This was the year that Hollywood hit ticket-selling heights by stranding Sandra Bullock in space, educating monsters, bringing back Superman (again) and teaching Brad Pitt to outwit zombies. By the time the Hobbit was unleashed (again), the box office was on fire, even if half of North America was “Frozen.”

Movie studios sustained some devastating flops in 2013, among them the samurai epic “47 Ronin,” which limped into theaters on Christmas. But it was a solid 12 months over all: Rentrak, a firm that compiles box-office data, projected on Sunday that North American ticket sales for the year would total $10.9 billion, a 1 percent increase from 2012. Analysts predict similar attendance numbers to last year’s, about 1.36 billion people.

Hollywood did it largely by serving more of the same. The five leading films at the global box office were all sequels. “Iron Man 3” was the top-selling movie of the year, taking in $409 million in North America, for a global total of more than $1.2 billion. “Despicable Me 2” was second with nearly $920 million in sales, followed by “Fast & Furious 6,” “The Hunger Games: Catching Fire” and “Monsters University” (actually, a prequel). And get ready for more: Crucially for their future business prospects, movie studios managed to introduce an unusually large number of new franchises. Sequels are already in the works for at least eight of the nonsequel films released in 2013, including “The Conjuring,” “The Croods,” “We’re the Millers” and “Man of Steel.”

“It bodes very well for our future,” said Greg Foster, chief executive of Imax Filmed Entertainment, which set a series of box-office records over the year.

Despite the celebration, some studio executives are doing a little soul searching. Again and again, audiences showed that they were starving for originality.

“Gravity,” the 3-D space picture with essentially a cast of two, Ms. Bullock and George Clooney, became a phenomenon. It took in $254.6 million in North America, for a global total of $653.3 million. “Now You See Me,” the kind of middle-budget movie that most big studios left for dead a few years ago, sold $117.7 million in tickets, for a worldwide total of $351.7 million.

“This Is the End,” a raunchy apocalyptic comedy starring James Franco and Jonah Hill as themselves, took in $101.5 million domestically — only a smidgen less than the 2013 movie that may have epitomized more of the same, “The Hangover Part III.”

“People do seem to want different,” said Richie Fay, president for domestic distribution at Lionsgate. “You’ve certainly now got to give people movies that they will come away talking about.”

Moviedom is also ending the year, as it has the past few, ruminating about ceding cultural ground to television. Combined, three presumed best picture contenders — “Nebraska,” “Her” and “Inside Llewyn Davis” — have been seen by roughly one-tenth of the more than 10 million viewers who tuned in to the last episode of “Breaking Bad.”

The serious side of feature filmmaking kept slipping toward a future in which theatrical release is just a seal of approval for pictures that are intended to be seen elsewhere. Among the year’s documentary success stories was “Blackfish,” about SeaWorld’s treatment of orcas and their trainers. The film stayed in theaters long enough to be certified as a “movie” by reviewers and awards voters, but the payoff came when it moved to video-on-demand services a month later, and CNN Films, its backer, quickly landed it in front of the real audience: television viewers.

“Moviegoers want compelling ideas,” Mr. Foster said, noting “Gravity” as one film that delivered (particularly in his theaters). “I take my hat off to television for it. Television has been incredibly compelling and thoughtful.”

It was a tough year for the good-behavior watchdogs of popular culture. Oprah Winfrey smoked in “Lee Daniels’ The Butler,” a late-summer hit, as did Meryl Streep in “August: Osage County.” Cigarettes glammed up the cafe scenes in “Inside Llewyn Davis.” The monitoring group SceneSmoking.org even gave a black lung rating for excessive tobacco use to “The Hobbit: The Desolation of Smaug.” (Smog?)

By June, every major studio was peddling guns in its marketing for films like Universal’s “R.I.P.D.” and “2 Guns,” and a study backed by the Annenberg Public Policy Center at the University of Pennsylvania found that gun violence in the top-selling PG-13 movies had surpassed that in best sellers rated R.

In a wearying year, Americans sidestepped wearying movies. They rejected “The Fifth Estate,” an Oscar hopeful centered on the WikiLeaks organization; it cost DreamWorks Studios about $26 million to make and took in a total of $8.6 million, roughly half of which goes to theater owners. Even returning to the Old West even seemed too much, as “The Lone Ranger” became one of the biggest flops in memory, requiring a write-down of about $160 million.

When consumers did leave their sofas, whether to watch “Star Trek Into Darkness” or “Thor: The Dark World,” it was to get away from their problems, not to work them out. “People are jittery in the country,” said Brad Grey, chief executive at Paramount, which found hits in films like “Jackass Presents: Bad Grandpa.” “They’re jittery over Obamacare, a whole list of issues.” The best response for a movie studio, he suggested, is “to think very, very long term.”

No serious documentary made a deep impression at the box office. The only chart topper was Morgan Spurlock’s boy-band concert film “One Direction: This Is Us,” which had about $28.9 million in domestic sales. It was a far cry from 2004, when “Fahrenheit 9/11” and Mr. Spurlock’s own “Super Size Me” filled seats.

Even when moviegoers watched nonfiction, or at least a lightly fictionalized version of it, they seemed to be showing up less for a history lesson or pieties than for a good time, as was offered by Sony’s “American Hustle,” promoted with hairdos and cleavage, or Martin Scorsese’s “The Wolf of Wall Street,” with its hookers and drug use.

“The Wolf of Wall Street” (Paramount) was the No. 1 new movie over the weekend, taking in an estimated $18.5 million, for a total since opening on Wednesday of $34.3 million. Still, the movie’s hefty cost — about $100 million, independently financed by Red Granite Pictures — and lackluster C score from audiences in exit polls make profitability an extremely steep climb.

Ben Stiller’s “The Secret Life of Walter Mitty” (20th Century Fox) was the second-most-watched new movie over the weekend, selling a soft $13 million in tickets, for a total since opening on Wednesday of $25.6 million. It cost about $90 million to produce and will need to make up significant ground to become a financial success.

The lightly marketed “47 Ronin” (Universal) fizzled as expected, taking in $9.9 million, for a five-day opening total of $20.6 million. Championed at Universal by the studio’s former chairman — who was fired in the fall — “47 Ronin” cost $175 million to make. Analysts estimate the movie will lose roughly $100 million, putting it in “The Lone Ranger” territory. Universal declined to discuss specific losses but confirmed in a statement that it had already taken a write down on the film, saying, “We adjusted film costs in previous quarters, and as a result our financial performance will not be negatively impacted this quarter.”

Although Universal also experienced a big flop in the summer, “R.I.P.D.,” the studio had both “Despicable Me 2” (which has yet to open in China) and “Fast & Furious 6.” It also had one of the year’s best-performing comedies in “Identity Thief,” starring Melissa McCarthy, who had a very good year; she also co-starred in “The Heat,” which took in more than $134.5 million for Fox.

But it wasn’t just the big movies that made a difference for Hollywood. Solidly performing little films also offered a lift. TheWrap.com, an entertainment trade news site, counted seven movies that started out in very limited release — pictures like Woody Allen’s “Blue Jasmine” and the independently produced “The Way, Way Back” — and managed to take in more than $20 million.

“Instructions Not Included,” from Pantelion, part owned by Lionsgate, became the highest-grossing Spanish-language film ever in the United States, with more than $44 million. Taken with the success of several movies with predominantly African-American casts, including “The Butler” and “The Best Man Holiday,” “Instructions Not Included” demonstrated an audience hunger for diversity in pictures.

“Those are niche types of films,” Lionsgate’s Mr. Fay said, “that went beyond their borders to become solid hits.”
http://www.nytimes.com/2013/12/30/mo...-the-year.html





What Could Have Entered the Public Domain on January 1, 2014?

Under the law that existed until 1978 . . . Works from 1957

Congress Shrugged

Current US law extends copyright for 70 years after the date of the author’s death, and corporate “works-for-hire” are copyrighted for 95 years after publication. But prior to the 1976 Copyright Act (which became effective in 1978), the maximum copyright term was 56 years – an initial term of 28 years, renewable for another 28 years. Under those laws, works published in 1957 would enter the public domain on January 1, 2014, where they would be “free as the air to common use.” (Mouse over any of the links below to see gorgeous cover art from 1957.) Under current copyright law, we’ll have to wait until 2053.1 And no published works will enter our public domain until 2019. The laws in Canada and the EU are different – thousands of works are entering their public domains on January 1.

Curious George Gets a Term Extension

What books and plays would be entering the public domain if we had the pre-1978 copyright laws? You might recognize some of the titles below.

• Samuel Beckett, Endgame (“Fin de partie”, the original French version)
• Jack Kerouac, On the Road (completed 1951, published 1957)
• Ayn Rand, Atlas Shrugged
• Margret Rey and H.A. Rey, Curious George Gets a Medal
• Dr. Seuss (Theodor Geisel), How the Grinch Stole Christmas and The Cat in the Hat
• Eliot Ness and Oscar Fraley, The Untouchables
• Northrop Frye, Anatomy of Criticism: Four Essays
• Walter Lord, Day of Infamy
• Studs Terkel, Giants of Jazz
• Corbett H. Thigpen and Hervey M. Cleckley, The Three Faces of Eve
• Ian Fleming, From Russia, with Love
• Ann Weldy (as Ann Bannon), Odd Girl Out
• A.E. Van Vogt, Empire of the Atom

You would be free to translate these books into other languages, create Braille or audio versions for visually impaired readers (if you think that publishers wouldn’t object to this, you would be wrong), or adapt them for film. You could read them online or buy cheaper print editions, because others were free to republish them. (Empirical studies have shown that public domain books are less expensive, available in more editions and formats, and more likely to be in print – see here, here, and here.) Imagine a digital Library of Alexandria containing all of the world’s books from 1957 and earlier, where, thanks to technology, you can search, link, index, annotate, copy and paste. (Google Books has brought us closer to this reality, but for copyrighted books where there is no separate agreement with the copyright holder, it only shows three short snippets, not the whole book.) Instead of seeing these literary works enter the public domain in 2014, we will have to wait until 2053.

Endgame – “The end is in the beginning and yet you go on. . .”

Think about the movies and television shows from 1957 that would have become available this year. Fans could share clips with friends or incorporate them into fantastic homages. (There are certainly some good candidates.) Local theaters could show the full features. Libraries and archivists would be free to digitize and preserve them. Here are a few of the works that we won’t see in the public domain for another 39 years.

• The Incredible Shrinking Man (Based on Richard Matheson’s 1956 book The Shrinking Man)
• The Bridge on the River Kwai (Best Picture, Best Director (David Lean), Best Actor (Alec Guinness); also starring William Holden, Jack Hawkins and Sessue Hayakawa)
• A Farewell to Arms (Rock Hudson and Jennifer Jones)
• Gunfight at the O.K. Corral (Burt Lancaster and Kirk Douglas)
• 3:10 to Yuma (1957 original starring Glenn Ford and Van Heflin)
• Island in the Sun (James Mason, Joan Fontaine, Dorothy Dandridge, and introducing Harry Belafonte)
• Witness for the Prosecution (Tyrone Power, Marlene Dietrich, Charles Laughton, Elsa Lanchester)
• 12 Angry Men (Henry Fonda, Lee J. Cobb, Jack Klugman, Ed Begley, and more)
• Sweet Smell of Success (Burt Lancaster and Tony Curtis)
• Jailhouse Rock (Elvis Presley)
• The Prince and the Showgirl (Laurence Olivier and Marilyn Monroe)
• Funny Face (Audrey Hepburn and Fred Astaire . . . and Paris as only Hollywood can imagine it)
• An Affair to Remember (Cary Grant and Deborah Kerr . . . and the Empire State Building)
• Nights of Cabiria (written and directed by Federico Fellini and starring Giulietta Masina)
• The Seventh Seal (written and directed by Ingmar Bergman and starring Max von Sydow and Bengt Ekerot)
• What’s Opera, Doc? (Bugs Bunny and Elmer Fudd do Wagner)
• The first episodes of Leave It to Beaver and Perry Mason
• Elvis Presley’s third and final appearance on The Ed Sullivan Show on January 6, 1957 (CBS refused to show his gyrating hips)

These works are famous, so we’re not likely to lose them entirely – the true tragedy is that of forgotten films that are literally disintegrating while preservationists wait for their copyright terms to expire.2

“That’ll Be the Day”. . . in 2053

What 1957 music could you have used without fear of a lawsuit? If you wanted to find guitar tabs or sheet music and freely record your own version of some of the influential music of the 1950s, January 1, 2014, might have been a booming day for you under earlier copyright laws – “That’ll Be the Day” and “Peggy Sue” (Buddy Holly, Jerry Allison, and Norman Petty), “Great Balls of Fire” (Otis Blackwell and Jack Hammer), and “Wake Up, Little Susie” (Felice and Boudleaux Bryant) would all be available. You could score a short film with Dmitri Shostakovich’s Symphony No. 11 in G minor (Opus 103; subtitled The Year 1905). Or you could stage your own performances of some of Elvis Presley’s hits: “All Shook Up” (Otis Blackwell and Elvis Presley) and “Jailhouse Rock” (Jerry Leiber and Mike Stoller). Today, these musical works remain copyrighted until 2053.3

The musical “West Side Story” (music by Leonard Bernstein, lyrics by Stephen Sondheim, and book by Arthur Laurents) made its Broadway debut in 1957. Would “West Side Story” have been legal if Shakespeare’s Romeo and Juliet was under copyright at the time? Probably not. And, of course, if copyright existed in Shakespeare's time, as Judge Richard Posner observed, “Romeo and Juliet itself would have infringed Arthur Brooke’s The Tragicall Historye of Romeo and Juliet . . . which in turn would have infringed several earlier Romeo and Juliets, all of which probably would have infringed Ovid’s story of Pyramus and Thisbe.” Artists build upon the past. Creativity depends upon a healthy public domain.

For lovers of fine art, 1957 also featured a wealth of material, including Dali’s “Celestial Ride” and “Music: the Red Orchestra,” Ed Hopper’s “Western Motel,” and Picasso’s “Las Meninas” set of paintings. This remarkable series of works consists of reinterpretations – remixes, if you will – of Diego Velázquez’s famous painting “Las Meninas”(usually translated as “The Maids of Honor”). Velázquez’s painting became this, and this, and this, and this, and this, and this, and this, and even this. (See some of the 58 works in Picasso’s “Las Meninas” here.) Picasso did not have to track down Velázquez’s heirs and negotiate licensing fees in order to create this oeuvre. He was free to “copy Las Meninas, entirely in good faith” in a way “that would be a detestable Meninas for a traditional painter, but would be my Meninas.”4 One masterpiece inspired another. This is what the public domain allows.
Science from 1957 – copyrighted research, still behind paywalls

1957 was a noteworthy year for science: the USSR launched Sputnik 1 and Sputnik 2, IBM released the first FORTRAN compiler, and the UK’s Medical Research Council published an early report linking smoking and lung cancer. There were groundbreaking publications in the fields of superconductivity and astrophysics such as “Theory of Superconductivity” by John Bardeen, L.N. Cooper, and J.R. Schrieffer and “Synthesis of the Elements in Stars (‘B²FH’)” by Geofrey Burbidge, Margaret Burbidge, William Fowler, and Fred Hoyle.

Both of the articles above are copyrighted, but thankfully their publishers have made them available in full online, so that you can read them, even though it may still be illegal to copy and distribute them. Many articles from 1957 remain behind paywalls, including those in major scientific journals such as Science, Nature, and JAMA. Are you interested in a historical perspective on, for example, “Soviet and U.S. Professional and Technical Manpower” or the “Breeding Behavior of Cichlids”? You can’t read those articles unless you pay or subscribe (the first costs US$20 for one day of access; you can purchase the second for US$32).

It’s remarkable to find scientific research from 1957 hidden behind publisher paywalls. True, some older articles – especially those with enduring impact – have been made available on third party websites, though it is often unclear whether this is being done with the consent (or temporary forbearance) of the copyright holder, or simply being provided by enthusiasts who cannot imagine that access to these works is still legally restricted. But this is not a stable solution for providing reliable access to science. Third party postings can be difficult to find or taken down, links can get broken, and would-be posters may be deterred by the risk of a lawsuit. Under the pre-1978 copyright term, all of this history would be free to scholars, students, and enthusiasts. Now, to get these articles from the publisher, you need a credit card or institutional subscription. And the institutional access that many top scientists enjoy is itself not a stable solution – even institutions such as Harvard have considered canceling their subscriptions because they can no longer afford the escalating prices of major journal subscriptions.

Not all scientific publishers work under this kind of copyright scheme. “Open Access” scientific publications, like those of the Public Library of Science, are under Creative Commons licenses, meaning that they can be copied freely from the day they are published.

Works from 1985!

Most of the works highlighted here are famous – that is why we included them. And if that fame meant that the work was still being exploited commercially 28 years after its publication, the rights holders would probably renew the copyright. (This is true for many of the works featured on this page, though even the shorter copyright term exceeds the commercial lifespan of a surprising percentage of successful works.) But we know from empirical studies that 85% of authors did not renew their copyrights (for books, the number is even higher – 93% did not renew), since most works exhaust their commercial value very quickly.

Under the law that existed until 1978 . . . Up to 85% of all copyrighted works from 1985 might have been entering the public domain on January 1, 2014.

That means that all of these examples from 1957 are only the tip of the iceberg. If the pre-1978 laws were still in effect, we could have seen 85% of the works published in 1985 enter the public domain on January 1, 2014. Imagine what that would mean to our archives, our libraries, our schools and our culture. Such works could be digitized, preserved, and made available for education, for research, for future creators. Instead, they will remain under copyright for decades to come, perhaps even into the next century.

Perhaps the most troubling aspect of the current copyright term is that in most cases, the cultural harm is not offset by any benefit to an author or rights holder. Unlike the famous works highlighted here, the vast majority of works from 1957 do not retain commercial value,5 but they are presumably off limits to users who do not want to risk a copyright lawsuit. This means that no one is benefiting from continued copyright, while the works remain both commercially unavailable and culturally off limits. The public loses the possibility of meaningful access for no good reason.

You can read more about the current costs associated with orphan works – works that are still presumably under copyright, but with no identifiable or locatable copyright holder – here and here. Importantly, the US Copyright Office has renewed its efforts to find solutions to the orphan works problem.
http://web.law.duke.edu/cspd/publicd.../2014/pre-1976
















Until next week,

- js.



















Current Week In Review





Recent WiRs -

December 28th, December 21st, December 14th, December 7th

Jack Spratts' Week In Review is published every Friday. Submit letters, articles, press releases, comments, questions etc. in plain text English to jackspratts (at) lycos (dot) com. Submission deadlines are Thursdays @ 1400 UTC. Please include contact info. The right to publish all remarks is reserved.


"The First Amendment rests on the assumption that the widest possible dissemination of information from diverse and antagonistic sources is essential to the welfare of the public."
- Hugo Black
JackSpratts is offline   Reply With Quote
Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Peer-To-Peer News - The Week In Review - July 16th, '11 JackSpratts Peer to Peer 0 13-07-11 06:43 AM
Peer-To-Peer News - The Week In Review - July 9th, '11 JackSpratts Peer to Peer 0 06-07-11 05:36 AM
Peer-To-Peer News - The Week In Review - January 30th, '10 JackSpratts Peer to Peer 0 27-01-10 07:49 AM
Peer-To-Peer News - The Week In Review - January 16th, '10 JackSpratts Peer to Peer 0 13-01-10 09:02 AM
Peer-To-Peer News - The Week In Review - December 5th, '09 JackSpratts Peer to Peer 0 02-12-09 08:32 AM






All times are GMT -6. The time now is 12:19 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
© www.p2p-zone.com - Napsterites - 2000 - 2024 (Contact grm1@iinet.net.au for all admin enquiries)