P2P-Zone  

Go Back   P2P-Zone > Peer to Peer
FAQ Members List Calendar Search Today's Posts Mark Forums Read

Peer to Peer The 3rd millenium technology!

Reply
 
Thread Tools Search this Thread Display Modes
Old 25-09-13, 07:10 AM   #1
JackSpratts
 
JackSpratts's Avatar
 
Join Date: May 2001
Location: New England
Posts: 10,013
Default Peer-To-Peer News - The Week In Review - September 28th, '13

Since 2002


































"The report seems to be quite a car crash, ignoring evidence when it doesn't suit, and relying on anecdotes when it does. It is tremendously one-sided." – Jim Killock


"It was way easier than expected." – Starbug






































September 28th, 2013




The Hole in Our Collective Memory: How Copyright Made Mid-Century Books Vanish

A book published during the presidency of Chester A. Arthur has a greater chance of being in print today than one published during the time of Reagan.
Rebecca J. Rosen

Last year I wrote about some very interesting research being done by Paul J. Heald at the University of Illinois, based on software that crawled Amazon for a random selection of books. At the time, his results were only preliminary, but they were nevertheless startling: There were as many books available from the 1910s as there were from the 2000s. The number of books from the 1850s was double the number available from the 1950s. Why? Copyright protections (which cover titles published in 1923 and after) had squashed the market for books from the middle of the 20th century, keeping those titles off shelves and out of the hands of the reading public.

Heald has now finalized his research and the picture, though more detailed, is largely the same: "Copyright correlates significantly with the disappearance of works rather than with their availability," Heald writes. "Shortly after works are created and proprietized, they tend to disappear from public view only to reappear in significantly increased numbers when they fall into the public domain and lose their owners."

The graph above shows the simplest interpretation of the data. It reveals, shockingly, that there are substantially more new editions available of books from the 1910s than from the 2000s. Editions of books that fall under copyright are available in about the same quantities as those from the first half of the 19th century. Publishers are simply not publishing copyrighted titles unless they are very recent.

But this isn't a totally honest portrait of how many different books are available, because for books that are in the public domain, often many different editions exist, and the random sample is likely to overrepresent them. "After all," Heald explains, "if one feeds a random ISBN number [into] Amazon, one is more likely to retrieve Milton's Paradise Lost (with 401 editions and 401 ISBN numbers) than Lorimer's A Wife out of Egypt (1 edition and 1 ISBN)." He found that on average the public domain titles had a median of four editions per title. (The mean was 16, but highly distorted by the presence of a small number of books with hundreds of editions. For this reason, statisticians whom Heald consulted recommended using the median.) Heald divided the number of public-domain editions by four, providing a graph that compares the number of titles available.

Heald says the picture is still "quite dramatic." The most recent decade looks better by comparison, but the depression of the 20th century is still notable, followed by a little boom for the most recent decades when works fall into the public domain. Presumably, as Heald writes, in a market with no copyright distortion, these graphs would show "a fairly smoothly doward sloping curve from the decade 2000-20010 to the decade of 1800-1810 based on the assumption that works generally become less popular as they age (and therefore are less desirable to market)." But that's not at all what we see. "Instead," he continues, "the curve declines sharply and quickly, and then rebounds significantly for books currently in the public domain initially published before 1923." Heald's conclusion? Copyright "makes books disappear"; its expiration brings them back to life.

The books that are the worst affected by this are those from pretty recent decades, such as the 80s and 90s, for which there is presumably the largest gap between what would satisfy some abstract notion of people's interest and what is actually available. As Heald writes:

This is not a gently sloping downward curve! Publishers seem unwilling to sell their books on Amazon for more than a few years after their initial publication. The data suggest that publishing business models make books disappear fairly shortly after their publication and long before they are scheduled to fall into the public domain. Copyright law then deters their reappearance as long as they are owned. On the left side of the graph before 1920, the decline presents a more gentle time-sensitive downward sloping curve.

But even this chart may understate the effects of copyright, since the comparison assumes that the same quantity of books has been published every decade. This is of course not the case: Increasing literacy coupled with technological efficiencies mean that far more titles are published per year in the 21st century than in the 19th. The exact number per year for the last 200 years is unknown, but Heald and his assistants were able to arrive at a pretty good approximation by relying on the number of titles available for each year in WorldCat, a library catalog that contains the complete listings of 72,000 libraries around the world. He then normalized his graph to the decade of the 1990s, which saw the greatest number of titles published.

By this calculation, the effect of copyright appears extreme. Heald says that the WorldCat research showed, for example, that there were eight times as many books published in the 1980s as in the 1880s, but there are roughly as many titles available on Amazon for the two decades. A book published during the presidency of Chester A. Arthur has a greater chance of being in print today than one published during the time of Reagan.

Copyright advocates have long (and successfully) argued that keeping books copyrighted assures that owners can make a profit off their intellectual property, and that that profit incentive will "assure [the books'] availability and adequate distribution." The evidence, it appears, says otherwise.
http://www.theatlantic.com/technolog...vanish/278209/





Downloading Is Mean! Content Industry Drafts Anti-Piracy Curriculum for Elementary Schools
David Kravets

Listen up children: Cheating on your homework or cribbing notes from another student is bad, but not as bad as sharing a music track with a friend, or otherwise depriving the content-industry of its well-earned profits.

That’s one of the messages in a new-school curriculum being developed with the Motion Picture Association of America, the Recording Industry Association of America and the nation’s top ISPs, in a pilot project to be tested in California elementary schools later this year.

A near-final draft of the curriculum, obtained by WIRED, shows that it comes in different flavors for every grade from kindergarten through sixth, to keep pace with your developing child’s ability to understand that copying is theft, period.

“This thinly disguised corporate propaganda is inaccurate and inappropriate,” says Mitch Stoltz, an intellectual property attorney with the Electronic Frontier Foundation, who reviewed the material at WIRED’s request.

“It suggests, falsely, that ideas are property and that building on others’ ideas always requires permission,” Stoltz says. “The overriding message of this curriculum is that students’ time should be consumed not in creating but in worrying about their impact on corporate profits.”

The material was prepared by the California School Library Association and the Internet Keep Safe Coalition in conjunction with the Center For Copyright Infringement, whose board members include executives from the MPAA, RIAA, Verizon, Comcast and AT&T.

Each grade’s material includes a short video, and comes with a worksheet for teachers to use that’s packed with talking points to share with students.

An entrepreneurial schoolyard artist finds her business selling dragon drawings is ruined after a fellow third-grader takes a photo with her cell phone.

In the sixth-grade version, (.pdf) teachers are asked to engage students with the question: “In school, if we copy a friend’s answers on a test or homework assignment, what happens?”

The answer is, you can be suspended from school or flunk the test. The teachers are directed to tell their students that there are worse consequences if they commit a copyright violation.

“In the digital world, it’s harder to see the effects of copying, even though the effects can be more serious,” the teacher worksheet says.

The material is silent on the concept of fair use, a legal doctrine that allows for the reproduction of copyrighted works without the rights holder’s permission. Instead, students are told that using works without permission is “stealing.”

“Justin Bieber got started singing other people’s songs, without permission, on YouTube. If he had been subjected to this curriculum, he would have been told that what he did was ‘bad, ‘stealing,’ and could have landed him in jail,” says Stoltz.

“We’ve got some editing to do,” concedes Glen Warren, vice president of the California School Library Association, the non-profit that helped produce the material with the Internet Keep Safe Coalition and industry.

The Internet Keep Safe Coalition is a non-profit partnering with various governments and some of the nation’s biggest corporate names like Google, Microsoft, Facebook, Target, Xerox, HP and others.

Its president, Marsali Hancock, says fair use is not a part of the teaching material because K-6 graders don’t have the ability to grasp it.

The curriculum, she said in a telephone interview, “is developmentally consistent with what children can learn at specific ages.”

She said the group will later develop material for older kids that will discuss fair use.

A 45-second video for second graders, for example, shows a boy snapping pictures and deciding whether to sell, keep or share them.

“You’re not old enough yet to be selling your pictures online, but pretty soon you will be,” reads the accompanying text in the teacher’s lesson plan. “And you’ll appreciate if the rest of us respect your work by not copying it and doing whatever we want with it.”

Hancock said the lessons were developed with “literacy experts,” and that some of the wording and kinks may still need to be ironed out.

She said the material has not yet been approved by the Center For Copyright Information, the group that commissioned the curriculum.

The Center for Copyright Information is best known for working with the White House and rights holders to forge an internet monitoring program with some of the nation’s biggest ISPs. That program provides for extrajudicial punishment of internet users who download copyrighted works without permission. Commenced earlier this year, the program’s punishment for repeat violators includes temporary internet termination and throttling connection speeds.

Hancock said the center is expected to be briefed on the proposed curriculum — dubbed “Be a Creator” — perhaps as early as this week.

The center’s executive director, Jill Lesser, told a House subcommittee Wednesday that she hoped the program would be integrated in “schools across the country.”

She testified that it’s best to attack piracy through youth education.

“Based on our research, we believe one of the most important audiences for our educational efforts is young people. As a result, we have developed a new copyright curriculum that is being piloted during this academic year in California,” according to her testimony.

“The curriculum introduces concepts about creative content in innovative and age-appropriate ways. The curriculum is designed to help children understand that they can be both creators and consumers of artistic content, and that concepts of copyright protection are important in both cases,” Lesser testified.

She said the CCI’s board is expected to sign off on the program soon, although she cautioned that it currently is in “draft” form.

“We are just about to post those materials in the next week or two on our web site,” Lesser said in a telephone interview.

Gigi Sohn, the president of Public Knowledge and an adviser to the CCI, declined to comment because she said she hasn’t seen the curriculum.

Overall, the curriculum’s message is anything but “sharing is caring.”

“We all love to create new things—art, music, movies, paper creations, structures, even buildings! It’s great to create — as long as we aren’t stealing other people’s work. We show respect for other artists and their work when we get permission before we use their work,” according to the message to first graders. (.pdf) “This is an important part of copyright. Sharing can be exciting and helpful and nice. But taking something without asking is mean.”

The fifth-grade lesson introduces the Creative Commons license, in which rights holders grant limited permission on re-use. But even in explaining the Creative Commons, the lesson says that it’s illegal to make any copies of copyrighted works. That’s a message that essentially says it’s even unlawful to rip CDs to your iPod.

“If a song or movie is copyrighted, you can’t copy it, download it, or use it in your own work without permission,” according to the fifth-grade worksheet. “However, Creative Commons allows artists to tell users how and if their work can be used by others. For example, if a musician is okay with their music being downloaded for free — they will offer it on their website as a ‘Free download.’ An artist can also let you know how you can use their work by using a Creative Commons license.”

Warren, of the library association, agreed that it’s incorrect to tell students they can never use copyrighted works without permission, as the fifth-grade worksheet says. He said some of the package’s language has been influenced by the rights holders on the Center for Copyright Information.

“We’re moving along trying to get things a little closer to sanity,” Warren said in a telephone interview. “That tone and language, that came from that side of the fence, so to speak.”
http://www.wired.com/threatlevel/201...ol-propaganda/





Record Label Picks Copyright Fight — With The Wrong Guy
Laura Sydell

An Australian record label may have picked a fight with the wrong guy. The label sent a standard takedown notice threatening to sue after YouTube computers spotted its music in a video.

It turns out that video was posted by one of the most famous copyright attorneys in the world, and Lawrence Lessig is suing back.

Lessig, a Harvard Law School professor, has lectured around the world about how copyright law needs to adapt to the Internet age. In his lecture, he shows examples of people who have used the Internet to "share their culture and remix other people's creations."

One of the examples he likes to show is a series of remixes that use the song "Lisztomania" by the French band Phoenix. Someone remixed that song with clips from the iconic '80s movie The Breakfast Club. The remix went viral and inspired other videos in which people pretended to be Breakfast Club actors dancing to the song.

Copyright Vs. Fair Use

Lessig posted his lecture on YouTube, which uses a technology that scans videos to find copyrighted songs.

Many labels and artists have agreed to let songs stay up in return for a cut of the money that YouTube gets from ads it runs with the videos — but some labels, like Melbourne-based Liberation Music, which owns the rights to "Lisztomania," just want them taken down.

One day, "the computer bots finally got around to noticing that I had used a clip from this song," he says. "Liberation Music then fired off threats of a lawsuit to me if I didn't take it down."

At first, YouTube took it down. But being a copyright attorney, Lessig knew his rights. He was entitled to use these clips in a lecture under a legal doctrine known as fair use.

"If I'm using it for purposes of critique, then I can use if even if I don't have permission of the original copyright owner," he says.

Liberation Music eventually backed down. But Lessig decided to invoke another part of the copyright law, "which basically polices bad-faith lawsuits," he says — threats made fraudulently or without proper basis.

Lessig is suing Liberation Music because he wants labels to stop relying on automated systems to send out takedown notices, he says.

Afraid To Fight Back

Liberation Music did not respond to NPR's numerous requests for comment, but this kind of takedown notice is fairly common, says Corynne McSherry, an attorney with the Electronic Frontier Foundation, a nonprofit digital rights group, who is representing Lessig.

"I get contacted all the time by folks who have had their material taken down," she says. "And often I'll go and I'll take a look at what's taken down, and it's clearly ridiculous."

The problem is that a lot of those people are afraid to fight back. If they lose, they might have to pay up to $150,000 a song, McSherry says. "And for most regular people that's a pretty scary possibility."

It certainly was to Bob Cronin, a DJ living in Atlanta who creates mashups of songs and occasionally posts them online. His mashups mix together clips from different bands — like the Beastie Boys and the Beatles and Jay Z — and sometimes he adds electronic drumbeats.

He says he's "really just trying to make music that's fun and surprising" for people, where they recognize tracks they like.

It was fun until Cronin got a notice — take it down or be sued.

"Basically, I was scared," he says. "When you're a guy like me you have everything to lose, and you really are not going to get a lot back from fighting Warner Bros. or something."

While it's somewhat debatable as to whether Cronin's mashups are protected speech, there isn't much doubt that Lessig's lecture is a fair use.

"What we've got is this computerized system threatening people about content that's on the Web, much of it legally on the Web," Lessig says.

The problem, he says, is the impact: "what we think of as a very significant chilling of completely legitimate and protected speech."

Lessig hopes his suit will set a precedent that will persuade copyright holders to put human beings who know the law back into the equation.
http://www.npr.org/blogs/alltechcons...-the-wrong-guy





Google Removes ‘BitTorrent’ From Piracy Search Filter
Ernesto

To reduce online piracy Google has implemented several changes to its search engine in recent years. Among other things, the search engine blacklisted dozens of piracy-related terms from appearing in its Autocomplete and Instant services. Both ‘BitTorrent’ and ‘uTorrent’ were included from the start, but TorrentFreak has learned that Google recently unbanned these keywords, resulting in a sharp increase in search traffic.

For two years Google has been filtering “piracy-related” terms from its ‘Autocomplete‘ and ‘Instant‘ services.

Google users searching for terms like “The Pirate Bay”, “RapidShare” and “isoHunt” will notice that no suggestions or search results appear before they type in the full word. While no webpages are removed from Google’s index, there is sharp decrease in searches for these terms.

What triggers a keyword to be included in the blacklist is not clear. A Google spokesperson told TorrentFreak two months ago that they remove terms that are “closely associated with piracy” without providing further details.

The full list of banned words also remains secret, but we do know that the search terms BitTorrent and uTorrent were included from the start. Both words are trademarks of San Francisco-based BitTorrent Inc. and the company was rather disappointed that Google labeled them as “piracy related.”

Over the past several months BitTorrent Inc. has continuously emphasized that BitTorrent does not equal piracy, and a recent upgrade to Google’s search filter show that this effort has paid off. Both BitTorrent and uTorrent are now absent from Google’s piracy filter and as a result searches for both terms spiked, resulting in an increase in visitors to the respective sites.

“This is almost certainly a result of that improving understanding helped by products like BitTorrent Bundle and BitTorrent Sync. They help those who are confused about BitTorrent understand that it is not a piracy website,” a BitTorrent Inc. spokesperson told TorrentFreak.

As far as we’re aware this is the first time that Google has removed terms from its search filter. Interestingly, Megaupload still remains blocked even though the site has been offline for nearly two years.

Unfortunately the reasons to include or remove certain terms remains a mystery. Recently Google added the name of the popular music streaming service Grooveshark, which has had its fair share of legal troubles in recent years but is currently licensed by several of the major labels.

While some people worry about possible over-blocking, the copyright holders have been arguing the opposite. Just last week the MPAA released a report claiming that Google and other search engines are major piracy facilitators, and that they should step up their anti-piracy efforts.

It’s now up to Google to find a balance between these two forces, which may prove to be quite a challenge.
http://torrentfreak.com/google-remov...filter-130924/





MPs: Google Blocks Child Abuse Images, it Should Block Piracy Too
Nicole Kobie

If Google can block child abuse images, it can also block piracy sites, according to a report from MPs.

MPs said they were "unimpressed" by Google's "derisorily ineffective" efforts to battle online piracy, according to a Commons Select Committee report looking into protecting creative industries.

The report said that proposals in the Hargreaves review to introduce copyright exceptions, and the failure to roll out the Digital Economy Act were risking the "livelihoods of the individuals and industries" that create content.

However, the strongest condemnation was for Google, which was criticised for not doing a better job of filtering out piracy sites from search results.

"We strongly condemn the failure of Google, notable among technology companies, to provide an adequate response to creative industry requests to prevent its search engine directing consumers to copyright-infringing websites," the report said. "We are unimpressed by their evident reluctance to block infringing websites on the flimsy grounds that some operate under the cover of hosting some legal content."

"The continuing promotion by search engines of illegal content on the internet is unacceptable," it added. "So far, their attempts to remedy this have been derisorily ineffective."

The report said it wasn't "beyond the wit of the engineers employed by Google and others to demote and, ideally, remove copyright infringing material from search engine results".

Indeed, John Whittingdale MP, the chair of the Committee - and also a non-executive director at Audio Network, an online music catalogue - noted that Google manages to remove other illegal content. "Google and others already work with international law enforcement to block for example child porn from search results and it has provided no coherent, responsible reason why it can't do the same for illegal, pirated content," he said.

Google disagreed that it wasn't doing enough. "We removed more than 20 million links to pirated content from our search results in the last month alone," a spokesperson said. "But search is not the problem - according to Ofcom just 8% of infringers in the UK use Google to find unlicensed film and 13% to find unlicensed music. Google works harder than anyone to help the film and music industry protect their content online."

Call for changes

The MPs' report called for a "powerful champion of IP" within government, suggesting that should be the role of the Intellectual Property Office, but that body is "too often seen as wishing to dilute copyright rather than defend and enforce it".

As well, the MPs want the IPO to include more research into online piracy in its annual report and to examine how search engines "facilitate" it, and for the maximum penalty for serious online copyright theft to be increased to ten years.

The roll out of the controversial and delayed Digital Economy Act should be accelerated, the MPs said, especially the sections requiring ISPs to send warning letters to customers seen to be downloading content illegally.

The report also disagreed with the widely well-received Hargreaves review into intellectual property - even disagreeing with its advice to allow "private copying", such as ripping from CDs. The Committee report said such activity is not "factored into the purchase either of music or devices that store, play or copy it".

Other options

The MPs' report also takes a swipe at digital activists the Open Rights Group (ORG). "While we share the Open Rights Group’s attachment to freedom of expression via the internet, we firmly repudiate their laissez-faire attitudes towards copyright infringement," the report said.

However, ORG director Jim Killock said the report was flawed. "The report seems to be quite a car crash, ignoring evidence when it doesn't suit, and relying on anecdotes when it does," he told PC Pro. "It is tremendously one-sided, given the breadth and importance of these issues to the whole of industry and society, well beyond the perspective of music and film lobby groups, objecting to simple things like legalising transfer of MP3s to iPods."

Indeed, the report quotes industry figures suggesting online piracy of film and music costs £400 million annually, and that 35% of movie watched online are downloaded illegally.

"These industry figures were questioned by the Open Rights Group, and Viscount Younger of Leckie stated they were not based on exact science," the report admitted. "Such quibbles in our view, however, should not detract from the existential threat that online piracy clearly poses to the creative economy."

Killock suggested the MPs had themselves missed the wider picture. "They seem to have missed the real story, which is that industry has succeeded with new services like Netflix and Lovefilm by making more flexible, consumer friendly deals, like showing Breaking Bad at nearly the same time as the in USA," he said.
http://www.pcpro.co.uk/news/384421/m...ock-piracy-too





Study: 99.7 Percent of Files Shared On Torrent Networks Are Illegal
Andre Yoskowitz

The RIAA and MPAA have begun distributing new research about the perils of piracy on the entertainment industry, trying to get Congress to take a second look into potential legislation on the matter.

Additionally, the trade groups are once again accusing search engines of not doing enough to direct traffic away from well-known pirate file sharing sites.

"We invite Google and the other major search engines to sit down with us to formulate a plan that goes beyond promises of action and actually serves its intended purpose of deterring piracy and giving the legitimate marketplace an environment to thrive," RIAA Chairman Cary Sherman told a House panel (via 3N).

MPAA boss Christopher Dodd added, "as the internet's gatekeepers, search engines share a responsibility to play a constructive role in not directing audiences to illegitimate content".

Legislators have shied away from sweeping legislation following last year's try at CISPA, which was met with massive criticism from the general public and from major corporations. Smaller measures have been taken, such as credit card processors like Visa and Mastercard blocking access to businesses that thrive off piracy, and ad networks have also pulled business from warez sites. Google also tweaked its algorithms to give 'less visibility' to certain sites.

One of the newly cited surveys is from Comcast, which used the digital brand monitoring company NetNames. That study, which came from a review of the top 12,500 files out of 3.5 million on a public BitTorrent tracker, says 99.7 percent were illegal/unauthorized. Of course, that sample size is small and skewed, but it should get some heads to turn.
http://www.afterdawn.com/news/articl..._are_i llegal





It’s Easier to Share Files Between Phones
Hiawatha Bray

How do you get two smartphones to talk to each other? That’s not a trick question. Try sharing your vacation photos with your best friend, or handing off an important business document. The clumsy process of moving files between phones ought to be child’s play, but isn’t.

Now it may soon be, thanks to some recent moves by Google Inc. and archrival Apple Inc.

Most of us do phone-to-phone sharing the hard way. IPhones, as well as smartphones running Google’s Android operating system, give you the option to mail files to your friends, or share them through social media networks like Facebook. You can also join cloud storage services like Dropbox and Microsoft Corp.’s SkyDrive. Then you can upload your files to one of these services and send a link to your friends, so they can download a copy.

But Google just spent something like $30 million to acquire Bump, a start-up company with a software app that sharply simplifies the process. Load Bump onto your phone and you can move files to another phone by simply tapping the two devices together.

Meanwhile Apple is rolling out iOS 7, the latest update for the software that drives iPhones and iPads. One new feature, AirDrop, is the mobile version of a wireless file transfer system that came to Apple’s Mac computers in 2011. AirDrop identifies nearby Apple devices, so users can easily send copies of photos or other documents.

AirDrop works by creating a temporary network between iOS devices, using either Wi-Fi or Bluetooth wireless networking. The “share” button in the new software lets you single out a particular nearby iGadget and transmit your data with a simple tap. It’s a big step forward for iPhones; up to now you have needed to purchase a separate app for this feature, which isn’t built into earlier versions of iOS.

But it’s old news for Android users. Their phones have long been able to swap files via Bluetooth. You just pair up your phone with that of your friend. Then open the file, touch the “share” icon, select the Bluetooth option, and send the file to your buddy.

Microsoft’s Windows Phone 8 devices and BlackBerry phones also support file swapping via Bluetooth.

If that’s too much work, many Android phones also have NFC or “near-field communications.” That’s a chip that comes to life when it’s in range of another NFC device. Phones with NFC and newer versions of Android software can swap files using a feature called Android Beam. Just touch one phone to the other, back to back. Then the sender taps his screen to send the file. Android Beam creates a temporary Bluetooth connection between the phones, and swaps the file.

Bluetooth was never designed for moving large files. So Samsung Corp.’s Galaxy S III and S 4 employ an even slicker system called S Beam, which uses Wi-Fi instead of Bluetooth. If both phones are Samsungs with S Beam, wireless file transfers will happen much faster.

None of these gimmicks work with iPhones because they don’t use NFC chips. But every iPhone does contain an accelerometer. That’s the motion-detection chip that rotates the image on the screen when you hold the phone sideways. That’s what Bump uses to transfer files between iPhones, and Androids, for that matter.

With the free Bump app loaded on each device, one user can transfer files to another simply by tapping the two phones together; the data moves painlessly and almost instantly. And Bump is excellent for sharing between iPhones and Android; the software doesn’t care.

Bump also doesn’t care whether the other device is a PC or laptop. Just direct the computer’s browser to a website called bu.mp and tap your Bump-equipped phone against the computer’s spacebar. Your files are uploaded from the phone to Bump’s remote server; click a download link and they will be compressed and transferred to the PC.

According to the Pew Internet and American Life Project, 82 percent of cellphone owners take photos with their phones, but only 34 percent share those photos with others, maybe because it’s too much work. But not any more.
http://www.bostonglobe.com/business/...1HN/story.html





Pirate Bay Swede's Hacking, Fraud Sentence Reduced
AP

A Swedish court has dismissed part of the hacking and fraud charges against the founder of the popular file-sharing website Pirate Bay and reduced his prison sentence from two years to one.

The Svea Court of Appeal on Wednesday dismissed cases against Gottfrid Svartholm Warg relating to the hacking of Nordea Bank AB, saying it could not be ruled out that others might have remotely accessed his computer as he has claimed. It upheld the conviction of hacking into the servers of two other companies that handle sensitive information for Sweden's police force and tax authority.

The 29-year-old Svartholm Warg was detained in Cambodia in September 2012 and deported to Sweden.

In 2009, a Swedish court gave him and three Pirate Bay colleagues one-year sentences for copyright violation.
http://www.nytimes.com/aponline/2013...n-hacking.html





Bypassing TouchID Was “No Challenge At All,” Hacker Tells Ars

German hacker Starbug tells Ars how he bypassed the fingerprint lock on new iPhones.
Dan Goodin

Ars expressed surprise on Monday that a hacker was able to bypass fingerprint protection less than 48 hours after its debut in Apple's newest iPhone, but not everyone felt the same way. The hack, carried out by well-known German hacker Starbug, required too much expertise and pricey equipment to make it practical, according to critics.

Marc Rogers, a security expert at smartphone security firm Lookout, was among the skeptics. After independently devising his own bypass of Apple's Touch ID, he concluded that it was anything but easy. "Hacking Touch ID relies upon a combination of skills, existing academic research, and the patience of a Crime Scene Technician," he wrote. Rogers went on to say that no one would know just how feasible Starbug's hack was until he released a step-by-step video and we learned more technical details.

We now have both. Heise Online has posted the video here, and it was enough to satisfy Rob Graham, a security expert who donated $500 to the first person to hack Touch ID. Ars has also heard directly from Starbug, who (like us and several security experts) was surprised by how little time and effort his bypass required.

It "was way easier than expected," he wrote in an e-mail. "I thought it would take at least a week and some fancy chip/bus hacking." It didn't require either.

What follows are his answers to questions Ars sent shortly after news of his hack broke Sunday night. The last question is a follow-up inquiry that came later. Because Starbug's first language is German and not English, some of his answers have been lightly edited for grammar and usage.

Was there something you wanted to prove by going after Touch ID? If yes, what was it, and how exactly does the hack go about proving it?

Like for the last 10 years, what I wanted to show is that there are no fingerprint systems that could not be fooled. But mostly I did it for the fun. Or in other words, because I can.

In the past, you've been critical of the way many people attempt to use fingerprints and other biometrics. Is that still the case? Why would you be critical of Apple? Touch ID isn't mandatory, and the fingerprint is just a substitute for a four-digit PIN.

I am not critical of Apple. The only thing you can [criticize] them [for] is that they have Touch ID advertised as safe, even though they knew that it would be hacked over [the] short or long [term]. Compared to no use of the safety PIN, fingerprint [scanning] is already a [benefit]. I think in general, the use of biometrics for automatic recognition of people [is] problematic, especially when, for example, face recognition is performed without using the human.

How long did it take for you to bypass Touch ID? Was there anything that you found hard or challenging about the hack? Was there anything about Touch ID that you think was well engineered or well implemented?

It took me nearly 30 hours from unpacking the iPhone to a [bypass] that worked reliably. With better preparation it would have taken approximately half an hour. I spent significantly more time trying to find out information on the technical specification of the sensor than I actually spent bypassing it.

I was very disappointed, as I hoped to hack on it for a week or two. There was no challenge at all; the attack was very straightforward and trivial.

The Touch ID is nevertheless a very reliable fingerprint system. However, users should only consider it an increase in convenience and not security.

How feasible is the hack that you came up with? Is it something anyone can do, or is it something that only talented hackers with a fair amount of skill and expensive equipment call pull off?

It's very easy. You basically can do it at home with inexpensive office equipment like an image scanner, a laser printer, and a kit for etching PCBs. And it will only take you a couple of hours. The techniques are actually several years old and are readily available on the Internet.

Many people said the sensor on Touch ID scanned fingers at a sub-epidermal level and that this would prevent fingerprint films like the one you used from working. That appears to have been wrong, correct? If so, why? What allowed your technique to work?

I wasn't actually able to find sufficient details on how the sensor works. I do assume they use sub-epidermal scanning. However, the scanned tissue is too similar to the upper layers of the skin. The most likely issue is the arbitrary threshold that Apple chose. They had to ensure that their setting works reliably, i.e. it shouldn't need to scan [a user's] finger twice because the sensor rejected the first attempt. Put simply, they chose usability and convenience over security. Hence, the fingerprint sensor can always be defeated as long as the materials used for the fake are sufficiently close to the characteristics of human tissue, and as long the scan of a high-resolution fingerprint is available.

It is also important to have in mind that personal devices like the iPhone are covered in fingerprints that can be used to produce a fake. Other everyday objects, such as glasses, fall into this category as well. The problem with your fingerprints is that you leave them everywhere. It's akin to writing your password on a post-it note and leaving it everywhere you go.

It seems like authentication in general is becoming more and more vulnerable. We see passwords and PINs becoming increasingly weak. Many people don't trust RSA's SecurID. Is there a form of authentication that you think is better than passwords, physical tokens, or biometrics? What is it? What needs to happen for it to become something people use to unlock their iPhones or log in to Gmail or other online services?

Passwords are no problem at all as long as they are long enough and someone had a look into the algorithms [used to store them] and their implementation. In fact, long, complex passwords, which can also be configured on iOS devices, offer a sufficient level of security. The problem is finding the right balance between convenience for the user and security. No normal person wants to be confronted with a 20-character password every single time they want to do something on their phone. On the other hand, today's smartphones contain a great amount of personal data where many would say that even a four-digit [PIN] is also insufficient.

Do you agree with what [Lookout security expert Marc Rogers] is saying in his blog post?

It's much easier. I guess the lifting is much less trouble than described. Best have a look at the video. I used just a scanner to lift the print of the thumb that enters the PIN. So the thumb left prints on the screen, and they could be easily lifted.
http://arstechnica.com/security/2013...ker-tells-ars/





News Corp Reveals Phone-Hacking Legal Costs of £238m
Gavriel Hollander

Rupert Murdoch’s media empire has spent $382m (£238m) over the past two years on legal fees dealing with the aftermath of the News of the World phone-hacking scandal.

In its annual report, released over the weekend, News Corp revealed that its total cost of legal and other professional fees relating to civil and criminal proceedings concerning its UK newspaper arm in the year to June 2013 had been $183m. This is in addition to the approximately $199m it spent in the year to June 2012.

The new figure suggests the cost of proceedings to Murdoch is continuing to rise, after it revealed earlier this year that the total spent to December 2012 was $340m.

News Corp further estimated that it could be liable for another $66m in costs this year as cases continue to be brought.

Overall, News Corp’s figures showed that the media giant had returned to profit in 2012/13. It reported a profit of $506m, compared to a $2.1bn loss last year (on a like-for-like basis). Total revenue was also up from $8.7bn to $8.9bn

The result was boosted by the $1.3bn the company gained, in a one-off accountancy treatment, following its acquisition of Fox Sport Australia owner Consolidated Media.

However, advertising revenue for its newspapers around the world continued to fall. It was down nearly 10 per cent to $3.9bn while its Australian newspaper arm experienced a 15 per cent drop in advertising income.

These are the first figures for News Corp since it was split from renamed TV and film entertainment division 21st Century Fox.

The accounts also reveal total remuneration for News Corp chief executive Robert Thomson of $2,661,463. This figure includes $992,308 salary for the former Times editor, a $1m bonus and an increase of $616,476 in the value of his pension.
http://www.pressgazette.co.uk/news-c...al-costs-£238m





Surface 2 Declassified: How Microsoft Made Surface Into the Tablet the World Said it Wanted

Microsoft's first Surface tablets left room for improvement. We sit down with Panos Panay and other members of the Surface team at Microsoft HQ to find out how the company's group of designers and engineers listened and regrouped to create the next generation.
Tim Stevens

Panos Panay and I stand on the third floor of Microsoft's Studio B, an unassuming building easily missed amid the company's sprawling Redmond campus. Panay, the Microsoft VP in charge of the Surface, looks a bit tired as he leans on a handrail and looks out across the inner courtyard of the building, the hub of Microsoft's ever-expanding and increasingly impressive hardware efforts. Still, he's visibly enthusiastic about the work he's about to show me.

A series of balconies frame a central meeting area below. To our left, a giant display ticks off the days until to the Xbox One launch: 60-some and counting. "The Xbox One design work happened on the fourth floor," Panay tells me. Other than the odd mouse and keyboard, the rest of the building is dominated by the company's tablet efforts, highlighted by large "Surface" stickers plastered onto most of the inner-facing windows. These are proud territory markers.

The Surface project, which started in a small, windowless room on the ground floor, has now spread to take over nearly the entire, heavily secured building. The beeps of ID badge scanners and the clunks of heavy doors punctuate the daily routines of a team that, once a dozen employees, now numbers some 500 in Redmond alone and more than 1,000 internationally.

Panay gives me a lay of the land: design, engineering, supply chain -- all disciplines comingle in Studio B to foster open communication. The large, central area on the first floor is meant for quick meetings. Offices and cubicles fill the sides, and in the corners lie what Panay calls "vaults" -- secure rooms segmented internally with areas requiring even more-privileged access. The vaults are where the critical stuff happens.

It's into one of these vaults that Panay takes me, then farther into a secure lab within. It's clear the lab has been dressed up a bit for the visit. On white platforms around the edges lie Surface tablets and a phalanx of accessories -- rainbows of Touch and Type covers, a new dock, a new battery cover, a few exposed motherboards and battery packs, and, over in the corner connected to a JamBox, the DJ-friendly Surface Remix Project.

Panay briefly explains the lineup, detailing how the new $899 Surface Pro has improved internals for more performance and battery life, and that the $449 Surface 2 (nee RT) is thinner and lighter and faster and has a higher-res display. He highlights the new covers and other accessories and then, politely, excuses himself.

"Have fun. This is your lab now," he tells me on the way out the door, leaving me to play with the new toys his team has been developing in secrecy for over a year. He's obviously confident, and he has good reason to be.

History

It was Julie Larson-Green, executive vice-president of the Devices and Studios group at Microsoft and a potential heir to the Microsoft throne, who brought Panay onto the brand-new Surface team after she discovered his work on Microsoft's first Surface product, which he'd been a part of since 2008. Back then, the Surface was a big, smart table-shaped computer controlled by touch -- not today's thin laptop replacement. That Surface wowed nearly everyone who tried it, but its consumer applications were limited, to say the least.

When it came time to build the flagship, touch-based platform for Windows 8, Larson-Green knew who to call. Her pitch was simple and ultimately effective: "We're going to reinvent Windows and we need a showcase for the product. We need to do our own hardware to do that. You should come to Windows and build the Surface PC."

Even in the Microsoft tablet's infancy, the team called the project "Surface," though the group would cycle through many other options before finally deciding to keep the name and rebrand the smart tables as "PixelSense." (The leading Surface alternate, by the way, was "One." That title would go on to make a comfortable living at HTC.)

During the birth of the Surface, Steven Sinofsky was boss of Windows and a major Surface proponent. (He was also the man who brilliantly suggested putting a USB port on the Surface's AC adapter.) Sinofsky left Microsoft in November 2012, almost immediately after the tablet's debut. The public's initially cool response to the tablets and the sudden nature of his departure gave rise to no shortage of speculation, but the official word was that of an amicable parting of ways. "Steven did an awesome job of setting up our team," a Surface designer told us. "He's responsible for a lot of what we look like and how we work here. His presence is still felt."

Larson-Green now heads up the group that owns the Surface, also overseeing the launch of the Xbox One and every other piece of hardware that Microsoft makes -- and will make. If there's one thing the Surface team loves to tease but hates to discuss, it's future devices. "Surface 2 was being developed before the launch of Surface RT. That's something to frame," Panay told me with a hint of pride.

Those first 2012 Surfaces were impressive, the premiere efforts of a fledgling hardware team, designed under the watchful, steel-rimmed eyes of Microsoft's Wolfsburg-raised and Bauhaus-minded creative director Ralf Groene. No one could find many faults in the design of those slates, nor in the spot-on concept of a productivity-focused tablet.

But the execution suffered shortcomings. Battery life on the Surface Pro disappointed. Many wanted more screen pixels and more apps for the RT. And then there was the kickstand and its critical flaw: you could hardly call the Surface a laptop replacement if you couldn't comfortably use it in your lap.

To prove that the new Surface 2 is far more lap-friendly, Groene sits me down in a black Eames lounge chair, itself of the most iconic designs of all time. The subtly curvaceous wooden seat is comfortable and steeply reclined. The new Surface on my lap can be more steeply reclined, too, kickstand offering an additional angle that is far more stable than before. However, it's still not exactly comfortable with the thin kickstand cutting into my legs.

Feedback and revision

Panay can't suppress a big smile when I inquire about his Reddit AMAs (here and here), in which he invited that vast yet remarkably civil and thoughtful community to ask him anything. That they did, and he and his team answered nearly every question -- "unless it was about our road map, for competitive reasons."

Each of Panay's posts garnered roughly 3,500 upvotes, a healthy number in what turned out to be an equally healthy discussion for Panay, who was inundated with many recurring demands. One of the clearest, he says, was, "We need to be able to attach a keyboard with a battery." This rather pointed request was music to his ears, as the team was already testing such a keyboard in the lab. Instantly, he knew their instincts were on-point.

Panay mentions requests for a better camera and a more adaptable kickstand as additional missions delivered by Redditors of the world, missions successfully completed by the Surface team with the introduction of the new Surface Pro 2 and Surface 2.

Of course, some less polite complaints about the tablets' shortcomings bubbled up among the helpful suggestions and polite requests for functionality. Battery life issues headlined and echoed the problems highlighted in most reviews, including my own and CNET's. These reactions were not a surprise to the Surface team. The designer I spoke with told me, "Our goal was just to make the best hardware out there. That's a pretty hefty goal....Coming into this market that's so saturated, I think we expected mixed reviews."

Battery life? Improved by up to 100 percent on the new Pro. Screen resolution? 1080p on the Surface 2. Lap-friendly kickstand? Fingerprint-resistant exterior? Better cameras? Check, check, and double-check.

And that's why Panos, Groene, Larson-Green, and everyone else I spoke with at Microsoft was so eager to point out all the many improvements made to the two new Surface tablets -- and the myriad accessories coming along for the ride. Battery life? Improved by up to 100 percent on the new Pro. Screen resolution? 1080p on the Surface 2. Lap-friendly kickstand? Fingerprint-resistant exterior? Better cameras? Check, check, and double-check.

Few, though, were more eager to talk innovative specifics than Steven Bathiche, the wild-haired Director of Research at Microsoft and a man who spends much of his time tinkering in a windowless laboratory he named after Thomas Edison.

In the corners of the Edison lab, in various states of disassembly, I found prototypes and production versions of the PixelSense table, a device covered by numerous patents in which Bathiche is named. A big, truly different product like that is easy to get excited about, but he's just as happy to talk about the many improvements he and his team contributed to the next-gen Surfaces.

First among Bathiche's fixes is the new Touch Cover, barely distinguishable from the previous version externally, yet vastly different on the inside. What was basically one sensor per key, about 80 total, is now an array of 1,100 discrete sensors that can detect exactly how hard your finger is pressing and where it landed -- even if it landed between keys. This enables gestures and a new level of accuracy that the original Surface lacked. Along the way, his team added backlit keys and increased the rigidity of the typing surface. "We went from 80 sensors to 1,100, we added a light guide, and it's thinner. And it's stiffer. That's cool," Bathiche says.

That is cool, and indeed many of the most interesting innovations in this new line of Surface tablets lie not in the devices themselves but in their accessories. But just as with the first Surface, these innovations run the risk of receiving a giant collective shrug from the public. People just don't get excited about accessories, regardless of how innovative. Microsoft doesn't include any of the keyboards in the price of either tablet. This lets users choose whether and which keyboard cover to purchase, but it also has the side-effect of relegating these devices to footnote status.

But the new Surface Pro's biggest improvement, battery life that lasts twice as long, also risks going unacknowledged. Double the battery life in a new machine would be a stunning development under normal circumstances, but Intel's Haswell CPUs have essentially made such improvements mandatory for x86 systems launched in 2013. We've come to expect it.

The Surface team insists it owes only half the new Surface's battery life improvement to the latest silicon from Intel. According to the team, Microsoft engineers toiled for months to optimize every driver and every internal component, measuring current to the closest microamp, reducing the number of low-power DDR3 chips, and making countless other tweaks. One thing they didn't change: the size of the batteries. They remain the same as before.

The critical question

One by one, the Microsoft team has checked off almost every upgrade dropped from the initial version of Surface because of cost or complexity. If you had a complaint about the original Surface hardware, chances are your concern has now been addressed. The new Surface devices are world-class; the Surface 2 sparks with great performance and a bright, 1080p, calibrated display. It looks and feels fantastic in the hand, and, at $449 for 32GB, is priced quite competitively with the $599 32GB iPad. The new Pro, meanwhile, has all that battery life and more performance to boot. That Microsoft pulled all this off in a relatively short period of time certainly is an accomplishment.

However, there's another, vastly important aspect of the Surface success equation: the software. Some of the most critical problems with the original slates were core aspects of Windows 8. The operating system is far and away the most finger-friendly Windows yet, but the need to frequently drop into desktop mode on the Pro raised a host of troublesome scaling issues. Those issues were less of a problem on the RT, but only thanks to the incompatibility with legacy apps. New apps have since marched steadily into the Windows app store, climbing to over 100,000 choices, but major gaps (Pandora, Rdio, Firefox, Chrome, YouTube, HBO Go, Facebook, to name a few) remain.

The tablets that were meant to be a showcase for Windows are evolving more rapidly and more progressively than the operating systems they run.

Windows 8.1 helps the scaling issue somewhat by adding features like discrete settings for external displays, and it finally allows developers to better tailor their desktop apps for tablet use. However, there continues to be a huge difference between the new Microsoft Design Language apps (the tiled interface formerly known as "Metro") and the traditional desktop that's been around since Windows 95. Moving between the two will still feel clumsy and disjointed. It's clear that these devices, which were meant to be a showcase for Windows, are evolving more rapidly and more progressively than the operating systems they run.

So there it lingers -- the critical question: Will the quality of the new Surface hardware pave over the kinks and gaps in the software? The world will need more time to experience both the new device and new revision of Windows together at length to make that call, but I feel comfortable putting a related doubt to rest.

If you feared Microsoft might drop the Surface RT after its initial sales struggles, every indication points the opposite direction. After all, Microsoft calls the tablet simply "Surface 2" this time around. Larson-Green told us the Surface program is "incredibly important to the business." And, of course, Panay can't resist teasing about things to come:

"The team is over-excited. We have such a long road map ahead of us, and we know we're in this marathon. The team knows that. You start with your first generation of products, you put them out there, you know they're good. There are ways they can get better. Now the second generation comes, they only get more motivated and when you look at our road map to come."

A Surface designer confirmed that sentiment: "We're excited. Launch time is always a good time."

Still, this designer couldn't resist asking, multiple times and with a hint of unease, what I thought -- whether I was impressed by what I'd seen. I told her I was indeed impressed, impressed by the quality of the hardware and impressed by the dedication of the team. But in the end, of course, it's not whether I'm impressed. It's whether you are, dear reader.
http://news.cnet.com/8301-10805_3-57...aid-it-wanted/





Facebook, Other Banned Sites to be Open in China Free Trade Zone: Report

Facebook, Twitter and other websites deemed sensitive and blocked by the Chinese government will be accessible in a planned free-trade zone (FTZ) in Shanghai, the South China Morning Post reported on Tuesday.

Citing unidentified government sources, the Hong Kong newspaper also said authorities would welcome bids from foreign telecoms firms for licenses to provide Internet services in the zone.

China's ruling Communist Party aggressively censors the Internet, routinely deleting online postings and blocking access to websites it deems inappropriate or politically sensitive.

Facebook and Twitter were blocked by Beijing in mid-2009 following deadly riots in the western province of Xinjiang that authorities say were abetted by the social networking sites. The New York Times has been blocked since reporting last year that the family of then-Premier Wen Jiabao had amassed a huge fortune.

TEST BED

The recently approved Shanghai FTZ is slated to be a test bed for convertibility of China's yuan currency and further liberalization of interest rates, as well as reforms of foreign direct investment and taxation, the State Council, or cabinet, has said. The zone will be formally launched on September 29, the Securities Times reported earlier this month.

The idea of unblocking websites in the FTZ was to make foreigners "feel like at home", the South China Morning Post quoted a government source as saying. "If they can't get onto Facebook or read The New York Times, they may naturally wonder how special the free-trade zone is compared with the rest of China," the source said.

A spokesman for Facebook said the company had no comment on the newspaper report. No one at Twitter or the New York Times was immediately available to comment.

China's three biggest telecoms companies - China Mobile, China Unicom and China Telecom - have been informed of the decision to allow foreign competition in the FTZ, the sources told the newspaper.

The three state-owned companies had not raised complaints because they knew the decision had been endorsed by Chinese leadership including Premier Li Keqiang, who has backed the Shanghai FTZ, the sources added.

(Reporting by John Ruwitch, with additional reporting by Alexei Oreskovic; Editing by Ian Geoghegan)
http://www.reuters.com/article/2013/...98N04020130924





Ex-FBI Agent to Plead Guilty to Leak to Media -U.S. Justice Dept

A former FBI agent has agreed to plead guilty to leaking secret government information about a bomb plot to a news agency, a leak that Attorney General Eric Holder called one of the most serious in U.S. history, the Justice Department said on Monday.

As part of a plea agreement filed in U.S. District Court in Indiana, Donald John Sachtleben agreed to a prison sentence of three years and seven months for the leak in addition to a separate sentence for unrelated child pornography charges, the department said.

A lawyer for Sachtleben, 55, of Carmel, Indiana, did not immediately respond to a request for comment.

A story by the Associated Press in May 2012 described a U.S. operation in Yemen to foil a plot to bomb an airliner. The AP said it delayed publishing the story at the request of government officials until security concerns were allayed, but U.S. officials said the leak compromised a U.S. agent working to undermine the Islamic militant group al Qaeda in the Arabian Peninsula.

Two months later, Attorney General Eric Holder appointed a senior prosecutor to lead an investigation.

Sachtleben retired from the FBI in 2008, after about 25 years, according to the Justice Department. He continued to work on contract as a bomb analyst.

According to a copy of a plea agreement dated Sept. 6 and released on Monday, Sachtleben agreed to plead guilty to one count of unauthorized disclosure of national defense information and one count of unauthorized possession and retention of national defense information.

If accepted by a judge, the prison sentence would be the longest ever handed down in a civilian court for a leak of classified information to a reporter. (Reporting by David Ingram; Editing by Howard Goller, Sandra Maler and Eric Walsh)
http://www.trust.org/item/20130923202416-9ggug





NSA Employee Spied on Nine Women Without Detection, Internal File Shows

Twelve cases of unauthorised surveillance documented in letter from NSA's inspector general to senator Chuck Grassley
Paul Lewis

A National Security Agency employee was able to secretly intercept the phone calls of nine foreign women for six years without ever being detected by his managers, the agency's internal watchdog has revealed.

The unauthorised abuse of the NSA's surveillance tools only came to light after one of the women, who happened to be a US government employee, told a colleague that she suspected the man – with whom she was having a sexual relationship – was listening to her calls.

The case is among 12 documented in a letter from the NSA's inspector general to a leading member of Congress, who asked for a breakdown of cases in which the agency's powerful surveillance apparatus was deliberately abused by staff. One relates to a member of the US military who, on the first day he gained access to the surveillance system, used it to spy on six email addresses belonging to former girlfriends.

The letter, from Dr George Ellard, only lists cases that were investigated and later "substantiated" by his office. But it raises the possibility that there are many more cases that go undetected. In a quarter of the cases, the NSA only found out about the misconduct after the employee confessed.

It also reveals limited disciplinary action taken against NSA staff found to have abused the system. In seven cases, individuals guilty of abusing their powers resigned or retired before disciplinary action could be taken. Two civilian employees kept their jobs – and, it appears, their security clearance – and escaped with only a written warning after they were found to have conducted unauthorised interceptions.

The abuses – technically breaches of the law – did not result in a single prosecution, even though more than half of the cases were referred to the Department of Justice. The DoJ did not respond to a request for information about why no charges were brought.

The NSA's director, Gen Keith Alexander, referred to the 12 cases in testimony to a congressional hearing on Thursday. He told senators on the intelligence committee that abuse of the NSA's powerful monitoring tools were "with very rare exception" unintentional mistakes.

"The press claimed evidence of thousands of privacy violations. This is false and misleading," he said.

"According to NSA's independent inspector general, there have been only 12 substantiated case of willful violation over 10 years. Essentially, one per year."

He added: "Today, NSA has a privacy compliance program any leader of a large, complex organization would be proud of."

However, the small number cases depicted in the inspector general's letter, which was published by Republican senator Chuck Grassley, could betray a far larger number that NSA managers never uncovered.

One of the cases emerged in 2011 ,when an NSA employee based abroad admitted during a lie-detector case that he had obtained details about his girlfriend's telephone calls "out of curiosity". He retired last year.

In a similar case, from 2005, an NSA employee admitted to obtaining his partner's phone data to determine whether she was "involved" with any foreign government officials. In a third, a female NSA employee said she listened to calls on an unknown foreign telephone number she discovered stored on his cell phone, suspecting he "had been unfaithful".

In another case, from two years ago, which was only discovered during an investigation another matter, a woman employee of the agency confessed that she had obtained information about the phone of "her foreign-national boyfriend and other foreign nationals". She later told investigators she often used the NSA's surveillance tools to investigate the phone numbers of people she met socially, to ensure they were "not shady characters".

The case of the male NSA employee who spied on nine women occurred between 1998 and 2003. The letter states that the member of staff twice collected communications of an American, and "tasked nine telephone numbers of female foreign nationals, without a valid foreign intelligence purpose, and listened to collected phone conversations".
http://www.theguardian.com/world/201...-internal-memo





N.S.A. Gathers Data on Social Connections of U.S. Citizens
James Risen and Laura Poitras

Since 2010, the National Security Agency has been exploiting its huge collections of data to create sophisticated graphs of some Americans’ social connections that can identify their associates, their locations at certain times, their traveling companions and other personal information, according to newly disclosed documents and interviews with officials.

The spy agency began allowing the analysis of phone call and e-mail logs in November 2010 to examine Americans’ networks of associations for foreign intelligence purposes after N.S.A. officials lifted restrictions on the practice, according to documents provided by Edward J. Snowden, the former N.S.A. contractor.

The policy shift was intended to help the agency “discover and track” connections between intelligence targets overseas and people in the United States, according to an N.S.A. memorandum from January 2011. The agency was authorized to conduct “large-scale graph analysis on very large sets of communications metadata without having to check foreignness” of every e-mail address, phone number or other identifier, the document said. Because of concerns about infringing on the privacy of American citizens, the computer analysis of such data had previously been permitted only for foreigners.

The agency can augment the communications data with material from public, commercial and other sources, including bank codes, insurance information, Facebook profiles, passenger manifests, voter registration rolls and GPS location information, as well as property records and unspecified tax data, according to the documents. They do not indicate any restrictions on the use of such “enrichment” data, and several former senior Obama administration officials said the agency drew on it for both Americans and foreigners.

N.S.A. officials declined to say how many Americans have been caught up in the effort, including people involved in no wrongdoing. The documents do not describe what has resulted from the scrutiny, which links phone numbers and e-mails in a “contact chain” tied directly or indirectly to a person or organization overseas that is of foreign intelligence interest.

The new disclosures add to the growing body of knowledge in recent months about the N.S.A.’s access to and use of private information concerning Americans, prompting lawmakers in Washington to call for reining in the agency and President Obama to order an examination of its surveillance policies. Almost everything about the agency’s operations is hidden, and the decision to revise the limits concerning Americans was made in secret, without review by the nation’s intelligence court or any public debate. As far back as 2006, a Justice Department memo warned of the potential for the “misuse” of such information without adequate safeguards.

An agency spokeswoman, asked about the analyses of Americans’ data, said, “All data queries must include a foreign intelligence justification, period.”

“All of N.S.A.’s work has a foreign intelligence purpose,” the spokeswoman added. “Our activities are centered on counterterrorism, counterproliferation and cybersecurity.”

The legal underpinning of the policy change, she said, was a 1979 Supreme Court ruling that Americans could have no expectation of privacy about what numbers they had called. Based on that ruling, the Justice Department and the Pentagon decided that it was permissible to create contact chains using Americans’ “metadata,” which includes the timing, location and other details of calls and e-mails, but not their content. The agency is not required to seek warrants for the analyses from the Foreign Intelligence Surveillance Court.

N.S.A. officials declined to identify which phone and e-mail databases are used to create the social network diagrams, and the documents provided by Mr. Snowden do not specify them. The agency did say that the large database of Americans’ domestic phone call records, which was revealed by Mr. Snowden in June and caused bipartisan alarm in Washington, was excluded. (N.S.A. officials have previously acknowledged that the agency has done limited analysis in that database, collected under provisions of the Patriot Act, exclusively for people who might be linked to terrorism suspects.)

But the agency has multiple collection programs and databases, the former officials said, adding that the social networking analyses relied on both domestic and international metadata. They spoke only on the condition of anonymity because the information was classified.

The concerns in the United States since Mr. Snowden’s revelations have largely focused on the scope of the agency’s collection of the private data of Americans and the potential for abuse. But the new documents provide a rare window into what the N.S.A. actually does with the information it gathers.

A series of agency PowerPoint presentations and memos describe how the N.S.A. has been able to develop software and other tools — one document cited a new generation of programs that “revolutionize” data collection and analysis — to unlock as many secrets about individuals as possible.

The spy agency, led by Gen. Keith B. Alexander, an unabashed advocate for more weapons in the hunt for information about the nation’s adversaries, clearly views its collections of metadata as one of its most powerful resources. N.S.A. analysts can exploit that information to develop a portrait of an individual, one that is perhaps more complete and predictive of behavior than could be obtained by listening to phone conversations or reading e-mails, experts say.

Phone and e-mail logs, for example, allow analysts to identify people’s friends and associates, detect where they were at a certain time, acquire clues to religious or political affiliations, and pick up sensitive information like regular calls to a psychiatrist’s office, late-night messages to an extramarital partner or exchanges with a fellow plotter.

“Metadata can be very revealing,” said Orin S. Kerr, a law professor at George Washington University. “Knowing things like the number someone just dialed or the location of the person’s cellphone is going to allow to assemble a picture of what someone is up to. It’s the digital equivalent of tailing a suspect.”

The N.S.A. had been pushing for more than a decade to obtain the rule change allowing the analysis of Americans’ phone and e-mail data. Intelligence officials had been frustrated that they had to stop when a contact chain hit a telephone number or e-mail address believed to be used by an American, even though it might yield valuable intelligence primarily concerning a foreigner who was overseas, according to documents previously disclosed by Mr. Snowden. N.S.A. officials also wanted to employ the agency’s advanced computer analysis tools to sift through its huge databases with much greater efficiency.

The agency had asked for the new power as early as 1999, the documents show, but had been initially rebuffed because it was not permitted under rules of the Foreign Intelligence Surveillance Court that were intended to protect the privacy of Americans.

A 2009 draft of an N.S.A. inspector general’s report suggests that contact chaining and analysis may have been done on Americans’ communications data under the Bush administration’s program of wiretapping without warrants, which began after the Sept. 11 attacks to detect terrorist activities and skirted the existing laws governing electronic surveillance.

In 2006, months after the wiretapping program was disclosed by The New York Times, the N.S.A.’s acting general counsel wrote a letter to a senior Justice Department official, which was also leaked by Mr. Snowden, formally asking for permission to perform the analysis on American phone and e-mail data. A Justice Department memo to the attorney general noted that the “misuse” of such information “could raise serious concerns,” and said the N.S.A. promised to impose safeguards, including regular audits, on the metadata program. In 2008, the Bush administration gave its approval.

A new policy that year, detailed in “Defense Supplemental Procedures Governing Communications Metadata Analysis,” authorized by Defense Secretary Robert M. Gates and Attorney General Michael B. Mukasey, said that since the Supreme Court had ruled that metadata was not constitutionally protected, N.S.A. analysts could use such information “without regard to the nationality or location of the communicants,” according to an internal N.S.A. description of the policy.

After that decision, which was previously reported by The Guardian, the N.S.A. performed the social network graphing in a pilot project for 1 ½ years “to great benefit,” according to the 2011 memo. It was put in place in November 2010 in “Sigint Management Directive 424” (sigint refers to signals intelligence).

In the 2011 memo explaining the shift, N.S.A. analysts were told that they could trace the contacts of Americans as long as they cited a foreign intelligence justification. That could include anything from ties to terrorism, weapons proliferation or international drug smuggling to spying on conversations of foreign politicians, business figures or activists.

Analysts were warned to follow existing “minimization rules,” which prohibit the N.S.A. from sharing with other agencies names and other details of Americans whose communications are collected, unless they are necessary to understand foreign intelligence reports or there is evidence of a crime. The agency is required to obtain a warrant from the intelligence court to target a “U.S. person” — a citizen or legal resident — for actual eavesdropping.

The N.S.A. documents show that one of the main tools used for chaining phone numbers and e-mail addresses has the code name Mainway. It is a repository into which vast amounts of data flow daily from the agency’s fiber-optic cables, corporate partners and foreign computer networks that have been hacked.

The documents show that significant amounts of information from the United States go into Mainway. An internal N.S.A. bulletin, for example, noted that in 2011 Mainway was taking in 700 million phone records per day. In August 2011, it began receiving an additional 1.1 billion cellphone records daily from an unnamed American service provider under Section 702 of the 2008 FISA Amendments Act, which allows for the collection of the data of Americans if at least one end of the communication is believed to be foreign.

The overall volume of metadata collected by the N.S.A. is reflected in the agency’s secret 2013 budget request to Congress. The budget document, disclosed by Mr. Snowden, shows that the agency is pouring money and manpower into creating a metadata repository capable of taking in 20 billion “record events” daily and making them available to N.S.A. analysts within 60 minutes.

The spending includes support for the “Enterprise Knowledge System,” which has a $394 million multiyear budget and is designed to “rapidly discover and correlate complex relationships and patterns across diverse data sources on a massive scale,” according to a 2008 document. The data is automatically computed to speed queries and discover new targets for surveillance.

A top-secret document titled “Better Person Centric Analysis” describes how the agency looks for 94 “entity types,” including phone numbers, e-mail addresses and IP addresses. In addition, the N.S.A. correlates 164 “relationship types” to build social networks and what the agency calls “community of interest” profiles, using queries like “travelsWith, hasFather, sentForumMessage, employs.”

A 2009 PowerPoint presentation provided more examples of data sources available in the “enrichment” process, including location-based services like GPS and TomTom, online social networks, billing records and bank codes for transactions in the United States and overseas.

At a Senate Intelligence Committee hearing on Thursday, General Alexander was asked if the agency ever collected or planned to collect bulk records about Americans’ locations based on cellphone tower data. He replied that it was not doing so as part of the call log program authorized by the Patriot Act, but said a fuller response would be classified.

If the N.S.A. does not immediately use the phone and e-mail logging data of an American, it can be stored for later use, at least under certain circumstances, according to several documents.

One 2011 memo, for example, said that after a court ruling narrowed the scope of the agency’s collection, the data in question was “being buffered for possible ingest” later. A year earlier, an internal briefing paper from the N.S.A. Office of Legal Counsel showed that the agency was allowed to collect and retain raw traffic, which includes both metadata and content, about “U.S. persons” for up to five years online and for an additional 10 years offline for “historical searches.”

James Risen reported from Washington and New York. Laura Poitras, a freelance journalist, reported from Berlin.
http://www.nytimes.com/2013/09/29/us...-citizens.html





Metadata Equals Surveillance
Bruce Schneier

Back in June, when the contents of Edward Snowden's cache of NSA documents were just starting to be revealed and we learned about the NSA collecting phone metadata of every American, many people -- including President Obama -- discounted the seriousness of the NSA's actions by saying that it's just metadata.

Lots and lots of people effectively demolished that trivialization, but the arguments are generally subtle and hard to convey quickly and simply. I have a more compact argument: metadata equals surveillance.

Imagine you hired a detective to eavesdrop on someone. He might plant a bug in their office. He might tap their phone. He might open their mail. The result would be the details of that person's communications. That's the "data."

Now imagine you hired that same detective to surveil that person. The result would be details of what he did: where he went, who he talked to, what he looked at, what he purchased -- how he spent his day. That's all metadata.

When the government collects metadata on people, the government puts them under surveillance. When the government collects metadata on the entire country, they put everyone under surveillance. When Google does it, they do the same thing. Metadata equals surveillance; it's that simple.
https://www.schneier.com/blog/archiv...ta_equals.html





NSA Surveillance Goes Beyond Orwell's Imagination – Alan Rusbridger

Guardian editor says depth of NSA surveillance programs greatly exceed anything the 1984 author could have imagined
Dominic Rushe

The potential of the surveillance state goes way beyond anything in George Orwell's 1984, Alan Rusbridger, the Guardian's editor-in-chief, told an audience in New York on Monday.

Speaking in the wake of a series of revelations in the Guardian about the extent of the National Security Agency's surveillance operations, Rusbridger said: "Orwell could never have imagined anything as complete as this, this concept of scooping up everything all the time.

"This is something potentially astonishing about how life could be lived and the limitations on human freedom," he said.

Rusbridger said the NSA stories were "clearly" not a story about totalitarianism, but that an infrastructure had been created that could be dangerous if it fell into the wrong hands.

"Obama is a nice guy. David Cameron is a nice social Democrat. About three hours from London in Greece there are some very nasty political parties. What there is is the infrastructure for total surveillance. In history, all the precedents are unhappy," said Rusbridger, speaking at the Advertising Week conference.

He said that whistleblower Edward Snowden, who leaked the documents, had been saying: "Look, wake up. You are building something that is potentially quite alarming."

Rusbridger said that people bring their own perspectives to the NSA revelations. People who have read Kafka or Orwell found the level of surveillance scary, he said, and that those who had lived or worked in the communist eastern bloc were also concerned.

"If you are Mark Zuckerberg and you are trying to build an international business, this is dismaying to you," Rusbridger said.

Zuckerberg recently criticised the Obama administration's surveillance apparatus. "Frankly I think the government blew it," he told TechCrunch Disrupt conference in San Francisco.

The Facebook founder was particularly damning of government claims that they were only spying on "foreigners".

"Oh, wonderful: that's really helpful to companies trying to serve people around the world, and that's really going to inspire confidence in American internet companies," said Zuckerberg.

"All sorts of people around the world are questioning what America is doing," said Rusbridger. "The president keeps saying: well we don't spy on our people. [But] that's not much comfort if you are German."

Rusbridger said the world of spying had changed incomparably in the last 15 years. "The ability of these big agencies, on an international basis, to keep entire populations under some form of surveillance, and their ability to use engineering and algorithms to erect a system of monitoring and surveillance, is astonishing," he said.

He said that as the NSA revelations had gone on, the "integrity of the internet" had been questioned. "These are big, big issues about balancing various rights in society. About how business is done. And about how safe individuals are, living their digital lives."

The Guardian editor rebuffed criticism from the Obama administration that the newspaper was drip-feeding the stories in order to get the most from them. "Well, the president has never worked in a newsroom," he said.

"If there are people out there who think we have digested all this material, and [that] we have all these stories that we are going to feed out in dribs and drabs, then I think that misunderstands the nature of news. What is happening is there is a lot of material. It's very complex material.

"These are not stories that sit up and beg to be told."

Rusbridger said the Guardian and its partners at the New York Times and ProPublica were working through the material. "It's a slow and patient business. If I were the president, I would welcome that."
http://www.theguardian.com/world/201...lan-rusbridger





Sen. Patrick Leahy Calls for End to NSA Bulk Phone Records Program
Ellen Nakashima

A senior U.S. senator on Tuesday called for an end to the National Security Agency’s phone records collection program, arguing that it treads too heavily on Americans’ privacy rights without having proved its value as a counterterrorism tool.

In a speech at Georgetown Law’s Center on National Security and the Law, Senate Judiciary Committee Chairman Patrick J. Leahy (D-Vt.) said he has introduced bipartisan legislation that he says would stop the controversial program, which allows the NSA to amass a database of Americans’ call logs. He said he also is working on legislation to address concerns about a separate program that collects the e-mails and phone calls of foreigners overseas, including their communications with Americans.

The bill he introduced, he said, would allow a more limited form of phone records collection under the Foreign Intelligence Surveillance Act, which is the subject of intense public debate in light of revelations by former NSA contractor Edward Snowden.

“Congress did not enact FISA to give [the government] dragnet surveillance powers to sweep in the data of countless innocent Americans,” said Leahy, who noted that his first vote as a senator in 1975 was in favor of creating the Church Committee, which investigated intelligence agency abuses, including spying on civil rights leaders.

Leahy’s bill is among a number of proposals to reform FISA following Snowden’s disclosures, which together with lawsuits by privacy groups sparked the release of formerly classified court opinions and other documents.

The Senate Intelligence Committee will hold a public hearing Thursday to examine some of the proposals. Leahy’s panel will hold a public hearing next Wednesday to do the same.

The committees will hear from Director of National Intelligence James R. Clapper Jr. and NSA Director Keith Alexander. Along with other U.S. officials, they have insisted that the programs have been repeatedly found to be constitutional and that Congress was briefed on the programs before reauthorizing the statutes underlying them.

Some lawmakers have said that they were unaware that they were approving the bulk records collection when they voted in 2011 to reauthorize the underlying FISA provision. Others have said they support the NSA programs in their current form. The phone records program began shortly after the attacks of Sept. 11, 2001, under executive branch authority. It was put under FISA court oversight in 2006.

Senate Intelligence Committee Chairman Dianne Feinstein (D-Calif.) does not support ending the phone records collection. “It would remove an important and effective intelligence tool and one that has been repeatedly determined to be lawful,” a committee aide said.

Feinstein, however, is working on legislation that would increase transparency and privacy protections. She has outlined some changes she would consider, including reducing the length of time the NSA can keep the phone records from five years to two or three.

Other proposals raised by lawmakers or policy experts include having the phone companies or a third party retain the data instead of the NSA. Companies are averse to that idea, fearing that they will be subject to an avalanche of requests from civil litigants, local law enforcement and others seeking access to the trove.

Other proposals focus on the FISA court, which meets in secret and hears only the government’s case for surveillance. Some lawmakers have proposed having a public advocate who could tell the judge how the government’s proposal would affect Americans’ privacy rights.

The House in July nearly approved a measure to end the bulk-records collection. Some analysts say it is unclear whether Leahy’s bill would end bulk collection, given the FISA court’s apparent willingness, as shown in a recently released opinion, to interpret the underlying law expansively.

Leahy, Sen. Charles E. Grassley (R-Iowa) and seven other Judiciary Committee members called on the intelligence community inspector general on Monday to review the government’s use of FISA and USA Patriot Act surveillance authorities, and to make his findings public.
http://www.washingtonpost.com/world/...cd6_story.html





US Intelligence Chiefs Urge Congress to Preserve Surveillance Programs

Officials refuse to say in Senate testimony whether cell site data had ever been used to pinpoint an individual's location
Paul Lewis and Dan Roberts

Senator Dianne Feinstein speaks with director of national intelligence James Clapper, NSA director general Keith Alexander and deputy attorney general James Cole. Photograph: James Reed/Reuters

US intelligence chiefs used an appearance before Congress on Thursday to urge lawmakers not to allow public anger over the extent of government surveillance to result in changes to the law that would impede them from preventing terrorist attacks.

General Keith Alexander, the director of the National Security Agency, conceded that disclosures by the whistleblower Edward Snowden "will change how we operate". But he urged senators, who are weighing a raft of reforms, to preserve the foundational attributes of a program that allows officials to collect the phone data of millions of American citizens.

In testy exchanges at the Senate intelligence committee, Alexander and the director of national intelligence, James Clapper, refused to say on the record where the NSA had ever sought to trawl cell site data, which pinpoints the location of individuals via their phones.

They were challenged by Democratic senator Ron Wyden who, as a member of the committee, has for years been privy to classified briefings that he cannot discuss in public. "You talk about the damage that has been done by disclosures, but any government official who thought this would never be disclosed was ignoring history. The truth always manages to come out," he said.

"The NSA leadership built an intelligence data collection system that repeatedly deceived the American people. Time and time again the American people were told one thing in a public forum, while intelligence agencies did something else in private."

Wyden and his fellow Democrat Mark Udall used the public hearing to press the intelligence chiefs on aspects of the top-secret surveillance infrastructure.

Asked by Udall whether it was the NSA's aim to collect the records of all Americans, Alexander replied: "I believe it is in the nation's best interest to put all the phone records into a lockbox – yes."

He would not be drawn on any past attempts or plans to store cell site data for security reasons. The NSA director evaded repeated questions from Wyden over whether the NSA had either collection of cell site phone data, or planned to do so. Alexander eventually replied: "What I don't want to do senator is put out in an unclassified form anything that is classified."

Alexander and Clapper also strongly criticised the media for over its publication of Snowden's disclosures, which they suggested had been misleading. Neither of the intelligence chiefs, nor any of the senators who criticised media reporting, indicated which news organisations or particular reports were misleading, or in what way.

Alexander said that while recent disclosures were likely to impact public perceptions of the NSA and "change how we operate", any diminution of the intelligence community's capabilities risked terrorist attacks on US territory.

He told the committee that over one seven-day period this month, 972 people had been killed in terrorist attacks in Kenya, Pakistan, Afghanistan, Syria, Yemen and Iraq. "We need these programs to ensure we don't have those same statistics here," he said.

Alexander said that violations of the rules governing surveillance powers were not common and "with very rare exceptions, are unintentional". Clapper also admitted to violations, saying "on occasion, we've made mistakes, some quite significant", but stressed those were inadvertent and the result of human or technical errors.

In a joint written submission with James Cole, the deputy attorney general, who also gave evidence to the committee, they said they were "open to a number of ideas that have been proposed in various forms" relating to the routine trawl of millions phone records of Americans under section 215 of the Patriot Act.

The trio said they would consider statutory restrictions on their ability to query the data they gather and disclosing publicly how often they use the system. However, there was no suggestion in the written submission that they would contemplate any infringement on the bulk collection and storage of the phone records, a proposal contained in bills being put forward in the House of Representatives and Senate.

"To be clear, we believe the manner in which the bulk telephony metadata collection program has been carried out is lawful, and existing oversight mechanisms protect both privacy and security," they stated.

The trio said they were also open to discussing legislation under which the foreign intelligence surveillance (Fisa) court would at its discretion solicit the views of some kind of independent figure in cases that raise broader civil liberties issues.

This falls short of the draft legislation calling for the appointment of a "constitutional advocate", which Wyden, Udall and other senators are pushing for in a bipartisan bill unveiled on Wednesday night.

At the start of the hearing, the Democratic chair of the committee, Diane Feinstein, outlined a separate bill she is introducing with Republican vice-chairman Saxby Chambliss.

Their proposed legislation broadly echoes the small tweaks the intelligence establishment says it will consider, but does not go further. Feinstein said their bill would change but preserve the program of collecting and storing phone records of Americans under section 215 of the Patriot Act.

She echoed criticisms of the media reporting of Snowden disclosures, said she was confident NSA surveillance programs were "lawful, effective and they are conducted under careful oversight". She asserted that the program by which intelligence officials secretly collect millions of phone metadata, and can be used to provide a detailed breakdown of an individual's movements life, was not a form of covert monitoring. "Much of the press has called this as surveillance program," she said. "It is not."

Chambliss said that "while we are here in large part because of the Snowden leaks", they had caused huge damage to the US and its interests and "would ultimately claim lives", something he said Snowden should be held to account for.

Feinstein and Chambliss are the two members of Congress who arguably have the biggest mandate to hold the intelligence establishment to account.

Their bill would not limit the collection of phone records, but rather introduce some restrictions on when intelligence officials are permitted to search the data, and requirements of the intelligence agencies to disclose how often they use the program.

It would also partly widen the powers of the NSA, allowing laws that authorise foreign spying to be continued for a period of time after targets enter US territory.

Mid-hearing on Thursday, Feinstein read out excerpts from an email she said she had received on her BlackBerry from the Obama administration, pertaining to the wording that might be used to describe a special advocate lawyer in the Fisa court.

Other senators on the committee criticised media reporting and argued the essence of the surveillance apparatus should be left in place. Republican senator Dan Coats said journalists were throwing "raw meat out there", suggesting the reporting was misleading the public. He cautioned against overreacting "for fear of the public saying, 'Oh, that headline makes me nervous.'"

Democrat Jay Rockefeller said that public misunderstandings risked dismantling a system of surveillance that has taken a decade to construct in the aftermath of the September 11 attacks. "You don't build a Roman fort and then build another one next door because you've made a mistake," he said.

Clapper responded that the agencies were finding ways to "counter the popular narrative".
http://www.theguardian.com/world/201...nate-committee





Senators Push to Preserve N.S.A. Phone Surveillance
Charlie Savage

The Senate Intelligence Committee appears to be moving toward swift passage of a bill that would “change but preserve” the once-secret National Security Agency program that is keeping logs of every American’s phone calls, Senator Dianne Feinstein, the California Democrat who leads the panel, said Thursday.

Ms. Feinstein, speaking at a rare public hearing of the committee, said she and the top Republican on the panel, Senator Saxby Chambliss of Georgia, are drafting a bill that would be marked up — meaning that lawmakers could propose amendments to it before voting it out of committee — as early as next week.

After the existence of the program became public by leaks from the former N.S.A. contractor Edward J. Snowden, critics called for it to be dismantled. Ms. Feinstein said her bill would be aimed at increasing public confidence in the program, which she said she believed was lawful.

The measure would require public reports of how often the N.S.A. had used the calling log database, she said. It would also reduce the number of years — currently five — that the domestic calling log data is kept before it is deleted. It would also require the N.S.A. to send lists of the phone numbers it searches, and its rationale for doing so, to the Foreign Intelligence Surveillance Court for review.

By contrast, a rival bill drafted by skeptics of government surveillance, including two members of the committee, Senators Ron Wyden of Oregon and Mark Udall of Colorado, would ban the mass call log collection program.

That more extensive step is unlikely to pass the committee. Ms. Feinstein contended that “a majority of the committee” believed that the call log program was “necessary for our nation’s security.”

Ms. Feinstein said her bill with Mr. Chambliss would also require Senate confirmation of the N.S.A.’s director. At the same time, it would expand the N.S.A.’s powers to wiretap without warrants in the United States in one respect: when it is eavesdropping on a foreigner’s cellphone, and that person travels to the United States, the N.S.A. would be allowed to keep wiretapping for up to a week while it seeks court permission.

That step would remove the largest number of incidents in which the N.S.A. has deemed itself to have broken rules about surveillance in the United States. Those incidents were identified in a May 2012 audit leaked by Mr. Snowden.

The rival proposals pushed by Mr. Wyden and Mr. Udall would also ban the N.S.A. from warrantless searches of Americans’ information in the vast databases of communications it collects by targeting noncitizens abroad. And it would prohibit, when terrorism is not suspected, systematic searches of the contents of Americans’ international e-mails and text messages that are “about” a target rather than to or from that person.

Still, most of the senators on the Intelligence Committee, which had received briefings about the call log program and other surveillance even before Mr. Snowden’s leaks, used the hearing on Thursday to largely defend the programs and criticize the disclosures.

Mr. Chambliss suggested that people could die because of Mr. Snowden’s disclosures, and he pressed Gen. Keith Alexander, the N.S.A. director, to describe the program’s value.

“In my opinion,” General Alexander said, “if we had had that prior to 9/11, we would have known about the plot.”

Officials have struggled to identify terrorist attacks that would have been prevented by the call log program, which has existed in its current form since 2006. The clearest breakthrough attributed to the program was a case involving several San Diego men who were prosecuted for donating several thousand dollars to a terrorist group in Somalia.

Mr. Wyden pressed General Alexander about whether the N.S.A. had ever collected, or made plans to collect, bulk records about Americans’ locations based on cellphone tower data.

General Alexander replied that the N.S.A. is not doing so as part of the call log program, but that information pertinent to Mr. Wyden’s question was classified.
http://www.nytimes.com/2013/09/27/us...veillance.html





Dianne Feinstein Accidentally Confirms That NSA Tapped The Internet Backbone
Mike Masnick

It's widely known that the NSA has taps connected to the various telco networks, thanks in large part to AT&T employee Mark Klein who blew the whistle on AT&T's secret NSA room in San Francisco. What was unclear was exactly what kind of access the NSA had. Various groups like the EFF and CDT have both been asking the administration to finally come clean, in the name of transparency, if they're tapping backbone networks to snarf up internet communications like email. So far, the administration has declined to elaborate. Back in August, when the FISA court declassified its ruling about NSA violations, the third footnote, though heavily redacted, did briefly discuss this "upstream" capability:

In short, "upstream" capabilities are tapping the backbone itself, via the willing assistance of the telcos (who still have remained mostly silent on all of this) as opposed to "downstream" collection, which requires going to the internet companies directly. The internet companies have been much more resistant to government attempts to get access to their accounts. And thus, it's a big question as to what exactly the NSA can collect via its taps on the internet backbone, and the NSA and its defenders have tried to remain silent on this point, as you can see from the redactions above.

However, as Kevin Bankston notes, during Thursday's Senate Intelligence Committee hearing, Dianne Feinstein more or less admitted that they get emails via "upstream" collection methods. As you can see in the following clip, Feinstein interrupts a discussion to read a prepared "rebuttal" to a point being made, and in doing so clearly says that the NSA can get emails via upstream collections:

Upstream collection... occurs when NSA obtains internet communications, such as e-mails, from certain US companies that operate the Internet background, i.e., the companies that own and operate the domestic telecommunications lines over which internet traffic flows.

She clearly means "backbone" rather than "background." She's discussing this in an attempt to defend the NSA's "accidental" collection of information it shouldn't have had. But that point is not that important. Instead, the important point is that she's now admitted what most people suspected, but which the administration has totally avoided admitting for many, many years since the revelations made by Mark Klein.

So, despite years of trying to deny that the NSA can collect email and other communications directly from the backbone (rather than from the internet companies themselves), Feinstein appears to have finally let the cat out of the bag, perhaps without realizing it.
http://www.techdirt.com/articles/201...backbone.shtml





How a Crypto ‘Backdoor’ Pitted the Tech World Against the NSA
Kim Zetter

In August 2007, a young programmer in Microsoft’s Windows security group stood up to give a five-minute turbo talk at the annual Crypto conference in Santa Barbara.

It was a Tuesday evening, part of the conference’s traditional rump session, when a hodge-podge of short talks are presented outside of the conference’s main lineup. To draw attendees away from the wine and beer that competed for their attention at that hour, presenters sometimes tried to sex up their talks with provocative titles like “Does Bob Go to Prison?” or “How to Steal Cars – A Practical Attack on KeeLoq” or “The Only Rump Session Talk With Pamela Anderson.”

Dan Shumow and his Microsoft colleague Niels Ferguson titled theirs, provocatively, “On the Possibility of a Back Door in the NIST SP800-90 Dual Ec Prng.” It was a title only a crypto geek would love or get.

The talk was only nine slides long. But those nine slides were potentially dynamite. They laid out a case showing that a new encryption standard, given a stamp of approval by the U.S. government, possessed a glaring weakness that made an algorithm in it susceptible to cracking. But the weakness they described wasn’t just an average vulnerability, it had the kind of properties one would want if one were intentionally inserting a backdoor to make the algorithm susceptible to cracking by design.

For such a dramatic presentation — by mathematicians’ standards — the reaction to it was surprisingly muted. “I think folks thought, ‘Well that’s interesting,’ and, ‘Wow, it looks like maybe there was a flaw in the design,’” says a senior Microsoft manager who was at the talk. “But there wasn’t a huge reaction.”

Six years later, that’s all changed.

Early this month the New York Times drew a connection between their talk and memos leaked by Edward Snowden, classified Top Secret, that apparently confirms that the weakness in the standard and so-called Dual_EC_DRBG algorithm was indeed a backdoor. The Times story implies that the backdoor was intentionally put there by the NSA as part of a $250-million, decade-long covert operation by the agency to weaken and undermine the integrity of a number of encryption systems used by millions of people around the world.

The Times story has kindled a firestorm over the integrity of the byzantine process that produces security standards. The National Institute of Standards and Technology, which approved Dual_EC_DRBG and the standard, is now facing a crisis of confidence, having been forced to re-open the standard for public discussion, while security and crypto firms scramble to unravel how deeply the suspect algorithm infiltrated their code, if at all. On Thursday, corporate giant RSA Security publicly renounced Dual_EC_DRBG, while also conceding that its commercial suite of cryptographic libraries had been using the bad algorithm as its default algorithm for years.

But beneath the flames, a surprising uncertainty is still smoldering over whether Dual_EC_DRBG really is backdoored. The Times, crypto experts note, hasn’t released the memos that purport to prove the existence of a backdoor, and the paper’s direct quotes from the classified documents don’t mention any backdoor in the algorithm or efforts by the NSA to weaken it or the standard. They only discuss efforts to push the standard through committees for approval.

Jon Callas, the CTO of Silent Circle, whose company offers encrypted phone communication, delivered a different rump session talk at the Crypto conference in 2007 and saw the presentation by Shumow. He says he wasn’t alarmed by it at the time and still has doubts that what was exposed was actually a backdoor, in part because the algorithm is so badly done.

“If [NSA] spent $250 million weakening the standard and this is the best that they could do, then we have nothing to fear from them,” he says. “Because this was really ham-fisted. When you put on your conspiratorial hat about what the NSA would be doing, you would expect something more devious, Machiavellian … and this thing is just laughably bad. This is Boris and Natasha sort of stuff.”

Indeed, the Microsoft presenters themselves — who declined to comment for this article — didn’t press the backdoor theory in their talk. They didn’t mention NSA at all, and went out of their way to avoid accusing NIST of anything. “WE ARE NOT SAYING: NIST intentionally put a back door in this PRNG,” read the last slide of their deck.

The Microsoft manager who spoke with WIRED on condition of anonymity thinks the provocative title of the 2007 presentation overstates the issue with the algorithm and is being misinterpreted — that perhaps reporters at the Times read something in a classified document showing that the NSA worked on the algorithm and pushed it through the standards process, and quickly took it as proof that the title of the 2007 talk had been right to call the weakness in the standard and algorithm a backdoor.

But Paul Kocher, president and chief scientist of Cryptography Research, says that regardless of the lack of evidence in the Times story, he discounts the “bad cryptography” explanation for the weakness, in favor of the backdoor one.

“Bad cryptography happens through laziness and ignorance,” he says. “But in this case, a great deal of effort went into creating this and choosing a structure that happens to be amenable to attack.

“What’s mathematically creative [with this algorithm] is that when you look at it, you can’t even prove whether there is a backdoor or not, which is very bizarre in cryptography,” he says. “Usually the presence of a backdoor is something you can prove is there, because you can see it and exploit it…. In my entire career in cryptography, I’ve never seen a vulnerability like this.”

It’s not the first time the NSA has been accused of installing backdoors. Crypto trapdoors, real and imagined, have been part of NSA lore for decades. In some ways the current controversy echoes the long-ago debate over the first U.S. Data Encryption Standard in the 1970s. The NSA was widely suspected of weakening DES to make it more crackable by the agency by tinkering with a table of numeric constants called an S-Box and shortening the algorithm’s key length. In 1994, though, the NSA was exonerated when it turned out that the agency had actually changed the S-Box numbers to harden DES against a code-breaking technique that had been known only within NSA at the time.

In 1995, another case came up that seemed to confirm suspicions about the NSA. The Baltimore Sun reported that year that the NSA had inserted a backdoor into cryptographic machines made by the respected Swiss company Crypto AG, apparently substantiating longstanding rumors to that effect.

Then in 1999, Microsoft inadvertently kicked off another controversy when it leaked its internal name for a cryptographic signing key built into Windows NT. The key was called _NSAKEY, spawning speculation that Microsoft had secretly given the agency the power to write and sign its own updates to Windows NT’s crypto engine. Microsoft said this was incorrect, that the key was an internal Microsoft key only and that it was called “_NSAKEY” because the NSA was the technical reviewing authority for U.S. export controls. The key was part of Microsoft’s compliance with U.S. export laws.

Suspicions about the NSA and backdoors were lingering in 2006 when Shumow and Ferguson began looking at Dual_EC_DRBG after NIST approved it for inclusion in a standard. The standard discussed four federally sanctioned random number generators approved for use in encrypting government classified and unclassified-but-sensitive communication.

Each of the four algorithms was based on a different cryptographic design family. One was based on hash functions, one on so-called HMAC (hash-based message authentication code), one on block ciphers and the fourth one was based on elliptic curves. The NSA had been pushing elliptic curve cryptography for a number of years, and it publicly championed the last one — Dual_EC_DRBG — to be included in the standard.

Elliptic curve algorithms are based on slightly different mathematics than the more common RSA algorithm, and the NSA believes they’re the future of cryptography, asserting that elliptic curve algorithms are smaller, faster and offer better security.

But as Shumow and Ferguson examined the properties of the elliptic curve random number generator in the standard, to determine how to incorporate it into the Windows operating system, a couple of strange things stood out. First, the random number generator was very slow – two to three orders of magnitude slower than another algorithm in the standard.

Second, it didn’t seem to be very secure.

“There was a property [in it] that seemed to make the prediction-resistance of the algorithm not what you would necessarily want it to be,” the Microsoft manager says. In non-geek speak, there was a weakness that made the random number generator not so random.

Good random number generation is at the core of encryption, and a weak RNG can undo the entire encryption system. Random number generators play a role in creating cryptographic keys, in opening secure communications between users and web sites and in resetting passwords for email accounts. Without assured randomness, an attacker can predict what the system will generate and undermine the algorithm.

Shumow and Ferguson found that the obstacles to predicting what the random number generator would generate were low. It wasn’t a catastrophic problem, but it seemed strange for a security system being promulgated by the government.

Then they noticed something else.

The standard, which contained guidelines for implementing the algorithm, included a list of constants – static numbers – that were used in the elliptic curve on which the random number generator was based. Whoever generated the constants, which served as a kind of public key for the algorithm, could have generated a second set of numbers at the same time – a private key.

Anyone possessing that second set of numbers would have what’s known in the cryptography community as “trapdoor information” – that is, they would be able to essentially unlock the encryption algorithm by predicting what the random number generator generated. And, Shumow and Ferguson realized, they could predict this after seeing as few as 32 bytes of output from the generator. With a very small sample, they could crack the entire encryption system used to secure the output.

“Even if no one knows the secret numbers, the fact that the backdoor is present makes Dual_EC_DRBG very fragile,” cryptographer Bruce Schneier wrote at the time, in a piece for WIRED. “If someone were to solve just one instance of the algorithm’s elliptic-curve problem, he would effectively have the keys to the kingdom. He could then use it for whatever nefarious purpose he wanted. Or he could publish his result, and render every implementation of the random-number generator completely insecure.”

No one knew who had produced the constants, but it was assumed that because the NSA had pushed the algorithm into the standard, the agency had generated the numbers. The spy agency might also, then, have generated a secret key.

Schneier called it “scary stuff indeed,” but he also said at the time that it made no sense as a backdoor, since it was so obvious to anyone who looked at the algorithm and standard that there was this flaw in it. As a result, developers of web sites and software applications wouldn’t use it to help secure their products and systems, he said.

But in fact, many developers did use it.

The U.S. government has enormous purchasing power, and vendors soon were forced to implement the suspect standard as a condition of selling their products to federal agencies under so-called FIPS certification requirements. Microsoft added support for the standard, including the elliptic curve random-number generator, in a Vista update in February 2008, though it did not make the problematic generator the default algorithm.

Asked why Microsoft supported the algorithm when two of its own employees had shown it to be weakened, a second Microsoft senior manager who spoke with WIRED said that while the weakness in the algorithm and standard was “weird” it “wasn’t a smoking gun.” It was more of an “odd property.”

Microsoft decided to include the algorithm in its operating system because a major customer was asking for it, because it had been sanctioned by NIST, and because it wasn’t going to be enabled as the default algorithm in the system, thus having no impact on other customers.

“In fact it is nearly impossible for any user to implement or to get this particular random number generator instantiating on their machines without going into the guts of the machine and reconfiguring it,” he says.

Other major companies, like Cisco and RSA, added it as well. NIST in fact provides a lengthy list of companies that have included it in their libraries, though the list doesn’t say which companies made it the default algorithm in their library or which products have been developed that invoke the algorithm.

A Cisco spokesman told WIRED that the algorithm was implemented in its standard crypto library around mid-2012, a library that is used in more than 120 product lines, but the algorithm is not the default, and the default algorithm cannot be changed by users. The company is currently completing an internal audit of all of its products that leverage the NIST standard.

RSA, however, made the algorithm the default in its BShare toolkit for Java and C developers until this week when it told WIRED that it was changing the default following the renewed controversy over it. The company sent an advisory to developer customers “strongly” urging them to change the default to one of a number of other random number generator algorithms RSA supports. RSA also changed the default on its own end in BSafe and in an RSA key management system. The company is currently doing an internal review of all of its products to see where the algorithm gets invoked in order to change those.

RSA actually added the algorithm to its libraries in 2004 or 2005, before NIST approved it for the standard in 2006 and before the government made it a requirement for FIPS certification, says Sam Curry, the company’s chief technology officer. The company then made it the default algorithm in BSafe and in its key management system after the algorithm was added to the standard. Curry said that elliptic curve algorithms were all the rage at the time and RSA chose it as the default because it provided certain advantages over the other random number generators, including what he says was better security.

“Cryptography is a changing field. Some algorithms go up and some come down and we make the best decisions we can in any point in time,” he says.”A lot of the hash-based algorithms were getting struck down by some weaknesses in how they chose numbers and in fact what kind of sample set they chose for initial seeding. From our perspective it looked like elliptic curve would be immune to those things.”

Curry says the fact that the algorithm is slower actually provides it with better security in at least one respect.

“The length of time that you have to gather samples will determine the strength of your random number generation. So the fact that it’s slower sometimes gives it a wider sample set to do initial seeding,” he says. “Precisely because it takes a little longer, it actually winds up giving you more randomness in your initial seeding, and that can be an advantage.”

Despite the renewed controversy over the algorithm and standard, Microsoft managers say they still don’t think the weaknesses constitute an intentional backdoor.

Callas agrees. He thinks it is simply bad cryptography that was included in the standard to round-out the selection so that there would be at least one elliptic curve algorithm in the standard.

But one advantage to having the algorithm supported in products like Vista — and which may be the reason the NSA pushed it into the standard — is that even if it’s not the default algorithm for encryption on a system, as long as it’s an option on the system, an intruder, like the NSA, can get into the system and change the registry to make it the default algorithm used for encryption, thereby theoretically making it easy for the NSA to undermine the encryption and spy on users of the machine.

Schneier says this is a much more efficient and stealth way of undermining the encryption than simply installing a keystroke logger or other Trojan malware that could be detected.

“A Trojan is really, really big. You can’t say that was a mistake. It’s a massive piece of code collecting keystrokes,” he said. “But changing a bit-one to a bit-two [in the registry to change the default random number generator on the machine] is probably going to be undetected. It is a low conspiracy, highly deniable way of getting a backdoor. So there’s a benefit to getting it into the library and into the product.”

To date, the only confirmation that the algorithm has a backdoor comes in the Times story, based on NSA documents leaked by Edward Snowden, which the Times and two other media outlets saw.

“[i]nternal memos leaked by a former NSA contractor, Edward Snowden, suggest that the NSA generated one of the random number generators used in a 2006 NIST standard — called the Dual EC DRBG standard — which contains a back door for the NSA,” the Times wrote.

An editorial published by the Times this weekend re-asserted the claim: “Unbeknown to the many users of the system, a different government arm, the National Security Agency, secretly inserted a ‘back door’ into the system that allowed federal spies to crack open any data that was encoded using its technology.”

But all of the quotes that the Times published from the memos refer to the NSA getting the standard passed by an international standards body; they do not say the NSA intentionally weakened the algorithm and standard, though the Times implies that this is what the memos mean by tying them to the 2007 presentation by Shumow and Ferguson.

NIST has denied any knowledge of a backdoor and has also denied that the NSA authored its standard. The institute has, however, re-opened the standard for public comment as a result of the controversy and “strongly” urged against using the algorithm in question until the matter could be resolved. The public comments period will close Nov. 6.

Even without more explicit confirmation that the weaknesses in the algorithm and standard constitute a backdoor, Kocher and Schneier believe they do.

“It is extraordinarily bad cryptography,” says Kocher. “If you look at the NSA’s role in creating standards [over the years] and its general cryptographic sophistication, none of it makes sense if there isn’t a backdoor in this.”

Schneier agrees and says the NSA has done too many other things for him to think, when he sees government-mandated crypto that’s weak, that it’s just by accident.

“If we were living in a kinder world, that would be a plausible explanation,” he says. “But we’re living in a very malicious world, it turns out.”

He adds that the uncertainty around the algorithm and standard is the worst part of the whole matter.

“This is the worst problem that the NSA has done,” Schneier says. “They have so undermined the fundamental trust in the internet, that we don’t know what to trust. We have to suspect everything. We’re never sure. That’s the greatest damage.”
http://www.wired.com/threatlevel/201...-backdoor/all/





Google's Gmail Scanning Unclear to Users, Judge Finds

Judge Lucy H. Koh will allow a class action suit that alleges Google violated wiretapping laws to proceed
Jeremy Kirk

A U.S. federal judge allowed a class-action suit against Google to proceed, saying the company's terms of service are unclear when describing how it scans Gmail content in order to deliver advertisements.

Google had filed a motion to dismiss the suit, which alleges that the company intercepted and read email while in transit in order to deliver advertisements and create user profiles and models since 2008. The plaintiffs alleged the company violated federal and state wiretapping laws.

The suit, which is being heard in U.S. District Court for the Northern District of California, further contends non-Gmail users who sent email to Gmail users were also subject to illegal interception.

In her ruling Thursday, U.S. District Judge Lucy H. Koh wrote that Google's terms of service and privacy policies do not explicitly say that the company intercepts users' email to create user profiles or deliver targeted advertising.

Although Google revised its terms of service and privacy policy in 2012, Koh wrote "that a reasonable Gmail user who read the Privacy Policies would not have necessarily understood that her emails were being intercepted to create user profiles or to provide targeted advertisements."

Google said in a statement it was disappointed with the decision and is considering its options. "Automated scanning lets us provide Gmail users with security and spam protection, as well as great features like Priority Inbox."

Google, which had filed a motion to dismiss, maintains that the automated scanning is fully disclosed to Gmail users and that features such as search and filtering would not be possible without it.

Past court rulings have also held that all email users imply consent to automated processing since email couldn't be sent otherwise, Google argued in the motion.

Koh also rejected Google's contention that non-Gmail users gave their implied consent to scanning of their communications.

"Google has cited no case that stands for the proposition that users who send emails impliedly consent to interceptions and use of their communications by third parties other than the intended recipient of the email," Koh wrote.

Consumer Watchdog, a nonprofit consumer advocate group based in Washington, D.C., called Koh's ruling a "tremendous victory for online privacy."

"The court rightly rejected Google's tortured logic that you have to accept intrusions of privacy if you want to send email," said John M. Simpson, Consumer Watchdog's privacy project director, in a news release. "Companies like Google can't simply do whatever they want with our data and emails."
http://www.itworld.com/it-management...rs-judge-finds





Seymour Hersh on Obama, NSA and the 'Pathetic' American Media

Pulitzer Prize winner explains how to fix journalism, saying press should 'fire 90% of editors and promote ones you can't control'
Lisa O'Carroll

Seymour Hersh has got some extreme ideas on how to fix journalism – close down the news bureaus of NBC and ABC, sack 90% of editors in publishing and get back to the fundamental job of journalists which, he says, is to be an outsider.

It doesn't take much to fire up Hersh, the investigative journalist who has been the nemesis of US presidents since the 1960s and who was once described by the Republican party as "the closest thing American journalism has to a terrorist".

He is angry about the timidity of journalists in America, their failure to challenge the White House and be an unpopular messenger of truth.

Don't even get him started on the New York Times which, he says, spends "so much more time carrying water for Obama than I ever thought they would" – or the death of Osama bin Laden. "Nothing's been done about that story, it's one big lie, not one word of it is true," he says of the dramatic US Navy Seals raid in 2011.

Hersh is writing a book about national security and has devoted a chapter to the bin Laden killing. He says a recent report put out by an "independent" Pakistani commission about life in the Abottabad compound in which Bin Laden was holed up would not stand up to scrutiny. "The Pakistanis put out a report, don't get me going on it. Let's put it this way, it was done with considerable American input. It's a bullshit report," he says hinting of revelations to come in his book.

The Obama administration lies systematically, he claims, yet none of the leviathans of American media, the TV networks or big print titles, challenge him.

"It's pathetic, they are more than obsequious, they are afraid to pick on this guy [Obama]," he declares in an interview with the Guardian.

"It used to be when you were in a situation when something very dramatic happened, the president and the minions around the president had control of the narrative, you would pretty much know they would do the best they could to tell the story straight. Now that doesn't happen any more. Now they take advantage of something like that and they work out how to re-elect the president.

He isn't even sure if the recent revelations about the depth and breadth of surveillance by the National Security Agency will have a lasting effect.

Snowden changed the debate on surveillance

He is certain that NSA whistleblower Edward Snowden "changed the whole nature of the debate" about surveillance. Hersh says he and other journalists had written about surveillance, but Snowden was significant because he provided documentary evidence – although he is sceptical about whether the revelations will change the US government's policy.

"Duncan Campbell [the British investigative journalist who broke the Zircon cover-up story], James Bamford [US journalist] and Julian Assange and me and the New Yorker, we've all written the notion there's constant surveillance, but he [Snowden] produced a document and that changed the whole nature of the debate, it's real now," Hersh says.

"Editors love documents. Chicken-shit editors who wouldn't touch stories like that, they love documents, so he changed the whole ball game," he adds, before qualifying his remarks.

"But I don't know if it's going to mean anything in the long [run] because the polls I see in America – the president can still say to voters 'al-Qaida, al-Qaida' and the public will vote two to one for this kind of surveillance, which is so idiotic," he says.

Holding court to a packed audience at City University in London's summer school on investigative journalism, 76-year-old Hersh is on full throttle, a whirlwind of amazing stories of how journalism used to be; how he exposed the My Lai massacre in Vietnam, how he got the Abu Ghraib pictures of American soldiers brutalising Iraqi prisoners, and what he thinks of Edward Snowden.

Hope of redemption

Despite his concern about the timidity of journalism he believes the trade still offers hope of redemption.

"I have this sort of heuristic view that journalism, we possibly offer hope because the world is clearly run by total nincompoops more than ever … Not that journalism is always wonderful, it's not, but at least we offer some way out, some integrity."

His story of how he uncovered the My Lai atrocity is one of old-fashioned shoe-leather journalism and doggedness. Back in 1969, he got a tip about a 26-year-old platoon leader, William Calley, who had been charged by the army with alleged mass murder.

Instead of picking up the phone to a press officer, he got into his car and started looking for him in the army camp of Fort Benning in Georgia, where he heard he had been detained. From door to door he searched the vast compound, sometimes blagging his way, marching up to the reception, slamming his fist on the table and shouting: "Sergeant, I want Calley out now."

Eventually his efforts paid off with his first story appearing in the St Louis Post-Despatch, which was then syndicated across America and eventually earned him the Pulitzer Prize. "I did five stories. I charged $100 for the first, by the end the [New York] Times were paying $5,000."

He was hired by the New York Times to follow up the Watergate scandal and ended up hounding Nixon over Cambodia. Almost 30 years later, Hersh made global headlines all over again with his exposure of the abuse of Iraqi prisoners at Abu Ghraib.

Put in the hours

For students of journalism his message is put the miles and the hours in. He knew about Abu Ghraib five months before he could write about it, having been tipped off by a senior Iraqi army officer who risked his own life by coming out of Baghdad to Damascus to tell him how prisoners had been writing to their families asking them to come and kill them because they had been "despoiled".

"I went five months looking for a document, because without a document, there's nothing there, it doesn't go anywhere."

Hersh returns to US president Barack Obama. He has said before that the confidence of the US press to challenge the US government collapsed post 9/11, but he is adamant that Obama is worse than Bush.

"Do you think Obama's been judged by any rational standards? Has Guantanamo closed? Is a war over? Is anyone paying any attention to Iraq? Is he seriously talking about going into Syria? We are not doing so well in the 80 wars we are in right now, what the hell does he want to go into another one for. What's going on [with journalists]?" he asks.

He says investigative journalism in the US is being killed by the crisis of confidence, lack of resources and a misguided notion of what the job entails.

"Too much of it seems to me is looking for prizes. It's journalism looking for the Pulitzer Prize," he adds. "It's a packaged journalism, so you pick a target like – I don't mean to diminish because anyone who does it works hard – but are railway crossings safe and stuff like that, that's a serious issue but there are other issues too.

"Like killing people, how does [Obama] get away with the drone programme, why aren't we doing more? How does he justify it? What's the intelligence? Why don't we find out how good or bad this policy is? Why do newspapers constantly cite the two or three groups that monitor drone killings. Why don't we do our own work?

"Our job is to find out ourselves, our job is not just to say – here's a debate' our job is to go beyond the debate and find out who's right and who's wrong about issues. That doesn't happen enough. It costs money, it costs time, it jeopardises, it raises risks. There are some people – the New York Times still has investigative journalists but they do much more of carrying water for the president than I ever thought they would … it's like you don't dare be an outsider any more."

He says in some ways President George Bush's administration was easier to write about. "The Bush era, I felt it was much easier to be critical than it is [of] Obama. Much more difficult in the Obama era," he said.

Asked what the solution is Hersh warms to his theme that most editors are pusillanimous and should be fired.

"I'll tell you the solution, get rid of 90% of the editors that now exist and start promoting editors that you can't control," he says. I saw it in the New York Times, I see people who get promoted are the ones on the desk who are more amenable to the publisher and what the senior editors want and the trouble makers don't get promoted. Start promoting better people who look you in the eye and say 'I don't care what you say'.

Nor does he understand why the Washington Post held back on the Snowden files until it learned the Guardian was about to publish.

If Hersh was in charge of US Media Inc, his scorched earth policy wouldn't stop with newspapers.

"I would close down the news bureaus of the networks and let's start all over, tabula rasa. The majors, NBCs, ABCs, they won't like this – just do something different, do something that gets people mad at you, that's what we're supposed to be doing," he says.

Hersh is currently on a break from reporting, working on a book which undoubtedly will make for uncomfortable reading for both Bush and Obama.

"The republic's in trouble, we lie about everything, lying has become the staple." And he implores journalists to do something about it.
http://www.theguardian.com/media/med...american-media





Why We're Shutting Off Our Comments

Starting today, PopularScience.com will no longer accept comments on new articles. Here's why.
Suzanne LaBarre

Comments can be bad for science. That's why, here at PopularScience.com, we're shutting them off.

It wasn't a decision we made lightly. As the news arm of a 141-year-old science and technology magazine, we are as committed to fostering lively, intellectual debate as we are to spreading the word of science far and wide. The problem is when trolls and spambots overwhelm the former, diminishing our ability to do the latter.

That is not to suggest that we are the only website in the world that attracts vexing commenters. Far from it. Nor is it to suggest that all, or even close to all, of our commenters are shrill, boorish specimens of the lower internet phyla. We have many delightful, thought-provoking commenters.

But even a fractious minority wields enough power to skew a reader's perception of a story, recent research suggests. In one study led by University of Wisconsin-Madison professor Dominique Brossard, 1,183 Americans read a fake blog post on nanotechnology and revealed in survey questions how they felt about the subject (are they wary of the benefits or supportive?). Then, through a randomly assigned condition, they read either epithet- and insult-laden comments ("If you don't see the benefits of using nanotechnology in these kinds of products, you're an idiot" ) or civil comments. The results, as Brossard and coauthor Dietram A. Scheufele wrote in a New York Times op-ed:

Uncivil comments not only polarized readers, but they often changed a participant's interpretation of the news story itself.
In the civil group, those who initially did or did not support the technology — whom we identified with preliminary survey questions — continued to feel the same way after reading the comments. Those exposed to rude comments, however, ended up with a much more polarized understanding of the risks connected with the technology.

Simply including an ad hominem attack in a reader comment was enough to make study participants think the downside of the reported technology was greater than they'd previously thought.

Another, similarly designed study found that just firmly worded (but not uncivil) disagreements between commenters impacted readers' perception of science.

If you carry out those results to their logical end--commenters shape public opinion; public opinion shapes public policy; public policy shapes how and whether and what research gets funded--you start to see why we feel compelled to hit the "off" switch.
Even a fractious minority wields enough power to skew a reader's perception of a story.

A politically motivated, decades-long war on expertise has eroded the popular consensus on a wide variety of scientifically validated topics. Everything, from evolution to the origins of climate change, is mistakenly up for grabs again. Scientific certainty is just another thing for two people to "debate" on television. And because comments sections tend to be a grotesque reflection of the media culture surrounding them, the cynical work of undermining bedrock scientific doctrine is now being done beneath our own stories, within a website devoted to championing science.

There are plenty of other ways to talk back to us, and to each other: through Twitter, Facebook, Google+, Pinterest, livechats, email, and more. We also plan to open the comments section on select articles that lend themselves to vigorous and intelligent discussion. We hope you'll chime in with your brightest thoughts. Don't do it for us. Do it for science.
http://www.popsci.com/science/articl...g-our-comments
















Until next week,

- js.



















Current Week In Review





Recent WiRs -

September 21st, September 14, September 7th, August 31st

Jack Spratts' Week In Review is published every Friday. Submit letters, articles, press releases, comments, questions etc. in plain text English to jackspratts (at) lycos (dot) com. Submission deadlines are Thursdays @ 1400 UTC. Please include contact info. The right to publish all remarks is reserved.


"The First Amendment rests on the assumption that the widest possible dissemination of information from diverse and antagonistic sources is essential to the welfare of the public."
- Hugo Black
__________________
Thanks For Sharing
JackSpratts is offline   Reply With Quote
Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Peer-To-Peer News - The Week In Review - November 24th, '12 JackSpratts Peer to Peer 0 21-11-12 09:20 AM
Peer-To-Peer News - The Week In Review - July 16th, '11 JackSpratts Peer to Peer 0 13-07-11 06:43 AM
Peer-To-Peer News - The Week In Review - January 30th, '10 JackSpratts Peer to Peer 0 27-01-10 07:49 AM
Peer-To-Peer News - The Week In Review - January 16th, '10 JackSpratts Peer to Peer 0 13-01-10 09:02 AM
Peer-To-Peer News - The Week In Review - December 5th, '09 JackSpratts Peer to Peer 0 02-12-09 08:32 AM






All times are GMT -6. The time now is 10:29 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
© www.p2p-zone.com - Napsterites - 2000 - 2024 (Contact grm1@iinet.net.au for all admin enquiries)