P2P-Zone  

Go Back   P2P-Zone > Peer to Peer
FAQ Members List Calendar Search Today's Posts Mark Forums Read

Peer to Peer The 3rd millenium technology!

Reply
 
Thread Tools Search this Thread Display Modes
Old 23-03-11, 08:37 AM   #1
JackSpratts
 
JackSpratts's Avatar
 
Join Date: May 2001
Location: New England
Posts: 10,017
Default Peer-To-Peer News - The Week In Review - March 26th, '11

Since 2002


































"Using indices collectively covering the period since 1960, we document that the annual number of new albums passing various quality thresholds has remained roughly constant since Napster, is statistically indistinguishable from pre-Napster trends, and has not diverged from song supply since iTunes' revival of the single format in 2003. We also document that the role of new artists in new recorded music products has not diminished since Napster." – Professor Waldfogel



































March 26th, 2011




Illegal File Sharing 'Hasn't Hurt Music Industry'

Contrary to industry claims, illegal file sharing has not hurt the quality or quantity of music available, new research shows.
David McLennan

University of Minnesota academic Joel Waldfogel, in a US National Bureau of Economic Research report Bye, Bye Miss American Pie? The supply of new recorded music since Napster, compared the quantity and quality of new albums before and after the introduction of Napster, one of the first widespread computer programs that allowed users to share music for free.

"We find no evidence that changes since Napster have affected the quantity of new recorded music or artists coming to market. We reconcile stable quantities in the face of decreased demand with reduced costs of bringing works to market and a growing role of independent labels,'' he said.

Revenue from sales of physically recorded music has fallen from US$37 billion in 1999 to $25 billion in 2007 and Professor Waldfogel said most observers blamed file sharing for much, if not most, of the fall.

"Organisations representing the recording industry have argued vigorously that piracy will have serious consequences for whether new works will be brought to market and made available to consumers,'' he said.

But Professor Waldfogel said technological change had not only made sharing music easier, it had also reduced the cost of creating, distributing, and promoting it.

"While much existing research focuses on the effect of file sharing on demand for legal recorded music, the more important question for the well-being of consumers is whether the overall effect of recent technological changes has reduced quantity of consequential music brought to market,'' he said.

It was hard to compare the quality of music across time, but Professor Waldfogel used 88 critics' retrospective lists of best works, like Rolling Stone's 500 best albums.

"Using indices collectively covering the period since 1960, we document that the annual number of new albums passing various quality thresholds has remained roughly constant since Napster, is statistically indistinguishable from pre-Napster trends, and has not diverged from song supply since iTunes' revival of the single format in 2003. We also document that the role of new artists in new recorded music products has not diminished since Napster," he said.

"How do we reconcile the demand reduction wrought by piracy since Napster with the continued creation of new recorded music documented above? One possibility is that the supply curve is vertical, ie, that creative activity is invariant to reward."

Professor Waldfogel's research also found that while the major labels were not producing as much music, independents were making up the difference.

"The new era may not stem the supply of music, but it may change the industrial organization of the music industry," he said.

"And while major labels facing financial distress may be able to invest less in new artists, it is also possible that these artists' music can nevertheless find its way to market via the less costly distribution channel that independent labels provide."

Professor Waldfogel said this was a very difficult area to study so "more than the usual number of caveats is in order".

"We lack a compelling counterfactual for the post-Napter musical world, so it is hard to measure Napster's effect on new music with great confidence. It is possible that, absent the demand contraction, the cost reductions would have ushered in a substantial increase in new high-quality works,'' he said.

"Second, even if it is true that music supply remains forthcoming in the face of weakened effective copyright protection, it is not clear what relevance these results have for other media [like films] that differ in their creative processes."
http://www.canberratimes.com.au/news...y/2111923.aspx





Only 9% (and Falling) of US Internet Users are P2P Pirates
Nate Anderson

In its 2010 annual report (PDF), recorded music's global trade body said that the industry would "struggle to survive unless we address the fundamental problem of piracy." Just how "fundamental" a problem is that piracy? Not very, as new research suggests that only 9 percent of US Internet users even use peer-to-peer networks at all, down substantially from 2007.

Market research firm NPD Group, which tracks music acquisition, said today that P2P use has dropped from 16 percent of all US Internet users to 9 percent over the last three years. The latest data comes from the fourth quarter of 2010, when a federal judge shut down LimeWire; that may have depressed the numbers a bit, though NPD notes that other P2P programs saw more usage as a result.

As for the average number of downloads per person, that also fell from 35 per quarter in 2007 to 18 per quarter by the end of 2010. Those averages obscure people who swap thousands of files, of course, but they also suggest that many P2P users only pick up a few tracks.

The data fits roughly with similar data collected by Warner Music and shown to the FCC in early 2010. Warner suggested that 13 percent of consumers were avowed pirates, but noted that even the pirates spent (some) money on recorded music.

The NPD numbers are not a complete view of the piracy problem. The company surveys only P2P use and so would not include one-click download sites and illegal online streaming services that have become more important for the movie business, in particular.

Still, the NPD data suggests that, in the US at least, piracy isn't the "fundamental" problem it's perceived to be. (Even Warner admits that pirates "tend to drive high discovery for others" and to spend some of their own money on music.) In developing economies, where piracy rates can reach north of 90 percent for music, movies, and software, it's a much more fundamental issue for content companies.

How should they address it? A major three-year academic research project has just concluded that piracy in developing economies is a "global pricing problem" that enforcement alone cannot fix.
http://arstechnica.com/tech-policy/n...2p-pirates.ars





Study: LimeWire Demise Slows Music Piracy
Greg Sandoval

In what will surely be music to the ears of the major labels, research firm NPD Group says that illegal file sharing of songs via peer-to-peer services has dramatically dropped off since Lime Wire shut down.

Lime Wire, the company that operated the popular peer-to-peer network Limewire, was forced to shut down in October following a federal court decision that found the company liable for copyright infringement. The Recording Industry Association of America had file a copyright suit against Lime Wire and CEO Mark Gorton in 2007, claiming the company encouraged the pirating of billions of songs.

In the wake of those events, NPD said today "the percentage Internet users who download music via peer-to-peer services was at 9 percent in the fourth quarter of 2010, compared to 16 percent in the same period earlier in 2007. Here's more of what NPD found:

Quote:
The average number of music files downloaded from P2P networks also declined from 35 tracks per person in Q4 2007 to just 18 tracks in Q4 2010, although some downloaded just one or two tracks, while others took hundreds. NPD estimates there were 16 million P2P users downloading music in Q4 2010, which is 12 million fewer than in Q4 2007.
NPD's research showed that Limewire was the overwhelming leader when it came to downloading music through P2P. But the research also noted that it appears former Limewire users are moving to similar networks.

• Of those using P2P to download music, 56 percent chose to do it with Limewire during the third quarter last year, NPD said. The number fell to 32 percent in the last quarter, which is the period when the company shuttered operations.

• Frostwire was used by just 10 percent of those sharing music files via P2P in the third quarter of 2010. That number rose to 21 percent in the final quarter of 2010. "Bittorrent client u-Torrent increased from 8 percent to 12 percent in the same time period," NPD reported.

"Limewire was so popular for music file trading, and for so long, that its closure has had a powerful and immediate effect on the number of people downloading music files," said Russ Crupnick, entertainment industry analyst for NPD in a statement.

"In the past," Crupnick continued, "we've noted that hard-core peer-to-peer users would quickly move to other Web sites that offered illegal music file sharing. It will be interesting to see if services like Frostwire and Bittorrent take up the slack left by Limewire."
http://news.cnet.com/8301-31001_3-20046136-261.html





LSE Experts Question Music Industry Claims on File-Sharing

Two days before the opening of a Judicial Review on the Digital Economy Act (DEA), a new report from the London School of Economics and Political Science casts doubt on the proportionality and likely effectiveness of measures to protect intellectual property, due to be implemented by the DEA.

This report, called ‘Creative Destruction and Copyright Protection’ by Bart Cammaerts and Bingchun Meng (London School of Economics), has been commissioned by the LSE Media Policy Project.

The LSE Media Policy Project research finds that:

The DEA gets the balance between copyright enforcement and innovation wrong. The use of peer-to-peer technology should be encouraged to promote innovative applications. Focusing on efforts to suppress the use of technological advances and to protect out-of-date business models will stifle innovation in this industry.

Providing user-friendly, hassle-free solutions to enable users to download music legally at a reasonable price, is a much more effective strategy for enforcing copyright than a heavy-handed legislative and regulatory regime.

Decline in the sales of physical copies of recorded music cannot be attributed solely to file-sharing, but should be explained by a combination of factors such as changing patterns in music consumption, decreasing disposable household incomes for leisure products and increasing sales of digital content through online platforms.

According to report author, Bart Cammaerts, “The music industry and artists should innovate and actively reconnect with their sharing fans rather than treat them as criminals. They should acknowledge that there are also other reasons for its relative decline beyond the sharing of copyright protected content, not least the rising costs of live performances and other leisure services to the detriment of leisure goods. Alternative sources of income generation for artists should be considered instead of actively monitoring the online behaviour of UK citizens.”

LSE expert, Bingchun Meng, argues that "the DEA has given too much consideration to the interests of copyright holders, while ignoring other stakeholders such as users, ISPs, and new players in the creative industry. I hope the Judicial Review will make the government reconsider its approach toward file-sharing."
http://finchannel.com/Main_News/B_Sc..._file-sharing/





Eminem's Side Wins $40M to $50M Victory as Supreme Court Lets Royalties Ruling Stand
Brian McCollum

The Supreme Court has quietly handed a major victory to Eminem in a much-watched case involving online royalties.

In a triumph for the Detroit star and his former production company, the court today declined to hear an appeal filed by Universal Music Group in a dispute over payments for downloaded tracks and ringtones.

The court let stand an appeals-court ruling against the record company — the world’s biggest — and set the stage for a potentially massive payout to the rapper and Ferndale’s F.B.T. Productions.

“For us, this is probably a $40 million to $50 million issue,” said Joel Martin of F.B.T., which filed the lawsuit in 2007. “Every artist who has this sort of language in their contract is now going to go back to their record company and say, ‘OK, so what do you want to do about (download royalties)?’”

F.B.T. administered Eminem’s deal with Universal in 1998, on the cusp of the MP3 revolution, and still has royalty rights in his work. The firm, which includes the well-known brother team Jeff and Mark Bass, has shared the rapper’s recording contract for some of the best-selling releases of the modern era, including albums such as “The Marshall Mathers LP” and last year’s “Recovery.”

Eminem was not a direct party in the suit, and has not publicly commented on it.

The Supreme Court has sent the case back to a trial court to determine damages. If Universal and F.B.T. cannot settle on a figure, a judge or a jury would decide what is owed.

At issue was the royalty rate for tracks distributed via online services such as iTunes. Universal claimed it owed F.B.T. the same royalty it paid for physical sales: 18% of the suggested retail price.

Eminem’s team argued that because Universal’s online agreements are actually licensing situations, not unit sales, a different type of calculation should kick in: 50% of net revenue. The 9th Circuit Court of Appeals agreed in September, reversing an earlier jury decision.

There has been sharp disagreement about the ruling’s broader impact.

Universal and some industry observers have said it will have few ramifications beyond Eminem. Most current hit artists already have contracts that explicitly spell out download royalties, and many other deals have been reworked in recent years.

In a statement today, Universal spokesman Peter Lofrumento said: “The case has always been about one agreement with very unique language. As it has been made clear during this case, the ruling has no bearing on any other recording agreement and does not create any legal precedent.”

But other experts have said the ruling will have profound effects on an industry that has already seen massive upheaval in the past decade. With the Supreme Court’s tacit endorsement in hand, artists with older deals may feel empowered to attain higher payments for download sales.

That might include Motown performers seeking to address their royalty rates with Universal, parent company of Motown Records, guided by the same arguments that worked for Eminem.

“We’ve been waiting for this from the Supreme Court, but we’ve been talking with the artists about it, and they’ve been very interested,” said Billy Wilson of the Motown Alumni Association, which had filed an amicus brief in the case. “And it’s not just Motown artists, but many others in the same situation, who now have the same opportunity to renegotiate their contracts — and renegotiate the structure of the music industry, really.”
http://www.freep.com/article/2011032...s-ruling-stand





Judge Orders Jobs to Answer iTunes Questions

A federal judge has ordered Apple Chief Executive Steve Jobs to answer questions relating to an antitrust lawsuit that says the company's iTunes software maintained a monopoly in portable digital media players and music downloads.

Judge Howard Lloyd of U.S. District Court for Northern California on Monday ordered that lawyers representing the plaintiffs may question Jobs for a total of two hours.

Lloyd said the questioning of Jobs should be limited to changes Apple made to its software in October 2004 that prevented iTunes rival RealNetworks' music files from being played on Apple's iPod music players.

Jobs has been out on medical leave since late January and the court ruling comes amid an intense media scrutiny of his health condition.

"The court finds that Jobs has unique, non-repetitive, first hand knowledge about Apple's software updates in October 2004 that rendered the RealNetworks's digital music files once again inoperable with iPods," Judge Lloyd wrote in his ruling.

The case is in re Apple iPod iTunes antitrust litigation, Case No. 05-00037, U.S. District Court, Northern District of California.

(Reporting by Sakthi Prasad in Bangalore; Editing by David Holmes)
http://www.reuters.com/article/2011/...72L26720110322





Apple Sues Amazon.com Over APP STORE Trademark

Apple Inc has sued Amazon.com Inc in a bid to stop the online retailer from improperly using Apple's APP STORE trademark, according to a court filing.

The lawsuit, filed in a California federal court late last week, said Amazon has improperly used Apple's APP STORE mark to solicit software developers throughout the United States.

Apple has applied to register the APP STORE trademark in the United States, a bid which Microsoft opposes. The matter is currently before a trademark trial and appeals board, according to the lawsuit.

"We've asked Amazon not to copy the APP STORE name because it will confuse and mislead customers," Apple spokeswoman Kristin Huguet said on Monday.

Amazon.com did not immediately respond to a request for comment.

According to the lawsuit, Amazon is unlawfully using the APP STORE trademark in connection with what Amazon calls the "Amazon Appstore Developer Portal," along with other instances like ads for a version of Angry Birds, the popular mobile game.

Apple has also asserted a claim of unfair competition, and is seeking to enjoin Amazon from using the APP STORE mark.

The case in U.S. District Court, Northern District of California is Apple Inc v. Amazon.com Inc, 11-1327.

(Reporting by Dan Levine; Editing by Bernard Orr)
http://www.reuters.com/article/2011/...72L07I20110322





ISPs Urged to Block Filesharing Sites

Music and film groups in talks with broadband providers over code that would bar access to sites such as The Pirate Bay
Josh Halliday

The Pirate Bay was one of the filesharing sites that rights holders told ISPs they wanted blocked. Photograph: Claudio Bresciani/Scanpix/PA Photos

Rights holders from across the music and film industries have identified about 100 websites – including The Pirate Bay and "cyberlocker" sites – that they want internet service providers such as BT to block under new measures to tackle illegal filesharing.

Under a voluntary code that is under discussion, content owners would pass evidence of illegal filesharing sites to ISPs, which would then take action against those sites.

However, the proposals are fraught with complications. ISPs are understood to be open to the idea of cutting off access to some infringing sites, but argue that an impartial judge should decide which get blocked. It is also unclear whether content owners or ISPs would be liable to pay compensation to a site that argues that it has been unfairly censored.

The communications minister, Ed Vaizey, is leading a series of talks with rights holders and ISPs, including BT and TalkTalk, aimed at developing voluntary code on internet policy, including site blocking.

The proposal is part of a contentious range of plans to curb illegal filesharing in the UK. Rights holders and ISPs have been at loggerheads over legislation due to be introduced under the Digital Economy Act, which faces a high court challenge by BT and TalkTalk on Wednesday.

BT and TalkTalk – which together have 8.4 million UK subscribers – have already spent close to £1m in legal fees on challenging the act, the Guardian understands. The government, meanwhile, is keen to push through voluntary agreements on controversial issues such as site blocking, as the act faces a delay of at least 12 months.

Issues such as how to give accused sites a fair hearing, indemnity and costs, as well as the governance structure of the code are yet to be ironed out.

"Cheaper than notice sending would be site blocking," said one rights holder present at the government meetings. "We're more interested in site blocking [than mass notification letters]. We don't want to target end users, [the mass notification system] is long winded – we want something now."

Another source at the meeting told the Guardian: "Site blocking is an interesting concept which we're open to, but there are issues on how to make it work, how to give sites a fair hearing, its governance structure and indemnity. But get a judge to tell us to do it and we'll do it."

The culture secretary, Jeremy Hunt, has convened a government-led working group, that comprised ISPs and search engines, to find a "plan B" to avoid potential litigation arising from the blocking of websites accused of illegal filesharing.

The Motion Picture Association (MPA), the trade body representing Hollywood studios including Paramount Pictures, 20th Century Fox and Disney, argues ISPs should block access to filesharing portals such as Newzbin2, The Pirate Bay, Movieberry and Free Movies Online 4 You.

In December, the MPA filed an injunction forcing BT, the UK's largest broadband provider, to throttle users' access to Newzbin2 using the UK Copyrights, Designs and Patents Act. Although the a voluntary set of principles is preferred to the legal route, the high court is expected rule in June on whether BT should block access to the site.
http://www.guardian.co.uk/technology...esharing-sites





Microsoft Presses State to Tackle Software Piracy

Microsoft is pushing Washington legislators to pass a law making it illegal for manufacturers that use pirated software to sell goods here.
Sharon Pian Chan

Microsoft is pushing Washington legislators to pass a law making it illegal for manufacturers that use pirated software to sell goods in the state.

The company is on a state-by-state campaign to force foreign companies to start paying for software.

While this would cover companies everywhere, the bill appears targeted at Chinese manufacturers. China is expected to become the largest PC market in the world this year.

The proposed legislation would create a legal cause of action by making manufacturing companies liable for damages, and it would give the state attorney general and companies the right to pursue injunctions in civil court to stop the manufacturers' goods from being sold.

For example, if a large Washington store sold T-shirts made from a company in China and the Chinese company uses pirated copies of Excel at an office in Shenzhen, Microsoft could seek an injunction to prevent the manufacturer from supplying T-shirts to be sold in Washington state.

"We have a problem internationally with stolen and counterfeited software," said state Rep. Deb Eddy, D-Kirkland, one of the bill's sponsors.

The measure appears likely to win final approval. The state House and Senate have approved versions of the bill by large margins. The House passed HB 1495 in a 90-4 vote Feb. 22, and the Senate passed SB 5449 in a 39-7 vote March 4.

Lawmakers are negotiating to bring the two versions together, and the Senate will hold a hearing on the matter Monday.

The bill would affect retailers that make $50 million or more in annual sales and that have a direct contract with the manufacturer. Retailers would have 18 months to change manufacturers or persuade their manufacturers to pay for software.

Microsoft General Counsel Brad Smith said the issue is about creating jobs in Washington state.

"Right now, theft of American technology is robbing the economy of tens of thousands of jobs in Washington state," Smith said. "This is far and away the most important step the Legislature can take to help us create more jobs in Washington state over the next three years."

Boeing and Weyerhaeuser support the bill, according to Microsoft.

On Friday, several large computer-hardware corporations sent a joint letter urging legislators to vote no on the bill. Many are Microsoft's partners — Dell, IBM, Intel and Hewlett-Packard.

"These bills would create a new and unjustified cause of action against many American employers, fueling business uncertainty, disrupting our supply chains and undermining the competitiveness of U.S. firms," the companies wrote in the letter.

The Washington Retail Association also says it has concerns about how the bill would affect store owners.

"The bottom line is, it draws us into an action we are not responsible for," said Jan Teague, president and chief executive of the association, which represents 2,400 storefronts. The association would prefer a federal resolution from Congress over state action.

Smith said companies in countries with lax intellectual-property protections laugh at Microsoft when it asks them to pay for software. "They tell us they have no intention of paying for something they can steal with immunity," he said.

Louisiana passed a similar law a year ago. Microsoft hopes to get other states to impose similar laws.
http://seattletimes.nwsource.com/htm...tpiracy14.html





Court Strikes Out File Sharing Actions

The Patents County Court has struck out 27 alleged unlawful file sharing actions brought by London firm ACS:Law on behalf of its clients Media CAT.

A Law Gazette report notes that Judge Birss is now considering how much ACS:Law and Media CAT should pay in wasted costs, after accepting the submission made on behalf of the defendants that the two organisations had wasted the court's time in starting actions that they had no intention of pursuing to trial. The cases were brought after ACS:Law wrote letters to thousands of individuals, alleging that they had unlawfully downloaded copyrighted media on 'peer to peer' networks over the Internet. The letters threatened court action if the recipients did not make payments of around £500 to settle the claims of alleged copyright infringement, according to the report.
http://www.legalbrief.co.za/article....10322075752527





Anti-Piracy Firms 'Delete Data' in Filesharing Row

BT steps up fight against legal firms as 'speculative invoicing' unravels

BT will on Wednsday take to the high court to fight government plans to curb illegal filesharing. A coalition of beleaguered rights owners will line up in opposition. An apposite time, then, for this industry's most controversial anti-filesharing cabal to be back in the news.

In a post on its customer forum on Tuesday evening, BT said that it had ordered DigiProtect and MediaCAT, two (very different) media companies used by notorious solicitors' firm ACS:Law, to delete thousands of its customers' details they received last year.

The two companies had issued a court order (known as a Norwich Pharmacal Order) for the information – which includes telephone numbers and addresses of customers – with the apparent intention of suing them for illegal filesharing, which we now know was probably unlikely.

BT said in the statement:

"With regard to Media CAT, we have been under a court order since July last year to supply them with details belonging to thousands more customers. We refused to do so and have now secured a further order to set aside the July order, meaning that the customer details will not be disclosed. Media CAT has also confirmed that all customer data that we sent to them in the past has now been deleted."

The UK's largest broadband provider has also escaped having to hand over thousands of customer details to Ministry of Sound, which applied for them back in May 2010 – months before the whole ACS:Law row blew up. BT adds:

"Digiprotect was another client of ACS:Law. We have already disclosed some customer details to them under a court order in early 2010. Since even before the revelations about ACS:Law, we had been challenging Digiprotect on their use of that data but did not get satisfactory answers.

"We have now taken the matter back to court and secured an order requiring Digiprotect either to issue proceedings or delete the data. The time for issuing proceedings has now expired and the data should be deleted."

We rang DigiProtect on Tuesday afternoon to see whether the data had been deleted. We were told that Dr. Frederik Gerckens, the company's managing director, was "in a meeting" and that we should ring back on Wednesday. That we shall. In the meantime it would be interesting to know what power BT, based in the UK, will have over Digiprotect, based in Germany, to order the deletion of user data that it handed over – and how it will be able to confirm that that deletion has been done. After all, how do you prove something isn't there?

All of this is bad news for so-called speculative invoicing schemes. While they were unashamedly lucrative for the solicitors' behind them – reaping money from those accused of illegal filesharing, but who didn't want to face the embarrassment (many were charged with downloading adult films) of fighting the charges in court – the long-term damage to these companies will likely outweigh the short-term spoils.

And that's before you ask big rights owners – Paramount Pictures, Disney, 20th Century Fox etc. – whether the row has sullied their cause. Back to BT:

"As a business we must facilitate genuine rights holders who wish to enforce their copyright in a proportionate way. With that in mind we have been working on a new framework policy to deal with future applications, in a bid to protect our customers.

"We continue to develop that policy, particularly in light of the comments of HHJ Birss QC in the recent Media CAT cases."
http://www.guardian.co.uk/technology...aw-filesharing





New UK File Sharing Law Challenged by ISPs in Court

Two major ISPs are at the UK high court to challenge elements of the new counter-piracy laws, that relate to illegal file-sharers.
Zack Whittaker

ISPs in England are challenging parts of the Digital Economy Act, which targets illegal downloaders and file-sharers.

BT and TalkTalk, both leaders in the broadband marketshare in the UK, are at the High Court today, arguing that elements of the bill were rushed through parliament last year without adequate scrutiny.

Under the current law, film and music providers can monitor illegal activity on peer-to-peer networks and collate IP addresses of those who infringe copyright. They can then apply to a court to force an ISP to hand over the name and billing address of the person alleged to have downloaded illegal content.

But there have been several cases where people were wrongly accused of illegally downloading copyrighted material.

In the United States, however, recent legal proceedings states that an ‘IP address does not equal a person’, referring to co-habitants and student households in particular.

A similar principle is hoped to be applied in the UK, with the ISPs in court hoping to clear up the definitions of the act, arguing that an IP address does not identify the particular user but only the location of where the alleged infringement occurred.

Yet more and more people are resorting to upload sites and checking links on blogs to download music and films, instead of using peer-to-peer networks where users are becoming more aware of the risks involved.

The bill was passed shortly before the new government was formed, during what is known as the ‘wash up’ period, where outstanding bills and legislation are cleared up and voted on before Parliament is dissolved.
http://www.zdnet.com/blog/igeneratio...-in-court/9051





ISPs Propose Pro-Piracy Website Blacklist

Molly-coddling on the high seas
Andrea Petrou

UK rights holders and internet service providers (ISPs) are debating an alternative to the Digital Economy Act which would see top file sharing websites blacklisted.

However, Big Brother Watch has spoken out against these plans claiming that internet users should not be "molly-coddled" by their ISPs.

They are planning a controversial Voluntary Code of Practice. That'd be the freedom to block any website that pushes and promotes copyright infringement online, with the group having already drawn up a list of targets including The Pirate Bay, NewzBin2 and Rapidshare.

ISPs are not pleased with a preliminary review from Ofcom. As part of the Digital Economy Act the watchdog proposed putting in place "technical measures" such as speed and access restrictions on ISPs, as well as giving those using offending sites and downloading material warning notices.

ISPs are worried that this would damage the way they operate.

One rights holder told ISP Review that it believed blocking users would be a much cheaper method. Others have said that they don't feel comfortable cutting off websites without the oversight of a judge.

Top10 believes that the proposed amendment "isn’t entirely without merit."

Jonathan Leggett at Top10 told TechEye: "The proposed amendment to legislation isn’t entirely without merit. Not least in the fact that it’s a positive step away from targeting end users, which is not only counterproductive but nigh on unenforceable.

"Focusing on blocking a smaller number of the best-known sites, which account for the majority of illegal downloads, on paper at least seems much more workable. What’s more, contrary to popular belief, while web-savvy types know how to get around site blocking, the less tech-minded punter won’t have that kind of knowledge.

"However, the new approach has some significant holes in it. Voluntary codes very rarely get results. More pertinent, though, is that the opaque process for blocking sites appears fraught with potential problems. To that extent it could be more costly and more likely to become mired in lengthy legal battles than what was being proposed before."

Daniel Hamilton, director at Big Brother Watch believes copyright is important but control shouldn't lie in the hands of ISPs.

"Leaving aside arguments about the rights and wrongs of websites such as Pirate Bay, Internet Service Providers shouldn't seek to impose blanket bans on online file-sharing services," he tells TechEye.

"While illegal copyright infringement is a serious problem, internet users ought to be allowed to access these services at their own risk rather than being molly-coddled by their ISPs".
http://www.techeye.net/internet/isps...site-blacklist





New Law Will Shut Down TorrentFreak, Music Industry Expert Says
Ernesto

TorrentFreak will soon cease to exist because of new legislation being considered by the Obama administration, a prominent music industry expert has announced. But we’re in good company. Music streaming service Grooveshark and the RIAA-approved iMesh will have to go too, and news sites like Wired, Techdirt and Slashdot will have to change their tune drastically so as not to upset the battered music industry.

Last week the White House published a white paper with several recommendations on how to make copyright law compliant with the digital age. Among other things, it suggests classifying unauthorized streaming of copyrighted material as a felony and to allow for wiretaps in copyright related cases.

The white paper along with its potential impact has since been widely discussed in the media, but apparently only a select few have the capacity to properly assess the consequences of an eventual change in copyright law. Music industry expert, book author and Grammy winner Moses Avalon is one of them.

“Here’s one story you won’t see going viral on a geek blog near you: the Obama administration is going to make torrent streaming, also known as P2P sharing of music, a felony,” Avalon wrote – four days after we covered the news.

Being the music industry and copyright expert he is, Avalon carefully explains how the White House recommendations will change the Internet as we know it. Not only will unauthorized streaming of copyrighted material become a felony, new legislation will also shutter legal music services that rely on P2P technology, and news sites that dare to mention the P word in public.

Although the White House white paper isn’t really about P2P at all, but about streaming, Avalon foresees a major change in the use of P2P technology on the internet, legitimate or not. In his list of services that will have to close, Avalon mentions the licensed streaming service Grooveshark and the RIAA-approved P2P service iMesh.

Despite the fact that Grooveshark and iMesh pay the music labels, they will have to go since the mere use of P2P and online streaming will soon be against the law, Avalon claims. And then there’s TorrentFreak, a site that has never encouraged readers to commit copyright infringement, but recognizes the benefits of P2P while rebutting entertainment industry propaganda.

TorrentFreak will have to change too, or be gone, Avalon says.

“You’ll start seeing less and less positive spin on P2P almost immediately,” says Avalon as he muses on the aftermath of the new legislation.

“Blogs who play fast and loose with copyright ‘facts’ and assert that P2P is OK because soon the music biz will be dead anyway, are going to get strangely quiet on the subject,” he writes.

Again, the above has very little to do with the White House announcement, which said nothing about P2P. In fact, encouraging people to commit copyright infringement through P2P services is already against the law. However, Avalon takes it up a notch claiming that writing about infringement and P2P will soon be a no go.

“What will they write about next? Who knows and frankly who cares. These guys are no different in my view than racist blogs inciting gay-bashing, and Antisemitism or ‘Freedom’ blogs that are vestibules for home-grown terrorism,” he notes while pondering the future of TorrentFreak.

And we’re not the only news sites who will be forced to change our tune, according to the expert. We’re in good company. Fine outfits such as Wired.com, Techdirt, Slashdot, Silicon Ally Insider and the blog of copyright lawyer Ray Beckerman will be affected too.

Let’s take a deep breath.

We honestly believe that Avalon’s writings are too absurd to respond too, especially coming from someone who previously said that Napster was the scapegoat of the music industry. And yes, Mr. Avalon was also the one who fiercely defended Eminem for rapping about wanting to see the president dead. Freedom of speech, he said at the time, only to now argue that writing about P2P technology is a crime.

But Avalon’s words do have impact, he thinks. He features all his TV appearances on his own YouTube channel and claims that his blog is read by 100,000 people, something he takes extreme pride in. When lawyer Ray Beckerman commented on his absurd writing, Avalon told him that he should be happy to be mentioned because it would get him some traffic. When responding to other commenters he simply ignores what’s being said, and changes the topic to himself and his outstanding writings.

You don’t have to be a psychologist to see that Moses Avalon shows signs of having a narcissistic personality disorder, to say the least. Should Mr. Avalon read a bible, he’d honestly believe himself to be the Moses who is so often referenced.

As for his writings with regard to TorrentFreak, the recommendations put forward by the White House do of course have no impact on sites that discuss P2P technology. And no, streaming and P2P services that distribute licensed content will not disappear either. It’s just the rambling of a pitiful person who just hit the narcissist jackpot with this article. Congrats!
http://torrentfreak.com/new-law-will...t-says-110322/





Supreme Court Ruling Makes Chasing File-Sharers Hugely Expensive
enigmax

A court ruling has not only sharply reduced the amount of compensation rightsholders can expect from Danish file-sharing cases, but has also drawn a line on evidential standards. To accurately claim their losses in future, rightsholders will have to gain physical access to an infringer’s computer. A leading lawyer in the field says the costs will prove prohibitively expensive.

In 2005, anti-piracy group Antipiratgruppen (APG) and the underlying music group IFPI tracked a man who they say was sharing 13,000 music tracks via a Direct Connect network. The case moved through the legal system and went all the way to the Supreme Court.

The 6 year-old case has now been concluded and although the rightsholder plaintiffs in the case won their battle – albeit in a much smaller way than anticipated – the Court’s ruling is set to prove a huge setback to their overall war.

The case against the now 57-year-old was brought by APG on behalf of many IFPI-linked record labels and artists. As is so often in these cases, they had hoped for a punishing outcome in order to deter others. The rightsholders had originally demanded 440,000 kroner ($83,400) in compensation but that claimed amount was ultimately reduced to 200,000 ($37,900).

However, yesterday the Supreme Court decided that the defendant should pay only 10,000 kroner ($1,900), a major setback for the rightsholders who had hoped for a much higher precedent-setting amount on which to model future cases.

The compensation-limiting factor problem proved to be the reach of the evidence relied on by Antipiratgruppen. APG used techniques which scraped the index of the files said to be being made available by the defendant and then linked them back to his IP address, a method which has been acceptable in the past. But while the Court accepted that some sharing had occurred due to the defendant’s confession, it wasn’t satisfied that the index was an accurate representation of the files physically present on the defendant’s computer.

Per Overbeck, lawyer for the defendant, said that the lowered compensation award shows that it’s worth fighting back.

“The ruling demonstrates that it pays to be critical of Antipiratgruppen’s claims,” he said.

Speaking with Politiken, IFPI lawyer Johan Schlüter said that the Supreme Court decision to tighten the standard of proof in these cases could mean that Antipiratgruppen has to seize and investigate the defendant’s computer in any forthcoming cases, an expensive process that would require a bailiff, IT experts, and in some cases a locksmith.

“I will not directly say that we can not afford it, but it could be so expensive that it could mean we cannot pursue such matters,” said Schlüter. “We can not accept that we have become completely neutered, so we’ll now sit down with some IT people and think through what we can do to provide better documentation.”

Schlüter commented that the industry is in somewhat of a “cultural battle” with illegal copying and he could have a point. A recent moral standards study in Denmark found that a high percentage of the public found illicit downloading socially acceptable.
http://torrentfreak.com/supreme-cour...ensive-110325/





P2P Lawyers Score a Victory; Mass Subpoenas Can Proceed
Nate Anderson

Judges across the country have been hammering mass file-sharing lawsuits in recent months, with one in West Virginia even going so far as to “sever” every such lawsuit filed in that district. But it's not all bad news for the attorneys bringing these suits, as they managed to score a victory this week. A federal judge in Washington, DC has decided that three such cases can continue, and the ISPs involved need to turn over names in a timely fashion.

Lawyers and public interest groups have taken part in such lawsuits all over the US, commonly arguing that suing hundreds or thousands of alleged BitTorrent users in a single lawsuit should not be allowed. Three reasons are usually given. First, the defendants in such cases are “improperly joined,” since they did not act together to break the law in the way that, for instance, an organized criminal syndicate might do. Each case should instead be filed individually, for $350 apiece in filing fees.

Second, most of the people sued in most of these cases live outside the district of the federal court in which they are being sued. Copyright holders should have known this, goes the argument, because geolocation tools make it simple to approximate someone's location with just an IP address. All of these people should be dismissed from the case.

Finally, defendants have a First Amendment right to anonymous speech. This right can obviously be breached when the law has been broken, but groups like the American Civil Liberties Union don't believe that turning in a list of IP addresses to a court is good enough evidence to invade someone's privacy, especially when cases concern pornographic films (as many do).

On Tuesday, Judge Beryl Howell rejected all three claims—at least for now.

For the purposes of this ruling, the judge combined three such mass lawsuits currently pending before the court. As for improper joinder, the judge ruled that suing hundreds or thousands of anonymous defendants at once was permissible, since the claims against them are all “logically related.” The judge also decided that forcing copyright holders to file a separate suit against each individual would cost too much money, something that would “further limit their ability to protect their legal rights.” Later in the case, defendants might well each present different defenses and different factual scenarios that would require them to be separated, but the judge said that they could be joined in the initial discovery phase.

As for personal jurisdiction, the judge said that jurisdiction remains unclear. Geolocation only reveals “where a defendant is ‘likely’ to be located. Given that these lookup tools are not completely accurate, this does not resolve the question of whether personal jurisdiction would be proper.” Once defendants are actually identified by their ISP, "they will then have the opportunity to file appropriate motions" if they are outside the court's jurisdiction.

On First Amendment concerns, the judge did agree that file-sharers are “engaged in expressive activity, on some level,” but concluded that this was fairly minimal and that the copyright holders had compelling reasons to seek out the defendants' identities.

The decision is a victory for the US Copyright Group, which brought the lawsuits. Though actually prosecuting all of the lawsuits to completion might still require thousands of separate cases, large sums of money, and years of effort, one more judge is at least open to mass subpoenas at an early stage of litigation. That should allow rightsholders to continue their preferred approach of filing large cases, mailing out settlement letters, and ending the lawsuit for a few thousand dollars.

Still, with other judges issuing nearly opposite rulings, no clear consensus has yet emerged in the federal judiciary about the proper approach to take.
http://arstechnica.com/tech-policy/n...an-proceed.ars





Righthaven Lawsuits Backfire, Reduce Protections for Newspapers
Steve Green

One year ago, U.S. newspapers and broadcasters could feel confident they controlled the news content they created.

It was understood that competing and special-interest websites couldn't appropriate that content and post it without authorization.

When such infringements occurred, they were dealt with swiftly and effectively with a simple phone call or email.

Infringing websites typically had re-posted material out of ignorance they were violating the Copyright Act and agreed to remove the material or replace it with a link to the source newspaper or broadcaster.

Then along came Righthaven LLC of Las Vegas, the self-appointed protector of the newspaper industry from such news sharers.

Some 250 Righthaven lawsuits later, Righthaven's startling achievement is that newspapers now have less -- not more -- protection from copyright infringers.

Righthaven may argue its lawsuits have deterred rampant online infringements of newspaper material -- but there's no proof that infringements it usually targets involving bloggers and special-interest websites ever affected newspaper revenue in the first place.

Keep in mind Righthaven doesn't sue local news competitors of the Review-Journal and the Denver Post and it doesn't sue big news aggregators like Yahoo and Google -- likely because it can't find infringements by these sites.

Back to the lawsuits: Just two of Righthaven's lawsuits have been closed by judges on the merits -- both now resulting in fair use losses for Righthaven and its partners at the Las Vegas Review-Journal.

While these aren't binding precedents upon other judges, these rulings can now be used by special-interest websites to justify their postings of what used to be copyright-infringing content. These, clearly, are setbacks for all newspapers interested in protecting their copyrights.

***

There's little doubt that many of Righthaven's lawsuit targets in fact infringed on copyrights for material that originally appeared in the Review-Journal or the Denver Post.

In these cases, the misappropriated material ran alongside advertising on the infringing websites and sometimes the material wasn't even credited to the Review-Journal or the Post. These sites generally settled, or are settling, the Righthaven lawsuits against them.

Still, if these defendants had fought the suits, Righthaven likely would have won only minimal damages. That's because most of these websites are so obscure that no judge or jury would find their use of the material from the Review-Journal or the Post materially harmed either newspaper -- both of which still offer the material for free on their websites.

There have been some big exceptions like the Drudge Report and Citadel Broadcasting, but most of these lawsuits are against websites few had ever heard of -- cat blogger Allegra Wong's site, for instance.

Another strike against Righthaven is that judges are likely to find copyrights obtained exclusively for the purpose of filing lawsuits are afforded less protection than copyrights held for the usual purpose of delivering the news.

Why aren't more of these defendants fighting Righthaven? Faced with tens of thousands of dollars in legal defense costs, potential damage awards of $150,000 and seizure of their domain names, attorneys usually say it's smarter to settle for a few thousand dollars. This use of the courts as an ATM machine by Righthaven hasn't gone unnoticed by the federal judges presiding over these cases.

***

U.S. District Court for Nevada Judge James Mahan, in striking the latest fair use blow against Righthaven on Friday, announced a decision that to me would have been unthinkable one year ago: A nonprofit was protected by the fair use doctrine in posting an entire Review-Journal story without authorization.

I wasn't the only one thinking that way. In initially responding to the lawsuit at issue, the defendant's attorneys didn't even argue fair use. It was Mahan who put that issue on the table.

If this decision is adopted by other judges and upheld on appeal, it would mean any nonprofit could post without authorization entire stories from the Las Vegas Sun or any other newspaper -- and presumably television and radio reports as well.

Keep in mind the story at issue wasn't a four-paragraph rant about the TSA or a five-paragraph report on a shooting.

For her 33-paragraph June 2010 story on Las Vegas police arresting illegal immigrants on misdemeanor charges, and authorities later deporting them, R-J reporter Lynnette Curtis interviewed multiple sources and clearly spent a good deal of time researching and writing the piece.

Mahan commented Friday that, "No disrespect to the reporter," but Curtis's story was essentially an information piece and didn't involve a level of creativity that would have afforded it greater copyright protection.

If the case had gone to a jury though, jurors may have heard about the amount of planning and work involved in executing such a story. They may also have heard about the substantial costs newspapers face every day to maintain buildings and equipment and to pay their staff to produce these types of stories for both their print and Internet audiences.

Including the editing work, in this case you have a substantial piece of journalism that's clearly of value to the R-J and its readers.

But now, according to Mahan, any nonprofit can appropriate the story for its own use and there's nothing Righthaven or the Review-Journal can do about it.

Mahan, during hearings on Friday and in December on the lawsuit against the nonprofit that posted the story, made it clear a big problem he had with the lawsuit was that it was filed without warning or a takedown notice by Righthaven and that Righthaven's copyright for the story is of dubious value since it only uses its copyrights for lawsuits.

If the Review-Journal had filed the lawsuit, rather than Righthaven, the R-J may have received a more sympathetic hearing since the R-J uses its copyright-protected material for the traditional purpose of delivering the news.

But then, if the R-J had called or emailed the nonprofit, the Center for Intercultural Organizing (CIO) in Portland, Ore., there would have been no lawsuit as the CIO appears to be a responsible organization that would have removed the story.

Mahan made three key points Friday that no doubt have occurred to other judges handling these cases: The lack of a takedown request or order hurt Righthaven's cause, the nonprofit status of the defendant weakened the lawsuit and -- most importantly for pending and future Righthaven cases -- the CIO's use of the story did not harm the market for the R-J story.

That's because, like the vast majority of Righthaven defendants, the CIO operates a special-interest website that in no way competes with or diverts business from the R-J website. In fact, it can be argued that most of these special-interest sites help the R-J and the Denver Post by stimulating interest in the R-J, the Post and their coverage of specialized topics.

Further hurting the cause of the R-J and Righthaven, and probably the Post down the road, is that Righthaven critics are pointing out there can be no market harm proven in these lawsuits since there is no market -- that is, Righthaven owns the copyrights and doesn't use or license them except for lawsuits.

This was a point made to Mahan during Friday's hearing by professor Jason Schultz, co-director of the Samuelson Law, Technology & Public Policy Clinic at the University of California-Berkeley.

Schultz had filed a friend of the court brief listing reasons the suit could be dismissed on fair use grounds and also arguing the Review-Journal had encouraged the online posting by the Oregon center by suggesting that readers share its news online.

Schultz's testimony against Righthaven was sponsored by the online free speech group the Electronic Frontier Foundation (EFF), which is fighting Righthaven and Review-Journal owner Stephens Media LLC in a few other copyright cases.

Kurt Opsahl, an EFF senior staff attorney who attended Friday's hearing, said the two fair use rulings against Righthaven have not weakened the newspaper industry's ability to stop legitimate copyright infringements involving content where there's a true market for the content at issue.

Schultz said the rulings weakening Righthaven's copyright claims should serve as a warning to the newspaper industry about doing business with Righthaven.

For newspapers trying to protect copyrights, "You don't want to give wins to the other guys," he said.

"The newspaper industry has to be careful about supporting the Righthaven business model," Schultz said. "If the newspaper industry is depending on copyright lawsuits, it's in a bad way. It's not going to be a good model. It should not rely on lawsuits."

***

Shawn Mangano, the attorney representing Righthaven during the hearing, said the company is hopeful Friday's fair use ruling against it will be struck down on appeal by the 9th U.S. Circuit Court of Appeals because of what Righthaven calls factual and procedural errors by Mahan.

Righthaven is also hopeful the 9th Circuit will reverse or at least modify Righthaven's first adverse fair use ruling by U.S. District Judge Larry Hicks in a lawsuit over the partial posting of a Review-Journal story by a Las Vegas real state agent.

Righthaven is likely to point out in its appeals that prior to ruling, neither Mahan nor Hicks allowed the parties to gather evidence about the alleged infringements through discovery. This lack of a factual record about the use of the R-J material at issue may undermine those rulings.

***

Friday's ruling against Righthaven was just strike two in what could become an expensive losing streak for the Las Vegas company.

Righthaven and its investors -- Las Vegas attorney Steven Gibson and an affiliate of Stephens Media -- now face seven counterclaims and mounting legal costs in the litigation campaign.

There's a real chance these counterclaims could yield further adverse rulings and awards of hundreds of thousands of dollars in attorneys fees against Righthaven on four points not covered by Mahan on Friday:

• That Righthaven's copyright claims are compromised by the fact that the Review-Journal and the Post encourage the online sharing of their material.

• That Righthaven's standard domain-seizure lawsuit demand will be struck down and judges will use this against Righthaven, perceiving it as an unfair tactic aimed at coercing settlements.

• That Righthaven's copyright assignments from Stephens Media are flawed in that Stephens Media maintains an economic interest in the content covered by the copyrights. This obscure legal point is likely to explode in the coming months as EFF attorneys representing the Democratic Underground and Righthaven attorneys fight over whether the law allows lawsuits over copyrights obtained for the sole purpose of litigation.

• That Righthaven's copyright claims over a Denver Post TSA pat-down photo could be compromised by the fact the photo went viral and the alleged infringers found it on websites other than the Post website -- meaning they had no idea they were infringing on material initially published by the Denver Post. Further complicating these cases is that The Associated Press has reported it distributed the photo at issue to news outlets, further muddying the waters about how anyone was supposed to know it was a Denver Post/Righthaven photo.

***

Elsewhere in the Righthaven litigation campaign, the company continues to run into trouble with its lawsuits because of its policy of suing first and asking questions later.

At least four of its lawsuits in U.S. District Court for Colorado over a Denver Post TSA pat-down photo are giving Righthaven trouble:

• A suit involving North Carolina blogger Brian D. Hill is a case that Righthaven would like to see go away. Only after suing Hill did Righthaven learn Hill has diabetes, hyperactive attention disorder and mild autism -- facts Hill has been communicating to the world on his websites and in an online petition urging U.S. District Judge John L. Kane in Denver to dismiss the suit against him. His attorney, in the meantime, is friendly with the EFF and is drafting a lengthy response to Righthaven's lawsuit that Righthaven will have to deal with if the case isn't settled.

• An attorney for Glenn Church, who was sued in Colorado over the Denver Post photo on Jan. 27, has informed Righthaven that Church had filed for bankruptcy on Dec. 30 in San Jose, Calif. This may mean extra work for Righthaven as it will have to ask the bankruptcy court for permission to continue the litigation -- assuming Church has money or assets that Righthaven wants to go after. Righthaven's lawsuit against Church alleges the photo at issue showed up on Church's website, foolocracy.com.

• Righthaven's lawsuit against Pajamas Media Inc. remains on hold after Pajamas Media Inc. said it was wrongly sued as it is a suspended California corporation that is not operating, has no assets and has no connection to the Pajamas Media website pajamasmedia.com.

• After suing Baltic Enterprises LLC and StrangeCosmos.com, Righthaven was informed by their attorney that "another individual and/or entity has been improperly operating under the name Baltic Enterprises L.L.C." and it's that party Righthaven will have to track down and sue.

***

In other Righthaven developments, U.S. District Judge Gloria Navarro in Las Vegas on Friday granted defendant and counterclaimant Thomas Neveu's motion that the case be put on hold for six months.

Neveu asked for the stay and that his case be sealed because of health reasons. Navarro declined to seal the case.

In a filing of nonopposition to the request for a stay, Righthaven attorneys denied assertions by Neveu that Righthaven had publicly disclosed information about his health situation.

Also, Righthaven dropped its motion for a clerk's default against New Hampshire blogger Christopher Malley and his website EMTCity.com, serving the emergency medical technician community.

Righthaven attorneys disputed charges by Malley's attorneys, who said the default motion violated a rule requiring Righthaven to confer with counsel for defendants before filing such motions. Righthaven attorneys said they had called Malley's counsel with the law firm Lewis and Roca LLP on the day they filed the default motion, but there was no response from Malley's counsel.

With the default motion out of the way, Lewis and Roca filed a response to the lawsuit -- which has already been heavily litigated through motions for dismissal that went against Malley -- with the usual denials and defenses against Righthaven.

These include fair use, that Righthaven's lawsuit is barred by the First Amendment, that the claim is too trivial to pursue, implied license, copyright misuse, alleged fraud upon the Copyright Office, barratry, champerty and that Righthaven lacks standing to sue.

Righthaven has not yet replied to that filing.
http://www.lasvegassun.com/blogs/bus...rotections-ne/





Google Books Deal Blocked

Judge's ruling: Amended Settlement Agreement "not fair, adequate, and reasonable"
Doug Caverly

Barring more legal maneuvering, the proposed Google Books Settlement – which would have cleared the way for Google to scan, digitize, and distribute millions of works – will not stand as is. This afternoon, Judge Denny Chin rejected the settlement, agreeing with opponents that it would give Google an unfair advantage.

U.S. Circuit Judge Chin expressed his opinions in a court document refreshingly light on legalese. He wrote, “While the digitization of books and the creation of a universal digital library would benefit many, the ASA [Amended Settlement Agreement] would simply go too far. It would permit this class action . . . to implement a forward-looking business arrangement that would grant Google significant rights to exploit entire books, without permission of the copyright owners.”

Then Chin continued, “Indeed, the ASA would give Google a significant advantage over competitors, rewarding it for engaging in wholesale copying of copyrighted works without permission, while releasing claims well beyond those presented in the case.”

Finally, the judge finished, “I conclude that the ASA is not fair, adequate, and reasonable. As the United States and other objectors have noted, many of the concerns raised in the objections would be ameliorated if the ASA were converted from an ‘opt-out’ settlement to an ‘opt-in’ settlement.”

Google hasn’t yet said much in response. It’s possible the search giant will follow Chin’s opt-in suggestion; it would almost certainly be the quickest and easiest way to resolve the matter, which has been in lawyers’ hands for years.

Or perhaps, given that Google is in no apparent hurry and has plenty of cash, the company will appeal Chin’s decision.

As always, we’ll be sure to continue following the situation.
http://www.webpronews.com/google-boo...locked-2011-03





Once in the Public’s Hands, Now Back in Picasso’s
Adam Liptak

Supreme Court arguments often concern not just the narrow issue in the case but also the implications of a ruling. You sometimes catch the justices squinting, trying to see over the legal horizon.

Nine years ago, for instance, the court heard arguments in a case about whether Congress was free to add 20 years of copyright protection for works that had not yet entered the public domain.

Several justices asked about a different and even tougher question: Was Congress also free to restore copyright protection to works that had entered the public domain and become public property?

“If Congress tomorrow wants to give a copyright to a publisher solely for the purpose of publishing and disseminating Ben Jonson, Shakespeare, it can do it?” Justice Stephen G. Breyer asked a lawyer for the government.

“It may,” said the lawyer, Theodore B. Olson, who was United States solicitor general at the time. But he did not sound too sure.

A little later, Justice David H. Souter pressed Mr. Olson on the same point and elicited the concession that restoring a copyright presented a much harder case.

“There is a bright line there” for “something that has already gone into the public domain,” Mr. Olson said.

Justice Souter seemed satisfied. “If you don’t throw out a line there,” he said, “then Ben Jonson certainly gets recopyrighted.”

The court ended up ruling, by a 7-to-2 vote in 2003 in Eldred v. Ashcroft, that extensions for works still under copyright are allowed.

This month, the court agreed to hear a case on the question Justices Breyer and Souter anticipated, one that will test whether there is indeed a constitutional line Congress may not cross when it comes to the public domain.

The new case asks whether Congress acted constitutionally in 1994 by restoring copyrights in foreign works that had belonged to the public, including films by Alfred Hitchcock and Federico Fellini, books by C. S. Lewis and Virginia Woolf, symphonies by Prokofiev and Stravinsky and paintings by Picasso, including “Guernica.”

“The works that qualify for copyright restoration probably number in the millions,” Marybeth Peters, the United States register of copyrights, said in 1996.

The plaintiffs in the new case, Golan v. Holder, are orchestra conductors, teachers and film archivists who say they had relied for years on the free availability of works in the public domain that they had performed, adapted and distributed.

The 1994 law, they told the justices, “did something unprecedented in the history of American intellectual property law and constitutionally profound.”

Lawrence Golan, the lead plaintiff, teaches conducting at the University of Denver and is the music director and conductor of the Yakima Symphony Orchestra in Washington State. He said the 1994 law made it very difficult for smaller orchestras to play some seminal 20th-century works that had once been a standard part of their repertories.

“Once you own a Beethoven symphony, you own it till it falls apart,” he said. “That used to be the case with Stravinsky, Shostakovich and Prokofiev. Now an orchestra that wants to play, say, Shostakovich’s Fifth has to rent it for $800 for one performance.”

He said he had no quarrel with providing financial incentives to people who create art. “Obviously, current composers need to be encouraged to create their works, and they should be getting royalties,” Mr. Golan said.

But he said withdrawing works from the public domain did great harm to the cultural life of small communities for no good reason.

That analysis, Mr. Golan’s lawyers say, is consistent with the constitutional balance between property and speech. The Constitution authorizes Congress “to promote the progress of science and useful arts, by securing for limited times to authors and inventors the exclusive right to their respective writings and discoveries.”

In other words, said Anthony T. Falzone of the Stanford Law School Center for Internet and Society, which represents the plaintiffs, the Constitution meant to create incentives, not monopolies. “The whole point wasn’t to protect stuff,” he said. “It was to encourage people to make stuff, and everybody’s lost sight of that.”

The government counters that nothing in the 1994 law did damage to the constitutional structure or to free speech rights.

The government adds that the 1994 law applies to foreign works “previously ineligible for protection or whose authors were unfamiliar with the technicalities of United States law.” Every work brought back into copyright protection, the government says, “expires on the same day as if the work had been protected since its creation.”

The federal appeals court in Denver, in upholding the law, said there were important First Amendment interests at stake on both sides. It concluded that there was reason to think that American authors and artists would be better off abroad if foreign authors and artists received expanded copyright protection here.

That economic calculation rankled Mr. Falzone. “You’re selling public property,” he said. “Congress literally took the public’s property and handed it over to foreign copyright owners.”
https://www.nytimes.com/2011/03/22/us/22bar.html





Best Selling Author Turns Down Half A Million Dollar Publishing Contract To Self-Publish
Mike Masnick

Joe Konrath, who we've written about numerous times, and Barry Eisler (who we haven't...), contacted me late last week to pass on the fascinating news that Eisler, who has been a NY Times Best Selling author of a variety of thrillers, has turned down a $500,000 publishing deal from a mainstream publisher, in order to self-publish his next book. That's a lot of money to give up. The link is to a (long, but fascinating) dialog between Konrath and Eisler, discussing the thinking behind passing up that kind of money to go the self-publishing route. The key takeaway: the $500,000 comes with strings (as does any publishing deal), and in this case, Eisler feels he's likely to be better off on his own.

Konrath, of course, has spent a lot of time sharing real world data on why the math works for people to self-publish digitally, pricing the book cheaply, but making much, much higher royalties per book sold. I'm still not convinced this move is right for everyone yet, but if you can handle the key functions that a publisher provides (things like editing, marketing, etc.) it can work out quite well. Publishers used to also be key for distribution, but that's less and less an issue these days, when physical book stores are less and less important, and online/digital is key. But you don't need a publisher for those things. Marketing is still the big issue for many, so this depends on how well you can market yourself, or work with someone else (perhaps the person who used to be your "agent") to market the work. And, of course, it's entirely possible that even if you went with a publisher, they'd do an awful job of marketing your book anyway (happens more times than you'd like to believe).

And, of course, the money in that deal is really an advance, that needs to be earned back. As we've discussed with RIAA accounting, earning that back is a lot more complex than it may sound -- and the gatekeeper (the publisher or the label) gets to reach a level of profits way, way, way before the content creator ever does (if they ever do). This part of Eisler and Konrath's discussion is instructive:

Joe: What was the ultimate basis for your decision? Did it come down to pure dollars and cents?

Barry: Financial considerations were a big part of it, yes. You and I have discussed various models to understand what a publisher's advance represents: a loan, an insurance policy, a bet. On the loan model, the first place I heard the concept articulated was in an extremely ballsy and persuasive blog post by Terrill Lee Lankford.

Joe: I like that analogy. I also believe signing with a big publisher is like signing a life insurance policy, where the payments keep getting larger while the payoff gets smaller as time goes on.

Barry: Yes. Now, of course there are numbers where the loan, the insurance, or the bet would make sense. If the loan is so big that you don't think you'd ever be able to make that much on your own, plus you won't have to pay it back, then sure, take it. If the insurance payout is so big that it eclipses the event it's supposed to protect against, okay. And if you find a publisher willing to put down so much money upfront that you feel they must be stoned because no one could ever earn that much back, then by all means, take the bet.

But short of that, you have to wonder if the person you're betting against isn't yourself.

Anyway, yes, much of this was financial. A lot of people don't realize--and I probably wouldn't have realized myself if you hadn't pointed it out--that the appropriate measure for determining how much your books can earn you in digital is forever. In paper, with rare exceptions, there's a big upfront sales push, followed by either total evaporation or by years of low backlist sales. Digital isn't like that.

Joe: Time is the ultimate long tail. Even with a big wad of money upfront, if something sells forever, the back end is what ultimately counts.

Barry: Right. So if you think you're going to die on Tuesday, for sure take the advance on Monday. If you think you're going to stick around for a while, though, and you have resources to draw on such that you don't need that expensive loan, don't take it. You'll be better off without.

Joe: Or to put it another way, getting half a million bucks and 14.9% royalties, forever, isn't as lucrative as no money up front and 70% royalties, forever.

Barry: Yes. Especially because you first have to earn out the half million at 14.9% per book. That could take a while. After which, as you note, you're still only earning 14.9% rather than 70%. You need to move five times the volume at 14.9%.

Of course, the conversation doesn't just focus on Eisler and his decision to turn down half a million dollars up-front. It talks about the publishing industry as a whole, and how it -- like so many other industries -- is struggling to recognize that it's moving away from being a gatekeeper business, and needs to start becoming an enabler business. Instead, it's trying to hang onto the gatekeeper side of things for as long as possible (again, like some other industries we're familiar with).

Joe: I also love print books. I have 5000 of them. But print is just a delivery system. It gets a story from the writer to the reader. For centuries, publishers controlled this system, because they did the printing, and they were plugged into distribution. But with retailers like Amazon, B&N, and Smashwords, the story can get to the reader in a faster, cheaper way. And publishers aren't needed.

Do you think publishers are aware of that?

Barry: I think they’re extremely aware of it, but they don't understand what it really means.

Joe: I believe they've gotten their business model mixed-up. They should be connecting readers with the written word. Instead, they're insisting on selling paper.

Barry: Yes. There's a saying about the railroads: they thought they were in the railroad business, when in fact they were in the transportation business. So when the interstate highway system was built and trucking became an alternative, they were hit hard. Likewise, publishers have naturally conflated the specifics of their business model with the generalities of the industry they're in. As you say, they're not in the business of delivering books by paper--they're in the business of delivering books. And if someone can do the latter faster and cheaper than they can, they're in trouble.

Joe: You say they're aware of it, and some evidence points to that being true. The agency model is an attempt to slow the transition from paper to digital. Windowing titles is another one. So are insanely high ebook prices.

Barry: All signs that publishers are aware of the potential for digital disintermediation, but that they don't understand what it really means.

Joe: Because they still believe they're essential to the process.

Barry: I would phrase it a little differently. They recognize they're becoming non-essential, and are trying to keep themselves essential--but are going about it in the wrong way.

Joe: You and I and our peers are essential. We're the writers. We provide the content that is printed and distributed. For hundreds of years, writers couldn't reach readers without publishers. We needed them.

Now, suddenly, we don't. But publishers don't seem to be taking this Very Important Fact into account.

Barry: Well, again, I think they're taking it into account, but they're drawing the wrong conclusions. The wrong conclusion is: I'm in the paper business, paper keeps me essential, therefore I must do all I can to retard the transition from paper to digital. The right conclusion would be: digital offers huge cost, time-to-market, and other advantages over paper. How can I leverage those advantages to make my business even stronger?

Now, I know we have some publishing folks among the readership here, and I'm sure they'll disagree, but there's clearly some truth to this. And, in fact, there may be many individuals within the various publishing companies who do get this. But institutionally, they seem to be reacting to try to hold back the tide, rather than embrace the tide. This is a pretty standard reaction, and we've seen it in other industries before. One typical response that we hear when pointing this out is that these publishers don't want to make that "leap" to really embrace the new until they know that it's sustainable and that it can work. But the key lesson that we've learned over and over and over again in other industries is that if you wait for such things, it's too late. In ceding that leadership position, you give up on being the enabler, and what's left for you is often... not much.
http://www.techdirt.com/blog/casestu...-publish.shtml





Received: Real Time File Sharing

Want early access?

We were sick and tired of the endless hoops we had to jump through just to send photos, videos and other files to the people we care about without having to compromise. Receivd is a powerful solution we've built to solve that problem.

Have photos you need to send to multiple people? No problem.

Receivd lets you create lists of people to securely share files with. You then drag & drop sets of files once and everybody on your list starts securely downloading the file instantly without having to manually approve the downloads. Simpler than email.

Need to send that essay to your classmate? Piece of cake.

Receivd also lets you send files to just one person securely. If they're not in front of their computer, the files you sent will be waiting for them when they return, along with email notifications with links to the files attached.

Using different platforms? We've got you covered.

Receivd's beautiful and powerful desktop, web, and mobile apps allow you browse through the photos, videos and other files that people have sent you and also allows you to send files effortlessly to multiple people at once. Receivd also keeps everything organized so you never have to search for files someone has sent you.

Send originals, not copies. They get just what you send.

Unlike email, all photos and videos sent via Receivd are delivered at the same high quality resolution that your camera supports. No compromises.
http://receivd.com/early-access





Times’s Online Pay Model Was Years in the Making
Jeremy W. Peters

The discussions played out over most of 2009 amid the hum of the third floor newsroom and in the executive suites high above Times Square, consuming what seemed like countless meetings and consultants’ recommendations.

At issue was the biggest strategic leap in a generation for the 159-year-old New York Times: would readers be willing to pay to read its journalism online?

The Times announced its new subscription plan last week to widespread debate. Many readers and bloggers said they were happy to be able to finally pay for their frequent use of the Web sites, while many others — joined by some industry analysts and pundits — said that The Times was dangerously out of step with the digital age and that the approach was doomed to fail.

The same debate raged inside The Times, with executives and senior editors sometimes heatedly taking sides. In the middle was Arthur Sulzberger Jr., chairman of the company, who grew to embrace the idea of a pay model. But he was opposed by several senior executives, especially those who had worked to build NYTimes.com into the most visited newspaper site in the world.

The risks were manifold. The company might jeopardize its huge online reach, and no one could predict what would happen to digital advertising, which had gone from being a drop in the bucket to more than a quarter of The New York Times Company’s overall advertising revenue.

Given the size of its online audience and its historic position as the country’s paper of record, its pay model, which takes effect on March 28, may be the most watched experiment in American journalism. The announcement last week might have laid out the company’s plan, but it did not end the debate, inside the building or out.

“On the one hand, I think there is some anxiety around it,” said Martin A. Nisenholtz, the senior vice president for digital operations, who initially wanted NYTimes.com to remain open and free. “On the other hand, I think the model we have chosen mitigates 90 percent of it.”

The Times had experimented with a pay model before: TimesSelect, which ran from 2005 to 2007, charged for access to popular opinion columnists like Frank Rich and Maureen Dowd and for The Times’s archives. That program brought in 227,000 subscribers at $49.95 a year, generating about $10 million in revenue.

But after they commissioned a study to examine how TimesSelect was working, company executives became convinced that restricting access to the site was constricting its potential for more readers and more advertising.

When that program was ended, traffic to the site almost doubled. It now stands at more than 30 million unique domestic visitors a month.

“It made more business sense to end TimesSelect,” said Vivian Schiller, who was senior vice president and general manager of NYTimes.com and championed ending the pay model. “At that time the game was about math.”

When the depths of the recession arrived in 2008, revenue from digital efforts had started to decline. In 2009, the Times Company borrowed $250 million from the Mexican billionaire Carlos Slim Helú at the extremely high interest rate of 14 percent, and for the first time in its history the paper resorted to layoffs in its newsroom. While online advertising has since recovered, it has not rebounded at a pace sufficient to make up for the continuing drop-off in print advertising.

These developments added fresh urgency to discussions about an online subscription model. There was a sense that online readers might be willing to pay, as print subscribers always had, according to Denise Warren, the chief advertising officer of The New York Times Media Group, which includes The International Herald Tribune.

“Some of them even send us checks unsolicited,” she said. “I have this woman in Canada who’s sent me two $50 checks because she doesn’t understand why she can get our journalism for free. Each time I have to tell her I can’t accept the check.”

Executives studied a variety of online business models including those used by Weight Watchers, which charges $17.95 a month plus a $29.95 initiation fee for weight loss guidance, and Apple’s iTunes service, which popularized the micropayment with the 99-cent song download. They even looked at a donation model and at creating a digital newsstand where people could buy The Times as part of a bundle with subscriptions to local papers and national papers like The Wall Street Journal and The Washington Post.

Mr. Sulzberger wanted a flexible system, one that would allow the company to adjust the limit on the number of free articles as needed — in the case of a big breaking news event, for example.

“Let’s imagine there’s a horrifying story like 9/11 again,” he said in an interview. “We can — with one hit of a button — turn that meter to zero to allow everyone to read everything they want,” he said. “We’re going to learn. We built a system that is flexible.”

The issue had long split the newsroom, as well. Many reporters and editors embraced the reach of the free site while others worried about further cuts if The Times could not tap a new source of revenue.

“I believe that our journalism is very worth paying for,” said Jill Abramson, The Times’s managing editor for news. “In terms of ensuring our future success, it was important to put that to the test.”

In the end, executives decided on a tiered plan, one that would allow visitors to read 20 articles a month at no charge before being asked to select one of three subscription models: $15 every four weeks for access to the Web site and a mobile phone app (or $195 for a full year); $20 for Web access and an iPad app ($260 a year); or $35 for an all-access plan ($455 a year).

“I think people across the board would find it unacceptably uncomfortable to have a hard wall up,” Mr. Nisenholtz said. “But I think everybody is much more comfortable in an environment where content can still be shared, tweeted, blogged.” Articles that readers gain access to through social networks will not count toward the monthly limit.

But there were two looming issues in the model. The first was technical: computer errors had marred TimesSelect. Home delivery subscribers, who were given free access, often had difficulty linking to their accounts online.

“That is one of the things we learned with TimesSelect,” said Janet L. Robinson, chief executive of the company. “We had to have the user experience down cold.”

For that reason, the new system took more than a year to build and was rolled out several months behind schedule. The Times has not publicly said how much the system cost to build.

The other issue was price. Executives considered more than 100 product combinations priced as low as $5 and as high as $40 a month, and they studied more than 20,000 consumer survey responses. Executives say they were surprised how many readers were willing to pay.

“It was high,” said Paul F. Smurl, vice president for paid products for NYTimes.com. “So high I said we needed to do the testing again.”

The company also wanted to get in early on the paid-app market, which the iPad has demonstrated can be a growing business. “This should be a huge marketplace for The New York Times,” Mr. Nisenholtz said.

Despite the consumer testing, the plan has been met with considerable skepticism. Critics point out that paying for general news online, as opposed to business news, is something that few without expense accounts have demonstrated a willingness to do.

“At the moment, I can’t see any evidence of a general interest newspaper making a success of this,” said Alan Rusbridger, editor of The Guardian, which does not charge for its Web site. “I think financial journalism is the one exception to the rule because the information is of value, and it’s time-critical. If I can know something five minutes before you do because I have a subscription to The Financial Times or The Wall Street Journal and you don’t, that’s of value.”

The Times will not say publicly how many online subscribers it hopes to get. But company executives have said privately that the goal for the first year is 300,000. And Mr. Sulzberger and Ms. Robinson insist that the plan is not intended for short-term gain.

“This is not a bet on this year,” Mr. Sulzberger said. The question that remains to be answered is whether that bet pays off in 2015, 2020 or ever.

“There’s no mystery here about what the risk is. This is essentially a bet that you can reconstitute to some degree the print economics online,” said Jonathan Landman, who was deputy managing editor for digital journalism until 2009, when he was named culture editor.

In his digital role, Mr. Landman argued against charging readers. But he says he believes the model the paper has settled on is sophisticated and designed to minimize traffic loss as much as possible.

“I’m not saying that anybody who believes in this is some kind of dinosaur,” he said, “but it says that as a business we can be a modernized form of what we do. And that may be true.”
https://www.nytimes.com/2011/03/21/b...a/21times.html





That Was Quick: Four Lines of Code is All it Takes for The New York Times’ Paywall to Come Tumbling Down
Joshua Benton

The New York Times paywall is costing the newspaper $40-$50 million to design and construct, Bloomberg has reported.

And it can be defeated through four lines of Javascript.

That fact is both the problem and the opportunity of a leaky paywall. There is no one consistent, workable price for online news content. For the vast majority of people who read a news site, the price they’re willing to pay is zero; for a few, it’s something more. The key question of the Times paywall — and of any paywall, really — is how to maximize the revenue generated from those two extremes and the various gradations in between.

The Times’ approach is to create a relatively high price point — $15 to $35 a month, depending on the package — for those willing to pay. For those who are very casual fly-by readers — those who read fewer than 20 articles a month — the site remains free, and the Times makes money from advertising. And for those in the middle — readers who lack the brand loyalty to want to pay, but nonetheless like to see Times stories pop up in their Twitter feed — the social media “leak” in the paywall will keep letting them in for ads.

That kind of nuance makes for a much more precise instrument than a blunt-force paywall. But it also puts the onus on you to get all that nuance right. Get it wrong and you risk angering readers — or letting would-be paying customers in for free.

The Times paywall doesn’t launch in the United States for another week; the paper has plenty of time to plug this particular Javascript vulnerability, which goes by the name NYTClean, if it wants to. But the real question is: Is this a hole they really want closed? Or is this one of the intentional leaks in the wall?

The parable of NYTClean

<nerdy interlude>

In my piece Thursday looking at the paywall — currently only live in Canada — I noted that, when you reach your 20-article limit and try to read more, the contraband article actually loads just fine in your browser — it’s just quickly covered by an overlay obscuring the article and reminding you to pay up:

The full text of the article is still visible in the page source. And as I mentioned in responding to a commenter — and as is evident to anyone who can right-click on a page and choose “Inspect Element” — the overlay is nothing more than a little CSS and Javascript.

Unfortunately for the Times, there are plenty of popular (or popular-among-nerds) tools that tactically remove little bits of CSS and Javascript. There’s Greasemonkey, there’s Stylish — not to mention the ease with which a browser extension in Firefox, Chrome, or Safari can be built to strip out code. As I wrote:

Quote:
…not to get too far into it (although many bearded people will in the coming days, I can assure you), but yeah, as far as I can tell it’s just a set of divs generated by some javascript. Although I couldn’t quickly find that script in any of the linked .js files, certainly someone nerdier than me will.

So an attempt at a set of Firefox/Chrome/Safari extensions named FreeNYT can’t be too far off. Although I’m sure the Times has already thought of some creative things to counter that too.
Well, consider the first shot in the NYT paywall battle fired. Canadian coder David Hayes has just released NYTClean, a bookmarklet that, in one click, tears down the Times’ paywall.

“Released” is probably even a little strong — it makes it sound like there was an extended development process. All NYTClean does is call four measly lines of Javascript that hide a couple <div>s and turn page scrolling back on. It barely even qualifies as a hack. But it allows you access to any New York Times story, even when you’re past the monthly limit. (I just tested it out with a Canadian proxy server — works just like it says.)

</nerdy interlude>

(Obligatory note: I think the Times is right to ask regular readers to pay, and I think their paywall is basically well designed. Me, I just became a print subscriber last week, using the Frank Rich Discount. Support your local journalist!)

Leakiness: a bug and a feature

Now, the Times paywall is, to a certain extent, defined by its leakiness. The various holes — external links from social media and search biggest among them — are no accident; they’re the result of some (correct, I say) thinking about hitting the right balance between fly-by and dedicated readers, between those who come in the front door and others who arrive from the side.

But the tradeoff for those holes is that they’re designed to be a pain to use if you’re a dedicated NYT reader. Click an occasional Times link when it comes up in your Twitter stream? No problem. But if you’re the kind of person who goes to nytimes.com every morning and clicks on four or five articles, you’ll quickly find it’s a big pain to go search for a headline in Google or Twitter every time you want to read another David Carr piece. (A similar workaround has existed for Wall Street Journal stories behind its paywall for years, but it’s doubtful anyone other than the most desperate reader has ever used it much.)

This CSS-and-Javascript hole, however, isn’t difficult to use at all. One drag into your bookmark bar, then one click whenever you hit a blocked article.

And yet this workaround is so blindingly obvious to anyone who’s ever worked with code that it’s difficult to imagine it didn’t come up in the paywall planning process. The other major news paywalls — WSJ, FT, The Economist — don’t actually send the entire forbidden article to your browser, then try cover it up with a couple lines of easily reversible code. They just hit you with a message saying, in effect, “Sorry, pay up here” whenever you stray past the free zone.

And that leakiness is actually a defensible choice, I think, on the Times’ part. Imagine a Venn diagram with two circles. One represents all the people on the Internet who might be convinced to pay for nytimes.com. The other represents all the people on the Internet who (a) know how to install a bookmarklet or (b) have read a Cory Doctorow novel. Do you really see a big overlap between the two? If someone is absolutely certain to never pay for the NYT, then it makes sense to squeeze a little extra advertising revenue out of them on the rare occasions when a link sends them to nytimes.com.

The problem with that model, though, is that it assumes inefficiency. It assumes that the happy-to-pay crowd (or the grudgingly-will-pay crowd) never find out about the workarounds — or at least that the workarounds remain complicated enough that they won’t want to bother. One click, though, ain’t all that complicated.

And that nudge-nudge approach to security through obscurity also assumes that the Times will be, at some level, okay with people using workarounds. It’s a tough balance: tolerating them so long as they boost advertising revenue and continue to give people the impression nytimes.com is available to them; breaking them when they prove to be too popular among people who might otherwise pay.

To get an idea what that balance looks like, check out statements from two top Times officials in the past few days. First, Eileen Murphy, NYT vice president of corporate communications, talking to the Canadian Press:

She said the paper will be watching for attempts to circumvent the digital subscription system and the limits in place, like if Twitter users tweeted links to the entire paper.

“If it was something blatant…that is likely something that we would make an effort to go after,” Murphy said.

“If there was some real attempt to game the system in some way that was not appropriate it’s something we would certainly look at.”

Psst…if you’re looking for someone who tweets a whole bunch of links to NYT content, I know a guy.

Or Martin Nisenholtz, in his interview with Peter Kafka:

…we want to make sure that we’re not being gamed, to the extent that we can be…We’re obviously going to be vigilant over the next couple of months, in looking at the ways that people are doing that…

I don’t think we’re going to spend enormous resources to go tracking people down. But at the same time, we’re going to obviously work to see where the source of these workarounds are, and work to close them off, if they become substantive enough.

But in looking at the research that we did, we expect [paywall jumpers] to be a very significant minority, a small, small number of people. When you look at your Twitter feed, based on the people you follow, it probably seems like it’s looming very large. But in the scheme of things, among people who don’t live in Silicon Valley or don’t cover it, the vast majority of people do not have this on their minds.

That last bit gets at the issue: You can afford to let nerds game your system. You probably want them to game your system, because they (a) are unlikely to pay, (b) generate ad revenue, and (c) are more likely to share your content than most.

The danger is when it becomes easy for non-nerds to do it. And that’s the risk of any leaky paywall — the risk that you might calibrate the holes incorrectly and let too many of your would-be subscribers through. Something like NYTClean — or the many tools that will soon follow it — could be the kind of thing that tips the balance in a way that hurts the Times.
http://www.niemanlab.org/2011/03/tha...mbling-down-2/





NY Times Asks Twitter to Shut Down Paywall Dodgers
Jeff Bercovici

There are plenty of ways to tunnel under or through The New York Times’s new online paywall, but if you’re not careful about how you do it, the Times will shut you down.

That’s what’s about to happen to FreeNYTimes, a Twitter feed started to take advantage of one of the holes in the wall: While users are allowed only 20 pageviews a month before the wall kicks in, visits to the site via social media links are unlimited. As I predicted, this was an inducement for someone to simply tweet a link to every single story on the site. In this case, the perpetrator seems to be a web developer who used the Times’s own API (application programming interface) to automate the process.

It’s clever, but it’s not kosher. “We have asked Twitter to disable this feed as it is in violation of our trademark,” says a Times spokeswoman. She adds that the paper has been monitoring and has already blown the whistle on other violations. Guess this won’t be my ticket to easy paywall beating.

I also asked her about NYTClean, a bookmark that defeats the paywall with the aid of four lines of code. The response: “As we have said previously, as with any paid product, we expect that there will be some percentage of people who will find ways around our digital subscriptions. We will continue to monitor the situation but plan no changes to the programming or paywall structure in advance of our global launch on March 28th.”
http://blogs.forbes.com/jeffbercovic...ywall-dodgers/





New York Times Announces Paywall Launch Time, Drops Case Against @FreeNYTimes
Lauren Indvik

The New York Times announced that its widely unpopular paywall will go into effect in the U.S. and worldwide beginning 2 p.m. Monday, March 28.

The paywall will be implemented in stages throughout the hour, a Times spokesperson said in an emailed statement. Readers who wish to continue reading more than 20 articles per month and who want to have access to full Times content on their mobile and/or tablet devices will then be able to choose between three different plans at NYTimes.com/access:

$15 for four weeks of access to NYTimes.com and a mobile phone app.
$20 for four weeks of access to NYTimes.com and its iPad app.
$35 for four weeks of access to all of the above.

Print subscribers to the Times and the International Herald Tribune will continue to have access to all of the Times‘ digital offerings at no additional charge. Those who decline to subscribe will still be able to read all NYTimes.com front page content and up to 20 additional NYTimes.com articles per month, as well as the “Top News” sections of the Times‘ smartphone and tablet applications, without paying. In addition, non-subscribers will have access to articles found through search (limited to five per day from major search engines), blogs and social networks like Facebook and Twitter, even if they have exceeded their 20-article reading limit.

The paywall has been in place in Canada since it was first announced March 17.

In addition, the Times confirmed a report that it had dropped its case against @freeNYTimes, a Twitter feed designed to help readers circumvent the forthcoming paywall.

The publisher asked Twitter earlier this week to disable the account because it was in violation of the Times trademark. The account has since removed the Times‘ logo from its profile photo, and the Times has relinquished its request.
http://mashable.com/2011/03/25/new-y...-paywall-date/





Sholes Signing Key Leak Explained
Nenolod

The Motorola(r) sholes platform uses a trusted bootloader environment. Signatures are stored as part of the CDT stored on the NAND flash. mbmloader verifies the signature on mbm before passing control. mbm verifies all other signatures before allowing the device to boot.

There is a vulnerability in the way that Motorola generated the signatures on the sections stored in the CDT. This vulnerability is very simple. Like on the PlayStation 3, Motorola forgot to add a random value to the signature in order to mask the private key. This allowed the private signing keys of firmware for vulnerable phones to be cracked given a signature, public key and data SHA1 checksum.

The keys can be cracked using Mathematica. Read up on how the Elgamal signature scheme works.
TL;DR: k = s - sha1sum(data)

Above formula will yield signing keys on vulnerable phones due to motorola botching their signing keys.
What went wrong?

Signature and public key nonce values were zero which pretty much wiped out the equasion exposing the private key.
Keys

Not placed here due to Motorola legal.

Ok, what does this mean?

Please refer to the following table:

Boot chain component Status
OMAP secure bootrom secure
Secure keystore replaceable (this CG must be signed by motorola's key)
mbmloader secure, but irrelevant
mbm secure, but irrelevant, replaceable but unnecessary
recovery replaceable (signable by anything in keystore)
system replaceable (signable by anything in keystore)
bootimage replaceable (signable by anything in keystore)

I do not plan on doing any more work on this. But all information has been handed over to people who are working on this. Follow the FreeMyMoto people for their progress.

In theory, creating a packed SBF to update keystore and replace recovery should work without bricking your phone. My advice: do not replace mbmloader as that is dangerous. An earlier version of this advisory marked it as replaceable, I have decided to remove this claim as I cannot presently think of a way to do it safely.
Notes to recovery authors

Your recovery must update the signatures on the Codegroup Descriptor Table (CDT). If it does not, your recovery will brick the phone if you attempt to flash a custom ROM.
http://nenolod.net/~nenolod/sholes-k...explained.html





U.S. Senators Ask Apple to Pull Police-Evasion Apps
Josh Lowensohn

A group of U.S. senators is calling on Apple to remove applications that alert users to the presence of police and other law enforcement checkpoints that have been set up to combat drunk driving.

U.S. Senators Harry Reid (D-Nev.), Charles E. Schumer (D-N.Y.), Frank Lautenberg (D-N.J.), and Tom Udall (D-N.M.) are named as senders in the letter, which is addressed to Apple's senior vice president of iPhone software, Scott Forstall. No specific applications are named, but the letter highlights apps that "contain a database of DUI [driving under the influence] checkpoints updated in real-time" as well as one that sends out real-time alerts about the existence of these checkpoints.

"With more than 10,000 Americans dying in drunk-driving crashes every year, providing access to iPhone and iPad applications that alert users to DUI checkpoints is harmful to public safety," the group wrote. "We know that your company shares our desire to end the scourge of drunk driving and we therefore would ask you to remove these applications from your store."

A quick search on the App Store shows several such apps, some with suggestive names such as Tipsy and Fuzz Alert Pro, some that cost money and some that are free. Alongside these more specialized applications are crowd-sourced, social-network-style apps that can alert users to general police presence on local roads and highways.

Apple did not immediately respond to a request for comment.

Along with drunk driving apps, Apple is currently under fire for approving an iPhone application from a religious ministry that takes a stance on homosexuality, encouraging users to "cure" themselves of it. That particular app has been up on the store since mid-February, and continues to be made available.

To combat any confusion or ambiguities on its rules and regulations for application approval, Apple released a set of App Store guidelines back in September that spells out what apps are and are not allowed to do. Included on that list of "don'ts" are "apps that encourage excessive consumption of alcohol or illegal substances, or encourage minors to consume alcohol or smoke cigarettes."

Here's a full copy of the senators' letter:

Quote:
Mr. Scott Forstall
Senior Vice President, iPhone Software
Apple, Inc.
1 Infinite Loop
Cupertino, CA 95014

Dear Mr. Forstall,

We write today with grave concern regarding the ease with which downloadable applications for the iPhone, iPad, and other Apple products allow customers to identify where local police officers have set up DUI checkpoints. With more than 10,000 Americans dying in drunk-driving crashes every year, providing access to iPhone and iPad applications that alert users to DUI checkpoints is harmful to public safety.

We know that your company shares our desire to end the scourge of drunk driving and we therefore would ask you to remove these applications from your store.

One application, your company acknowledges in the product description, contains a database of DUI checkpoints updated in real-time. Another application, with more than 10 million users, also allows users to alert each other to DUI checkpoints in real time.

Police officers from across the country have voiced concern about these products, with one police captain saying, "If people are going to use those, what other purpose are they going to use them for except to drink and drive?" With a person dying every 50 minutes in a drunk-driving crash, this technology should not be promoted to your customers--in fact, it shouldn't even be available.

We appreciate the technology that has allowed millions of Americans to have information at their fingertips, but giving drunk drivers a free tool to evade checkpoints, putting innocent families and children at risk, is a matter of public concern. We hope that you will give our request to remove these applications from your store immediate consideration.

Thank you for your prompt and careful consideration of this matter. Should you have additional questions, please do not hesitate to contact our offices.

Sincerely,

Senator Reid
Senator Schumer
Senator Lautenberg
Senator Schumer
http://news.cnet.com/8301-13579_3-20045942-37.html





RIM Bows to Pressure, Yanks BlackBerry DUI Checkpoint App
Gregg Keizer

Research in Motion said Wednesday that it would comply with a request made by four U.S. senators, and will pull BlackBerry apps that alert drivers of police drunk-driving checkpoints.

"RIM's decision to remove these apps from their online store proves that when it comes to drunk driving, there should not be an app for that," said Sen. Charles Schumer (D-NY), one of the four lawmakers, in a statement Wednesday.

At least one app has disappeared from the BlackBerry App World.

On Tuesday, Sens. Harry Reid (D-Nev.), Schumer, Frank Lautenberg (D-NJ) and Tom Udall (D-NM) asked Apple, Google and RIM to pull an unspecified number of apps from their mobile app markets.

RIM is the first of the three smartphone operating system makers to confirm that it's removing applications from their online marts.

The senators had problems with apps that include alerts of upcoming sobriety checkpoints, a feature in some programs that also warns drivers of user-reported speed traps, roving radar-equipped patrol cars and recent accidents. Many of the apps integrate a smartphone's built-in GPS to display police and accident locations.

"Giving drunk drivers a free tool to evade checkpoints, putting innocent families and children at risk, is a matter of public concern," the senators said in their letter to executives at Apple , Google and RIM.

Joe Scott, the CEO of PhantomALERT, the only app cited by name in the senators' letter, confirmed that RIM had pulled his program.

"It was recently brought to RIM's attention that the PhantomALERT application for BlackBerry raises public safety concerns, specifically around the functionality that allows an end user to avoid police checkpoints set up to catch drivers under the influence," read an e-mail from RIM that Scott shared with Computerworld. "In response to this concern, RIM has removed the application from BlackBerry App World."

Apple and Google have not answered Computerworld's requests for comment.

RIM did not respond to further questions today, including how many apps it has yanked, and their titles.

PhantomALERT is produced by a Harrisburg, Penn.-based company of the same name. As of 5 p.m. ET Wednesday, a search for the app on the BlackBerry App World site came up empty. However, the program was still available on Google's Android Market and Apple's App Store.

On Wednesday, Scott again argued that PhantomALERT is "100% legal" and said the senators' concerns were unwarranted. "I think we are misjudged. If they really understood what we are doing and aim to achieve they would actually support us," Scott said in an e-mail.

One trade group agreed with Scott.

"The suggestion that the government should compel Apple, RIM, or other mobile application stores to block programs that simply allow users to report information based on location is misguided at best," said the Association for Competitive Technology (ACT), a Washington-based group that claims to represent more than 3,000 small- and mid-sized IT companies. "Taken to its conclusion, that would require blocking apps like Foursquare and Loopt. Having the government act as arbiter of which products should be sold in stores is a slippery slope that few would welcome."

ACT also counts Microsoft , Oracle and eBay among its members.

Not all states conduct DUI (driving under the influence) or DWI (driving while intoxicated) checkpoints. According to the Governors Highway Safety Association , 38 states as well as the District of Columbia allow police to run such checkpoints.

Twelve states, including Michigan, Minnesota, Texas and Oregon, do not.
http://www.pcworld.com/article/223128/





Google Says It Won’t Pull DUI Checkpoint Evasion App
Gabriel Perna

Don't expect Google to remove apps that help users avoid DUI checkpoints -- the company says it is leaving the controversial apps on its Android Marketplace.

A source said the company only removes apps that violate its Android content policies and the apps in question no not appear to violate these policies.

Four senators, Harry Reid (D-NV), Charles E. Schumer (D-NY), Frank R. Lautenberg (D-NJ), and Tom Udall (D-NM), sent letters out to Apple, Google and Research in Motion asking them to remove apps that help people avoid DUI checkpoints, saying they are dangerous. RIM has agreed to pull the apps and Apple has yet to respond.

On its Android app policy site, the only thing that the DUI checkpoint evasion app comes close to breaking is its "Illegal activities policy." Google says Android apps must "Keep it legal. Don't engage in unlawful activities on this product."

The main source of concern is an app called PhantomAlert, which shows the locations of the DUI checkpoints, school zones, red light cameras and speed traps. It can also be uploaded to a user's GPS system and costs $9.99 per month. Other apps of this nature include Buzzed and Trapster.

The senators say having these kinds of apps available is dangerous. One person dies every 50 minutes in a drunk-driving accident, and more than 10,000 Americans die in drunk-driving crashes each year.

"We appreciate the technology that has allowed millions of Americans to have information at their fingertips, but giving drunk drivers a free tool to evade checkpoints, putting innocent families and children at risk, is a matter of public concern. We hope that you will give our request to make these applications unavailable immediate consideration," the senators said in their letters to Google, RIM and Apple.

Apple has not responded to a request for comment.
http://www.ibtimes.com/articles/1264...ui-evasion.htm





It’s Tracking Your Every Move and You May Not Even Know
Noam Cohen

A favorite pastime of Internet users is to share their location: services like Google Latitude can inform friends when you are nearby; another, Foursquare, has turned reporting these updates into a game.

But as a German Green party politician, Malte Spitz, recently learned, we are already continually being tracked whether we volunteer to be or not. Cellphone companies do not typically divulge how much information they collect, so Mr. Spitz went to court to find out exactly what his cellphone company, Deutsche Telekom, knew about his whereabouts.

The results were astounding. In a six-month period — from Aug 31, 2009, to Feb. 28, 2010, Deutsche Telekom had recorded and saved his longitude and latitude coordinates more than 35,000 times. It traced him from a train on the way to Erlangen at the start through to that last night, when he was home in Berlin.

Mr. Spitz has provided a rare glimpse — an unprecedented one, privacy experts say — of what is being collected as we walk around with our phones. Unlike many online services and Web sites that must send “cookies” to a user’s computer to try to link its traffic to a specific person, cellphone companies simply have to sit back and hit “record.”

“We are all walking around with little tags, and our tag has a phone number associated with it, who we called and what we do with the phone,” said Sarah E. Williams, an expert on graphic information at Columbia University’s architecture school. “We don’t even know we are giving up that data.”

Tracking a customer’s whereabouts is part and parcel of what phone companies do for a living. Every seven seconds or so, the phone company of someone with a working cellphone is determining the nearest tower, so as to most efficiently route calls. And for billing reasons, they track where the call is coming from and how long it has lasted.

“At any given instant, a cell company has to know where you are; it is constantly registering with the tower with the strongest signal,” said Matthew Blaze, a professor of computer and information science at the University of Pennsylvania who has testified before Congress on the issue.

Mr. Spitz’s information, Mr. Blaze pointed out, was not based on those frequent updates, but on how often Mr. Spitz checked his e-mail.

Mr. Spitz, a privacy advocate, decided to be extremely open with his personal information. Late last month, he released all the location information in a publicly accessible Google Document, and worked with a prominent German newspaper, Die Zeit, to map those coordinates over time.

“This is really the most compelling visualization in a public forum I have ever seen,” said Mr. Blaze, adding that it “shows how strong a picture even a fairly low-resolution location can give.”

In an interview from Berlin, Mr. Spitz explained his reasons: “It was an important point to show this is not some kind of a game. I thought about it, if it is a good idea to publish all the data — I also could say, O.K., I will only publish it for five, 10 days maybe. But then I said no, I really want to publish the whole six months.”

In the United States, telecommunication companies do not have to report precisely what material they collect, said Kevin Bankston, a lawyer at the Electronic Frontier Foundation, who specializes in privacy. He added that based on court cases he could say that “they store more of it and it is becoming more precise.”

“Phones have become a necessary part of modern life,” he said, objecting to the idea that “you have to hand over your personal privacy to be part of the 21st century.”

In the United States, there are law enforcement and safety reasons for cellphone companies being encouraged to keep track of its customers. Both the F.B.I. and the Drug Enforcement Administration have used cellphone records to identify suspects and make arrests.

If the information is valuable to law enforcement, it could be lucrative for marketers. The major American cellphone providers declined to explain what exactly they collect and what they use it for.

Verizon, for example, declined to elaborate other than to point to its privacy policy, which includes: “Information such as call records, service usage, traffic data,” the statement in part reads, may be used for “marketing to you based on your use of the products and services you already have, subject to any restrictions required by law.”

AT&T, for example, works with a company, Sense Networks, that uses anonymous location information “to better understand aggregate human activity.” One product, CitySense, makes recommendations about local nightlife to customers who choose to participate based on their cellphone usage. (Many smartphone apps already on the market are based on location but that’s with the consent of the user and through GPS, not the cellphone company’s records.)

Because of Germany’s history, courts place a greater emphasis on personal privacy. Mr. Spitz first went to court to get his entire file in 2009 but Deutsche Telekom objected.

For six months, he said, there was a “Ping Pong game” of lawyers’ letters back and forth until, separately, the Constitutional Court there decided that the existing rules governing data retention, beyond those required for billing and logistics, were illegal. Soon thereafter, the two sides reached a settlement: “I only get the information that is related to me, and I don’t get all the information like who am I calling, who sent me a SMS and so on,” Mr. Spitz said, referring to text messages.

Even so, 35,831 pieces of information were sent to him by Deutsche Telekom as an encrypted file, to protect his privacy during its transmission.

Deutsche Telekom, which owns T-Mobile, Mr. Spitz’s carrier, wrote in an e-mail that it stored six months’ of data, as required by the law, and that after the court ruling it “immediately ceased” storing data.

And a year after the court ruling outlawing this kind of data retention, there is a movement to try to get a new, more limited law passed. Mr. Spitz, at 26 a member of the Green Party’s executive board, says he released that material to influence that debate.

“I want to show the political message that this kind of data retention is really, really big and you can really look into the life of people for six months and see what they are doing where they are.”

While the potential for abuse is easy to imagine, in Mr. Spitz’s case, there was not much revealed.

“I really spend most of the time in my own neighborhood, which was quite funny for me,” he said. “I am not really walking that much around.”

Any embarrassing details? “The data shows that I am flying sometimes,” he said, rather than taking a more fuel-efficient train. “Something not that popular for a Green politician.”
https://www.nytimes.com/2011/03/26/b...26privacy.html





NSA Report Renews Data Mining Concerns

If phone land lines are being monitored, what about the Internet?
AP

If the National Security Agency is indeed amassing a colossal database of Americans’ phone records, one way to use all that information is in “social network analysis,” a data-mining method that aims to expose previously invisible connections among people.

Social network analysis has gained prominence in business and intelligence circles under the belief that it can yield extraordinary insights, such as the fact that people in disparate organizations have common acquaintances. Companies can buy social networking software to help determine who has the best connections for a particular sales pitch.

So it did not surprise many security analysts to learn Thursday from USA Today that the NSA is applying the technology to billions of phone records.

“Who you’re talking to often matters much more than what you’re saying,” said Bruce Schneier, a computer security expert and author of “Beyond Fear: Thinking Sensibly About Security in an Uncertain World.”

The NSA declined to comment. But several experts said it seemed likely the agency would want to assemble a picture from more than just landline phone records. Other forms of communication, including cell phone calls, e-mails and instant messages, likely are trackable targets as well, at least on international networks if not inside the U.S.

To be sure, monitoring newer communications services is probably harder than getting billing records from landline phones. USA Today reported that the NSA has collected call logs from the three largest U.S. phone companies, BellSouth Corp., AT&T Inc. and Verizon Communications Inc.

That level of cooperation confirmed the fears of many privacy analysts, who pointed out that AT&T is already being sued in federal court in San Francisco for allegedly giving the NSA access to contents of its phone and Internet networks. The charges are based on documents from a former AT&T technician.

It remains unclear whether other communications providers have been asked for their call logs or billing records.

Verizon Wireless spokesman Jeffrey Nelson definitively said his company was “not involved in this situation.” His counterparts at Cingular — an AT&T/BellSouth joint venture — and Sprint Nextel Corp. were less explicit and did not deny any participation.

In a statement e-mailed to The Associated Press Thursday, T-Mobile USA Inc. said it does not participate “in any NSA program for warrant-less surveillance and acquisition of call records, and T-Mobile has not provided any such access to communications or customer records.”

Even without cell phone carriers’ help, of course, calls between wireless subscribers and Verizon, AT&T and BellSouth landlines presumably would be captured.

Internet poses challenge

Among Internet service providers, representatives for AOL LLC said the company complies with individual government subpoenas and court orders but does not have a blanket program for broader sharing of customer data. Microsoft Corp. had “never engaged in the type of activity referenced in these articles,” according to a statement from Scott Charney, its vice president for trustworthy computing. Google Inc. spokesman Steve Langdon said his company does not participate, either. Yahoo Inc. officials say they comply with subpoenas, but refused to elaborate, saying they cannot comment on specific government interactions.

Even without full inside help, the NSA has proven itself adept at capturing communications or at least analyzing traffic information. The Echelon program, for example, is known to have tapped into satellite, microwave and fiber-optic phone links and even undersea cables in order to gain insights into what the rest of the world was talking about.

The Internet does present new challenges for snoops, which has led federal authorities to seek an expansion of a key surveillance law so that it applies to new kinds of Web services.

But even now authorities can tap into data feeds. There is a relatively small number of major Internet backbones and data junctions where networks hand information off to each other.

And while e-mail, Internet calls and other data packets splinter and take varying routes across networks, each packet has a header identifying its source and destination. It’s not obvious what the packet is part of — whether an e-mail, a Web page or an Internet phone call — but it still contains the equivalent of a phone billing record: who’s talking to whom.

“It’s not trivial to analyze all the material, but it’s trivial to get to the material,” said Barry Steinhardt, director of the technology and liberty program at the American Civil Liberties Union.

Even Skype, the popular Internet phone service that encrypts its calls — which presumably prevents sweeping monitoring of their content — is believed to be vulnerable to who’s-calling-whom traffic analysis.

Still, while the government clearly can parlay industry cooperation and technical firepower to grab lots of communications, there’s bound to be a limit.

For example, tiny, free voice-over-Internet services likely don’t bother to maintain the kinds of call logs that Verizon, BellSouth and AT&T apparently handed over, said Jeff Pulver, an authority on the technology.

Also, social network analysis would appear to be powerless against criminals and terrorists who rely on a multitude of cell phones, payphones, calling cards and Internet cafes.

And then there are more creative ways of getting off the grid. The Madrid train bombings case has revealed that the plotters communicated by sharing one e-mail account and saving messages to each other as drafts that didn’t traverse the Internet like regular mail messages would.

Privacy activists worry that the government is likely to try to overcome these surveillance gaps by making more use of the information it does have — by cross-referencing phone or other records with commercially harvested data.

One effort in that direction, the Pentagon’s infamous Total Information Awareness program, was technically shuttered by Congress, but the government still can access copious data from the private sector.

Even if the NSA’s surveillance went no further than the NSA’s access to phone billing records, it clearly would raise hackles.

The time and destination of dialed phone calls has long been available to authorities through “pen registers” and “trap and trace” devices — but with a court order. USA Today noted that concerns about the legality of the NSA’s phone-call database led Qwest Communications International Inc. to refuse to participate.

“A court order couldn’t be obtained to just wholesale surveil,” said Kurt Opsahl, staff attorney for the Electronic Frontier Foundation, which is suing AT&T in San Francisco. “The legal standard requires something more specific. You can’t get everybody’s data unless you have some suspicion.”
http://www.msnbc.msn.com/id/12745304...ence-security/





Court Seals Unclassified Docs in Drake “Leak” Case
Steven Aftergood

Prosecutors in the case of the former National Security Agency official Thomas A. Drake, who is suspected of leaking classified information to a reporter, last week asked the court to block public access to two letters that were introduced as exhibits by the defense earlier this month. Late Friday, the court agreed to seal the two exhibits. But they remain publicly accessible anyway.

The exhibits describe the classification status of several NSA records that were found in the home of Mr. Drake, explaining why in each case the prosecution considers the records classified. The defense disputes their classification and denies that Mr. Drake ever retained any classified records at his home.

Mr. Drake’s defense said that it intends to introduce testimony at trial “which will include a discussion of the appropriate assignment of classification controls under the Executive Order and the consequences and pervasiveness of inappropriately assigning classification controls.”

To document the classification judgments that it disputes, the defense also filed the two letters from the Justice Department as exhibits on March 11.

On March 16, prosecutors asked the court to seal those two records. “As grounds [for sealing the records], the information contained within the exhibits derives from NSA. As the holder of the privilege for this information, NSA has classified the documents as ‘FOUO’, which means ‘For Official Use Only.’ This means that the information is not for public dissemination. Until such time as NSA downgrades the information to ‘Unclassified,’ the exhibits should not be publicly filed,” prosecutors wrote.

Ironically, this prosecution argument illustrates the confusion about classification policy that prevails at NSA, in the Justice Department and in much of the government.

The NSA could not “classify” the records as FOUO and cannot “downgrade” them to “unclassified” because they are already unclassified. “Information cannot be classified and FOUO at the same time,” according to the governing DoD regulation 5200.1-R. “By definition, information must be unclassified in order to be designated FOUO.”

Without waiting for a response from the defense or from other interested parties, Judge Richard D. Bennett of the Maryland District Court granted the prosecution motion and sealed the records. His March 18 decision on the matter, which was first reported by Politico, was also sealed.

The newly-sealed records remain available, however, on the Federation of American Scientists web site here. Besides being unclassified, these records do not prejudice either the prosecution or the defense, to whom they were originally written.
http://www.fas.org/blog/secrecy/2011...cs_sealed.html





China Tightens Censorship of Electronic Communications
Sharon LaFraniere and David Barboza

If anyone wonders whether the Chinese government has tightened its grip on electronic communications since protests began engulfing the Arab world, Shakespeare may prove instructive.

A Beijing entrepreneur, discussing restaurant choices with his fiancée over their cellphones last week, quoted Queen Gertrude’s response to Hamlet: “The lady doth protest too much, methinks.” The second time he said the word “protest,” her phone cut off.

He spoke English, but another caller, repeating the same phrase on Monday in Chinese over a different phone, was also cut off in midsentence.

A host of evidence over the past several weeks shows that Chinese authorities are more determined than ever to police cellphone calls, electronic messages, e-mail and access to the Internet in order to smother any hint of antigovernment sentiment. In the cat-and-mouse game that characterizes electronic communications here, analysts suggest that the cat is getting bigger, especially since revolts began to ricochet through the Middle East and North Africa, and homegrown efforts to organize protests in China began to circulate on the Internet about a month ago.

“The hard-liners have won the field, and now we are seeing exactly how they want to run the place,” said Russell Leigh Moses, a Beijing analyst of China’s leadership. “I think the gloves are coming off.”

On Sunday, Google accused the Chinese government of disrupting its Gmail service in the country and making it appear as if technical problems at Google — not government intervention — were to blame.

Several popular virtual private-network services, or V.P.N.’s, designed to evade the government’s computerized censors, have been crippled. This has prompted an outcry from users as young as ninth graders with school research projects and sent them on a frustrating search for replacements that can pierce the so-called Great Firewall, a menu of direct censorship and “opinion guidance” that restricts what Internet users can read or write online. V.P.N.’s are popular with China’s huge expatriate community and Chinese entrepreneurs, researchers and scholars who expect to use the Internet freely.

In an apology to customers in China for interrupted service, WiTopia, a V.P.N. provider, cited “increased blocking attempts.” No perpetrator was identified.

Beyond these problems, anecdotal evidence suggests that the government’s computers, which intercept incoming data and compare it with an ever-changing list of banned keywords or Web sites, are shutting out more information. The motive is often obvious: For six months or more, the censors have prevented Google searches of the English word “freedom.”

But other terms or Web sites are suddenly or sporadically blocked for reasons no ordinary user can fathom. One Beijing technology consultant, who asked not to be identified for fear of retribution against his company, said that for several days last week he could not visit the Web site for the Hong Kong Stock Exchange without a proxy. LinkedIn, a networking platform, was blocked for a day during the height of government concerns over Internet-based calls for protests in Chinese cities a few weeks ago, he said.

Hu Yong, a media professor at Peking University, said government censors were constantly spotting and reacting to new perceived threats. “The technology is improving and the range of sensitive terms is expanding because the depth and breadth of things they must manage just keeps on growing,” Mr. Hu said.

China’s censorship machine has been operating ever more efficiently since mid-2008, and restrictions once viewed as temporary — like bans on Facebook, YouTube and Twitter — are now considered permanent. Government-friendly alternatives have sprung and developed a following.

Few analysts believe that the government will loosen controls any time soon, with events it considers politically sensitive swamping the calendar, including a turnover in the Communist Party’s top leadership next year.

“It has been double the guard, and double the guard, and you never hear proclamations about things being relaxed,” said Duncan Clark, chairman of BDA China, an investment and strategy consultancy based in Beijing, and a 17-year resident of China. “We have never seen this level of control in the time I have been here, and I have been here since the beginning of the Internet.”

How far China will clamp down on electronic communications is unclear. “There’s a lot more they can do, but they’ve been holding back,” said Bill Bishop, a Internet expert based in Beijing. Some analysts suggest that officials are exploring just how much inconvenience the Chinese are willing to tolerate. While sentiment is hard to gauge, a certain segment of society rejects censorship.

For many users, an inoperable V.P.N. is an inconvenience, not a crisis. But Internet consultants said interfering with an e-mail service on which people depend every day is more serious. “How people respond is going to be more intense, more visceral,” one consultant said.

Google began receiving complaints from Gmail users and its own employees in China about a month ago, around the time anonymous Internet posts urged people unhappy with the government to gather every Sunday. Some Gmail users found their service disconnected when they tried to send or save messages.

Engineers determined that there were no technical difficulties on Google’s end, Google said; rather, the hand of the Chinese government was at work. China’s Foreign Ministry did not respond Monday to calls or faxed questions about Google’s statement.

Disrupting Web sites and Internet connections is a standard tactic in dealing with companies that fall out of government favor. Mark Seiden, an Internet consultant, said Chinese officials typically left the companies and users to guess the reason.

In the Google case, an article on the Web site of People’s Daily, the Communist Party’s official publication, offered a strong hint. The March 4 article, attributed to a netizen, called Google a tool of the United States government. Like Facebook and Twitter, the article said, Google has “played a role in manufacturing social disorder” and sought to involve itself in other nations’ politics.

China has treated Google as a threat for some time. Last year, Google closed its search service and redirected Chinese users to Google’s Hong Kong site after the company said China was behind a cyberattack aimed partly at Gmail accounts.

Mr. Moses, the Beijing analyst, said the latest moves further expand government control of electronic communications. “The model for this government is that every day is a new challenge and a new opportunity to show the strength of the state here,” he said. “There is clear confidence in the capability of the political authorities to maintain order.”

Jonathan Ansfield contributed reporting from Beijing, and Claire Cain Miller from San Francisco. Jonathan Kaiman and Li Bibo contributed research from Beijing.
https://www.nytimes.com/2011/03/22/w...a/22china.html





Facebook Traffic Mysteriously Passes Through Chinese ISP

Routing cockup most likely explanation
Dan Goodin

For a short time on Tuesday, internet traffic sent between Facebook and subscribers to AT&T's internet service passed through hardware belonging to the state-owned China Telecom before reaching its final destination, a security researcher said.

An innocent routing error is the most likely explanation for the highly circuitous route, but it's troubling nonetheless, said Barrett Lyon, the independent researcher who helped discover the anomaly and later blogged about it. Human rights groups have long accused China's government of snooping on the internet communications of dissidents, and last year Google claimed it and dozens of other companies were on the receiving end of a sophisticated hacking campaign carried out by the Chinese.

During a window that lasted 30 minutes to an hour Tuesday morning, all unencrypted traffic passing between AT&T customers and Facebook might have been open to similar monitoring. Lyon said he has no evidence any data was in fact snarfed, but he said the potential for that is certainly there because the hardware belonged to China Telecom, which in turn is owned by the Chinese government.

“This kind of thing happens all the time, sometimes on accident and sometimes on purpose,” he told The Reg. “I think people should talk about it at the very least.”

It's not the first time traffic has been diverted through Chinese networks under mysterious circumstances. In March and April of last year, traffic to as much as 15 percent of the world's internet destinations was briefly diverted through China. Networks used by Dell, Apple, CNN, and Starbucks were all affected. At least one of those incidents was the result of erroneous BGP, or Border Gateway Protocol, routes that were quickly corrected.

Unlike those incidents, Tuesday's diversion appeared to affect only traffic traveling between AT&T users and Facebook. Lyon discovered the anomaly by telnetting into AT&T's IP Services Route Monitor (telnet://route-server.ip.att.net) and typing various commands, such as “show ip bgp 69.171.224.20/20.”

Traceroute commands executed during the brief window Tuesday morning on machines connected to AT&T's network also verified that Facebook-bound traffic was traveling over AS4134, the Autonomous System belonging to China Telecom, Lyon said.

Facebook issued a statement that read:

Quote:
We are investigating a situation today that resulted in a small amount of a single carrier's traffic to Facebook being misdirected. We are working with the carrier to determine the cause of this error.

Our initial checks of the latency of the requests indicate that no traffic passed through China.
The statement left open the possibility that Facebook traffic passed through China Telecom hardware located in Europe or elsewhere.

The incident comes two months after Facebook started offering its users the option of using always-on SSL to encrypt their sessions from beginning to end. Previously, only logins and other select transactions were protected, leaving online chats, photo uploads and other activities wide open to anyone who had the ability to monitor the networks between the user and Facebook.

Facebook has said it hopes to turn on SSL by default in the future, but don't count on that happening anytime soon. In the meantime, users must activate it manually, by going to Account Settings > Account Security and checking the box that says “Browse Facebook on a secure connection (https) whenever possible.”
http://www.theregister.co.uk/2011/03...china_telecom/





Spotify Splattered with Malware-Tainted Ads
John Leyden

Updated Users of the ad-supported version of Spotify were hit by a malware-based attack on Thursday.

The assault takes advantage of a Java-based exploit to deposit Trojan horse malware or exploit kits on vulnerable Windows machines. Only users of the free version of the music streaming service seem to be affected.

In response, Spotify pulled its ad feed on Friday while it investigating the problem.

We're currently investigating and have pulled all third party display ads that could have caused the problem until we locate the specific advert.

El Reg became aware of the problem on Thursday, following a tip-off from a reader in UK academia. JANET (Joint Academic Network) is reportedly looking into incidents of viral warnings linked to Spotify. "We're not investigating any specific infections at this moment, but our community is asking for more info," it said.

The malware was served up via malicious third-party adverts, a factor that means the threat is not persistent and may be region specific. This makes it harder for anti-virus firms to pin down the outbreak.

But the problems was far from isolated, with several Twitter users reporting the same issue over the last day or so. Both Avast and AVG are detecting the exploit.

Netcraft has a precis of the attack, which is still under investigation, here.
http://www.theregister.co.uk/2011/03...sement_attack/





Microsoft Shuts off HTTPS in Hotmail for Over a Dozen Countries
Eva Galperin

Microsoft appears to have turned off the always-use-HTTPS option in Hotmail for users in more than a dozen countries, including Bahrain, Morocco, Algeria, Syria, Sudan, Iran, Lebanon, Jordan, Congo, Myanmar, Nigeria, Kazakhstan, Uzbekistan, Turkmenistan, Tajikistan, and Kyrgyzstan. Hotmail users who have set their location to any of these countries receive the following error message when they attempt to turn on the always-use-HTTPS feature in order to read their mail securely:

“Your Windows Live ID can't use HTTPS automatically because this feature is not available for your account type.”

Microsoft debuted the always-use-HTTPS feature for Hotmail in December of 2010, in order to give users the option of always encrypting their webmail traffic and protecting their sensitive communications from malicious hackers using tools such as Firesheep, and hostile governments eavesdropping on journalists and activists. For Microsoft to take such an enormous step backwards— undermining the security of Hotmail users in countries where freedom of expression is under attack and secure communication is especially important—is deeply disturbing. We hope that this counterproductive and potentially dangerous move is merely an error that Microsoft will swiftly correct.

The good news is that the fix is very easy. Hotmail users in the affected countries can turn the always-use-HTTPS feature back on by changing the country in their profile to any of the countries in which this feature has not been disabled, such as the United States, Germany, France, Israel, or Turkey. Hotmail users who browse the web with Firefox may force the use of HTTPS by default—while using any Hotmail location setting—by installing the HTTPS Everywhere Firefox plug-in.
https://www.eff.org/deeplinks/2011/0...ozen-countries





Iranian Hackers Obtain Fraudulent HTTPS Certificates: How Close to a Web Security Meltdown Did We Get?
Peter Eckersley

On March 15th, an HTTPS/TLS Certificate Authority (CA) was tricked into issuing fraudulent certificates that posed a dire risk to Internet security. Based on currently available information, the incident got close to — but was not quite — an Internet-wide security meltdown. As this post will explain, these events show why we urgently need to start reinforcing the system that is currently used to authenticate and identify secure websites and email systems.

There is a post up on the Tor Project's blog by Jacob Appelbaum, analysing the revocation of a number of HTTPS certificates last week. Patches to the major web browsers blacklisted a number of TLS certificates that were issued after hackers broke into a Ceritificate Authority. Appelbaum and others were able to cross-reference the blacklisted certificates' serial numbers against a comprehensive collection of Certificate Revocation Lists (these CRL URLs were obtained by querying EFF's SSL Observatory databases) to learn which CA had been affected.

The answer was the UserTrust "UTN-USERFirst-Hardware" certificate owned by Comodo, one of the largest CAs on the web. Comodo has now published a statement about the improperly issued certs, which were for extremely high-value domains including google.com, login.yahoo.com and addons.mozilla.org (this last domain could be used to trojan any system that was installing a new Firefox extension, though updates to previously installed extensions have a second layer of protection from XPI signatures). One cert was for "global trustee" — not a domain name. That was probably a malicious CA certificate that could be used to flawlessly impersonate any domain on the Web.

Comodo also said that the attack came primarily from Iranian IP addresses, and that one of the fraudulent login.yahoo.com certs was briefly deployed on a webserver in Iran.1

What should we do about these attacks?

Discussing problems with the revocation mechanisms that should (but don't) protect users who don't instantly get browser updates, Appelbaum makes the following assertion:

Quote:
If the CA cannot provide even a basic level of revocation, it's clearly irresponsible to ship that CA root in a browser. Browsers should give insecure CA keys an Internet Death Sentence rather than expose the users of the browsers to known problems.
Before discussing whether or not such a dramatic conclusion is at all warranted, it is worth considering what the consequences of blacklisting Comodo's UserTrust CA certificate would have been. We used the SSL Observatory datasets to determine what had been signed by that CA certificate. The answer was that, as of August 2010, 85,440 public HTTPS certificates were signed directly by UTN-USERFirst-Hardware. Indirectly, the certificate had delegated authority to a further 50 Certificate Authorities, collectively responsible for another 120,000 domains. In the event of a revocation, at least 85,000 websites would have to scramble to obtain new SSL certificates.

The situation of the 120,000 other domains is more complicated -- some of these are cross-certified by other root CAs or might be able do obtain such cross-certifications. In most -- but not all -- cases, these domains could continue to function without updating their webserver configurations or obtaining new certs.

The short answer, however, is that the Comodo's USERFirst-Hardware certificate is too big to fail. If the private key for such a CA were hacked, by the Iranians or by anybody else, browsers would face a horrible choice: either blacklisting the CA quickly, causing outages at tens or hundreds of thousands of secure websites and email servers; or leave all of the world's HTTPS, POP and IMAP deployments vulnerable to the hackers for an extended period of time.

Fortunately, Comodo has said that the master CA private keys in its Hardware Security Modules (HSMs) were not compromised, so we did not experience that kind of Internet-wide catastrophic security failure last week. But it's time for us to start thinking about what can be done to mitigate that risk.
Cross-checking the work of CAs

Most Certificate Authorities do good work. Some make mistakes occasionally,2 but that is normal in computer security. The real problem is a structural one: there are 1,500 CA certificates controlled by around 650 organizations,3 and every time you connect to an HTTPS webserver, or exchange email (POP/IMAP/SMTP) encrypted by TLS, you implicitly trust all of those certificate authorities!

What we need is a robust way to cross-check the good work that CAs currently do, to provide defense in depth and ensure (1) that a private key-compromise failure at a major CA does not lead to an Internet-wide cryptography meltdown and (2) that our software does not need to trust all of the CAs, for everything, all of the time.

For the time being, we will make just one remark about this. Many people have been touting DNSSEC PKI as a solution to the problem. While DNSSEC could be an improvement, we do not believe it is the right solution to the TLS security problem. One reason is that the DNS hierarchy is not trustworthy. Countries like the UAE and Tunisia control certificate authorities, and have a history of compromising their citizens' computer security. But these countries also control top-level DNS domains, and could control the DNSSEC entries for those ccTLDs. And the emergence of DNS manipulation by the US government also raises many concerns about whether DNSSEC will be reliable in the future.

We don't think this is an unsolvable problem. There are ways to reinforce our existing cryptographic infrastructure. And building and deploying them may not be that hard. Look for a blog post from us shortly about how we should go about doing that.
https://www.eff.org/deeplinks/2011/0...audulent-https





Google Spends $1 Million on Censorship and Throttling Detection
Nate Anderson

Google has awarded $1 million to Georgia Tech researchers so that they can develop simple tools to detect Internet throttling, government censorship, and other "transparency" problems.

That money will cover two years of work at Georgia Tech, with an additional $500,000 extension possible if Google wants an extra year of development. At the end of the project, the Georgia Tech team hopes to provide "a suite of Web-based, Internet-scale measurement tools that any user around the world could access for free. With the help of these tools, users could determine whether their ISPs are providing the kind of service customers are paying for, and whether the data they send and receive over their network connections is being tampered with by governments and/or ISPs."

Wenke Lee, a computer science professor at the school and one of the grant's principal investigators (along with the grant's author, computer science professor Nick Feamster), says that the work will create a "transparency ecosystem" on the 'Net.

"For example," he said, "say something happens again like what happened in Egypt recently, when the Internet was essentially shut down. If we have a community of Internet user-participants in that country, we will know instantly when a government or ISP starts to block traffic, tamper with search results, even alter Web-based information in order to spread propaganda." (The Tunisian government early this year added bits of code to Facebook login pages in order to capture user credentials, for instance.)

The team cares about more than computers, too; with the surge in mobile data connections, it plans to build tools for smartphone and tablet owners as well.
http://arstechnica.com/tech-policy/n...-detection.ars





Judge Rejects Google’s Deal to Digitize Books
Miguel Helft

Google’s ambition to create the world’s largest digital library and bookstore has run into the reality of a 300-year-old legal concept: copyright.

The company’s plan to digitize every book ever published and make them widely available was derailed on Tuesday when a federal judge in New York rejected a sweeping $125 million legal settlement the company had worked out with groups representing authors and publishers.

The decision throws into legal limbo one of the most ambitious undertakings in Google’s history, and it brings into sharp focus concerns about the company’s growing power over information. While the profit potential of the book project is not clear, the effort is one of the pet projects of Larry Page, the Google co-founder who is set to become its chief executive next month. And the project has wide support inside the company, whose corporate mission is to organize all of the world’s information.

“It was very much consistent with Larry’s idealism that all of the world’s information should be made available freely,” said Ken Auletta, the author of “Googled: The End of the World as We Know It.”

But citing copyright, antitrust and other concerns, Judge Denny Chin said that the settlement went too far. He said it would have granted Google a “de facto monopoly” and the right to profit from books without the permission of copyright owners.

Judge Chin acknowledged that “the creation of a universal digital library would benefit many,” but said that the proposed agreement was “not fair, adequate and reasonable.” He left open the possibility that a substantially revised agreement could pass legal muster. Judge Chin was recently elevated to the United States Court of Appeals for the Second Circuit, but handled the case as a district court judge.

The decision is also a setback for the Authors Guild and the Association of American Publishers, which sued Google in 2005 over its book-scanning project. After two years of painstaking negotiations, the authors, publishers and Google signed a sweeping settlement that would have brought millions of printed works into the digital age.

The deal turned Google, the authors and the publishers into allies instead of opponents. Together, they mounted a defense of the agreement against an increasingly vocal chorus of opponents that included Google rivals like Amazon and Microsoft, as well as academics, some authors, copyright experts, the Justice Department and foreign governments.

Now the author and publisher groups have to decide whether to resume their copyright case against Google, drop it or try to negotiate a new settlement.

Paul Aiken, executive director of the Authors Guild, said in an interview that it was too early to tell what the next step would be. “The judge did expressly leave the door open for a revised settlement,” he said.

Hilary Ware, managing counsel at Google, said in a statement that the decision was “clearly disappointing,” adding: “Like many others, we believe this agreement has the potential to open up access to millions of books that are currently hard to find in the U.S. today.” The company would not comment further.

Google has already scanned some 15 million books. The entire text of books whose copyrights have expired are available through Google’s Book Search service. It shows up to 20 percent of copyrighted titles that it has licensed from publishers, and only snippets of copyrighted titles for which it has no license.

The settlement would have allowed it to go much further, making millions of out-of-print books broadly available online and selling access to them. It would have given authors and publishers new ways to earn money from digital copies of their works.

Yet the deal faced strong opposition. Among the most persistent objections, raised by the Justice Department and others, were concerns that it would have given Google exclusive rights to profit from millions of so-called orphan works, books whose rights holders are unknown or cannot be found. They also said no other company would be able to build a comparable library, leaving Google free to charge high prices for its collection. And some critics said the exclusive access to millions of books would help cement Google’s grip on the Internet search market.

Judge Chin largely agreed with the critics on those points. But he suggested that substantial objections would be eliminated if the settlement applied only to books whose authors or copyright owners would explicitly “opt in” to its terms.

When the Justice Department suggested as much last year during a court hearing, Google rejected the idea as unworkable. It would leave millions of orphan works out of the agreement and out of Google’s digital library, greatly diminishing its value to Google and to the public.

“Opt-in doesn’t look all that different from ordinary licensing deals that publishers do all the time,” said James Grimmelmann, a professor at New York Law School who has studied the legal aspects of the agreement. “That’s why this has been such a big deal — the settlement could have meant orphan books being made available again. This is basically going back to status quo, and orphan books won’t be available.”

Some longtime opponents of the settlement hailed the decision, saying that they hoped it would prompt Congress to tackle legislation that would make orphan works accessible.

“Even though it is efficient for Google to make all the books available, the orphan works and unclaimed books problem should be addressed by Congress, not by the private settlement of a lawsuit,” said Pamela Samuelson, a copyright expert at the University of California, Berkeley who helped organize efforts to block the agreement.

Gina Talamona, a Justice Department spokeswoman, said in a statement that the court had reached the “right result.”

A group of publishers said they were disappointed by the decision, but believed that it provided “clear guidance” on the changes necessary for the settlement to be approved.

John Sargent, the chief executive of Macmillan, spoke on behalf of the publishers, which included Penguin Group USA, McGraw-Hill, Pearson Education, Simon & Schuster and John Wiley & Sons.

“The publisher plaintiffs are prepared to enter into a narrower settlement along those lines to take advantage of its groundbreaking opportunities,” Mr. Sargent said in a statement. “We hope the other parties will do so as well.”

He added: “The publisher plaintiffs are prepared to modify the settlement agreement to gain approval. We plan to work together with Google, the Authors Guild and others to overcome the objections raised by the court and promote the fundamental principle behind our lawsuit, that copyrighted content cannot be used without the permission of the owner, or outside the law.”

Julie Bosman and Claire Cain Miller contributed reporting.
https://www.nytimes.com/2011/03/23/t.../23google.html





E-Textbooks Get a Lift From Publishers
Verne G. Kopytoff

Over the years, publishers have tried a variety of strategies to sell digital textbooks but with limited success. Most students continue to buy print books despite the inconvenience of having to lug them around in their backpacks.

Two major publishers are trying a new tactic. They have invested in Inkling, a company that makes interactive textbooks available on the iPad. They have also agreed to make dozens of their titles available on Inkling’s service.

The investment by Pearson and McGraw-Hill, announced on Wednesday, is a major step for Inkling, a company founded in 2009. Inkling sells interactive textbooks that incorporate audio, video and interactive quizzes.

The amount invested by Pearson and McGraw-Hill, among the biggest textbook publishers, was not disclosed. Inkling’s total investment to date, including money invested previously by several venture capital firms, is just under $10 million, according to a source who requested anonymity because of the confidential nature of the deals.

Inkling currently has 14 textbooks available from publishers like John Wiley & Sons and W.W. Norton. Pearson and McGraw-Hill have committed to add to that number. Pearson plans to make two dozen of its M.B.A. textbooks available along with a number of undergraduate arts and sciences books, marking the first time the company will sell books through Inkling. McGraw-Hill, which has a handful of books available on the service, will add its Top 100 college titles plus some medical and reference books.

In all, Inkling expects to have nearly 100 textbook available by the fall.

“This is not some pilot program on the part of the publishers, but a real commitment to build their business forward,” said Matt MacInnis, Inkling’s chief executive.

Inkling’s business is based on collecting commissions from publishers for each sale. Students can buy an entire textbook, at up to a 35 percent discount over print, or a single chapter. Inkling’s books are customized for the iPad, and are not just an online copy of what is available in print. A book about music on Inkling, for instance, includes audio of a Mozart concerto along with explanations that scroll on the screen while the music is playing.

In the past, publishers have sold digital textbooks that were essentially copies of what was available in print. Success has been mixed. Publishers are now hoping that more interactive versions will generate more sales.

“One of the things that’s interesting about Inkling is that they can take the content to a slightly higher level,” said Gary June, chief marketing officer for Pearson in North America.
Vineet Madan, vice president of learning for McGraw-Hill, said that he was pleased with the experiments his company had run so far with Inkling. Books on Inkling take advantage of the iPad’s interactivity, which could potentially encourage more sales than books that simply recreating the design of print.

However, both publishing executives stressed that Inkling was just one of several outlets they were using to sell digital textbooks. They remain participants in CourseSmart, a publishing industry cooperative, and continue with their own proprietary formats along with textbooks for other tablet devices besides the iPad.

“We have a lot of irons in the fire,” Mr. June said.
http://bits.blogs.nytimes.com/2011/0...om-publishers/





Sharp Scrutiny for Merger of AT&T and T-Mobile
Edward Wyatt

Mega-mergers may be celebrated on Wall Street on the theory that bigger is better. But the proposed merger of AT&T and T-Mobile is likely to face intense scrutiny by regulators, lawmakers and consumer advocates.

The review of the merger, one of the largest deals since the 2008 financial crisis, will also be a test for the White House. During his campaign, President Obama criticized the Bush administration’s record on antitrust review, and promised to increase scrutiny of merger proposals.

Some analysts say it is too early to see if the merger will pass muster. Part of the difficulty stems from the fact that the two primary agencies that will oversee any merger — the Federal Communications Commission and the Justice Department’s antitrust division — look at it with different goals in mind.

The Justice Department will chiefly examine whether competition among wireless mobile phone providers would remain sufficient after a merger. The department gave some hint to its thinking when it told the F.C.C. last year that the agency needed to use its “policy levers” to encourage more competition among wireless companies, particularly in wireless broadband access.

The F.C.C., on the other hand, has a goal of protecting the public interest in allocating use of the public airwaves, which it does in part by promoting competition. For example, one objective involves pushing the big wireless companies to allow smaller competitors to use their networks for data roaming services. Such policy initiatives give the F.C.C. more flexibility to consider conditions that it could apply to a merger to make it more palatable.

“Normally, competition and the public interest go hand in hand,” said Bert Foer, president of the American Antitrust Institute, a nonprofit agency that generally argues for more competition. But federal courts have recently sided with regulatory agencies instead of antitrust enforcers when conflicts occur, something that perhaps will give the F.C.C. the advantage in setting the conditions under which to approve a merger.

Congress, too, is likely to play a part in scrutinizing the proposed merger. Several Congressional committees have already announced plans to review the deal.

Senator John D. Rockefeller IV, a West Virginia Democrat who is chairman of the Senate Commerce Committee, said it was “absolutely essential” for both the Justice Department and the F.C.C. to “leave no stone unturned in determining what the impact of this combination is on the American people.”

Most recently, lawmakers and the White House faced stiff opposition from advocacy groups over the recent merger of Comcast and NBC Universal. That deal was approved by the Justice Department and the F.C.C. with several conditions that subject the expanded company to additional oversight, but many consumer groups and liberal advocacy organizations were unsatisfied with those terms.

“I’m not a huge fan of weighing down a merger with a dozen or more conditions,” said Gigi B. Sohn, president of Public Knowledge, a consumer advocacy group that sharply criticized the proposed merger of AT&T and T-Mobile. “As a regulator, you have to ask yourself how a merger could be in the public interest if you have to do all those things to it to get it done.”

AT&T is already pre-empting any concerns from lawmakers. Its publicity materials announcing the merger (including a dedicated Web site, mobilizeeverything.com), pulled language directly from the president’s State of the Union Message with references to providing wireless high-speed Internet to nearly all Americans in the next five years. And to help Senator Rockefeller with his deliberations, AT&T prominently displayed in its online materials a map of how the merger would expand service in West Virginia.

AT&T contends that the wireless phone market is highly competitive, with 18 of the 20 largest United States markets each having five or more wireless competitors. That may help AT&T, as the Justice Department has traditionally looked at local competition when considering whether a proposed merger will substantially reduce consumer options, said Michael L. Weiner, a partner and co-chairman of the antitrust practice at Dechert, an international law firm.

But national competition also plays an important role in the wireless market. Many mobile phone customers base their buying decisions on whether they will have to pay expensive roaming charges when they travel out of their home area. And after the merger, AT&T and Verizon will between them control nearly 80 percent of the wireless market, with the third-largest competitor, Sprint, lagging far behind.

When AT&T Wireless merged with Cingular in 2004, it emphasized the national benefits of the joined companies, Mr. Weiner said. Now, it is making the opposite argument.

“We’re very confident that we can achieve a successful regulatory review,” Wayne Watts, a senior executive vice president and general counsel at AT&T, said Monday. “The facts show significant, unique public interest benefits from this transaction.”

AT&T also said that it expected to discuss some divestitures with regulators to get approval for the deal. Most likely, those would take the form of the company giving up some wireless airwaves in certain cities where the merger would leave too few competitors.

The F.C.C. could use its leverage over the parties in discussing those possible conditions, under the theory that a company trying to get approval for a merger is unlikely to strongly oppose its chief regulator’s policy goals.

For example, the agency has been trying to get the large wireless companies to allow customers of other providers to use their networks for data roaming, much as they allow for roaming for voice services. The big providers have been reluctant to do so, because data services use much more bandwidth and, as AT&T customers with iPhones know, that can greatly slow a network.

The F.C.C. also has voiced its desire to expand its policy of net neutrality, which requires that Internet service providers not slow or block specific content, to apply to wireless broadband services as well as to wired services.

Though the F.C.C.’s position on net neutrality for wired Internet service was opposed by many telecommunications companies, AT&T supported it. This month, amid its talks with T-Mobile, AT&T’s chief lobbyist, James W. Cicconi, testified to the company’s support of the F.C.C. policy before a hostile House subcommittee.
https://www.nytimes.com/2011/03/22/t...2regulate.html





In AT&T & T-Mobile Merger, Everybody Loses
Om Malik

The lull of my lazy, rainy weekend was broken by the news that AT&T plans to acquire T-Mobile USA for a whopping $39 billion in cash and stock. Who wins and who loses in this deal? It’s hard to find winners, apart from AT&T and T-Mobile shareholders. Here is a list of who loses, in my opinion, in this deal:

Consumers. The biggest losers of this deal are going to be the consumers. While AT&T and T-Mobile are going to try to spin it as a good deal to combine wireless spectrum assets, the fact is, T-Mobile USA is now out of the market.

T-Mobile USA has been fairly aggressive in offering cheaper voice and data plans as it has tried to compete with its larger brethren. The competition has kept the prices in the market low enough. This has worked well for U.S. consumers. With the merger of AT&T and T-Mobile, the market is now reduced to three national players: AT&T, Verizon and Sprint. Net-net, U.S. consumers are going to lose.

Phone Handset Makers. Before the merger was announced, the handset makers such as HTC and Motorola had two major carriers who could buy their GSM-based phones. They just lost any ability to control price and profits on handsets because now there is a single buyer that can dictate what GSM phones come to market. Even with LTE becoming the standard for the 4G world, it would essentially be a market dominated by three buyers (should Sprint go with LTE), which would place handset makers at the mercy of the giants.

Sprint. The nation’s third-largest carrier was in talks to buy T-Mobile according to Bloomberg, but AT&T’s offer has now pushed Sprint to the bottom of the pile in terms of size and potentially spectrum assets if it goes through. If it doesn’t go through, then Sprint now has a price it has to match in order to get its hands on T-Mobile. Plus, Sprint and T-Mobile often stood against AT&T and Verizon on a variety of regulatory issues, so if AT&T succeeds, Sprint will stand alone on special access and other issues.

Network Equipment Suppliers. The carrier consolidation has proved to be a living hell for companies that make infrastructure network equipment. Alcatel-Lucent, along with Ericsson and Nokia Siemens, are suppliers of gears to both AT&T and T-Mobile USA. With a single customer, they will lost ability to control their own fate and are going to see their profits suffer as a result.

Google. I think the biggest loser in this could be Google. In T-Mobile, it has a great partner for its Android OS-based devices. Now the company will be beholden to two massive phone companies — Verizon and AT&T — who are going to try to hijack Android to serve their own ends.

Don’t be surprised if you see AT&T impose its own will on what apps and service are put on its Android smartphones. I wouldn’t be surprised to see the worst phone company in the U.S. (according to Consumer Reports) tries to create its own app store and force everyone to buy apps through it.

It doesn’t matter how you look at it; this is just bad for wireless innovation, which means bad news for consumers. T-Mobile has been pretty experimental and innovative: It has experimented with newer technologies such as UMA, built its own handsets and has generally been a more consumer-centric company. AT&T, on the other hand, has the innovation of a lead pencil and has the mentality more suited to a monopoly: a position it wants to regain.
http://gigaom.com/2011/03/20/in-att-...erybody-loses/





Verizon Wireless CEO Says No Interest in Sprint Deal
Sinead Carew

The chief executive of Verizon Wireless said he has no interest in buying Sprint Nextel Corp (S.N) even as the company stands to lose its top position in the U.S. wireless market because of a merger between AT&T Inc (T.N) and T-Mobile USA.

Verizon Wireless CEO Daniel Mead also said he would not oppose AT&T's plans to buy Deutsche Telekom's (DTEGn.DE) T-Mobile USA for $39 billion.

The CEO said the company did not want to be distracted from its goal of being the most profitable U.S. wireless operator. Verizon Wireless is a joint venture of Verizon Communications (VZ.N) and Vodafone Group (VOD.L).

"We're not interested in Sprint. We don't need them," said Mead, speaking to Reuters ahead of the CTIA Wireless Conference.

AT&T announced plans on Sunday to buy T-Mobile USA in a massive deal to create a new U.S. mobile market leader.

Mead said U.S. regulators would likely approve the AT&T/T-Mobile deal if the companies agreed to certain conditions. AT&T is expected to have to sell some assets in order to get regulators to approve the deal.

"Anything can go through if you make enough concessions," Mead said.

(Reporting by Sinead Carew, writing by Lewis Krauskopf in New York; Editing by Vinu Pilakkott)
http://www.reuters.com/article/2011/...72L0K820110322





Trade Group Sets Off Debate Over Spectrum 'Hoarding'

The National Association of Broadcasters questions if carriers are using the spectrum they have
Grant Gross

The National Association of Broadcasters, asked by the U.S. Federal Communications Commission and some lawmakers to give up television spectrum for mobile data uses, has fired back by accusing several other companies of hoarding the spectrum they hold.

In recent weeks, the NAB has taken a new approach to its concern over a year-old FCC proposal that urges TV stations to voluntarily give up unused spectrum in exchange for a piece of the proceeds in a so-called incentive auction of that spectrum. NAB, an influential trade group, has gone on the offensive in recent weeks by suggesting that several spectrum holders, including Verizon Communications, AT&T and Time Warner Cable, have not developed the spectrum they already have.

"Maybe you should develop that spectrum before you come to broadcasters asking for 40 percent more of their spectrum," said Dennis Wharton, NAB's executive vice president for media relations. "Why is it taking so long, if there really is a national spectrum crisis?"

NAB members believe they've given up enough spectrum in the transition to digital TV resulting in the 700MHz auctions that ended in early 2008, Wharton said. "We gave at the office," he added.

NAB doesn't oppose voluntary auctions, but the spectrum crunch seems likely to be a problem only in major cities, Wharton said. TV stations have their own uses for the targeted spectrum, he added.

"We are using this spectrum to deliver the primary video signal that broadcasters delivered in the analog era, along with digital multicast channels that offer niche programming like weather channels, foreign-language channels, religious programming, kids' TV shows and even high school sports in some markets -- all for free," he said. "Spectrum will also be used to deliver live and local mobile digital TV to smartphones, laptops and the back seats of cars."

FCC Chairman Julius Genachowski, mobile trade group CTIA and individual mobile carriers have long argued there's a coming spectrum shortage, with data use on mobile devices skyrocketing. In its national broadband plan released a year ago, the FCC proposed to make 500MHz of spectrum available for mobile uses over the next decade, with 120MHz coming from the TV bands.

NAB's position strikes of "incongruity," given that broadcasters are sitting on spectrum that's used to deliver over-the-air TV signals, a service that's "dropping like a stone" in popularity, Jim Cicconi, AT&T's senior executive vice president for external and legislative affairs, wrote in a blog post Friday.

"NAB ... insinuated the problem isn't their own massive warehousing and underuse of precious spectrum resources," he wrote. "Instead, the problem is everyone else."

Several other groups, including CTIA and the Consumer Electronics Association, have denounced the NAB's accusations as an effort to sidetrack the debate over spectrum needs.

The spectrum-hoarding complaints are a "desperate attempt by the broadcast industry to deflect attention from the looming national spectrum crisis," the two trade groups said in a letter to congressional leaders Thursday. "NAB has once again endeavored to search for any hint of outlier instances where spectrum allegedly is not being put to productive use -- a point that has been consistently refuted."

The letter from CTIA and CEA came a day after the NAB criticized the FCC for not completing a detailed spectrum inventory.

On Wednesday, the FCC's Genachowski, in a speech at the Mobile Future Forum, said the agency has completed a "baseline" spectrum inventory. Genachowski also refuted several points being made by the NAB, although he didn't call out the NAB by name.

"The spectrum crunch will not be solved by the build-out of already allocated spectrum," Genachowski said. "That spectrum was already built into the FCC's analysis of the spectrum shortage and does not detract from the desirability and necessity of adding the incentive auction tool to the FCC's arsenal."

The baseline inventory the FCC has completed tells the agency "more than enough" to conclude incentive auctions are needed, Genachowski said.

"Our inventory confirms that there are no hidden vacant lots of commercial airwaves, but that there are a few areas well-suited to mobile broadband, such as the TV and [mobile satellite services] bands," he added.

The FCC has not released the results of its baseline spectrum inventory, and spectrum holders are generally reluctant to talk about how much idle spectrum they own. That's led the NAB -- and other groups -- to call for a more complete spectrum inventory.

"The question is not whether the FCC can identify locations and licenses on the spectrum dashboard that have been set aside for specific services," Wharton said. "The real issue is whether specific companies that bought or were given spectrum worth billions have actually deployed it."

Several spectrum owners have denied the NAB claims, saying they're moving forward with plans to develop spectrum they've won at auction in recent years. AT&T has spent nearly US$8 billion on recent 700MHz and AWS auctions, but that spectrum is the foundation of the company's 4G LTE network, launching in mid-2011, Joan Marsh, the carrier's vice president of federal regulatory affairs, wrote in a February blog post.

NAB's claims of mobile carriers being spectrum hoarders is an "astonishing display of denial and false accusation," Marsh wrote.

Still, NAB points to instances where spectrum holders have no immediate plans. DISH Network, the satellite TV company, paid $711 million for a near-nationwide swath of spectrum in the 2008 700MHz auctions.

During a November conference call on DISH's third-quarter earnings, a participant asked DISH President and CEO Charles Ergen about the company's plans for its 70MHz holdings. Ergen called the spectrum an "investment."

"It's a building block, potentially strategically, for things we might want to do in the future," he said, according to a transcript on SeekingAlpha.com. "It is, as it turns out, a pretty good inflation hedge, and they're not making any more of that spectrum. If we're not able to strategically do something with that spectrum, then there's probably other people who are able to do that."

DISH has been "very conservative" about building out the spectrum without knowing what it wants to do with it yet, Ergen added. "I don't know whether our timing's right or not on 700MHz," he said. "At some point, that will be a valuable spectrum to somebody. And if we can figure out a way to use it, that's good. If we can't, then somebody else will own it."

DISH has used past spectrum allocations to deliver service to its customers, said Marc Lumpkin, the company's director of corporate communications. In the 700MHz spectrum, "we're exploring the best use of it," he said.

Some mobile users question whether the carriers are efficiently using the spectrum they now have, said Robb Topolski, a veteran networking engineer and frequent critic of the large carriers. Topolski questioned poor coverage by Verizon on parts of Cape Cod, Massachusetts.

"Our wireless spectrum is a public asset and we lease it like we lease other public assets for the public good," he said. "When it is sold, it is sold with the intention that it will be used -- it does someone good. Hoarding it provides none of those public benefits, and the government ought to reclaim any unused bandwidth and put it to work for the people."
http://www.cio.com.au/article/380294...rum_hoarding_/





AT&T Says They're Working On Meter Accuracy

Insists They'll Work With Users On Problems
Karl Bode

Earlier this week we noted that AT&T's usage meters for their upcoming metered billing push aren't accurate when compared to user firewall or router logs. That's been par for the course with a lot of efforts to bill by the byte particularly in Canada -- where companies like Cogeco and Bell have had significant trouble tracking usage accurately. AT&T has responded to our request for comment on the meter issue, and the company tells us they're working to improve meter accuracy before the system goes live in May -- and will work with anybody who is having problems.

"We're happy to work one-on-one with any of your readers to walk through the measurement tool and address any questions," AT&T spokesman Seth Bloom tells Broadband Reports. "We're already addressing ways we can make the labels and information on the online tool more clear for customers between now and May...I can also assure you our team is performing checks everyday to ensure accuracy."

As we noted earlier in the week, AT&T's decision to measure from the DSLAM could be increasing usage estimates because that would include ATM and PPPoE overhead. AT&T didn't offer us a comment on that.

The company did suggest the volume of time measured by some tools could be to blame. "Other tools may measure at different 24-hour periods than we do, and most likely do not take into account the standard network protocols (e.g. Ethernet, IP) that are used to provide applications and content to our customers via the Internet," says Bloom. "As you know, this is fairly standard to incorporate when measuring broadband traffic and is applied by other ISPs who measure usage." Looking at router logs submitted to us, we're fairly sure this isn't the problem most are seeing.

As we move close to AT&T's cap and overage launch in May, we're interested in hearing from users as to whether or not their meters are reflecting usage accurately, and if not -- how responsive AT&T is to your concerns. Again, keep in mind that as ISPs push for meters on your broadband spigot, there is absolutely nobody (aside from yourself) working to ensure they are metering you honestly.
http://www.broadbandreports.com/show...ccuracy-113370





Internet or Splinternet?
Chris

Some of you may remember that not so long ago there was a time, when all things were simpler. In order to play a console game you had to insert a cartridge and press the start button, without having to endure long loading times, installation, connection to some poor online service, endless firmware upgrades. TV guide was a handy brochure, not a laggy and painfully slow screen. Pink Floyd, Nirvana and Guns’n’Roses were considered the “popular music”. TV was not only about terrible talent shows. Vampire novels meant books by Anne Rice, not “Twilight”. Cars were not smarter than the drivers. And yes, movies were still silly. But it was good kind of silly.

Back then, the Internet was one – a global web, similar regardless of whether you were accessing it from Birmingham, Berlin, Bangladesh or Kickapoo. All of this changed.

I don’t want to be that scruffy guy with “The end is nigh” sign and some really bad dental problems, but most industry analysts already noticed that global Internet is coming apart, changing into a cluster of smaller and more closed webs. They have even created a catchy name for this Web 3.0 – the Splinternet. How is it happening?

First reason is the hardware. In the beginning, most users browsed the Internet from similar desktop machines. Even if the operating system was different, standardized web protocols and languages made the final experience similar, whether you were using Windows 3.1 machine or your trusty classic Mac. But now the pool of devices capable of using Internet is growing rapidly. In fact, various proprietary gadgets will soon overtake the desktops as the most common way of accessing the web. Some of them support flash, some of them don’t. Some of them will adopt HTML5, others don’t plan for it. Many access the altered, ‘mobile’ versions of the sites and apps. Some have very limited processing power, which effectively blocks them from certain web activities. And their manufacturers sometimes block certain parts of the Internet entirely, like Apple fighting porn, or AT&T blocking Skype on their smartphones. Today, the Internet on one device might be different from the Internet on the other. Between mobiles, tablets, desktops, netbooks, internet enabled TV’s, and fridges, the hardware gap is widening.

Even bigger change came with the rise of social networks and various web apps. Every day more content is hidden in the walled gardens of the web, like Facebook or Twitter, behind the fence of login and password. Just think about it: how much interesting content have you discovered in your friend’s updates, notes and tweets? This content is invisible to Google and other search engines, it’s not backed up by wayback machine or proxy servers. The number of people seeing only the things recommended by their social circle is growing.

But that’s not everything. There is also an idea of the adaptive web, Internet that changes depending on your preferences or habits. It was started by location-sensitive websites, forcing you to use the localized version if they find out you’re in a certain country. Then, some sites (like Amazon) learned to keep track of user history – and adapt. Right now, many portals try to push it up one more level, the whole site content is supposed to change based on your preferences. What’s the problem with that? A simple example: imagine you’ve seen a great article on a certain site, you tell your friend about it, but when he goes to the same site, he won’t see it. The site remembered that he’s interested in music and film, not in popular science, and is feeding him only the content he is supposed to like. Adaptive web might close people off in small bubbles of content, blind them to the outside world.

Same goes for ISP-side filtering. I wrote about it recently in my series on net neutrality, but to give you a quick recap: major telecoms are lobbying for the right to filter internet traffic coming to their clients. They want to block certain sites, they want to force you to use their own services (e-mail clients, auction houses, shops), instead of the ones you use right now. Should they succeed, the internet will be torn apart by gaps much wider than everything I mentioned in the previous paragraphs.

Like it or not, the Splinternet age has begun. We have a growing hardware chasm, walled gardens rising left and right, websites that become shape-shifting adapters, ISP’s that filter content, and users gather in closed, social recommendation circles. The web is much different than it was years ago, and many analysts agree that the golden age of Internet is finished.

So how does our StormDriver tie in with all that? Are we a knight in a shining armor, on a quest to defend the old ways? Or are we a part of web 3.0? It’s complicated (as usual). On one hand, we want to bring the interaction back to the common web, and break down the walled garden walls. We want you to be able to interact everywhere, not only in places where admins allow you to. On the other hand, we’re also an adaptive and robust social recommendation circle. Stormdriver will allow you to see the web as recommended by other users. It will be much easier to avoid the really bad sites and content, but on the other hand – it is a garden, even if the walls are knee-high, and you can step over them without login or password.

Because in the end, no one can fight the Splinternet. It’s a paradox – users want the web to become more intelligent and adaptive, but at the same time the single homogenous Internet will shatter. Everyone is soon to have an Internet of their own.
http://www.stormdriver.com/blog/inte...r-splinternet/





Pirate Bay User Database Compromised and Exploited, Again
Ernesto

In recent weeks many Pirate Bay users have received an email, allegedly sent by The Pirate Bay team, encouraging them to download a course on how to make money from the site. The email is clearly sent by spammers, but since this is not the first time the Pirate Bay user database has been exploited, users are starting to worry how it’s possible that their personal info is leaking out again.

tpbLast summer a group of Argentinian hackers gained access to The Pirate Bay’s admin panel through a security breach. At the time, the hackers stated that they didn’t want to exploit the vulnerability, and merely wanted to show that the system was vulnerable.

The Pirate Bay team informed TorrentFreak that they were doing all they could to patch the vulnerability, and later said that the site was fully secure again. Two month later, however, it became apparent that The Pirate Bay backend had been exploited, this time by spammers.

At the time a large number of The Pirate Bay users received an email, allegedly from the site’s operators, inviting them to join the private BitTorrent tracker DemUnoid. The emails were sent out using a unique combination of real Pirate Bay user names and the email addresses those people signed up with, indicating that the sender had exploited the user database.

How this happened, and whether there was a connection to the earlier hack attempt remained a mystery, but it has now become apparent that this spam attempt was not an isolated incident.

Starting mid-February TorrentFreak started receiving reports of another spam attempt. This time Pirate Bay users are being encouraged to visit a website where they can allegedly download instructions on how to make money from torrent sites like The Pirate Bay.

Below is a copy of one of the original emails. A slightly edited version was sent out as recently as yesterday.



Subject: Attention to all PirateBay Users

Dear *Username*

A course has been put together to show you how to use The PirateBay to make some serious money. This seriously works.

Please visit http://www.sams101.com/ccount/click.php?XXX

and download the course instructions. Because you are a torrent user and you use TPB you can do this.

Pirate Team




The staff at The Pirate Bay are definitely not sending out these emails, so from where do they originate? As far as we can see it appears to be another exploit of a vulnerability in The Pirate Bay user database, one that is used for malicious purposes. Another possibility is that the same people are reusing the previously obtained data.

The emails that TorrentFreak has seen all follow the same structure and link to the same page. They are sent from various addresses such as super.affilates002@gmail.com, the.pirates.teams@gmail.com and the.pirate.teams@thepiratesteam.com and all use the unique combination of a Pirate Bay username and email address of the user in question.

One Pirate Bay user who received the spam email told TorrentFreak that he only used the email the spam was sent to once, to sign up at The Pirate Bay, which is a clear sign that the spam results from a compromised user database. How this info was collected is unclear at this point, and from the information we have it appears that only a subset of users is affected.

During recent weeks users have mentioned the spam mails at the Pirate Bay forums, but no official explanation has been given thus far. With nearly 5 million users The Pirate Bay database is a lucrative target for spammers so new users should be weary of this and if possible use a throwaway email address when signing up.

A Pirate Bay moderator told TorrentFreak that users who want to change their email address can ask for it on IRC or at the forums. An option to let users change their email addresses on the site is being considered.
http://torrentfreak.com/pirate-bay-u...-again-110320/

















Until next week,

- js.



















Current Week In Review





Recent WiRs -

March19th, March 12th, March 5th, February 26th

Jack Spratts' Week In Review is published every Friday. Submit letters, articles, press releases, comments, questions etc. in plain text English to jackspratts (at) lycos (dot) com. Submission deadlines are Thursdays @ 1400 UTC. Please include contact info. The right to publish all remarks is reserved.


"The First Amendment rests on the assumption that the widest possible dissemination of information from diverse and antagonistic sources is essential to the welfare of the public."
- Hugo Black
JackSpratts is offline   Reply With Quote
Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Peer-To-Peer News - The Week In Review - July 10th, '10 JackSpratts Peer to Peer 0 07-07-10 07:49 AM
Peer-To-Peer News - The Week In Review - February 13th, '10 JackSpratts Peer to Peer 0 10-02-10 07:55 AM
Peer-To-Peer News - The Week In Review - January 30th, '10 JackSpratts Peer to Peer 0 27-01-10 07:49 AM
Peer-To-Peer News - The Week In Review - January 16th, '10 JackSpratts Peer to Peer 0 13-01-10 09:02 AM
Peer-To-Peer News - The Week In Review - December 5th, '09 JackSpratts Peer to Peer 0 02-12-09 08:32 AM






All times are GMT -6. The time now is 02:25 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
© www.p2p-zone.com - Napsterites - 2000 - 2024 (Contact grm1@iinet.net.au for all admin enquiries)