Go Back   P2P-Zone > Peer to Peer
FAQ Members List Calendar Search Today's Posts Mark Forums Read

Peer to Peer The 3rd millenium technology!

Thread Tools Search this Thread Display Modes
Old 09-04-14, 07:47 AM   #1
JackSpratts's Avatar
Join Date: May 2001
Location: New England
Posts: 9,928
Default Peer-To-Peer News - The Week In Review - April 12th, '14

Since 2002

"The evidence on societal costs points in conflicting directions and our study shows that the impact of illegal downloading and file sharing remains unclear." – Daniel Zizzo

"When people commit crimes, we have the ability and obligation to ensure that they do not stand to account for those crimes in forums in which they performed no 'essential conduct element' of the crimes charged." – Circuit Judge Michael Chagares

"There’s nothing users can do until the web services have made their sites secure." – Mark Seiden

April 12th, 2014

Studios File New Lawsuit Against Megaupload and Its Founder
Michael Cieply

Hollywood’s major film studios on Monday added a new civil lawsuit on copyright infringement to the legal challenges facing the owners of the now-closed entertainment website Megaupload.com.

The suit was filed in United States District Court for the Eastern District of Virginia, in Alexandria, by 20th Century Fox Film, Disney Enterprises, Paramount Pictures, Universal City Studios Productions, Columbia Pictures Industries and Warner Bros. Entertainment.

It names Megaupload and its founder, Kim Dotcom, among others, as defendants, and lists about 30 feature films — including “Avatar,” “Transformers” and “Cars 2” — as having been among what it said were thousands of copyrighted works illegally downloaded through the site. The suit made various demands for damages that it said could include a maximum statutory award of $150,000 for each infringement.

Mr. Dotcom, who was born Kim Schmitz, has been fighting extradition from New Zealand to face criminal charges of copyright infringement and racketeering in the United States.

Ira Rothken, a lawyer for Megaupload and the other defendants, said he believed the suit was without merit. “We plan on vigorously defending the case,” Mr. Rothken said.

Dropbox Clarifies its Policy on Reviewing Shared Files for DMCA Issues

Service matches publicly shared content against hashes of previously blocked files.
Kyle Orland

For years now, Internet users have accepted the risk of files and content they share through various online services being subject to takedown requests based on the Digital Millennium Copyright Act (DMCA) and/or content-matching algorithms. But users have also gotten used to treating services like Dropbox as their own private, cloud-based file storage and sharing systems, facilitating direct person-to-person file transfer without having to worry.

This weekend, though, a small corner of the Internet exploded with concern that Dropbox was going too far, actually scanning users' private and directly peer-shared files for potential copyright issues. What's actually going on is a little more complicated than that, but it shows that sharing a file on Dropbox isn't always the same as sharing that file directly from your hard drive over something like e-mail or instant messenger.

The whole kerfuffle started yesterday evening, when one Darrell Whitelaw tweeted a picture of an error he received when trying to share a link to a Dropbox file via IM. The Dropbox webpage warned him and his friend that "certain files in this folder can't be shared due to a takedown request in accordance with the DMCA."

Whitelaw freely admits that the content he was sharing was a copyrighted video, but he still expressed surprise that Dropbox was apparently watching what he shared for copyright issues. "I treat [Dropbox] like my hard drive," he tweeted. "This shows it's not private, nor mine, even though I pay for it."

In response to follow-up questions from Ars, Whitelaw said the link he sent to his friend via IM was technically a public link and theoretically could have been shared more widely than the simple IM between friends. That said, he noted that the DMCA notice appeared on the Dropbox webpage "immediately" after the link was generated, suggesting that Dropbox was automatically checking shared files somehow to see if they were copyrighted material rather than waiting for a specific DMCA takedown request.

Dropbox did confirm to Ars that it checks publicly shared file links against hashes of other files that have been previously subject to successful DMCA requests. "We sometimes receive DMCA notices to remove links on copyright grounds," the company said in a statement provided to Ars. "When we receive these, we process them according to the law and disable the identified link. We have an automated system that then prevents other users from sharing the identical material using another Dropbox link. This is done by comparing file hashes."

Dropbox added that this comparison happens when a public link to your file is created and that "we don't look at the files in your private folders and are committed to keeping your stuff safe." The company wouldn't comment publicly on whether the same content-matching algorithm was run on files shared directly with other Dropbox users via the service's account-to-account sharing functions, but the wording of the statement suggests that this system only applies to publicly shared links.

We should be clear here that Dropbox hasn't removed the file from Whitelaw's account; they just closed off the option for him to share that file with others. In a tweeted response to Whitelaw, Dropbox Support said that "content removed under DMCA only affects share-links." Dropbox explains its copyright policy on a Help Center page that lays out the boilerplate: "you do not have the right to share files unless you own the copyright in them or have been given permission by the copyright owner to share them." The Help Center then directs users to its DMCA policy page.

Dropbox has also been making use of file hashing algorithms for a while now as a means of de-duplicating identical files stored across different users' accounts. That means that if I try to upload an identical copy of a 20GB movie file that has already been stored in someone else's Dropbox account, the service will simply give my account access to a version of that same file rather than allowing me to upload an identical version. This not only saves bandwidth on the user's end but significant storage space on Dropbox's end as well.

Some researchers have warned of security and privacy concerns based on these de-duplication efforts in the past, but the open source Dropship project attempted to bend the feature to users' advantage. By making use of the file hashing system, Dropship effectively tried to trick Dropbox into granting access to files on Dropbox's servers that the user didn't actually have access to. Dropbox has taken pains to stop this kind of "fake" file sharing through its service.

In any case, it seems a similar hashing effort is in place to make it easier for Dropbox to proactively check files shared through its servers for similarity to content previously blocked by a DMCA request. In this it's not too different from services like YouTube, which uses a robust ContentID system to automatically identify copyrighted material as soon as it's uploaded.

In this, both Dropbox and YouTube are simply responding to the legal environment they find themselves in. The DMCA requires companies that run sharing services to take reasonable measures to make sure that re-posting of copyrighted content doesn't occur after a legitimate DMCA notice has been issued. Whitelaw himself doesn't blame the service for taking these proactive steps, in fact. "This isn't a Dropbox problem," he told Ars via tweet. "They're just following the laws laid out for them. Was just surprised to see it."

Still, we feel this is important information for Dropbox users to know. There are certain limitations on how accounts can be used. Any Dropbox file shared via a "public link," even if it's a link that you only intend to share with a single person, is being compared against a database of previous material subject to the DMCA. It could be blocked on those grounds.

Why the Digital Privacy Act Undermines Our Privacy: Bill S-4 Risks Widespread Warrantless Disclosure
Michael Geist

Earlier this week, the government introduced the Digital Privacy Act (Bill S-4), the latest attempt to update Canada's private sector privacy law. The bill is the third try at privacy reform stemming from the 2006 PIPEDA review, with the prior two bills languishing for months before dying due to elections or prorogation.

The initial focus has unsurprisingly centered on the new security breach disclosure requirements that would require organizations to disclose breaches that puts Canadians at risk for identity theft. Security breach disclosure rules are well-established in other countries and long overdue for Canada. The bill fixes an obvious shortcoming from the earlier bills by adding some teeth to the disclosure requirements with the addition of penalties for violations of the law. Moreover, Bill S-4 stops short of granting the Privacy Commissioner full order making power as is found at the provincial level, but the creation of compliance orders has some promise of holding organizations to account where violations occur.

Despite those positive proposed changes to Canadian privacy law, the bill also includes a provision that could massively expand warrantless disclosure of personal information.

The government is already working to expand warrantless disclosure of subscriber information to law enforcement with Bill C-13 (the "cyber-bullying bill") including an immunity provision from any criminal or civil liability (including class action lawsuits) for companies that preserve personal information or disclose it without a warrant. The law currently entrusts companies with a gatekeeper role since it permits them to either voluntarily disclose personal information as part of a lawful investigation or demand that law enforcement first obtain a court order. The immunity provision makes it more likely that disclosures will occur without a warrant since the legal risks associated with such disclosures are removed.

In light of revelations that telecom companies and Internet companies already disclose subscriber information tens of thousands of times every year without a court order, the immunity provision is enormously problematic. Yet it pales in comparison to the Digital Privacy Act, which would expand the possibility of warrantless disclosure to anyone, not just law enforcement. Bill S-4 proposes that:

"an organization may disclose personal information without the knowledge or consent of the individual... if the disclosure is made to another organization and is reasonable for the purposes of investigating a breach of an agreement or a contravention of the laws of Canada or a province that has been, is being or is about to be committed and it is reasonable to expect that disclosure with the knowledge or consent of the individual would compromise the investigation;

Unpack the legalese and you find that organizations will be permitted to disclose personal information without consent (and without a court order) to any organization that is investigating a contractual breach or possible violation of any law. This applies both past breaches or violations as well as potential future violations. Moreover, the disclosure occurs in secret without the knowledge of the affected person (who therefore cannot challenge the disclosure since they are not aware it is happening).

When might this apply?

Consider the recent copyright case in which Voltage Pictures sought an order requiring TekSavvy to disclose the names and addresses of thousands of subscribers. The federal court established numerous safeguards to protect privacy and discourage copyright trolling by requiring court approval for any demand letters being sent to subscribers. If Bill S-4 were the law, the court might never become involved in the case. Instead, Voltage could simply ask TekSavvy for the subscriber information, which could be legally disclosed (including details that go far beyond just name and address) without any court order and without informing their affected customer.

In fact, the potential use of this provision extends far beyond copyright cases. Defamation claims, commercial battles, and even consumer disputes may all involve alleged breaches of agreements or the law. While the organization with the personal information (telecom companies, social media sites, local businesses) might resist disclosing information without a court order, the law would not require them to do so.

The resulting framework from C-13 and S-4 is stunning from an anti-privacy perspective:

• organizations could disclose subscriber or customer personal information without a court order to law enforcement with full legal immunity from liability
• organizations could disclose subscriber or customer personal information without a court order to any other organization claiming investigation of an actual or potential contractual breach or legal violation
• the disclosures would be kept secret from the affected individuals
• the disclosing organizations would be under no obligation to report on their practices or past disclosures

The government claims the Digital Privacy Act "will provide new protections for Canadians when they surf the web and shop online". What it does not say that the same bill will open the door to massive warrantless disclosure of their personal information.

No, You Can’t Save This File, It’ll Automatically Self-Destruct Instead
Chris Smith

DSTRUX is a brand new service that wants to make files even more secure when you share them on the Internet by offering total control on who gets to see them and by making it more difficult for recipients to duplicate the file without permission. This new cloud-based file-sharing product will also allow users to set up a self-destruct timer on shared files, so they’re automatically destroyed at a certain point in time.

“DSTRUX was created for the millions of everyday people and companies who lose control of their images, sensitive documents, files and intellectual property,” DSTRUX CEO Nathan Hecht said.

“We have been programmed to accept a complete loss of control online and the consequences that follow. When files leave a computer for another, there is no telling where they will wind up or how they may be altered. The idea that society has become complacent is mind-boggling and disturbing. Too often sensitive documents and pictures come back to haunt people and companies for simple errors. We want DSTRUX to be a platform that impacts society in both the virtual and real worlds, and I am truly proud of what we’ve accomplished,” the exec added.

Files uploaded with DSTRUX will be protected against altering, printing, copying, forwarding or screenshots without the sender’s permission, but they can be shared in a variety of ways, including via social networks or regular email. The company further says that the sender can further control whom the file can be forwarded to by those users that receive it initially.

DSTRUX will be available free of charge for the first three months, although a pricing structure for the following months has not been announced. DSTRUX apps for iOS and Android will be available in May and later this summer, respectively.

FCC Sides with Local Cable Commissions Against Comcast

It upholds ruling telling company to stop lumping “service fees” with equipment.
Libor Jany

The Federal Communications Commission has upheld a ruling ordering Comcast to stop charging its customers for cable equipment under the guise of service fees.

The FCC denied an appeal by Comcast, which argued that its practice of charging customers separately for a DTA (digital terminal adapter) — a converter box that allows cable subscribers with older televisions to receive digital channels, which the company said would be provided at no charge — is not subject to rate regulation, because it is a service fee. The ruling was issued on March 19.

After subscribers complained, four Twin Cities suburban cable franchising authorities issued rate orders “requiring Comcast to unbundle the equipment from three different so-called ‘service fees,’ ” said Woodbury attorney Michael Bradley, whose law firm, Bradley & Guzzetta, represented the cable commissions.

The commissions are: North Metro Telecommunications, North Suburban Communications, Ramsey/Washington Counties Suburban Cable and South Washington County Telecommunications.

Municipal cable officials last year called on Comcast, the Twin Cities’ largest cable provider, to be more transparent in its billing practices, Bradley said.

The FCC agreed, ruling that the cable giant is “required to establish separate equipment and programming rates when setting initial regulated rates, by first unbundling (or separating) equipment costs from total regulated revenues.”

In an e-mail last week to the Star Tribune, Comcast vice president of corporate affairs Mary Beth Schubert said the case “involved a relatively minor dispute about the way certain items are presented on the rate card but has no effect on overall pricing.”
But, Bradley argued the FCC’s decision sets a strong precedent for transparency within the cable industry.

“They were lumping the equipment and the so-called service fees all together,” he said. “And what’s important about the ruling is that now the FCC has very clearly told Comcast and all cable companies across the country that it can no longer do that.”

In Scrutiny of Cable Merger, Internet Choice Will Be Crucial Battlefield
Edward Wyatt

Since announcing plans to take over Time Warner Cable two months ago, Comcast has steadily beat the drum with one big message: The merger will not limit consumers’ choice in picking a cable television or high-speed Internet service provider.

Comcast is expected to repeat this message twice this week — on Wednesday during the first Senate hearings on the $45 billion deal, and again in legal filings it is expected to give to the two government agencies reviewing the merger.

But in highlighting how the two companies do not compete with each another in any metropolitan market, Comcast has exposed a potential weakness in its argument, legal experts say. The lack of overlap in cable TV is the legacy of government-granted local monopolies. But the government never granted monopolies in the unregulated, highly lucrative business of high-speed Internet service — an area where the two companies face little to no competition.

As a result, regulators are likely to focus as much on how the merger will affect the market for high-speed Internet, also known as broadband, as how it will affect cable TV service.

“The economic argument over broadband is going to be crucial in this case,” said Allen P. Grunes, a partner at the law firm GeyerGorey in Washington and the co-author of two recent papers examining the merger.

The proposed takeover will come under scrutiny on Wednesday by the Senate Judiciary Committee, which oversees antitrust agencies. Senator Patrick J. Leahy, the Vermont Democrat who is chairman of the committee, has indicated that broadband will play a crucial role in its investigation.

“Millions of Americans rely on cable connectivity to receive the programs they love and to access the Internet at the fast speed needed as we conduct more of our lives online,” Mr. Leahy said in announcing the hearing. “The pending merger is an important opportunity to examine how Americans access these valuable services as the video and online marketplace continue to evolve.”

The questions about broadband competition also will loom large this week if Comcast, as expected, lays out its defense of the merger in filings with the Justice Department and the Federal Communications Commission. The merger requires approval of both agencies. The Justice Department looks solely at whether the merger violates antitrust laws, and it faces the burden of proof.

But the F.C.C. has a broader mandate in examining whether the merger serves “the public interest.” In the F.C.C. inquiry, the burden is on Comcast to prove its deal serves the public interest. The company is expected to file its public interest statement on Tuesday.

Comcast has said it will lessen the merger’s impact on cable television competition by keeping its share of the national cable market at no more than 30 percent. That will require it to divest itself of some of Time Warner Cable’s customers.

But Comcast’s share of the market for high-speed Internet is bigger than its share of the cable market — at least 40 percent, the company says. Critics of the merger say that if only truly high-speed Internet service is considered, meaning data transmission rates of 25 megabits per second or more, its share tops 50 percent.

As consumers steadily drop their cable subscriptions in favor of services like Netflix, Amazon and YouTube to stream movies and TV shows through the Internet, high-speed Internet service is expected to become the more important business.

“Broadband is the faster-growing competitor to traditional video services,” said Gene Kimmelman, chief executive of Public Knowledge, a nonprofit group that opposes the merger.

Comcast has gotten plenty of support in its bid for Time Warner Cable. In Comcast’s home state, Pennsylvania, the governor and the state’s two senators have spoken out in favor of the deal, as have industry experts.

“We should consider the benefits to consumers and the overall economy, as well as the potential drawbacks, instead of assuming big cable companies are necessarily bad,” wrote Doug Brake, a telecommunications policy analyst at the Information Technology and Innovation Foundation, when the deal was announced. “With a little analysis, the deal appears a win for consumers and the economy over all.”

Comcast, however, might have provided evidence that it faces little competition in high-speed Internet in dozens of F.C.C. petitions it filed over the last few years seeking to get out from under local cable rate regulation.

Cable rates cannot be regulated if a company proves there is “effective competition” in a market. A market is judged to be competitive if at least two companies each offer service to 50 percent of the households in the area, and if the market share of all providers, except the largest, is at least 15 percent.

In the petitions, Comcast argued that the nation’s two satellite television companies, DirecTV and Dish Network, meet those requirements in many markets by accounting for at least 15 percent of television service. While a few markets also have telecoms, usually AT&T or Verizon, competing to provide television service, most of the petitions cite only the satellite companies as rivals.

But satellite companies do not offer high-speed Internet service, as the technology prevents it. The time required for a television signal to travel to a satellite and back to Earth are significant enough to create untenable delays.

That means that in those markets, Comcast is usually the only provider of high-speed broadband using a cable modem — the fastest service going, next to fiber optic cable, which is not widely available in the United States.

Comcast said those F.C.C. filings did not refer to broadband speeds directly and said it did face competition in most markets from other broadband providers. “In most of those areas, DSL is available” from the local phone company, and wireless broadband is available through mobile phone companies, said Sena Fitzmaurice, a spokeswoman for Comcast.

But DSL service, which is delivered over traditional copper phone lines, does not measure up to the speeds of cable Internet service. The most recent F.C.C. figures available, from mid-2012, show that only 8 percent of DSL connections in the United States transmit at a speed of at least 10 megabits per second. Seventy percent of cable modem service travels that fast.

All of that brings into focus another debate that is expected during consideration of the merger: Just what qualifies as high-speed Internet, or broadband, service?

The F.C.C. has a formal definition: broadband that moves at 3 megabits per second or faster. But that speed is woefully inadequate for many sites that stream video. Many of the newest generation of mobile broadband services can transmit data at up to 8 megabits per second, but that is still below the average cable modem speed.

In that respect, Comcast says it is a victim of its success. It has spent heavily to build its capacity and offer customers better Internet service. “People who are critics of us,” Ms. Fitzmaurice said, “can’t decide that now that we’re offering higher speeds, competition from DSL doesn’t exist.”

Congratulations To Comcast, Your 2014 Worst Company In America!

Four years since winning its first Worst Company tournament, Comcast’s doubted that the Kabletown Krusher could ever regain that 2010 form. But after a few years of letting others hold the title, Comcast was fiercely intent on bringing a second Golden Poo to its Philadelphia lair. And in one of the narrowest Final Death Matches in the centuries’ long history of WCIA battle, Comcast managed to hold the genetically modified body blows of Monsanto.

From the onset of the day-long bout, lawsuit-lovin’, herbicide-makin’ Monsanto was within striking distance of the Philly Kid, but Comcast gained a hair-thin edge early on and never ceded the lead.

Comcast’s road to the Poo started out without a speedbump, as the company powered through the first three rounds without ever giving up more than 30% of the vote. And with two-time reigning champ EA eliminated in Round One by Comcast’s merger partner Time Warner Cable, followed by three-time consecutive runner-up Bank of America’s surprise defeat at the hands of Walmart, Comcast seemed destined for the Final Death Match.

But the nation’s largest cable and Internet provider (which is trying to become even larger), almost got stopped in its track by first-time contender SeaWorld, riding high on waves of negative publicity tied to the documentary Blackfish. Comcast pulled off a buzzer-beater to hold off SeaWorld and earn its place in the Final Death Match.

Comcast’s win makes it only the second company to claim multiple Poos. Last year, video game biggie EA was both the first two-time winner and its first repeat champ.

And so that’s it for WCIA 2014. See you again next year!

Franken’s Campaign Against Comcast Is No Joke
Ashley Parker

For Senator Al Franken, the political became personal at a “Saturday Night Live” cast party, of all places.

It was there in New York two years ago that Mr. Franken, a Minnesota Democrat, ran into Lorne Michaels, the creator of the NBC show and his former boss when he was a writer and performer there. Mr. Michaels was chatting with Brian L. Roberts, the chief executive of Comcast, which had recently acquired NBCUniversal in a deal that Mr. Franken opposed.

“I fought to prevent this!” Mr. Franken blurted out to the two men.

It was a potentially awkward moment that Mr. Franken defused with the kind of blustery humor that delighted audiences during his years as an entertainer. “We all had a laugh, fun was had by all, and I went on,” he said in an interview.

But for Mr. Franken, antitrust issues involving big companies are no joking matter. The man who created such famous “Saturday Night Live” characters as the self-help guru Stuart Smalley is now a serious policy wonk and a self-made expert in antitrust matters like price-fixing and monopolization.

After a failed attempt to block the Comcast-NBC Universal merger, Mr. Franken again finds himself playing a trustbusting role in Washington — against the same adversary. He has emerged as the leading congressional opponent of Comcast’s $45 billion bid to take over Time Warner Cable, a merger that would unite the nation’s two biggest cable companies.

In a three-hour Senate Judiciary hearing on Wednesday, Mr. Franken adopted a prosecutorial stance as he interrogated executives from both companies, asking pointed questions, often repeatedly, like a dog with a particularly tasty bone. He was the only lawmaker to explicitly say he wanted the merger blocked.

“We’ve got the biggest cable provider and biggest Internet provider, in Comcast, buying the second-biggest cable provider and third-largest Internet provider, and I’m very worried that will create a company that’s too big,” Mr. Franken said in the interview. “They’re going to use their position to leverage higher cable prices and to dictate a lot of things that will make for fewer choices, and their service will be even worse.”

Mr. Franken, for his part, should have a good sense of Comcast — he said the company was his provider in both Minnesota and Washington, and added with a laugh: “It’s great. The service is wonderful.” Moments later, he doubled back to explain his tone. His chuckle, he said, “was more ironic than sarcastic.”

Mr. Franken, who also opposed the unsuccessful merger of AT&T and T-Mobile, said his interest in the issue was about “trying to protect consumers in Minnesota, trying to protect people whose experience with Comcast has not been good.” He added that when he asked his constituents to weigh in, he received over 100,000 replies, overwhelmingly opposed to the deal and talking about “how lousy the customer service was.”

But the issue is also one Mr. Franken knows intimately from his time in the entertainment industry. He recalled working in television when the Financial Interest and Syndication Rules, which reined in the power of the networks, were relaxed and ultimately overturned in the 1990s. That allowed networks to own the television they broadcast in prime time, which Mr. Franken said “killed independent production.”

“The networks swore up and down that they would not favor their own shows because they said, ‘We want the best ratings,’ ” he said in a phone interview. “That was completely false.”

“LateLine,” a sitcom he helped create, was produced by Paramount, a company that NBC did not own, and as a result NBC did not give it a choice slot, in Mr. Franken’s view. “Our time slot was not conducive to getting many eyeballs to our show,” he said.

During his performing career, Mr. Franken acquired a reputation for tilting heavily against the establishment — and that included the government, the network he worked for and even his own show.

In 1980 he took on Fred Silverman, then the NBC chief executive, in a commentary on the Weekend Update segment, calling him “a lame-o” for wrecking the network. Mr. Silverman was reportedly enraged by the sketch.

In another well-remembered segment from 1980, he mocked the show’s producers and essentially argued for “S.N.L.” to be canceled — but not until a week later, after he had a chance to be the host. (Because of a writers’ strike, he never was.)

He was known for leading the effort to make sure the writers were paid for every type of replay of “S.N.L.” or the sketches they had written. A colleague from the early years of the show, who asked not to identified because he’d become estranged from Mr. Franken, said that Mr. Franken had been originally hired as a writer at a less than subsistence salary, “and he never forgot that.”

As a senator, Mr. Franken has followed the workhorse model of previous senators who came in surrounded by hype, such as Hillary Rodham Clinton and Barack Obama, and focused on the minutiae of legislation. He usually speaks only to the Minnesota press, and even his more whimsical pursuits are local in nature, like the Annual Minnesota Congressional Delegation Hotdish Off, a casserole competition among the state’s lawmakers that Mr. Franken organized.

“If you had to pick a word for Al Franken as a senator, it’s ‘studious,’ ” said Senator Charles E. Schumer, Democrat of New York. “He really studies the issues hard, he’s very serious about them, and he’s effective.”

Far from cracking wise, he has earned a reputation as a student of the fine details of policy and legislation. He spent his 62nd birthday last year immersed late into the night at a Senate Judiciary Committee session, marking up a broad bipartisan immigration bill; at one point, as the hours ticked by, forcing Mr. Franken to miss a family birthday dinner, the Democratic senator Chris Coons of Delaware presented him with a vanilla buttercream cupcake.

“That’s Al Franken the senator,” Mr. Coons said. “He is engaged, he is diligent, he is thorough, he is thoughtful.”

Senator Patrick J. Leahy, Democrat of Vermont and chairman of the Judiciary Committee, echoed the praise. “He’s been one of the best-prepared people there, and is very valuable in committee,” Mr. Leahy said. “He certainly knows a lot about the business, far more than most of us would on a personal basis.”

Mr. Franken’s deep understanding of — and near obsession with — telecommunications mergers has impressed even those involved in antitrust debates. Albert A. Foer, president of the American Antitrust Institute, which opposes mergers, still recalls a speech Mr. Franken, who is not a lawyer, gave on the subject to the American Bar Association two years ago, calling it “beautiful.”

“As a guy coming from the media, he understands the need for diversity, and as a politician, he understands the need for decentralized power,” Mr. Foer said. “He puts these together in a kind of nonpedantic way.”

Mr. Franken’s colleagues on the judiciary committee, many of whom have law degrees, say that Mr. Franken likes to joke, “I’m not a lawyer, but I played one in a sketch.”

It is a punch line that is adaptable to different situations.

“When I speak to prosecutors, I say, ‘I played a defense lawyer,’ ” Mr. Franken said, adding with a chuckle and evident satisfaction: “There, I made myself laugh.”

Bill Carter contributed reporting from New York.

The Backlash to the Comcast Merger is Now Bipartisan
Brian Fung

Ever since Comcast unveiled its plan to take over the nation's second biggest cable company, liberals have been pretty upset about the idea. Among the most vocal is Sen. Al Franken (D-Minn.), who argued recently in blunt messages to federal regulators that "the Internet belongs to the people, not huge corporations." On Tuesday, dozens of left-leaning organizations, such as Moveon.org and SumofUs, sent a letter to the Justice Department and the Federal Communications Commission expressing their displeasure.

Conservatives, by contrast, have mostly kept mum or praised the looming merger. But that may be starting to change as Republicans detect a political opportunity in the proposal — not to mention some burgeoning problems with the merger itself. The result is bipartisan objection to a buyout that critics say would be harmful to competition.

Comcast has argued that when stacked up against wireless companies and DSL providers, the cable company is surrounded by plenty of potential rivals. But as my colleague Cecilia Kang points out, the variation in broadband quality when it comes to these other services means that Comcast isn't really going head-to-head with these guys. Cellular data plans top out at speeds of roughly 10 Mbps, while cable can outpace that easily. Cable remains the most common way that Americans connect to the Internet, with more than half of all U.S. households getting online through a cable provider. And if the Time Warner Cable deal gets approved, almost 60 percent of all U.S. cable subscribers would belong to Comcast.

Republican groups, including the American Family Association, Tea Party Nation and the U.S. Business and Industry Council, see that consolidation as a reason to worry.

"Competition isn’t just about the number of competitors," they and nearly a dozen other organizations wrote to Congress Tuesday. "It is also about the market forces that stimulate the process of bringing new and innovative products to the market that consumers want."

Others see the merger as a chance to score points against the Obama administration, which has close ties with top Comcast execs Brian Roberts and David Cohen. The right-leaning Washington Free Beacon published a 1,200-word column on Friday excoriating Comcast's political contributions to Democratic politicians. That was soon followed by columns on Breitbart.com and a number of other outlets.

"If Republicans had any sense, they would wage war against Comcast and its Democratic enablers and turn the merger into a live issue," wrote the Free Beacon's Matthew Continetti.

Just because conservatives are beginning to line up alongside their liberal counterparts doesn't mean they'll be able to torpedo the deal. Neither does the White House's relationship with Comcast necessarily imply that the Justice Department and the FCC are in Big Cable's pocket. But the right's growing skepticism is a surprising turn for a party that tends to side with business as a matter of instinct.

Comcast PAC Gave Money to Every Senator Examining Time Warner Cable Merger

Even anti-Comcast Al Franken got one Comcast donation 5 years ago.
Jon Brodkin

It's no surprise that Comcast donates money to members of Congress. Political connections come in handy for a company seeking government approval of mergers, like Comcast's 2011 purchase of NBCUniversal and its proposed acquisition of Time Warner Cable (TWC).

But just how many politicians have accepted money from Comcast's political arm? In the case of the Senate Judiciary Committee, which held the first congressional hearing on the Comcast/TWC merger yesterday, the answer is all of them.

Sen. Chuck Schumer (D-NY) led the way with $35,000 from the Comcast federal political action committee (PAC) between 2009 and 2014, Sen. Patrick Leahy (D-VT) received $32,500, and Sen. Orrin Hatch (R-UT) received $30,000. These figures are the combined contributions from Comcast to the senators' campaign and leadership committees. (Schumer has recused himself from the merger hearings because his brother, a lawyer, worked on the deal.)

Out of 18 committee members, 10 Democrats and eight Republicans, 17 got money from Comcast's federal PAC, according to the database at OpenSecrets.org.

Just a one-time thing

Anti-Comcast Sen. Al Franken (D-MN) isn't listed as having received anything from Comcast's PAC, but that's apparently because the database didn't take into account money collected by Franken's recount fund from when he needed a vote recount to get elected to the Senate. Franken and Comcast spokespeople confirmed to Ars that a Comcast PAC did give $5,000 to Franken's recount fund in 2009. A Los Angeles Times story from 2010 also mentions the donation.

The LA Times story notes that Franken was using his opposition to the Comcast/NBC deal to raise more campaign funds. A Franken spokesperson told the newspaper at the time that there were no plans to return the donation, saying, "He campaigned pretty clearly that he was going to stand up to special interests."

The Comcast donation came through on Oct. 30, 2009, after the election recount was settled but while Franken was still paying off campaign debt. Comcast had previously given to Franken's opponent, Republican Norm Coleman.

Separately from Comcast's PAC, Franken has received $15,050 from Comcast employees since 2009. His popularity with Comcast's overlords have obviously gone downhill since the recount fund donation, though. He argued yesterday that the Comcast/TWC merger would stifle competition and lead to higher prices and worse service for consumers.

“There’s no doubt that Comcast is a huge, influential corporation, and I understand that there are over 100 lobbyists making the case for this deal to members of Congress and our staffs,” Franken said during the hearing. "But I’ve also heard from over 100,000 consumers who oppose this deal, and I think their voices need to be heard, too.”

Exercising its political rights

Comcast has a website describing its political activity, saying, "Comcast exercises its fundamental right and responsibility to participate in the political process... Political contributions are made from employee-funded political action committees ('PAC') that are sponsored by Comcast. The Comcast PACs are operated by a board of directors, chaired by the Executive Vice President. When permitted by law, political contributions are also made out of corporate funds."

Donations are given through PACs because "corporations are barred from direct contributions at the federal level by law," a Comcast spokesperson told Ars.

With data from OpenSecrets, here is a look at Comcast PAC donations to Senate Judiciary Committee members between 2009 and 2014 (Cruz, Hirono, and Flake were first elected to the Senate in 2012). These numbers are the sum of donations to each senator's campaign committee and leadership PAC (Franken's recount fund is not included).

Comcast PAC donations to Senate Judiciary Committee Democrats

• Chuck Schumer, New York: $35,000
• Patrick Leahy, Vermont, Chairman: $32,500
• Sheldon Whitehouse, Rhode Island: $26,500
• Chris Coons, Delaware: $25,000
• Dick Durbin, Illinois: $23,000
• Amy Klobuchar, Minnesota: $22,500
• Dianne Feinstein, California: $18,500
• Richard Blumenthal, Connecticut: $11,500
• Mazie Hirono, Hawaii: $5,000
• Al Franken, Minnesota: $0

Comcast PAC donations to Republicans

• Orrin Hatch, Utah: $30,000
• Chuck Grassley, Iowa, Ranking Member: $28,500
• John Cornyn, Texas: $21,000
• Lindsey Graham, South Carolina: $13,500
• Jeff Sessions, Alabama: $10,000
• Mike Lee, Utah: $8,500
• Ted Cruz, Texas: $2,500
• Jeff Flake, Arizona: $1,000

Win some, lose some

Comcast's money has been well spent in some cases. Hatch argued yesterday that antitrust laws shouldn't advantage or disadvantage specific competitors.

"Some of my friends here today have never met a merger they've liked," Hatch said. "The markets for both video services and broadband Internet are both dynamic and innovative with new entrants and evolving technologies. Government regulators must be especially careful not to intervene unwisely in such technologically dynamic markets... Five years ago many believed that no one could compete effectively against the Bells. Today some suggest that no one will be able to compete effectively against cable in providing broadband."

Sen. Grassley's opening statement and questions to witnesses suggest that he isn't reflexively for or against the merger.

"Comcast and Time Warner control a significant amount of the cable infrastructure that Americans use to access high-speed Internet," Grassley said. "They control the cable lines that go directly into people’s homes. So there’s a lot of interest in what will happen if the two companies merge. Consumers want to know whether a combined Comcast-Time Warner will be in a better position to expand high-speed Internet access. What will Comcast-Time Warner do to their cable bills? Are prices going to increase? Will they have more content choices? People want to know what this will do to the industry. Will the merger inhibit growth and deployment of broadband services? Will it enhance competition with companies like Dish Network and Google Fiber? What are the downstream effects of the merger? What are the implications of the merger for open access and peering? Consumers care about their options, the quality of their cable access, and the price that they pay."

Grassley also asked Comcast whether it would charge smaller content providers more to be carried on Comcast's TV service after a merger. Comcast Executive VP David Cohen said not to worry. "Over the last decade our programming costs have gone up 98 percent while our cable rates have gone up at basically half that rate," he said. "It shows you the balance of power in the market, where programmers have more power at the negotiating table."

Cohen personally contributed to two Senate Judiciary Committee members last year, sending $1,000 to Cruz and $2,600 to Coons.

Judiciary Committee Chairman Patrick Leahy, one of the top recipients of Comcast cash, criticized the state of competition in the broadband market.

"In 1996, I voted against the Telecommunications Act in part because of concerns I had about the lack of competition in the cable TV market," he said. "Along with many consumers, I continue to be concerned. Similar questions are now being raised about the broadband industry, where consumers feel like they face large bills and inadequate choices."

Leahy also questioned TWC about the golden parachutes that will give tens of millions of dollars to executives if the merger is approved.

Klobuchar, who got $22,500 from Comcast's PAC, questioned the company about data caps, usage-based pricing, and its recent deal to charge Netflix for a connection to its network. "Why charge both Netflix and your consumers for this service?" Klobuchar asked Cohen.

Lee, who received $8,500 from Comcast's PAC, was skeptical of the acquisition, saying the broadband market doesn't have enough competition. Comcast and TWC together would have nearly half of the country's high-speed Internet access customers, he said. (A New York Times story quotes merger critics saying the companies' share of 25Mbps and up service would top 50 percent.) Comcast argues that it would control less than 30 percent of the multi-channel video market after the merger.

Blumenthal, who got $11,500 from Comcast's PAC, said he's worried a merged Comcast and TWC will overcharge rivals for sports programming, given that the two companies own 16 regional sports networks combined.

Comcast employees open their wallets, too

In addition to money from Comcast PACs, OpenSecrets tracks donations from Comcast employees. Here's a look:

Comcast employee donations to Senate Judiciary Committee Democrats:

• Chris Coons, Delaware: $44,200
• Chuck Schumer, New York: $31,600
• Dick Durbin, Illinois: $30,400
• Amy Klobuchar, Minnesota: $23,123
• Sheldon Whitehouse, Rhode Island: $22,331
• Patrick Leahy, Vermont, Chairman: $15,250
• Al Franken, Minnesota: $15,050
• Richard Blumenthal, Connecticut: $14,000
• Dianne Feinstein, California: $6,025
• Mazie Hirono, Hawaii: $4,500

Comcast employee donations to Republicans:

• Orrin Hatch, Utah: $23,750
• Chuck Grassley, Iowa, Ranking Member: $3,000
• Lindsey Graham, South Carolina: $2,000
• Jeff Sessions, Alabama: $0
• John Cornyn, Texas: $0
• Mike Lee, Utah: $0
• Ted Cruz, Texas: $0
• Jeff Flake, Arizona: $0

These can vary a lot depending on the state, although senators on average receive 51 percent of their donations from outside their states, according to Maplight.org, which also tracks campaign finance data.

Maplight analyzed Comcast's contributions to Congress between 2001 and 2012, finding that the company is one of the biggest corporate donors to members of Congress.

"House members of the Subcommittee on Communications and Technology received $853,525 from Comcast" in that time period, while "Contributions from Comcast to House members serving in the 109th, 110th, 111th and 112th Congresses total $6,678,446," Maplight reported. That subcommittee, which oversees the FCC, held a hearing on the Comcast/NBCUniversal merger in 2010 but hasn't announced plans for a hearing on the TWC deal yet.
The House Judiciary Committee scheduled a hearing on the merger for May 8.

Comcast also spent $18.8 million on lobbying last year.

Congress won't actually decide whether Comcast and TWC get to merge. That will be up to the FCC and Justice Department. Still, "Congressional reaction will tend to influence how the agencies react," wrote Harold Feld, senior VP of consumer advocacy group Public Knowledge.

How the Internet Is Taking Away America’s Religion

Using the Internet can destroy your faith. That’s the conclusion of a study showing that the dramatic drop in religious affiliation in the U.S. since 1990 is closely mirrored by the increase in Internet use.

Back in 1990, about 8 percent of the U.S. population had no religious preference. By 2010, this percentage had more than doubled to 18 percent. That’s a difference of about 25 million people, all of whom have somehow lost their religion.

That raises an obvious question: how come? Why are Americans losing their faith?

Today, we get a possible answer thanks to the work of Allen Downey, a computer scientist at the Olin College of Engineering in Massachusetts, who has analyzed the data in detail. He says that the demise is the result of several factors but the most controversial of these is the rise of the Internet. He concludes that the increase in Internet use in the last two decades has caused a significant drop in religious affiliation.

Downey’s data comes from the General Social Survey, a widely respected sociological survey carried out by the University of Chicago, that has regularly measure people’s attitudes and demographics since 1972.

In that time, the General Social Survey has asked people questions such as: “what is your religious preference?” and “in what religion were you raised?” It also collects data on each respondent’s age, level of education, socioeconomic group, and so on. And in the Internet era, it has asked how long each person spends online. The total data set that Downey used consists of responses from almost 9,000 people.

Downey’s approach is to determine how the drop in religious affiliation correlates with other elements of the survey such as religious upbringing, socioeconomic status, education, and so on.

He finds that the biggest influence on religious affiliation is religious upbringing—people who are brought up in a religion are more likely to be affiliated to that religion later.

However, the number of people with a religious upbringing has dropped since 1990. It’s easy to imagine how this inevitably leads to a fall in the number who are religious later in life. In fact, Downey’s analysis shows that this is an important factor. However, it cannot account for all of the fall or anywhere near it. In fact, that data indicates that it only explains about 25 percent of the drop.

He goes on to show that college-level education also correlates with the drop. Once it again, it’s easy to imagine how contact with a wider group of people at college might contribute to a loss of religion.

Since the 1980s, the fraction of people receiving college level education has increased from 17.4 percent to 27.2 percent in the 2000s. So it’s not surprising that this is reflected in the drop in numbers claiming religious affiliation today. But although the correlation is statistically significant, it can only account for about 5 percent of the drop, so some other factor must also be involved.

That’s where the Internet comes in. In the 1980s, Internet use was essentially zero, but in 2010, 53 percent of the population spent two hours per week online and 25 percent surfed for more than 7 hours.

This increase closely matches the decrease in religious affiliation. In fact, Downey calculates that it can account for about 25 percent of the drop.

That’s a fascinating result. It implies that since 1990, the increase in Internet use has had as powerful an influence on religious affiliation as the drop in religious upbringing.

At this point, it’s worth spending a little time talking about the nature of these conclusions. What Downey has found is correlations and any statistician will tell you that correlations do not imply causation. If A is correlated with B, there can be several possible explanations. A might cause B, B might cause A, or some other factor might cause both A and B.

But that does not mean that it is impossible to draw conclusions from correlations, only that they must be properly guarded. “Correlation does provide evidence in favor of causation, especially when we can eliminate alternative explanations or have reason to believe that they are less likely,” says Downey.

For example, it’s easy to imagine that a religious upbringing causes religious affiliation later in life. However, it’s impossible for the correlation to work the other way round. Religious affiliation later in life cannot cause a religious upbringing (although it may color a person’s view of their upbringing).

It’s also straightforward to imagine how spending time on the Internet can lead to religious disaffiliation. “For people living in homogeneous communities, the Internet provides opportunities to find information about people of other religions (and none), and to interact with them personally,” says Downey. “Conversely, it is harder (but not impossible) to imagine plausible reasons why disaffiliation might cause increased Internet use.”

There is another possibility, of course: that a third unidentified factor causes both increased Internet use and religious disaffiliation. But Downey discounts this possibility. “We have controlled for most of the obvious candidates, including income, education, socioeconomic status, and rural/urban environments,” he says.

If this third factor exists, it must have specific characteristics. It would have to be something new that was increasing in prevalence during the 1990s and 2000s, just like the Internet. “It is hard to imagine what that factor might be,” says Downey.

That leaves him in little doubt that his conclusion is reasonable. “Internet use decreases the chance of religious affiliation,” he says.

But there is something else going on here too. Downey has found three factors—the drop in religious upbringing, the increase in college-level education and the increase in Internet use—that together explain about 50 percent of the drop in religious affiliation.

But what of the other 50 percent? In the data, the only factor that correlates with this is date of birth—people born later are less likely to have a religious affiliation. But as Downey points out, year of birth cannot be a causal factor. “So about half of the observed change remains unexplained,” he says.

So that leaves us with a mystery. The drop in religious upbringing and the increase in Internet use seem to be causing people to lose their faith. But something else about modern life that is not captured in this data is having an even bigger impact.

Aftermarketfailure: Windows XP's End of Support

Andrew Tutt
Yale University - Information Society Project; Yale University - Law School

April 6, 2014

112 Mich. L. Rev. First Impressions 109 (2014)

After 12 years, support for Windows XP will end on April 8, 2014. Microsoft Windows XP’s end of support, combined with a collective action failure stemming from individual users’ failure to realize or internalize the costs of failing to migrate or upgrade their operating systems, could be catastrophic. The attached essay briefly sketches out the argument for why software monopolists should be legally required to help other companies provide ongoing support for their products. First, it describes the conceptual and economic theories that would support such a requirement. Second, it describes the conflicting law governing the intersection between intellectual property and antitrust. Third, it exhorts Microsoft to extend the support clock, release its sourcecode, or make clear to the world that should anyone else wish to take on the task of providing future security support for Windows XP, Microsoft will help them to do so.

Number of Pages in PDF File: 8

Keywords: intellectual property, property, competition law, antitrust, software, computers, copyright, patent
Accepted Paper Series

Download This Paper

Windows XP Holdouts: Meet the Diehard Faithful Who Refuse to Move On
Ian Paul

After more than 12 years, Windows XP breathes its last gasp on Tuesday, April 8, when Microsoft will issue the final security update for the aging OS.

Nearly every longtime Windows user looks back on Windows XP with a certain fondness, but the party’s over, at least according to Microsoft. “It’s time to move on,” says Tom Murphy, Microsoft’s director of communications for Windows. “XP was designed for a different era.”

Despite Microsoft’s urgings, which began in earnest nearly two-and-a-half years ago, a sizable portion of the world’s PC users are still actively using Windows XP. During March 2014, close to 30 percent of all Internet-connected PCs worldwide were running XP, according to Net Market Share. Only Windows 7 surpassed XP in PC usage.

There's no doubt about it: Many, many people around the world refuse to give up on XP anytime soon. But why? What’s so great about an operating system that was invented before the age of Dropbox and Facebook, an OS that's almost as old as the original Google search engine?

"All the 'experts' say I am crazy. Thing is, I stopped the security updates in XP years ago and yet I have never been infected... So, crazy though I be, I am sticking with XP."

After talking to a number of current XP users, we've reached one major conclusion: For many of them, PCs aren’t snazzy tech gadgets, but home appliances that still work just fine. Beyond that there’s suspicion toward Windows 8, migration hassles and costs, personal preference, and a heavy dose of skepticism about the fundamental insecurity of Windows XP.

Who's afraid of the big, bad malware?

When you ask Microsoft why it's urging users to give up Windows XP for a Windows 7 or Windows 8.1 PC, it all comes down to security. “XP was launched in 2001, which means the design and engineering started in the 90s,” Murphy says. “At that time, the types of threats and risks you found online were really different and a lot less sophisticated than what we see now. Windows 7 and 8.1 start with security in mind, they are created and designed to be inherently more secure. XP predates all that work, because these threats simply didn't exist.”

But many XP users aren’t buying it.

“They built an awful lot of bomb shelters back in the 50's with the same kind of mindset,” says Dallas-based Pix Smith, a puppeteer and magician who uses an XP PC for online research and small-business bookkeeping. Smith acknowledges that an attack against his PC is possible, but he argues that when it comes to malware, the odds are in his favor.

“As with most of those things, the number of people affected versus the total number of users is a really, really, low percentage, if you are relatively prudent,” Smith said. “And I like to think that I'm a relatively prudent user. I don't do a lot of things that would open me up to malware.”

Windows XP users can rely on smart online behavior and tools such as Mozilla's plugin check page to ensure they stay safe beyond April 8.

Others offered similar sentiments. Bob Appel, a retiree based in Toronto, says he uses 12 PCs in a personal Dropbox-like network—10 of which are running XP.

“I use a third-party firewall, a free virus checker, and run Housecall periodically,” Appel told PCWorld via email. “My Firefox browser uses Keyscrambler, HTTPS Anywhere, Ghostery, and Disconnect. I also have a VPN account (PIA) when traveling. For suspicious email attachments, I deploy private proprietary bioware (me!) to analyze before opening. All the 'experts' say I am crazy. Thing is, I stopped the security updates in XP years ago after a bad update trashed my system, and yet I have never been infected, although online for hours each day. So, crazy though I be, I am sticking with XP.”

“So tell me about the people who have never used Microsoft Update and are still running a virgin copy of any Windows OS and have never been infected,” writes Mike Merritt, who uses an XP PC to run his online business in rural Ontario. “Tell me about the number of times that your antivirus program honestly finds a virus trying to get in...Get real! Fear mongering.”

Nevertheless, Microsoft isn’t the only company warning about the inherent dangers of XP after April 8. Security firm Avast says XP is already under attack six times more often than Windows 7. F-Secure, meanwhile, argues that an attack against an as yet unknown flaw in Windows XP is inevitable.

Migration pains

While security is top of mind for Microsoft, users have other concerns, such as the pain of moving their data to a new PC, configuring applications, and, in many cases, getting used to new programs.

For some users, saying goodbye to Windows XP would also mean saying goodbye to a beloved program.

Merritt cites Outlook Express as one of his major reasons for sticking with XP. The once-popular email client isn’t available with Windows 7 or 8.1, and for Merritt, alternatives such as Thunderbird or webmail clients like Outlook.com are a non-starter.

“I live and work in a remote area and am limited to dial-up Internet connection,” Merritt said. “Webmails have a slower load time than a desktop app like Outlook Express and they would have their own learning curve and modification to my current workflow.”

"The upgrade path for me would require replacing a bunch of things that work just fine as far as I'm concerned,” says Smith, who runs a number of older programs that still do everything he needs, such as WordPerfect Office X3 for document creation and editing.

Works just fine for me

In fact, the mantra that “XP does everything I need” was a common refrain during our discussions with users. Juan Barbosa, a salesman based in Puerto Rico, recently advised his parents to stick with XP. “My parents just use Facebook, YouTube, Hotmail [now Outlook.com], and read online versions of BBC, CNN, and local newspapers. No need to upgrade,” Barbosa told us via email.

Like others we spoke to, Barbosa also isn’t concerned about security. “[My parents’] favorites tab holds the 14 sites that they visit most. They have antivirus and anti-malware software, and I always advise them to be careful online. I know about the security Vista, Win 7, and Win 8.1 afford, but they are happily accustomed to Win XP and I feel as long as they don’t stray from the secure path I laid there is not much to worry about.”

Microsoft’s Murphy disagrees with that argument, however. “Even if you're only doing email or using social networks, that's personal data on your machine that's potentially exposed,” Murphy told us. “If you're surfing the web and doing email, you're also probably buying things online. Given the risk and threats there are online today, I don't think it's very wise to continue using your Windows XP machine.”

The elephants in the room: Money and Windows 8

For some, sticking with XP simply comes down to the cost. Sam Allen, a student based in Lincoln, UK, says he doesn’t want to move to Windows 8.1, and the cost of a Windows 7 PC is just too high. “They do not do student packages for Windows 7 anymore,” Allen told PCWorld via email. “I am also reluctant to use Windows 8 as I have heard many negative reviews of it thus far.”

Windows 8's drastically overhauled interface worries some Windows XP users.

When asked about his opinion of Windows 8, Allen points to the Start screen and how it affects the traditional desktop. “It feels like they have tried to fix something that did not need to be fixed,” Allen said.

“I don't have Windows 8 on any computer and hope that I never will,” Merritt said. “Much of my efficiency in work and play requires that I be able to switch between apps promptly and cut-paste-copy as needed. I'm used to my Taskbar and the order of things on it.”

Murphy acknowledges that there is a learning curve when moving from an OS like XP to Windows 8.1; however, he said, it’s not as dramatic as, say, switching from a 2001-era cell phone to a modern smartphone. “If you don't have a touch device, then spending a couple of minutes where you learn how to boot to desktop and how to use the Windows key [in Windows 8.1], you end up way more productive [than in XP].”

Microsoft is also taking steps to make Windows 8.1 more appealing to users reluctant to try out the dual-interface OS.

Microsoft plans to bring the Start menu back to Windows 8.1, as well as the ability to run Metro apps in desktop windows, though the update will likely roll out far after Windows XP's expiration date.

The same day that Windows XP reaches its end of support on April 8, Microsoft will roll out a major update to Windows 8.1 that will make it easier for traditional desktop users to interact with touch-friendly modern UI apps. The company also recently announced that the Start menu will return to Windows sometime in the coming months.

Once the Windows 8.1 Start menu returns, Microsoft may be able to convince some more XP users to switch. But for many, the decision to stick with XP clearly goes deeper than the presence or absence of a Start menu. And that may not change, regardless of Microsoft's efforts.

Still using XP? Be sure to read PCWorld's guide to keeping your Windows XP PC secure in a world without security patches.

Microsoft Partners Lenovo, Tencent to Offer XP Tech Support in China

Microsoft Corp has partnered Lenovo Group Ltd and Tencent Holdings Ltd to provide software security services for Windows XP users in China, after the U.S. tech firm stopped updating the operating system.

Microsoft wants users to move to later, more secure versions of Windows and so stopped servicing the 13-year-old XP this week, potentially leaving users vulnerable to viruses and hacking.

XP has 200 million users in China, or 70 percent of the market, according to Zhongguancun Online, cited by state news agency Xinhua. Upgrading could be expensive as computers running XP might not be powerful enough for newer versions of Windows.

To continue support for XP users, Microsoft has partnered Lenovo, Tencent and several other Chinese computing companies to offer services such as information protection, post-virus repairs and upgrades to the newer Windows 7 or 8.

"For domestic users who continue to use Windows XP before upgrading to a new operating system, we have made it a priority to provide safety protections," Microsoft said in an email to Reuters.

Tencent, in a statement to Reuters, said it will provide permanent XP support free of charge, and that it has set up two 24-hour hotlines. Lenovo declined to comment.

Among other partners, Qihoo 360 Technology Co will offer security support and, for 299 yuan ($48.25), help users transition to newer versions.

"Qihoo 360 will continue to provide Windows XP support to Chinese users as long as there are still XP users in China," Alex Xu, Qihoo 360 co-chief financial officer, told Reuters.

Encouraging users to upgrade could also reduce the number of computers running pirated Windows XP software. Former CEO Steve Ballmer reportedly told employees in 2011 that, because of piracy, Microsoft earned less revenue in China than in the Netherlands even though Chinese computer sales matched those of the U.S.

But the cost of upgrading has irked some users. Added to any transition fee is the cost of the software - at least 888 yuan ($140) for Windows 8 - plus the cost of any hardware upgrade.

"Many of my clients only use their computers for email," said a computer shop owner who identified himself by the surname Niu. "There is no use for them to buy a new computer."

($1 = 6.1968 Chinese Yuan)

(Reporting by Matthew Miller; Additional reporting by Beijing Newsroom; Editing by Christopher Cushing)

Critical Crypto Bug in OpenSSL Opens Two-Thirds of the Web to Eavesdropping

Exploits allow attackers to obtain private keys used to decrypt sensitive data.
Dan Goodin

Researchers have discovered an extremely critical defect in the cryptographic software library an estimated two-thirds of Web servers use to identify themselves to end users and prevent the eavesdropping of passwords, banking credentials, and other sensitive data.

The warning about the bug in OpenSSL coincided with the release of version 1.0.1g of the open-source program, which is the default cryptographic library used in the Apache and nginx Web server applications, as well as a wide variety of operating systems and e-mail and instant-messaging clients. The bug, which has resided in production versions of OpenSSL for more than two years, could make it possible for people to recover the private encryption key at the heart of the digital certificates used to authenticate Internet servers and to encrypt data traveling between them and end users. Attacks leave no traces in server logs, so there's no way of knowing if the bug has been actively exploited. Still, the risk is extraordinary, given the ability to disclose keys, passwords, and other credentials that could be used in future compromises.

"Bugs in single software or library come and go and are fixed by new versions," the researchers who discovered the vulnerability wrote in a blog post published Monday. "However this bug has left a large amount of private keys and other secrets exposed to the Internet. Considering the long exposure, ease of exploitations and attacks leaving no trace this exposure should be taken seriously."

The researchers, who work at Google and software security firm Codenomicon, said even after vulnerable websites install the OpenSSL patch, they may still remain vulnerable to attacks. The risk stems from the possibility that attackers already exploited the vulnerability to recover the private key of the digital certificate, passwords used to administer the sites, or authentication cookies and similar credentials used to validate users to restricted parts of a website. Fully recovering from the two-year-long vulnerability may also require revoking any exposed keys, reissuing new keys, and invalidating all session keys and session cookies. Members of the Tor anonymity project have a brief write-up of the bug here, and a this analysis provides useful technical details.

OpenSSL is by far the Internet's most popular open-source cryptographic library and TLS implementation. It is the default encryption engine for Apache, nginx, which according to Netcraft runs 66 percent of websites. OpenSSL also ships in a wide variety of operating systems and applications, including the Debian Wheezy, Ubuntu, CENTOS, Fedora, OpenBSD, FreeBSD, and OpenSUSE distributions of Linux. The missing bounds check in the handling of the Transport Layer Security (TLS) heartbeat extension affects OpenSSL 1.0.1 through 1.0.1f.

The bug, which is officially referenced as CVE-2014-0160, makes it possible for attackers to recover up to 64 kilobytes of memory from the server or client computer running a vulnerable OpenSSL version. Nick Sullivan, a systems engineer at CloudFlare, a content delivery network that patched the OpenSSL vulnerability last week, said his company is still evaluating the likelihood that private keys appeared in memory and were recovered by attackers who knew how to exploit the flaw before the disclosure. Based on the results of the assessment, the company may decide to replace its underlying TLS certificate or take other actions, he said.

Attacking from the outside

The researchers who discovered the vulnerability, however, were less optimistic about the risks, saying the bug makes it possible for attackers to surreptitiously bypass virtually all TLS protections and to retrieve sensitive data residing in the memory of computers or servers running OpenSSL-powered software.

"We attacked ourselves from outside, without leaving a trace," they wrote. "Without using any privileged information or credentials we were able steal from ourselves the secret keys used for our X.509 certificates, user names and passwords, instant messages, emails and business critical documents and communication."

They called on white-hat hackers to set up "honeypots" of vulnerable TLS servers designed to entrap attackers in an attempt to see if the bug is being actively exploited in the wild. The researchers have dubbed the vulnerability Heartbleed because the underlying bug resides in the OpenSSL implementation of the TLS heartbeat extension as described in RFC 6520 of the Internet Engineering Task Force.

The OpenSSL vulnerability is the latest to threaten the HTTPS scheme that's the default and often only method for cryptographically protecting websites, e-mail, an other Internet communications from attacks that allow hackers to eavesdrop on end users or impersonate trusted websites. Last month, developers of the GnuTLS library disclosed an equally catastrophic bug that left hundreds of open-source applications open to similar attacks. And in February, Apple fixed an extremely critical vulnerability in the iOS and OS X operating systems that also made it possible for hackers to bypass HTTPS protections.

Heartbleed Security Flaw Emphasizes the Need to Change Passwords
Steve Lohr

Wait a day or so. Then change the passwords on the web services you use.

That is probably the best advice for web users unnerved by reports of a potential vulnerability for email and other online accounts because of the security flaw called Heartbleed.

Immediately changing passwords could feed a new password into a website that has not fixed the flaw, according to Mark Seiden, an independent computer security consultant. For websites, the fix-it process involves installing software patches on the computers in their data centers, then swapping out the confidential software key used to secure messages and transactions. The private key essentially shakes hands, digitally, with a public key. When they make an authenticated handshake — the signal of trust — the encrypted information is sent on its way. Swapping out the old private key for a new one is an extra step of caution, just in case the software flaw allowed cyberthieves to pilfer the private key.

“There’s nothing users can do until the web services have made their sites secure,” Mr. Seiden said.

Users will largely need to depend on individual sites to notify them about whether the flaw has been addressed. Many major web services, like Yahoo, have already released such notices.

The Heartbleed scare, even if it doesn’t turn out to hurt many consumers, is a reminder of the importance of password hygiene. Changing passwords occasionally is a good idea, as is using a different password for each site. If passwords are lost because of a security breach at a company, identity thieves have a far greater opportunity for mischief.

To vary passwords, Mr. Seiden suggests choosing a formula that is a variation on a theme. Pick out a core password of a mixture of six letters and numbers that are not a word.

Then, your passwords become variations on that core, which is reused. For example, Mr. Seiden said, you pick the second and third letter of a service, to avoid being obvious. If the service is Yahoo, the letters are “a” and “h.” Those are added at the front or back of your core password, or one letter at the front and the other at the back. Next time, perhaps, you choose the letter below those two on a computer keyboard.

“This is a good time to review your password practices in general,” Mr. Seiden said. “Any kind of formula can help. It’s what I use.”

NSA Said to Exploit Heartbleed Bug for Intelligence for Years
Michael Riley

The U.S. National Security Agency knew for at least two years about a flaw in the way that many websites send sensitive information, now dubbed the Heartbleed bug, and regularly used it to gather critical intelligence, two people familiar with the matter said.

The NSA’s decision to keep the bug secret in pursuit of national security interests threatens to renew the rancorous debate over the role of the government’s top computer experts.

Heartbleed appears to be one of the biggest glitches in the Internet’s history, a flaw in the basic security of as many as two-thirds of the world’s websites. Its discovery and the creation of a fix by researchers five days ago prompted consumers to change their passwords, the Canadian government to suspend electronic tax filing and computer companies including Cisco Systems Inc. to Juniper Networks Inc. to provide patches for their systems.

Heartbleed Fallout Continues: Canada Orders Shutdown of More Gov't Sites
Lorenzo Franceschi-Bicchierai

After shutting down its online tax-filing services on Tuesday, Canada is taking further precautionary measures against the Heartbleed Internet security bug. The country is ordering various government agencies to temporarily shut down affected sites.

The chief information officer for the Canadian government sent a directive to all government departments late on Thursday to "immediately disable public websites that are running unpatched OpenSSL software," according to a statement by the President of Canada's Treasury Board.

"This action is being taken as a precautionary measure until the appropriate security patches are in place and tested," the statement read.

It's unclear which websites will be disabled, as the statement only refers to "certain Government of Canada websites." The Treasury Board said it was a necessary step to ensure Canadians' privacy.

"We understand that this will be disruptive, but, under the circumstances, this is the best course of action to protect the privacy of Canadians," "We understand that this will be disruptive, but, under the circumstances, this is the best course of action to protect the privacy of Canadians," the statement read.

Heartbleed is a bug in the popular open-source encryption software OpenSSL, used to secure data flowing from users' computers to hundreds of thousands of websites. According to estimates, almost two-thirds of all sites use OpenSSL, making this bug one of the most dangerous the Internet has ever seen.

OpenSSL has contained this bug for two years, but security researchers uncovered it only last week. Details were first published on Monday, prompting many sites to rush to fix their servers. The bug potentially allows hackers to access all kinds of sensitive data, such as passwords or credit card numbers on websites that are supposedly secure. However, there is no proof that hackers discovered the flaw and exploited it before the researchers uncovered it.

Many websites patched the problem quickly, while others are still fixing their systems. Many, including Facebook, Instagram and Pinterest have asked users to change their passwords. Mashable is continuously updating a list of popular social media, banking, commerce and other sites that were vulnerable.

The full statement from Canada's Treasury Board is below:

Earlier this week, an Internet security vulnerability named the Heartbleed bug, caused by a flaw in OpenSSL software was detected. OpenSSL is a commonly used software used on the Internet to provide security and privacy.

The Heartbleed bug is affecting many global IT systems in both private and public sector organizations and has the potential to expose private data.

This evening the Chief Information Officer for the Government of Canada issued a directive to all federal government departments to immediately disable public websites that are running unpatched OpenSSL software. This action is being taken as a precautionary measure until the appropriate security patches are in place and tested.

As a result, Canadians will be unable to access certain Government of Canada websites while measures are being applied.

We understand that this will be disruptive, but, under the circumstances, this is the best course of action to protect the privacy of Canadians.


Hackers Lurking in Vents and Soda Machines
Nicole Perlroth

They came in through the Chinese takeout menu.

Unable to breach the computer network at a big oil company, hackers infected with malware the online menu of a Chinese restaurant that was popular with employees. When the workers browsed the menu, they inadvertently downloaded code that gave the attackers a foothold in the business’s vast computer network.

Security experts summoned to fix the problem were not allowed to disclose the details of the breach, but the lesson from the incident was clear: Companies scrambling to seal up their systems from hackers and government snoops are having to look in the unlikeliest of places for vulnerabilities.

Hackers in the recent Target payment card breach gained access to the retailer’s records through its heating and cooling system. In other cases, hackers have used printers, thermostats and videoconferencing equipment.

Companies have always needed to be diligent in keeping ahead of hackers — email and leaky employee devices are an old problem — but the situation has grown increasingly complex and urgent as countless third parties are granted remote access to corporate systems. This access comes through software controlling all kinds of services a company needs: heating, ventilation and air-conditioning; billing, expense and human-resources management systems; graphics and data analytics functions; health insurance providers; and even vending machines.

Break into one system, and you have a chance to break into them all.

“We constantly run into situations where outside service providers connected remotely have the keys to the castle,” said Vincent Berk, chief executive of FlowTraq, a network security firm.

Data on the percentage of cyberattacks that can be tied to a leaky third party is difficult to come by, in large part because victims’ lawyers will find any reason not to disclose a breach. But a survey of more than 3,500 global I.T. and cybersecurity practitioners conducted by a security research firm, the Ponemon Institute, last year found that roughly a quarter — 23 percent — of breaches were attributable to third-party negligence.

Security experts say that figure is low. Arabella Hallawell, vice president of strategy at Arbor Networks, a network security firm in Burlington, Mass., estimated that third-party suppliers were involved in some 70 percent of breaches her company reviewed.

“It’s generally suppliers you would never suspect,” Ms. Hallawell said.

The breach through the Chinese menu — known as a watering hole attack, the online equivalent of a predator lurking by a watering hole and pouncing on its thirsty prey — was extreme. But security researchers say that in most cases, attackers hardly need to go to such lengths when the management software of all sorts of devices connects directly to corporate networks. Heating and cooling providers can now monitor and adjust office temperatures remotely, and vending machine suppliers can see when their clients are out of Diet Cokes and Cheetos. Those vendors often don’t have the same security standards as their clients, but for business reasons they are allowed behind the firewall that protects a network.

Security experts say vendors are tempting targets for hackers because they tend to run older systems, like Microsoft’s Windows XP software. Also, security experts say these seemingly innocuous devices — videoconference equipment, thermostats, vending machines and printers — often are delivered with the security settings switched off by default. Once hackers have found a way in, the devices offer them a place to hide in plain sight.

“The beauty is no one is looking there,” said George Kurtz, the chief executive of Crowdstrike, a security firm. “So it’s very easy for the adversary to hide in these places.”

Last year, security researchers found a way into Google’s headquarters in Sydney, Australia, and Sydney’s North Shore Private hospital — and its ventilation, lighting, elevators and even video cameras — through their building management vendor. More recently, the same researchers found they could breach the circuit breakers of one Sochi Olympic arena through its heating and cooling supplier.

Fortunately, the researchers were merely testing for flaws that could have been exploited by real hackers.

Billy Rios, director of threat intelligence at Qualys, a security firm, was one of those researchers. He said it was increasingly common for corporations to set up their networks sloppily, with their air-conditioning systems connected to the same network that leads to databases containing sensitive material like proprietary source code or customer credit cards.

“Your air-conditioning system should never talk to your H.R. database, but nobody ever talks about that for some reason,” Mr. Rios said.

The Ponemon survey last year found that in 28 percent of malicious attacks, respondents could not find the source of the breach. Ms. Hallawell compared the process of finding the source of a breach to “finding a needle in a haystack.”

Ideally, security experts say, corporations should set up their networks so that access to sensitive data is sealed off from third-party systems and remotely monitored with advanced passwords and technology that can identify anomalous traffic — like someone with access to an air-conditioning monitoring system trying to get into an employee database.

But even then, companies require security personnel with experience in detecting such attacks. Even though Target used security technology supplied by FireEye, a company that sounds alerts when it identifies such anomalous activity, its I.T. personnel ignored the red flags, according to several people who confirmed the findings of a Bloomberg Businessweek investigation last month but could not speak publicly about Target’s continuing internal investigation.

Like all else, security experts say, it’s simply a matter of priorities. One Arbor Networks study found that unlike banks, which spend up to 12 percent of their information technology budgets on security, retailers spend, on average, less than 5 percent of their budget on security. The bulk of that I.T. spending goes to customer marketing and data analytics.

“When you know you’re the target and you don’t know when, where or how an attack will take place, it’s wartime all the time,” Ms. Hallawell said. “And most organizations aren’t prepared for wartime.”

"Unbreakable" Encryption Almost Certainly Isn't

This headline is provocative: "Human biology inspires 'unbreakable' encryption."

The article is similarly nonsensical:

Researchers at Lancaster University, UK have taken a hint from the way the human lungs and heart constantly communicate with each other, to devise an innovative, highly flexible encryption algorithm that they claim can't be broken using the traditional methods of cyberattack.

Information can be encrypted with an array of different algorithms, but the question of which method is the most secure is far from trivial. Such algorithms need a "key" to encrypt and decrypt information; the algorithms typically generate their keys using a well-known set of rules that can only admit a very large, but nonetheless finite number of possible keys. This means that in principle, given enough time and computing power, prying eyes can always break the code eventually.

The researchers, led by Dr. Tomislav Stankovski, created an encryption mechanism that can generate a truly unlimited number of keys, which they say vastly increases the security of the communication. To do so, they took inspiration from the anatomy of the human body.

Regularly, someone from outside cryptography -- who has no idea how crypto works -- pops up and says "hey, I can solve their problems." Invariably, they make some trivial encryption scheme because they don't know better.

Remember: anyone can create a cryptosystem that he himself cannot break. And this advice from 15 years ago is still relevant.

In a Prying World, News Organizations Are Struggling to Encrypt Their Online Products
Craig Timberg

The old-fashioned newspaper, long maligned for its stodginess and sagging profits, has one advantage over high-tech alternatives: You read it. It never reads you.

The digital sources that increasingly dominate our news consumption, by contrast, transmit information across the fundamentally public sphere of the Internet, leaving trails visible to anyone with the right monitoring tools — be it your employer, your Internet provider, your government or even the scruffy hacker sitting next to you at the coffee shop, sharing the WiFi signal.

This is why privacy advocates have begun pushing news organizations, including The Washington Post, the New York Times and the Guardian, to encrypt their Web sites, as many technology companies increasingly do for e-mails, video chats and search queries.

The growing use of encryption — signaled by the little lock icon in your browser’s address box — has emerged as perhaps the most concrete response to Edward Snowden’s revelations about the ability of the National Security Agency to collect almost anything that exists in digital form, including the locations, communications and online activities of people worldwide.

It’s only fair, say privacy advocates, that The Post and other news organizations that broke these stories heed their key lesson: Online surveillance is pervasive and voracious, especially when data is unprotected.

Among the issues potentially illuminated by what you choose to read, advocates say, are your health concerns, financial anxieties, sexual orientation and political leanings. A single article might mean little, but Big Data companies constantly collect and crunch a broad range of personal information to produce profiles of each of us.

“You could paint a pretty detailed picture of a person — their likes and dislikes — if you could see the articles they’re reading,” said Trevor Timm, executive director of the Freedom of the Press Foundation, one of several groups pushing for wider use of encryption.

Encryption may seem a stretch as a press freedom issue, far from what concerned the Founding Fathers when they enshrined the First Amendment in the Bill of Rights. Yet a free press operates best when the public can make reading decisions without fear that their government — or anyone capable of doing them harm — is looking over their shoulder.

Encrypting something as complex as a news site is enormously difficult, according to technical experts within the industry. Several major news organizations offered encryption for some elements of their sites in recent years but largely stopped when problems arose in displaying content quickly and cleanly to readers, said Peter Eckersley, technology projects director for the Electronic Frontier Foundation, which tracks the use of the technology.

In an era when news zings across the globe at the speed of light, making encryption work properly across an entire site is a challenge worth undertaking, advocates say. “No one has done it for real,” Eckersley said.

When a Chinese Internet user, for example, tries to follow international coverage of the looming 25th anniversary of the Tiananmen Square protests, that government’s Internet surveillance and censorship system, known as the Great Firewall, will know. Closer to home, British intelligence reportedly has monitored visits to a Web site for WikiLeaks, which while not a traditional news site shares enough similarities to illustrate risks to reader privacy.

Our stuff didn’t always spy on us. But much of it now can: phones, computers, cable boxes, Internet-ready cars and, soon enough, glasses, watches and even household appliances that continually track use over a “smart” electrical grid.

Whenever that information is transmitted over the Internet without encryption, it’s possible for others to see it, collect it and analyze it. The monitoring tools used by employers and universities can see every Web address accessed by a user. Hackers, using free software, can see the sites viewed by anyone sharing an unsecured WiFi signal. Government intelligence agencies such as NSA monitor Web traffic on a massive scale using equipment wired directly into the fiber-optic cables that form the essential arteries of the Internet.

Journalists have been slow to understand the role we have been playing in the surveillance rising all around us. But the moment newspapers put their work online — as this paper first announced plans to do in 1993, under the now-quaint headline, “Post to Launch Computerized Version of Paper; Service Will Send Information, Ads and Sound Effects to PCs Beginning Next Summer” — readers’ choices became exposed to potential collection and analysis.

It’s clear now that anything that’s potentially useful to anyone is vulnerable on the Internet. And while encryption is not perfect, routine deployment of this technology makes it far more difficult to conduct mass surveillance.

It also complicates the work of censors in China, Vietnam, Iran, Saudi Arabia and elsewhere because requests for articles, when encrypted, appear to anyone monitoring the Internet as a jumble of numbers and letters. Governments can block all the content flowing from encrypted Web sites but can’t choose to allow stories, for example, about President Obama’s latest political drama but not the travels of the Dalai Lama.

“All news Web sites should encrypt their content,” said Martin Johnson, a founder of GreatFire.org, which tracks China’s Great Firewall. (Like others with the group, he uses a pseudonym to evade detection by the government there.) “Not encrypting your content is like saying, ‘We are happy to allow censors around the world to selectively filter our content.’ ”

The prospect of outright censorship is not a small concern for news organizations. The Chinese government has blocked, to varying degrees and for various lengths of time, some of the largest Western news organizations after the publication of unflattering stories about that nation’s leaders and their families. The New York Times and Bloomberg News have been unavailable to online readers in China since 2012 and the Guardian since January, according to GreatFire.org. The Wall Street Journal’s Chinese-language site has been intermittently blocked there as well.

The Times, the Journal, Bloomberg and the Guardian all declined to comment about Web-site encryption.

It’s not just China that seeks to control the spigot of digital news and information. Several nations blocked YouTube in 2012 to keep a controversial film, “Innocence of Muslims,” from being downloaded. Turkey last month blocked YouTube and Twitter to damp the spread of an embarrassing audio recording that seemed to capture leaders discussing possible war with Syria.

There is no way to predict how such nations would respond to a major new move toward encryption. Yet as news organizations battle censorship, privacy issues run on a separate, parallel track. One concerns the ability of people — potential readers — to gain access to journalism. The other concerns the rights of those with access — actual readers — to enjoy whatever they please, privately.

The Intercept, a nonprofit news organization started this year by eBay founder Pierre Omidyar, former Guardian reporter Glenn Greenwald and several other journalists with experience reporting on surveillance, launched its Web site using encryption, providing readers with articles that are much safer from prying eyes. ProPublica, another nonprofit group, offers encryption as an option for readers who know to activate it. Neither relies on ad revenue.

The Post is considering a similar move and has in recent weeks begun experimenting with encryption, said Shailesh Prakash, the company’s chief information officer and vice president for digital product development. If the tests underway are successful and encryption is made easily available throughout the site, The Post could become the first major traditional news organization to protect its users’ privacy in this way.

“I fundamentally believe that this is good for our readers,” Prakash said. “If we can get the experience right for our readers, I feel this is the right thing to do.”

(Disclosure: I advocated this move internally at The Post, with Prakash and others.)

Encryption technology has become less expensive and technically demanding in recent years, in part because computers increasingly have the power to encode and decode Internet traffic rapidly without slowing transmission to a noticeable degree.

Modern browsers can instantly determine, by checking a digital security certificate, the authenticity of a site that offers to make an encrypted connection, typically using a Web address that starts with “https” rather than the more familiar “http.” The “s” is for “secure,” and the Internet traffic that flows on such connections can be read only by the sender and its intended recipient.

Hackers can still break into individual computers, snatching digital communications before the content is encoded or after it’s decoded. But simply reading everything that flows across the Internet becomes very difficult — even for intelligence agencies, criminals and censors.

Encrypting a Web site as elaborate as The Post’s would require substantial resources, as well as time to work out whatever issues arise.

Most Web sites consist of many different elements — for The Post, articles, ads, videos and much more — that your computer’s browser assembles seamlessly and almost instantaneously on the screen. If some of those elements are encrypted and others are not, browsers will balk.

Companies that help sites deliver their Web content often charge substantially higher rates to use encryption, and many ad networks don’t support it at all. Prakash said The Post, in its initial testing, has run into snags with ads not displaying properly when technicians try to encrypt Web pages.

Ars Technica, a site for technology news, has been trying to encrypt its traffic for a year but keeps hitting unexpected problems, such as how to handle links to videos and other outside content that is not encrypted.

The site wants to push ahead but may have to accept that it’s not practical to encrypt everything that’s embedded in its pages. It may resort to erecting signposts for readers warning them when they are moving to an unprotected page, said Jason Marlin, the director of technology for Ars Technica.

“We’d love to lead the way and solve it, and show others how to do it,” Marlin said.

But shifting from an encrypted page to an unencrypted one causes some browsers to issue stark warnings that can alarm readers. And some browsers have begun refusing to display pages that have a mix of encrypted and unencrypted elements — a development that has chilled experimentation by some news sites.

“This is a technically very hard thing to do, and a lot of sites struggle to do it,” said Matthew Prince, chief executive of CloudFlare, a company that delivers encrypted content for Web sites securely and without major new costs. He said that fewer than 3 million of the Web’s 650 million sites use SSL, the most popular form of encryption.

This list of sites that encrypt successfully is weighted toward online retailers and financial institutions, but privacy groups such as the ACLU and Electronic Frontier Foundation have also encrypted their sites. EFF also has developed a tool called HTTPS Everywhere — a reference to the encrypted “https” version of Web addresses — that activates encryption for any site that offers it as an option.

Privacy advocates, however, favor encryption by default, meaning that it’s used automatically whenever a browser capable of supporting it logs on. Only the most antiquated browsers, such as Microsoft’s Internet Explorer 6, struggle with encryption.

Web sites that encrypt their traffic may still track their users to gather valuable information. Google is among the world’s leading companies in adopting encryption by default, but it still analyzes its users’ e-mails, search queries and Web browsing for clues to what ads they might respond to. That’s how Google makes its multibillion-dollar profits.

Pretty much any company that gives away content for free, including The Post, analyzes readers’ choices to target ads more effectively and make more money. But for anyone else tempted to look at what you’re reading, encryption provides a rare measure of privacy in our relentlessly wired, surveilled world.

The Feds Cut a Deal With In-Flight Wi-Fi Providers, and Privacy Groups Are Worried
Kim Zetter

Gogo, the inflight Wi-Fi provider, is used by millions of airline passengers each year to stay connected while flying the friendly skies. But if you think the long arm of government surveillance doesn’t have a vertical reach, think again.

Gogo and others that provide Wi-Fi aboard aircraft must follow the same wiretap provisions that require telecoms and terrestrial ISPs to assist U.S. law enforcement and the NSA in tracking users when so ordered. But they may be doing more than the law requires.

According to a letter Gogo submitted to the Federal Communications Commission, the company voluntarily exceeded the requirements of the Communications Assistance for Law Enforcement Act, or CALEA, by adding capabilities to its service at the request of law enforcement. The revelation alarms civil liberties groups, which say companies should not be cutting deals with the government that may enhance the ability to monitor or track users.

“CALEA itself is a massive infringement on user’s rights,” says Peter Eckersley of the Electronic Frontier Foundation. “Having ISP’s [now] that say that CALEA isn’t enough, we’re going to be even more intrusive in what we collect on people is, honestly, scandalous.”

Gogo provides inflight Wi-Fi and digital entertainment to Delta, American Airlines, Alaska Airlines, Virgin America, US Airways and others using a dedicated air-to-ground network that GoGo says it designed in consultation with law enforcement.

The disclosure that GoGo voluntarily exceeded the requirements of CALEA appears in a letter to the FCC (.pdf) the company wrote in 2012. “In designing its existing network, Gogo worked closely with law enforcement to incorporate functionalities and protections that would serve public safety and national security interests,” Gogo attorney Karis Hastings wrote.

Although FCC rules “do not require licensees to implement capabilities to support law enforcement beyond those outlined in CALEA…,” Hastings noted, “[n]evertheless, Gogo worked with federal agencies to reach agreement regarding a set of additional capabilities to accommodate law enforcement interests. Gogo then implemented those functionalities into its system design.”

When CALEA became law in 1994, it applied only to telecoms and required them to provide wiretap capabilities for phone calls. But in 2007 the FCC ordered CALEA compliance from broadband and VoIP providers as well, amid pressure from the Justice Department and the FBI. Under CALEA, these communications providers must be able to isolate all wire and electronic communications to and from any account targeted by law enforcement and identify the numbers or accounts with which the target has communicated.

The FCC has considered applying special rules to in-flight Wi-Fi providers. Gogo’s 2012 letter to the FCC was an effort to convince the commission that special mandated rules were unnecessary for in-flight Wi-Fi providers because the companies were willing to work with law enforcement agencies to give them what they want.

“Gogo believes that its experience demonstrates that a flexible approach based on direct negotiation can best ensure that … operators deploy capabilities designed to protect public safety and national security, and that adoption of a specific list of capabilities … is unwarranted,” Hastings wrote.

A Gogo spokesman insists that, despite the letter’s reference to multiple capabilities added by Gogo, the company only added a single capability beyond CALEA, and it has nothing to do with monitoring traffic.

But it apparently is not the only company cutting deals with law enforcement. An FCC notice of proposed rule making (.pdf) published in December notes that Panasonic Avionics negotiated with law enforcement “regarding lawful interception … and network security functionality to be deployed” in the company’s eXConnect system, which provides Wi-Fi to American Airlines and United.

According to the document, Panasonic engaged a CALEA-compliant equipment vendor to implement its intercept capability but was also “implementing additional functionality subject to final agreement with U.S. law enforcement.” The document notes operators “have uniformly engaged in direct consultations with law enforcement to develop appropriate capabilities consistent with their system characteristics and service offerings.”

Chris Soghoian of the American Civil Liberties Union, who first spotted the reference to expanded capabilities in the FCC documents, says law enforcement often leverages FCC threats of added rules to pressure companies into making concessions.

“I don’t think people understand the extent to which the FCC acts as the enforcer for the surveillance community,” he says. “The Gogo document and Panasonic documents really reflect this process of these companies sitting down with the government and making deals so the FCC wouldn’t get on their back. These are not agreements that are taking place in the sunlight. These are secret deals that are definitely not being made in the best interest of the public.”

Panasonic Avionics did not respond to a call for comment. A Gogo spokesman, when initially asked about the FCC documents by Pando Daily, declined to identify what additional capabilities Gogo implemented.

“What we are prepared to say is: Gogo does what all airborne connectivity companies have been asked to do from a security perspective, and it has nothing to do with monitoring traffic. Beyond that, we can’t comment beyond what’s in our public comments with the FCC,” spokesman Steve Nolan told Pando Daily.

But in a phone call with WIRED, Nolan said the company made just one concession to law enforcement beyond its CALEA requirements: adding a CAPTCHA feature to “prevent people from remotely accessing the system.” That would seem to contradict the FCC letter that specifically says that Gogo made “a set of additional capabilities” beyond CALEA. In a follow-up email, Nolan suggested there was more than one concession.

“Beyond adhering to CALEA, our primary concession to law enforcement is the use of CAPTCHA to access the system,” he wrote. Asked to clarify the disparity in his statements, he wrote that the “secondary concessions are all the CALEA requirements we adhere to.”

CAPTCHA displays a string of numbers or a word that users must enter to use the service. It generally is used to prevent automated bots from using online services, but Nolan said GoGo added it as a security feature to keep remote users out of the network. Soghoian doesn’t buy that.

“That doesn’t make any sense,” he says. “You can only access [the network] from the airplane. The Wi-Fi only works when you’re above a certain number of feet…. If that’s all the government wanted, why not be up front with that in the beginning? Initially they said there were things that were done, but they couldn’t describe them. [The new statement] suggests there’s more there.”

The answers may lie in a 2009 statement made by the director of business development and strategy for Aircell, a GoGo subsidiary that provides Wi-Fi for the business aviation sector.

The Aircell executive told Flight Global that the company had a “Super CALEA” arrangement with the FBI whereby it could immediately shut off service to select individuals or an entire airplane– without shutting the service off to U.S. air marshals–if authorities determined there was a security threat to the plane.

But the executive also described surveillance capabilities that go beyond what CALEA generally provides. “CALEA,” he said, “allows the feds to collect information about who is using the system, on which devices, and what the traffic looks like. Aircell can give [law enforcement] any information they need in real time.”

Nolan, asked about those statements, said, “Despite what the person said in 2009, what I can tell you today and what the truth is today is that we adhere to CALEA and we do everything in conjunction with what law enforcement has asked us to do.” He added that, “There is no ‘super CALEA’ capability. Our capabilities and what we adhere to are exactly what any communications provider, including on the ground networks, adhere to when they abide by CALEA. Nothing more and nothing less.”

Gogo notes in its terms of service that it may be required by law “to record some or all of your communications” and that it may “disclose your Personal Information (including your Account Information) and your communications through the Services, if required by law … or if we believe in good faith that such disclosure is necessary to: (a) comply with relevant laws or to respond to subpoenas or warrants served on us; or (b) protect or defend the rights, property, or safety of Gogo, you, other users, or third parties (especially in emergency situations).”

If Gogo is making additional concessions to law enforcement aside from the CALEA requirements and the CAPTCHA feature, Soghoian and others say it’s not hard to imagine what those might include.

“There are a number of things that are still in the surveillance arena that don’t involve monitoring traffic,” he says, such as watching “the MAC addresses of known bad guys.”

A recent CBC News story, based on documents obtained from Edward Snowden, described how Canada’s electronic spy agency, the Communications Security Establishment Canada, collected “metadata” from devices used to access Wi-Fi at a major Canadian airport. Authorities then used the metadata to track the movement of these devices for days as the devices connected to Wi-Fi hotspots across Canada and in U.S. airports.

The Canadian article doesn’t specify the device metadata the spy agency collected, but it most likely refers to the Media Access Control (MAC) address, a unique identifier for computers.

“If you’re watching [MAC addresses] in the airport, why not watch them in the air?,” says Soghoian.

Authorities may also want the ability to trace online activity to a specific passenger. “That is surveillance. It’s just not about [monitoring traffic]. It’s about making sure they can finger you down the line.”

EU Court Rules Against Requirement to Keep Data of Telecom Users

The European Union's highest court ruled on Tuesday that an EU directive requiring telecoms companies to store the communications data of EU citizens for up to two years was invalid.

"The Court of Justice declares the Data Retention Directive to be invalid," the court said in a statement.

The data-retention directive was introduced in March 2006 after bombings on public transport in Madrid and London. The aim was to give the authorities better tools to investigate and prosecute organized crime and terrorism.

It required telecoms service providers to keep traffic and location data as well as other information needed to identify the user, but not the content of the communication. The records were to be kept from six to 24 months.

Austrian and Irish courts asked the European Court of Justice to rule if the law was in line with the Charter of Fundamental Rights of the EU. The law also caused a public outcry in Germany.

"It entails a wide-ranging and particularly serious interference with the fundamental rights to respect for private life and to the protection of personal data, without that interference being limited to what is strictly necessary," the court said.

"The Court takes the view that, by requiring the retention of those data and by allowing the competent national authorities to access those data, the directive interferes in a particularly serious manner with the fundamental rights to respect for private life and to the protection of personal data," it said.

"Furthermore, the fact that data are retained and subsequently used without the subscriber or registered user being informed is likely to generate in the persons concerned a feeling that their private lives are the subject of constant surveillance," it said.

(Reporting By Jan Strupczewski; Editing by Philip Blenkinsop, Larry King)

Swedish ISP Deletes All Retained Customer Data in Wake of EU Court Ruling
Loek Essers

A Swedish ISP has deleted all retained customer data after European Union laws that require communications providers to retain metadata were invalidated by the EU’s supreme court earlier this week. The ISP on Thursday called on other providers to do the same.

Under EU rules, telecommunications and Internet providers are required to retain data necessary to identify the subscriber, as well as traffic and location data, in order to help investigations of serious crimes and terrorism. However, the EU’s Data Retention Directive was invalidated on Tuesday by the Court of Justice of the EU (CJEU), which ruled that the directive seriously interferes with fundamental privacy rights.

The Swedish law implementing the directive is still in place, but despite that Jon Karlung, CEO of Swedish ISP Bahnhof, said he deleted all retained records and stopped collecting customer information on Wednesday after consulting his lawyers.

The verdict is clear, and means that since the directive came into force in 2006, data was wrongly collected and should be deleted, Karlung said.

“We have followed this verdict and from our point of view it is more important to protect the privacy and integrity of our customers,” Karlung said. “I strongly suggest that other ISPs and service providers would follow our example,” he said, adding that he thought the verdict was clear enough to do this.

The Swedish Post and Telecom Authority (PTS) that monitors the electronic communications and postal sectors in Sweden has no plans to act against Bahnhof, a spokesman said. The authority is analyzing the ruling to see if national legislation is still applicable, he said.

The Swedish Prosecution Authority also has no plans yet to start an investigation, although it has the power to do so even in the absence of a police report, a spokeswoman said.

Karlung is confident that he would ultimately win any legal dispute with the authorities because the case would eventually end up at the Court of Justice of the EU, he said, adding that European law is above Swedish law.

“For the first time in a very long time there is some common sense in the European Union regarding these matters,” Karlung said. “There is finally hope again.”

Sweden was one of the last EU member states to transpose the Data Retention Directive into national law. In May 2013, the CJEU fined Sweden €3 million (around US$4 million) for the delay.

Asked Tuesday whether the Commission would consider paying back that fine, EU Home Affairs Commissioner Cecilia Malmström said that was a possibility she would look into.

How Advertising Cookies Let Observers Follow You Across the Web
Russell Brandom

Back in December, documents revealed the NSA had been using Google's ad-tracking cookies to follow browsers across the web, effectively coopting ad networks into surveillance networks. A new paper from computer scientists at Princeton breaks down exactly how easy it is, even without the resources and access of the NSA. The researchers were able to reconstuct as much as 90% of a user's web activity just from monitoring traffic to ad-trackers like Google's DoubleClick. Crucially, the researchers didn't need any special access to the ad data. They just sat back and watched public traffic across the network.

Tor was the only tool that escaped the researchers' dragnet

As it turns out, trackers are displaying a surprising amount of information in public. Each ad system gives a user a unique ID number, but by following the same browser session from system to system, the researchers were able to tie together the vast majority of a given user's web requests. By following those same cookies to identity-based services like Facebook and Google+, the researchers were able to give a name to each user.

The result is, for a given pageview, it's surprisingly easy to trace back to a person's name and the other pages they've visited. Security measures like HTTPS threw researchers off the case a little bit, but the density of ad cookies makes them easy to get around. The only solid protection was the routing network Tor, which scrambled IP addresses thoroughly enough to escape the researchers' impromptu dragnet.

Google Chrome Bug Could Allow Websites To Snoop On Conversations

A security flaw in the world's most popular Web browser allows any site to read transcripts of nearby speech
Thomas Halleck

Google Chrome can access your computer's microphone to enable voice searches and Google Now functionality, but an Israel-based developer has found a security flaw that allows Websites to record text transcripts of any conversation in range of your computer's microphone. Google

A flaw in Google's popular Chrome browser allows any website to hack into its speech recognition software and access a transcript of anything spoken within range of the computer's microphone. Above, Google's head of Chrome, Sundar Pichai, discusses the browser onstage at the company's annual I/O conference in Mountain View, Calif. Reuters

A security flaw in Google Chrome, the world's most popular Web browser, could allow a hacker to turn on a user's computer microphone and secretly obtain a Chrome-generated transcript of the user's conversations, according to an Israel-based software developer who highlighted the flaw in a blog post this week.

The developer, Guy Aharonovsky, told International Business Times he found the defect in Chrome while experimenting with a voice recognition feature in the browser. He said he reported the problem to Google through its Chromium bug tracker, but the company’s developers designated it “low-severity,” which meant they didn't view it as a top priority and offered no immediate fix. Google did investigate the problem, he said, but only after he submitted a blog post about it to Reddit, a popular socially driven news site.

“Google, [like] all developers, [has] a tendency to dismiss the not-so-obvious security bugs,” Aharonovsky said.

A Google spokesperson confirmed the existence of the vulnerability on Wednesday. “Our security team is actively investigating this issue,” Google said in an email to IBTimes.

The security flaw in the Chrome browser emerges just as the world is confronting the frightening prospect of a similarly long-existant, but previously undetected bug known as Heartbleed, that makes millions of passwords across the Web vulnerable to theft.

The spokesperson refused to comment on Aharonovsky’s claim that it downplayed the security flaw until his post gained attention online. "As of right now, we have no further comment other than the one we provided in our first email," Google said.

While there's no evidence of any Chrome user harmed by the vulnerability, the security flaw's potential damage is significant. Chrome serves more than half of the world's Web traffic, and with just one click on a malicious Web page, a user could unwittingly allow that website to obtain a text transcript of any conversation near the computer, via the user's computer microphone.

While most hackers target victims by enticing them to download a virus or malware file, this bug only requires a Chrome user to visit a Website that's designed to exploit this vulnerability.

Google told IBTimes that a software feature in the browser generates the text from a user’s voice, which is recorded by a computer microphone. Google said recorded text files contain “much less information” than audio files, and if no sound is detected for eight seconds after the last mouse click, the feature turns itself off.

Aharonovsky created a simple demonstration to show how the bug could work. In it, computer users are asked to use a mouse to drag and drop “seeds” onto the ground to grow a tree, increasing the likelihood that a voice recording is activated every eight seconds. He said this feature works even if users block access to the computer microphone in Chrome’s security settings.

The spokesperson said Google could not say when an update with a fix for the security flaw will be available.

“I do not believe [the vulnerability] will be dismissed at this point,” Aharonovsky said. “It seems like they started to look for a way to quickly mitigate this flaw.”

The U.S. Thinks it Can Invent a Cuban Spring? Puh-leez, Say Internet Freedom Scholars.
Brian Fung

An arm of the U.S. government seeking to destabilize the Castro regime in Cuba has taken covert actions to overthrow the government there.

No, that's not a line out of a Cold War history book. The U.S. Agency for International Development created a fake version of Twitter for Cubans to use in hopes of fomenting what the Associated Press calls a "Cuban Spring" that could, according to one of 1,000 pages of USAID documents, "renegotiate the balance of power between the state and society."

The plan was to bring Cubans onto the service, called ZunZuneo, then begin feeding them political content designed to incite huge protests reminiscent of 2011's democratic uprisings in the Middle East, according to the AP. It all sounds rather implausible at a time when the United States is trying to avoid embroiling itself in places, such as Syria and Ukraine. But the program actually exists. In an e-mail, USAID said the plan was reviewed by the Government Accountability Office last year.

"USAID is a development agency, and we work all over the world to help people exercise their universal rights and freedoms," USAID spokesman Matt Herrick said.

But just because Washington says ZunZuneo is above board doesn't mean it has an effective grasp on technology's role in social movements, Internet scholars say.

Beginning with 2009's Green Revolution in Iran, an entire school of literature has cropped up to analyze social media and its political implications. It wasn't until 2011, though, that social scientists got their hands on enough case studies to be able to draw meaningful conclusions about Twitter and Facebook as weapons in the hands of revolutionaries. The early days of the Arab uprising in Tunisia and Egypt were marked by breathless reports about social media overwhelming dictatorial regimes. Government attempts to shut the services down only fueled the perception that social media helped cause the uprisings. That perception was quickly dashed by academics and activists with direct ties to the region.

"One of the big points of consensus in our community following the uprisings in the Arab region was that, yes, the [social media] tools were a catalyst, but there were really smart, powerful and long-lasting activist networks that put all of that into motion," said Ellery Biddle, the editor of Global Voices Advocacy and a close observer of Cuban affairs. "Most of my friends in Egypt who were very, very involved in the first phase of revolution there had been organizing and talking and working together for 10 years before any of this happened. It wasn't as if someone gave them a cellphone one day, and somebody figured out they could have a revolution."

Technology can facilitate organizing. But it can't overthrow governments, much less on a timetable of your choosing.

That USAID failed to appreciate such an important lesson — from the most up-to-date evidence available — seems surprising for an agency that specializes in international development. It also raises questions about why it still (reportedly) forged ahead with the program, and why it kept it a secret. Some are still in doubt about the report itself.

Jonathan Zittrain is the director of Harvard's Berkman Center for Internet and Society. "It's hard to believe there's somebody in the government rubbing their hands together, Mr. Burns-style, thinking you can orchestrate a revolution this way," Zittrain said, referring to the classically devious "Simpsons" character. "It may be, the truth is a little less interesting and a little more reasonable."

Cold War overtones aside, it's understandable why USAID might want to shield its subversive activity from public view. It was likely not for the benefit of the Cuban government, nor to give the agency itself plausible deniability — but perhaps to protect vulnerable Cubans who would want to participate. Collaborating with the United States is considered a crime by Havana, according to Biddle. If ZunZuneo operated openly as a U.S. government program, the Cuban government could track down and punish potential dissidents.

"It is no secret that in hostile environments, governments take steps to protect the partners we are working with on the ground," said Herrick.

Even so, however, going to the trouble of setting up a special social media platform makes no sense, said Biddle — particularly when there's already a thriving network of cellphone users who text each other news and information in secret. Around 25 percent of the population in Cuba now carries a cellphone, thanks to a recent liberalization in ownership policies. In 2008, it became legal to acquire a mobile device without going through an extensive permit process. Since then, informal news networks involving as many as 30 to 40 people — using just their cellphones — have become common.

"To be honest, I think a lot of work the U.S. government is doing there is probably having exactly the wrong effect," she added. "It divides civil society. There are people who choose to accept U.S. money or associate with U.S. government programs. They usually lose friends, family, their job. It suddenly becomes their only source of income. So they get stuck in it, and there's a big separation between people who want change and are willing to go down that road, and people who want change but want it on their own terms."

Others believe there's a much simpler way to achieve regime change: Just keep liberalizing the United States's trade policy with respect to Cuba. According to the Electronic Frontier Foundation's Jillian York, there's a lot of technology that could help Cubans circumvent censorship that isn't making it to the country because of the United States's own ideologically motivated export restrictions.

Lifting those barriers might do far more to promote American values in Cuba than directly stirring up unrest with a covert communications program, said York.

If that's true, USAID might have more luck trying its hand at soft power — not subversion.

U.S. Court Voids Man's Conviction for Hacking Celebrity iPads
Jonathan Stempel

A federal appeals court on Friday unanimously threw out the conviction of an Arkansas man for stealing the personal data of about 120,000 Apple iPad users, including big-city mayors, a TV news anchor and a Hollywood movie mogul.

The reason: Prosecutors brought the case in the wrong state.

The 3rd U.S. Circuit Court of Appeals said the prosecution of Andrew Auernheimer did not belong in New Jersey, hundreds of miles from his alleged crimes, and as a result, his November 2012 conviction and 41-month prison sentence could not stand.

Writing for a three-judge panel, Circuit Judge Michael Chagares also admonished prosecutors that the Internet's "ever-increasing ubiquity" did not give the government carte blanche to prosecute cybercrime wherever it wishes.

"Venue in criminal cases is more than a technicality; it involves matters that touch closely the fair administration of criminal justice and public confidence in it," Chagares wrote.

"Cybercrimes do not happen in some metaphysical location that justifies disregarding constitutional limits on venue."

Auernheimer, who went by the names Weev, Weelos and Escher, had been convicted by a Newark jury of one count of conspiracy to violate the federal Computer Fraud and Abuse Act by accessing AT&T Inc servers, and one count of identity theft.

"The court merely determined that the Department of Justice brought this case in the wrong state, and we are reviewing our options," said Matthew Reilly, a spokesman for U.S. Attorney Paul Fishman in New Jersey, who handled the prosecution.

Auernheimer, 28, has been serving his sentence at a prison in Allenwood, Pennsylvania, federal records show.

Orin Kerr, a law professor at George Washington University who argued Auernheimer's appeal, did not immediately respond to requests for comment.


The alleged conspiracy took place in 2010, and involved a co-defendant, Daniel Spitler, and the group Goatse Security.

Prosecutors said Auernheimer used an "account slurper" to extract data about iPad users from AT&T servers, and then shared information with a reporter for the website Gawker who wrote an article naming some well-known people with compromised emails.

Among those affected were ABC News anchor Diane Sawyer, former New York City Mayor Michael Bloomberg, Chicago Mayor Rahm Emanuel and Hollywood movie producer Harvey Weinstein.

But the 3rd Circuit said that at all relevant times, Auernheimer was in Fayetteville, Arkansas; Spitler in San Francisco, and the servers in Atlanta and Dallas. It was also "undisputed" that the Gawker reporter was not in New Jersey.

Chagares said this meant that by being hauled more than 1,000 miles to the Newark courtroom of U.S. District Judge Susan Wigenton, Auernheimer was denied his "substantial right" to be tried where he allegedly committed his crimes.

"When people commit crimes, we have the ability and obligation to ensure that they do not stand to account for those crimes in forums in which they performed no 'essential conduct element' of the crimes charged," Chagares wrote.

Spitler pleaded guilty in June 2011 and was sentenced in January to three years probation. Susan Cassell, his lawyer, did not immediately respond to requests for comment.

The appeal is Auernheimer v. U.S., 3rd U.S. Circuit Court of Appeals, No. 13-1816.

(Reporting by Jonathan Stempel in New York; Editing by Jonathan Oatis and Bernadette Baum)

Eight (No, Nine!) Problems With Big Data
Gary Marcus and Ernest Davis

BIG data is suddenly everywhere. Everyone seems to be collecting it, analyzing it, making money from it and celebrating (or fearing) its powers. Whether we’re talking about analyzing zillions of Google search queries to predict flu outbreaks, or zillions of phone records to detect signs of terrorist activity, or zillions of airline stats to find the best time to buy plane tickets, big data is on the case. By combining the power of modern computing with the plentiful data of the digital era, it promises to solve virtually any problem — crime, public health, the evolution of grammar, the perils of dating — just by crunching the numbers.

Or so its champions allege. “In the next two decades,” the journalist Patrick Tucker writes in the latest big data manifesto, “The Naked Future,” “we will be able to predict huge areas of the future with far greater accuracy than ever before in human history, including events long thought to be beyond the realm of human inference.” Statistical correlations have never sounded so good.

Is big data really all it’s cracked up to be? There is no doubt that big data is a valuable tool that has already had a critical impact in certain areas. For instance, almost every successful artificial intelligence computer program in the last 20 years, from Google’s search engine to the I.B.M. “Jeopardy!” champion Watson, has involved the substantial crunching of large bodies of data. But precisely because of its newfound popularity and growing use, we need to be levelheaded about what big data can — and can’t — do.

The first thing to note is that although big data is very good at detecting correlations, especially subtle correlations that an analysis of smaller data sets might miss, it never tells us which correlations are meaningful. A big data analysis might reveal, for instance, that from 2006 to 2011 the United States murder rate was well correlated with the market share of Internet Explorer: Both went down sharply. But it’s hard to imagine there is any causal relationship between the two. Likewise, from 1998 to 2007 the number of new cases of autism diagnosed was extremely well correlated with sales of organic food (both went up sharply), but identifying the correlation won’t by itself tell us whether diet has anything to do with autism.

Second, big data can work well as an adjunct to scientific inquiry but rarely succeeds as a wholesale replacement. Molecular biologists, for example, would very much like to be able to infer the three-dimensional structure of proteins from their underlying DNA sequence, and scientists working on the problem use big data as one tool among many. But no scientist thinks you can solve this problem by crunching data alone, no matter how powerful the statistical analysis; you will always need to start with an analysis that relies on an understanding of physics and biochemistry.

Third, many tools that are based on big data can be easily gamed. For example, big data programs for grading student essays often rely on measures like sentence length and word sophistication, which are found to correlate well with the scores given by human graders. But once students figure out how such a program works, they start writing long sentences and using obscure words, rather than learning how to actually formulate and write clear, coherent text. Even Google’s celebrated search engine, rightly seen as a big data success story, is not immune to “Google bombing” and “spamdexing,” wily techniques for artificially elevating website search placement.

Fourth, even when the results of a big data analysis aren’t intentionally gamed, they often turn out to be less robust than they initially seem. Consider Google Flu Trends, once the poster child for big data. In 2009, Google reported — to considerable fanfare — that by analyzing flu-related search queries, it had been able to detect the spread of the flu as accurately and more quickly than the Centers for Disease Control and Prevention. A few years later, though, Google Flu Trends began to falter; for the last two years it has made more bad predictions than good ones.

As a recent article in the journal Science explained, one major contributing cause of the failures of Google Flu Trends may have been that the Google search engine itself constantly changes, such that patterns in data collected at one time do not necessarily apply to data collected at another time. As the statistician Kaiser Fung has noted, collections of big data that rely on web hits often merge data that was collected in different ways and with different purposes — sometimes to ill effect. It can be risky to draw conclusions from data sets of this kind.

A fifth concern might be called the echo-chamber effect, which also stems from the fact that much of big data comes from the web. Whenever the source of information for a big data analysis is itself a product of big data, opportunities for vicious cycles abound. Consider translation programs like Google Translate, which draw on many pairs of parallel texts from different languages — for example, the same Wikipedia entry in two different languages — to discern the patterns of translation between those languages. This is a perfectly reasonable strategy, except for the fact that with some of the less common languages, many of the Wikipedia articles themselves may have been written using Google Translate. In those cases, any initial errors in Google Translate infect Wikipedia, which is fed back into Google Translate, reinforcing the error.

A sixth worry is the risk of too many correlations. If you look 100 times for correlations between two variables, you risk finding, purely by chance, about five bogus correlations that appear statistically significant — even though there is no actual meaningful connection between the variables. Absent careful supervision, the magnitudes of big data can greatly amplify such errors.

Seventh, big data is prone to giving scientific-sounding solutions to hopelessly imprecise questions. In the past few months, for instance, there have been two separate attempts to rank people in terms of their “historical importance” or “cultural contributions,” based on data drawn from Wikipedia. One is the book “Who’s Bigger? Where Historical Figures Really Rank,” by the computer scientist Steven Skiena and the engineer Charles Ward. The other is an M.I.T. Media Lab project called Pantheon.

Both efforts get many things right — Jesus, Lincoln and Shakespeare were surely important people — but both also make some egregious errors. “Who’s Bigger?” claims that Francis Scott Key was the 19th most important poet in history; Pantheon has claimed that Nostradamus was the 20th most important writer in history, well ahead of Jane Austen (78th) and George Eliot (380th). Worse, both projects suggest a misleading degree of scientific precision with evaluations that are inherently vague, or even meaningless. Big data can reduce anything to a single number, but you shouldn’t be fooled by the appearance of exactitude.

FINALLY, big data is at its best when analyzing things that are extremely common, but often falls short when analyzing things that are less common. For instance, programs that use big data to deal with text, such as search engines and translation programs, often rely heavily on something called trigrams: sequences of three words in a row (like “in a row”). Reliable statistical information can be compiled about common trigrams, precisely because they appear frequently. But no existing body of data will ever be large enough to include all the trigrams that people might use, because of the continuing inventiveness of language.

To select an example more or less at random, a book review that the actor Rob Lowe recently wrote for this newspaper contained nine trigrams such as “dumbed-down escapist fare” that had never before appeared anywhere in all the petabytes of text indexed by Google. To witness the limitations that big data can have with novelty, Google-translate “dumbed-down escapist fare” into German and then back into English: out comes the incoherent “scaled-flight fare.” That is a long way from what Mr. Lowe intended — and from big data’s aspirations for translation.

Wait, we almost forgot one last problem: the hype. Champions of big data promote it as a revolutionary advance. But even the examples that people give of the successes of big data, like Google Flu Trends, though useful, are small potatoes in the larger scheme of things. They are far less important than the great innovations of the 19th and 20th centuries, like antibiotics, automobiles and the airplane.

Big data is here to stay, as it should be. But let’s be realistic: It’s an important resource for anyone analyzing data, not a silver bullet.

Seagate Brings Out 6TB HDD, Did Not Need NO STEENKIN' SHINGLES

Or helium filling either, according to reports
Chris Mellor

Seagate sub LaCie has pre-announced a 6TB near line disk drive from its parent and it doesn't appear to use the slow write shingled magnetic recording technology.

LaCie's news came out with the NAB event in Las Vegas, which opened on Friday. There are details of the drive on Seagate's website, although the drive has not been announced and is not yet available.

The Enterprise Capacity drive has these features:

• 7,200rpm spin speed,
• 2, 3, 4, 5 and 6TB capacity points,
• 128MB cache,
• encryption option with Secure Instant Erase or SED FIPS 140-2 option,
• SAS 12Gbit/s and SATA 6Gbit/s interface options and
• a 25 per cent increase in areal density.

Seagate says the drive has 8th generation technology, that's gen 8 PMR (perpendicular magnetic recording) technology. It claims its drive "provides a 25 per cent nearline performance boost over competitive offerings," with "best-in-class random and sequential read/write performance".

The only competitive 6TB offering is Hitachi GST's 6TB He6 helium-filled spinner, which we believe rotates at 7,200rpm.

Seagate 6TB Enterprise Capacity Drive

We're told the 6TB Seagate drive, rated for 24 x 7 operation, "is built to support enterprise-class nearline workloads of up 550TB per year, which is up to 10× the rated workload of desktop HDDs," and "50 per cent more capacity over last generation," which maxed out at 4TB.

A 25 per net increase in areal density over Seagate's 4-platter 4TB Surveillance hard drive would mean 1.25TB platters, and five of these would be needed to reach a 6TB capacity level.

How many platters does the 6TB drive have? It has the same 26.1mm height as the 4TB Surveillance HDD, but that drive weighs 610g whereas the 6TB Enterprise Capacity drive weighs 780g – we believe this points to an extra platter – making five in all.

Get a datasheet here (PDF).

Seagate says its neat new drive can be used for:

• hyperscale applications,
• high-capacity RAID storage,
• mainstream enterprise external storage arrays (SAN, NAS, DAS),
• cloud data centres—replicated bulk data storage,
• enterprise backup and restore—D2D, virtual tape and
• Centralised surveillance.

A 50 per cent jump from 4TB to 6TB is a lovely increase in capacity. LaCie is offering a 2-drive, 12TB LaCie 2big with Thunderbolt 2 technology, hardware RAID, and speeds of up to 420MB/s. Oh yes please. I want one.

And, dreaming on some more, 2.5-inch drives should surely be able to reach 3TB capacities with this technology. Come on Seagate, Tosh and WD/HGST, I want a 3TB MacBook or Windows notebook now. At once! This instant! Are you listening?

What about storage array capacity? A rackful of 3.5-inch 4TB drives could get its capacity increased by 50 per cent. That's massive. Think about the 2.8PB BOSS rack we wrote about recently, the one Scality is using for its RING object storage using 4TB Kinetic drives. It could go up to 4.2PB capacity if Seagate bring out 6TB Kinetic drives - as it is surely bound to do.

And if Seagate can bring out 6TB PMR drives without using helium-filled enclosures or shingling, then surely WD and Tosh can do so too. We can expect a wave of 6TB drive upgrades to work its way through the ranks of storage array suppliers in the next few months as WD and Tosh announce their 6TB spinners and OEM qualifications get under way.

Downsides? RAID rebuilds will take even longer, giving faster-rebuild self-healing object storage technologies a fillip.

Back to the desktop: it's a shame the LaCie "2big" won't be available until later this quarter and the pricing isn't yet available either. 2big? Not at all.

Party's Over for the Dutch: Pirated Downloads Now Prohibited in the Netherlands
Loek Essers

The Dutch Ministry of Security and Justice has banned downloading pirated content, finally making this illegal for people in the Netherlands.

The government’s decision follows a ruling of the Court of Justice of the European Union (CJEU) on Thursday.

Until now, people in the Netherlands had been allowed to download copyrighted material from illegal sources and to make private copies of content they own.

To compensate for copyright owners’ resulting lost revenue, the country placed levy on sales of devices like smartphones, MP3 players and tablets.

However, in its judgement, the CJEU said that national legislation that makes no distinction between private copies made from lawful sources and those made from counterfeited or pirated sources cannot be tolerated.

If member states were free to adopt legislation that permits reproductions from an unlawful source, that would be clearly detrimental to the proper functioning of the internal market, the court said in its verdict.

“This means that, as of today, downloading from an illegal source is no longer permitted,” said Ministry of Security and Justice spokesman Wiebe Alkema. The ban is based on civil law, which means that Dutch law enforcement authorities won’t be in charge of enforcing it, he added.

Next up: lawsuits

It is up to organizations like the Dutch antipiracy foundation Brein to tackle downloads from illegal sources by filing civil lawsuits, said Alkema.

Brein, which has stated before that it wouldn’t target individual downloaders, said in a news release on Thursday that it will go after sites and services that facilitate access to illegal material.

The Dutch government will now have to modify the private copying levy rules, Alkema said.

SONT, the organization that sets the levies and represents copyright holders and device makers and resellers, was asked by the Ministry to determine which levies are still appropriate, given the verdict, Alkema said. The Ministry expects the response by the summer.

As of January 2013, smartphones, tablets and MP3 players in the Netherlands were subject to a copyright levy of up to €5 (US$7). Importers and manufacturers of such devices are required to pay that private copying levy to the Dutch Home Copying Foundation (Stichting de Thuiskopie), which is also a member of SONT.

However, some manufacturers and importers sued the Home Copying Foundation alleging that the levies should be smaller because the impact of illegal downloads shouldn’t be considered, the CJEU said.

The case got to the Supreme Court of the Netherlands which decided to seek a preliminary ruling from the CJEU.

The Dutch Home Copying Foundation said in a news release that it is confident that the ruling will not affect the proceeds coming from the levy.

Record Labels Lose Big as Court Declares File-Sharing Tools Legal

An epic battle between the world's largest record companies and the creator of software known as the "Spanish Napster" has concluded with defeat for the labels. Veteran developer Pablo Soto informs TorrentFreak that the decision clears the way for even more development, starting today with a brand new, BitTorrent-powered release of his software.

In 2008, Universal, Sony, EMI, Warner and “Spanish RIAA” Promusicae (Productores de Música de España) joined forces to sue MP2P Technologies, a company created by Pablo Soto, the brains behind Blubster, the “Spanish Napster” file-sharing software.

The record companies said that Soto had designed his Blubster, Piolet and Manolito software with the intent of providing a platform for users to pirate music while he generated profit. This, the labels said, amounted to unfair competition in the market. Soto should pay them 13 million euros ($18m) in damages, the labels argued.

Following years of litigation, in 2011 a Madrid court handed defeat to the labels by declaring Soto’s technology neutral. While his users may have infringed copyright, Soto was not responsible for that, the court said. Furthermore, since Soto wasn’t in the record business and the labels weren’t in the file-sharing business, the unfair competition claim was also dismissed.

After investing so much time in the case, the labels weren’t prepared to concede defeat. The case went to the Madrid Court of Appeals which has just made its decision public. It’s a decisive win for Soto and a big loss for the labels.

“[Soto's] activity is not only neutral, and perfectly legal, moreover it is protected by article 38 of our Constitution,” the Court wrote in its ruling.

Speaking with TorrentFreak, Soto says that the Court saw no problem with sharing technology and discovered no plan “to sink or unbalance the recording industry” or obstruct the development of its business.

“The court affirmed — yet again — that [the creation of sharing technologies] is not an act of looting, unfair competition or unfair benefit from others’ effort,” Soto informs TF.

The Spaniard, who has been developing software since he was 16 years old, adds that the win is not only good news for him, but also for others seeking to innovate.

“This clears the path for more opportunities to bring leading edge technologies to the marketplace and no longer be distracted by misguided legal tactics from the copyright conglomerates. We really appreciate and thank our loyal following, especially among the readers at TorrentFreak.”

Soto’s lawyer, David Bravo, who described the ruling as having a “very strong foundation”, said developers will now be able to go about their business free from “inventive legal interpretations that define the very creator of a file-sharing tool as the responsible of copyright infringement.”

In celebration of the victory, Soto has released a brand new version of his Blubster software, for the first time powered by BitTorrent.

“While we have continued innovating with Torrents.fm, we can now also focus once again on further creating and offering advanced P2P technology across our other networks with this new version of Blubster just launched today,” Soto told TF.

Traditionally Windows only, Blubster will soon debut on both Linux and Mac.

Sharing = Stealing: Busting a Copyright Myth

Consumers copy and share digital files. This has been blamed for a potentially catastrophic decline in certain markets. But why do consumers copy? And is it as economically harmful as often thought?

CREATe, the UK research centre for copyright, has put a decade of evidence to the test by reviewing studies published between 2003 and 2013. Applying techniques normally used in the medical sciences, articles on unlawful file sharing for digital media were methodically searched in academic databases, while non-academic literature was sought from key stakeholders and research centres. Over 54,000 sources were initially found and these were narrowed down to 206 articles which examined human behaviour, intentions or attitudes.

Professor Daniel Zizzo, an economist at the University of East Anglia, is co-author of the study, launched today. He said the research revealed that "current knowledge of file sharing is dramatically skewed by sector and method".

He adds: "Most evidence is based on the unlawful file sharing of music, which has been subjected to far more research than movies and software, which themselves have been studied far more than videogames, books or TV. This means there is a real risk of designing policy which meets the needs of a specific industry, possibly at the expense of other creative industries which are less well represented in the literature. Also, the economic effects found in one medium may not apply to another and current knowledge of file sharing is dramatically skewed by method.

"The evidence on societal costs points in conflicting directions and our study shows that the impact of illegal downloading and file sharing remains unclear. Focussing on 'lost sales', and examining people's hypothetical willingness to pay with and without the possibility of unlawful file sharing is insufficient. Regarding determinants of unlawful file sharing, there are many studies on self-reported attitudes, but few studies that observe behaviour. This is a problem, particularly as there is often a gap in findings between studies that use behaviour and studies that do not."

An important contribution of the new study is the identification of five testable reasons (or "Utilities") why consumer copy: (a) Financial and Legal Utility (this is where the enforcement debate traditionally focussed: "you can't compete with free"); (b) Experiential Utility (unlawful file sharing may be influenced by a desire to sample new content, to access niche content, or to build a collection); (c) Technical Utility (content is easier to access unlawfully); (d) Social Utility (it appears to matter what our peers do: a kind of herding effect); (e) Moral Utility (this perhaps motivates policy makers' emphasis on the education of consumers).

The academics devised a cube graphic that illustrates the key findings of the study in relation to the determinants of unlawful file sharing.

A commonly held belief is that unlawful file sharing costs the creative economy billions of pounds every year. But according to Professor Martin Kretschmer, Professor of Intellectual Property Law at the University of Glasgow, and Director of CREATe, "legislating without understanding behaviour produces lop-sided policies. The most useful evidence increases our understanding of how to turn infringers into customers".

Until next week,

- js.

Current Week In Review

Recent WiRs -

April 5th, March 29th, March 22nd, March 15th

Jack Spratts' Week In Review is published every Friday. Submit letters, articles, press releases, comments, questions etc. in plain text English to jackspratts (at) lycos (dot) com. Submission deadlines are Thursdays @ 1400 UTC. Please include contact info. The right to publish all remarks is reserved.

"The First Amendment rests on the assumption that the widest possible dissemination of information from diverse and antagonistic sources is essential to the welfare of the public."
- Hugo Black
JackSpratts is offline   Reply With Quote

Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Peer-To-Peer News - The Week In Review - July 16th, '11 JackSpratts Peer to Peer 0 13-07-11 06:43 AM
Peer-To-Peer News - The Week In Review - July 9th, '11 JackSpratts Peer to Peer 0 06-07-11 05:36 AM
Peer-To-Peer News - The Week In Review - January 30th, '10 JackSpratts Peer to Peer 0 27-01-10 07:49 AM
Peer-To-Peer News - The Week In Review - January 16th, '10 JackSpratts Peer to Peer 0 13-01-10 09:02 AM
Peer-To-Peer News - The Week In Review - December 5th, '09 JackSpratts Peer to Peer 0 02-12-09 08:32 AM

All times are GMT -6. The time now is 07:56 PM.

Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2022, Jelsoft Enterprises Ltd.
© www.p2p-zone.com - Napsterites - 2000 - 2021