P2P-Zone  

Go Back   P2P-Zone > Peer to Peer
FAQ Members List Calendar Search Today's Posts Mark Forums Read

Peer to Peer The 3rd millenium technology!

Reply
 
Thread Tools Search this Thread Display Modes
Old 19-03-14, 07:43 AM   #1
JackSpratts
 
JackSpratts's Avatar
 
Join Date: May 2001
Location: New England
Posts: 10,013
Default Peer-To-Peer News - The Week In Review - March 22nd, '14

Since 2002


































"What blogger will use that service now?" – Jennifer Granick






































March 22nd, 2014




Warner Bros Fights Looming Exposé of Anti-Piracy Secrets
Ernesto

Hollywood studio Warner Bros. is fighting a request from the Electronic Frontier Foundation to a Florida federal court to unseal details of the movie studio's anti-piracy practices. The sealed documents are part of Warner's DMCA-abuse case against Hotfile, and the movie studio says that pirates could "infringe without fear of detection" should enforcement tactics be exposed.

To deal with the ongoing threat of online piracy, major Hollywood studios have entire divisions dedicated to tracking down copyright infringers. Exactly what goes on behind the scenes is a mystery, but if the Electronic Frontier Foundation (EFF) has its way, part of this veil will soon be lifted.

Last month the digital rights group asked a Florida federal court to unseal the filings Warner submitted in its now-settled DMCA abuse case against Hotfile.

EFF argued that the public has the right to know what mistakes Warner made. Knowing how Warner Bros’ anti-piracy system works will be instrumental in discussing the effectiveness of the DMCA takedown procedure and similar measures.

This week Warner opposed the EFF’s request. The movie studio fears that by exposing the sealed documents pirates will obtain an unfair advantage. According to David Kaplan, Warner’s Senior Vice President of Anti-Piracy operations, the information “would give pirates multiple routes for evading detection and copyright enforcement.”

“Persons familiar with Warner’s methods and strategies for identifying unauthorized Warner content online could infringe without fear of detection if they knew how the detection worked,” Kaplan informed the court.

The above is intriguing, as it suggests that there are ways to bypass Warner’s anti-piracy systems. While this may be as simple as using anonymizer tools, the studio clearly doesn’t want the public to know. The opposition filings themselves are heavily redacted, but Warner warns the court that exposing their secrets could allow more “criminals” to avoid justice.

The movie studio asks the court to keep the documents under seal, and accuses EFF of having a secret agenda. Warner believes that the digital rights group is not so much interested in serving the public good, and suggests that the EFF mostly wants to use the information to their own advantage.

“Although EFF claims that this unsealing would serve the ‘public interest,’ EFF’s motion is a thinly-veiled effort to gain access to Warner’s confidential information for EFF’s own tactical advantage in private litigation that EFF regularly brings against copyright owners to challenge their use of takedown systems,” Warner writes.

In EFF’s case, the public interest may of course be aligned with the interests of the group itself. However, the Hollywood studios believe that EFF is mainly interested in scandalizing.

“Plaintiffs’ concern that EFF’s true intentions are to exploit the sealed information in order to ‘promote scandal’ regarding Warner and other copyright owners is fully justified, and tips the balance even further toward continued sealing of the designated information,” Warner informs the court.

According to Warner, the EFF’s reasoning doesn’t trump their right to protect their anti-piracy secrets. This is not to avoid “embarrassment” as EFF suggests, but to prevent pirates from outsmarting them. If the sealed documents were exposed, this could severely damage Warner’s operation, they claim.

“As Plaintiffs have explained, this detailed information could be used by infringers to evade Warner’s copyright enforcement efforts. That such disclosure would cause significant harm to Warner’s copyright enforcement efforts is beyond serious dispute,” Warner stresses.

It’s now up to the court to decide whose interests weigh stronger. If Judge Kathleen Williams decides to unseal the documents, it will be interesting to see what Warner is so afraid of.
http://torrentfreak.com/warner-bros-...ecrets-140319/





Megaupload's Dotcom Loses Case to Access Extradition Evidence
Naomi Tajitsu

Internet entrepreneur Kim Dotcom on Friday suffered another blow to his fight against extradition to the United States to face online piracy charges after New Zealand's highest court rejected his appeal to access evidence to be presented at the hearing.

The Supreme Court ruled that U.S. prosecutors were not required to disclose evidence at a hearing set for July to extradite Dotcom, the founder of online file sharing site Megaupload, and his three colleagues to the United States, where they are also charged with mass copyright infringement, money laundering and racketeering.

Washington charges that the Megaupload website, which was shut down in 2012, cost film studios and record companies more than $500 million and generated more than $175 million in criminal proceeds by letting users store and share copyrighted material, such as movies and TV shows.

If Dotcom, a German national with New Zealand residency, is extradited, the ensuing copyright case could set a precedent for internet liability laws, potentially tightening regulations on disseminating copyrighted material on the Internet.

A successful fight against the charges could force entertainment companies to rethink online distribution methods.

Friday's ruling, which culminates a series of appeals by both parties, stated that a lower court was wrong to order disclosure of evidence in the first place.

Justice John McGrath said in the Supreme Court's decision that a summary of the evidence had been provided and that was sufficient. He said Dotcom had not indicated why he could not fight the extradition charge without full access to the evidence.

The evidence in question refers to documents included in Dotcom's belongings, including laptops and hard drives, which were seized when the New Zealand government in 2012 arrested the internet tycoon at his mansion near Auckland in a SWAT team-style raid requested by U.S. authorities.

Friday's ruling deals another knock to Dotcom's defense, coming just a month after the High Court ruled last month that the search warrant used in the arrest of the entrepreneur and his colleagues was legal. Dotcom is appealing that decision.

U.S. attorney Ira Rothken, a member of Dotcom's legal team, said that the ruling was "quite robust," adding it could put the defense at a disadvantage at the extradition hearing.

"We have a much higher hurdle because of today's ruling in getting disclosures, and that will impact the fairness of the hearing," Rothken told Reuters.

Dotcom, who also goes by the name of Kim Schmitz, says Megaupload was merely an online warehouse and should not be held accountable if stored content was obtained illegally. The site housed everything from family photos to Hollywood blockbusters and was one of the most visited sites on the Internet in its heyday.

The U.S. Justice Department counters that Megaupload encouraged piracy by paying users who uploaded popular content and by deleting content that was not regularly downloaded.

(Editing by Stephen Coates)
http://www.reuters.com/article/2014/...A2K00220140321





MP3Tunes Founder Michael Robertson Found Liable for Copyright Infringement

A jury finds that the once-popular website willfully blinded itself to thousands of songs being stolen.
Eriq Gardner

On Wednesday, a jury in a New York federal courtroom found MP3Tunes founder Michael Robertson liable for infringing the works of Capitol Records, EMI and other record labels and music publishers.

Robertson operated two websites, MP3Tunes.com and Sideload.com, that once boasted a catalog of more than 400,000 recordings by 40,000 artists. In some respects an early version of cloud services, Robertson's sites allowed its users to upload music, listen to music, and transfer or "sideload" music from third-party websites to storage lockers.

The plaintiffs brought the copyright case in 2007, the same year that Viacom sued YouTube. In its nearly seven years on the court docket, Capitol Records v. MP3Tunes became incredibly complex. Thanks in large part to the lawsuit, MP3Tunes filed for bankruptcy in 2012 with Robertson later challenging the court's jurisdiction over him. Further, Robertson attempted to raise all sorts of issues relating to whether the record labels had properly registered copyrights on their songs.

The judge in the case determined on summary judgment that MP3Tunes and Robertson were liable for direct infringement for personally uploading some of the songs in question, but the main trial drama revolved around issues of "willful blindness" and "red flag knowledge" with regards to thousands of other hit songs.

As explored in the recently settled Viacom v. YouTube case, the Digital Millennium Copyright Act provides safe harbor from copyright liability under certain conditions. Most famously, Internet service providers that respond expeditiously upon the actual knowledge that comes from takedown notices can claim a safe harbor defense. But then there are points where the immunity can be attacked including when a service provider is aware of a high probability of infringement and consciously avoids to confirm that fact or when a service provider has a right and ability to control infringing activity.

In 2012, the 2nd Circuit Court of Appeals addressed this in the YouTube case and hinted that something less than a DMCA takedown notice could trigger legal obligations.

Afterward, the judge in the MP3Tunes case withdrew his prior grant of summary judgment in favor of the defendants on the possibility they exhibited willful blindness and had red flag knowledge.

Robertson countered this triable issue with evidence that the record companies had distributed free songs online as part of marketing campaigns, which he believed might exonerate him as a innocent infringer. After all, perhaps his infringements didn't rise to the level of a standard known as what's "objectively obvious to a reasonable person."

Unfortunately for him, a jury came back with a ruling that MP3tunes was willfully blind -- a determination that will be cheered by the entertainment industry's fiercest copyright advocates. It's pretty much the exact outcome that Viacom had hoped to get against YouTube before abandoning the case.

Robertson, who was also the founder of MP3.com, successfully defended a few claims including ones over whether he should have removed files from users' lockers, but the trial now moves on to a determination of damages. Already, his lawyers are preparing for this next phase and are attempting to get the judge to preclude emails from other MP3Tunes employees from being introduced to the jury. Robertson hopes to establish that his "state of mind" was innocent with respect to the acts of infringement that occurred at his company. His latest legal strategy appears to be distancing himself as far as he can from the liability of his bankrupt company.
http://www.hollywoodreporter.com/thr...n-found-689785





The Price of Music
David Pakman

Will the recorded music industry ever grow again? Since 1999, the industry has been in rapid decline as CDs became unbundled into downloaded singles. The digital download market never came close to the size of the physical music market. Now we are in the midst of another format transition, this time from downloaded singles to streaming.

The question many people ask — like the thoughtful Marc Geiger — is how big will the streaming market be? I think the answer lies not in consumers’ appetite for streaming songs, but in the price services charge consumers for streaming.

At the 1999 peak of the recorded music market, about $40 billion of recorded music was sold. How much did the average consumer spend per year on recorded music? Hundreds of dollars? Nope. At the time, according to the music trade group International Federation for the Phonographic Industry, across the total 18-and-over population (both across many countries or individually within one), the average amount spent came to $28 per consumer.

But that includes people who did not buy any music that year. If we look at just the consumers who bought music, they spent $64 on average that year. And that was at a time when one had to buy a bundle of 12 songs in the form of a CD in order to get access to just one or two. What has happened since?

Once the bundle broke, the average spending per consumer decreased. This is predictable, since bundles artificially raise the amount of total dollars a consumer spends. The chart below shows the average spending per capita in various countries according to IFPI (in U.K. pounds):

Another study by NPD Group in 2011 found similar spending, about $55 per music buyer per year on all forms of recorded music (they note that this spending is slightly higher among P2P music service users).

But the one retailer on the planet who would really know what consumer are willing to spend on recorded digital music today is Apple. The largest music retailer in the world, their data is very consistent — about $12 per iTunes account per quarter is spent on music, or about $48 per year.

Note that this figure declines year by year as iTunes users are confronted with many more choices on which to spend their disposable income, like apps and videos. Also note that total disposable spending, on average, is decreasing per account as iTunes gets bigger and bigger. As a service becomes truly mass market, it reaches fewer and fewer consumers willing to spend as much as previous consumers.

So, the data tells us that consumers are willing to spend somewhere around $45–$65 per year on music, and that the larger a service gets, the lower in that range the number becomes. And these numbers have remained consistent regardless of music format, from CD to download.

Curiously, the on-demand subscription music services like Spotify, Deezer, Rdio and Beats Music are all priced the same at more than twice consumer spending on music. They largely land at $120 per year (although Beats has a family-member option for AT&T users at $15 per month.)

This is because the three major record labels, as part of their music licenses, have mandated a minimum price these services must charge. While it may seem strange that suppliers can dictate to retailers the price they must charge end users for their service, this is common practice in digital music. The services are not able to charge a price they believe will result in maximum adoption by consumers.

The data shows that $120 per year is far beyond what the overwhelming majority of consumers will pay for music, and instead shows that a price closer to $48 per year is likely much closer to a sweet spot to attract a large number of subscribers.

For this reason, I believe the market size for these services is limited to a subset of music buyers, which in turn is a subset of the population. This means that there will be fewer subscribers to these services than there are purchasers of digital downloads unless one of two things happens:

(a) Consumers decide to spend more than two times their historical spend on recorded music, or
(b) major record labels allow the price of subscription music services to fall to $3–$4 per month.

I think the former is highly unlikely, given the overwhelming number of choices competing for consumers’ disposable income combined with the amount of free music available from YouTube, Vevo, Pandora and many others. The data shows consumer spending per category decreases in the face of many disparate entertainment choices.

The latter is the big question. My experience with the major labels when I was CEO of eMusic was that they largely did not believe that music was an elastic good. They were unwilling to lower unit economics, especially for hit music, to see if more people would buy. Our experience at eMusic taught us that music is, in fact, elastic, and that lower prices lead to increased sales. If the major labels want to see the recorded music business grow again, I believe the price of music must fall.
http://recode.net/2014/03/18/the-price-of-music/





Streaming Subscriptions Are Now a Billion-Dollar Business, but Music Sales Stall
Peter Kafka

Music subscription services have been around for a long time. Now they’re finally a real business: Companies like Spotify, Deezer and even Google generated more than $1.1 billion in music subscription revenue last year.

But the global music industry, which had finally perked up last year after a decade-plus slide, drooped again.

Worldwide wholesale revenue declined 3.9 percent, to $15 billion, in 2013, according to the music trade group International Federation for the Phonographic Industry. Much of that drop comes from Japan, where sales plummeted nearly 17 percent. If you strip out Japan’s results, sales would have been down 0.1 percent.

Flat — or, at least, a slower rate of decline — has been the new up for many years for the music business, so this won’t be terribly discouraging for the industry. And the fact that the IFPI estimates there are now 28 million people paying a monthly fee for digital music subscriptions is legitimately good news.

It does look like the industry may hover around this level for a while, though. The industry still gets a majority of its revenue from the sale of CDs, so those numbers will be shrinking for a long time. And even if subscription services, which generally charge around $10 a month in markets like the U.S. for all-you-can-eat access, find more customers, other digital revenue streams are sputtering.

The growth of download sales, primarily from Apple’s iTunes, has been slowing for a while, and last year the market declined by 2.1 percent — the first time the IFPI has recorded a drop. Downloads still account for 67 percent of digital sales, though.

Meanwhile, the ringtone industry — which generated a surprising amount of money for the music business for a surprising amount of time — is on its last legs. In 2008, “mobile” sales of ringtones and other digital novelties accounted for 26 percent of music’s digital revenue. Last year that number had shrunk to five percent.

Separately, the Recording Industry Association of America, the U.S.-based music trade group, reported similar statistics for 2013: Download sales were down one percent, while subscription services grew at a blistering 57 percent.

The RIAA says revenue for all streaming services — including ad-supported offerings like Pandora and YouTube — hit $1.4 billion last year, while subscription-only on-demand services like Spotify hit $628 million. Bear in mind that those numbers don’t sync up with the IFPI numbers, primarily because the RIAA reports retail instead of wholesale revenue.
http://recode.net/2014/03/18/streami...c-sales-stall/





Pandora Raises One Price to $4.99 Per Month for New Subscribers and Scraps Annual Option, Blames Royalty Rates
Emil Protalinski

Pandora today announced changes to its Pandora One subscription plan, which charges users to listen without advertising. The company is keeping the same pricing for existing monthly subscribers, but new subscribers and annual subscribers have to pay more.

Here is how the changes are broken down:

• Existing Pandora One monthly subscribers that remain active will not experience a price increase “at this time” and will continue to pay $3.99 per month.

• For new subscribers, the subscription price will change to $4.99 per month starting in May.

• Existing annual subscribers that remain active will migrate to a discounted loyalty price of $3.99 per month at their next renewal period.

• Pandora is ending the annual subscription option. The company plans to notify annual subscribers that are approaching their renewal date this week with the new pricing change.

Since monthly pricing for new subscribers only takes effect in May, you can take advantage of discounted loyalty pricing if you subscribe in the next month or so. That being said, this doesn’t mean that the old pricing structure will be around forever: Pandora is merely keeping it around for now.

The One subscription debuted in 2009 and hasn’t changed in price until now. It was offered for $36 per year as well as $3.99 per month (which comes out to $48 per year). The new pricing means users will end up paying about $50 per year.

Pandora says that over the last five years, the costs of delivering its service have grown “considerably.” Royalty rates the company pays to performers via SoundExchange, for example, have increased by 53 percent. Between 2014 and 2015, they will jump another 9 percent.

Pandora notes that only a small percentage of its listeners (3.3 million subscribers of more than 250 million registered users total) are affected by these changes. Still, it’s telling if the most popular music streaming service in the US has to increase its prices to keep up with its expenses.
http://thenextweb.com/insider/2014/0...-rates/#!AxA64





New Tribune Mobile App Creates 'Playlists' of Streaming Audio News
Alexei Oreskovic

A new smarpthone app developed by the Tribune Co will read aloud a personalized "playlist" of news articles along with weather and traffic updates, as the media organization looks for new ways to reach consumers.

The free Newsbeat app released on Thursday is the first product developed by the year-old Tribune Digital Ventures group headed by former Yahoo Inc executive Shashi Seth.

The app uses text-to-speech technology and recordings of humans reading news articles to produce a daily catalog of roughly 7,000 articles about everything from world news to sports. Similar to the way digital music streaming services such as Pandora Media Inc operate, the Newsbeat app determines the stories most relevant for different users based on their preferences and habits.

Seth said the app, which is available for iPhones and Android smarpthones, is good for commuters. The streaming news playlist is customized for the length of each user's commute, based on the addresses they enter into the app. "There is a very large amount of people's undivided attention that you can get," said Seth.

The news articles include stories produced by Tribune's newspapers, which include the Los Angeles Times and the Chicago Tribune, as well content from websites and publishers that Tribune has partnered with. The customized playlists of news stories include audio ads about every 10 minutes.

Tribune Co is preparing to separate its slow growth newspaper assets from lucrative broadcast TV properties later this year. Newspapers across the United States face challenges including shrinking advertising revenue and readers who prefer to get their news on mobile devices.

(Reporting by Alexei Oreskovic; Editing by Cynthia Osterman)
http://www.reuters.com/article/2014/...A2J10Z20140320





Is The Supreme Court About To Rule That Software Is Ineligible For Patent Protection?
Tim Holbrook

Anyone with an iPhone loves how easy it is to use one. Zooming into their pictures by using their fingers, or tapping the screen twice to zoom in on a picture or page. At present, these tools are protected by patents. But things could change dramatically. On March 31, 2014, the Supreme Court will hear the case Alice Corp. v. CLS Bank, to determine when, if ever, computer software is eligible for patent protection. If the Court decides that software is not eligible, the Court will destroy numerous patents in the software field. Given the Supreme Court’s recent patent activity, it makes many wonder, no matter its intentions, is the Supreme Court actually hurting innovation by its judicial decisions in patent law.

Something about patent law has grabbed the Court’s attention. Since 2000, the Supreme Court has taken thirty patent-related cases (six in its October 2013 term alone). In contrast, the Court has taken nine copyright and nine trademark/false advertising cases (two of each in its 2013 term). To some, the dominance of patent cases may not seem surprising, given the importance of technology and innovation in our current economy.

But in fact, all of this activity is extremely odd. The United States has an expert “patent court,”the United States Court of Appeals for the Federal Circuit, that hears every appeal from around the country in cases that arise under the country’s patent laws. From California to Florida, any patent appeal goes to the Federal Circuit. Unlike areas of the law, the Federal Circuit creates nationally uniform legal standards for patent law. One of the key reasons the Supreme Court will take a case is when lower courts disagree on a legal issue. But, with patent law’s single appellate court, no such splits arise.

Yet, such a disproportionate number of patent cases suggests something is going on. The Supreme Court may fear that the Federal Circuit has developed a pro-patent bias. Before his retirement, Supreme Court Justice John Paul Stevens noted, “occasional decisions by courts with broader jurisdiction will provide an antidote to the risk that the specialized court may develop an institutional bias.” The Supreme Court might think that patents have become too powerful, harming rather than helping innovation.

But many believe the Court’s activity itself is truly harming innovation by creating legal uncertainty. One area of particular concern is subject matter eligibility: what types of technology are eligible for patent protection. Since 2009, the Supreme Court has visited this three times and will address that issue again this term. In 2010, the Court found a method of hedging risk, a business method, was not eligible for patent protection because it was an abstract idea. Then, in 2012, the Court concluded that a method of optimizing drug dosage was ineligible because it was a law of nature. Last year, the Court found that patents covering natural occurring DNA are not eligible. Significantly, the Court did find that synthesized, complimentary DNA is eligible, the only time in these cases where the Court concluded the technology should be patentable.

Some view these decisions as problematic because they have withdrawn patent protection from certain areas. But the real problem is the opaqueness of these decisions. The Supreme Court has offered no clear rules, just vague generalities. Clarity in legal rules is important to innovation: companies and innovators must decide whether, and in what areas, they should invest their finite time and resources for research and development. Part of that decision making process is determining whether an invention is patentable. The Supreme Court’s decisions on eligible subject matter has made this area an unpredictable mess, making it difficult for companies to make these evaluations.

Confirming this untoward state of affairs is the Federal Circuit’s decision in Alice. Trying to apply the Supreme Court’s precedent, the Federal Circuit was hopelessly divided, generating over 125 pages and a myriad of decisions with inconsistent reasoning. If the “expert” court can’t figure out what the Supreme Court decisions mean, it is virtually impossible for business persons, scientists, engineers, and other innovators to do so.

This is not to say that the Supreme Court should stay away from patent law completely. I myself have advocated for the Court to review the Federal Circuit’s decisions, and I am no fan of gene patents. And, to be fair, the Supreme Court is not always anti-patent. It has confirmed that invalidating a patent in litigation is difficult, requiring a heightened level of proof. The Supreme Court also made it easier for patent owners to prove patent infringement.

But, in the area of patentable subject matter, the Supreme Court’s decisions have been a disaster. The Court has created mass confusion, making it almost impossible to discern whether certain innovations, particularly as to software, are patentable. Alice provides the Supreme Court an opportunity to provide guidance in the law, particularly in the software industry. Let’s hope the Court takes it. It is time for some clarity – innovation depends on it.
http://www.forbes.com/sites/realspin...nt-protection/





EU Net Neutrality Vote Would Let ISPs Charge for Internet “Fast Lane”

Ban on roaming charges helps push through controversial net neutrality package.
Jon Brodkin

A European telecom law approved by a committee today is intended to prevent Internet service providers from blocking or slowing down Web applications, but lets ISPs charge content providers for higher quality of service.

Critics say this allowance will create an Internet "fast lane" and undermine the principles of net neutrality, that Internet service providers should treat all traffic equally. The European Parliament's Industry Committee announced its vote in favor of the "Connected Continent" legislation, saying that "Internet providers should no longer be able to block or slow down Internet services provided by their competitors."

Under the heading, "Net neutrality," the committee announcement said it "inserted strict rules to prevent telecoms companies from degrading or blocking Internet connections to their competitors’ services and applications. In 2012, for example, EU telecoms regulator BEREC reported that several internet providers were blocking or slowing down services like 'Skype.'"

The European Telecommunications Network Operators' Association (ETNO), a telco lobby group, criticizes the restrictions as too severe, saying, "This would make an effective management of the network almost unworkable."

On the other side of the aisle, consumer advocates are worried about an exception in the legislation for "specialized services."

"Companies would still able to offer specialized services of higher quality, such as video on demand and business-critical data-intensive cloud applications, provided that this does not interfere with the Internet speeds promised to other customers," the committee announcement said. "Measures to block or slow down the Internet would be allowed only in exceptional cases, e.g., where specifically ordered by a court."

Charging content providers "will enable telecom operators to generate additional revenue streams from OTT [over the top] actors, content providers as well as from consumers who are willing to pay for better or faster services," a bill description states. "These revenues in turn, will enable operators to finance investments into network upgrades and expansion."

Some European parliament members objected. Marietje Schaake of the Netherlands said the benefits of a stronger rule have been demonstrated in her home country, which passed a net neutrality law in 2012. "For Dutch companies a level playing-field is important; being the frontrunner has its disadvantages as long as not all European companies have to abide by the same rules. Without legal guarantees for net neutrality, Internet service providers were able to throttle competitors. This could push players without deep pockets, such as start-ups, hospitals or universities, out of the market. Today’s vote risks allowing just that," she said.

The net neutrality regulation was proposed by another Dutch politician, Neelie Kroes, the European Commissioner for Digital Agenda. Kroes touted her proposal's "new safeguards to ensure access to the open Internet. Today, millions of Europeans find services like Skype blocked, or their Internet access degraded: my proposal will end those discriminatory practices. Extra new 'specialized services' (like for IPTV, e-Health, or cloud computing) would be allowed only if they don't cause general impairment of regular Internet access."

The full parliament is scheduled to vote on the proposal April 3.

Consumer-friendly ban on roaming charges

In addition to net neutrality, the legislation would ban most mobile roaming charges. "A broad majority of the committee members backed plans to ban 'roaming' charges within the EU as of 15 December 2015," the committee announcement said. "However, to protect telecoms companies against 'anomalous or abusive usage of retail roaming services,' MEPs [Members of the European Parliament] ask the European Commission to lay down guidelines for exceptional cases in which companies would be allowed to apply charges. These charges would, however, have to be below the caps laid down in current roaming rules."

Schaake said the popularity of the roaming charge restriction helped get the legislation passed, despite its warts. "The whole package has been rushed through by Parliament because abolishing roaming costs is a nice message to campaign on," she said.

Non-government organizations that oppose the legislation set up a website titled, Save the Internet. "Right now big companies like Microsoft and Facebook are on the same level as our Blogs or Podcasts," the site says. "But if the definition of 'specialized services' isn't fixed, these companies will be in the fast lane on the data highway, leaving start-ups and non-commercial websites like Wikipedia in the slow lane."

Save the Internet claimed that the regulation will let ISPs "block content without any judicial oversight." The committee's announcement disputes that, but a Save The Internet spokesperson argued that "Although the proposal explicitly prohibits blocking and throttling, it authorizes new forms of discrimination that would have the same effect, by allowing exclusive and restrictive commercial deals between Internet access providers and service provider."

The group also believes the rules are vaguely written and could be interpreted in harmful ways, and pointed to legislation text that says:

Within the limits of any contractually agreed data volumes or speeds for internet access services, and subject to the general quality characteristics of the service, providers of internet access services shall not restrict the freedoms provided for in paragraph 1 by blocking, slowing down, altering or degrading specific content, applications or services, or specific classes thereof, except in cases where it is necessary to apply (deletion) traffic management measures. Traffic management measures shall not be applied in such a way as to discriminate against services competing with those offered by the provider of Internet access. Traffic management measures shall be transparent, non-discriminatory, proportionate and necessary in particular to:

a) implement a court order;
b) preserve the integrity and security of the network, services provided via this network, and the end-users' terminals;
c) prevent the transmission of unsolicited commercial communications to end-users;
d) prevent network congestion or mitigate the effects of temporary or exceptional network congestion provided that equivalent types of traffic are treated equally.

European Digital Rights Executive Director Joe McNamee, a Save The Internet member, told Ars that the line preventing discrimination "against services competing with those offered by the provider of Internet access" could mean that "traffic management can be applied against other services." He also pointed to the "contractually agreed data volumes or speeds" section, saying, "if there is a low contractually agreed data volume and discrimination is only explicitly forbidden within any contractually agreed limits, then surely this means that outside a low or non-existing limit, the discrimination is allowed. Or does it? Is this the worst drafting ever?"

In a blog post reacting to today's vote, Save The Internet also criticized the legislation's "broad definition of “specialized services” that does not provide clear legal guidance to regulators and companies. 'Specialized services' should be limited to services provided by ISPs, such as IPTV, and should not be confused with services on the open internet, like YouTube or Spotify."

In the US, the Federal Communications Commission issued net neutrality rules in 2010 that prevented fixed broadband providers (though not cellular carriers) from blocking and degrading traffic or charging for special access. The rules were overturned by a court decision this year.
http://arstechnica.com/tech-policy/2...net-fast-lane/





Comcast and Time Warner Cable Lost 1.1 Million Video Customers in 2013

Pay-TV isn't going away, though, as AT&T and Verizon boost subscriber numbers.
Jon Brodkin

Comcast, Time Warner Cable (TWC), and all other top cable companies lost pay-TV subscribers in 2013, but the companies were able to boost their total broadband Internet subscribers, according to research by Leichtman Research Group.

Comcast and TWC, the two biggest cable companies in the US, combined for 1.1 million lost video subscribers. Comcast finished 2013 with 21.7 million multi-channel video subscribers, down 305,000 according to Leichtman's research. TWC lost 825,000 video subscribers, dropping to 11.4 million.

"The top nine cable companies lost about 1,735,000 video subscribers in 2013—compared to a loss of about 1,410,000 subscribers in 2012," the research said.

At the same time, Comcast added 1.3 million broadband Internet subscribers, to hit a total of 20.7 million. TWC gained 211,000 broadband subscribers, to bring its total to 11.6 million. Comcast's 1.3 million broadband subscriber gain accounted for "49 percent of the total net additions for the top providers in the year," the research said.

A Comcast spokesperson noted that the video subscriber loss of 305,000 in 2013 was an improvement over the loss of 336,000 in 2012. Comcast turned things around in Q4 2013 with a net gain of 43,000 video subscribers, which came after 26 consecutive quarters of subscriber losses.

A TWC spokesperson explained its video subscriber losses in a statement to Ars. "We began to move away from too aggressively priced new customer promotions that alienated some subscribers when their promotional price period ended," the statement said. Additionally, "we experienced a larger than usual video subscriber loss last summer from both the CBS/Showtime blackouts as well as from Journal Broadcasting related blackouts in the Midwest."

TWC has "begun significant upgrades in our largest markets this year which will introduce a more dynamic new TV navigation experience and much more content via On Demand," the spokesperson said.

Cutting the cord, but slowly

While some people are "cutting the cord," opting to get video from online streaming sites instead of traditional video packages, the cable industry's losses aren't replicated in the rest of the multi-channel video provider market. AT&T's U-verse gained 924,000 video subscribers in 2013 to reach a total of 5.5 million, while Verizon FiOS gained 536,000 to reach a total of 5.3 million. DirecTV gained 169,000 subscribers to move up to 20.3 million, while Dish stayed level at 14.1 million.

While cable companies as a whole lost 1.7 million video subscribers, moving down to 49.6 million, the entire multi-channel video market lost only 104,521 subscribers, bringing the total down to 94.6 million. In 2012, the whole multi-channel video market added 175,000 subscribers.

Total broadband subscribers in the US grew by 2.6 million in 2013, hitting 84.3 million. Cable accounts for 49.3 million, and "telephone companies" like AT&T, Verizon, and CenturyLink account for the rest. Most of the phone companies saw small gains in broadband subscribers last year.

"[T]he 17 largest cable and telephone providers in the US—representing about 93 percent of the market—acquired over 2.6 million net additional high-speed Internet subscribers in 2013," Leichtman Research Group said. "Annual net broadband additions in 2013 were 95 percent of the total in 2012."

The fact that "broadband is still growing, while video is flat, is largely a function of where the two are in the adoption curves, [rather] than [being] directly related to each other," researcher Bruce Leichtman told Ars in an e-mail. "Much of cable broadband growth is coming from people upgrading from dial-up and switching from DSL, as well as from new household formation."

The cable industry's video subscriber losses provide further incentive to ISPs and content providers to restrict the amount of video content people can get on the Internet. This strategy has been particularly successful with live sports programming—sports nuts who want to cut the cord would have to survive without live broadcasts of their favorite teams' games. The latest example is this month's NCAA basketball tournament, which is only available for online streaming if you also have a pay-TV subscription.

Comcast is attempting to buy TWC for $45.2 billion, a merger expected to get a thorough antitrust review. While the two companies don't compete against each other in individual cities and towns, and Comcast has pledged to divest itself of three million subscribers to appease regulators, the combined buying power of a merged Comcast and TWC would provide further clout in negotiating contracts with content providers.

Comcast and Time Warner have taken somewhat different approaches in marketing video and broadband services, Leichtman told Ars. "Over the past year+ Comcast’s maintained its strategy of focusing on the bundle of video, Internet, and phone, while TWC de-emphasized the bundle and became more willing to get a broadband-only customer," he said. "In the past year, Comcast also added 768,000 phone subscribers, while TWC lost 218,000 residential phone subscribers."

According to Comcast, 79 percent of its video customers at the end of 2013 subscribed to two services while 44 percent subscribed to all three.

Unlike Comcast, TWC has more Internet than video subscribers. However, that changes when business customers are subtracted from the total. "Note that Time Warner’s totals include business accounts—for residential subscribers only, there are still slightly more video customers than broadband (11,197,000 video vs. 11,089,000)," Leichtman said.
http://arstechnica.com/business/2014...omers-in-2013/





What if Netflix Switched to P2P for Video Streaming?
Janko Roettgers

Could Netflix change its video streaming service to use a P2P architecture, in order to save money on content delivery and sidestep peering conflicts with ISPs like Comcast?

That’s a possibility raised by Netflix CEO Reed Hastings in a blog post Thursday, which urged the FCC to make peering part of new net neutrality regulations. ISPs want Netflix to pay for delivering traffic to their customers because the company doesn’t consume as much traffic as it delivers — to which Hastings replied:

“Interestingly, there is one special case where no-fee interconnection is embraced by the big ISPs — when they are connecting among themselves. They argue this is because roughly the same amount of data comes and goes between their networks. But when we ask them if we too would qualify for no-fee interconnect if we changed our service to upload as much data as we download (in other words, moving to peer-to-peer content delivery) — thus filling their upstream networks and nearly doubling our total traffic — there is an uncomfortable silence.”

This brings up an interesting question: Could Netflix actually do that?

Could a service like Netflix stream videos via P2P?

P2P is best known for file sharing — think Napster’s MP3 swapping and movie downloads from the Pirate Bay, or even licensed torrent downloads, courtesy of BitTorrent Inc. At its core, it just means that users don’t access data from a central server, but instead exchange it between one another — and that same technology can easily be used for video streaming as well.

In fact, Chinese video services used P2P as their primary distribution mechanism for video streams for years. The Chinese internet was traditionally fragmented, with infrastructure being centered around a few major state-owned telecommunications companies. Reaching consumers with adequate speeds to stream video would have required significant investment from video service providers, which is why many of them decided to distribute P2P streaming clients instead.

Services like PPStream, PPLive and Xunlei all used their own P2P software, and even major broadcasters like CCTV used P2P to reach millions of viewers during major sporting events with higher reliability and lower costs than a server-based architecture could have afforded them. Only in recent years has there been a trend toward central architectures for some of these offerings.

In the U.S., P2P was also used for some time to power video streaming for CNN and others, but falling bandwidth costs and the unwillingness of consumers to install plugins or clients for streaming led most services to switch to a central architecture. Most recently, BitTorrent shut down its efforts to bring P2P live streaming to desktop PCs, and decided to focus on mobile devices instead.

Would P2P really double Netflix’s traffic?

Hastings suggested Thursday that P2P would “nearly double” Netflix’s traffic. That assessment was obviously meant to put pressure on ISPs, and a closer look shows that the math isn’t all that clear.

When a Netflix subscriber watches an episode of House of Cards in HD, he consumes about 3GB of data. If that same subscriber were also to upload that very same data to someone else to distribute it in a P2P fashion, it would lead to a total consumption of 6GB. Right?

Well, not so fast. First of all, by getting the data from the first user, the second subscriber wouldn’t access House of Cards from Netflix’s servers, which would mean that in total, about the same amount of data would change hands. And in reality, there wouldn’t just be two people watching the same content, but likely thousands, ideally leading to only incremental data consumption increases for each consumer. With a slightly larger overhead, there would be some traffic increase, but it’s very unlikely that this number would approach 100 percent.

Peering and the last mile: So close, yet so far

The real question here isn’t whether the total amount of bits caused by Netflix viewing would increase, but what the impact on peering as well as the last mile would be. Hastings suggested that switching to P2P could essentially lead to a world in which Netflix viewers would send as much traffic from an ISP’s network to other networks as they would consume. The real impact on peering would largely depend on the P2P architecture used.

Back when BitTorrent and other file-sharing technologies had a larger impact on ISP networks, some P2P developers banded together to propose a technical solution for this very problem. Dubbed P4P, it gave ISPs a way to steer the flow of file sharing traffic to make sure that users connected to geographically closer peers, or peers on networks that allowed them settlement-free peering. So if Netflix and ISPs cooperated, they could make P2P work — but given the current situation, that’s a big if.

The other pain point is the last mile. Back in 2008, Comcast admitted to throttling BitTorrent. It argued that file sharers were consuming too much bandwidth on the local level, causing network congestion for their neighbors. Comcast eventually moved away from these measures and towards data caps, and BitTorrent changed the protocol of its clients to be more aware of the state of the network and yield to other traffic. But if Netflix flipped the switch on P2P tomorrow, it could put lots of stress on the last mile, which could be the real choke points for ISPs.

What about mobile and TVs?

One of the challenges for Netflix would be that more than 80 percent of its traffic comes from mobile and connected devices. Distributing a P2P plugin to PCs is relatively simple, but making it work on the Xbox One could be significantly more challenging. P2P has been done on mobile devices, and adding a P2P component to Netflix’s mobile apps should be possible, even though issues like data caps on mobile plans as well as battery life would have to be addressed.

But the real issue could be making this work in the living room, where Wi-Fi could become another choke point. Consumers frequently use older networking equipment they got from their ISPs, and getting adequate bandwidth for HD video streaming is already a challenge for many. Now imagine that their smart TVs were also uploading bits and pieces as they streamed Orange is the New Black, and you can see that they’d frequently end up with congestion in the home. Some consumers might go out and finally buy a new router, but many would just blame Netflix if their streaming looked worse.

In the end, Netflix switching to P2P is nothing more than an academic exercise. Yes, it would be possible, and yes, it would save the company some money. But with the large number of Netflix users and the wide variety of devices they use to watch Netflix, P2P would also bring up a whole range of new problems.
http://finance.yahoo.com/news/netfli...180229987.html





How a Laser Beam Could Quadruple the Speed of the INTERNET
Brian Fung

We've heard a lot about how Netflix wants to improve download speeds for viewers by partnering with Comcast and other Internet providers. The central issue is about how to carry large video streams efficiently from one part of the Internet to another. But someday, the technology behind that infrastructure could make those pipes much, much bigger, helping to alleviate those concerns.

Researchers from the California Institute of Technology say they've come up with a new kind of laser that's capable of quadrupling the bandwidth on today's fastest fiber optic networks. These networks make up what's known as the Internet "backbone," the behind-the-scenes network that delivers content to ISPs like Verizon — who in turn make that content available to you.

What do lasers have to do with the Internet? In today's most advanced networks, which rely on fiber optic technology, data is transmitted as light rather than electrical signals. On traditional copper-wire networks, those signals don't travel as fast and tend to degrade more easily over long distances. So light offers an inherent advantage.

Today's best backbone technology is capable of staggering bandwidth — in some cases up to 400 Gbps. For perspective, that's more than 40,000 times the speed of the average American's home connection. (Take that comparison with a grain of salt: Most Americans will never need the capacity of a backbone connection. Even the fastest consumer plans top out at 1 Gbps these days.)

But the new laser technology, developed in part by National Medal of Science-winner Amnon Yariv, promises to quadruple bandwidth in the existing Internet backbone, if not more.

"Our first run lasers, fabricated at Caltech, are capable of of a 4x increase in the number of bytes-per-second carried by each channel," Yariv, whose research was published recently in the Proceedings of the National Academy of Sciences, said in an e-mail. "This number will increase with our continuing work, but even at this level, the economic advantages are very big."

The more efficient laser is a marked improvement over existing ones in that it operates closer to a single frequency than any other yet created. The purity of the beam allows it to carry more data.

Yariv compares the laser to a highway. The highway has a set number of lanes, and carries a certain number of trucks every day. With the Caltech upgrade, the trucks will now be able to carry four times the tonnage on the same highway.

"A single channel carrying today, say, 40 Gbps, will go to 160 Gbps," said Yariv. Applied to top-of-the-line networks, that might mean eventual backbone speeds of 1,600 Gbps or more. (Just for fun, that's 164,000 times faster than the 10 Mbps connection serving the average American home today.)

The discovery isn't likely to benefit individual Internet users like you and me in a huge way — at least, not directly. Internet subscribers are limited largely by the plan they've purchased from their ISPs. If you're paying for a 15 Mbps connection, for instance, you aren't suddenly going to be upgraded to a 60 Mbps plan.

Still, dramatically expanding the rate at which data can be routed through the Internet to your ISP could have downstream implications for companies like Netflix. It'll also make a big difference as our homes become smarter, more connected and more automated, and as phone companies transition away from copper networks to carrying calls over fiber optic networks, as well.
http://www.washingtonpost.com/blogs/...-the-internet/





UK Gov Wants 'Unsavoury' Web Content Censored
Liat Clark

The UK minister for immigration and security has called for the government to do more to deal with "unsavoury", rather than illegal, material online.

James Brokenshire made the comments to the Financial Times in an interview related to the government's alleged ability to automatically request YouTube videos be taken down under "super flagger" status.

A flagger is anyone that uses YouTube's reporting system to highlight videos that breach guidelines. The Home Office explained to Wired.co.uk that the Metropolitan Police's Counter Terrorism Internet Referral Unit (CTIRU), responsible for removing illegal terrorist propaganda, does not have "super flagger" status, but has simply attained the platform's Trusted Flagger accreditation -- a status for users who regularly correctly flag questionable content.

The FT published its article in context of growing concerns around the radicalisation of Britons travelling to partake in the ongoing conflict in Syria, and the Home Office told Wired.co.uk any videos flagged by the CTIRU for review were ones found to be in breach of counter-terrorism laws (29,000 have been removed across the web since February 2010).

This seems to be the impetus for the kinds of extended controls Brokenshire told the FT the government should be looking into, namely, dealing with material "that may not be illegal but certainly is unsavoury and may not be the sort of material that people would want to see or receive".

"Terrorist propaganda online has a direct impact on the radicalisation of individuals and we work closely with the internet industry to remove terrorist material hosted in the UK or overseas," Brokenshire told Wired.co.uk in a statement.

YouTube already has a flagging system in place for just these purposes, and will review every complaint. However with 100 hours of video being uploaded to the site every minute, the concern is there is no feasible way of playing whack-a-mole fast enough. This is one issue. How a member of government could propose the authorities do more to deal with material that is simply "unsavoury" though, is another matter entirely. And it's hard to see how any suggestion of this kind is not censorship.

"It is [censorship]," Jaani Riordan, a barrister specialising in technology litigation, told Wired.co.uk. "Removal of lawful material by government simply because it offends governmental or public policy is without justification. Conversely, a private enterprise, such as YouTube, would always remain free to remove content which offends its Terms of Use or other policies, and there is very limited if any recourse against it for doing so."

If the government were to force YouTube to remove content, it would be breaching Article 10(2) of the European Convention of Human Rights, related to freedom of expression.

This is why, as with the case of self-harm content or even explicit content, the government prefers to put pressure on private companies to self-censor. In a situation like this, in which we find ourselves today, it is of course impossible to know where these lines will eventually be drawn.

In his interview with the FT Brockenshire says the government is considering a "code of conduct" for internet service providers and companies, and a potential system whereby search engines and social media platforms actually alter their algorithms so that said "unsavoury" content is less likely to appear.

"Google has already modified its algorithms to accommodate government and rights-holder requests," says Riordan. "For example, penalising sites with a high number of takedown requests and removing auto-complete suggestions for pirate content. These changes would likely be couched in terms of helping consumers find relevant content. It's a dangerous precedent."

Furthermore, the government has already piled on the pressure for service providers to provide internet filters. It is the further expansion of those filtering systems -- which began as being there to block child abuse content, then pornography and now everything from nudity to alcohol-related content -- that is gravely concerning.

"Through proposals from the Extremism Taskforce announced by the Prime Minister in November, we will look to further restrict access to material which is hosted overseas -- but illegal under UK law," Brockenshire told Wired.co.uk in a statement. But, there was more: "...and help identify other harmful content to be included in family-friendly filters."

The Home Office told Wired.co.uk the government has no interest in preventing access to legitimate and legal material (though we're not sure why the inclusion of the word "legitimate" was necessary -- the government need only be concerned with legality). But it went on to say that even though it may be legal, some extremist content can be distressing and harmful. As such it is working with industry to support its efforts in identifying material that can be included in "family friendly filters", which can be turned off if the user wishes.

The government had already admitted in January, mere weeks after ISP filters had been implemented, that they were inadvertently blocking content.

"The government has a history of conjuring the spectre of 'regulation,'" said Riordan. "It didn't work in 2008 and 2009, when the government sought to encourage ISPs to agree a code of conduct on repeat copyright infringers; this culminated in the Digital Economy Act. CleanFeed blocking of the [Internet Watch Foundation] watchlist is voluntary, but clearly encouraged. One could easily imagine similar threats being made in relation to filtering of extremist materials. The potential for mission creep is extremely concerning."

Referring to Brockenshire's comment that the government will help industry "identify other harmful content", the barrister added: "The passage is certainly suggestive of mission creep -- and potentially of great impact, since opt-out filtering now affects the vast majority of British residential internet connections. But I think blocking (assuming no false positives) is probably less harmful than outright removal at source... Of course, the lack of clarity and coherence in content policy is itself deeply concerning."

The Home Office told Wired.co.uk a large part of the new effort by the CTIRU is to be centred on taking down "terrorist" content overseas, where much of it is being posted. If the police has a good relationship with industry, it's possible that material can still be taken down, and the Home Office said it has such relationships and now wants them to take these new developments forward i.e. the threat of radicalisation.

The Home Office in fact compared the situation to the restriction of child abuse images, which industry (including Google, most prominently, as mentioned earlier) has already conceded to. But the issues the Home Office refers to as "new developments" cover legal material that may be considered "harmful".

"I don't think content should be restricted simply because someone thinks it is 'unsavoury' -- we need to know what criteria would be applied," Jacob Rowbottom from the University of Oxford's law faculty, told Wired.co.uk. The author of " Leveson, Press Freedom and the Watchdogs" said we would need to know if there would be an accountability or legal supervision system for decisions made by the government to flag material or request certain sites receive lower search engine rankings. "There is a danger that informal arrangements between government and private companies can lack sufficient safeguards."

"If there's one thing that remains constant, politicians have proved to be terrible arbiters of taste. If you don't think much of their suits and haircuts, you're not going to think much of what they think acceptable or unsavoury for public consumption," Danny O'Brien, International Director of the Electronic Frontier Foundation, told Wired.co.uk. "We have free speech because there's no one person, no one organisation, and certainly no single political party that can determine what's true, acceptable, unsavoury or revolutionary. The internet is about letting us all speak, and then leaving it to posterity to see who was right and who was wrong. A system that could let us make that judgement in advance wouldn't be an algorithm -- it would be a time machine."

Emma Carr, deputy director of Big Brother Watch, added: "Governments shouldn't be deciding what we can see online. Google must be fully transparent about how the British Government uses this system to reassure people freedom of speech is not being chilled."

For any "code of conduct" -- which the Home Office confirmed it is still looking into -- to be even remotely acceptable, the conversation about its formula has to happen in public. We need to know, points Riordan, how that code will be enforced, who the signatories will be and whether there will be any kind of penalty for non-compliance. "Co-regulation of this kind is nothing new, but might potentially amount to an interference with the freedom to conduct a business under article 16 of the EU Charter of Fundamental Rights if the duties it imposes are onerous and disproportionate."

For O'Brien, no amount of explanation will justify the end result. "In China, they call it 'self-discipline'," he said. "If politicians are terrible censors, companies are even worse at implementing such censorship. They don't have the resources to snoop on every video and picture when dealing with billions of users, and they shouldn't be given that excuse."

One particular comment made to Wired.co.uk by the Home Office did nothing to assuage concerns. It said that although the Met uses the Trusted Flagger scheme to quickly flag illegal terrorist propaganda, YouTube may also choose to remove legal extremist content if it breaches its terms and conditions. The moment the Home Office is speaking on behalf of industry, speculating about what it may or may not do in context of a proposed code of conduct (which would amount to a new set of terms and conditions for that industry) and more intrusive filtering, a fair few flags of our own pop up.
http://www.wired.co.uk/news/archive/...web-censorship





Brazil to Drop Local Data Storage Rule in Internet Bill
Anthony Boadle

Brazil will drop a controversial provision that would have forced global Internet companies to store data on Brazilian users inside the country to shield them from U.S. spying, a government minister said on Tuesday.

The rule was added last year to proposed Internet governance legislation after revelations that the U.S. National Security Agency had spied on the digital communications of Brazilians, including those of their President Dilma Rousseff and the country's biggest company Petroleo Brasileiro SA.

Instead, the legislation will say that companies such as Google Inc and Facebook Inc are subject to Brazilian laws in cases involving information on Brazilians even if the data is stored abroad, congressional relations minister Ideli Salvatti told reporters.

She said the bill, which is opposed by Rousseff allies in the lower chamber of Congress, has enough support to be put to the vote on Wednesday.

Salvatti said the government will not negotiate a key provision in the bill on net neutrality, which has faced strong opposition from telecom companies in Brazil because it would bar them from introducing differential pricing according to Internet usage and speeds, such as higher rates for downloading videos.

Regulation of the business aspects of the new legislation can be done later by executive decree, she said.

The legislation dubbed Brazil's "Internet Constitution" protects freedom of expression, safeguards privacy and sets limits to the gathering and use of metadata on Internet users.

It ran into opposition from government allies in the PMDB party, Brazil´s largest, who opposed the net neutrality provision, while the requirement for in-country data storage had the Internet companies up in arms. They complained it would increase their costs and erect unnecessary barriers in one of the world's largest Internet markets.

However, following the spying revelations based on documents leaked by former NSA contractor Edward Snowden, requiring Internet companies to store data on Brazilians inside the country so that it could be subject to Brazilian laws became a priority for Rousseff.

Documents leaked by Snowden last year included revelations that the NSA secretly collected data stored on servers by Internet companies such as Google and Yahoo Inc.

Facebook has some 70 million users in Brazil, its third biggest market after the United States and India, and Google has a big slice of the local digital advertising market.

The reported espionage using powerful Internet surveillance programs upset relations between the United States and Brazil and led Rousseff to cancel a state visit to Washington in October and denounce massive electronic surveillance of the Internet in a speech to the U.N. General Assembly.

Rousseff and German Chancellor Angela Merkel, another leader allegedly spied on by the NSA, have led international efforts to limit mass electronic surveillance and Brazil will host a global conference on the future of Internet governance next month.

(Additional reporting by Maria Carolina Marcello; Editing by Lisa Shumaker)
http://www.reuters.com/article/2014/...A2I03O20140319





Microsoft Software Leak Inquiry Raises Privacy Issues
Nick Wingfield and Nick Bilton

Technology companies have spent months denying they know anything about broad government spying on people who use their Internet services.

But a legal case filed this week against a former Microsoft employee shows the power these companies themselves have to snoop on their customers whenever they want to.

Microsoft accused the former employee of stealing company trade secrets in the form of software code for the Windows operating system, and leaking the software to a blogger. In an investigation, the company figured out who revealed the information by reading the emails and instant messages of the blogger on his Microsoft-operated Hotmail and message accounts.

While Microsoft’s actions appear to have been legal and within the scope of its own policies, its reading of the private online accounts of a customer without a court order was highly unusual and raises questions about its protections for customer data, privacy lawyers say.

“What blogger will use that service now?” said Jennifer Granick, an attorney and director of civil liberties at the Stanford Center for Internet and Society.

Ms. Granick said it appeared that Microsoft’s actions were within the boundaries of the Electronic Communications Privacy Act, which allows service providers to read and disclose customers’ communications if it is necessary to protect the rights or property of the service provider. Still, she called the move by Microsoft “stupid” and said it should raise concerns among bloggers and journalists about using Microsoft Internet services to communicate with their sources.

Microsoft said it had done nothing wrong, but seemed to acknowledge the unusual nature of its actions — as well as a brewing outcry over its methods of investigating the former employee — by saying it would take several new steps to reassure customers that their communications will be private.

In a statement, John E. Frank, a Microsoft vice president and deputy general counsel, said that in the future, if the company had evidence of a crime against Microsoft, it would submit that evidence to an outside lawyer who is a former judge, and that it would conduct a search of private communications only if the judge concluded there was enough evidence to meet the standards for a court order. Companies do not need to obtain court orders to search their own services, Mr. Frank said.

He said the company would also state the number of searches of customer accounts that it conducted itself as part of a broader transparent report it publishes periodically on government and court orders.

“The privacy of our customers is incredibly important to us, and while we believe our actions in this particular case were appropriate given the specific circumstances, we want to be clear about how we will handle similar situations going forward,” Mr. Frank said.

Criticism of Microsoft began mounting on Thursday after a report on a news site, Seattlepi.com, about the case. The details were especially troubling for people who saw potential implications for journalists.

“I have never seen a case like this,” said Edward Wasserman, the dean of the Graduate School of Journalism at University of California, Berkeley. “Microsoft essentially decided that whatever privacy expectation that its own customers supposedly had was basically a dead letter. It simply decided that in its own corporate interest, it can intrude on a person’s email.”

In the case, filed in federal court earlier this week in Seattle, Microsoft accused Alex Kibkalo, the former employee, of leaking the Windows code to a French blogger in 2012. As part of an internal Microsoft investigation at the time, Microsoft examined the blogger’s private Microsoft email and instant messaging accounts and discovered evidence that the blogger had received confidential information and Windows software code from Mr. Kibkalo, who worked as a software architect for Microsoft in Lebanon.

The blogger is not identified in the complaint, but the person is described as specializing in publishing leaked screenshots and other information about Microsoft products on the web. Much of the details in the complaint, which was filed by an agent for the F.B.I., were based on Microsoft’s own inquiry.

Mr. Kibkalo, a Russian national, was arrested in Seattle this week. Russell Leonard, a public defender assigned to represent him, did not respond to requests for comment.

The picture that Microsoft’s investigation of the blogger paints is not a flattering one.

The complaint says a Microsoft investigative team interviewed the blogger, during which the person admitted to receiving confidential information and software from Mr. Kibkalo. The blogger also admitted to selling activation keys for Windows Server software on eBay, the complaint said.

Microsoft reached a confidentiality agreement with the blogger, which is why he is not identified in the complaint, according to a person briefed on the matter, who spoke only on condition of anonymity.

Nate Cardozo, a staff lawyer with the Electronic Frontier Foundation, a privacy rights group, said that a number of companies had broad terms of service, but that it was extremely rare that any actually follow through and sift through a customer’s personal email.

“To see Microsoft using this right to essentially look through a blogger’s email account for evidence of wrongdoing and then turn it over on a silver platter for law enforcement, it is extremely undesirable,” Mr. Cardozo said.
http://www.nytimes.com/2014/03/21/te...cy-issues.html





Attorney General's New War On Encrypted Web Services

How you might be forced to unlock seized packets.

Australia's Attorney-General's department wants new laws to force users and providers of encrypted internet communications services to decode any data intercepted by authorities.

The proposal is buried in a submission (pdf) by the department to a Senate inquiry on revision of the Telecommunications Interception Act.

The Attorney General's submission makes it clear that its proposal is a "preliminary view" that may not align with that of the broader Australian Government, which it says has made "no decision" on any TIA-related revision.

The department argues the rise of over-the-top communications (OTT) makes it more difficult to guarantee that intercepted communications will be in an "intelligible" format. The rising adoption of encryption to thwart mass surveillance attempts is irking authorities.

"Sophisticated criminals and terrorists are exploiting encryption and related counter-interception techniques to frustrate law enforcement and security investigations, either by taking advantage of default-encrypted communications services or by adopting advanced encryption solutions," the submission noted.

Though it does not name its key targets, Yahoo!, Google and Microsoft already enable encryption by default for their respective web-based email services. BlackBerry's messaging encryption has also previously been raised as a law enforcement issue.

Under the department's plan, "law enforcement, anti-corruption and national security agencies … [would be able] to apply to an independent issuing authority for a warrant authorising the agency to issue 'intelligibility assistance notices' to service providers and other persons".

The department argues the obligation on service providers would merely "formalise" existing arrangements. However, forcing individual suspects to unlock encrypted messages would be a new power for authorities.

The department sees the scheme acting in a similar way to section 3LA of the Crimes Act, under which authorities can get a warrant that compels an individual to turn over passwords to seized hard drives.

Under 3LA, the individual is compelled to "'provide any information or assistance that is reasonable and necessary’ to allow information held on the device to be converted into an intelligible form", the department said.

The department isn't specific about what it believes individual users could provide authorities that would aid in making sense of encrypted data from internet communication services.

It appeared to acknowledge that it could not "compel a person to do something which they are not reasonably capable of doing". Users would also not simply be told to turn over unencrypted content to authorities.

However, the department wants failure to comply with a notice to "constitute a criminal offence, consistent with the Crimes Act." It does not suggest what types of penalties it would seek if users did not help unlock their encrypted communications.

Encryption has been high on the agenda since revelations that the US National Security Agency (NSA) and its British counterparts were surreptitiously targeting encrypted communications on the internet.

Even before those revelations, agencies were known to be hitting up providers of web services to obtain master encryption keys in order to aid interception.
http://www.itnews.com.au/News/375286...-services.aspx





Secretary Johnson Highlights Results of Operation That Dismantled Underground Child Exploitation Enterprise on Tor Network
Release Date:
March 18, 2014

For Immediate Release
DHS Press Office
Contact: 202-282-8010
Photos and b-roll available on http://www.dvidshub.net/unit/ICE

WASHINGTON – Department of Homeland Security (DHS) Secretary Jeh Johnson, joined by representatives from U.S. Immigration and Customs Enforcement (ICE), U.S. Postal Inspection Service (USPIS) and the U.S. Attorney for the Eastern District of Louisiana today highlighted the complete results of one of the largest online child exploitation investigations in the history of ICE, involving victims in 39 states and five countries.

Fourteen men operating a child pornography website on the Darknet’s Onion Router, also known as Tor, have been arrested and charged as part of a conspiracy to operate a child exploitation enterprise, following an extensive international investigation by ICE’s Homeland Security Investigations (HSI) and USPIS. Eleven have been federally charged in the Eastern District of Louisiana and three in other districts. All are in federal custody.

“Every day the men and women of the Department of Homeland Security work to keep our nation safe and a major part of that effort is the work of ICE Homeland Security Investigations, one of the largest investigative agencies with jurisdiction over a wide range of crimes spanning the U.S. and the entire globe,” said Secretary Johnson. “Today’s announcement underscores that work, with one of the largest online child exploitation investigations in the history of ICE, involving victims in 39 states and 5 countries.”

So far, investigators have identified 251 minor victims in 39 states and five foreign countries: 228 in the United States and 23 in the United Kingdom, Canada, New Zealand, Australia and Belgium. Eight of the victims were female and 243 were male. The majority of victims, 159, were 13 to 15 years old. Fifty nine victims were 16 and 17; 26 victims were 10 to 12; four victims were 7 to 9; one victim was 4 to 6; and two victims were 3 years old or younger. All victims have been contacted by law enforcement and U.S. victims offered support services from HSI victim assistance specialists.

“These indictments represent a strong coordinated strike – by Homeland Security, the U.S. Postal Inspection Service, and several U. S. Attorney’s Offices around the country – against child pornography and those who allegedly seek to harm our most vulnerable citizens, our young children,” stated U. S. Attorney Kenneth Allen Polite, Jr.

The website’s primary administrator, Jonathan Johnson, 27, of Abita Springs, La., has been charged with operating a child exploitation enterprise. He admitted to creating multiple fake female personas on popular social networks to target and sexually exploit children and to coaching other child predators in his inner circle to do the same. Jonathan Johnson has been in federal custody since his arrest June 13, 2013, and faces 20 years to life in prison.

“Never before in the history of this agency have we identified and located this many minor victims in the course of a single child exploitation investigation,” said ICE Deputy Director Daniel Ragsdale. “Our agency is seeing a growing trend where children are being enticed, tricked and coerced online by adults to produce sexually explicit material of themselves. While we will continue to prioritize the arrest of child predators, we cannot arrest our way out of this problem: education is the key to prevention.”

“Protecting children from crimes of sexual abuse and exploitation is a priority for the U.S. Postal Inspection Service,” stated Gerald O’Farrell, inspector in charge of Criminal Investigations, National Headquarters. “I’m proud of the work of the Postal Inspection Service and our investigative partners to bring child predators to justice. U.S. Postal inspectors have investigated these crimes for more than a century. While the predators’ use of sophisticated technology has evolved, the core harm has not changed: a child’s lost innocence. We will not lose sight of this, and remain steadfast in our efforts to investigate, apprehend, and assist in the prosecution of those who seek to exploit children via the U.S. mail.”

The underground website was a hidden service board on the Tor network and operated from about June 2012 until June 2013, at which time the site contained more than 2,000 videos and had more than 27,000 members. The website shared webcam-captured videos of mostly juvenile boys enticed by the operators of the site to produce sexually explicit material. Tor enables online anonymity, directing Internet traffic through a volunteer network consisting of thousands of relays to conceal a user’s location.

The investigation was dubbed ‘Operation Round Table’ and began with the arrest of Jonathan Johnson by USPIS and HSI. Further computer forensic analysis by HSI revealed Jonathan Johnson to be the creator and administrator of the underground website. Ten additional individuals have been arrested and charged in the Eastern District of Louisiana as the primary operators, contributors and producers of material for the child exploitation enterprise:

• Daniel Nolan Devor, 39, of Brunswick, Ga., charged with conspiracy to produce child pornography, distribution of child pornography and receipt of materials involving the sexual exploitation of minors

• John C. Foster, 44 of Tipp City, Ohio, charged with conspiracy to produce child pornography, distribution of child pornography, and receipt of materials involving the sexual exploitation of minors

• Aung Gaw aka Michael Gaw, 25, of Fremont, Calif., charged with receipt of child pornography

• Vittorio Francesco Gonzalez-Castillo, 26, of Tucson, Ariz., charged with conspiracy to produce child pornography

• Sean Jabbar, 32, of Minneapolis, Minn., charged with receipt of child pornography

• Christopher Jamieson, 30, of Douglassville, Ga., charged with receipt of child pornography

• Andrew Korpal, 29, of Granger, Ind., charged with receipt of child pornography

• Nicholas Saine, 27, of Seattle, Wash., charged with receipt of child pornography

• Christopher Schwab, 25 of New Orleans, charged with production of child pornography, distribution of child pornography, and receipt of child pornography

• Stanley Zdon, III, 27, of Tuckerton, N.J., charged with conspiracy to produce child pornography

Roy Naim, 30, of Brooklyn, N.Y., was charged in the Eastern District of New York with conspiracy to produce child pornography, attempted sexual exploitation of a child, receipt of child pornography, and possession of child pornography. Minh Vi Thong, 30, of Denver, Colo., was charged in the District of Colorado with production of child pornography, distribution of child pornography, and possession of child pornography. Michael Eales, 24, of Westby, Wis., was charged in the Western District of Wisconsin with production of child pornography. He was sentenced Oct. 29, 2013, to serve two concurrent 30-year terms in federal prison, followed by a lifetime of supervised release, for manufacturing child pornography.

More than 300 investigations have been opened into potential subscribers of the website: 150 in the United States and 150 overseas. Investigators anticipate ongoing arrests and additional identification of victims as they continue to examine and analyze the more than 40 terabytes of data seized.

The prosecution of this case in the Eastern District of Louisiana is being handled by Fraud Unit Chief and Project Safe Childhood Coordinator, Assistant U. S. Attorney Brian M. Klebba.

Substantial assistance in this ongoing case is being provided by the National Center for Missing & Exploited Children, the U.S. Department of Justice’s Child Exploitation and Obscenity Section, and the Royal Canadian Mounted Police.

Operation Round Table was conducted as part of HSI’s Operation Predator to identify and rescue victims of online sexual exploitation and to arrest their abusers as well as others who own, trade and produce images of child pornography.

Last fiscal year, 2,099 child predators were arrested by HSI on criminal charges related to the online sexual exploitation of children. In 2012, 1,655 child predators were arrested, 1,335 were arrested in 2011, and 912 were arrested in 2010. Since 2003, HSI has initiated more than 29,000 cases and arrested more than 10,000 individuals for these types of crimes. HSI encourages the public to report suspected child predators and any suspicious activity through its toll-free hotline at 1-866-DHS-2-ICE or by completing its online tip form. Both are staffed around the clock by investigators. Tips can also be made through HSI’s Operation Predator smartphone app, which can be downloaded at http://bit.ly/1eixbIM. Suspected child sexual exploitation or missing children may be reported to the National Center for Missing & Exploited Children, an Operation Predator partner, via its toll-free 24-hour hotline, 1-800-THE-LOST.

For more information, visit www.ICE.gov.
http://www.dhs.gov/news/2014/03/18/s...erground-child





NSA Surveillance Program Reaches ‘Into the Past’ to Retrieve, Replay Phone Calls
Barton Gellman and Ashkan Soltani,

The National Security Agency has built a surveillance system capable of recording “100 percent” of a foreign country’s telephone calls, enabling the agency to rewind and review conversations as long as a month after they take place, according to people with direct knowledge of the effort and documents supplied by former contractor Edward Snowden.

A senior manager for the program compares it to a time machine — one that can replay the voices from any call without requiring that a person be identified in advance for surveillance.

The voice interception program, called MYSTIC, began in 2009. Its RETRO tool, short for “retrospective retrieval,” and related projects reached full capacity against the first target nation in 2011. Planning documents two years later anticipated similar operations elsewhere.

In the initial deployment, collection systems are recording “every single” conversation nationwide, storing billions of them in a 30-day rolling buffer that clears the oldest calls as new ones arrive, according to a classified summary.

The call buffer opens a door “into the past,” the summary says, enabling users to “retrieve audio of interest that was not tasked at the time of the original call.” Analysts listen to only a fraction of 1 percent of the calls, but the absolute numbers are high. Each month, they send millions of voice clippings, or “cuts,” for processing and long-term storage.

At the request of U.S. officials, The Washington Post is withholding details that could be used to identify the country where the system is being employed or other countries where its use was envisioned.

No other NSA program disclosed to date has swallowed a nation’s telephone network whole. Outside experts have sometimes described that prospect as disquieting but remote, with notable implications for a growing debate over the NSA’s practice of “bulk collection” abroad.

Bulk methods capture massive data flows “without the use of discriminants,” as President Obama put it in January. By design, they vacuum up all the data they touch — meaning that most of the conversations collected by RETRO would be irrelevant to U.S. national security interests.

In the view of U.S. officials, however, the capability is highly valuable.

In a statement, Caitlin Hayden, spokeswoman for the National Security Council, declined to comment on “specific alleged intelligence activities.” Speaking generally, she said that “new or emerging threats” are “often hidden within the large and complex system of modern global communications, and the United States must consequently collect signals intelligence in bulk in certain circumstances in order to identify these threats.”

NSA spokeswoman Vanee Vines, in an e-mailed statement, said that “continuous and selective reporting of specific techniques and tools used for legitimate U.S. foreign intelligence activities is highly detrimental to the national security of the United States and of our allies, and places at risk those we are sworn to protect.”

Some of the documents provided by Snowden suggest that high-volume eavesdropping may soon be extended to other countries, if it has not been already. The RETRO tool was built three years ago as a “unique one-off capability,” but last year’s secret intelligence budget named five more countries for which the MYSTIC program provides “comprehensive metadata access and content,” with a sixth expected to be in place by last October.

The budget did not say whether the NSA now records calls in quantity in those countries or expects to do so. A separate document placed a high priority on planning “for MYSTIC accesses against projected new mission requirements,” including “voice.”

Ubiquitous voice surveillance, even overseas, pulls in a great deal of content from Americans who telephone, visit and work in the target country. It may also be seen as inconsistent with Obama’s Jan. 17 pledge “that the United States is not spying on ordinary people who don’t threaten our national security,” regardless of nationality, “and that we take their privacy concerns into account.”

In a presidential policy directive, Obama instructed the NSA and other agencies that bulk acquisition may be used only to gather intelligence related to one of six specified threats, including nuclear proliferation and terrorism. The directive, however, also noted that limits on bulk collection “do not apply to signals intelligence data that is temporarily acquired to facilitate targeted collection.”

The emblem of the MYSTIC program depicts a cartoon wizard with a telephone-headed staff. Among the agency’s bulk collection programs disclosed over the past year, its focus on the spoken word is unique. Most of the programs have involved the bulk collection of metadata — which does not include call content — or text, such as e-mail address books.

Telephone calls are often thought to be more ephemeral and less suited than text for processing, storage and search. And there are indications that the call-recording program has been hindered by the NSA’s limited capacity to store and transmit bulky voice files.

In the first year of its deployment, a program officer wrote that the project “has long since reached the point where it was collecting and sending home far more than the bandwidth could handle.”

Because of similar capacity limits across a range of collection programs, the NSA is leaping forward with cloud-based collection systems and a gargantuan new “mission data repository” in Utah. According to its overview briefing, the Utah facility is designed “to cope with the vast increases in digital data that have accompanied the rise of the global network.”

Christopher Soghoian, the principal technologist for the American Civil Liberties Union, said history suggests that “over the next couple of years they will expand to more countries, retain data longer and expand the secondary uses.”

Spokesmen for the NSA and the office of Director of National Intelligence James R. Clapper Jr. declined to confirm or deny expansion plans or discuss the criteria for any change.

Based on RETRO’s internal reviews, the NSA has a strong motive to deploy it elsewhere. In the documents and in interviews, U.S. officials said RETRO is uniquely valuable when an analyst uncovers a new name or telephone number of interest.

With up to 30 days of recorded conversations in hand, the NSA can pull an instant history of the subject’s movements, associates and plans. Some other U.S. intelligence agencies also have access to RETRO.

Highly classified briefings cite examples in which the tool offered high-stakes intelligence that would not have existed under traditional surveillance programs in which subjects are identified for targeting in advance. In contrast with most of the government’s public claims about the value of controversial programs, the briefings supply names, dates, locations and fragments of intercepted calls in convincing detail.

Present and former U.S. officials, speaking on the condition of anonymity to provide context for a classified program, acknowledged that large numbers of conversations involving Americans would be gathered from the country where RETRO operates.

The NSA does not attempt to filter out their calls, defining them as communications “acquired incidentally as a result of collection directed against appropriate foreign intelligence targets.”

Until about 20 years ago, such incidental collection was unusual unless an American was communicating directly with a foreign intelligence target. In bulk collection systems, which are exponentially more capable than the ones in use throughout the Cold War, calls and other data from U.S. citizens and permanent residents are regularly ingested by the millions.

Under the NSA’s internal “minimization rules,” those intercepted communications “may be retained and processed” and included in intelligence reports. The agency generally removes the names of U.S. callers, but there are several broadly worded exceptions.

An independent group tasked by the White House to review U.S. surveillance policies recommended that incidentally collected U.S. calls and e-mails — including those obtained overseas — should nearly always “be purged upon detection.” Obama did not accept that recommendation.

Vines, in her statement, said the NSA’s work is “strictly conducted under the rule of law.”

RETRO and MYSTIC are carried out under Executive Order 12333, the traditional grant of presidential authority to intelligence agencies for operations outside the United States.

Since August, Sen. Dianne Feinstein (D-Calif.), the chairman of the Senate Intelligence Committee, and others on that panel have been working on plans to assert a greater oversight role for intelligence-gathering abroad. Some legislators are considering whether Congress should also draft new laws to govern those operations.

Experts say there is not much legislation that governs overseas intelligence work.

“Much of the U.S. government’s intelligence collection is not regulated by any statute passed by Congress,” said Timothy H. Edgar, the former director of privacy and civil liberties on Obama’s national security staff. “There’s a lot of focus on the Foreign Intelligence Surveillance Act, which is understandable, but that’s only a slice of what the intelligence community does.”

All surveillance must be properly authorized for a legitimate intelligence purpose, he said, but that “still leaves a gap for activities that otherwise basically aren’t regulated by law, because they’re not covered by FISA.”

Beginning in 2007, Congress loosened 40-year-old restrictions on domestic surveillance because so much foreign data crossed U.S. territory. There were no comparable changes to protect the privacy of U.S. citizens and residents whose calls and e-mails now routinely cross international borders.

Vines noted that the NSA’s job is to “identify threats within the large and complex system of modern global communications,” in which ordinary people share fiber-optic cables with legitimate intelligence targets.

For Peter Swire, a member of the president’s review group, the fact that Americans and foreigners use the same devices, software and networks calls for greater care to safeguard Americans’ privacy.

“It’s important to have institutional protections so that advanced capabilities used overseas don’t get turned against our democracy at home,” he said.


Soltani is an independent security researcher and consultant. Julie Tate contributed to this report.
http://www.washingtonpost.com/world/...f19_story.html

















Until next week,

- js.



















Current Week In Review





Recent WiRs -

March 15th, March 8th, March 1st, February 22nd

Jack Spratts' Week In Review is published every Friday. Submit letters, articles, press releases, comments, questions etc. in plain text English to jackspratts (at) lycos (dot) com. Submission deadlines are Thursdays @ 1400 UTC. Please include contact info. The right to publish all remarks is reserved.


"The First Amendment rests on the assumption that the widest possible dissemination of information from diverse and antagonistic sources is essential to the welfare of the public."
- Hugo Black
JackSpratts is offline   Reply With Quote
Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Peer-To-Peer News - The Week In Review - July 16th, '11 JackSpratts Peer to Peer 0 13-07-11 06:43 AM
Peer-To-Peer News - The Week In Review - July 9th, '11 JackSpratts Peer to Peer 0 06-07-11 05:36 AM
Peer-To-Peer News - The Week In Review - January 30th, '10 JackSpratts Peer to Peer 0 27-01-10 07:49 AM
Peer-To-Peer News - The Week In Review - January 16th, '10 JackSpratts Peer to Peer 0 13-01-10 09:02 AM
Peer-To-Peer News - The Week In Review - December 5th, '09 JackSpratts Peer to Peer 0 02-12-09 08:32 AM






All times are GMT -6. The time now is 08:07 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
© www.p2p-zone.com - Napsterites - 2000 - 2024 (Contact grm1@iinet.net.au for all admin enquiries)