P2P-Zone  

Go Back   P2P-Zone > Peer to Peer
FAQ Members List Calendar Search Today's Posts Mark Forums Read

Peer to Peer The 3rd millenium technology!

Reply
 
Thread Tools Search this Thread Display Modes
Old 30-07-14, 08:08 AM   #1
JackSpratts
 
JackSpratts's Avatar
 
Join Date: May 2001
Location: New England
Posts: 10,013
Default Peer-To-Peer News - The Week In Review - August 2nd, '14

Since 2002


































"We are using secure encryption protocols such as curve25519, ed25519 , salsa20, poly1305, and others. Links between nodes are encrypted. All communication is end to end encrypted. This should be the new normal in the post-Snowden era." – Farid Fadaie, Bittorrent


"Politically, the UK is fast becoming Europe's pariah when it comes to digital rights." – Julia Powles


"Atheist TV is live and it’s going to stay live, 24/7, until the sun burns out." – David Silverman






































August 2nd, 2014




The Pirate Bay Founder Files Complaint Over Religious Rights
Tarun Mazumdar

Peter Sunde, The Pirate Bay founder, has complained that he is not able to exercise his religious rights in the prison. Peter was denied permission to meet a rep from the Church of Kopimism.

The Pirate Bay founder was arrest in May 2014, he is currently held in Västervik and serving his eight-month sentence for copyright-infringement cases.

In June, Peter raised his voice and complained that he has not been provided vegan food in prison and due to this he lost 11 pounds in weight.

Recently, The Pirate Bay founder complained to the authorities that he is not able to practice his chosen religion freely in the prison.

"The board of spiritual care (NAV) doesn't have any representative for the Kopimist faith with whom they cooperate and therefore the Prison and Probation Service should provide permission for electronic contact with representatives from the Kopimist faith to believers," stated Peter in a letter to authority via The Local.

In 2010, Isak Gersonin, a 19-year-old Young Pirate member, founded the Missionary Church of Kopimism. The religion was accepted by the Swedish government in December 2011.

According to Kopimism, the keyboard shortcuts - CTRL+C and CTRL+V are sacred symbols. The church mentions that it has no direct connections with The Pirate Bay, however the religion promotes the practice of copying and sharing information.

TorrentFreak noted that since Kopimism is a recognised religion in Sweden, the authorities might have to consider Peter's request.

In 2012, Peter Sunde wrote about Kopimism as a religion.

"This is probably the thing that I love the most with Kopimism as a religion - we can have yet another form of P2P communication - Priest2Priest. With no legal right for anyone to listen in to the conversation perhaps."

However, it has to be seen if Peter will be allowed to have an electronic chat with a Kopimism priest.
http://au.ibtimes.com/articles/56063...r-kopimism.htm





Police Placing Anti-Piracy Warning Ads On Illegal Sites
Dave Lee

The City of London police has started placing banner advertisements on websites believed to be offering pirated content illegally.

The messages, which will appear instead of paid-for ads, will ask users to close their web browsers.

The move comes as part of a continuing effort to stop piracy sites from earning money through advertising.

Police said the ads would make it harder for piracy site owners to make their pages look authentic.

"When adverts from well known brands appear on illegal websites, they lend them a look of legitimacy and inadvertently fool consumers into thinking the site is authentic," said Detective Chief Inspector Andy Fyfe from the City of London Police Intellectual Property Crime Unit (Pipcu).

"This new initiative is another step forward for the unit in tackling IP crime and disrupting criminal profits.

"Copyright infringing websites are making huge sums of money though advert placement, therefore disrupting advertising on these sites is crucial and this is why it is an integral part of Operation Creative."

Sunblock
The initiative will make use of technology provided by Project Sunblock - a firm used by major brands to stop adverts appearing alongside questionable content such as pirated material or pornography.

Many websites - including those offering pirated content - will use syndication networks to place advertisements on their pages.

Brands use the syndication networks like a wholesaler, and so may not be clear what sites their advert will eventually appear on.

Project Sunblock detects the content of websites to prevent brands' ads appearing where they do not want them.

When a website on Pipcu's Infringing Websites List (IWL) tries to display an advert, Project Sunblock will instead serve the police warning.

Neither the police or Project Sunblock are paying the website in question to display the police message.

Piracy battle
In the past, some have raised concerns about Pipcu's process in adding a website to the IWL.

Ernesto Van Der Sar is the editor of Torrentfreak, a news site that covers issues around online piracy. When Pipcu announced its intentions in March this year, Mr Van Der Sar said he worried about the implications.

"As with all blocklists there is a serious risk of overblocking," he said.

"Without proper oversight, perfectly legal sites may end up losing good advertising opportunities if they are wrongfully included."

The battle against online piracy has seen content creators attempt many different strategies in order to stem the flow of illegal downloading.

In the UK, the courts have ordered internet service providers to block almost 50 different websites offering pirated content, either by direct download or through peer-to-peer sharing.

While effective in lowering the traffic of these sites, filtering is a flawed prevention method - many internet users are adept in using different technologies to circumvent the court-imposed restrictions.

This latest attempt looks to hit the owners of these websites in a more painful way - by stopping advertising revenues from coming in.
http://www.bbc.com/news/technology-28523738





Ford and GM Sued for Millions Over CD-Ripping Tech in Cars

The lawsuit calls for punitive damages equal to $2,500 per CD-R player installed
Lucas Mearian

The copyright protection arm of the U.S. music industry is suing Ford and GM because the companies sold cars with CD players that can rip music to the vehicle's hard drive.

The Alliance of Artists and Recording Companies (AARC), a non-profit group representing more then 300,000 artists, filed the suit against the car companies and their infotainment system tech suppliers, Denso and Clarion.

The lawsuit calls out a feature in Ford vehicles called Jukebox, which records songs from CDs to the infotainment system's hard drive. The Jukebox function has been available on Ford vehicles since at least the 2011 model year.

For example, the owner's manual explains, "Your mobile media navigation system has a Jukebox which allows you to save desired tracks or CDs to the hard drive for later access. The hard drive can store up to 10GB (164 hours; approximately 2,472 tracks) of music."

The lawsuit also cites GM's Hard Drive Device, made by Denso, which can rip music and has been available on numerous models since at least 2011.

The suit seeks millions of dollars to be paid by each of the companies for violating the Audio Home Recording Act of 1992. The Act protects against distributing digital audio recording devices whose primary purpose is to rip copyrighted material.

For example, a CD-R player in a personal computer is not considered a violation under the Act because the computer was not marketed as a musical recording device.

The "defendants designed these devices for the express purpose of copying music CDs and other digital recordings to a hard drive on the devices, and they market these devices emphasizing that copying function," the AARC argues.

In its filing, the AARC claims the "multi-billion dollar companies" are protected by the AARC as long as they incorporate certain copying control technology and pay "a modest royalty per device."

The automakers, however, have yet to pay royalties to the music industry in conjunction with marketing the CD-R technology in their vehicles, the AARC said.

News of the lawsuit, first reported by TorrentFreak, focused on the fact that Ford and GM have for years sold a variety of models with CD players that rip songs to the vehicle's hard drive. The models include the Lincoln MKS, Ford Taurus, Ford Explorer, Buick LaCrosse, Cadillac SRX, Chevrolet Volt, and GMC Terrain.

The lawsuit, filed in the U.S. District Court in Washington D.C. on Friday, seeks injunctive relief and damages equal to $2,500 for each digital audio recording device installed in a vehicle.

The damages were calculated as the unpaid royalties over the past three years, as well as an additional amount equal to 50% of the actual damages to be paid to the artists who registered their copyrights.

Neither automaker responded to requests for comment.
http://www.computerworld.com/s/artic...g_tech_in_cars





Sims 2 Ultimate Collection and SecuROM
Lisa Pham

As you may know, EA released a full collection of our beloved Sims 2 and decided to give it away free to anyone who had Sims 2 registered in their Origin account... then because of the high demand, they decided to give it to anyone who had an Origin account. What they didn't tell anyone was that they had implemented SecuROM into the mix.

So, EA did their usual and didn't inform us of them doing this, and here I was worried about having to have ORIGIN on my PC... I didn't even think to check for SecuROM.

When Maryh mentioned to me that some simmers were asking if there was SecuROM in the new Sims2 version I decided to take a look under the hood. I then found EA had put SecuROM Version 07.40.0009 in their nice Free Sims2 Ultimate Collection which they plan to sell in their store soon.

We've done screenshots of a few files and registry keys in which people may want to know of in the event of troubleshooting purposes. As far as we know EA has decided to provide everyone a free, but unscrubbed copy of Sims 2 Ultimate Collection, which is no different to the previous versions of Sims 2 that EA have supplied / sold in the past.

Please let me know if people are having issues with SecuROM / Sims 2. So far with my initial playthrough, everything was okay. The issues I came across was accessing some files and registry keys in which SecuROM had denied me (admin) access.

I'll be uninstalling Origin, SecuROM and the Sims 2 Ultimate Collection today and if anyone else wants to do the same, but need help with removing SecuROM, you can use this as your guide.

UPDATE...
Trying to remove Sims2 UC, Origin and SecuROM... all is successful except a SecuROM License key in the registry won't delete at all. I will have to work on this further and hopefully I can find a way for it to be deleted.

UPDATE - 2...
Thanks to Nalia over at SimCave reminding me about trying TrashReg, I was able to delete the SecuROM License key from the registry. Although it didn't work years ago, this newer version does do the trick. So, I've updated the SecuROM removal instructions at the bottom to incorporate using TrashReg to delete those stubborn registry keys. :)
http://www.reclaimyourgame.com/conte...on-and-SecuROM





Tor Security Advisory: "Relay Early" Traffic Confirmation Attack

This advisory was posted on the tor-announce mailing list.

SUMMARY:
On July 4 2014 we found a group of relays that we assume were trying to deanonymize users. They appear to have been targeting people who operate or access Tor hidden services. The attack involved modifying Tor protocol headers to do traffic confirmation attacks.

The attacking relays joined the network on January 30 2014, and we removed them from the network on July 4. While we don't know when they started doing the attack, users who operated or accessed hidden services from early February through July 4 should assume they were affected.

Unfortunately, it's still unclear what "affected" includes. We know the attack looked for users who fetched hidden service descriptors, but the attackers likely were not able to see any application-level traffic (e.g. what pages were loaded or even whether users visited the hidden service they looked up). The attack probably also tried to learn who published hidden service descriptors, which would allow the attackers to learn the location of that hidden service. In theory the attack could also be used to link users to their destinations on normal Tor circuits too, but we found no evidence that the attackers operated any exit relays, making this attack less likely. And finally, we don't know how much data the attackers kept, and due to the way the attack was deployed (more details below), their protocol header modifications might have aided other attackers in deanonymizing users too.

Relays should upgrade to a recent Tor release (0.2.4.23 or 0.2.5.6-alpha), to close the particular protocol vulnerability the attackers used — but remember that preventing traffic confirmation in general remains an open research problem. Clients that upgrade (once new Tor Browser releases are ready) will take another step towards limiting the number of entry guards that are in a position to see their traffic, thus reducing the damage from future attacks like this one. Hidden service operators should consider changing the location of their hidden service.

THE TECHNICAL DETAILS:
We believe they used a combination of two classes of attacks: a traffic confirmation attack and a Sybil attack.

A traffic confirmation attack is possible when the attacker controls or observes the relays on both ends of a Tor circuit and then compares traffic timing, volume, or other characteristics to conclude that the two relays are indeed on the same circuit. If the first relay in the circuit (called the "entry guard") knows the IP address of the user, and the last relay in the circuit knows the resource or destination she is accessing, then together they can deanonymize her. You can read more about traffic confirmation attacks, including pointers to many research papers, at this blog post from 2009:
https://blog.torproject.org/blog/one-cell-enough

The particular confirmation attack they used was an active attack where the relay on one end injects a signal into the Tor protocol headers, and then the relay on the other end reads the signal. These attacking relays were stable enough to get the HSDir ("suitable for hidden service directory") and Guard ("suitable for being an entry guard") consensus flags. Then they injected the signal whenever they were used as a hidden service directory, and looked for an injected signal whenever they were used as an entry guard.

The way they injected the signal was by sending sequences of "relay" vs "relay early" commands down the circuit, to encode the message they want to send. For background, Tor has two types of cells: link cells, which are intended for the adjacent relay in the circuit, and relay cells, which are passed to the other end of the circuit. In 2008 we added a new kind of relay cell, called a "relay early" cell, which is used to prevent people from building very long paths in the Tor network. (Very long paths can be used to induce congestion and aid in breaking anonymity). But the fix for infinite-length paths introduced a problem with accessing hidden services, and one of the side effects of our fix for bug 1038 was that while we limit the number of outbound (away from the client) "relay early" cells on a circuit, we don't limit the number of inbound (towards the client) relay early cells.

So in summary, when Tor clients contacted an attacking relay in its role as a Hidden Service Directory to publish or retrieve a hidden service descriptor (steps 2 and 3 on the hidden service protocol diagrams), that relay would send the hidden service name (encoded as a pattern of relay and relay-early cells) back down the circuit. Other attacking relays, when they get chosen for the first hop of a circuit, would look for inbound relay-early cells (since nobody else sends them) and would thus learn which clients requested information about a hidden service.

There are three important points about this attack:

A) The attacker encoded the name of the hidden service in the injected signal (as opposed to, say, sending a random number and keeping a local list mapping random number to hidden service name). The encoded signal is encrypted as it is sent over the TLS channel between relays. However, this signal would be easy to read and interpret by anybody who runs a relay and receives the encoded traffic. And we might also worry about a global adversary (e.g. a large intelligence agency) that records Internet traffic at the entry guards and then tries to break Tor's link encryption. The way this attack was performed weakens Tor's anonymity against these other potential attackers too — either while it was happening or after the fact if they have traffic logs. So if the attack was a research project (i.e. not intentionally malicious), it was deployed in an irresponsible way because it puts users at risk indefinitely into the future.

(This concern is in addition to the general issue that it's probably unwise from a legal perspective for researchers to attack real users by modifying their traffic on one end and wiretapping it on the other. Tools like Shadow are great for testing Tor research ideas out in the lab.)

B) This protocol header signal injection attack is actually pretty neat from a research perspective, in that it's a bit different from previous tagging attacks which targeted the application-level payload. Previous tagging attacks modified the payload at the entry guard, and then looked for a modified payload at the exit relay (which can see the decrypted payload). Those attacks don't work in the other direction (from the exit relay back towards the client), because the payload is still encrypted at the entry guard. But because this new approach modifies ("tags") the cell headers rather than the payload, every relay in the path can see the tag.

C) We should remind readers that while this particular variant of the traffic confirmation attack allows high-confidence and efficient correlation, the general class of passive (statistical) traffic confirmation attacks remains unsolved and would likely have worked just fine here. So the good news is traffic confirmation attacks aren't new or surprising, but the bad news is that they still work. See https://blog.torproject.org/blog/one-cell-enough for more discussion.

Then the second class of attack they used, in conjunction with their traffic confirmation attack, was a standard Sybil attack — they signed up around 115 fast non-exit relays, all running on 50.7.0.0/16 or 204.45.0.0/16. Together these relays summed to about 6.4% of the Guard capacity in the network. Then, in part because of our current guard rotation parameters, these relays became entry guards for a significant chunk of users over their five months of operation.

We actually noticed these relays when they joined the network, since the DocTor scanner reported them. We considered the set of new relays at the time, and made a decision that it wasn't that large a fraction of the network. It's clear there's room for improvement in terms of how to let the Tor network grow while also ensuring we maintain social connections with the operators of all large groups of relays. (In general having a widely diverse set of relay locations and relay operators, yet not allowing any bad relays in, seems like a hard problem; on the other hand our detection scripts did notice them in this case, so there's hope for a better solution here.)

In response, we've taken the following short-term steps:

1) Removed the attacking relays from the network.

2) Put out a software update for relays to prevent "relay early" cells from being used this way.

3) Put out a software update that will (once enough clients have upgraded) let us tell clients to move to using one entry guard rather than three, to reduce exposure to relays over time.

4) Clients can tell whether they've received a relay or relay-cell. For expert users, the new Tor version warns you in your logs if a relay on your path injects any relay-early cells: look for the phrase "Received an inbound RELAY_EARLY cell".

The following longer-term research areas remain:

5) Further growing the Tor network and diversity of relay operators, which will reduce the impact from an adversary of a given size.

6) Exploring better mechanisms, e.g. social connections, to limit the impact from a malicious set of relays. We've also formed a group to pay more attention to suspicious relays in the network:
https://blog.torproject.org/blog/how-report-bad-relays

7) Further reducing exposure to guards over time, perhaps by extending the guard rotation lifetime:
https://blog.torproject.org/blog/lif...of-a-new-relay
https://blog.torproject.org/blog/imp...changing-guard...

8) Better understanding statistical traffic correlation attacks and whether padding or other approaches can mitigate them.

9) Improving the hidden service design, including making it harder for relays serving as hidden service directory points to learn what hidden service address they're handling:
https://blog.torproject.org/blog/hid...need-some-love

OPEN QUESTIONS:
Q1) Was this the Black Hat 2014 talk that got canceled recently?
Q2) Did we find all the malicious relays?
Q3) Did the malicious relays inject the signal at any points besides the HSDir position?
Q4) What data did the attackers keep, and are they going to destroy it? How have they protected the data (if any) while storing it?

Great questions. We spent several months trying to extract information from the researchers who were going to give the Black Hat talk, and eventually we did get some hints from them about how "relay early" cells could be used for traffic confirmation attacks, which is how we started looking for the attacks in the wild. They haven't answered our emails lately, so we don't know for sure, but it seems likely that the answer to Q1 is "yes". In fact, we hope they *were* the ones doing the attacks, since otherwise it means somebody else was. We don't yet know the answers to Q2, Q3, or Q4.





BitTorrent Starts Testing Bleep, its New P2P Messaging Platform
Janko Roettgers

SUMMARY: BitTorrent’s new P2P chat app is dubbed Bleep, and Windows users can now line up to become part of an invite-only test.

BitTorrent is slowly starting to take the wraps off its upcoming P2P chat initiative: The company started an invite-only pre-alpha test of a new Windows chat client dubbed “Bleep” on Wednesday, and it also revealed that it plans to make the underlying peer-to-peer technology available to other chat apps and messaging service providers as well.

BitTorrent Senior Product Manager Jahee Lee explained the product name in a blog post this way:

“Why Bleep, you might ask? Well, basically, we never see your messages or metadata. As far as we’re concerned, anything you say is ‘bleep’ to us.”

BitTorrent’s new Bleep chat client doesn’t rely on any central servers to find and manage contacts. Instead, the company is using Distributed Hash Tables, also known as DHT, which are basically decentralized sets of data that can be queried by any connected client. DHT have been in use by file sharing apps and other P2P-based clients for some time, but BitTorrent married the technology with SIP, a common standard for messaging and VOIP applications.

BleepScreen

The result is not only a decentralized architecture for Bleep, but also one that could be used by other SIP-compatible clients as well. “As long as the messaging application is using SIP it should be straight-forward, in theory, to switch over from a server-based client to our platform,” explained BitTorrent’s Director of Communications Christian Averill. The company hasn’t set a timetable for integrating third-party services, but Averill said it is open to talk to companies interested in using its P2P technology for messaging.

So why would a messaging app providers switch from a proven server-based technology to BitTorrent’s P2P approach — or why, for that matter, would users switch from their prefered chat app to Bleep? The company touts the absence of a central server or directory as a safeguard against government wiretapping and other kind of snooping, and it also promises better security for the actual messages. Bittorrent’s Senior Director of Product Development Farid Fadaie explained it this way on the company’s engineering blog:

“We are using secure encryption protocols such as curve25519, ed25519 , salsa20, poly1305, and others. Links between nodes are encrypted. All communication is end to end encrypted. This should be the new normal in the post-Snowden era.”

Bleep isn’t the first attempt to decentralize chat. Skype used to be entirely P2P-based, but has been changed in recent years to rely on a hybrid architecture that makes use of hosted servers for directory and lookup functionality. The Guardian reported last year that the NSA has the capability to collect audio and video data from Skype calls.
http://gigaom.com/2014/07/30/bittorr...ging-platform/





What a New Law About Cellphone Unlocking has to do with Coffee, Cars and Consumer Freedom
Brian Fung

So a bill allowing cellphone unlocking is headed to the president's desk. President Obama has pledged to sign the legislation, giving relief to the more than 114,000 people who signed a White House petition calling for more progressive rules on cellphone use. Now what?

The answer is a much wider battle in Congress over not only cellphone unlocking but also the underlying aspects of copyright law that made it an issue in the first place. In the coming months, expect to hear a lot about something called "circumvention"; according to a House Judiciary Committee aide, lawmakers are going to take a specific look this fall at the Copyright Act's provisions that presume cellphone unlocking and similar activities to be illegal by default.

The results of that fight, advocates say, will likely shape the future of all technologies involving intellectual property — ranging from self-driving cars to media and entertainment to the Internet-connected home.

In the context of cell phones, circumvention involves bypassing the controls that a wireless carrier has placed on a phone so that the device can't be used with a different network. What the rest of us might call "cellphone unlocking" is vitally important for anyone who's tried to switch carriers — for a trip abroad, for instance, or to another service provider here at home. Cellphone unlocking makes buying a whole new device unnecessary when switching carriers.

For the past couple years it's actually been illegal to unlock your cellphone without first asking permission from your wireless carrier — something you could only do, by the way, at the end of your contract. The new bill passed by Congress overturns the government decision that made unlocking illegal, but policy experts say this is just a temporary fix. Here's why.

Every three years, the Library of Congress — which handles copyright issues for the government — has the opportunity to look at technologies designed to circumvent the locks manufacturers place on machines to protect their intellectual property. Cellphone unlocking is one example of this potentially law-breaking technology. Until recently, the Library of Congress generally concluded that cellphone unlocking deserved an exemption. But in 2012, it decided otherwise, opting not to renew the exemption.

So it's great for consumer choice that Congress passed this latest bill; it effectively reverses the Library of Congress' 2012 decision. But 2012 + 3 = 2015, meaning that the Library of Congress is going to revisit the question of cellphone unlocking again — you guessed it — next year. The LoC could decide all over again to make cellphone unlocking illegal, undoing the effects of the legislation that President Obama's about to sign.

To avoid a pointless back-and-forth, copyright reform advocates say the law that makes circumvention illegal should be changed. Some House lawmakers led by Rep. Zoe Lofgren (D-Calif.) support a bill that would do just that.

Circumvention keeps technology firmly in the manufacturers' hands, preventing customers or third parties from legally making their own repairs or doing the tinkering that has inspired many an inventor, according to the advocates. In November, the Electronic Frontier Foundation wrote about an emerging anti-circumvention system for automobiles being developed by Renault.

"Instead of selling consumers a complete car that they can use, repair, and upgrade as they see fit, Renault has opted to lock purchasers into a rental contract with a battery manufacturer and enforce that contract with digital rights management (DRM) restrictions that can remotely prevent the battery from charging at all," EFF wrote in a blog post.

It's not just cars that'll increasingly be subject to the circumvention provision of the Copyright Act. Hearing aids, e-books for the blind, Keurig coffee machines, even farm equipment — all these technologies are reliant on software to a growing degree. And the owners of that software have an interest in protecting it from theft or unsanctioned modification. So applying DRM may make a lot of sense if you're a business, but it makes life harder if you're a consumer. (Anyone who's grappled with DRM for music will probably agree.)

"Many manufacturers are realizing that if they put a digital chip in these devices, they're able to control them in a way that traditionally they haven't been able to do," said Derek Khanna, a copyright reform advocate who also pushed for the cellphone unlocking bill. "If a [Keurig] customer wants to use a different K-cup, they're potentially committing a felony."

Altering the circumvention provision of the Copyright Act could change all that. And the House Judiciary seems open to considering it.
http://www.washingtonpost.com/blogs/...sumer-freedom/





Hold the Phone: A Big-Data Conundrum
Sendhil Mullainathan

One advantage of being a professor is that you can ramble about your eccentric theories to a captive audience. For example, I often grumble to my graduate students that every time a new iPhone comes out, my existing iPhone seems to slow down. How convenient, I might think: Wouldn’t many business owners love to make their old product less useful whenever they released a newer one? When you sell the device and control the operating system, that’s an option.

This particular conspiracy theory has its adherents. But it is especially eccentric for an economist to entertain because economics argues that this type of strategy may not be as good for the bottom line as it sounds. (Catherine Rampell gave a terrific rundown of the economic arguments around planned-obsolescence and Apple conspiracy theories last October on the Economix blog of The New York Times.)

Apple would not comment on such theories. But there are two simple reasons that planned obsolescence might not maximize profits. First, the legal risk. Second, competition and consumer rationality should combine to thwart this strategy. All a competitor needs to do is to offer a smartphone that doesn’t become a brick as quickly, and more people should buy it.

But these are theoretical arguments. And my experience, though constituting a sample size of one, is empirical.

Generally, my students know enough to ignore my grumbling. But in this instance, Laura Trucco, a Ph.D. student in economics at Harvard, followed a hunch. She wanted to see whether my experience was unique. But how? When people become frustrated with a slow phone, she reasoned, they search Google to figure out what to do about it. So, in theory, data on how often people search for “iPhone slow,” as provided by Google Trends, can measure the frustration globally. (Data for only the United States show similar results.)

Because this data is available weekly, she was able to cross-reference these searches against release dates of new phones. The charts show the results, which are, to say the least, striking. In the top chart, there are six distinct spikes, and they correspond to releases of new iPhones.

At a minimum, this shows that my experience is not unique. Yes, phones feel slower over time as they hold more software and as our expectations of speed increase. But the spikes show that the feeling doesn’t grow gradually; it comes on suddenly in the days after a new phone is released.

Yet that’s all it shows: People suddenly feel that their phone is slowing down. It doesn’t show that our iPhones actually became slower. Imagine that someone points out a buzzing sound in your office. Until then, you hadn’t noticed it. But now you can’t hear anything else. Perhaps this is the digital equivalent of that experience: Hearing about a new release makes you contemplate getting a new and faster phone. And you suddenly notice how slow your old phone is.

To test if this is the reason, we can use an important difference between Apple and Google Android. In Apple’s case, the company sells the device and makes the operating system. In principle, this creates the motive (to sell more devices) and the means (control over the operating systems) to slow down the old phone.

Google has the means (it controls the Android operating system), but not the motive because it doesn’t make money directly from selling new hardware. Conversely, Samsung or other sellers of Android phones have the motive but not the means.

If the perception that your phone is slower is attributable to the psychological effect of hearing about a new release, it should be there for both Android and Apple phones. A new phone of either kind should make you focus on your existing one. The conspiracy theory, however, should apply only to one platform.

The second chart shows searches for “Samsung Galaxy slow.” In this chart, there are no noticeable spikes or anything correlated to the release of new Galaxy phones. Try other types of Android phones, and, similarly, there are no new spikes. This is suggestive, though it’s important to note that new releases of Apple products inevitably draw much more media attention than those of other phones.

Still, if attention on new devices is what makes old ones feel slow, why are the spikes on Apple product release dates, and not when the company announces the new products? In 2008, for example, the iPhone3G was announced a full month before its release. There was a spike at the release, but not at the announcement.

This data has an even more benign explanation. Every major iPhone release coincides with a major new operating system release. Though Apple would not comment on the matter, one could speculate — and many have — that a new operating system, optimized for new phones, would slow down older phones. This could also explain the Samsung-iPhone difference: Because only 18 percent of Android users have the latest operating systems on their phones, whereas 90 percent of iPhone users do, any slowdown from a new operating system would be naturally bigger for iPhones.

The important distinction is of intent. In the benign explanation, a slowdown of old phones is not a specific goal, but merely a side effect of optimizing the operating system for newer hardware. Data on search frequency would not allow us to infer intent. No matter how suggestive, this data alone doesn’t allow you to determine conclusively whether my phone is actually slower and, if so, why.

In this way, the whole exercise perfectly encapsulates the advantages and limitations of “big data.” First, 20 years ago, determining whether many people experienced a slowdown would have required an expensive survey to sample just a few hundred consumers. Now, data from Google Trends, if used correctly, allows us to see what hundreds of millions of users are searching for, and, in theory, what they are feeling or thinking. Twitter, Instagram and Facebook all create what is evocatively called the “digital exhaust,” allowing us to uncover macro patterns like this one.

Second, these new kinds of data create an intimacy between the individual and the collective. Even for our most idiosyncratic feelings, such data can help us see that we aren’t alone. In minutes, I could see that many shared my frustration. Even if you’ve never gathered the data yourself, you’ve probably sensed something similar when Google’s autocomplete feature automatically suggests the next few words you are going to type: “Oh, lots of people want to know that, too?”

Finally, we see a big limitation: This data reveals only correlations, not conclusions. We are left with at least two different interpretations of the sudden spike in “iPhone slow” queries, one conspiratorial and one benign. It is tempting to say, “See, this is why big data is useless.” But that is too trite. Correlations are what motivate us to look further. If all that big data does — and it surely does more — is to point out interesting correlations whose fundamental reasons we unpack in other ways, that already has immense value.

And if those correlations allow conspiracy theorists to become that much more smug, that’s a small price to pay.
http://www.nytimes.com/2014/07/27/up...conundrum.html





Popular Android Apps Inherit Bugs from Recycled Code
Brett Winterford

Study reveals security concerns in top 50 list.

At least half of the 50 most popular Android mobile apps have inherited security vulnerabilities through the reckless re-use of software libraries, according to the security team that uncovered the ‘Heartbleed’ vulnerability in OpenSSL.

Researchers at Codenomicon, which first published information about the OpenSSL vulnerability and coined the ‘Heartbleed’ name, will this week publish findings which name and shame some of the world’s most successful app developers for their lax approach to security.

Preliminary results from the study reveal that over half of the 50 apps send user data to third party advertising networks without user permission - often in clear text.

The researchers concluded that many of the developers of these applications were not aware of the vulnerabilities they were shipping in the code.

Olli Jarva, chief security specialist at Codenomicon, said 80 to 90 percent of mobile app software is made up of re-used libraries, most of which are available under open source.

He said it was natural that developers did not want to "invest in reinventing the wheel” with every app they push out.

But while “in theory” the open source community should result in better quality code, owing to the number of developers contributing, the numerous bugs in OpenSSL proved this was not the case, he said.

“We’re seeing the end products inherit vulnerabilities - sometimes it’s just poor software design or logic errors in implementations, and sometimes those bugs are identified and patched. Sometimes, like in the case of Heartbleed, they are not identified for two years."

More concerning is when “developers act intentionally,” Jarva said.

“Some people might have been providing a vulnerability on purpose in order to do something nasty” once the code has been distributed.

It’s rare that developers do their due diligence on who created the libraries before they embed them in the apps, he said.

“Who are they working with? Do they have sideline jobs somewhere else? The developers might be getting their dollars from ad networks," Jarva said.

End users who purchase or commission the development of mobile apps are unlikely to be aware that the apps reuse software libraries connecting them to advertising networks which exfiltrate private data without user consent, he said.

The preliminary study found that close to half of the top 50 Android apps on the market submit the user’s Android ID to third party advertising networks.

One in ten apps send either the user’s device ID (IMEI code) or location data to a third party, and one even sends the user’s mobile phone number. One in ten applications connected to more than two ad networks.

The study found that over 30 percent of the apps transmit private data in plain text and plenty more are not encrypting the transfer of this data to best practice.

“The issues are invisible to users,” Jarva said. “A lot of things are happening behind the scenes, it only afterwards they know what has been done.”

Jarva said IT security should be concerned by any app that sends irrelevant or sensitive information to third, fourth and fifth parties if this communication doesn’t align with what the app purports to do.

There are sandboxing tools available that allow an administrator to scan the binaries in an installation file and “reveal the true characteristics of the app” in under a minute, he said.

“Its not a huge amount of data to analyse.”

Jarva said open source does not provide a "free lunch".

“We have to take care to test well enough the libraries we use so we can be confident they are safe enough to be used,” he said.

Jarva said IT managers usually turn to a whitelisting strategy to overcome these issues, but were struggling to keep up with the volume of new apps being released every day.

“It’s too time consuming,” he said.

“At the end of the day, we have to make the developers and those that employ them to understand the importance of testing.

"The difficulty we face is that the motivating factor for app delivery is rarely the quality of security. More testing means more time spent, and that means more cost for the developer and a higher price for the solution. It only becomes an issue when something bad happens.”
http://www.itnews.com.au/News/390365...cled-code.aspx





The Judges Approving the NSA'S Surveillance Requests Keep Buying Verizon Stock
Lee Fang

When the National Security Agency would like to take a look at all of the metadata of phone calls made by people using Verizon, a program revealed last summer by Edward Snowden, they must obtain approval from the secretive Foreign Intelligence Surveillance Court (better known as the FISA Court), which typically grants such requests. VICE has obtained disclosures that reveal for the first time since this program was made public that FISA Court judges have not only owned Verizon stock in the last year, but that at least one of the judges to sign off on the NSA orders for bulk metadata collection is a proud shareholder of the company complying with these requests.

On May 28 last year, Judge James Zagel, a FISA Court member since 2008, purchased stock in Verizon. In June of this year, Zagel signed off on a government request to the FISA Court to renew the ongoing metadata collection program.

He's not the only one. We filed a request to the courts for the personal finance statements for all of the FISA Court judges. About a month ago, federal judges began turning in their disclosures, which cover the calendar year of 2013. The disclosures show that FISA Court Judge Susan Wright purchased Verizon stock valued at $15,000 or less on October 22. FISA Court Judge Dennis Saylor has owned Verizon stock, and last year collected a dividend of less than $1,000. The precise amount and value of each investment is unclear—like many government ethics disclosures, including those for federal lawmakers, investments amounts are revealed within certain ranges of value.

The FISA Court continually rotates with respect to how it deals with requests from the government. In essence, each judge takes turns overseeing surveillance asks from the Feds. Judge Roger Vinson, the judge who signed off on the order disclosed by Snowden last year, requested an extension for filing his personal finance statement. While it's not clear how the rotation schedule works, it's certainly plausible Judge Saylor or Judge Wright will soon be asked to renew the next request by the NSA for metadata from telecom companies.

Do the investments constitute a conflict of interest? Federal judges are bound by an ethics law that requires them to recuse themselves from cases in which they hold a financial stake in the outcome, or in cases in which their "impartiality might reasonably be questioned."

In the past, revelations about stock ownership have invalidated certain court decisions. For example, after an eye-opening investigation from the Center for Public Integrity, which revealed that a federal judge who participated in a mortgage foreclosure-related decision owned stock in Wells Fargo, a case was re-opened. The FISA Court is different. For one thing, FISA proceedings are ex parte, meaning Verizon isn't even a party for the NSA requests. However, telecom companies certainly have a stake in how they comply with government orders, and some ethicists say judges would be well served if they simply steer clear of these types of investments.

"I think prudence would suggest that a FISA judge would not acquire investments in these telecommunication stocks," says Professor William G. Ross, an expert on judicial ethics at Samford University's Cumberland School of Law in Alabama. "I'm not saying there is a conflict of interest, which my impression says there's probably not," Ross says, adding, "this is between what's improper and what's prudent."

District court clerks told VICE that judges typically do not offer responses on the record for these types of inquiries. Judge Saylor's office could not offer a comment, and a request for comment was also sent to the other judges.

Last year, Gawker reported that many FISA Court judges have owned various telecommunication stocks over the years. But the ethics forms we obtained show that since the Snowden revelation, FISA Court judges have been specifically purchasing and holding stock in the company that is the only named telecom giant known for its compliance with the NSA's bulk data orders.
http://www.vice.com/read/the-judges-...izon-stock-725





Personal Privacy Is Only One of the Costs of NSA Surveillance
Kim Zetter

There is no doubt the integrity of our communications and the privacy of our online activities have been the biggest casualty of the NSA’s unfettered surveillance of our digital lives. But the ongoing revelations of government eavesdropping has had a profound impact on the economy, the security of the internet and the credibility of the U.S. government’s leadership when it comes to online governance.

These are among the many serious costs and consequences the NSA and those who sanctioned its activities—including the White House, the Justice Department and lawmakers like Sen. Dianne Feinstein—apparently have not considered, or acknowledged, according to a report by the New America Foundation’s Open Technology Institute.

“Too often, we have discussed the National Security Agency’s surveillance programs through the distorting lens of a simplistic ‘security versus privacy’ narrative,” said Danielle Kehl, policy analyst at the Open Technology Institute and primary author of the report. “But if you look closer, the more accurate story is that in the name of security, we’re trading away not only privacy, but also the U.S. tech economy, internet openness, America’s foreign policy interests and cybersecurity.”

Over the last year, documents leaked by NSA whistleblower Edward Snowden, have disclosed numerous NSA spy operations that have gone beyond what many considered acceptable surveillance activity. These included infecting the computers of network administrators working for a Belgian telecom in order to undermine the company’s routers and siphon mobile traffic; working with companies to install backdoors in their products or network infrastructure or to devise ways to undermine encryption; intercepting products that U.S. companies send to customers overseas to install spy equipment in them before they reach customers.

The Foundation’s report, released today, outlines some of the collateral damage of NSA surveillance in several areas, including:

• Economic losses to US businesses due to lost sales and declining customer trust.
• The deterioration of internet security as a result of the NSA stockpiling zero-day vulnerabilities, undermining encryption and installing backdoors in software and hardware products.
• Undermining the government’s credibility and leadership on “internet freedom” and governance issues such as censorship.

Economic Costs to U.S. Business
The economic costs of NSA surveillance can be difficult to gauge, given that it can be hard to know when the erosion of a company’s business is due solely to anger over government spying. Sometimes, there is little more than anecdotal evidence to go on. But when the German government, for example, specifically cites NSA surveillance as the reason it canceled a lucrative network contract with Verizon, there is little doubt that U.S. spying policies are having a negative impact on business.

“[T]he ties revealed between foreign intelligence agencies and firms in the wake of the U.S. National Security Agency (NSA) affair show that the German government needs a very high level of security for its critical networks,” Germany’s Interior Ministry said in a statement over the canceled contract.

Could the German government simply be leveraging the surveillance revelations to get a better contract or to put the US on the defensive in foreign policy negotiations? Sure. That may also be part of the agenda behind data localization proposals in Germany and elsewhere that would force telecoms and internet service providers to route and store the data of their citizens locally, rather than let it pass through the U.S.

But, as the report points out, the Germans have not been alone in making business decisions based on NSA spying. Brazil reportedly scuttled a $4.5 billion fighter jet contract with Boeing and gave it to Saab instead. Sources told Bloomberg News “[t]he NSA problem ruined it” for the US defense contractor.

Governments aren’t the only ones shunning US businesses. American firms in the cloud computing sector are feeling the pressure as consumers and corporate clients reconsider using third-party storage companies in the U.S. for their data. Companies like Dropbox and Amazon Web Services reportedly have lost business to overseas competitors like Artmotion, a Swiss hosting provider. The CEO of the European firm reported that within a month after the first revelations of NSA spying went public, his company’s business jumped 45 percent. Similarly, 25 percent of respondents in a survey of 300 British and Canadian businesses earlier this year said they were moving their data outside the US as a result of NSA spying.

The Information Technology and Innovation Foundation has estimated that repercussions from the spying could cost the U.S. cloud computing industry some $22 to $35 billion over the next few years in lost business.

Will the NSA spying revelations have long-term effects? Or will customers return to U.S. companies once the news fades into the background? It’s hard to tell.

But German chancellor Angela Merkel has suggested that Europe build a separate permanent internet to keep data local and prevent it from traversing networks the NSA can more easily monitor. Germany also has instituted new data rules that prohibit any company from obtaining a federal contract unless it can guarantee that it will protect data stored in Germany from foreign authorities. These kinds of policies and infrastructure changes tend to remain long after the circumstances that spawned them have passed.

Deterioration of Cybersecurity
Out of all the revelations to come to light in the past year, the most shocking may well be the NSA’s persistent campaign to undermine encryption, install backdoors in hardware and software and amass a stockpile of zero-day vulnerabilities and exploits.

“For the past decade, N.S.A. has led an aggressive, multipronged effort to break widely used Internet encryption technologies,” according to a 2010 memo from Government Communications Headquarters, the NSA’s counterpart in the UK, leaked by Edward Snowden.

Furthermore, a story from Pro Publica noted, the NSA “actively engages the US and foreign IT industries to covertly influence and/or overtly leverage their commercial products’ designs” to make them more amenable to the NSA’s data collection programs and more susceptible to exploitation by the spy agency.

The NSA, with help from the CIA and FBI, also has intercepted network routers from US manufacturers like Cisco to install spy tools before they’re shipped to overseas buyers, further undermining customer trust in US companies. Cisco senior vice president Mark Chandler wrote in a company blog post that his and other companies ought to be able to count on the government not interfering “with the lawful delivery of our products in the form in which we have manufactured them. To do otherwise, and to violate legitimate privacy rights of individuals and institutions around the world, undermines confidence in our industry.”

All of these activities are at direct odds with the Obama administration’s stated goal of securing the internet and critical infrastructure and undermine global trust in the internet and the safety of communications. The actions are particularly troubling because the insertion of backdoors and vulnerabilities in systems doesn’t just undermine them for exploitation by the NSA but makes them more susceptible for exploitation by other governments as well as by criminal hackers.

“The existence of these programs, in addition to undermining confidence in the internet industry, creates real security concerns,” the authors of the report note.

Undermining U.S. Support for Internet Freedom
Finally, the NSA’s spying activities have greatly undermined the government’s policies in support of internet freedom around the world and its work in advocating for freedom of expression and combating censorship and oppression.

“As the birthplace for so many of these technologies, including the internet itself, we have a responsibility to see them used for good,” then-Secretary of State Hillary Clinton said in a 2010 speech launching a campaign in support of internet freedom. But while “the US government promotes free expression abroad and aims to prevent repressive governments from monitoring and censoring their citizens,” the New American report notes, it is “simultaneously supporting domestic laws that authorize surveillance and bulk data collection.” The widespread collection of data, which has a chilling effect on freedom of expression, is precisely the kind of activity for which the U.S. condemns other countries.

This hypocrisy has opened a door for repressive regimes to question the US role in internet governance bodies and has allowed them to argue in favor of their own governments having greater control over the internet. At the UN Human Rights Council in September 2013, the report notes, a representative from Pakistan—speaking on behalf of Cuba, Iran, China and other countries—said the surveillance programs highlighted the need for their nations to have a greater role in governing the internet.

The report makes a number of recommendations to address the problems the NSA’s spying has created. These include strengthening privacy protections for Americans and non-Americans, developing clear policies about whether and under what legal standards it is permissible for the government to secretly install malware on a computer or network, and working to restore trust encryption systems and standards.
http://www.wired.com/2014/07/the-big...talking-about/





Lords: Right To Be Forgotten Is Wrong, Unworkable, Unreasonable
Liat Clark

Europe's right to be forgotten ruling, which states that everyone has the right to wipe their digital slate clean, is simply "wrong", a House of Lords report has concluded. As a result, it argues, Google has been faced with an "unworkable and unreasonable situation".

The Lords EU Sub-Committee -- which deals with topics as disparate as immigration, health, sport and education -- delivered this bold statement after consulting with the Information Commissioner's Office, Minister for Justice and Civil Liberties Simon Hughes and Google, amongst others. In a lengthy statement, Chairman of the Sub-Committee Baroness Prashar said the reality was "crystal clear" -- "neither the 1995 Directive, nor the Court of Justice of the European Unions's (CJEU) interpretation of it, reflects the incredible advancement in technology that we see today, over 20 years since the Directive was drafted".

The report supports the UK government's own stance that new regulation needs to invalidate the CJEU ruling.

The right to be forgotten is an interpretation of Article 12 of the Data Protection Directive, laid down by European Parliament in 1995 and relating to the protection and processing of personal data. For close to two decades, the law was not interpreted as having a "right to be forgotten" clause. But when one Mario Costeja González found that the first Google search result of his name related to a 1998 story about his property being repossessed, he demanded Google remove the link. Since the report was no longer representative of his financial situation, González declared it an invasion of privacy -- Google was highlighting a thing of the past, not allowing him to live it down.

After years of legal battles, the CJEU agreed with González. And since the ruling Google, much perturbed by the turn of affairs and claiming to be totally unprepared to deal with it, said it has been faced with more than 90,000 takedown requests and argues the law is totally impractical and unjust.

The Lords' committee appears to totally agree.

It believes that, at its very heart, the CJEU ruling is flawed. It should not make search engines judge and jury of the web -- it is not their job, and is not stipulated in the law. On top of this, people do not have an inherent right to have factual information about them scrubbed from digital history. "We do not believe that individuals should have a right to have links to accurate and lawfully available information about them removed, simply because they do not like what is said," Baroness Prashar said.
This has been a major concern ever since the right to be forgotten was first floated. In the days following the European ruling, Google highlighted how these fears were coming to fruition when it told the FT 31 percent of takedown requests from the UK and Ireland related to frauds or scams; 20 percent to arrests or convictions for violent or serious crimes and 12 percent to child pornography arrests. Others came from police and government, or celebrities.

Further justifying these fears, when links began to be removed and their publishers' notified, it became clear anyone and everyone would be taking a punt and asking for embarrassing or disparaging information to be removed. BBC's Robert Peston highlighted the problem when he lambasted the takedown of a link to his 2007 blogpost on the career problems of a former Merrill Lynch boss -- the suggestion being someone in a position of power was attempting to scrub their record clean.

The Lords report concluded that the right to be forgotten is unworkable for main two reasons.

First off, it totally ignores the fact that pretty much every other search engine (bar Yahoo and of course Bing, which rather embarrassingly had to volunteer to be included in the whole affair) doesn't have the spending power and infrastructure to implement the ruling.

Since the ruling went into effect, Google says it has received more than 90,000 removal requests. The search giant has repeatedly emphasised that it is not equipped to deal with the response itself, recently stating: "This is a new process for us. Each request has to be assessed individually and we're working as quickly as possible to get through the queue." In spite of this, European regulators are apparently irked by Google's ineptitude. Google appears to have been painfully implementing the ruling down to the tee, which has meant it okayed more than half the requests received (outrage ensued, naturally, as many of those links removed went through to factual, solid journalism relating to public figures). That diligence also meant Google only removed the search results from its European search engines, in line with the law, meaning anyone could switch to .com or other portals to access them. Europe has apparently taken issue with this, and the fact Google alerted websites to the removals. It also complained Google was not passing on enough information to national regulators, who are left to deal with all the complaints Google rejects.

For all these reasons, it's clear both Google and individual countries are grappling with the enormous problem the new application of the law has created.

The Committee's second point, is a legal one -- it says the interpretation of the Directive was in itself, totally wrong. It was "wrong in principle" to force search engines to be judge and jury of the web. Particularly when "based on vague, ambiguous and unhelpful criteria". People should not have the right to pick and choose what is recorded in history about them, said Baroness Prashar, and search engines should not legally be considered "data controllers" -- an argument Google has used for years.

Rather damningly, Prashar concludes that yes, law should try and keep up with technology. But when we do so it must be "sensible" and should take into account the data logistics of what is asking.

It should, she said, also "decide not to try and enforce the impossible".

The report somewhat echoes the many concerns Google has voiced leading up to and following the CJEU ruling. And many members of the public agree with those concerns. When Wired.co.uk published a blog suggesting Google, in its rather slow and seemingly arbitrary implentation of the law, was attempting to highlight its inadequacies in order to get it quashed, a number of readers wrote in to say: good. Many see the impracticalities and dangers of the law, and the Lords report validates those fears and concerns.

There is an argument, however, that the report emphasises the rights of businesses over the societal and legal issues that brought it to the fore in the first place. Laurence Eastham, editor of The Society for Computers and Law's online publication, comments that it lays "too heavy an emphasis on the inconvenience that arises for business -- e.g. seeming to suggest that start-ups cannot cope with 'privacy by design'".

He continues: "It is worth emphasising that the Committee publishing this report does not complain of a lack of balance in the CJEU judgment -- as so many of us have. It does not want any such right to exist at all."

Law researcher at the University of Cambridge, Julia Powles, is one such academic that has been pushing for a careful, considered and balanced reaction to the law.

"The essence of my concern with the Lords is misdirected fire," she tells Wired.co.uk. "I agree that data protection as a system has major flaws. Fixing perceived problems with the perceived right to be forgotten is only a partial fix." The Lords report, and conclusions like it, could leave Google with an "escape route", she says, "and everyone else with onerous obligations, routinely ignored".

"We need to look at the whole edifice and debate why we have it and how we can make it workable."

Much as with DRIP, the surveillance law rushed through Parliament earlier this month, the UK appears to be dealing with a Euoprean ruling it does not like, by deciding to totally ignore and defy it.

Of this, Powles notes: "Politically, the UK is fast becoming Europe's pariah when it comes to digital rights. With this and with DRIP, we see knee-jerk responses that go against the broader sentiment of the continent and, importantly, go against individual interests. It is really quite remarkable to have a public statement on behalf of the UK so strongly decrying Europe's superior court."

The debate, however, is far from over. Google's own Advisory Council -- launched to deal with the law's fallout and including human rights lawyers, editors and Wikipedia's Jimmy Wales -- is seeking applications for public comment, and will hold meetings where selected experts can testify across Europe. September dates have already been set for public meetings in Madrid, Rome, Paris, Warsaw, Berlin and London.
http://www.wired.co.uk/news/archive/...otten-is-wrong





Now You Can Tell the FCC to Overturn State Limits on Municipal Broadband

FCC takes comments on petition to preempt laws in North Carolina and Tennessee.
Jon Brodkin

The Federal Communications Commission just started taking public comments on whether it should preempt state laws that limit the growth of municipal broadband in Tennessee and North Carolina.

Twenty states have passed such limits, which protect private Internet service providers from having to compete against cities and towns that seek to provide Internet, TV, and phone service to residents. After FCC Chairman Tom Wheeler said he intends to use the commission's authority to preempt the state laws, the commission received petitions from two public entities that want to expand broadband offerings.

"On July 24, 2014, the Electric Power Board of Chattanooga, Tennessee, and the City of Wilson, North Carolina filed separate petitions asking that the Commission act pursuant to section 706 of the Telecommunications Act of 1996 to preempt portions of Tennessee and North Carolina state statutes that restrict their ability to provide broadband services," the FCC said today. "The Electric Power Board is an independent board of the City of Chattanooga that provides electric and broadband service in the Chattanooga area. The City of Wilson provides electric service in six counties in eastern North Carolina and broadband service in Wilson County. Both Petitioners allege that state laws restrict their ability to expand their broadband service offerings to surrounding areas where customers have expressed interest in these services, and they request that the Commission preempt such laws."

The FCC opened two proceedings, one for North Carolina and one for Tennessee. Initial comments will be accepted until August 29, and reply comments will be accepted until September 29.

These two proceedings will serve as test cases. Wheeler points to the FCC's authority to promote competition in local telecommunications markets by removing barriers that prevent investment, saying that this allows the commission to preempt state laws that prevent private ISPs from having to compete against municipalities. But he's already getting pushback from House Republicans, who voted to prevent the FCC from preempting state broadband laws. That vote may not become law given that the Senate is controlled by Democrats, but the FCC could also face a lawsuit if it grants the Tennessee or North Carolina petitions.
http://arstechnica.com/business/2014...pal-broadband/





Consumers OK with Data Limits on Wireless, But Not Wired Broadband

GAO finds that internet users were confused about how much data they use
Gautham Nagesh

Consumers are OK with caps on how much data they can download on their wireless devices every month, but are much more concerned about limits on their home Internet usage, a government watchdog agency has found.

The Government Accountability Office conducted focus groups in four cities for its report on Internet usage-based pricing, or charging consumers based on how much data they use. While consumers were generally accepting of the data limits on wireless usage, they showed "strong negative reactions" to the idea of capping the amount of data they can use on their wired broadband connections at home, citing the importance of unlimited bandwidth to their lives and fear that broadband providers could use data caps as a way to increase the amount they charge for Internet service.

The GAO said that the four major national wireless carriers-- AT&T Inc., T +2.64% Verizon Communications Inc., VZ +0.76% T-Mobile US Inc. TMUS -0.03% and Sprint Corp. S +4.71% --charge consumers if they exceed their monthly data allowance, and that consumers have responded by limiting the amount of video they watch on their mobile devices, logging onto Wi-Fi networks when possible, or upgrading their data plans. In contrast, only about half of the wired broadband providers surveyed by the GAO currently impose some form of usage-based pricing, mostly by letting consumers pay less for a smaller amount of data a month.

In a finding that could bolster regulators' push for transparency from broadband providers, the GAO found that consumers were largely confused about how much data they use, and that many could opt for a less-expensive data plan without fear of exceeding their monthly data cap.

However, the report acknowledged that consumers' data use could increase as more people choose streaming video services from Netflix NFLX -0.09% or Amazon in lieu of cable or satellite connections. Broadband equipment maker Sandvine estimated that consumers who turn to the Internet to replace traditional pay-TV typically consume an average of 212 gigabytes of data a month, roughly in line with many of the existing wired broadband data caps.

"While broadband providers are experimenting with these new business models, consumers are left wondering whether they will have to foot the bill and how much more it will be," said Rep. Anna Eshoo (D., Calif.), who requested the study.

A staff aide to Ms. Eshoo said she plans to submit the GAO's preliminary findings to the Federal Communications Commission as part of its rule-making on how broadband providers can manage traffic on their networks. Advocates of net neutrality, the principle that all Internet traffic should be treated equally, argue that broadband providers could use data caps as a way to force consumers to pay extra for video and other services.

The GAO found that usage-based pricing could pose a risk to innovation by discouraging consumers from adopting high-bandwidth services, such as streaming online video. Experts have suggested the net impact of data caps could be very similar to allowing broadband providers to charge content companies for fast and slow broadband lanes to consumers.

"Now in the midst of the net neutrality debate, the topic of usage-based pricing is more relevant than ever. While much of the talk has focused on the anticompetitive impact of fast and slow lanes, data caps, particularly when applied discriminatorily, could have the same damaging impact on the free and open Internet as we know it," Ms. Eshoo said.

The GAO conducted focus groups in Des Moines, Iowa, Las Vegas, Baltimore and New York. Each group contained nine or 10 individuals (77 total) of varying ages, races, genders, education and income levels, as well as self-identified "light" and "heavy" Internet users.
http://online.wsj.com/news/article_e...MDIwOTEyNDkyWj





ISPs Tell Government that Congestion is “Not a Problem,” Impose Data Caps Anyway

Shocking government research also finds Internet users don’t want data caps.
Jon Brodkin

After consulting focus groups of Internet customers, government researchers have come to a conclusion that should surprise no one: people don't want data caps on home Internet service.

But customers are getting caps anyway, even though ISPs admit that congestion isn't a problem. The US Government Accountability Office (GAO) today released preliminary findings of research involving surveys of cellular carriers, home Internet providers, and customers.

The majority of top wireline ISPs are at least experimenting with data caps. But while cellular carriers say they impose usage-based pricing (UBP) to manage congestion on wireless networks, that's not the case with cable, fiber, and DSL. "Some wireless ISPs told us they use UBP to manage congestion," the GAO wrote. On the other hand, "wireline ISPs said that congestion is not currently a problem."

Why set up data limits and charge extra when users go over them, then? "UBP can generate more revenues for ISPs to help fund network capacity upgrades as data use grows," the GAO wrote.

The GAO said it interviewed "some experts" who think usage-based pricing "may be unnecessary because the marginal costs of data delivery are very low, [and] heavier users impose limited additional costs to ISPs." Limiting heavy users could even "limit innovation and development of data-heavy applications," the GAO wrote.

Customers told the GAO they don't want data caps, at least on home Internet.

Eight focus groups of nine or 10 people each were polled about data caps on both cellular service and wireline home Internet. While they were generally accepting of limits on cellular data, most did not want any limits on home Internet usage, in part because they manage limited wireless plans by connecting mobile devices to their home Wi-Fi. The GAO wrote:

In only two groups did any participants report experience with wireline UBP [usage-based pricing]. However, in all eight groups, participants expressed strong negative reactions to UBP, including concerns about:

• The importance of the Internet in their lives and the potential effects of data allowances.
• Having to worry about data usage at home, where they are used to having unlimited access.
• Concerns that ISPs would use UBP as a way of increasing the amount they charge for Internet service.


While all four major cellular carriers impose some form of data limits, seven out of the 13 top wireline ISPs surveyed by the GAO have deployed usage-based pricing "to some extent." Some of the limits are more strict than others, the GAO said:

• Three wireline ISPs use UBP with data allowance tiers and impose overage fees on customers who exceed allowance. (Overage fee charges are generally $10 a month for 50 GB of additional data.)
• Two have data allowance tiers but do not impose fees for overage.
• One offers a voluntary low-data plan at a discounted rate.
• One is testing UBP approaches that include overage fees in select markets of varying sizes.


The GAO didn't name any specific ISPs, but they are publicly known. GigaOm surveyed 15 ISPs last November and found that eight capped data, at least for some customers. Those ISPs with data caps include Comcast, AT&T, CenturyLink, Cox, Charter, Suddenlink, MediaCom, and CableOne.

What’s next? Well, probably nothing.

This information isn't likely to cause any immediate change in government policy. The GAO report came in response to a request in May 2013 by US Rep. Anna Eshoo (D-CA). Preliminary findings were summarized in a slide deck (see it here), while a more extensive report is expected in November.

Eshoo plans to share the findings with the Federal Communications Commission. "Now in the midst of the net neutrality debate, the topic of usage-based pricing is more relevant than ever," Eshoo said in a press conference. "While much of the talk has focused on the anti-competitive impact of fast and slow lanes, data caps, particularly when applied discriminatorily, could have the same damaging impact on the free and open Internet as we know it."

The FCC is planning to implement net neutrality rules that would prevent blocking of applications and guarantee a minimum level of service. The FCC also last week reminded wireline ISPs and cellular carriers of their obligations to disclose accurate information about network management practices, performance, and the commercial terms of their services. The FCC hasn't proposed any restrictions on data caps and usage-based pricing, however.

The focus groups interviewed by the GAO included both heavy and light Internet users and "a mix of ages, races, genders, and education and income levels." The participants were drawn from Baltimore, MD; Des Moines, IA; Las Vegas, NV; and New York, NY.

While the focus group participants generally didn't object to wireless data caps, they expressed "confusion regarding wireless data usage," including "uncertainty over plan details, such as their data allowance," and "uncertainty whether their plans were subject to throttling." (Hint: they are being throttled.)

The focus group participants were accustomed to not having to consider wireline data usage, and their opposition to home Internet data caps was "in part driven by confusion about the amount of data used by Internet applications," the GAO concluded. But while focus group participants seemed to overestimate the amount of data used by activities such as online shopping and "leaving social media applications running in the background," they were savvy enough to limit their use of video streaming on cellular networks and save data by connecting wireless devices to their in-home Wi-Fi networks.

"Participants were accustomed to unlimited wireline Internet access at home and prefer not having to maintain awareness about data consumption," the GAO wrote. "Some participants said that multi-person households, each with multiple devices, would pose challenges to them in tracking Internet data consumption. In all eight groups, participants said that they frequently connect their wireless devices to their in-home Wi-Fi without worrying about data usage."

Nonetheless, some of the participants would have "positive reactions" to wireline usage-based pricing if it could result in them paying less money. "Some focus group participants thought it was more fair to pay only for the data used—akin to utilities, such as water or electricity," the GAO wrote.

So far, though, the GAO found that "consumers may not be fully benefiting from lower-cost options under UBP." The GAO pointed to one "wireline ISP [that] offers a small monthly discount for a 5GB/month data allowance. However, according to that ISP, only a small percent of its customers have signed up for that option even though almost 20 percent of its customers use 5GB a month or less."

As we've written before, this ISP is Time Warner Cable. While the GAO seems confused about why people turn down the offer, the answer is pretty clear: Time Warner is only offering $5 monthly discounts to customers who agree to a cap, which started out at 5GB but was increased to 30GB last year. TWC charges $1 for every extra gigabyte, and up to $25 in overage charges each month. It's no surprise that TWC customers would turn down a $5 discount when it comes with the risk of paying $25 in extra charges.
http://arstechnica.com/business/2014...a-caps-anyway/





$12 a Month for Facebook – Sprint Tramples Over Net Neutrality with New Prepaid Plan
Kyle Wiggers

Today, Sprint dispensed with all subtlety. Without any pretense of net neutrality whatsoever, the carrier unveiled a plan with options to pay more for unfettered access to social media and streaming music, depending on the tier.

The Virgin Mobile Custom plan, sold under Sprint’s Virgin Mobile brand, provides unlimited access to one of four social media services – Facebook, Twitter, Instagram, or Pinterest – on top of your data plan for $12 a month. An additional $10 will net unlimited use of all four, while $5 more grants unlimited streaming from any one music app. The base plan also includes 20 minutes of talk time and 20 texts, both of which can be upgraded. Lines start at $6.98 a month, $5 extra for “unlimited” access. Plans can be adjusted on the fly, even daily if so desired.

The plan, President of Prepaid at Sprint Dow Draper told the Wall Street Journal, isn’t currently part of a promotion – none of the companies featured are subsidizing connection costs, unlike AT&T’s Sponsored Data program, but he said “it’s definitely possible” down the road.

The new plan embodies the anti-net neutrality schemes advocates have been warning about for years. Instead of allowing data to flow unimpeded, Virgin Mobile Custom very clearly discriminates against a huge number of apps, ultimately relegating them to more restrictive data plans. If Sprint’s goal, as Mr. Draper implies, is to provide the Internet at palatable prices for poorer consumers, perhaps lower-cost (the cheapest data package Virgin is offering starts at $8 a month) capped but open access with an option to pay for more might be more appropriate . Heck, T-Mobile does it free for tablets – why can’t Sprint do the same for prepaid phones?

These plans will be made available through Walmart beginning August 9. Supported handsets include the LG Unify, LG Pulse, and ZTE Emblem.
http://www.droid-life.com/2014/07/30...-prepaid-plan/





'Disturbing' Loophole to Throttle Unlimited Data
Sam Gustin

Federal Communication Commission Chairman Tom Wheeler is "deeply troubled" about Verizon Wireless's recently-announced plan to begin slowing down data speeds for some customers, he wrote in a strongly-worded letter to the company's CEO on Wednesday.

Last week, Verizon Wireless, the nation's largest mobile broadband provider, announced that starting in October, some customers who have unlimited plans and are heavy data users will have their speeds limited when they are connecting to cell sites that are experiencing heavy traffic.

Verizon Wireless calls this practice "Network Optimization." Others call it "throttling." And Wheeler is not happy about it.

"I am deeply troubled by your July 25, 2014 announcement that Verizon Wireless intends to slow down some customers' data speeds on your 4G LTE network starting in October 2014," Wheeler wrote in a letter to Daniel Mead, President and CEO of Verizon Wireless.

According to Verizon Wireless, customers who fall within the top 5 percent of data users, who have fulfilled their minimum contractual commitment, and are on unlimited plans using a 4G LTE device "may experience slower data speeds when using certain high bandwidth applications, such as streaming high-definition video or during real-time, online gaming, and only when connecting to a cell site when it is experiencing heavy demand."

As of March 2014, the top 5 percent of data users were using 4.7 GB or more of data each month, according to Verizon Wireless. The company has applied a similar policy to users on its 3G network since 2011; now the company wants to extend the policy to its higher-speed LTE network.

Verizon justifies the plan by calling it a "network management" practice—a highly-loaded term-of-art in the telecom policy community that refers to ways in which broadband providers can manipulate speeds in order to ensure that their networks run smoothly and securely.

Wheeler is not buying it.

"Reasonable network management concerns the technical management of your network; it is not a loophole designed to enhance your revenue streams," Wheeler wrote in the letter, which was dated July 30 and obtained by Motherboard. "It is disturbing to me that Verizon Wireless would base its network management on distinctions among its customers' data plans, rather than on network architecture or technology."

In essence, Wheeler is accusing Verizon Wireless of cloaking a business decision designed to push users off unlimited data plans—which it discontinued offering to new customers in 2011—as a "network management" practice.

Verizon Wireless denies that its "Network Optimization" policy amounts to throttling. "The difference between our Network Optimization practices and throttling is network intelligence," the company says. "With throttling, your wireless data speed is reduced for your entire cycle, 100% of the time, no matter where you are."

But this claim is somewhat "nonsensical," according to Jon Brodkin at Ars Technica. "Throttling is still throttling whether it happens one percent of the time or 100 percent, of course," Brodkin wrote last week.

In his letter, Wheeler asks for a "prompt response" to several questions. This one may be the most pertinent: "How does Verizon Wireless justify this policy consistent with its continuing obligations under the 700 MHz C Block open platform rules?"

In 2008, Verizon agreed to pay $4.7 billion for the highly coveted 700 Mhz C Block of wireless spectrum in a closely watched FCC auction, and in doing so agreed to abide by open platform provisions set by the FCC. As part of its bid, the company agreed to not deny, limit, or restrict the ability of users to download and use applications of their choosing on the network.

"We will officially respond to the Chairman's letter once we have received and reviewed it," a Verizon Wireless spokesperson said in a statement emailed to Motherboard. "However, what we announced last week was a highly targeted and very limited network optimization effort, only targeting cell sites experiencing high demand. The purpose is to ensure there is capacity for everyone in those limited circumstances, and that high users don't limit capacity for others."
http://motherboard.vice.com/read/fcc...unlimited-data





AT&T, IBM Research and ACS Create Faster Way to Distribute Bandwidth in the Cloud
Jonathan Vanian

SUMMARY:
The new SDN prototype, developed as part of the U.S. Government’s DARPA CORONET program, is basically a powerful resource management system that can coordinate data flow and hand out more bandwidth when needed.


AT&T, IBM and Applied Communication Sciences have teamed up to develop a software-defined networking (SDN) prototype technology that supposedly will give cloud service providers a faster way to access extra bandwidth in case something unexpected happens in their data centers.

A common problem facing a lot of cloud providers is having to deal with changes in performance requirements when a spike in user activity hits or some sort of disaster strikes that causes a cloud server to shut down. In instances like these, providers need to ensure that they can distribute extra bandwidth throughout their data center to handle the slack. The new SDN prototype is basically a powerful resource management system that can coordinate data flow and hand out more bandwidth when needed, the companies said.

The three companies developed the tech as part of the U.S. Government’s DARPA CORONET program, created in 2007 and designed to improve network architecture. To test out the new system, the team of company scientists used the OpenStack cloud as their field of study. The system works with the help of a new IBM cloud platform technology that manages all virtual machine (VM) network applications running on OpenStack software; doing so can help users automatically monitor server load and discover if a server is peaking or going offline. The IBM platform then communicates with AT&T’s SDN wide area network (WAN) orchestrator, which handles all of the data server connection requests and can distribute the appropriate bandwidth when needed.

Once the system was set up, the team recorded setup times of 40 seconds and claimed that they were able to get results in under a second by using advanced reconfigurable optical add-drop multiplexer (ROADM) equipment, which helps allocate wavelength distribution for increased bandwidth.

Earlier this month, a team of MIT researchers developed their own networking management system called Fastpass that helps data get transferred across networks when periods of heavy traffic cause routers or network nodes to be congested.
http://gigaom.com/2014/07/29/att-ibm...-in-the-cloud/





Netflix Signs Peering Deal With AT&T to Reduce Buffering
Andy Fixmer

Netflix has reached an agreement with AT&T to give its streaming service direct access to AT&T's network, with the goal of reducing buffering when subscribers watch shows like House of Cards and Orange Is the New Black.

The so-called peering arrangement went into effect on Tuesday, said two people with knowledge of the situation, who asked not to be named because an announcement isn't planed. The sources declined to provide financial terms of the deal. AT&T confirmed the deal in an e-mailed statement.

"We reached an interconnect agreement with Netflix in May and since then have been working together to provision additional interconnect capacity to improve the viewing experience for our mutual subscribers,'' an AT&T spokeswoman said in the statement. "We’re now beginning to turn up the connections, a process that should be complete in the coming days."

Netflix has said it's paying some of the biggest Internet service providers in the U.S. for direct access to their networks in order to sidestep congestion that has caused frustrations for subscribers streaming high-definition videos. The company already signed similar "peering" or "interconnect" agreements with Comcast and Verizon this year.

Netflix, as well as Google's YouTube, provides a regularly updated video quality report that ranks the speed of ISPs. AT&T's services have typically ranked low on Netflix's list, although Verizon's services haven't fared much better since it signed a peering agreement with Netflix.

Earlier this year, Netflix made clear it wasn't happy about having to pay Comcast to deliver its content faster to consumers. It has waged a public campaign about the danger such deals pose to net neutrality, even as it has signed agreements with more providers.

In June, the Federal Communications Commission said it would try to obtain details of deals between Internet providers and content companies, including Netflix and YouTube.

But the agreements wouldn't be covered by the FCC's newly proposed rules for regulating how companies control the flow of data on the Internet. Net neutrality advocates are calling for the agency to take aggressive steps to ensure that ISPs are not able to play favorites among content providers.

Historically, peering agreements have been a common part of how the Internet operates — and haven't required payments between companies — by allowing users and content providers to send data across numerous networks. The goal was to balance traffic from one provider's network to another's.

More recently, Internet providers have begun to seek compensation for the traffic that some companies, such as Netflix, create on their networks. These deals create direct physical connections between networks.

Despite consumer frustrations over buffering speeds, Netflix's popularity continues to grow. The company said this month it had surpassed 50 million streaming subscribers, with 36.2 million in the U.S. and 13.8 million internationally. House of Cards, Netflix's first exclusive series, earned 13 Emmy nominations for its second season, while Netflix's prison drama Orange Is the New Black received 12 nominations.
http://mashable.com/2014/07/29/netfl...-peering-deal/





Atheists, Believe It or Not, Have Their Own Channel

Atheist TV has its premiere, on Roku and online
Neil Genzlinger

Atheists are angry, and watch out, because now they have a television channel.

This week the organization American Atheists announced the premiere of Atheist TV, available through the streaming service Roku and over the Internet. That news will certainly prompt assorted knee-jerk reactions in some quarters, and perhaps some confusion:

“Atheist TV? It’ll be full of incest and smut and debaucheries of all kinds. Oh, wait; that’s HBO.”

“Atheist TV? It’ll be nonstop mockery of conservative Christians and Republicans and Middle America. Oh, wait; that’s Comedy Central.”

“Atheist TV? It’ll be godless wiccans and flesh-eating zombies and serial killers and all manner of other people who lack the Judeo-Christian morals that built America. Oh, wait; that’s practically every mainstream network and cable channel.”

Tuesday night, at a party for the debut, David Silverman, president of American Atheists, described a channel that won’t be any of the sordid things that certain religious types might envision, but that will be a challenge to a lot of things those people hold dear. The channel, he said, will “provide a breadth of content, from science to politics to comedy, all centered around our common freedom from religion.”

American Atheists, founded in 1963, is a serious organization that advocates the absolute separation of church and state and a view of life that emphasizes the here and now and provable. The channel, Mr. Silverman said in the first streamed broadcast, will have no psychics, no ghost hunters, no “science fiction presented as science fact,” and will be “a place we can call our own, where we can speak the truth as frankly as we want.” It intends, he said, “to promote the idea that religion can and should be criticized.”

That will make it a lonely outpost. Religion isn’t hard to find on television, including some negative images of it — see the debate over the Muslim characters on the new FX drama “Tyrant” — but not many outlets that rely on advertising dollars are willing to ask probing questions about religion as big business, religion as an instigator of wars, religion as a suppressor of intellectual inquiry.

At first, Atheist TV will be limited, offering interviews with leading atheists, film from atheist conventions and other content from the Richard Dawkins Foundation and like-minded organizations. But it has plans to introduce original programming.

Among the people helping to bring that about, the channel has announced, will be the producer Liz Bronstein, whose credits include reality shows like “Whale Wars,” on Animal Planet, which is part of Discovery Communications — a company that Mr. Silverman slammed hard on Tuesday night.

“The TV networks kowtow to the liars who make money off of misinformation,” he said, singling out for special contempt outlets that mix silly supernatural gunk with more serious science and nature shows.

“The Discovery Channel treats ghosts like they’re real,” he said, adding later, “Bigfoot, psychics, aliens, ghosts, spirits, gods, devils — all bunk, all pushed by the so-called truthful and scientific stations in an effort to placate the waning religion segment at the expense of the growing segment of atheists who should be, but are not, their target audience.”

Whew. If he sounds peeved, well, it’s hard being an atheist in the United States, where plenty of people behave in decidedly un-Christian ways, but to speak ill of Christianity or other religions can be career-ending. How low in the hierarchy are American atheists? Dogs had their own channel before atheists did. Sarah Palin, too.

So expect a fair amount of bluntness when Atheist TV gets rolling. The outlet may have enemies in Very High Places: At the Tuesday event, seven minutes into the streaming of the first broadcast, the Internet feed in the room gave out. But if that was God sending a message, Mr. Silverman wasn’t deterred.

“Atheist TV is live,” he said, “and it’s going to stay live, 24/7, until the sun burns out.”
http://www.nytimes.com/2014/08/02/ar...nd-online.html

















Until next week,

- js.



















Current Week In Review





Recent WiRs -

July 26th, July 19th, July 12th, July 5th


Jack Spratts' Week In Review is published every Friday. Submit letters, articles, press releases, comments, questions etc. in plain text English to jackspratts (at) lycos (dot) com. Submission deadlines are Thursdays @ 1400 UTC. Please include contact info. The right to publish all remarks is reserved.


"The First Amendment rests on the assumption that the widest possible dissemination of information from diverse and antagonistic sources is essential to the welfare of the public."
- Hugo Black
JackSpratts is offline   Reply With Quote
Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Peer-To-Peer News - The Week In Review - July 16th, '11 JackSpratts Peer to Peer 0 13-07-11 06:43 AM
Peer-To-Peer News - The Week In Review - July 9th, '11 JackSpratts Peer to Peer 0 06-07-11 05:36 AM
Peer-To-Peer News - The Week In Review - January 30th, '10 JackSpratts Peer to Peer 0 27-01-10 07:49 AM
Peer-To-Peer News - The Week In Review - January 16th, '10 JackSpratts Peer to Peer 0 13-01-10 09:02 AM
Peer-To-Peer News - The Week In Review - December 5th, '09 JackSpratts Peer to Peer 0 02-12-09 08:32 AM






All times are GMT -6. The time now is 02:51 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
© www.p2p-zone.com - Napsterites - 2000 - 2024 (Contact grm1@iinet.net.au for all admin enquiries)