P2P-Zone  

Go Back   P2P-Zone > Peer to Peer
FAQ Members List Calendar Search Today's Posts Mark Forums Read

Peer to Peer The 3rd millenium technology!

Reply
 
Thread Tools Search this Thread Display Modes
Old 22-10-14, 07:58 AM   #1
JackSpratts
 
JackSpratts's Avatar
 
Join Date: May 2001
Location: New England
Posts: 10,013
Default Peer-To-Peer News - The Week In Review - October 25th, '14

Since 2002


































"If the US gets its way, then criminal penalties will apply even against users who were not seeking financial gain from sharing or making available copyrighted works, such as fans and archivists. Such a broad definition is ripe for abuse." – EFF






































October 25th, 2014




Latest TPP Leak Reveals Even Harsher Copyright Rules
Emma Woollacott

Wikileaks has released a new draft of the intellectual property chapter of the Trans-Pacific Partnership (TPP) trade agreement, revealing that the US is still pushing for draconian measures on copyright infringement.

“By publishing this text we allow the public to engage in issues that will have such a fundamental impact on their lives,” says Wikileaks editor in chief Julian Assange.

The US has all along wished to introduce features of the controversial Digital Millennium Copyright Act (DMCA), such as compelling ISPs to alert customers who are accused of illegal downloads and, possibly, take the infringing material down. If they failed to do so, they would themselves be liable for any copyright infringement by their customers.

But the latest draft goes even further: the US wants to see these rules covering not just ISPs, but anyone providing internet services. And, as Alberto Cerda of Georgetown University Law Center points out to TorrentFreak, this means that coffee shops could potentially be held liable for copyright infringement by their customers.

Meanwhile, the copyright itself could be enforced for longer. While the previously-leaked draft showed that some countries were proposing flexibility on copyright terms, it seems that all are now agreed that there should be a universal minimum term, whether life-plus-50-years, life-plus-70 or life-plus-100.

The US is also calling for criminal sanctions for copyright infringement, even where the infringement isn’t being carried out for commercial reasons.

“If the US gets its way, then criminal penalties will apply even against users who were not seeking financial gain from sharing or making available copyrighted works, such as fans and archivists,” write Jeremy Malcolm and Maira Sutton of the Electronic Frontier Foundation (EFF). “Such a broad definition is ripe for abuse.”

Indeed, a similar provision in a free trade agreement between Columbia and the US led to copyright laws that saw a Columbian graduate student arrested for posting another student’s academic paper online without permission.

But while the last leaked draft of the TPP, dated November 2013, showed strong international opposition to this criminalization plan, Canada now seems to be the only serious hold-out.

This may, suggests James Love of Knowledge Ecology International, be because this new draft gives some countries extra time to implement the agreement – meaning that current governments won’t necessarily have to carry the can for their decisions.

“Developing countries are being asked to accept very restrictive standards for intellectual property in return for transition periods that defer the harm until current governments are now longer held accountable,” he says. “This will be a short-term benefit in exchange for a long term harm.”

The draft calls for countries to introduce criminal penalties for unauthorised access to, misappropriation of or disclosure of trade secrets “by any person using a computer system”. This would apply where the actions led to commercial advantage or financial gain; where they were directed by “a foreign economic entity”; or where they were detrimental to a country’s economic interests, international relations, national defence or national security.

These are very broad provisions, and don’t allow any exceptions in the public interest, such as journalism or whistleblowing.

“This text goes far beyond existing trade secrets law, which in the United States and other common law countries is usually a matter for the civil not the criminal courts,” write Malcolm and Sutton of the EFF.

“No public interest exception, such as for journalism, is provided. In practice, this could obligate countries into enacting a draconian anti-hacking law much like the Criminal Fraud and Abuse Act (CFAA) that was used to prosecute Aaron Swartz.”

These changes matter all the more because of the inclusion in the trade agreement of so-called Investor -State Dispute Settlement (ISDS) – a system that allows corporations to sue governments for decisions that result in a loss of profits. UN figures uncovered by the Independent newspaper recently revealed that US companies have already made billions of dollars from suing foreign governments under similar ISDS agreements.

The good news is that the US appears to be fairly isolated in some of its more extreme requests, with Canada pushing back hard: indeed, according to Wikileaks, Canada has registered its opposition to proposals 56 times, more than any other country. Canada’s recently enacted its own copyright legislation, and is working hard to keep it: and while it now looks like the only country still putting up much of a fight against the US, this latest leak could work in its favor.

The next round of negotiations is due to take place in Australia at the end of this month.
http://www.forbes.com/sites/emmawool...pyright-rules/





iiNet Back in Court Over P2P File Sharing

Dallas Buyers Club, LLC takes group of ISPs to court over customers' alleged piracy
Rohan Pearce

For the second time, Internet service provider iiNet has been dragged to court over the alleged copyright infringement of some of its customers.

Dallas Buyers Club, LLC on 14 October initiated legal action in the Federal Court aimed at obtaining the details of a number of iiNet customers.

The company is seeking as part of a process of preliminary discovery the details of customers linked to IP addresses that it alleges were involved in downloading the 2013 movie Dallas Buyers Club using BitTorrent or other peer-to-peer software.

"In plain terms, Dallas Buyers Club wants the names and contact details of our customers they believe may have illegally shared their film," an iiNet blog entry stated.

Documents filed by Dallas Buyers Club list other prospective respondents as iiNet subsidiaries Adam Internet and Internode, as well as ISPs Amnet Broadband and Dodo, and Wideband Networks Pty Ltd.

iiNet said it would oppose the action by Dallas Buyers Club LLC.

"We are concerned that our customers will be unfairly targeted to settle any claims out of court using a practice called 'speculative invoicing'," the blog entry states.

"Speculative invoicing, as practiced overseas, commonly involves sending intimidating letters of demand to subscribers seeking significant sums for an alleged infringement. These letters often threaten court action and point to high monetary penalties if sums are not paid."

Computerworld Australia has asked representatives of Dallas Buyers Club LLC whether more ISPs are likely to be targetted for court action.

A number of lawsuits relating to unauthorised downloads of Dallas Buyers Club have been filed by the movie's studio, Voltage, in the US.

The iiNet case is due for a directions hearing on 4 November in the Federal Court in NSW.

iiNet was previously embroiled in a copyright-related court case brought against it by a number of movie studios. The lengthy court process, which began in 2009 and concluded in 2012, resulted in a High Court victory for the ISP.

That case revolved around the responsibility of the ISP for preventing the alleged copyright infringing activities of its customers.

The High Court's judgement noted that the movie studies that brought the action against the ISP, rather than pursue iiNet customers alleged to have downloaded movies, "seek to fix iiNet with the liability of a secondary infringer in relation to those primary infringements."

The judgement noted that "iiNet had no direct technical power at its disposal to prevent a customer from using the BitTorrent system to download the appellants' films on that customer's computer".

Reversing the outcome of that trial has been an explicit goal of Attorney-General George Brandis. The government has outlined a range of proposed changes to how online copyright enforcement functions in Australia in order to crack down on piracy.
http://www.computerworld.com.au/arti...-file-sharing/





BitTorrent Performance Test Shows How Much Faster Sync is Compared to Google Drive, OneDrive, and Dropbox
Emil Protalinski

Now that its file synchronization tool has received a few updates, BitTorrent is going on the offensive against cloud-based storage services by showing off just how fast BitTorrent Sync can be. More specifically, the company conducted a test that shows Sync destroys Google Drive, Microsoft’s OneDrive, and Dropbox.

The company transferred a 1.36 GB MP4 video clip between two Apple MacBook Pros using two Apple Thunderbolt to Gigabit Ethernet Adapters, the Time.gov site as a real-time clock, and the Internet connection at its headquarters (1 Gbps up/down). The timer started when the file transfer was initiated and then stopped once the file was fully synced and downloaded onto the receiving machine.

Sync performed 8x faster than Google Drive, 11x faster than OneDrive, and 16x faster than Dropbox:

Sync’s time might seem ridiculously low, almost as if the Internet wasn’t involved at all. You have to remember, however, that BitTorrent’s headquarters has a ridiculous fast connection both downstream and upstream.

BitTorrent ran three tests for each service: In the morning at the start of the workday, in the afternoon, and in the evening when the company’s San Francisco office was mostly empty and Internet use was at a minimum. The above times are averages for each of the three times.

Running our own test

The results are impressive, and while we didn’t doubt them given how Sync is designed, we figured doing the test ourselves was worth a shot.

BitTorrent made a persuasive argument in its blog post:

It’s important here to note that Dropbox, Google Drive and Microsoft OneDrive all rate-limit uploads and do not fully utilize the 1 Gbps bandwidth available (in regards to the office Internet connection, not the LAN switched). We’re confident that a slower Internet connection would yield similar results.

Most people don’t have access to such speeds. My ISP, for example, claims I have a 35 Mbps up and 3 Mbps down connection, which is certainly slower than the above. I do sometimes get faster speeds than that, especially if I’m transferring over FTP or BitTorrent, but I honestly wasn’t expecting anything too crazy in this case.

Nonetheless, I grabbed the first four episodes (1.37 GB) of The Wire, which just so happens to be one of the best TV shows ever made, and downloaded Sync onto my Windows desktop. I then sent the link along to my laptop, grabbed Sync there as well, and hit approve to get things rolling:

The whole process was done in 1 minute and 6 seconds, according to the timer on my Nexus 5. Yet this is for a test performed on the same network. What happens if I want to send the episodes to my friend across town?

The transfer process was much longer. Times were in the double digit minutes, and largely depended on what connections my friends had.

Yet it’s worth noting that Google Drive, OneDrive, and Dropbox still performed worse. They were limited by the same download bandwidth, but the upload section of the process was notably much slower (many ISPs worldwide offer much slower upload speeds than download speeds).

The bottom line

The reason Sync is so fast comes down to the fact that it is based on the BitTorrent protocol. This means it is designed to take the shortest and fastest path when getting file pieces from one device to another, and it doesn’t have to rely on third-party servers that are typically involved when sending files via cloud services.

That being said, your mileage will definitely vary. You may find that OneDrive is faster than Google Drive, or that Dropbox is king. Sync will definitely be the fastest if you and the receiver are on the same network or physically close (that’s exactly why BitTorrent is trying to pitch Sync to businesses, especially if they need to transfer files between different operating systems).

Of course, it’s possible that Sync will not be the fastest for you, especially if you and the receiver are close to a Google, Microsoft, or Dropbox data center. Sync should win if it can max out your connection, but if for whatever reason it can’t (factors include your ISP, time of day, and overall congestion on the network), the alternatives are certainly good ones.

At the end of the day, Sync may be the fastest in most cases, but speed probably isn’t the only feature you want.
http://venturebeat.com/2014/10/22/bi...e-and-dropbox/





Silicon Valley Stirs Up Hollywood
Nick Bilton

Last week in San Francisco, Vanity Fair held its New Establishment Summit, where old-media titans like Bob Iger of Disney could schmooze with new techies like Evan Spiegel of Snapchat.

But not everyone was sure what they were doing there. “I thought you decided not to come to this?” a Hollywood man in his mid-40s was overheard saying to another executive.

“I wasn’t going to, but I woke bolt upright in bed this morning at 5 a.m. and got on the next flight up here,” the executive replied. “I had this moment of realization that this is the future of our business, not what’s going on down there.”

These two men weren’t alone. Baby billionaire nerds populated the stage, but the audience looked more like a red carpet Hollywood screening: Brad Grey, the chairman of Paramount Pictures; Richard Lovett, president of Creative Artists Agency; Brian Grazer, a founder of Imagine Entertainment; and Jon Feltheimer, chief executive of Lions Gate Entertainment, just to name a few.

The “up here”/“down there” divide was evident throughout the two-day conference.

I bumped into George Lucas, the creator of “Star Wars,” in the hallway of the Yerba Buena Center for the Arts, and asked him why all these Hollywood tycoons and directors were in attendance, when they could be lounging by their pools 380 miles away.

Mr. Lucas pointed his nose in the air and sniffed like a rabbit. “Do you smell that?” he said. (I wiggled my nose a little, but beyond the whiff of bad coffee, I got nothing.) “It’s the smell of money,” he said. “That’s why they’re here.”

The problem was, these Down There executives weren’t sure if the tech moguls wanted to join forces, buy them or destroy them. (Maybe it’s all of the above, just not in that particular order.) It was like watching tourists go cage-diving with sharks, the scent of blood and chum in the water, and as the sharp teeth and fins become visible, wondering if the bars are strong enough to keep the predators at bay.

Hollywood and Silicon Valley have always been at odds. Down There thinks that without its movies and TV shows, Up Here would have no content to show. Up Here thinks that Down There can be disrupted, just like every other industry. But with Hollywood having one of the worse summers in recent memory, and the economics of the movie industry undergoing technological upheaval, Silicon Valley may have the upper hand.

“It’s no accident that this conference is happening right here, right now, because there is a convergence with Hollywood and Silicon Valley that is undeniable,” said Allan Loeb, a screenwriter based in Los Angeles, on the first day of the conference. “Silicon Valley knows it needs content, and Hollywood knows it needs money.”

“But,” he added, leaning in and whispering, “because Silicon Valley is so much richer than Hollywood, what I believe is about to occur is that Silicon Valley is going to buy Hollywood.”

He wasn’t the first attendee to offer this prediction. I heard several venture capitalists mention that for one-fifth of the $22 billion that Facebook paid for WhatsApp, the messaging platform, it could buy Lions Gate or AMC Networks, valued at $4 billion each. Facebook could easily make that content exclusive. (“Breaking Bad,” brought to you by Facebook.) In one session, Kara Swisher of Re/code talked about Apple buying Disney, as if it were buying a carton of milk at the deli.

On Wednesday, HBO announced that it would offer a stand-alone streaming service in 2015. Clearly, the company is trying to teeter on a tightrope of old and new, hoping to make it to the other side without a loud splat.

Mr. Lucas’s vision of Hollywood isn’t so generous as to include studios being acquired. He believes that technology has made filmmaking so cheap that movies will be created and distributed by everybody, cutting Hollywood out of the process.

Was Hollywood concerned? It seemed to be clueless, judging by one panel discussion titled “Who Owns the Screen?” Susan Wojcicki, the chief executive of YouTube, spoke about how younger consumers are increasingly watching content that Hollywood has no hand in making.

As Exhibit A, she told the audience to look at Pew Die Pie (pronounced Pew Dee Pie), a Swedish gamer who records himself playing video games and who happens to be YouTube’s most followed personality, with 31 million subscribers.

After the session, people in the hallway kept asking one another if they had heard of this “Pew Dye Pie” guy. It reminded me of my grandparents asking if I had ever watched “the MTV” when I was 8.

Graydon Carter, the editor of Vanity Fair, agreed that Silicon Valley and Hollywood still had a lot to work out. I asked why Vanity Fair had chosen to host the conference near Silicon Valley, and not Hollywood or even New York.

“You fish where the fishes are,” he said. “In so much of America, people are in love with the past, but Northern California is a place where people are in love with the future.”

The big question is, will Down There be part of the future that Up Here is about to create.
http://www.nytimes.com/2014/10/16/fa...hollywood.html





Streaming Music Has Left Me Adrift
Dan Brooks

It’s hard to imagine now, but there once was a time when you could not play any song ever recorded, instantly, from your phone. I call this period adolescence. It lasted approximately 30 years, and it was galvanized by conflict.

At that time, music had to be melted onto plastic discs and shipped across the country in trucks. In order to keep this system running smoothly, a handful of major labels coordinated with broadcasters and retailers to encourage everyone to like the same thing, e.g. Third Eye Blind. This approach divided music into two broad categories: “popular” and what I liked.

Lest history remember industry versus indie as a distinction without difference, I should point out that mainstream rock was genuinely awful in the two decades before Napster. Classic rock gave way to glam metal, which was vanquished by Nirvana and grunge, whose promise quickly curdled into the cynical marketing strategy known as “alternative.” From Journey to Smash Mouth, the major-label system peddled an enormous quantity of objectively hideous music in its waning years.

In a now-famous 1993 essay for The Baffler, the musician and recording engineer Steve Albini described how this system pauperized bands to enrich a series of middlemen. The structure of most contracts meant that artists paid back almost all their royalties in managers’ and recording fees. The occasional hits profited the artists far less than they did the labels, whose marketing departments ignored most of their catalogs to focus on the hits. For a majority of bands, signing with a major label was the first step toward going out of business. Albini called it “the problem with music”: the major-label system acted as an anticurator by making good music harder to find. For me, adopting an indie-snob identity (subset punk) didn’t just solve this problem and provide me with a lifetime of sound-as-art, it also gave me something to talk about with other pointy-haired youngsters I ran across.

Then a different subset of nerds invented MP3 encoding, and everything changed. The good news is that digital distribution neatly solved Albini’s problem with music: Now that nearly every piece of recorded sound is as easy to find as any other, everyone can finally listen to what we snobs wanted them to hear all along. (Also on the plus side, labels have joined bands in not making any money.) The bad news is that we have lost what was once a robust system for identifying kindred spirits. Now that we all share the same record collection, music snobs have no means to recognize one another. We cannot flip through a binder of CDs and see a new friend, a potential date. By making it perfectly easy to find new music, we’ve made it a little more difficult to find new people.

Before Spotify solved the problem with music forever, esoteric taste was a measure of commitment. When every band was more or less difficult to hear by virtue of its distance from a major label, what you liked was a rough indicator of the resources you had invested in music. If you liked the New York City squat-punk band Choking Victim, it was a sign you had flipped through enough records and endured enough party conversations to hear about Choking Victim. The bands you listened to conveyed not just the particular elements of culture you liked but also how much you cared about culture itself.

Like blasted pecs or a little rhinestone flag pin, esoteric taste in music is an indicator of values. Under the heel of the major-label system in the early ’90s, indie taste meant more than liking weird bands. To care about obscure bands was to reject the perceived conformity of popular culture, to demand a more nuanced reading of the human experience than Amy Grant’s “Baby Baby” and therefore to assert a certain kind of life. That assertion was central to my identity as a young adult, and I found that people who shared it were more likely to agree with me on seemingly unrelated issues. Like all aesthetics, taste in music is a worldview.

But music is not just an aesthetic pursuit; it’s also intensely social. You listen to music at parties, around the house with your college roommates, in the car on the way to high school — anywhere meaningful interpersonal connections are made or, importantly, not. If you prefer to put on the radio, you have something in common with most people, and therefore nobody. But if you put on the Brian Jonestown Massacre, you will quickly identify who else in the room is a bit like you.

The translation of musical taste to social acceptance was in many ways terrifyingly complex and arbitrary. I once attended a party at the home of a poetry professor who, in her meticulous preparations, happened to leave out one CD: Stephen Malkmus and the Jicks. It was a gutless choice, the act of a person who reads music magazines. Any other album would have revealed her taste, but instead she had only shown that she understood what our kind liked.

Her transparency scandalized me, because at that time I understood musical taste as a central element of personality. In college, I was horrified to learn that a smart and culturally sophisticated woman I had been dating owned just six CDs. I couldn’t comprehend how such a sensitive — and, given the circumstances, evidently charitable — person could not be interested in music. I felt like a sommelier walking into A.A. At a level of understanding since replaced by OkCupid match percentage, I knew I was taking a long shot. Years later, when my friends and I discussed the powerful and surely arbitrary forces that had kept us single, we toyed with the idea that “into music” was a deal-breaker quality in a mate.

The application of such reasoning over two decades of romantic entanglements, night-life chums and road-trip mixes makes it safe to say that musical taste has plotted the course of my life. Since age 14, it has determined my leisure hours and then my career, and by extension, my friends and lovers. We embraced art and rejected a major-label system that cared only about selling records. Oddly, we expressed our position by buying records. The problem with my life as an anticorporate bohemian was that it was predicated on a consumer behavior.

Joseph Heath and Andrew Potter explore this contradiction in their 2004 book, “Nation of Rebels: How Counterculture Became Consumer Culture.” They argue that contemporary consumer culture is driven not by a desire to keep up with the Joneses but by the opposite impulse: to individuate. We believe our purchases distinguish us from a perceived mainstream of numb consumers, so we cannot stop buying things.

Certainly, this reasoning lay at the core of my indie identity. But when nerds figured out how to play music over the Internet, it rendered indie culture inert. The shift away from physical albums destroyed that mechanism of consumer individuation. When getting into a band became as easy as typing its name into a search box, particular musical tastes lost their function as signifiers of commitment. What you listened to ceased to be a measure of how much you cared and became a mere list of what you liked.

Worse, this list was no more ethically righteous than anyone else’s. You didn’t have to support local businesses or hang with freaky beatniks to hear Choking Victim anymore, so liking them became no better (or worse) than liking Pearl Jam.

Last spring, I befriended a charming stranger on the basis of our mutual interest in the Slits. If you haven’t heard them, statistics suggest that you will enjoy their cover of “I Heard It Through the Grapevine” and nothing else. When she started talking about the import version of “The Peel Sessions,” I knew I had met somebody special. We went back to her apartment and played each other songs on Spotify for three hours. Thanks to the Internet, we both had all the same albums.

Such connections are still possible, even in this new world of abundant content. But have they become too possible — so possible, in the universal digital distribution of Slits records, that they have lost meaning? What is a Slits fan like now that her habits differ from those of a Kesha fan only in the letters she types into a box? If I passed her in a store aisle, would she notice that we used to hate the same media conglomerates? I worry she would not. Now that the tyranny of the majors has been overthrown, the members of the resistance don’t recognize one another anymore.

The digital age has given everyone in America a better music collection than the one I put together over the last 20 years, and in so doing it has leveled us. James Murphy describes this problem in the LCD Soundsystem song “Losing My Edge”:

I heard you have a compilation of every good song ever done by anybody:

Every great song by the Beach Boys, all the underground hits, all the Modern Lovers tracks.

I heard you have a vinyl of every Niagara record on German import.

I heard that you have a white label of every seminal Detroit techno hit: 1985, ’86, ’87.

I heard that you have a CD compilation of every good ’60s cut and another box set from the ’70s.


In a swoop, the Internet devalued the consumer capital that Murphy amassed through years of collecting records as a D.J. But at least he was married before it happened.

As generational problems go, this one is pretty mild. My grandfather, for example, had to stop Hitler from overrunning Europe. But in the same way that he came back from France saying beaucoup a lot, these seismic changes alter us in ways we don’t perceive. Consuming music, an act central to my being for as long as I can remember, has changed forever. Who knows how that will change me?

My record collection is no longer a lifestyle, a biography, a status. The identities that I and a generation of fellow aesthetes spent our lives fashioning are suddenly obsolete. They turned out to be mere patterns of consumption, no more resilient than the patterns of production that provoked them. Not content to ruin music for the first three decades of my life, the major labels have collapsed and ruined dating too. I will probably never forgive them, if I ever get around to forgiving myself.
http://www.nytimes.com/2014/10/19/ma...me-adrift.html





Researcher Finds Tor Exit Node Adding Malware to Binaries
Dennis Fisher

A security researcher has identified a Tor exit node that was actively patching binaries users download, adding malware to the files dynamically. The discovery, experts say, highlights the danger of trusting files downloaded from unknown sources and the potential for attackers to abuse the trust users have in Tor and similar services.

Josh Pitts of Leviathan Security Group ran across the misbehaving Tor exit node while performing some research on download servers that might be patching binaries during download through a man-in-the middle attack. Downloading any kind of file from the Internet is a dodgy proposition these days, and many users know that if they’re downloading files from some random torrent site in Syria or The Marshall Islands, they are rolling the dice. Malware runs rampant on these kinds of sites.

But the scenario that worries security experts much more involves an attacker being able to control the download mechanism for security updates, say for Windows or OS X. If an attacker can insert malware into this channel, he could cause serious damage to a broad population of users, as those update channels are trusted implicitly by the users’ and their machines. Legitimate software vendors typically will sign their binaries and modified ones will cause verification errors. What Pitts found during his research is that an attacker with a MITM position can actively patch binaries–if not security updates–with his own code.

Pitts built a framework called BDF (Backdoor Factory) that can patch executable binaries with shell code in such a way that the binary will execute as intended, without the user noticing. He wanted to see whether anyone was conducting this kind of attack on the Internet right now, so he decided to have a look at Tor, the anonymity network, which is used by people around the world.

“To have the best chance of catching modified binaries in transit over the Internet, I needed as many exit points in as many countries as possible. Using Tor would give me this access, and thus the greatest chance of finding someone conducting this malicious MITM patching activity,” Pitts wrote in his explanation of the research.

“After researching the available tools, I settled on exitmap. Exitmap is Python-based and allows one to write modules to check exit nodes for various modifications of traffic. Exitmap is the result of a research project called Spoiled Onions that was completed by both the PriSec group at Karlstad University and SBA Research in Austria. I wrote a module for exitmap, named patchingCheck.py, and have submitted a pull request to the official GitHub repository. Soon after building my module, I let exitmap run. It did not take long, about an hour, to catch my first malicious exit node.”

The exit node in question was in Russia, and Pitts discovered that the node was actively patching any binaries he downloaded with a piece of malware. He downloaded binaries from a variety of sources, including Microsoft.com, and each of them came loaded with malicious code that opens a port to listen for commands and starts sending HTTP requests to a remote server.

Pitts informed officials at the Tor Project, who quickly flagged the exit node as bad.

“We’ve now set the BadExit flag on this relay, so others won’t accidentally run across it. We certainly do need more people thinking about more modules for the exitmap scanner. In general, it seems like a tough arms race to play,” Roger Dingeldine, one of the original developers of Tor, wrote in a message on a Tor mailing list Friday.

In terms of defending against the sort of attack, Pitts suggested that encrypted download channels are the best option, both for users and site operators.

“SSL/TLSis the only way to prevent this from happening. End-users may want to consider installing HTTPS Everywhere or similar plugins for their browser to help ensure their traffic is always encrypted,” he said via email.

Pitts said that the relay in Russia was the only one he found that was exhibiting this malicious behavior, but that doesn’t mean it’s not happening elsewhere.

“Out of over 1110 exit nodes on the Tor network, this is the only node that I found patching binaries, although this node attempts to patch just about all the binaries that I tested. The node only patched uncompressed PE files. This does not mean that other nodes on the Tor network are not patching binaries; I may not have caught them, or they may be waiting to patch only a small set of binaries,” he said.

This isn’t the first time that attackers have been found using such an attack in the wild. In 2012 the Flame malware was seen using a complicated technique that involved the attackers using a forged Microsoft certificate to impersonate a Windows Update server and distribute Flame to more users. That attack involved a lot of moving parts and was a highly targeted attack, whereas the Tor attack Pitts found is applicable to a much wider potential population.

“The problem of modified binaries is not limited to Tor. We highlight the example because of some of the misconceptions people have about Tor providing increased safety. In general, users should be wary of where they download software and ensure they are using TLS/SSL. Sites not supporting TLS/SSL should be persuaded to do so,” Pitts said.
http://threatpost.com/researcher-fin...inaries/109008





What We Give Away When We Log On to a Public Wi-Fi Network
Maurits Martijn

In his backpack, Wouter Slotboom, 34, carries around a small black device, slightly larger than a pack of cigarettes, with an antenna on it. I meet Wouter by chance at a random café in the center of Amsterdam. It is a sunny day and almost all the tables are occupied. Some people talk, others are working on their laptops or playing with their smartphones.

Wouter removes his laptop from his backpack, puts the black device on the table, and hides it under a menu. A waitress passes by and we ask for two coffees and the password for the WiFi network. Meanwhile, Wouter switches on his laptop and device, launches some programs, and soon the screen starts to fill with green text lines. It gradually becomes clear that Wouter’s device is connecting to the laptops, smartphones, and tablets of local cafe visitors.

On his screen, phrases like “iPhone Joris” and “Simone’s MacBook” start to appear. The device’s antenna is intercepting the signals that are being sent from the laptops, smartphones, and tablets around us.

More text starts to appear on the screen. We are able to see which WiFi networks the devices were previously connected to. Sometimes the names of the networks are composed of mostly numbers and random letters, making it hard to trace them to a definite location, but more often than not, these WiFi networks give away the place they belong to.

We learn that Joris had previously visited McDonald’s, probably spent his vacation in Spain (lots of Spanish-language network names), and had been kart-racing (he had connected to a network belonging to a well-known local kart-racing center). Martin, another café visitor, had been logged on to the network of Heathrow airport and the American airline Southwest. In Amsterdam, he’s probably staying at the White Tulip Hostel. He had also paid a visit to coffee shop called The Bulldog.

Session 1:

Let everyone connect to our fake network

The waitress serves us our coffee and hands us the WiFi password. After Slotboom is connected, he is able to provide all the visitors with an internet connection and to redirect all internet traffic through his little device.

Most smartphones, laptops, and tablets automatically search and connect to WiFi networks. They usually prefer a network with a previously established connection. If you have ever logged on to the T-Mobile network on the train, for example, your device will search for a T-Mobile network in the area.

Slotboom’s device is capable of registering these searches and appearing as that trusted WiFi network. To demonstrate, I suddenly see the name of my home network appear on my iPhone’s list of available networks, as well as my workplace, and a list of cafes, hotel lobbies, trains, and other public places I’ve visited. My phone automatically connects itself to one of these networks, which all belong to the black device.

Slotboom can also broadcast a fictitious network name, making users believe they are actually connecting to the network of the place they’re visiting. For example, if a place has a WiFi network consisting of random letters and numbers (Fritzbox xyz123), Slotboom is able to provide the network name (Starbucks). People, he says, are much more willing to connect to these.

We see more and more visitors log on to our fictitious network. The siren song of the little black device appears to be irresistible. Already 20 smartphones and laptops are ours. If he wanted to, Slotboom is now able to completely ruin the lives of the people connected: He can retrieve their passwords, steal their identity, and plunder their bank accounts. Later today, he will show me how. I have given him permission to hack me in order to demonstrate what he is capable of, though it could be done to anyone with a smartphone in search of a network, or a laptop connecting to a WiFi network.

Everything, with very few exceptions, can be cracked.

The idea that public WiFi networks are not secure is not exactly news. It is, however, news that can’t be repeated often enough. There are currently more than 1.43 billion smartphone users worldwide and more than 150 million smartphone owners in the U.S. In the U.S., more than 92 million American adults own a tablet and more than 155 million own a laptop. Each year the worldwide demand for more laptops and tablets increases. In 2013, an estimated 206 million tablets and 180 million laptops were sold worldwide. Probably everyone with a portable device has once been connected to a public WiFi network: while having a coffee, on the train, or at a hotel.

The good news is that some networks are better protected than others; some email and social media services use encryption methods that are more secure than their competitors. But spend a day walking in the city with Wouter Slotboom, and you’ll find that almost everything and everyone connected to a WiFi network can be hacked. A study from threat intelligence consultancy, Risk Based Security, estimates that more than 822 million records were exposed worldwide in 2013, including credit card numbers, birth dates, medical information, phone numbers, social security numbers, addresses, user names, emails, names, and passwords. Sixty-five percent of those records came from the U.S. According to IT security firm Kaspersky Lab, in 2013 an estimated 37.3 million users worldwide and 4.5 million Americans were the victim of phishing—or pharming—attempts, meaning payment details were stolen from hacked computers, smartphones, or website users.

Report after report shows that digital identity fraud is an increasingly common problem. Hackers and cybercriminals currently have many different tricks at their disposal. But the prevalence of open, unprotected WiFi networks does make it extremely easy for them. The National Cyber Security Center, a division of the Ministry of Security and Justice, did not issue the following advice in vain: “It is not advisable to use open WiFi networks in public places. If these networks are used, work or financial related activities should better be avoided.”

Slotboom calls himself an “ethical hacker,” or one of the good guys; a technology buff who wants to reveal the potential dangers of internet and technology. He advises individuals and companies on how to better protect themselves and their information. He does this, as he did today, usually by demonstrating how easy it is to inflict damage. Because really, it’s child’s play: The device is cheap, the software for intercepting traffic is very easy to use, and readily available for downloaded. “All you need is 70 Euros, an average IQ, and a little patience,” he says. I will refrain from elaborating on some of the more technical aspects, such as equipment, software, and apps needed to go about hacking people.

Session 2: Scanning for Names, Passwords and Sexual Orientation

Armed with Slotboom’s backpack, we move to a coffeehouse that is known for the beautiful flowers drawn in the foam of the lattes, and as a popular spot for freelancers working on laptops. This place is now packed with people concentrating on their screens.

Slotboom switches on his equipment. He takes us through the same steps, and within a couple of minutes, 20 or so devices are connected to ours. Again we see their Mac-addresses and login history, and in some cases their owners names. At my request, we now go a step further.

Slotboom launches another program (also readily available for download), which allows him to extract even more information from the connected smartphones and laptops. We are able to see the specifications of the mobile phone models (Samsung Galaxy S4), the language settings for the different devices, and the version of the operating system used iOS 7.0.5).

The latter can be extremely valuable information for a malicious hacker. If a device has an outdated operating system, for example, there are always known “bugs,” or holes in the security system that can be easily exploited. With this kind of information, you have what you need to break into the operating system and take over the device. A sampling of the coffeehouse customers reveals that none of the connected devices have the latest version of the operating system installed. For all these legacy systems, a known bug is listed online.

We can now see some of the actual internet traffic of those around us. We see that someone with a MacBook is browsing the site Nu.nl. We can see that many devices are sending documents using WeTransfer, some are connecting to Dropbox, and some show activity on Tumblr. We see that someone has just logged on to FourSquare. The name of this person is also shown, and, after googling his name, we recognize him as the person sitting just a few feet away from us.

Information comes flooding in, even from visitors who are not actively working or surfing. Many email programs and apps constantly make contact with their servers—a necessary step for a device to retrieve new emails. For some devices and programs, we are able to see what information is being sent, and to which server.

And now it’s getting really personal. We see that one visitor has the gay dating app Grindr installed on his smartphone. We also see the name and type of the smartphone he’s using (iPhone 5s). We stop here, but it would be a breeze to find out to who the phone belongs to. We also see that someone’s phone is attempting to connect to a server in Russia, sending the password along with it, which we are able to intercept.

Session 3: Checking out Studies, Hobbies and Relationship Problems

Many apps, programs, websites, and types of software make use of encryption technologies. These are there to ensure that the information sent and received from a device is not accessible to unauthorized eyes. But once the user is connected to Slotboom’s WiFi network, these security measures can be circumvented relatively easily, with the help of decryption software.

To our shared surprise, we see an app sending personal information to a company that sells online advertising. Among other things, we see the location data, technical information of the phone, and information of the WiFi network. We can also see the name (first and last) of a woman using the social bookmarking website,Delicious. Delicious allows users to share websites—bookmarks—they are interested in. In principle, the pages that users of Delicious share are available publicly, yet we can’t help feeling like voyeurs when we realize just how much we are able to learn about this woman on the basis of this information.

First we google her name, which immediately allows us to determine what she looks like and where in the coffeehouse she is sitting. We learn that she was born in a different European country and only recently moved to the Netherlands. Through Delicious we discover that she’s been visiting the website of a Dutch language course and she has bookmarked a website with information on the Dutch integration course.

In less than 20 minutes, here’s what we’ve learned about the woman sitting 10 feet from us: where she was born, where she studied, that she has an interest in yoga, that she’s bookmarked an online offer for an anti-snore mantras, recently visited Thailand and Laos, and shows a remarkable interest in sites that offer tips on how to save a relationship.

Slotboom shows me some more hacker tricks. Using an app on his phone, he is able to change specific words on any website. For example, whenever the word “Opstelten” (the name of a Dutch politician) is mentioned, people see the word “Dutroux” (the name of a convicted serial killer) rendered on the page instead. We tested it and it works. We try another trick: Anyone loading a website that includes pictures gets to see a picture selected by Slotboom. This all sounds funny if you’re looking for some mischief, but it also makes it possible to load images of child pornography on someone’s smartphone, the possession of which is a criminal offense.

Password Intercepted

We visit yet another cafe. My last request to Slotboom is to show me what he would do if he wanted to really harm me. He asks me to go to Live.com (the Microsoft email site) and enter a random username and password. A few seconds later, the information I just typed appears on his screen. “Now I have the login details of your email account,” Slotboom says. “The first thing I would do is change the password of your account and to indicate to other services you use that I have forgotten my password. Most people use the same email account for all services. And those new passwords will then be sent to your mailbox, which means I will have them at my disposal as well.” We do the same for Facebook: Slotboom is able to intercept the login name and password I entered with relative ease.

Another trick that Slotboom uses is to divert my internet traffic. For example, whenever I try to access the webpage of my bank, he has instructed his program to re-direct me to a page he owns: a cloned site that appears to be identical to the trusted site, but is in fact completely controlled by Slotboom. Hackers call this DNS spoofing. The information I entered on the site is stored on the server owned by Slotboom. Within 20 minutes he’s obtained the login details, including passwords for my Live.com, SNS Bank, Facebook, and DigiD accounts.

I will never again be connecting to an insecure public WiFi network without taking security measures.
https://decorrespondent.nl/1101/What...40493-53737dba





E.T. Phone Home?

This repository provides a corpus of network communications automatically sent to Apple by OS X Yosemite; we're using this dataset to explore how Yosemite shares user data with Apple.

The provided data was collected using our Net Monitor toolkit; more information regarding usage and methodology is provided below.

Examples

The following occur with all privacy options enabled -- including disabling analytics (i.e., Diagnostics and Usage Data).

About this Mac

When the user selects 'About this Mac' from the Apple menu, Yosemite phones home and s_vi, a unique analytics identifier, is included in the request. (si_vi is used by Adobe/Omniture's analytics software).

If we search the logs for the cookie value, we can find:

• Where the identifying cookie was first set -- when the user visited http://www.apple.com in Safari, with an expiration of two years.
• Where else the cookie is sent to Apple -- for example, when both Spotlight and Help phone home.

DuckDuckGo for Privacy

Having read DuckDuckGo's privacy statements, you might decide to switch Safari's default search to DuckDuckGo. If we enter a new search in Safari, we can then search the logged data to see who the search terms are actually sent to.

The logs show that a copy of your Safari searches are still sent to Apple, even when selecting DuckDuckGo as your search provider, and 'Spotlight Suggestions' are disabled in System Preferences > Spotlight.

Non-Cloud Mail Account

When setting up a new Mail.app account for the address admin@fix-macosx.com, which is hosted locally, searching the logs for "fix-macosx.com" shows that Mail quietly sends the domain entered by the user to Apple, too.
Methodology, Usage, and Caveats

Two different datasets are provided; these were generated in independent VMs with fresh installs of Mac OS X Yosemite:

• eff-user-r0

o All data sharing options disabled.
o Location services disabled.
o iCloud not used.
o No Apple ID used.
o DuckDuckGo selected as Safari search engine

• icloud-user-r0

o Installed with all default options, including sending of "Diagnostics and Usage Data".
o iCloud and most iCloud features enabled, including iCloud drive.

All TCP/SSL connections are logged with one file per connection: <application path>/<iso 8601 time>-<username>-<src addr>-<dest-addr>.log Non-TCP traffic (such as UDP, ICMP) is logged in pcap format in udp-monitor/*.pcap.

Caveats

• This data was collected over the course of a few hours, and with only minimal interaction with the system and applications. It is not a complete representative set of all data potentially collected by Yosemite; for example:
o icloud-user-r0 dataset does not contain the diagnostics data periodically sent to Apple.
o Cursory usage means that application-specific logs are not representative -- e.g., when setting up a Mail account, we only entered information on the first screen.
• Correlation of sockets with file system executable paths is reasonably accurate; actual correspondance should be sanity checked (we've seen cases where proc_pidpath() returned paths for processes that could not be running).
• TLS traffic using client certificates cannot be captured in plaintext by default. For example, NM captures the key exchange performed by apsd (Apple Push Services Daemon), that establishes a client certificate, but NM can't transparently sniff future communications protected by that certificate without the addition of apsd-specific protocol handling.
• Not all traffic is logged in plaintext, so the lack of a match on a search should not be treated as conclusive; it may be necessary to decode data that was encoded for transmission via URL encoding, base64, protobuf, etc.

Contributing

Help is requested in all of the following areas:

• Finding and documenting privacy issues.
• Enhanced automated dataset visualization/decoding.
• Adding application-specific support for processes using client-certificates to SSLsplit.
• Automated (re-)generation of the datasets (e.g, scripting installation and application use).
• Using net-monitor to gather data from AirDrop, Handoff, and other technologies that are difficult to run in a VM environment.
• Exploring work-arounds (e.g., sandboxing, firewalling).

https://github.com/fix-macosx/yosemite-phone-home/





Sprint Promises to Take 2G Into the Internet of Things

They'll be no refarming round here
Bill Ray

Sprint has committed to keeping its 2G network operational beyond AT&T and Verizon, hoping to sign up some machines even if fleshy humans wander away.

Announcing a deal with u-blox to provide embedded modules which are pin compatible with GSM kit already in use, Sprint promised to maintain its CDMA network for "the long term", hoping to attract machines put off by AT&T's decision to shut down 2G in 2017.

Smart meters, cars, home automation and industrial applications all need low-bandwidth connectivity - the Internet of Things won't happen without it. Right now 2G telephone networks are just about the only option, the 900MHz ISM band will mesh a metropolitan area and White Space devices are on their way, but for the moment the cellular operators have a window of opportunity.

2G telephone networks aren't well suited to M2M applications, the signalling overhead and synchronous bandwidth work against it, but the largely-ubiquitous coverage and decent building penetration offsets that.

2G cellular modules are expensive, in M2M terms, costing around $20, but a 3G module will come in at twice that, while LTE is several times the price. That $20 is also largely consumed in patent fees, so the price won't drop much more despite technical innovation.

Machines have a much longer life than phones, so operators wanting to grab some IoT revenue are promising to keep their 2G networks alive for at least a decade, and often longer, which is the motivation behind Sprint's commitment.

Verizon's 2G isn't being as precise as AT&T, but admits it will be shutting down 2G around 2021. T-Mobile is refarming its 2G spectrum while publicly saying it will reserve enough to keep a 2G network operational as it too fancies some machine-to-machine pie.

But keeping a 2G network operational is expensive, and if it's only going to be used to connect a few thousand 'leccy meters the there's not a lot of point, but operators hoping to manage those connections need to be upbeat as any failing confidence will prevent utilities committing to a 2G future.
http://www.theregister.co.uk/2013/04/24/sprint_2g/





T-Mobile Quietly Hardens Part of its U.S. Cellular Network Against Snooping
Ashkan Soltani and Craig Timberg

Wireless carrier T-Mobile US has been quietly upgrading its network in a way that makes it harder for surveillance equipment to eavesdrop on calls and monitor texts, even on the company’s legacy system.

The upgrade involves switching to a new encryption standard, called A5/3, that is harder to crack than older forms of encryption. Testing by The Washington Post has found T-Mobile networks using A5/3 in New York, Washington and Boulder, Colorado, instead of the older A5/1 that long has been standard for second-generation (2G) GSM networks in the United States. More advanced technologies, such as 3G and 4G, already use stronger encryption.

T-Mobile, the fourth-largest wireless carrier in the United States, declined to describe the extent of its network upgrades, saying in a statement, “T-Mobile is continuously implementing advanced security technologies in accordance with worldwide recognized and trusted standards.”

Deutsche Telekom, the majority shareholder of T-Mobile, last year announced plans to make A5/3 standard on all of its 2G networks in Germany. That came after news reports, based on documents provided by former National Security Agency contractor Edward Snowden, that the NSA was eavesdropping on phone calls by German Chancellor Angela Merkel, causing massive backlash in Germany. (The Post reported in December that the NSA can decode texts and conversations using A5/1 encryption.)

In places where T-Mobile is using A5/3 encryption, mass surveillance becomes more difficult because equipment that passively collects cellular signals from the air often cannot decode calls. Active attacks, involving a device called an “IMSI catcher,” may still be able to eavesdrop on individual calls by manipulating a phone’s security settings directly, without having to crack the encryption.

AT&T, the largest provider of GSM cell phone services in the country, said last year that it was deploying A5/3 encryption for parts of its network. “AT&T always protects its customers with the best encryption possible in line with what their device will support,” the company said in a statement.

Tests by the Post in New York, Washington, and Boulder, Colorado showed that AT&T calls used the older A5/1 encryption, making them more vulnerable to interception by law enforcement officials or criminals with access to advanced surveillance technology. The tests were performed using a custom application called Darshak which was released at the Black Hat security conference in August.

AT&T plans to shut down its 2G network in 2017, replacing it with newer, more secure technology. Nationwide, an estimated 13 percent of cellular connections used 2G technology in 2013, and that percentage is expected to fall to 7.2 percent by 2018, according to Cisco, a leading manufacturer of telecommunications network equipment. Internationally, 2G calls are much more prevalent, making up 68.4% of the market in 2013.

The other largest U.S. cellular carriers, Verizon and Sprint, use different technologies in their networks.
http://www.washingtonpost.com/blogs/...inst-snooping/





How Verizon’s Advertising Header Works
Jonathan Mayer

Over the past couple of days, there’s been an outpouring of concern about Verizon’s advertising practices. Verizon Wireless is injecting a unique identifier into web requests, as data transits the network. On my phone, for example, here’s the extra HTTP header.1

X-UIDH: OTgxNTk2NDk0ADJVquRu5NS5+rSbBANlrp+13QL7CXLGsFHpMi4LsUHw

After poring over Verizon’s related patents and marketing materials, here’s my rough understanding of how the header works.

In short, Verizon is packaging and selling subscriber information, acting as a data broker on real-time advertising exchanges. Questionable. By default, the information appears to consist of demographic and geographic segments.2 If a user has opted into “Verizon Selects,” then Verizon also shares behavioral profiles built by deep packet inspection.

Whatever the merits of Verizon’s new business model, the technical design has two substantial shortcomings. First, the X-UIDH header functions as a temporary supercookie.3 Any website can easily track a user, regardless of cookie blocking and other privacy protections.4 No relationship with Verizon is required.

Second, while Verizon offers privacy settings, they don’t prevent sending the X-UIDH header.5 All they do, seemingly, is prevent Verizon from selling information about a user.

Much better designs are possible. Verizon doesn’t need to supercookie its wireless subscribers to sell their advertising segments.6 And it certainly doesn’t need to send a supercookie if a user isn’t participating.

The diagram above includes phone, server, cloud, and cash assets from The Noun Project. Thanks to the participants in Princeton’s Web Tracking and Transparency Workshop, who provided valuable feedback.

______________________


1. In my (very limited) testing, the header was injected into every HTTP request from my iPhone 6 Plus. Some subscribers have reported not seeing the header, or only seeing the header with certain requests.

2. Verizon’s case studies also suggest the system can be used for advertising attribution.

3. According to a comment on Hacker News, the X-UIDH value changes each week. I can’t (yet) confirm that. Over the past two days, anyway, the X-UIDH value for my phone has been static.

4. HTTP blocking, like Adblock Plus or Privacy Badger, would still be effective.

5. If I understand correctly, the demographic and geographic advertising segments are opt out, associated with Verizon’s CPNI privacy preference. Behavioral segments are opt in, associated with the “Verizon Selects” preference (formerly “Relevant Mobile Advertising”).

6. For example, Verizon could send an encrypted ID and nonce with each request. A recipient website would not be able to use the values to track a user.
http://webpolicy.org/2014/10/24/how-...-header-works/





Local Police Tracking Mobile Phones Using Military Gear
Peter Robison

Freddy Martinez, a 27-year-old systems administrator, was in Chicago’s Daley Plaza last February protesting National Security Agency surveillance programs when a sedan with the green-lettered license plates of an unmarked police vehicle pulled up nearby. He’d noticed trouble with dropped calls at previous demonstrations, including the 2012 NATO summit. He opened an app on his phone that spots nearby cellular transmitters and saw a new signal. He wondered if it might be coming from the car.

Martinez filed a request under state public records laws for information about mobile-phone surveillance equipment used by the Chicago Police Department. In April the department produced invoices for military-grade spy gear that identify and track mobile phones in real time, Bloomberg Businessweek reports in its Oct. 20 issue. The department declined to say more, citing exemptions under the Homeland Security Act and the Arms Control Export Act.

In September, Martinez sued in Illinois state court seeking details about how the department has used the equipment. “Whether you think this is good or bad technology for the police to have, the public is entitled to some sense of how it’s being used,” says Matthew Topic, Martinez’s lawyer. Chicago police declined to comment on pending litigation.

In the past decade, local law enforcement agencies have spent millions on sophisticated surveillance tools using grants from the U.S. Department of Homeland Security, which has showered more than $35 billion on cities, counties, and states for terrorism prevention and disaster preparedness. Funding has also come from federal drug enforcement grants and nonprofits that fundraise for police departments.

Tech Intruder

Civil liberties advocates say the technology intrudes on the activities of innocent people, not just criminal suspects. Police can use the devices without anyone knowing -- and without having to get approval from judges, as they would to obtain phone records. “We’re talking about really sophisticated, invasive technology being used with next to no oversight by local cops,” says Christopher Soghoian, a senior policy analyst with the American Civil

Liberties Union.

One of the main beneficiaries is Harris Corp., a Melbourne, Florida, defense contractor. The company is authorized by the Federal Communications Commission to sell a mobile-phone tracking tool known as the StingRay. The suitcase-size devices send signals that mimic cell towers, tricking phones as far as a mile away into transmitting identifying information.

StingRay Sales

Chicago spent at least $150,000 on StingRay equipment, including software upgrades and antennas, according to the documents released in response to Martinez’s request. Houston, San Francisco, and Miami have mobile-phone tracking devices made by Harris, too. The company, which also makes military radios and manages air traffic control systems for the Federal Aviation Administration, routinely refuses to discuss the StingRay. “We refer everything back to the law enforcement agencies that are reportedly using them,” says spokesman Jim Burke.

At least 44 police forces across 18 states have mobile-phone surveillance gear, and others borrow it from federal agencies such as the Federal Bureau of Investigation or the Drug Enforcement Administration, according to the ACLU.

Police in Tacoma, Washington, told city officials they planned to use their StingRay to detect mobile phone-detonated bombs like those often seen in Iraq. City council members and judges were surprised to learn police use it about three times a month in routine investigations, according to the Tacoma News Tribune. Once, it was used to recover a stolen city laptop. Officials told the newspaper they had confidence in the department’s ability to manage the technology.

Crime Prevention

“If our law enforcement need access to information to prevent crime or keep us safe, that’s a legitimate use of the technology,” said Mayor Marilyn Strickland.

Few guidelines exist for governing the use of mobile-phone trackers, says Brian Owsley, an assistant professor at Indiana Tech Law School. As a federal magistrate judge in Texas, he denied a request from the DEA to use the StingRay in a 2012 drug trafficking case because the agency offered no plan for protecting innocent people from being monitored.

In other cases, authorities have kept mobile-phone surveillance secret from judges. Police in Tallahassee, Florida, used StingRays more than 200 times without seeking warrants, according to testimony in a sexual battery and theft case. Police found the defendant by tracking the victim’s mobile phone. A Florida appeals court last year threw out the conviction, saying it violated Fourth Amendment protections against unreasonable searches.

“There’s no doubt somebody’s saying, ‘We don’t have time to get a court order, but we’re going to find out who’s at this protest,’ ” says Owsley.
http://www.bloomberg.com/news/2014-1...tary-gear.html





East Bay CHP Officer Accused Of Stealing Nude Photos Says It’s ‘Game’ For Police

An East Bay California Highway Patrol officer accused of stealing nude photos of a DUI suspect is speaking out, claiming that officers have stolen images for years.

Officer Sean Harrington of Martinez reportedly confessed to stealing explicit photos from the suspect’s phone, and said he forwarded those images to at least two other CHP officers.

According to court documents obtained by the Contra Costa Times, Harrington told investigators this kind of image stealing has been going on for years in the state law enforcement agency, stretching all the way from Los Angeles to his own Dublin station.

Harrington called the photo stealing a “game” and said he had done the same thing to female arrestees a “half dozen times in the last several years.”

Text messages obtained by the newspaper also show the kind of exchanges going on. In one exchange involving Harrington and another officer, Harrington said, “Just rerun a favor down the road buddy” with a smiley face.

Suspicions that the photos were stolen were first raised when the suspect said she synced her phone after her arrest and noticed when six photos were sent from her phone to another account.

“They’re personal private photos she meant to keep private on her own cellphone and were not meant for anybody else’s eyes,” Richard Madsen, the victim’s attorney told KPIX 5.

In a written statement to KPIX 5, CHP Commissioner Joseph Farrow said, “The allegations anger and disgust me. We expect the highest levels of integrity and moral strength from everyone in the California Highway Patrol and there is no place in our organization for such behavior.”
http://sanfrancisco.cbslocal.com/201...an-harrington/





At Austin Airport, Wi-Fi Predicts How Long the Security Line Will Be

Using signals from passengers' devices, the system collects data that can help travelers plan ahead
Stephen Lawson

The Internet can ease travel concerns in many ways, including flight-delay information, maps of road congestion, and ride-sharing apps. But a Wi-Fi network at the Austin, Texas, airport can now answer one of the great unknowns: How long will I have to wait in line at security?

That information is available thanks to fairly simple technology implemented on a Cisco Systems network run by global Wi-Fi provider Boingo Wireless. It's an early example of how the so-called Internet of Things can make some parts of life easier.

Austin-Bergstrom International Airport got the nation's first airport Wi-Fi network in 2000, according to Boingo, which has run the airport's Wi-Fi since 2008. Now it's become one of the first airports to implement Passpoint, the standard that lets users of some devices get on networks and roam between them without entering a username and password. The Cisco network that supports Passpoint can also use location technologies for additional services.

Travelers don't even need to get on the network to take advantage of the security-wait warning system. A forecast for how long each line will take appears on screens right outside the security checkpoint. And any traveler who goes through security with a device that has Wi-Fi or Bluetooth turned on also helps to make the system work, according to Boingo CTO Derek Peterson. Boingo has launched the wait-sensing technology at three airports, all in trial mode, and Austin's is the first facility where it's displaying the information.

Here's how the system works: Wi-Fi devices with standard settings turned on constantly send out signals looking for nearby Wi-Fi devices and access points. Access points near the security checkpoints detect those signals and the unique MAC (media access control) addresses associated with them. Using that data, the system determines when that device entered the area of the queue and when it reached the other end of the checkpoint, after the owner finished with security.

In some areas, the airport does the same thing with beacons that detect Bluetooth signals emitted by users' devices. The unique Bluetooth ID identifies each device, so it works the same way as a MAC address. In some areas, the system uses both Wi-Fi and Bluetooth.

Despite using a unique identifier for each device, the system doesn't identify the person carrying that device, Peterson said: The airport isn't concerned with who's made it through security, just how long it took them to get through.

There's one wait-time display for each line to go through security, so before they get in line, travelers can join the one with the shortest wait. In effect, they do their own load balancing, which can minimize the wait time for everyone.

The signs help set travelers' expectations, which has reduced complaints about the wait, Peterson said. But the displays are just the beginning. All that data is stored -- again, without any names associated with it -- and can be analyzed to estimate how long the security line will be at any given time and day. Boingo has shared the information with the airport and the Transportation Security Administration (TSA), which then made changes that reduced wait times, he said. It may also share the data with airlines, which could use it to estimate wait times for their customers. Delivered well before their flights, those estimates could help them decide when to leave for the airport.

Data from Wi-Fi and Bluetooth devices makes for pretty accurate estimates of wait times, according to Peterson. With about two months of historical data, the system can build a predictive model for normal days that's about 99 percent accurate, he said. In Austin, where the system has more than a year of data built up, it can now account for holiday travel surges. But other events, like the shooting of a TSA agent that shut down part of Los Angeles International Airport last year, can't really be built into a predictive model.

Boingo envisions using the network's location capabilities for other services, too. For example, the airport could predict busy times for restrooms with a detection system similar to the one at the security checkpoints, and it could track movable airport assets such as wheelchairs by equipping them with Wi-Fi radios, Peterson said.

Additional services, for travelers who opt in to them, could include notifications and retail promotions that are based on where someone is in the airport and how much time they have to get to the gate and board, he said.

Wireless location technology has improved dramatically just in the past six to 18 months, according to Peterson."It's come a long way in the accuracy that you're able to get off of both Bluetooth and Wi-Fi," he said.

As for luggage, seemingly the most misplaced thing in airports, there may be a Wi-Fi solution for that problem, too.

"That is one of the applications that we've toyed with," Peterson said.
http://www.itworld.com/article/28377...e-will-be.html





Congress to the FBI: There's 'Zero Chance' We'll Force Apple to Decrypt Phones
Jason Koebler

The FBI's director wants Congress to force force Apple and Google to do away with default smartphone encryption. Congress, however, doesn’t look to be with him.

Last week, FBI director James Comey suggested that encryption "threatens to lead all of us to a very dark place" and suggested that if Apple and Google don't remove default encryption from iOS and Android then "Congress might have to force this on companies."

But years of National Security Administration surveillance and other privacy oversteps and surveillance creep by the federal government has lawmakers skittish to do anything that'll be seen as expanding the surveillance state, even if Congress still isn't ready to roll back the laws it already has on the books.

"To FBI Director Comey and the Admin on criticisms of legitimate businesses using encryption: you reap what you sow," California Republican Rep. Darrell Issa tweeted. "The FBI and Justice Department must be more accountable—tough sell for them to now ask the American people for more surveillance power."

Issa holds considerable power on such matters, and The Hill reported that other lawmakers have echoed his sentiments. Rep. Zoe Lofgren (a California democrat who has been staunchly anti surveillance for some years now) said that Comey's proposal would have "zero chance" of passing; Sen. Ron Wyden (D-Ore.) told the publication that he doubts more than "a handful" of lawmakers would support such a bill.

Americans have watched their government mislead the public about data collection and resist necessary oversight.

— Darrell Issa (@DarrellIssa) October 17, 2014


So, while it’s disappointing Congress won’t roll back NSA surveillance, it’s at least heartening to hear that Congress thinks that passing a bill like Comey has suggested would be political suicide.

Comey repeatedly suggested that "bad guys" use encryption to evade law enforcement, and that it's time to have "conversation as a country about where we are, where we want to be, with respect to the authority of law enforcement."

To FBI Director Comey and the Admin on criticisms of legitimate businesses using encryption: you reap what you sow. http://t.co/qz7yUTJWln

— Darrell Issa (@DarrellIssa) October 17, 2014


The Electronic Frontier Foundation points out that we’ve already been through this, back in the 1990s, in what was called the “Crypto Wars.” The Communications Assistance for Law Enforcement Act states that companies “shall not be responsible for decrypting, or ensuring the government’s ability to decrypt” communication.

“The law specifically ensures that a company is not required to essentially become an agent of the FBI rather than serving your security and privacy interests,” the EFF said in a statement.

That conversation may be occurring right now, and it appears to be a quick one: The people’s elected representatives aren't with you, James.
http://motherboard.vice.com/read/con...decrypt-phones





CIA Apparently 'Impersonated' Senate Staffers to Gain Access to Documents on Shared Drives
Tim Cushing

The CIA is still fighting for creative control of its most anticipated 21st century work: the Torture Report. Long before it got involved in the ongoing redaction battle, it was spying on those putting the report together, namely Senators and Senate staffers. Hands were wrung, apologies were made and it was medically determined that Sen. Dianne Feinstein doesn't have an ironic bone in her body.

The Torture Report's final cut now seemingly lies in the hands of White House Chief of Staff Denis McDonough -- a rather strange place for it to be considering the administration has no shortage of officials willing to offer their input on national security issues. But McDonough's ill-fitting position as go-between to the Senate and the CIA isn't the most interesting part of the story, although it appears he's trying to keep the "hanging" of CIA director John Brennan from being a foregone conclusion. Neither he nor the White House have suggested a replacement scapegoat, so Brennan may end up paying the price despite having the administration's full support. You can't just drop something as damaging as the Torture Report on the American public and simply walk away from it. A symbolic sacrifice still needs to be made, even if the underlying problems continue to be ignored.

No, the most interesting part of the latest Torture Report details almost falls off the end of the page over at The Huffington Post. It's more hints of CIA spying, ones that go a bit further than previously covered.

According to sources familiar with the CIA inspector general report that details the alleged abuses by agency officials, CIA agents impersonated Senate staffers in order to gain access to Senate communications and drafts of the Intelligence Committee investigation. These sources requested anonymity because the details of the agency's inspector general report remain classified.

"If people knew the details of what they actually did to hack into the Senate computers to go search for the torture document, jaws would drop. It's straight out of a movie," said one Senate source familiar with the document.


Impersonating staff to gain access to Senate Torture Report work material would be straight-up espionage. Before we get to the response that mitigates the severity of this allegation, let's look at what we do know.

The CIA accessed the Senate's private network to (presumably) gain access to works-in-progress. This was denied (badly) by CIA director John Brennan. The CIA also claimed Senate staffers had improperly accessed classified documents and reported them to the DOJ, even though they knew the charges were false. Then, after Brennan told his agency to stop spying on the Senate, agents took it upon themselves to improperly access Senate email accounts. This is all gleaned from a few public statements and a one-page summary of an Inspector General's report -- the same unreleased report EPIC is currently suing the agency over.

Now, there's this: accusations that the CIA impersonated Senate staffers in hopes of accessing Torture Report documents. Certainly a believable accusation, considering the tactics it's deployed in the very recent past. This is being denied -- or, at least, talked around.

A person familiar with the events surrounding the dispute between the CIA and Intelligence Committee said the suggestion that the agency posed as staff to access drafts of the study is untrue.

“CIA simply attempted to determine if its side of the firewall could have been accessed through the Google search tool. CIA did not use administrator access to examine [Intelligence Committee] work product,” the source said.


So, it was a just an innocuous firewall test. And according to this explanation, it wasn't done to examine the Senate's in-progress Torture Report. But this narrative meshes with previous accusations, including those detailed in the Inspector General's report.

Logging on to the shared drives with Senate credentials would allow agents to check the firewall for holes. But it also would allow them to see other Senate documents, presumably only accessible from that "side" of the firewall. While there's been no mention of "impersonation" up to this point, the first violation highlighted by the IG's report seems to be the most likely explanation of what happened here.

Five Agency employees, two attorneys and three information technology (IT) staff members, improperly accessed or caused access to the SSCI Majority staff shared drives on the RDINet

Accessing another part of the shared network/drive by using someone else's credentials is low-level hackery, but not the first thing that springs to mind when someone says "impersonation." A supposed firewall test would be the perfect cover for sniffing around previously off-limits areas. Much of what has come to light about the agency's actions hints at low-level espionage. There's still more buried in the IG report that the agency is actively trying to keep from being made public. Just because these activities didn't specifically "target" Senate work material, it was all there and able to accessed. It doesn't really matter what the CIA says it was looking for. The fact that it was done at all, and done with such carefree audacity, is the problem. There are presumably ways to perform these checks that don't involve Inspector Generals, damning reports and multiple hacking accusations.
https://www.techdirt.com/articles/20...d-drives.shtml





Facebook Rebukes DEA For Impersonating Woman Online

After BuzzFeed News revealed that the Drug Enforcement Administration had created a phony Facebook page using a real woman’s name — without her knowledge — the company has told the agency it committed a “serious breach” of Facebook’s terms of service.
Chris Hamby

Facebook has bluntly told the U.S. Drug Enforcement Administration to stop using phony accounts and posing as real people in its investigations.

The company’s rebuke, delivered Friday in a sharply critical letter to the law enforcement agency, comes after BuzzFeed News disclosed that a DEA agent had created a bogus Facebook account, impersonated an upstate New York woman, and posted racy photos of her and an image of her young son from her seized cell phone — all without her knowledge. The agent used the account to contact suspected criminals.

Lawyers from the U.S. Department of Justice have defended the agent’s actions in court filings. But Facebook strongly disagreed.

“We regard the DEA’s conduct to be a knowing and serious breach of Facebook’s terms and policies,” the social media site’s chief security officer, Joe Sullivan, wrote in the letter to DEA Administrator Michele Leonhart. He also said the company is “deeply troubled” by the agency’s legal position.

Facebook removed the fake profile after BuzzFeed News revealed that it had been created by the agent, and a Justice Department spokesman responded to the story by saying, “The incident at issue in this case is under review by Justice Department officials.”

In a statement about Facebook’s letter, the spokesman said, “That review is ongoing, but to our knowledge, this is not a widespread practice among our federal law enforcement agencies.” The Associated Press first reported on the letter.

The actions by the DEA agent, Timothy Sinnigen, came to light because the woman he impersonated, Sondra Arquiett, is suing him and the government in federal court, saying the bogus profile violated her privacy and placed her in danger.

Law enforcement officers had arrested Arquiett in 2010, accusing her of being part of a drug ring. But evidence emerged that she was a bit player. She accepted responsibility and pled guilty, and a judge sentenced her to probation, which she has completed.

But while she was awaiting trial, Sinnigen created a Facebook page using her real name, which was then Sondra Prince. He posted photos from her phone, including one of her posing, legs spread, on the hood of a BMW and another of her with her son and niece, who were young children.

When BuzzFeed News disclosed the profile’s existence, it was still viewable by anyone with a Facebook account.

In a court filing, a U.S. Attorney stated that Sinnigen sent a friend request to a fugitive, accepted other friend requests, and used the account “for a legitimate law enforcement purpose.”

The government offered a defense of Sinnigen’s actions in a court document: “Defendants admit that Plaintiff did not give express permission for the use of photographs contained on her phone on an undercover Facebook page, but state the Plaintiff implicitly consented by granting access to the information stored in her cell phone and by consenting to the use of that information to aid in an ongoing criminal investigations [sic].”

Facebook’s letter addresses this claim, saying, “Facebook is deeply troubled by the DEA’s claims and legal position.” Facebook’s Sullivan wrote that “our terms and Community Standards — which the DEA agent had to acknowledge and agree to when registering for a Facebook account — expressly prohibit the creation and use of fake accounts.” He added, “Facebook has long made clear that law enforcement authorities are subject to these policies.”
http://www.buzzfeed.com/chrishamby/f...g-woman-online





New Evidence of the NSA Deliberately Weakening Encryption
Tom Leinster

One of the most high-profile ways in which mathematicians are implicated in mass surveillance is in the intelligence agencies’ deliberate weakening of commercially available encryption systems — the same systems that we rely on to protect ourselves from fraud, and, if we wish, to ensure our basic human privacy.

We already knew quite a lot about what they’ve been doing. The NSA’s 2013 budget request asked for funding to “insert vulnerabilities into commercial encryption systems”. Many people now know the story of the Dual Elliptic Curve pseudorandom number generator, used for online encryption, which the NSA aggressively and successfully pushed to become the industry standard, and which has weaknesses that are widely agreed by experts to be a back door. Reuters reported last year that the NSA arranged a secret $10 million contract with the influential American security company RSA (yes, that RSA), who became the most important distributor of that compromised algorithm.

In the August Notices of the AMS, longtime NSA employee Richard George tried to suggest that this was baseless innuendo. But new evidence published in The Intercept makes that even harder to believe than it already was. For instance, we now know about the top secret programme Sentry Raven, which

works with specific US commercial entities … to modify US manufactured encryption systems to make them exploitable for SIGINT [signals intelligence].

(page 9 of this 2004 NSA document).
_____________

The Intercept article begins with a dramatic NSA-drawn diagram of the hierarchy of secrecy levels. Each level is colour-coded. Top secret is red, and above top secret (these guys really give it 110%) are the “core secrets” — which, as you’d probably guess, are in black. From the article:

the NSA’s “core secrets” include the fact that the agency works with US and foreign companies to weaken their encryption systems.

(The source documents themselves are linked at the bottom of the article.)

It’s noted that there is “a long history of overt NSA involvement with American companies, especially telecommunications and technology firms”. Few of us, I imagine, would regard that as a bad thing in itself. It’s the nature of the involvement that’s worrying. The aim is not just to crack the encrypted messages of particular criminal suspects, but the wholesale compromise of all widely used encryption methods:

The description of Sentry Raven, which focuses on encryption, provides additional confirmation that American companies have helped the NSA by secretly weakening encryption products to make them vulnerable to the agency.

The documents also appear to suggest that NSA staff are planted inside American security, technology or telecomms companies without the employer’s knowledge. Chris Soghoian, principal technologist at the ACLU, notes that “As more and more communications become encrypted, the attraction for intelligence agencies of stealing an encryption key becomes irresistible … It’s such a juicy target.”

Unsurprisingly, the newly-revealed documents don’t say anything specific about the role played by mathematicians in weakening digital encryption. But they do make it that bit harder for defenders of the intelligence agencies to maintain that their cryptographic efforts are solely directed against the “bad guys” (a facile distinction, but one that gets made).

In other words, there is now extremely strong documentary evidence that the NSA and its partners make strenuous efforts to compromise, undermine, degrade and weaken all commonly-used encryption software. As the Reuters article puts it:

The RSA deal shows one way the NSA carried out what Snowden’s documents describe as a key strategy for enhancing surveillance: the systematic erosion of security tools.

The more or less explicit aim is that no human being is able to send a message to any other human being that the NSA cannot read.

Let that sink in for a while. There is less hyperbole than there might seem when people say that the NSA’s goal is the wholesale elimination of privacy.

This evening, I’m going to see Laura Poitras’s film Citizenfour (trailer), a documentary about Edward Snowden by one of the two journalists to whom he gave the full set of documents. But before that, I’m going to a mathematical colloquium by Trevor Wooley, Strategic Director of the Heilbronn Institute — which is the University of Bristol’s joint venture with GCHQ. I wonder how mathematicians like him, or young mathematicians now considering working for the NSA or GCHQ, feel about the prospect of a world where it is impossible for human beings to communicate in private.
https://golem.ph.utexas.edu/category...weakening.html





Computer Users Who Damage National Security Could Face Jail

Human rights experts criticise proposed legislation saying new law could be used to target legitimate whistleblowers
Matthew Taylor

Government plans that mean computer users deemed to have damaged national security, human welfare, the economy or the environment will face a life sentence have been criticised by experts who warn that the new law could be used to target legitimate whistleblowers.

Last week the Joint Committee on Human Rights raised concerns about the proposals and the scope of such legislation.

“Legal certainty requires that criminal offences are precisely defined so that individuals know how to avoid such sanctions,” its report stated. “Vagueness is not permissible in the definition of criminal offences.”

Professor Peter Sommer, cyber security academic and expert witness, said legitimate whistleblowers could be targeted, adding that existing legislation under the Computer Misuse Act, which allows a maximum sentence of 10 years, was sufficient.

“There is almost certainly adequate legislation to deal with situations that arise in relation to computer misuse … after that if you want to pursue a matter of terrorism against national security, for example, then pursue that matter under the appropriate terrorism legislation.”

Sommer said he suspected the plan was prompted by politicians who wanted “the opportunity to stand up and sound tough”, but he warned there could be serious consequences. “If this is not more carefully defined it could go after people who you and I and many others may classify as a whistleblower.”

The executive director of Open Rights Group, Jim Killock, warned that the legislation was too widely drawn and called for greater protection for potential whistleblowers.

“As the internet affects more areas of our lives, computer legislation drafted in one context may be more widely applied than originally intended. We would hope that an increase in penalties under the Computer Misuse Act would be matched with additional protections – for example, through a public interest defence.”

The government says the legislation was needed to deal with catastrophic cyber attacks “which result in loss of life, serious illness or injury or serious damage to national security, or a significant risk thereof”.

It says that as well as targeting cyber terrorists, the new offence in the proposed update to the Computer Misuse Act 1990 would also hand harsher sentences to those hackers carrying out industrial espionage, believed to be a growing menace affecting UK business.

A Home Office spokesperson said: “Serious and organised crime blights lives and causes misery across the UK. It is a threat to our national security and costs hard-working taxpayers at least £24bn a year.”

He added that the reliance on computer systems and the degree to which they are interlinked is “ever increasing and a major cyber attack on our critical infrastructure would have grave consequences.

“Through this bill we will ensure that in the event of such a serious attack those responsible would face the justice they deserve.”
http://www.theguardian.com/law/2014/...rity-face-jail





Avast Antivirus Was Spying On You with Adware (Until This Week)
Lowell Heddings

We warned you at the beginning of the year that many of your browser extensions are spying on you, tracking what you are visiting, and even inserting ads into pages. These aren’t just no-name developers either: even Avast, one of the most trusted antivirus vendors was in on the game.

Before we go even one step further, it’s important to note that they recently disabled the spying “shopping” feature in their browser extension. So if you are running the latest Chrome with extensions updated, you are fine. For now.

So Avast has stopped integrating the spying extension, but this is about the principle: you should be able to trust your antivirus provider. Why are they adding a feature that spies on your browsing, inserts ads… and all without properly notifying you?

And why, at the same time, are they claiming to stop spyware, even uninstalling other shopping extensions from other vendors, while they were doing the same thing they are supposed to stop?

Avast removes other Shopping extensions while leaving theirs enabled

On our test system, the only spyware and crapware that Avast actually detected and removed were the ones that competed with their own shopping extension.

About a week ago, we were playing around with installing a lot of nonsense from crapware sites, so we loaded up trusty Avast antivirus to see how much of the malware it would actually catch during the process. We were shocked to find out that some of the adware wasn’t from a third-party, but from Avast itself.

The problem lies in the SafePrice component of their Online Security extension, which adds shopping recommendations (ads) as you are browsing around the web.

Here’s the thing: many people actually want shopping extensions that help them find better prices — in fact, one of the HTG staff writers recently asked me what was the best way to find better prices. As a standalone product, if you specifically and deliberately choose to install something like this, there’s nothing wrong with it.

The problem is that Avast snuck this component in to their browser extensions that have at least 10 million users for the Chrome version alone. And then they enabled it by default.

Note: as we were doing research for this article, they updated their extension to not include the shopping feature, but it was there since maybe around last December.

Spying, You Say?

You might remember earlier how we said that this extension is spying on you and, unlike many websites, we’re definitely not going to make some claim like that without proof of what is really going on. So we loaded up Fiddler to see what’s really going on behind the scenes and under the hood and behind the curtain.

As it turns out, every single URL that you visit was being sent to Avast servers — first there would be a check to /urlinfo on one of their servers, passing in a unique ID that represents you on every single request. In this way they can build a list of every single page you have ever visited. They claim on their web site that they remove all personally identifying information, but how, exactly, are they able to do that when they are tracking every single page you visit and sending back that URL with a unique ID to represent you?

That unique tracking ID is the biggest problem here: while it might not identify you by name, it’s enough to tie your whole browsing history together, and that’s a scary thing.

And remember, you didn’t ask for this. You just wanted to keep yourself safe online with a trusted antivirus provider.

The Bottom Line: Browser Extensions Have Wayyyy Too Much Power

This behavior, while ridiculous and sad from a company you should trust, isn’t new at all. Almost every product and service on the Internet and almost every browser extension, app, and website, are doing some form of tracking. Here on How-To Geek we use Google Analytics to see our site statistics, and our advertisers probably use a lot of other tracking that we can’t control. And it’s the same with every single web site.

Personal information and big data have become the standard; because after all: if a product is free, the real product is you. If you are browsing and reading a completely free web site, it’s not that big of a deal… after all, sites like ours need to pay our writers, and advertisements are the only way to do that. The problem is when it’s across everything you do.

The problem is that most browser extensions have access to everything you are seeing on the Internet, across every web site. And they aren’t properly disclosing this to you.

So the next time an extension says it can “Read and modify all your data on the websites you visit”, perhaps you should click that “Remove from Chrome” button instead.
http://www.howtogeek.com/199829/avas...til-this-week/





The FCC as Data Security Cop: $10 Million Fine for Carriers’ Security Breaches
Michael Cooney

FCC says TerraCom and its affiliate YourTel stored Social Security numbers, names, addresses in the open

The FCC took a big stand today saying it will fine TerraCom and YourTel America $10 million because the agency said both carriers violated the privacy of phone customers’ personal information. The action is the agency’s first data security case and the largest privacy enforcement in the Commission’s history.

The FCC said TerraCom and its affiliate YourTel stored Social Security numbers, names, addresses, driver’s licenses, and other sensitive information belonging to their customers on unprotected Internet servers that anyone in the world could access.

In their privacy policies, the FCC said two companies stated that they had in place “technology and security features to safeguard the privacy of your customer specific information from unauthorized access or improper use.” Yet, from September 2012 through April 2013, the sensitive documents they collected from consumers were apparently stored in a format accessible via the Internet and readable by anyone.

Even after the companies learned of this security breach, they allegedly failed to notify all potentially affected consumers, depriving them of any opportunity to take steps to protect their personal information from misuse by Internet thieves, the FCC stated.

The personal information was gathered to demonstrate eligibility for the Universal Service Fund’s Lifeline program, which offers discounted phone services for low-income consumers.

“Consumers trust that when phone companies ask for their Social Security number, driver’s license, and other personal information, these companies will not put that information on the Internet or otherwise expose it to the world,” said Travis LeBlanc, Chief of the FCC’s Enforcement Bureau in a statement. “When carriers break that trust, the Commission will take action to ensure that they are held accountable for unjust and unreasonable data security practices.”
http://www.networkworld.com/article/...-breaches.html





Fingerprints Are Usernames, Not Passwords
Dustin Kirkland

As one of the maintainers of eCryptfs, and a long time Thinkpad owner, I have been asked many times to add support to eCryptfs for Thinkpad's fingerprint readers.

I actually captured this as a wishlist bug in Launchpad in August 2008, but upon thinking about it a bit more, I later closed the bug "won't fix" in February 2009, and discussed in a blog post, saying:

Hi, thanks so much for the bug report.I've been thinking about this quite a bit lately. I'm going to have to mark this "won't fix" for now. The prevailing opinion from security professionals is that fingerprints are perhaps a good replacement for usernames. However, they're really not a good replacement for passwords. Consider your laptop... How many fingerprints of yours are there on your laptop right now? As such, it's about as secret as your username. You don't leave your password on your spacebar, or on your beer bottle :-) This wikipedia entry (although it's about Microsoft Fingerprint Readers) is pretty accurate: * http://en.wikipedia.org/wiki/Microso...erprint_Reader So, I'm sorry, but I don't think we'll be fixing this for now.

I'm bringing this up again to highlight the work released last week by The Chaos Computer Club, which has demonstrated how truly insecure Apple's TouchID is.

There may be civil liberties at issue as well. While this piece is satire, and Apple says that it is not sharing your fingerprints with the government, we've been kept in the dark about such things before. I'll leave you to draw your own conclusions on that one.

But let's just say you're okay with Apple sharing your fingerprints with the NSA, as I've already told you, they're not private at all. You leave them on everything you touch. And let's say you're insistent on using fingerprint (biometric) technology because you can. In that case, your fingerprints might identify you, much as a your email address or username identifies you, perhaps from a list.

I could see some value, perhaps, in a tablet that I share with my wife, where each of us have our own accounts, with independent configurations, apps, and settings. We could each conveniently identify ourselves by our fingerprint. But biometrics cannot, and absolutely must not, be used to authenticate an identity. For authentication, you need a password or passphrase. Something that can be independently chosen, changed, and rotated. I will continue to advocate this within the Ubuntu development community, as I have since 2009.

Once your fingerprint is compromised (and, yes, it almost certainly already is, if you've crossed an international border or registered for a driver's license in some US states and countries), how do you change it? Are you starting to see why this is a really bad idea?

There are plenty of inventions that exist, but turned out to be bad ideas. And I think fingerprint readers are another one of those.

This isn't a knock on Apple, as Thinkpad have embedded fingerprint readers for nearly a decade. My intention is to help stop and think about the place of biometrics in security. Biometrics can be use used as a lightweight, convenient mechanism to establish identity, but they cannot authenticate a person or a thing alone.

So please, if you have any respect for the privacy your data, or your contacts' information, please don't use fingerprints (or biometrics, in general) for authentication.
http://blog.dustinkirkland.com/2013/...names-not.html





EFF Relaunches Surveillance Self-Defense
Jillian York

We’re thrilled to announce the relaunch of Surveillance Self-Defense (SSD), our guide to defending yourself and your friends from digital surveillance by using encryption tools and developing appropriate privacy and security practices. The site launches today in English, Arabic, and Spanish, with more languages coming soon.

SSD was first launched in 2009, to “educate Americans about the law and technology of communications surveillance…” and to provide information on how to use technology more safely. Not long after, in the midst of the 2009 Iranian uprising, we launched an international version that focused on the concerns of individuals struggling to preserve their right to free expression in authoritarian regimes.

In the time since the Snowden revelations, we’ve learned a lot about the threats faced by individuals and organizations all over the world—threats to privacy, security, and free expression. And there is still plenty that we don’t know. In creating the new SSD, we seek to help users of technology understand for themselves the threats they face and use technology to fight back against them. These resources are intended to inspire better-informed conversations and decision-making about digital security in privacy, resulting in a stronger uptake of best practices, and the spread of vital awareness among our many constituents.

We invite you to take a look at SSD, and to provide us with feedback (we’ve made it easy: there’s a feedback dropdown on every page). Right now, the site is available in just three languages, but we soon plan to expand, with Vietnamese, Russian, Persian, and several other languages in our sights. And if you think we’ve missed something, please let us know. The threats are always changing, so our advice should change to keep up.
https://www.eff.org/deeplinks/2014/1...e-self-defense





Assange Court Ruling Expected on Monday
AFP

The court could have announced its decision as early as Friday on an appeal by Assange's lawyers against the arrest warrant hanging over him for allegations of rape and sexual molestation.

But on Friday lunchtime the Svea Court of Appeal told The Local it was still waiting for the prosecutor to respond to the latest appeal document. The prosecutor has until Monday October 27th to do so.

Earlier in the week, Assange commented on the upcoming development in his case.

"We will win because the law is very clear. My only hope is that the court is following the law and is not pressured politically to do anything outside of the law," he said via a video link screened at a human rights film festival in Barcelona on Wednesday.

Swedish prosecutors want to question the 43-year-old Australian, who could also face trial in the United States over WikiLeaks' publishing a horde of sensitive military and diplomatic communications.

Assange has been holed up since 2012 in the Ecuadorian embassy in London. Ecuador granted him political asylum the same year.

If the Swedish court scraps the European arrest warrant against Assange, it could mean that he would be able to leave the Ecuadoran embassy for the first time in more than two years.

"As time goes by, political pressure decreases and understanding increases. So I am very confident I will not remain in this situation. I'm completely confident," Assange said.

Assange fears the warrant against him is aimed at eventually extraditing him from Sweden to the United States. Swedish prosecutors said last month that idea was "far-fetched".
http://www.thelocal.se/20141023/assa...-court-ruling/





BBC to Publish 'Right to be Forgotten' Removals List
Dave Lee

The BBC is to publish a continually updated list of its articles removed from Google under the controversial "right to be forgotten" rule.

The ruling allows people to ask Google to remove some types of information about them from its search index.

But editorial policy head David Jordan told a public meeting, hosted by Google, that the BBC felt some of its articles had been wrongly hidden.

He said greater care should be given to the public's "right to remember".

Following the ruling, Google set up a form on its site allowing people to request which links should be taken down.

The European Court of Justice (ECJ) said links that were "inadequate, irrelevant or no longer relevant" should not appear when a specific search - usually a person's name - was made.

Google decided to notify affected websites each time a link had been removed.

The BBC will begin - in the "next few weeks" - publishing the list of removed URLs it has been notified about by Google.

Mr Jordan said the BBC had so far been notified of 46 links to articles that had been removed.

They included a link to a blog post by Economics Editor Robert Peston. The request was believed to have been made by a person who had left a comment underneath the article.

An EU spokesman later said the removal was "not a good judgement" by Google.

Real IRA

The list will not republish the story, or any identifying information. It will instead be a "resource for those interested in the debate", Mr Jordan said.

He criticised the "lack of a formal appeal process" after links have been taken down, noting one case where news of the trial involving members of the Real IRA was removed from search results.

"Two of whom were subsequently convicted," Mr Jordan explained.

"This report could not be traced when looking for any of the defendants' names. It seems to us to be difficult to justify this in the public's interest."

He suggested that Google implement some changes to the process of making a "right to be forgotten" request - such as requiring the identity of the person to be shared with the publication, on condition of confidentiality.

The meeting, hosted by Google chairman Eric Schmidt, is the latest of several that have taken place around Europe in the past two months. The next, on 4 November, will be held in Brussels.

However supporters of the ruling said the meetings were a "PR exercise" for Google - which would rather not deal with requests - rather than an open debate.

"They want to be seen as being open and virtuous, but they handpicked the members of the council, will control who is in the audience, and what comes out of the meetings," said Isabelle Falque-Pierrotin, head of CNIL - France's data protection body.
line

People keen to get data removed from Google's index must:

• Provide weblinks to the relevant material
• Name their home country
• Explain why the links should be removed
• Supply photo ID to help Google guard against fraudulent application

http://www.bbc.co.uk/news/technology-29658085





Gigabit Cellular Networks Could Happen with 24GHz Spectrum, FCC Says

Indoor coverage will be difficult with ultra-high frequencies, however.
Jon Brodkin

The Federal Communications Commission is starting to plan for cellular networks that can send users gigantic streams of data, but there are technical challenges to be solved and years of work ahead.

A Notice of Inquiry issued unanimously by the commission on Friday identifies frequencies of 24GHz and above as being able to provide gigabit or even 10Gbps speed. This would be a major change because today’s cellular networks use frequencies from 600MHz to 3GHz, with so-called “beachfront spectrum” under 1GHz being the most desirable because it can be used to deliver data over long distances. AT&T and Verizon Wireless control the most beachfront spectrum.

"It was long assumed that higher spectrum frequencies—like those above 24 GHz—could not support mobile services due to technological and practical limitations," the FCC said in a press release. "New technologies are challenging that assumption and promise to facilitate next generation mobile service—what some call '5G'—with the potential to dramatically increase wireless broadband speeds."

Rather than replacing today’s lower-spectrum systems, networks with frequencies of 24GHz and above could complement them by providing much higher data rates over short distances—perhaps very short. A new Wi-Fi standard that uses 60GHz can deliver up to 7Gbps but only if the transmitter and receiver are in the same room, for example.

The FCC’s notice talks about frequencies as high as 90GHz. Anything over 30GHz is classified as “millimeter wave frequencies,” which are blocked by walls. Indoor coverage is going to be tough.

“[W]hatever licensing regimes we adopt should take into account the fact that signals from carriers’ outdoor base stations will rarely be able to penetrate into the interiors of buildings, where around 75 percent of cellular data usage occurs today,” the FCC wrote. “Reaching such spaces will almost certainly require the deployment of indoor base stations.”

The FCC is asking experts for input on technical and licensing questions. The FCC wants to examine potential ways “mobile services can avoid interfering with each other” and with existing technologies that share the same frequency bands or operate in adjacent ones. The FCC said it intends to adopt licensing schemes that minimize the potential of interference while giving carriers what they need to boost data speeds.

Carriers holding exclusive licenses to spectrum are the norm today, but the FCC said it’s possible that carriers could share higher-frequency spectrum by using “dynamic beamforming capabilities” that distribute signals on different paths so as not to interfere with one another. Licenses could also be granted for smaller geographic areas than usual to minimize the amount of unserved real estate.

Another question is how much contiguous spectrum will be needed by operators in order to deliver high data rates.

“Nokia’s studies suggest the need for 2 gigahertz of contiguous spectrum to achieve maximum data throughputs of 10Gbps and at least 100Mbps at the cell edge with a maximum latency of no more than 1 millisecond,” Nokia said. “Samsung has demonstrated 500 megahertz systems in the 28GHz band with data rates ranging from 260Mbps to 1Gbps. We also seek comment on whether technology will allow licensees to effectively aggregate smaller, non-contiguous blocks of spectrum for use in providing mobile services, possibly reducing the need for large blocks of contiguous spectrum.”

Although millimeter waves require line-of-sight access to receivers, creative strategies might expand their use. Field trials conducted “at New York University and the University of Texas with funding from the US Army and Samsung… found that 39GHz mobile base stations can sustain 100 percent coverage in cells with a 200-meter radius in high-density urban areas,” the FCC said. “Receivers equipped with highly directional, steerable antennas were able to capture and combine as many as 14 links with rooftop-mounted transmitters despite obstructions in propagation paths, i.e., the receivers were able to ‘see’ multiple reflected signals from places where lines of sight to base stations were blocked. Samsung has conducted similar trials at 28 and 38 GHz, respectively, in Suwon, Korea, and Austin, Texas.”

While these higher frequencies may end up being used in 5G networks that replace today’s 4G ones, “[t]he truth is that 5G wireless technologies are likely to use many spectrum bands and may or may not include these millimeter wave frequencies,” Commissioner Michael O’Rielly wrote in a statement on the FCC’s decision.

5G commercial offerings could reach consumers within six years, Commissioner Ajit Pai wrote.

The FCC said it does not plan to require use of a specific technology or 5G standard. Instead, the commission said it intends to “let innovation and market competition drive the direction of technological development, and to put in place regulations that can accommodate future technological advances.”
http://www.computerworld.com/article...d-bidding.html





32 Cities Want to Challenge Big Telecom, Build Their Own Gigabit Networks
Jason Koebler

More than two dozen cities in 19 states announced today that they're sick of big telecom skipping them over for internet infrastructure upgrades and would like to build gigabit fiber networks themselves and help other cities follow their lead.

The Next Centuries Cities coalition, which includes a couple cities that already have gigabit fiber internet for their residents, was devised to help communities who want to build their own broadband networks navigate logistical and legal challenges to doing so.

Over the last several months, there's been a Federal Communications Commission-backed push for cities to build their own broadband networks because big telecom companies like Comcast, AT&T, and Verizon either don't or won't offer competitive broadband speeds in certain parts of the country.

"Across the country, city leaders are hungry to deploy high-speed Internet to transform their communities and connect residents to better jobs, better health care, and better education for their children,” Deb Socia, the group's executive director, said in a statement. “These mayors are rolling up their sleeves and getting the job done."

That's turned out to be a tricky proposition in a legal environment where more than 20 states have passed legislation (lobbied for by telecom companies and ALEC, a controversial, big business-backed “charity” that writes legislation for states) making it illegal or legally difficult for cities to build their own networks.

Of the cities involved in the coalition, 12 are located in states where there are legal barriers to building community networks. Those cities include Austin and San Antonio, Texas; Chattanooga, Morristown, Jackson, and Clarksville, Tenn.; Kansas City, Mo.; Lafayette, Louisiana; Montrose, Colo.; Mount Vernon, Wash.; Raleigh and Wilson, N.C.; and Winthrop, Minn. To be fair, some of these cities, such as Wilson, Chattanooga, and Austin already have gigabit service (Wilson and Chattanooga built it before a law was passed, Austin has Google Fiber).

"Towns and communities struggle with limited budgets, laws that restrict their opportunity to build/support a network that fits their needs, and even market pressures," the group of cities said in a recent blog post.

FCC chairman Tom Wheeler has repeatedly said he's willing to help cities preempt state laws barring community networks if they file a petition with the FCC. It'd appear that many of these cities are at least looking into the idea, with Chattanooga and Wilson already filing preemption requests with the agency.

That's riled up conservative lawmakers and small government types, who say it's a federal government overreach to tell states what they can and cannot do. Cities say that's absurd, because, well, what is a state government if not a slightly bigger government trampling on the will of local decision makers and citizens?

The cities involved in this coalition say that their residents, who languish with mediocre DSL connections and intermittent access, are being lost in the national debate.

"We are at a crossroads. Too few communities have the Internet infrastructure to deliver on the promise of America. Too few commentators and policymakers recognize that truly next-generation Internet is indispensable in the 21st century," the group wrote.

"If you are a city equipped with gigabit infrastructure, join us. If you want this infrastructure but face difficulty in attaining it, join us," it said. "If you want to be part of a movement of cities and leaders who believe that next-generation Internet infrastructure will be a decisive factor for America’s cities in the decades to come, join us."
http://motherboard.vice.com/read/32-...gabit-networks





Lowly DSL Poised for Gigabit Speed Boost

Internet service providers are getting a new option called G.fast that can extend the lifespan of existing copper phone lines yet again.
Stephen Shankland

DSL was one of the first widely adopted technologies for bringing high-speed Internet access to homes and businesses, but it hasn't been the fastest.

That's all changing. At the Broadband World Forum in Amsterdam this week, several companies are announcing and demonstrating products that bring DSL -- or digital subscriber line -- into a future with a speed of 1 gigabit per second. That's about 1,000 times the data-transfer speed the technology offered when it arrived in the late 1990s.

The DSL upgrade comes through a new technology called G.fast. Among those making network equipment chips to enable the technology are industry giant Broadcom, China-based Triductor Technology and Israeli startup Sckipio. The technology itself should arrive in homes starting in 2016.

The reasons people want fast broadband are plentiful: video chat with friends, high-definition TV from services like iTunes and Netflix, online backup of family photos, streaming music from Spotify, and an easier adjustment to the new era when apps are updated frequently. You can multiply all these uses by the growing number of devices in homes that tap into the Internet -- laptops, mobile phones, tablets, game consoles, thermostats, TVs, security systems. With a maxed-out gigabit connection, you can download a 4GB high-definition movie in about half a minute.

In many countries, though, particularly the United States, cable TV wiring has led the way for high-speed broadband. And when Google decided it wanted to accelerate the arrival of this high-tech future, it picked an even faster fiber-optic lines for 1Gbps today. That's all given a bad rap to DSL, which uses the phone systems' twisted pairs of copper wires that are more susceptible to radio-frequency interference.

With G.fast, Internet service providers and carriers will get a new way to give a new speed boost to DSL. And even though the upgrade will be expensive, it's necessary, said Ovum analyst Kamalini Ganguly.

"In many countries, doing nothing is not an option any more in my opinion," Ganguly said. "In particular, this is true where there is significant coverage and competition from cable companies, who I expect to be embarking on another upgrade over the next few years that will enable them to support 1Gbps services....We are also likely to see some fiber to the home."

DSL upgrades have at least one big advantage: they're an upgrade to networks that already exist. Much of the world doesn't have cable-TV infrastructure at all, and still less of it has fiber-optic connections. Phone networks, though, are widely used, and covered about 422 million DSL subscribers globally in 2013, according to analyst firm IHS.

That should rise to 480 million by 2018. But reflecting the competitive threat to DSL equipment makers, fiber optic links are expected to spread much more rapidly -- from 113 million in 2013 to 200 million in 2018.

Vectoring and G.fast

It will be awhile -- a couple years at least, most likely -- before DSL customers see gigabit speeds. The telecommunications industry really is only now installing its predecessor, called VDSL for "vectored" DSL. But the first stage of the next transition is beginning with the arrival of chips for the communications gear in telecom networks -- and in the home network equipment on the other end of the connection.

At the conference, Triductor will demonstrate its G.fast chip, which it plans to ship in the first quarter of 2015. Sckipio, too, plans G.fast demonstrations: its DP3000 chip for network gear and its CP1000 chip for the home network equipment.

Broadcom is pushing G.fast chips both for home gear and network infrastructure, showing off prototypes of its electronics.

The company won't say how much it is charging, but it's clear what the sales pitch is: a more affordable way for network operators to speed up broadband.

"We think it's very competitive relative to the alternative of deploying fiber," said Jim McKeon, a Broadcom senior director of product marketing. "We believe we are going to be the first to market with this. We're very excited for the potential for G.fast to unlock the hidden value of existing copper plant that is distributed worldwide for DSL."

As a standard, G.fast isn't quite done yet -- McKeon expects ratification at the end of the year or early in 2015 -- but firmware updates will let Broadcom tweak its products to comply with final refinements. The technology is governed by a pair of standards at the International Telecommunication Union -- ITU-T G.9700 and ITU-T G.9700.

Physics constraints

Faster broadband is great, but there's a good reason it's expensive for ISPs and telecommunications companies to build out their infrastructure: physics.

Broadband today works by transmitting electrical signals down wires. High-frequency signals whose voltage changes rapidly can carry more ones and zeros in a given amount of time. But the quality of high-frequency signals degrades faster than for lower frequencies, limiting the practical wiring length.

That's why broadband providers have been placing their network gear closer to homes -- often in boxes under sidewalks, in cabinets by roads, or boxes attached to telephone poles. That's also why it's so expensive to upgrade broadband networks: the ISPs have had to extend their networks to bring that network gear closer to their customers.

To meet its full gigabit-per-second potential, G.fast connections will require broadband providers to use network equipment close to the customers' buildings -- 50 meters (about 160 feet) or less, McKeon said. A 200-meter distance will still be good enough for about 600Mbps, he said. Some customers likely will bring their network gear all the way to a building, then use G.fast to provide connections to customers inside without having to replace the building's copper wiring.

G.fast requires more than just shorter distances for top speed. It also requires the vectoring technology introduced with VDSL. Vectoring is a technology that minimizes the interference between different customers' copper communication lines.

The basic problem is that copper wires act like broadcasting and receiving antennas, McKeon said. A signal in one line can cause a fainter signal in a nearby second line, muddying its signals. Meanwhile, the second line's signals interfere with the first. Multiply that by the number of lines in a network cabinet, and you've got a problem.

What vectoring does is take advantage of the fact that communications chips know what data will be sent over which lines and when. Complicated calculations can then be used to augment each signal's line with a different voltage to counteract the expected interference. That, in effect, tidies things up so high-frequency signals can be sent.

Gigabit push

Broadband providers are anxious to get high-speed connections to customers. "Being perceived as a technology leader was the overwhelming driver for offer gigabit broadband services," according to results published Monday in a Broadbandtrends survey of 88 providers.

One early G.fast supporter is Telekom Austria, which last week announced it's got the technology working in real-world tests. Widespread installation is set for 2016.

"This technology will enable us to offer urban areas data rates 10, even up to 20, times higher than ever before," Telekom Austria CEO Hannes Ametsreiter said in a statement. "Fiber to the home remains our long-term vision, but we consider G.fast as an intelligent interim solution until fiber will have a similar coverage as we have with copper now."

European customers are likely to favor G.fast in particular, Triductor CEO Tan Yaolong said. That's because labor costs are very high in that region, which discourages extensive renovation projects.

Expense aside, fiber competition with DSL is very real. Google has increased its visibility with its Google Fiber project, which is providing 1Gbps broadband in Kansas City, Missouri, Kansas City, Kansas, and Provo, Utah, with Austin, Texas, on the way. US broadband giant AT&T has begun countering Google with its GigaPower program.

In the longer run, wireless broadband will play a role too. The US Federal Communication Commission last week announced that it's examining wireless frequencies of 24GHz and higher: "It was long assumed that higher spectrum frequencies -- like those above 24 GHz -- could not support mobile services due to technological and practical limitations. New technologies are challenging that assumption and promise to facilitate next-generation mobile service -- what some call '5G' -- with the potential to dramatically increase wireless broadband speeds."

But wireless broadband suffers from capacity limits that aren't as much a burden for fixed-line communication technology like DSL, cable TV, and fiber optics.

"The demands of applications like multiple 4K TV streams to a household mean that serving the home via wireless isn't viable," said Ovum analyst Steven Hartley. "There are segments for which wireless makes sense -- low-end users, short-term renters, migrants. But these are also likely to be lower-spending segments."
http://www.cnet.com/news/lowly-dsl-b...t-speed-boost/





Comcast’s Net Neutrality Commitments Aren’t Good Enough, Senator Says

Sen. Leahy asks Comcast to swear off fast lanes even after NBC deal expires.
Jon Brodkin

Sen. Patrick Leahy (D-VT) today called on Comcast to make a long-term pledge that it won't charge content providers for faster access to its subscribers.

Comcast already agreed to follow network neutrality provisions until September 2018 as part of its 2011 purchase of NBCUniversal. While the agreement with the US government doesn't specifically prevent Comcast from signing paid prioritization deals, the company has said it has no plans to do so. Comcast has been touting its net neutrality commitments while making the case that it should be allowed to purchase Time Warner Cable, the second biggest cable company in the US after itself.

Leahy, chairman of the Senate Judiciary Committee, wrote a letter to Comcast Executive VP David Cohen today, saying he worries about "the risk of paid prioritization agreements through which websites could be charged for priority access over the Internet." Leahy wants "meaningful pledges from our Nation's broadband providers that they share the American public's commitment to an Internet that remains open and equally accessible to all."

"As a condition of the Comcast-NBC Universal merger, Comcast is bound to the net neutrality principles embodied in the FCC's Open Internet Order through the end of 2018," he wrote. "Those rules should be viewed as a minimum level of protection to promote competition online and Comcast's commitment to those principles should extend well beyond the imminent cut-off date of 2018. As the antitrust regulators continue to evaluate Comcast's proposed transaction with Time Warner Cable, and regardless of whether it is approved, I ask Comcast to pledge that it will not engage in paid prioritization. I also ask that Comcast pledge not to engage in any activity that prioritizes affiliated content or services over unaffiliated content or services, helping to ensure that vertical integration does not threaten competition online."

Comcast is reviewing the letter, a company spokesperson told Ars.

The Federal Communications Commission is considering whether to create net neutrality rules that ban paid prioritization. Leahy's letter to Comcast noted that he has "introduced legislation with Congresswoman Doris Matsui that would ban paid prioritization arrangements and has urged the FCC to enact meaningful net neutrality rules to preserve the Internet we know today."

Out of all major ISPs, Verizon may be the likeliest to offer paid fast lanes for Internet content providers. The company sued the FCC to overturn previous net neutrality rules issued in 2010, and Verizon said in court arguments that it would offer fast lanes if it's allowed to.

Comcast has been criticized for charging Netflix for a direct connection to the edge of its network, but that kind of arrangement isn't targeted by most net neutrality proposals. Generally, net neutrality rules focus only on whether traffic is prioritized after it enters the Internet service provider's network.
http://arstechnica.com/tech-policy/2...-senator-says/





Three House Dems, Three Proposals for Net Neutrality. Here’s What They Look Like.
Brian Fung

Rep. Anna Eshoo is urging federal regulators to oversee Internet providers using Title II of the Communications Act — a move that would give the Federal Communications Commission more latitude to prevent the sort of traffic discrimination net neutrality advocates say would hurt the open Internet.

In a letter to the FCC this week, the California Democrat said the agency could selectively apply only those parts of the law that deal with reasonable rates and unjust discrimination — what are known as Sections 201 and 202. The rest of the law, she said, could be waived under a process known as "forbearance." (My colleague Nancy Scola has a good overview of what that means.)

"It is true that some of these laws do apply only to telephone services," said Eshoo, referring to Title II's historical role in regulating phone companies like AT&T. "But others are the source of timeless principles that can and do apply to all two-way telecom services, including broadband."

Eshoo's proposal tracks closely with one by her colleague, Rep. Zoe Lofgren (D-Calif.). In an earlier letter to the FCC, Lofgren also called for a mix of Title II and forbearance — while leaning on another part of the law that could help the FCC make a stronger case for forbearance. In striking down much of the FCC's original net neutrality rules this year, a federal court granted the agency a little more authority under what's called Section 706.

Section 706, found under Title I of the FCC's congressional charter, is how the agency could regulate broadband under new net neutrality rules. Some advocates have been pushing for the FCC to rely primarily on its Section 706 authority to draft the new regulations, but consumer groups argue that won't be enough.

A third proposal by another California Democrat, Rep. Henry Waxman , also relies on Title II, but advances an alternative that waives the very provisions of the law that Eshoo's letter says are the most important — the language against "unjust discrimination." This may sound counterintuitive, but Waxman appears to agree with industry officials' arguments that the phrase "unjust discrimination" actually could still allow Internet fast lanes — if broadband providers can claim that the discrimination is a "just" and reasonable practice.

"Indeed, section 201(b) expressly permits the creation of 'different classes of communications' with different charges so long as they are deemed 'just and reasonable' by the Commission," Waxman wrote in a letter to the agency, adding that the FCC has already "permitted tiered pricing structures based on volume and term discounts, different levels of quality of service" and other factors that could set a precedent for Internet fast lanes.

So, Waxman is proposing regulating broadband under Title II, but waiving Sections 201 and 202, along with much of the rest of Title II. But why does he even bother with Title II in the first place if he throws so much of it out?

To understand, we have to go back to this year's court decision. The D.C. Circuit said the FCC's old net neutrality rules were invalid because they imposed rules on Internet providers that could only be applied to "common carriers" regulated under Title II (remember that Internet providers are currently regulated under Title I). The wall between Title I services and Title II services is pretty strong, but Waxman's idea would knock down that wall for broadband providers, technically putting them under Title II. Then, rather than actually regulating the Internet providers using Title II requirements, the FCC would draw up new requirements based on Section 706.

That's a lot to digest. But we now have three House Democrats all promoting variations on the same idea: Use the most well-worn part of the FCC's authority to deal with a relatively novel problem.
http://www.washingtonpost.com/blogs/...hey-look-like/





Meet ‘Forbearance,’ the Obscure Governing Tool that Just Might Resolve the Net Neutrality Debate
Nancy Scola

The net neutrality debate might soon, mercifully, be wrapping up, as the Federal Communications Commission prepares to issue a new round of rules. And as the FCC does so, it's exceptionally likely that we'll hear one word again and again: forbearance.

While we prep for the home stretch, it's worth taking a moment to understand exactly what that deceptively dull concept means and where it came from. For that, we turn to Harold Feld. A senior vice president at the advocacy group Public Knowledge (which supports strong net neutrality regulation), Feld lived through the telecom debates of the '90s and aughts. He can cite the proper dates, details and provisions of the U.S. legal code without pausing to double-check his notes. In short, he's the right person for the job.

"Forbearance" means much the same in normal English as it does in telecommunications law -- to restrain oneself from doing something. In short, it might be key to resolving the net neutrality fight. And yet it is one of modern governing's most poorly-understood tools.

Where forebearance becomes important in the net neutrality debate is where it is being proposed as a sort of negotiated settlement. Take the approach advocated by Silicon Valley Democrat Rep. Anna Eshoo on Wednesday as a "light touch." The plan would move broadband Internet into Title II of the Communications Act of 1934. That's the more heavily regulated title, the one that includes landline telephones. But under Eshoo's plan, the FCC would choose not to enforce some of the act's several dozen provisions that broadband providers find the most onerous, such as imposing limits on how telecom products can be priced. That's forbearance.

How we got forbearance in the first place

It's simple enough in concept, somewhat more complicated in practice. But that's largely because it's a regulatory mechanism meant to cope with an enormously complex field. Feld walks us through the history:

It was the mid-1990s. The AT&T monopoly had been broken up by the courts a decade earlier. Mobile phones were popping up all over. This was a time of increased competition, and yet the 1934 Communications Act that governed the landscape had been written for an era when only a handful of big companies ruled. Congress aimed to light a fire under that competition and began drafting a telecommunications law overhaul. It was, says Feld, "one big experiment."

And yet, there was a bit of a mismatch: The telecom field was in flux and only likely to become more so. "Congress by necessity writes in broad, sweeping language," says Feld, a sensible approach given that nobody wants to have to draft a new telecom law every few years. And so Congress included some regulatory wiggle room in its landmark 1996 Telecommunications Act: a provision -- "Section 10" -- that requires that the FCC not enforce the laws on various parts unless they're needed to protect consumers or generally ensure the public good.

Why the FCC has the power and others don't

It's a power that's unique to the nation's main communications agency, explains Feld, because of how varied the telecom landscape is in the United States. In some parts of America, phone and broadband service is closer to 1934 than to 2014. To steal a line attributed to William Gibson, "The future is here, it's just not evenly distributed yet." So the FCC, which is supposed to be the public sector's foremost expert on such matters, is granted the power to make in-the-weeds and in-the-wires decisions, like, say, setting up a checklist that a Baby Bell might have to tick off before it can offer long-distance service in rural Iowa.

That said, says Feld, Congress had the foresight to know that the five members of the FCC and their staff might not be itching to give up their powers. "We don't trust regulatory agencies," he jokes about the congressional mindset, "to let go of their precious rules."

And so, there are two ways forbearance works. The first is that the agency can simply decide on its own to stop regulating.

The second is more a forcing of its hand: Telecommunications providers ask the FCC to suspend some rule in some context so as to free the provider from whatever restriction. If the agency doesn't respond before a year runs out, that petition is "deemed granted," and the targeted law no longer applies. That process creates a regulatory mechanism unique in American governing: The FCC is given enormous power to decide how the telecommunications market works. But if it doesn't routinely exercise that power, all bets are off, and telephone, cable and broadband Internet companies are free to go at each other with little restriction.

The future of forbearing (and changing your mind)

Some Internet providers, to be absolutely clear, don't think the Title II-plus-forbearance solution is any sort of solution at all. They argue that it's too onerous a process to get the FCC to stop enforcing its rules city by city, technology by technology. But Feld frets about something else. In the nearly 20 years that those regulators have had forbearance powers, they've never really figured out how to un-forbear -- that is, to decide that while waiving a rule was once a smart thing to do, some change in the telecommunications market makes it no longer the best option. There are few, if any, examples of such a reversal, says Feld.

But if the FCC opts to embrace net neutrality through a forbearance scheme, it will likely have to figure out how to make unforbearance work, too -- a good reminder that within the U.S. telecommunications landscape, one of the few constants is reinvention.
http://www.washingtonpost.com/blogs/...rality-debate/





Hungary Plans New Tax on Internet Traffic, Public Calls for Rally
Marton Dunai

Hungary plans to impose a new tax on Internet data transfers, a draft 2015 tax bill submitted to parliament late on Tuesday showed, in a move that could hit Internet and telecoms providers and their customers hard.

The draft tax code contains a provision for Internet providers to pay a tax of 150 forints (37 pence) per gigabyte of data traffic, though it would also let companies offset corporate income tax against the new levy.

Within hours of the tax provision being published over 100,000 people joined a Facebook group protesting the levy, which they fear providers will pass on to them. Thousands said they would rally against the tax, which they said was excessive, outside the Economy Ministry on Sunday.

Prime Minister Viktor Orban's government has in the last few years imposed special taxes on the banking, retail and energy sectors as well as on telecommunications providers to keep the budget deficit in check, jeopardising profits in some sectors of the economy and unnerving international investors.

Economy Minister Mihaly Varga defended the move on Tuesday, saying communications technology has changed the way people use telecom services and therefore the tax code needed to be changed. His ministry said it expects the tax to generate annual revenue of 20 billion forints.

However, fixed-line Internet traffic in Hungary reached 1.15 billion gigabytes in 2013 and mobile internet added 18 million gigabytes, which would generate revenue of 175 billion forints under the new tax according to consultancy firm eNet.

Traffic has probably grown since, eNet partner Gergely Kis told Reuters, so the tax could hit Internet providers by more than 200 billion forints, if left unaltered.

The entire internet service sector's annual revenue came to 164 billion forints at the end of 2013, according to the Central Statistics Office (KSH).

The government's low estimate of revenue suggests it will impose a cap on the amount of tax any single Internet provider will have to pay, and in view of the public reaction the ruling Fidesz party asked the government to set a maximum level on the tax payable by individuals.

"The Fidesz parliament group insists that the data traffic tax be paid by service providers, therefore we propose changes to the bill," Fidesz parliament group leader Antal Rogan said in an emailed statement.

"We think it is practical to introduce an upper limit in the same fashion and same magnitude that applied to voice-based telephony previously."

Under the current tax code private individuals' tax payments are maximised at a monthly 700 forints ($2.9) while companies cannot pay more than 5,000 forints a month.

A government spokesman was not immediately available for comment.

STOCK HIT, INTERNET USERS UNITE

Analysts at Equilor Securities said on Wednesday that the Internet service market leader, Deutsche Telekom's (DTEGn.DE) subsidiary Magyar Telekom MTEL.BU could expect to pay about 10 billion forints if there was no limit on the proposed tax.

"Although corporate taxes offset this amount Magyar Telekom has paid only 200-300 million forints worth of such tax in recent years because its parent company used tax breaks," Equilor noted.

"The company could theoretically pass on the burden to its clients but that requires a business policy decision so it's too early to say much about that. The tax could, however, boost uncertainty about a resumption of dividend payments at Magyar Telekom."
Magyar Telekom recently said it would pay no dividend for 2014 in order to keep its debt in check.

The company said the "drastic" new tax threatened to undermine planned investments in broadband network infrastructure, and called for the proposal to be withdrawn. It said industry players were not consulted about the idea.

Magyar Telekom shares were down 2.9 percent at 1221 GMT, underperforming the blue chip index .BUX, which was down 0.3 percent.

The Association of IT, Telecommunications and Electronics Companies said in a statement on Wednesday that the tax would force them to hike prices, which would reflect in consumer prices in general and hinder economic growth.

"The real losers of the Internet tax are not the Internet companies but their clients, users, and all Hungarians who would now access the services they have used much more expensively, or in an extreme case, not at all," the Association said.

Balazs Nemes, one of those who began the Facebook page protesting the move, said: "In more developed nations, broadband Internet access is considered part of human rights.

"Only the darkest dictatorships want to control the Internet either financially or with raw power," he said.

"We pay VAT, the Internet service providers pay corporate taxes, so what justifies making web use a luxury when we do basic things like arranging medical appointments, university applications or banking online?"

(Reporting by Marton Dunai and Gergely Szakacs; Editing by Hugh Lawson)
http://uk.reuters.com/article/2014/1...0IB0RI20141022

















Until next week,

- js.



















Current Week In Review





Recent WiRs -

October 18th, October 11th, October 4th, September 27th


Jack Spratts' Week In Review is published every Friday. Submit letters, articles, press releases, comments, questions etc. in plain text English to jackspratts (at) lycos (dot) com. Submission deadlines are Thursdays @ 1400 UTC. Please include contact info. The right to publish all remarks is reserved.


"The First Amendment rests on the assumption that the widest possible dissemination of information from diverse and antagonistic sources is essential to the welfare of the public."
- Hugo Black
JackSpratts is offline   Reply With Quote
Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Peer-To-Peer News - The Week In Review - July 16th, '11 JackSpratts Peer to Peer 0 13-07-11 06:43 AM
Peer-To-Peer News - The Week In Review - July 9th, '11 JackSpratts Peer to Peer 0 06-07-11 05:36 AM
Peer-To-Peer News - The Week In Review - January 30th, '10 JackSpratts Peer to Peer 0 27-01-10 07:49 AM
Peer-To-Peer News - The Week In Review - January 16th, '10 JackSpratts Peer to Peer 0 13-01-10 09:02 AM
Peer-To-Peer News - The Week In Review - December 5th, '09 JackSpratts Peer to Peer 0 02-12-09 08:32 AM






All times are GMT -6. The time now is 05:55 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
© www.p2p-zone.com - Napsterites - 2000 - 2024 (Contact grm1@iinet.net.au for all admin enquiries)