P2P-Zone  

Go Back   P2P-Zone > Peer to Peer
FAQ Members List Calendar Search Today's Posts Mark Forums Read

Peer to Peer The 3rd millenium technology!

Reply
 
Thread Tools Search this Thread Display Modes
Old 15-07-15, 07:39 AM   #1
JackSpratts
 
JackSpratts's Avatar
 
Join Date: May 2001
Location: New England
Posts: 10,013
Default Peer-To-Peer News - The Week In Review - July 18th, '15

Since 2002


































"Overall, depending on the advertising load of the site’s home page, an ad blocker cuts data transfers by half to three quarters." – Frédéric Filloux


"Anyway, if DEF CON wants a talk on how to hook up a Raspberry Pi to a UbiQuiTi NanoStation LOCOM9 in order bridge WiFi, I'll happily give that talk." – Robert Graham






































July 18th, 2015




A German Pirate Just Saved Our Right to Take Public Selfies
Liat Clark

The European Parliament has rejected proposals to place copyright restrictions on photos of public places. The proposals would have forced members of the public to secure permission from architects or rightsholders before sharing selfies taken in front of architectural landmarks on social media.

502 MEPs voted against the proposals -- part of a move to create a resolution streamlining European-wide copyright law -- including the very MEP who first tabled the inclusion, Jean-Marie Cavada.

Our right to take and share architecturally-important selfies (known as the Freedom of Panorama) was saved by Julia Reda, MEP and Germany Pirate Party politician. Reda wrote the original draft the resolution, passed on 9 July, was based upon.

"As a result, most Europeans will continue to be able to post selfies online and view photos of famous buildings on Wikipedia unencumbered by copyright," Reda told WIRED.co.uk. "At the same time, the fact that the attack on freedom of panorama for a time enjoyed the support of a majority demonstrates that many MEPs have yet to fully understand the cultural shift caused by the internet and its consequences for copyright. Much work remains until we have a European copyright framework fit for the digital age."

UK law already protects our right to take photos of public buildings, under the 1911 Copyright Act. But countries including France, Italy and Belgium require photographers and filmmakers to obtain licenses and pay fees to work in public places. Reda points out that those publicly supporting the highly unpopular move to make these kinds of rules European-wide, had mainly been French collection societies making millions from the permissions systems. "A French newspaper, Les Echos, reported yesterday that YouTube in fact does have a contract with a collecting society for such uses in France," Reda says.

This is in line with MEP Cavada's original plan for the proposal -- to target platforms like Wikipedia and Facebook, not their end users. End users would, however, have been legally vulnerable since every time you upload a photo to Facebook, for instance, you grant it the right to commercially use that photo.

"This amendment would be disastrous," Michael Maggs, chair of Wikimedia UK, said ahead of the vote. "Many citizens use Facebook, Tumblr and other commercial social media sites, and uploads to such sites would put photographers at legal risk, even if no money changes hands. Non-commercial is not the same as non-profit, and large numbers of educational, charity and academic sites would be affected, including Wikipedia." He warned that if such a clause were to pass, it could put international filmmakers off working in Europe.

That such a proposal got this far, seems incredible. Reda tells WIRED.co.uk that the likely reason for the Legal Affairs Committee passing the proposal, leading to the vote, will not be that it was "carefully evaluated" and agreed upon. Instead: "[MEPs] simply extended to this amendment the same attitude and convictions they apply to all copyright reform issues" -- that more copyright protection, will always be a good thing.

Reda does not think the effort by a minority to alter the Freedom of Panorama could rear its head again, saying: "That's hard to imagine after the public outcry and the decisive vote: only 40 MEPs voted to keep the call for a restriction, 502 against -- including, in the end, even Mr Cavada himself." The UK's Royal Institute of British Architects (RIBA) publicly stated it was against the proposals in June, warning they could have "negative implications, and represent a potentially damaging restriction of the debate about architecture and public space"; while a Change.org petition against the changes racked up half a million supporters before the vote.

Reda does, however, hope to push for the Freedom of Panorama to be part of Europe's copyright "harmonisation" later in the year -- it didn't make it that far in yesterday's vote.

Other good news from yesterday's vote, and more successes for Reda, include the voting down of an amendment brought by German MEPs that would have seen the introduction of a Europe-wide "snippet tax", which has seen Google News stop publishing German news stories to avoid paying a levy on them. There was also a call to stop further geoblocking, which Reda says would have prevented "cultural minorities from accessing content in their language across borders", and "a call to enable e-lending and digitisation for libraries and text and data mining for scientists". Making it all in all a good day for creative freedoms.

It's definitely not time to rest, though, warned Reda, indicating to WIRED.co.uk that the concerning anti-internet company culture shows no sign of slowing -- and could ultimately backfire on Europe.

"One of the remaining worrying points in the report is a very negative outlook on internet platforms inserted by MEPs," she says. "Even the freedom of panorama restriction proposal can be seen in this context -- the conviction that online services are parasitic foreigners unfairly profiting off 'our' culture. The Commission has also announced plans to review the role of platforms as part of their digital single market strategy.

"An attempt seems to be mounting to increase the liability of social networks, search engines, apps etc. for copyright infringement committed by their users. This is dangerous, as it risks privatising law enforcement, incentivises service providers to actively screen user content which is both a privacy concern and can lead to overblocking when providers decide to err on the side of caution. The end result may be a concentration of the market to a few large intermediaries who can afford to put in place complicated monitoring systems -- thus increasing, rather than limiting, the dominance of today's big players."
http://www.wired.co.uk/news/archive/...-german-pirate





FBI Extends Piracy-Hunt to Romania, Sites Shut Down
Martin Anderson

The FBI have assisted Romanian authorities in the closure of three piracy-based torrent sites in the region. A report from the prosecutor’s office in Romania’s High Court of Cassation and Justice details a cooperative investigation dating back four years which has now resulted in raids and site seizures, including the domain serialepenet.ro, now taken over with a government legend:

Roughly (machine) translated, the warning reads: ‘This domain name is seized in accordance with article 249 of the Code of penală. This domain name is the subject of a criminal case,’

Additionally fisierulmeu.ro has been taken offline. Torrentfreak reports that authorities carried out searches of four locations, including the homes of individuals and also companies thought to be working in cooperation with the pirate sites, or furnishing services to them, and that documentation and computer equipment was seized during these raids.

The central hub of the operation was reportedly tracked down to an office in Bucharest, with a connection intimated between web-hosting firm Xservers, a cloud-hosting company with a range of business hosting solutions. Unsubstantiated media allegations implicate Xservers in money-laundering for the pirate operations, with any official word on the matter still pending. An undisclosed number of men have been arrested on suspicion of tax evasion and intellectual property offences, among other charges.

The FBI has a strong presence in Romania for more than the pursuit of redress for copyright-holders in the U.S., maintaining Legats (legal attaches) who represent the director of the FBI in Romanian territory, and who work in concert with Romanian and Moldovian investigative authorities at quite an intimate level.

FBI programs in Romania have spent more than $4.6mn (£2.9mn) in interdepartmental programs and initiatives with Romania-based authorities since 2007, which costs include the placement of an agent within the country’s Organised Crime Directorate – an organization within the Romanian police force – and additional personnel at the Southeast European Cooperative Initiative (SECI) centre, an investment amounting to $3.4mn (£2.1mn). Though the FBI do not have default powers of investigation or arrest without specific government warrant, this does not usually seem to be a hindrance – in 2010 the FBI participated in the arrest of 50 members of a fraud gang who had extracted approximately three million dollars from victims in the U.S., executing 100 search warrants in the process. Additionally the FBI funds the training of Romanian officers from various branches of enforcement in cross-over schemes between the two countries.
http://thestack.com/fbi-romania-pira...ut-down-140715





Feds Bust Through Huge Tor-Hidden Child Porn Site Using Questionable Malware

FBI seized server, let site run for two weeks before shutting it down.
Cyrus Farivar and Sean Gallagher

A newly unsealed FBI search warrant application illustrates yet another example of how the government deploys malware and uses sophisticated exploits in an attempt to bust up child pornography rings.

The 28-page FBI affidavit (text-only, possibly NSFW) was unsealed in a federal court in Brooklyn, New York earlier this month. It describes a North Carolina server hosting a Tor hidden service site. The setup was seized in February 2015, but law enforcement allowed it to run for two additional weeks as a way to monitor its nearly 215,000 users.

Currently, at least three men—Peter Ferrell, Alex Schreiber, and James Paroline—have been charged in connection with this site.

Ferrell, username "plowden23," is the target of the search warrant affidavit. Schreiber, 66, of Queens, was a former New York City schoolteacher. The two New York men have been released on bond.

Paroline remains in federal custody without bail in New Jersey. The criminal complaint against him states that "during an interview with law enforcement officers, defendant PAROLINE admitted" that while working at a nursery school and a summer camp counselor in New Jersey, he "inappropriately touched minor children."

Two of the lawyers for two of the suspects did not respond to Ars' request for comment.

Mia Isner-Grynberg, the federal public defender for Ferrell, told Ars: "Thanks for reaching out. I'm sorry, but I don't generally comment on pending cases."

Kelly Langmesser, an FBI spokeswoman, also declined to respond to specific questions. "Because this is an ongoing matter, we are not commenting on the case," she told Ars.

Legal warrant or not?

Legal experts told Ars that there are significant questions about precisely how the unnamed Tor site was breached, exactly how its "Network Investigative Tool" (or NIT, i.e., malware) works, how many of the users were outside of the judicial district, and if the seized server contained other non-criminal content.

"This is another example of the FBI obtaining a warrant that they are not yet authorized to obtain or execute based on the lack of technical expertise of the judiciary," Ahmed Ghappour, a law professor at the University of California, Hastings, told Ars. Ghappour pointed to a proposed change to Rule 41 that is currently working its way through the judicial system. He has written at length about this potential upcoming modification to Rule 41.

If the proposal is passed as currently drafted, federal authorities would gain an expanded ability to conduct "remote access" under a warrant against a target computer whose location is unknown or outside of a given judicial district. It would also apply in cases where that computer is part of a larger network of computers spread across multiple judicial districts. For now, in the United States, federal warrants are issued by judges who serve one of the 94 federal judicial districts and are typically only valid for that particular jurisdiction.

With the Tor-server effort, the affidavit does not clearly indicate how the malware was specifically deployed, nor if it was used against users outside of the Eastern District of New York.

"As you say, [the amendment to] Rule 41 has not yet been implemented, and so the variety of users on this website that were abroad to the extent that they were hacked as a result of the execution of this warrant, that would be in violation of the current venue restrictions of Rule 41," Ghappour added. "Even if someone from out of state was to have their computer searched as a result, that would be outside the bounds of the venue restriction of the current rule."

Hanni Fakhoury, a former federal public defender and current attorney with the Electronic Frontier Foundation, told Ars that there are no specific statutes or cases that currently deal with government-sanctioned malware deployed against criminal suspects.

"Rather, it would just be governed by the same principles and standards that would apply to other forms of electronic communications," told Ars. "So if law enforcement is using the malware to monitor electronic communications in real time, it would need a wiretap order to monitor. And if the malware needs to be installed on a specific computer, they would need to get a search warrant to do that (and that’s what it looks like they did here, at least according to paragraph 21 on page 11 of the affidavit). There are some really tricky technical questions about whether these warrants are ‘particular,’ specifically because many times the actual location of the computer is unknown."

“Website A”

The affidavit only refers to "Website A" and doesn’t refer to Tor by name, but anyone familiar with how Tor works would recognize its description.

"The court filings scrupulously avoid naming Tor (or mentioning hacking). Instead, they provide a detailed description of an anonymizing ‘Network’ and how a particular website was hidden in that ‘Network," Jonathan Mayer, a Stanford University legal scholar and current computer science doctoral candidate, told Ars. "There's only one software tool with the described popularity and with the described client and server functionality. That's Tor."

As FBI Special Agent John Robertson wrote:

Websites that are accessible only to users within the Network can be set up within the Network and Website A was one such website. Accordingly, Website A could not generally be accessed through the traditional Internet. Only a user who had installed the appropriate software on the user’s computer could access Website A. Even after connecting to the Network, however, a user had to know the exact web address of Website A in order to access it. Websites on the Network are not indexed in the same way as websites on the traditional Internet. Accordingly, unlike on the traditional Internet, a user could not simply perform a Google search for the name of Website A, obtain the web address for Website A, and click on a link to navigate to Website A. Rather, a user had to have obtained the web address for Website A directly from another source, such as other users of Website A, or from online postings describing both the sort of content available on Website A and its location.

The court filing provides extensive descriptions of both the types of child pornography available on Website A (Ars will not repeat those here) and the malware’s capabilities:

Pursuant to that authorization, on or about and between February 20, 2015, and March 4, 2015, each time any user or administrator logged into Website A by entering a username and password, the FBI was authorized to deploy the NIT which would send one or more communications to the user’s computer. Those communications were designed to cause the receiving computer to deliver to a computer known to or controlled by the government data that would help identify the computer, its location, other information about the computer, and the user of the computer accessing Website A. That data included: the computer’s actual IP address, and the date and time that the NIT determined what that IP address was; a unique identifier generated by the NIT a series of numbers, letters, and/or special characters) to distinguish the data from that of other computers; the type of operating system running on the computer, including type (eg, Windows), version (eg, Windows 7), and architecture (eg, x86); information about whether the NIT had already been delivered to the computer; the computer’s Host Name; the computer's active operating system username; and the computer’s MAC address.

Pulling back the curtain

Over the past few years, the FBI has used a number of tools to pull back the veil of privacy provided by Tor to identify suspected child pornography rings and other "darknet" markets. There are several possible ways in which first the server itself, and then the users, were exposed. It's possible that the server had been identified for months before the FBI seized it and used it as a "honeypot" to track and identify the individuals connecting to it.

The FBI's NIT was used in previous child pornography investigations. It's cited in court papers for the case USA v Cottom et al, which is currently being tried in the Nebraska US District Court. A team of experts hired by the defense—Dr. Ashley Podhradsky, Dr. Matt Miller, and Josh Stroschein of Dakota State University—performed forensic analysis of the NIT, reverse-engineering the code. They found it used the same techniques as Rapid7's Metasploit "decloaking engine"—a component of the Metasploit framework that in this case used a known Flash vulnerability to extract information about computers running an older, unpatched version of the Tor Browser Bundle. (Ironically, Metasploit's core developer for several years was also named Matt Miller—but he now works at Microsoft.)

While leveraging an exploit to extract identity information from computers connecting to the Tor service, the defense expert investigators wrote that they "do not consider the NIT to be 'hacking'" because the NIT "exploited a configuration setting that did not require offensive-based actions." The NIT exploit bypassed Tor by creating a direct socket connection that eschews Tor's routing—in this particular case, by using a Flash component. This functionality, the experts noted, was identical to Metasploit's decloaking code.

Tor only routes Transmission Control Protocol (TCP) traffic and does not handle other Internet communications protocols. The exploit took advantage of this to send information about the system that the exploit executed on over the public Internet, both revealing its public address and tying that address to the website the exploit was launched from. A "policy file" on the server hosting the exploit is checked by the exploit package "to see which type of method to use on the client side," the expert investigators wrote to the court. "The choices given in the NIT were Java, Javascript, or Flash. This allows the NIT to only connect via Flash when it is the 'best method' available."

In a conversation with Ars about the most recent FBI affidavit, security researcher and former Tor developer Runa Sandvik said she believes that the same Metasploit-based NIT was used to unmask the 215,000 users of the site seized by the FBI. Alternatively, she said the FBI may have used a honeypot technique that feeds site visitors a link to a webpage outside of Tor, next using a variety of traffic analysis methods and information provided by the site users themselves to aid in identifying them. "The FBI could have used that type of method too and not relied on [JavaScript] or Flash," she noted.

To use any of these techniques to "uncloak" users, however, the FBI must first find and gain control over the server those users visit. At this point, it's clear that the FBI has found a number of ways to identify servers running Tor "hidden services." Last November, as part of Operation Onymous, FBI and Europol officials identified and seized at least 27 servers hosting over 400 Tor hidden services—including the Silk Road 2.0 marketplace. While court papers filed in the Silk Road 2.0 case claim that an undercover Homeland Security investigator managed to get hired as an "admin" for the marketplace, the FBI could have used other techniques to identify hosting services that might have servers running Tor sites.

In a blog post last November, former Tor Project Director Andrew Lewman noted that ten Tor "exit nodes"—the last stops for Tor traffic before leaving the anonymizing network—had been taken offline during Operation Onymous. He noted that it was possible that law enforcement was operating Tor network nodes in an effort to identify hidden services and users. Even narrowing down the location of services to a particular hosting service's network would allow law enforcement to approach the hosting company. At that point, all the feds need to do is ask the service which servers running within their data centers matched the traffic profile of Tor.
http://arstechnica.com/tech-policy/2...nable-malware/





Dotcom's File-Sharing Hive Mega 'Sues for Copyright Infringement'

Fat man bites dog

A company founded by Kim Dotcom, the file-sharing website kingpin who is accused of mass copyright infringement, is reportedly suing another biz for ... copyright infringement.

Cloud storage website Mega.co.nz has apparently fired legal threats at MegaSearch.co.nz, a website that lists links to downloads of unlicensed material, alleging MegaSearch uses Mega's logo and trademarks without permission, TorrentFreak reports.

Mega has closed down scrapers before: MegaSearch.me was shuttered a couple of years ago. Mega works by allowing people to backup files – anything from personal photos and text to movies and pirated software – to its servers, and then share those files with other people, if necessary.

Quite how involved Dotcom really is in Mega isn't clear. The porky profiteer transferred the site's assets to his then-wife Mona Verga Dotcom (really) and children, but the couple have since separated.

Dotcom, whose birth name is Kim Schmitz, lost assets worth $20m – and the number plates GUILTY, MAFIA, and STONED – when his New Zealand mansion was raided by armed FBI agents in 2012. His bank accounts, which held $175m, were also frozen. He is now fighting off extradition to the US, where he is expected to face piracy charges over his previous file-sharing venture – MegaUpload.com.
http://www.theregister.co.uk/2015/07..._infringement/





High Court Quashes Regulations Allowing People to Copy CDs

Move follows judge’s recent ruling that government was legally mistaken in deciding not to introduce compensation scheme for musicians who faced losses

The high court has quashed regulations introduced by the government to allow members of the public to lawfully copy CDs and other copyright material bought for their own private use. The move follows a judge’s recent ruling that the government was legally incorrect in deciding not to introduce a compensation scheme for songwriters, musicians and other rights holders who faced losses as a result of their copyright being infringed.

The decision was won by the Musicians’ Union, UK Music and the British Academy of Songwriters, Composers and Authors, with a legal team led by two QCs, Ian Mill and Tom de la Mare. UK Music estimated that the new regulations, without a compensation scheme, would result in loss of revenues for rights owners in the creative sector of £58m a year.

The Department for Business, Innovation and Skills said when introducing the new regulations that they would cause only zero or insignificant harm, thus making compensation unnecessary. But Mr Justice Green, sitting in London, ruled last month that the evidence relied on by the government simply did not justify the claim that the harm would be “de minimis”. On Friday, in a further decision, he said: “It is clear that I should quash the regulations. I make clear this covers the entirety of the regulations and all the rights and obligations contained therein.”

The changes had come into force last October under the Copyright and Rights in Performances (Personal Copies for Private Use) Regulations 2014. Prior to 1 October, it was unlawful, for example, to “rip” or copy the contents of a CD on to a laptop, smartphone or MP3 player for personal use, although the format-shifting activity had become commonplace. The regulations introduced an exception into UK copyright law permitting the making of personal copies, as long as they were only for private use.

Jo Dipple, CEO of UK Music, said: “Last month, the high court agreed with us that the government acted unlawfully when it introduced an exception to copyright for private copying without fair compensation. We therefore welcome the court’s decision today to quash the existing regulations. It is vitally important that fairness for songwriters, composers and performers is written into the law. My members’ music defines this country.”

The judge stressed that the case had raised a range of legal issues of wide significance for UK and EU law, most of which he had decided in the government’s favour.

During a three-day hearing in April, Mill told the judge the law on private copying had been in an unsatisfactory state for decades. But the problem had been “massively exacerbated” by new digital technology and the internet, and the quality and speed of reproduction and copying they allowed.

Mill said the music industry welcomed the government’s new measures “but objects to the lack of a fair compensation scheme to compensate rights owners for the harm caused – both historically and in the future – by private copying infringements of their rights”. The UK government, unlike the majority of other European countries, had failed to provide appropriate compensation, he said.

Pushpinder Saini, representing the Department for Business, Innovation and Skills, contended that no credible evidence emerged during a lengthy consultation process that prejudice to rights holders “would be anything other than minimal”. The measures adopted by the UK authorities were far more limited in scope than those adopted in other EU member states, Saini submitted.

Under the new regulations, only the individual who purchased the original copy of the work, and not others such as a friend or family, is legally allowed to copy it.

Saini argued that the music industry case “boils down to an opportunistic attempt to obtain a financial benefit which, if the exception had never been introduced, they would never have received”. But the judge rejected the government’s stance, saying it was “simply not justified” by the particular evidence it was relying on with regard to the compensation issue.
http://www.theguardian.com/uk-news/2...-cds-musicians





Goodbye, Music Tuesday: Starting Today, Albums Come Out On Friday
Leah Scarpelli

If you didn't find any new albums on iTunes or in your local music store earlier this week, it's because beginning July 10, new music around the world is being released on Fridays.

For more than 25 years, Tuesday has been the standard release day for new albums in America — a tradition Keith Caulfield, co-director of charts at Billboard, says had a lot to do with shipping in the pre-digital era.

"One particular retailer might get that album on, say, Monday morning before they open," Caulfield says. "And they can have it on their shelf. Boom, great! So, if you walk in and you want Michael Jackson's new Bad album, they will have it."

"However," he continues, "a store a couple blocks down the road may have not got their shipment." That store couldn't do anything but wait until it showed up.

In 1989, the recording industry settled on Tuesday as the day every retailer could start selling new releases at the same time — but that was just in the U.S. Albums came out on Mondays in the U.K. and Canada, Fridays in Australia and Germany. Recently, the industry decided it needed a global standard.

"In the digital world, you can't make consumers wait," says Adrian Strain, head of communications for the International Federation of the Phonographic Industry (IFPI), a trade group representing over 1,300 record labels worldwide.

Under the old system, Strain says, a fan in Britain could buy a new album on Monday and upload it — so her friend in Australia could listen before it came out there on Friday.

Strain says "New Music Fridays," the nickname for the new global release schedule, "should give less reason for those people who can't get the new release legally to go to illegal sites."

The new release date isn't just popular with industry officials. The IFPI asked consumers across eight countries when they would like to get new music. Of those who expressed an opinion, 68 percent said Friday or Saturday.

Still, album sales have been declining for years. Nielsen SoundScan just released its midyear report, and total album sales are down 4 percent over the same period last year. Total album consumption was up, thanks in part to the growth in music streaming services.

So, it might not seem to really matter when albums come out. But it does to the people who still sell them, like those at Amoeba Music in Hollywood, which calls itself the world's largest independent record store. Co-founder Marc Weinstein says he was not consulted about the new global album release day.

"It's not something we would choose to have happen," Weinstein says. "I mean, it's a logistic nightmare on a lot of levels."

Weinstein explains that Amoeba now has to change its ad schedule, weekend staffing and live in-store performances, which were typically held when albums came out on Tuesdays.

"It gave us an opportunity to get a bump in the middle of the week when a lot of people would come in on a Tuesday, which normally wouldn't be a busy day," he says.

With many stores already struggling to survive, "this is gonna be perceived as kind of another nail in the coffin for brick-and-mortar retail, and it's kind of sad that no one takes any of that into account when they make these kind of fundamental changes in the way things work," Weinstein says.

Adrian Strain says not everyone will be happy with such a big change — but that the industry can only follow what it thinks the music fan wants.
http://www.npr.org/sections/therecor...-out-on-friday





Neil Young is Finished with Streaming

Neil Young is finished with streaming as of today

The singer removes all his music from streaming services.

Neil Young is officially done with streaming according to a new post on his Facebook page. The songwriter and mastermind behind his own hi-fi music project has announced he will remove all his music from streaming services.

In the post, which was accompanied by the picture of a melting vinyl record, he explained it wasn’t about money, he simply doesn’t want his music heard in a compromised quality an longer. He said the decision isn’t final and he’s willing to come back when things improve. That could be sooner that he thinks considering Tidal already offers that and Apple Music recently announced it will include hi-fi options in its next update. Read the statement below:

Streaming has ended for me. I hope this is ok for my fans.

It’s not because of the money, although my share (like all the other artists) was dramatically reduced by bad deals made without my consent.

It’s about sound quality. I don’t need my music to be devalued by the worst quality in the history of broadcasting or any other form of distribution. I don’t feel right allowing this to be sold to my fans. It’s bad for my music.

For me, It’s about making and distributing music people can really hear and feel. I stand for that.

When the quality is back, I’ll give it another look. Never say never.

Neil Young


Ultimately, Young has been a little mixed with his messages here since he also recently called the vinyl revival “a fashion statement“. Maybe he won’t rest until we all have Ponos.
http://www.factmag.com/2015/07/15/ne...ith-streaming/





U.S. Judge Says Internet Streaming Service Should be Treated Like Cable
Andrew Chung

A U.S. judge ruled on Thursday that online television service FilmOn X LLC should be treated like a traditional cable system in order to transmit the programs of the nation's broadcasters over the Internet.

The ruling, coming as consumer TV-watching habits are increasingly migrating to the Internet, is the first to first to view a streaming service like a cable provider and could have major implications for broadcasters if it is upheld by higher courts.

Broadcasters have been aggressively litigating against such services, contending they violate their copyrights and threaten their ability to generate advertising and control subscription fees.

U.S. District Judge George Wu in Los Angeles said in his ruling that FilmOn X is entitled to a compulsory license under the Copyright Act to retransmit the broadcasters' programs if it meets the law's requirements.

Acknowledging the major commercial consequences of his decision, Wu said he would allow an immediate appeal to the 9th U.S. Circuit Court of Appeals.

He also left in place an injunction against FilmOn X's operations that the broadcasters had won in 2012, so FilmOn will still not able to stream their content pending the appeal.

"The broadcasters have been trying to keep their foot on the throat of innovation," said FilmOn X's lawyer, Ryan Baker, in an interview. "The court’s decision today is a win for technology and for the American public."

In a statement, Fox Networks said the opinion "contravenes all legal precedent" and vowed to appeal.

The dispute stems from two lawsuits that Fox, Walt Disney Co's ABC network, CBS Corp, Comcast Corp's NBCUniversal and several others filed against FilmOn X in 2012.

The networks successfully shut down Aereo, a more prominent competitor to FilmOn X, when the U.S. Supreme Court in June, 2014 said that company violated the broadcasters' copyrights in retransmitting their programs to subscribers' devices via the Internet.

Aereo then tried to argue in a Manhattan federal court it should be seen as analogous to cable, eligible for a compulsory license. The judge in that case disagreed. The company, backed by Barry Diller's IAC/InterActiveCorp, has since gone bankrupt.

Both Aereo and FilmOn X, founded by Internet entrepreneur Alki David, use similar technology that allows viewers to watch network television captured via remote antennas and sent over the Internet.

The case is Fox Television Stations, Inc v. FilmOn X, LLC, in the U.S. District Court for the Central District of California, No. 12-cv-6921.

(Reporting by Andrew Chung; Editing by Cynthia Osterman)
http://www.reuters.com/article/2015/...0PR02620150717





Cord Cutting Is About To Punch ESPN Squarely In The Face
Karl Bode

If there's a primary reason for ridiculously-high cable TV prices, it's sports content generally, and ESPN specifically. On one hand, sports programming is one of the biggest reasons that people continue to pay for traditional TV. But with the slow but steady rise in cord cutting and an increase in so-called "skinny bundle" streaming services, it's pretty clear that the "worldwide leader in sports" is starting to get a little bit nervous. Cord cutting has hit segments like kids broadcasting harder than other areas, but it's increasingly clear the death of the traditional cable cash cow is headed in ESPN's direction at a pretty reasonable clip.

According to a recent Wall Street Journal report, the channel is tightening its belt after starting to feel the cord cutting (and more accurately, the cord trimming) pinch. ESPN has lost 7.2 million viewers in the last four years, and a little more than three million in the last year:

Since July 2011, ESPN’s reach into American homes has dropped 7.2%, from more than 100 million households—roughly the size of the total U.S. pay-TV market—to 92.9 million households, according to Nielsen data. Viewership of SportsCenter, its marquee and high-margin sports-news show, has sagged since September, due in part to the fact that younger consumers are increasingly finding sports news at their fingertips on smartphone apps.

There's a cable and broadcast industry narrative that consumers just can't live without sports, and the blathering talking heads on ESPN somehow get included in this argument. But a recent survey by DigitalSmiths suggested that only 35.7% of consumers would include ESPN in their cable lineup if they were able to pick and choose their channels (a la carte TV). In fact, the channel came in at 20th place in terms of the most desired channels among those surveyed. So according to SNL Kagan data, there are about 94.5 million homes each paying $6.41 per month ($7.5 billion annually) for a channel they're not really all that interested in.

That's pretty clearly not sustainable, and ESPN could be served by getting ahead of the curve and launching its own direct-to-consumer streaming service. But the Journal points out that the company's current contracts with pay TV providers state that if ESPN goes that route, the cable operators have the right to boot ESPN out of their core channel lineups:

If ESPN offers its channel as a direct-to-consumer streaming service, some pay-TV operators have the contractual right to boot ESPN out of their most widely-sold channel packages and sell it a la carte, according to people familiar with the matter. ESPN would have to charge about $30 a month per customer in an over-the-top offering to make the same money using that model, analysts say. But those distributors would have the right to undercut ESPN in their retail pricing, the people said.

And you might recall that ESPN sued Verizon when the company decided to pull ESPN out of the core channel lineup, arguing at the time that this was necessary to protect "innovation":

ESPN is at the forefront of embracing innovative ways to deliver high-quality content and value to consumers on multiple platforms, but that must be done in compliance with our agreements. We simply ask that Verizon abide by the terms of our contracts.

In other words, if ESPN actually decides to get out ahead of cord cutting and cord trimming by focusing on a direct-to-consumer effort, they'll open the door to more cord cutting and cord trimming, since they'll no longer be able to force people to pay an arm and a leg for a product many of them don't actually watch. Isn't the Internet video revolution kind of beautiful?
https://www.techdirt.com/articles/20...ely-face.shtml





Password Sharing: Are Netflix, HBO Missing $500 Million by Not Cracking Down?

Subscription VOD players, fighting for market share, don't have easy fixes to stop cheaters
Todd Spangler

Netflix, HBO and other Internet video-subscription providers are theoretically leaving megabucks on the table from customers nefariously sharing login info with nonpaying users. So why aren’t they aggressively trying to block the millions of freeloaders gorging on “Game of Thrones” or “Orange Is the New Black”?

Illicit password-sharing would appear to be a serious issue for subscription VOD players: The practice will cost the sector upwards of $500 million worldwide in 2015, according to a recent report from research firm Parks Associates.

It’s certainly a striking claim. About 6% of U.S. broadband households use an over-the-top video service paid by someone living outside of the household, the firm estimated. Unauthorized password-sharing is most rampant among consumers 18-24, with 20% of OTT users in that age bracket binge-watching on someone else’s dime, Parks says. The data is based on a consumer survey of 10,000 U.S. broadband households conducted in Q3 2014.

But the reason subscription-video services are not moving to actively stamp out password sharing, at least for the time being, is that they don’t want to screw up the customer experience — especially as they’re in growth mode, adding new subscribers every month.

First, think of it along the lines of the tech-biz maxim “it’s not a bug, it’s a feature.” Netflix and HBO Now are specifically designed for multiple (authorized) members of a household to watch on several different screens at once. Is a college kid piggybacking off mom and dad’s Netflix account out of bounds? It’s a gray area. But anytime-anyplace multistream capabilities are a core part of why people love SVOD services. And the goal is to encourage as much usage per account as possible, because that drives up the perceived value of the subscription, so such “virtual households” are tolerated.

The real problem is, SVOD providers really can’t block unauthorized users if they have a legit password without instituting an additional form of authentication. Netflix and HBO want to make it as easy as possible to watch their streaming services; if they started asking for your mother’s maiden name or some other proof you’re entitled to the goods, customers would get irritated.

Look, HBO is not going to require a fingerprint scan or Social Security number before you can watch the latest “True Detective” episode. SVOD services could do heavy two-factor authentication for a preset number of devices per account, but again, that would stunt users’ ability to stream on any Internet-connected thing with a screen (e.g., from your in-laws’ smart TV on Thanksgiving).

That said, Netflix has effectively taken password-sharing into account in its pricing strategy. Since 2013, it has offered a “family plan” with up to four concurrent streams for $11.99 monthly (versus two streams for the standard $8.99 service). Sure, that’s designed for families — or, say, four cheapskate buddies who can get Netflix for $3 each. HBO, for both the standalone HBO Now and HBO Go cable add-on services, provides up to three simultaneous streams per account.

Technically, sharing passwords with anyone outside your household violates SVOD providers’ terms of service, which specify that access to the services are only for personal use and “nontransferable.” (Note that for Hulu, password sharing isn’t as much of concern because it provides just a single concurrent stream per account to subs.)

Execs at Netflix and HBO have regularly insisted that password-lending scofflaws are not a big concern. And then there’s this: Many password “borrowers” passwords may eventually become paying customers. A 2013 study found that 41% of HBO freeloaders and 33% of Netflix non-subscribers said they’d be willing to pay for their own accounts in the next six months.

To be sure, the industry is closely watching to see if password sharing becomes worse. HBO reserves the right to “change the maximum number of simultaneous streams and/or registered devices per account that you may use at any time, in its sole discretion,” according to the cabler’s terms of service for HBO Now. Netflix has similar verbiage on its subscriber agreement.

Right now, though, in the scramble for market share, putting up with password-sharing cheaters is a cost of doing business that Netflix and HBO don’t have an easy way to solve.
https://variety.com/2015/digital/new...al-1201538908/





At Netflix, Big Jump in Users—and Costs

Company plans expansion to more countries as profit declines 63%
Shalini Ramachandran and Maria Armental

Netflix Inc. added a better-than-expected 3.28 million streaming subscribers in the June quarter, as the video service continued to sacrifice profit amid an ambitious international expansion.

The company, based in Los Gatos, Calif., plans to aggressively expand service to more countries by the end of next year, with Japan, Portugal, Italy and Spain slated to join later this year. Netflix also said that it continues to look for a partner to enter China.

Netflix’s second-quarter results, however, show the inherent risks associated with expanding operations and doing business abroad. Profit fell 63% in the quarter as costs increased to buy and create content, and the strong U.S. dollar lowered the value of revenue generated outside the U.S.

On a video conference call, Netflix Chief Executive Reed Hastings struck a cautious note for investors, saying that international markets may not blossom right out of the gate. “How we do in the first year in a new market is not that determinate of the long-term,” he said. In the next couple of years, Netflix will “have a clearer picture of how we will do in markets that are quite different from the U.S.,” he said.

Netflix had negative free cash flow of $229 million in the second quarter, wider than the $163 million outflow in the first quarter, as the company continues to spend on original shows. Nevertheless, the service affirmed its commitment to the originals strategy, noting that nearly 90% of its subscribers have “engaged” with its original shows and movies.

The company projected that its spending on content would approach $5 billion in 2016, and expenses for marketing will be nearly $1 billion next year. The higher spending comes as Netflix, the pioneer of Internet TV, faces increased competition from traditional media companies, such as Comcast Corp. , which said it plans to roll out a streaming service to its broadband subscribers this summer.

Netflix shares, which started trading Wednesday at the new price reflecting the company’s 7-for-1 stock split, rose 9.7% in late trading to $107.63. As of 4 p.m. Wednesday, the shares were down 2.2% to $98.13 in regular trading on the Nasdaq. Netflix shares have more than doubled this year, giving Netflix a more expensive forward-looking price-earnings ratio than 99% of U.S. companies with market values over $1 billion, according to data from FactSet.

Netflix’s surprising 3.28 million bump in subscribers—the company’s projection was for 2.5 million additions, while analysts, on average, were expecting 3.14 million, according to FactSet—occurred as the streaming service released several original shows in the quarter.

The new shows included Marvel’s “Daredevil,” the science-fiction series “Sense8” and the third season of its high-profile dark comedy series “Orange is the New Black.”

The company has been placing a big emphasis on its initiative to back original films that will premiere on Netflix at the same time as they debut in theaters. Last month, it announced a partnership with Brad Pitt’s Plan B Entertainment to create a new movie called “War Machine” to debut next year. In the coming months, several Netflix original movies will premiere on the service at the same time as in select theaters, including “Crouching Tiger, Hidden Dragon: The Green Legend,” and Adam Sandler’s “The Ridiculous Six.”

The content is aimed at helping to separate Netflix from a wave of new online video services jumping into the streaming market, in the U.S. and abroad. Traditional TV networks such as HBO, CBS and Lifetime have launched their own online video services domestically, while pay-TV operators like Dish Network Corp. are catering to younger audiences with inexpensive, skinny-bundle Web TV services. Abroad, potential rivals like Eros International’s Eros Now in India are beefing up their content libraries in wait of Netflix’s arrival.

Overall, Netflix reported a quarterly profit of $26.3 million, or six cents a share, down from $71 million, or 16 cents a share, a year earlier. Revenue rose to $1.64 billion from $1.34 billion.

The results topped the company’s guidance. Analysts surveyed by Thomson Reuters expected, on average, earnings of four cents a share on $1.65 billion in revenue.

International operations again weighed on profit as the segment’s second-quarter loss widened to $92 million, from a year-earlier loss of $15 million. The company expects a loss of $77 million in the current quarter. Mr. Hastings said on the call that the company’s international losses will peak next year.

Including customers signed up for free trials, Netflix now has 65.55 million streaming members world-wide. It added a net 900,000 members in the U.S. and 2.37 million abroad in the second quarter.

Netflix expects to add another 1.15 million customers in the U.S. in the third quarter and 2.4 million from its international operations.

Mr. Hastings said he would like the streaming service to add 5 million to 6 million subscribers annually at a steady pace for the next several years until it gets to 60 million to 90 million subscribers in the U.S.—a target he has been holding up for a long time. U.S. customers currently stand at 42.3 million.

Separately, Netflix said it would support Charter Communications Inc.’s deal to buy Time Warner Cable Inc. so long as Charter adheres to its new policy not to charge content companies and long-haul telecom carriers to interconnect to its network.

Though Netflix grudgingly struck a “paid” interconnection deal with Comcast early last year, it lashed out publicly against the practice of such charges and made the deal a cornerstone of its opposition to Comcast’s ill-fated merger with Time Warner Cable.

It’s clear that Netflix hopes its public handshake with Charter bears other fruit. Mr. Hastings said on the call that “it would be great” if regulators reviewing AT&T’s proposed $49 billion acquisition of DirecTV looked at applying as a condition the “precedent” set by Charter’s free interconnection policy.
http://www.wsj.com/article_email/net...ODEzNTYxMDU2Wj





Netflix to Support Charter Acquisition of Time Warner Cable
Alex Sherman

Netflix Inc. will support Charter Communications Inc.’s $55 billion acquisition of Time Warner Cable Inc. in exchange for free access to Charter’s customers, according to filings the companies sent to the Federal Communications Commission Tuesday.

Charter won’t charge any website to deliver its content more efficiently until at least Dec. 31, 2018, the company said in a filing. Netflix filed a separate document that said it’s committed to supporting Charter’s deal for Time Warner Cable announced in May, given this commitment by Charter.

Netflix is a major customer of Charter and Time Warner Cable, and its support could help allay concerns by regulators that the deal might be bad for customers. Charter’s new “peering” policy prevents the broadband provider from charging any Internet company for faster access, which is particularly important for video-streaming services such as Netflix, Time Warner Inc.’s HBO Go, and Google Inc.’s YouTube. Netflix accounts for about 37 percent of all download traffic in peak evening hours, according to research from Sandvine, a Canada-based networking company.

Netflix opposed an earlier failed bid by Comcast Corp. for Time Warner Cable after reluctantly agreeing last year to pay Comcast for access. With settlement-free “peering,” neither Charter nor Netflix would pay each other.

“This new policy and the commitment to apply it across the ‘New Charter’ footprint is a substantial public-interest benefit and will support scaling the Internet to meet consumers’ growing demand for online services and help foster continued innovation across the Internet ecosystem,” Christopher Libertelli, vice president of global public policy at Netflix, wrote in the filing.
Net Neutrality

Charter shares climbed 2.2 percent to close at $181.00 in New York, putting the stock up 8.6 percent this year. Time Warner Cable shares rose 1.4 percent to $186.00.

Content companies such as Netflix want to get information from their servers to the networks of consumer Internet providers. Netflix has traditionally used a third company to make this connection. But it also has the option of connecting through its own content delivery network, cutting out the middleman and dealing with companies such as Charter and Comcast directly. The controversy surrounding Netflix’s deal with Comcast came because Netflix agreed to pay for this direct connection. Critics say these payments are a kind of extortion on the cable operators’ part.

The FCC approved rules earlier this year in an effort to curb the power of broadband operators to slow or block traffic. The agency claimed power to judge whether Internet service providers offer fair terms for accepting Web traffic from companies like Netflix and data shippers such as Cogent Communications Holdings Inc. and Level 3 Communications Inc.
http://www.bloomberg.com/news/articl...e-warner-cable





Comcast Candidly Admits Its High Prices Helped Create Netflix
Karl Bode

Responding to Netflix's record-setting second quarter numbers, top Comcast lobbyist David Cohen this week was willing to admit that Comcast's refusal to seriously compete on TV pricing helped create the streaming giant that Netflix is today. Speaking to attendees of the New England Cable & Telecommunications Association, Cohen called Netflix a "frenemy" Comcast inadvertently encouraged via high television pricing:

While Cohen sees Netflix as a complement to Comcast’s cable offering, he acknowledges that streaming services, especially those that offer slimmer video packages like Sling TV and Sony PlayStation Vue, could potentially be more attractive to price-conscious consumers. "Part of this is a self-inflicted wound,” Cohen said. “We have made video too expensive."

You don't say? While broadcaster programming costs are the primary culprit, cable operators have contributed to these soaring rates through a variety of additional charges, whether that's a modem rental fee, a "Broadcast TV" fee, the increase in DVR and set top rental costs, charges to pay your bill in person, charges to pay your bill over the phone, etc.

Regardless, most everyone agrees this price-hike parade is unsustainable, and Comcast, owner of NBC, is culpable all along the supply chain. Cohen of course also was quick to point out how even though Netflix may have added 3.2 million subscribers last quarter, Netflix still has to come through Comcast to get to them:

Cohen added while some fear that more Netflix customers means less cable customers, he reminded the audience that reliable broadband is a crucial element of the streaming service. "Remember, you can’t get Netflix without broadband service,” Cohen said. “Those are 3 million customers of our broadband service."

You'll know the cable and broadcast industry is truly taking Netflix seriously when it begins to seriously compete on price, something the industry at large has gone to great, great lengths to avoid. The second cable operators are forced to compete on price on the video end, however, they'll likely compensate for lost TV revenue by increasing broadband prices. There's a few ways for Comcast to do that, either through interconnection fees or the company's slowly expanding use of usage caps and overages.
https://www.dslreports.com/shownews/...Netflix-134494





Comcast 2 Gig Pricing Arrives And The Install Cost Is a Doozy
Karl Bode

A new Comcast website has finally revealed pricing for the company's heavily promoted two gigabit offering. According to the freshly unveiled Comcast website for the "Gigabit Pro" product, the company's standard pricing for the symmetrical two gigabit service will be $300 a month. That pricing doesn't include a whopping $500 installation and activation fees, or the other surcharges Comcast applies to the connection post sale.

Fortunately, Comcast says they'll be offering a $159 per month early promotional price for the service if users agree to a two-year contract. Update: this price is apparently only being offered in one small area and only if users sign a three year contract.

Comcast announced back in April that it would be offering two gigabit ("Gigabit Pro") service to 18 million homes by the end of the year. The company however had gone out of its way to avoid discussing what they hoped to charge; the steep $1000 in installation fees likely being the major reason why.

There's also an unspecified early termination fee, which if Comcast's existing 505 Mbps tier is any indication, will also be well over a thousand dollars.

To get the service, Comcast's website says customers must "generally live within a third of a mile of our fiber network in the cities where Gigabit Pro is offered." The company says that installation of the new tier may take "six to eight weeks" to complete. It's unclear if the company plans to reduce the speedy' service's price in markets where Comcast faces pressure from municipal broadband or Google Fiber.
https://www.dslreports.com/shownews/...a-Doozy-134458





Gigabit Marketing Drives Adoption of Slower Tiers
Karl Bode

If you think gigabit speeds (or two gigabit speeds) are largely a marketing ploy you're right, since there's few services that can utilize even half that amount of bandwidth. But the offering of gigabit service appears to drive adoption of slower speeds as well, numerous ISPs have now stated. TDS Telecom recently stated that adoption of its $35 a month 100 Mbps tier have surged since it started offering gigabit speeds.

Apparently users call in to see what the excitement is about, realize they probably don't need that much speed (or don't want to pay that price), then stumble off with a fast-but-not-ridiculously-fast slower tier.

"As soon as you begin to shout you have a 1 Gig service it begins to draw people's attention and when you explain to them why 1 Gig is closer to reality than you think, they begin to examine what service they have and they call for more bandwidth," Hawaiin Telecom CEO Scott Barber tells Fierce Cable.

In what will be a sort of blasphemy for our regular readers, the exec points out that there's a lot of people out there for whom "gigabit" is meaningless, since they have absolutely no idea what speed they currently have.

"Only one customer knew exactly what they had but all of them were guessing," Barber said. "People will order service knowing specifically what they are ordering, and a year or two later they forgot, so when you're able to talk about 1 Gig in the marketplace being available people begin to pay attention to what they have and the phones start to light up."

Of course the kind of people who don't know what speed they have may be looking to faster speeds to resolve issues like interconnection and streaming performance, which obviously have nothing to do with breaking the gigabit threshold.
https://www.dslreports.com/shownews/...r-Tiers-134464





Obama Unveils ConnectHome to Get Low-Income Households Online

The pilot program will launch in 27 cities and one tribal nation and reach more than 275,000 low-income households. Some communities will receive broadband connections at no charge.
Don Reisinger and Marguerite Reardon

The Obama administration on Wednesday announced a broad initiative that aims to provide high-speed Internet service to low-income households.

Dubbed ConnectHome, the new initiative will bring high-speed broadband access to over 275,000 low-income households across the US. According to the White House, the pilot program will launch in 27 cities including New York, Boston and Seattle, as well as the Choctaw Tribal Nation in Oklahoma. The effort will initially connect nearly 200,000 children to the Web, according to the White House.

The pilot program is part of the Obama administration's continuing effort to close the digital divide, ensuring that everyone, regardless of income, has access to high-speed Internet service. The president in March created the Broadband Opportunity Council, comprising 25 federal agencies and departments charged with giving more people access to broadband, which Obama sees as a critical component for US economic growth and competitiveness.

According to the Pew Research Center, 92 percent of households with incomes between $100,000 to $150,000 have broadband access, but less than half of households below the $25,0000 income level can tap into high-speed Internet. The American Library Association, which applauded the president's move on Wednesday, said 5 million households with school-age children do not have high-speed Internet service.

"While nearly two-thirds of households in the lowest-income quintile own a computer, less than half have a home internet subscription," the White House said in a statement on Wednesday. "While many middle-class U.S. students go home to Internet access, allowing them to do research, write papers, and communicate digitally with their teachers and other students, too many lower-income children go unplugged every afternoon when school ends. This 'homework gap' runs the risk of widening the achievement gap, denying hardworking students the benefit of a technology-enriched education."

The US government has been actively seeking ways to connect more low-income households to the Web. In June, the Federal Communications Commission voted to advance a proposal that would allow qualifying households to use their $9.25-per-month Lifeline subsidy on either phone or broadband service.

"Broadband has gone from being a luxury to a necessity," FCC Chairman Tom Wheeler said at the time. "But the fact of the matter is that the majority of Americans earning less than $25,000 a year don't have broadband at home."

For its ConnectHome program, the US government is partnering with several organizations in the private and public sectors. Google Fiber, for instance, will provide free monthly home Internet service to select public housing communities in Atlanta, Durham, North Carolina, Kansas City, Missouri and Nashville. Another provider, CenturyLink, is offering broadband service to HUD households in Seattle for $9.95 for the first year, increasing that monthly rate to $14.95 in the next four years. Cox Communications and Sprint, among others, will also participate.

Other companies and organizations will offer add-on services. Big-box retailer Best Buy, for instance, will provide computer training and technical support to HUD residents. The James M. Cox Foundation, a not-for-profit associated with Internet service provider Cox Communications, will provide 1,500 tablets at $30 each to students and their families in Macon, Georgia. The open-source project GitHub is offering $250,000 to help with "digital literacy," Public Broadcasting Service will produce new content for children and the American Library Association will provide on-site training.

"Librarians know from our work in communities every day that far, far too many Americans currently lack the technology access and skills to participate fully in education, employment and civic life," ALA President Sari Feldman said in a statement Wednesday. "Broadband is essential, and we are so pleased the Obama administration has made home broadband access a priority.

ConnectedHome is being funded by private industry, nonprofit organizations and local leaders, Julian Castro, U.S. Secretary of Housing and Urban Development, said during a press call Wednesday. Together, they have committed to spending $70 million over the next several years, Jeff Zients, director of the White House National Economic Council, said on the call. The federal government will not be contributing money beyond the $50,000 allocated by the Department of Agriculture for broadband-related equipment deployed in the Choctaw tribal nation, Zients said. He also confirmed that the 27 local governments where the program is launching will not be required to pay for the program.

Castro said the program has been limited by design in order to show that it works.

"My hope is that it will demonstrate great results, which will provide an opportunity to help expand it," Castro said. "This is a demonstration project, and we're focused on making sure it's done right."

The agency plans to measure the success of the program in two different ways. First, it will look at the number of families who did not have broadband access before the program launched and who chose to sign up for service after the program is launched. The second measurement will be more complicated. The agency hopes it can analyze how young people who participated in the program perform academically compared to students with similar socioeconomic backgrounds that do not have access to the program.

"This is really a proof-of-concept program," Zients said. "We're trying to help as many students as we can while helping to bridge the digital divide."

President Obama will officially announce the ConnectHome initiative at an event in Durant, Oklahoma, on Wednesday. He is expected to outline the program and link it to his ConnectEd initiative, a program announced in 2013 that aims at getting 99 percent of US students in grades Kindergarten through 12 on high-speed Internet access in school and libraries within five years.

"Since the president took office, the private and public sectors have invested over $260 billion into new broadband infrastructure, and three in four Americans now use broadband at home," the White House said on Wednesday. "Thanks to smart spectrum policies and world-leading technology, fast 4G wireless broadband is now available to over 98 percent of Americans -- up from zero percent since 2009."
http://www.cnet.com/news/obama-unvei...eholds-online/





News Sites Are Fatter and Slower Than Ever
Frédéric Filloux

An analysis of download times highlights how poorly designed news sites are. That’s more evidence of poor implementation of ads… and a strong case for ad blockers.

Websites designers live in a bubble, they’re increasingly disconnected from users. Their work requirements include design (fonts, layouts elements), advertising (multiple ad serving and analytics), data collection (even though most sites collects way more data that they are able to process), a/b testings, and other marketing voodoo.

Then, when a third party vendor shows up with the tool-everyone-else-uses, the pitch stresses simplicity: ‘Just insert a couple of lines of code’, or ‘A super-light javascript’. Most often, corporate sales and marketing drones kick in: ‘We need this tool’, or ‘Media-buying agencies demand it’. The pressure can even come from the newsroom struggling to improve its SEO scores, asking for new gadgets “To better pilot editorial production”, or “To rank higher in Google News”.

Quite often, these are contraptions are used to conceal professional shortcomings that range from an inability to devise good ads formats that won’t be rejected by users (and better, clicked on), to a failure to provide good journalism that will naturally finds its way to users without needing titillating stimuli.

This troublesome trend also reveals a leadership failure: As no one is given the final authority over performance, a web site (and sometimes an app) ends up as a disorganized pile of everyone’s demands. Lack of choice leads to anarchy.

In the process, publishers end up sacrificing a precious commodity: SPEED.

Today, a news site web page of a consists of a pile of scripts, of requests to multiple hosts in which relevant content only makes up an insignificant proportion of the freight. (On the same subject, see this post by John Gruber, and Dean Murphy’s account of An hour with Safari Content Blocker in iOS9)

Consider the following observations: When I click on a New York Times article page, it takes about 4 minutes to download 2 megabytes of data through… 192 requests, some to Times’ hosts, most to a flurry of others servers hosting scores of scripts. Granted: the most useful part — 1700 words / 10,300 characters article + pictures — will load in less that five seconds.

But when I go to Wikipedia, a 1900 words story will load in 983 milliseconds, requiring only 168 kilobytes of data through 28 requests.

To put it another way: a Wikipedia web page carrying the same volume of text will be:
— twelve times lighter in kilobytes
— five times faster to load relevant items
— about 250 times faster to fully load all elements contained in the page because it will send 7 times fewer requests.

And the New York Times is definitely not the worse in its league.

I hear cohorts of technical people yelling that I’m comparing oranges — a barebones, advertising-free, non-profit web site — and peaches — a media-rich commercial site that must carry all the bells and whistles. I surely do, and this is exactly my point. When I spend hours measuring web sites from different perspectives — and knowing how the sausage factory works — my certainty is we went way too far when overloading our web sites with poorly implemented, questionable features.

Two major industry trends should force us to reconsider the way we build our digital properties. The first one is the rise of ad blockers that pride themselves at providing faster navigation and at putting less strain on computers. Users made publishers pay the hard price for giving up browsing comfort and speed: in some markets, more than 50% of visitors use ad-blocking extensions.

The second trend is the rise of mobile surfing that account for half of pageviews in mature markets. And, in emerging countries, users leapfrog desktops and access the web en masse through mobile.

Let’s consider the ad blockers problem.

To assess their impact of page-loading time, I selected eight news sites, each with different characteristics: free and mostly ad-supported (Buzzfeed, Spiegel, HuffPo and Guardian), subscription-based (FT.com), mixed model (NYT), or no-ads and text based (Medium).

The table below shows how the activation of an ad blocker impacts requests and data transfers. In my test, since a page can still continue to download elements 15 minutes after the initial request, I arbitrarily measured the number of requests and amount transferred after the site comes to some kind of rest, i.e. once the most important elements have been downloaded.

Note the number of servers requests: it ranges from 99 (Medium) to 400+ for the HuffPo or the FT.com. Requests by a site like the Washington Post are impossible to measure since the site never stops downloading, endlessly calling data for auto-play videos; these megabytes are often rendered by Flash Player, well-known for CPU overheating (you can hear your laptop’s fans make a drone-like noise).

One caveat: This is a crude assessment. I used Chrome’s developer tools, setting my AdBlockPlus on and off, and making about fifty downloads on these URLs; the values must only be considered as indicative, even if the ad blockers’ impact seems quite reliable. Another note: ad blockers cut requests and transfers of ad formats, but many scripts embedded in a page are not affected.

Overall, depending on the advertising load of the site’s home page, an ad blocker cuts data transfers by half to three quarters,

Now, let’s have a look at mobile issues.

Unfortunately, today’s analysis is restricted to mobile sites and does not include applications. But it gives an idea of how web sites are implemented for mobile use, whether through responsive design (dynamic adjustment to the window’s size) or dedicated mobile sites.

First, let’s consider the effect of network conditions encountered in real life, and rarely factored-in by developers who enjoy super-fast networks in their offices.

I took an article page of the New York Times mobile site, to see how fast it loads.

No surprise, the page takes six times more times on a regular 2G network that on 4G. Based on these observations, most mobile news sites would need to be redesigned to deal with the harsh realities of many cellular networks in Africa or Asia. (By ignoring this, publishers leave an open field to Google and Facebook ultra-light apps.)

For the test, I took seven news sites (FT.com doesn’t have a mobile version, nor a responsive one — it relies on its great web app for mobile use).

The test sample is an article page, with no ad-blocking activated. I measured the difference between web and mobile versions, simulating an iPhone 5 operating over a “good 3G network” with a 1 Mb transfer rate and a 40 milliseconds Round Trip Delay Time. Again, I quit measuring requests and byte transfers after 30 seconds, which coincides with apparent completion for most sites (even if many continue to download minor stuff almost endlessly).

For mobile use, this shows most analyzed properties reduce the number of requests and bytes transferred – by up to 80% for the Guardian as an example. However, if we consider that the speed ratio between 3G networks and wifi access is 1:20, there is room for serious improvement.

Since its initial release 22 years ago, the Hyper Text Markup Language (HTML) has gone through many iterations that make web sites richer and smarter than ever. But this evolution also came with loads of complexity and a surfeit of questionable features. It’s time to swing the pendulum back toward efficiency and simplicity. Users are asking for it and will punish those who don’t listen.
http://www.mondaynote.com/2015/07/13...wer-than-ever/





Moxie Marlinspike: The Coder Who Encrypted Your Texts

Dreadlocked programmer has spooked the FBI by creating a tool that police can’t crack
Danny Yadron

In the past decade, Moxie Marlinspike has squatted on an abandoned island, toured the U.S. by hopping trains, he says, and earned the enmity of government officials for writing software.

Mr. Marlinspike created an encryption program that scrambles messages until they reach the intended reader. It’s so simple that Facebook Inc. ’s WhatsApp made it a standard feature for many of the app’s 800 million users.

The software is effective enough to alarm governments. Earlier this year, shortly after WhatsApp adopted it, British Prime Minister David Cameron called protected-messaging apps a “safe space” for terrorists. The following week, President Barack Obama called them “a problem.”

That makes the lanky, dreadlocked and intensely private coder a central figure in an escalating debate about government and commercial surveillance. In a research paper released Tuesday, 15 prominent technologists cited three programs relying on Mr. Marlinspike’s code as options for shielding communications.

His encrypted texting and calling app, Signal, has come up in White House meetings, says an attendee. Speaking via video link last year as part of a panel on surveillance, former National Security Agency contractor Edward Snowden, who leaked troves of U.S. spying secrets, urged listeners to use “anything” that Mr. Marlinspike releases.

That endorsement was “a little bit terrifying,” Mr. Marlinspike says. But he says he sees an opening, following Mr. Snowden’s revelations, to demystify, and simplify, encryption, so more people use it. He finds most privacy software too complicated for most users.

The former teenage hacker studies popular apps like Snapchat and Facebook Messenger, trying to understand their mass appeal. He says he wants to build simple, “frictionless” apps, adopting a Silicon Valley buzzword for “easy to use.”

“I really started thinking about, ’How do I be more in touch with reality?’ ” he says.

Those who know him say he has both the will and the technical chops to popularize complex technology.

A few years ago, Matthew Green, a cryptographer and professor at Johns Hopkins University, unleashed his students on Mr. Marlinspike’s code. To Prof. Green’s surprise, they didn’t find any errors. He compared the experience to working with a home contractor who made “every single corner perfectly squared.”

During chats about surveillance and security, Mr. Marlinspike also won over Morgan Marquis-Boire, a researcher who has worked on security for Google Inc. In a fellowship recommendation for Mr. Marlinspike, Mr. Marquis-Boire wrote, “There are very few people who write privacy tools that I trust, and Moxie is one of them.”

Mr. Marlinspike says it is more important that users trust his software than trust him. “It’s easier to trust that I haven’t made mistakes,” he says.

Even by the standards of privacy activists, Mr. Marlinspike is unusually secretive about himself. He won’t give his age, except to say he is “in his 30s.” In an interview, he wouldn’t say whether Moxie Marlinspike was his birth name. In an 2011 online interview with the website Slashdot, however, he wrote, “the name my parents put on my birth certificate is ‘Matthew.’ ” Friends and former associates say they know him only as Moxie.

Consumer encryption tools like Mr. Marlinspike’s have been around since the early 1990s, but most are so cumbersome that few people use them. A popular email-encryption program, PGP, or Pretty Good Privacy, requires users to swap a series of thousands of random letters and numbers with anyone they wish to contact. Sending a message requires several clicks, a password, and sometimes, copying and pasting.

A young Mr. Marlinspike once thought users would eventually adopt such tools. “That hasn’t really worked out,” he says now.

Phil Zimmermann, who invented PGP, says he rarely uses it because “it doesn’t seem to work well on the current version of MacIntosh.”

Such headaches have limited the use of encryption to a level law enforcement has mostly learned to live with. Big technology companies like Google, Microsoft Corp. and Yahoo Inc. usually maintain access to customer messages and provide user emails and contact information to authorities when faced with a court order, even if they oppose it. Consumer services like these typically haven’t had strong encryption.

Adding easy-to-use encryption that companies can’t unscramble to products used by millions changes that calculus. After Apple Inc. tweaked its iPhone software so that the company could no longer unlock phones for police, the director of the Federal Bureau of Investigation accused Apple of aiding criminals. Apple Chief Executive Tim Cook counters that he is defending user privacy.

The incident sparked a continuing war of words between Silicon Valley and Washington. “Encryption has moved from something that is available to something that is the default,” FBI Director James Comey told a congressional panel Wednesday. “This is a world that in some ways is wonderful and in some ways has serious public-safety ramifications.”

Technology companies, once cozy with Washington, sound increasingly like Mr. Marlinspike. Apple, Facebook, Google and others are resisting efforts to give the government access to encrypted communications.

Last fall, WhatsApp added Mr. Marlinspike’s encryption scheme to text messages between users with Android smartphones, but there is no easy way to verify that the encryption software is actually turned on. The app maker, acquired by Facebook for $22 billion last year, plans to extend encryption to images and iPhone messages, a person familiar with the project said.

Behind the clash lurks this reality: Even if the big tech companies come around, there are others like Mr. Marlinspike who will pick fights with code.

Mr. Marlinspike argues for safe spaces online. His personal Web address is thoughtcrime.org, a reference to George Orwell’s “1984.”

As a teenager, Mr. Marlinspike says, he was more interested in breaking software than creating it. He turned to protecting data as he grew more concerned about surveillance.

He moved to San Francisco in the late 1990s and worked for several technology companies before the dot-com bust, including business-software maker BEA Systems Inc. Since then, he often has lived on the edge of the Bay Area’s tech-wonk scene.

During the mid-2000s, he and three friends refurbished a derelict sailboat and spent summers being blown around the Bahamas, without a backup motor, as depicted in a home movie Mr. Marlinspike posted online.

In 2010, Mr. Marlinspike’s company, Whisper Systems, released an encryption app, TextSecure. Twitter Inc. bought Whisper Systems for an undisclosed sum in 2011 primarily so that Mr. Marlinspike could help the then-startup improve its security, two people familiar with the transaction said. He worked to bolster privacy technology for the social-media firm, leaving in 2013.

Around that time, the State Department was looking to use technology to support pro-democracy movements overseas. Mr. Marlinspike’s work caught the attention of Ian Schuler, manager of the department’s Internet freedom programs. Encrypted messaging was viewed as a way for dissidents to get around repressive regimes.

With help from Mr. Schuler, Radio Free Asia’s Open Technology Fund, which is funded by the government and has a relationship with the State Department, granted Mr. Marlinspike more than $1.3 million between 2013 and 2014, according to the fund’s website.

Mr. Marlinspike was hardly a conventional Washington player. He and a government official missed meeting one another at a San Francisco burrito joint because the visitor assumed the dreadlocked Mr. Marlinspike couldn’t be the person he was there to see, Messrs. Schuler and Marlinspike said.

Mr. Marlinspike now runs a new firm, Open Whisper Systems, from a low-rent workspace in San Francisco’s Mission District. He has received other grants but says he isn’t interested in venture capital, partly because he would have to promise returns to investors.

His latest app, Signal, promises users secure text messages and voice calls. He acknowledges that it still has some kinks. Calls can drop if a user receives a traditional phone call while on an encrypted call. Mr. Marlinspike won’t disclose how many people use the app.

He still has work to do if he wants typical users to adopt encrypted communications.

But its minimalist blue-and-white design looks like something that could have emerged from Facebook.

Mr. Marlinspike says the San Francisco Police Department called last year to ask whether the app was secure enough for its officers to use. A spokesman for the department said it “did look at this vendor.”
http://www.wsj.com/article_email/mox...MjEzMDkxMjAwWj





Anti-Surveillance Tool ProxyHam Will Never See the Light of Day
Steve Ragan

Earlier this month, several news outlets reported on a powerful tool in the fight between those seeking anonymity online, versus those who push for surveillance and taking it away.

The tool, ProxyHam, is the subject of a recently canceled talk at DEF CON 23 and its creator has been seemingly gagged from speaking about anything related to it. Something's off, as this doesn't seem like a typical cancellation.

Privacy is important, and if recent events are anything to go by – such as the FBI pushing to limit encryption and force companies to include backdoors into consumer oriented products and services; or the recent Hacking Team incident that exposed the questionable and dangerous world of government surveillance; striking a balance between law enforcement and basic human freedoms is an uphill struggle.

Over the last several years, reports from various watchdog organizations have made it clear that anonymity on the Internet is viewed as a bad thing by some governments, and starting to erode worldwide.

Whistleblowers, journalists, human rights activists, or anyone who wishes to express their opinions against the state are being tracked and targeted by the very governments they're discussing or protesting.

The documents leaked by Edward Snowden prove that privacy is a basic right that's easily dismissed by some governments, and the Hacking Team incident shows there's a booming business market in helping them succeed.

Organizations such as Hacking Team or Gamma International have developed the tools and tactics needed to help oppressive governments, enabling them with the ability to track people no matter their location or how they connected to the Web.

While tools such as Tor or VPNs can help, the problem is that once a person's IP address has been linked to a physical location their anonymity ceases to exist.

Given that governments often control the infrastructure being used, unmasking people in this fashion has only gotten easier over the last decade or so.

While it is true that criminals can be flagged and arrested using pinpointing techniques and lawful interception tools – and that's a good thing – normal citizens expressing their basic human rights are also targeted and arrested (including journalists), which is horrendous.

Not every government is guilty of such acts, but several of them are, and that's why it's important that people be empowered to speak freely and to do so anonymously if there's a need.

Enter ProxyHam, a tool created by Ben Caudill, a researcher for Rhino Security Labs, which can help human rights activists, whistleblowers, journalists, and privacy advocates remain anonymous online.

Designed to augment existing privacy tools, ProxyHam is a Raspberry Pi computer with Wi-Fi enabled. There's three antennas; one is used to connect to a public Wi-Fi network, and the other two are used to transmit Wi-Fi signals over a 900 MHz frequency.

By using a 900 MHz radio, ProxyHam can connect to a Wi-Fi network up to two miles away, and blend-in with traffic on that spectrum. So if the person using it were to be tracked via IP address to a physical location, all anyone would find at that location is the ProxyHam box.

Caudill had planned a talk at DEF CON 23 centered on ProxyHam, which would've included a demonstration and the release of full hardware schematics, as well as source code. While everything needed to develop a device would've been offered, pre-configured units were planned for sale at a cost of $200.

On Friday, the talk was canceled. Caudill was vague in his responses to the public. Based on brief public remarks, it's clear that he cannot speak about the topic or explain further.

In fact, all he can say is that the talk is canceled, the ProxyHam source code and documentation will never be made public, and the ProxyHam units developed for Las Vegas have been destroyed. The banner at the top of the Rhino Security website promoting ProxyHam has gone away too. It's almost as if someone were trying to pretend the tool never existed.

Talks have been canceled at DEF CON before. So in that sense, this talk isn't the first and it won't be the last.

However, given the topic, the nature of the tool, and the current privacy climate – it's a strange coincidence that a tool with such value and usefulness would be promoted and then removed from the public.

I don't believe in coincidences.

Could have Caudill changed his mind? Yes, but that's unlikely, because he was excited to release this tool and share the information with the public and protect those who are most at risk for using their voice.

Therefore, while it is pure speculation on my part since no one can speak on record, it would look as if a higher power – namely the U.S. Government – has put their foot down and killed this talk.

It isn't perfect, but a tool like ProxyHam – when combined with Tor or other VPN services, would be powerful.

Such a combination would make tracking dissidents or whistleblowers (even with custom malware or tools from the likes of Hacking Team) increasingly difficult the more that ProxyHam was developed.

In fact, while the first version offered strong support to existing privacy tools, further developments were planned that would've not only improved things, but made them more affordable.

While the chance for abuse is also a valid point to make, and law enforcement certainly would, criminal elements have abused VPN and Tor before, so that's not a strong argument. Honestly, criminals have been twisting legitimate tools and resources for their own gain for quite some time now.

At the same time, that criminals could abuse the tool is the only argument a government needs to make.

When faced with legal threats, most researchers will bend because there's no other option available to them. No one wants to face fines and jail time over code.

Caudill isn't talking, and clearly he can't. Offering his apologies in an email when asked for comment, he responded to questions by repeating what was said on Twitter:

"...ProxyHam development would cease immediately, all existing units and prototypes destroyed, no further information or source code would be made available, and the DEF CON talk on whistleblowers and anonymity would be cancelled..."

Again, ProxyHam was under development for more than a year and Caudill was excited for it to go public. Now that's all gone, and there's nothing to suggest this was his intention.

Rather, given the state of things as they pertain to privacy and legal matters here in the U.S., it appears that his hand was forced – legally – complete with gags and destruction orders.

If a government agency killed this project, then it's a sad day for privacy and security research.

Salted Hash as reached out to DEF CON to see if they can offer any additional details.

Update:

Shortly after this story was posted, readers on Twitter pointed out that there doesn't appear to be an FCC license for Caudill listed. Salted Hash has reached out to confirm the status, because it is possible the FCC intervened on the talk for that reason, or because there were devices for sale. (Thanks: @t0x0pg & @Err0r10 )

Another reader noted that using the devices would violate FCC rules, but there have been other talks where such a conflict potentially exists, and those were not canceled.

Shortly after this update was posted, Caudill responded to questions about the FCC stating that no, licensing had nothing to do with it. The 900MHz licensing was something they were just starting to look at, but the ProxyHam devices were limited to 1Watt as required by the FCC.

"Proxyham devices did not break the FCC standards as the 900MHz antennas were capped at the 1-watt limit," he said.

Update 2:

When asked about patents, and if those held by Ubiquity or Intel are related to the problems he currently faces, Caudill told Salted Hash that IP related matters were not at issue.

"[There's] no IP related issues," he said. The answer was the same when it came to potential issues with the FCC.

The FCC question resurfaced because if encryption were used, it would violate FCC part 97 against armature radio operators encrypting. There's also the issue of sales, which under FCC part 95 (sub one-watt consumer use device), requires validation – a slow and often expensive process.

Adding context, Michael Harris, Principal Security Analyst and Adjunct Instructor at the University of Missouri, commented via email:

"Many Hams have experimented with IP over ham bands, lower frequencies have throughput issues as one might expect and gear up a the 1.2 Ghz range is still too expensive. The current sweet spot is in the 800 to 900 Mhz range but is saturated by many other services fighting for that space from legacy cell phones to industrial controls doing short haul data to many spread spectrum and frequency hopping commercial radios.

"That general frequency range is a really noisy place to be and a proliferation of ProxyHam devices in that range would cause lots of problems in whatever particular band was selected. There is a huge fight over that frequency range going on already not just here but worldwide."

So if patents were not a problem, and if the FCC wasn't a problem - as confirmed by Caudill himself, why was this tool forced out of the public's reach? We may never know.

There is another possible reason, one that I felt was too extreme when I first pinned this rant: a National Security Letter.

If a NSL was issued, unless Caudill goes the way of Lavabit, he has little recourse and almost no defense against this. There have been cases where a NSLs have been used inappropriately, but it's rare to actually see proof in such cases until long after the fact.

But again, this is pure speculation. The point of the rant was that people need privacy tools, and ProxyHam would have made a great addition to the existing mix, but now we'll never get it.

For the record, I asked Caudill about getting a NSL, Caudill would only answer, "No comment."

Update 3:

There was an AMA on Reddit about ProxyHam earlier this month, for those who don't know. Also, Rob Graham has posted his thoughts about the issue on the Errata Security blog.
http://www.csoonline.com/article/294...umstances.html





ProxyHam Conspiracy is Nonsense
Robert Graham

This DEF CON conspiracy theory is about a canceled talk about "ProxyHam", which has been canceled under mysterious circumstances. It's nonsense.

The talk was hype to begin with. You can buy a 900 MHz bridge from Ubquiti for $125 and attach it to a Raspberry Pi. How you'd do this is obvious. It's a good DEF CON talk, because it's the application that important, but the technical principles here are extremely basic.

I don't know why the talk was canceled. One likely reason is that the stories (such as the one on Wired) sensationalized the thing, so maybe their employer got cold feet. Or maybe the FBI got scared and really did give them an NSL, though that's incredibly implausible.

Anyway, if DEF CON wants a talk on how to hook up a Raspberry Pi to a UbiQuiTi NanoStation LOCOM9 in order bridge WiFi, I'll happily give that talk. It's just basic TCP/IP configuration, and if you want to get fancy, some VPN configuration for the encryptions. Just give me enough lead time to actually buy the equipment and test it out. Also, if DEF CON wants to actually set this up in order to get long distance WiFi working to other hotels, I'll happily buy a couple units and set them up this way.


Update: An even funner talk, which I've long wanted to do, is to do the same thing with cell phones. Take a cellphone, pull it apart, disconnect the antenna, then connect it to a highly directional antenna pointed at a distant cell tower -- several cells away. You'd then be physically nowhere near where the cell tower thinks you are. I don't know enough about how to block signals in other directions, though -- radio waves are hard.
http://blog.erratasec.com/2015/07/pr...l#.VaQ9bflVhBc





Britain's 'Vital' Emergency Surveillance Law Ruled Unlawful
Michael Holden

Britain has been given nine months to produce new surveillance legislation it says is vital to national security after London's High Court ruled on Friday that emergency measures rushed through parliament last year were unlawful.

The court backed a judicial challenge from two prominent lawmakers and other campaigners that powers which compelled telecoms firms to retain customer data for a year were inconsistent with European Union laws.

Prime Minister David Cameron had said the measures were vital to protect the country, which is on high alert because of the threat posed by Islamic State militants and from Britons who have travelled to Iraq and Syria to fight with them.

"The court has recognised what was clear to many last year, that the government’s hasty and ill thought through legislation is fatally flawed," said lawmaker David Davis, a long-time campaigner against state intrusion who was defeated by Cameron in the race to become Conservative Party leader in 2005.

Tom Watson, currently standing to be the deputy leader of the opposition Labour Party, added: "The government was warned that rushing through important security legislation would end up with botched law."

Last year's legislation was fast-tracked after the European Court of Justice threw out an EU directive requiring companies to retain data for 12 months.

Cameron had argued that scrapping this could deprive police and intelligence agencies of access to information about who customers contacted by phone, text or email, and where and when, which forms a crucial part of most investigations.

He said the emergency law would not provide new powers but only enshrine existing capabilities.

However, the High Court ruled the law should be "disapplied", saying it did not provide precise rules to ensure data was only obtained to prevent or detect serious crimes, and also failed to ensure there was judicial or independent oversight over the access to the data.

It gave the government until March 2016 to come up with a replacement.

"We disagree absolutely with this judgement and will seek an appeal," said Security Minister John Hayes.

Cameron's government, fresh from winning an election in May, has already promised new legislation to extend the powers of police and spies to monitor communications and web activities so they can keep up with technological changes.

Friday's decision also comes after three major inquiries concluded that British spies were not knowingly carrying out illegal mass surveillance following disclosures by former U.S. security agency contractor Edward Snowden.

(Editing by Stephen Addison)
http://uk.reuters.com/article/2015/0...0PR15H20150717

















Until next week,

- js.



















Current Week In Review





Recent WiRs -

July 11th, July 4th, June 27th, June 20th

Jack Spratts' Week In Review is published every Friday. Submit letters, articles, press releases, comments, questions etc. in plain text English to jackspratts (at) lycos (dot) com. Submission deadlines are Thursdays @ 1400 UTC. Please include contact info. The right to publish all remarks is reserved.


"The First Amendment rests on the assumption that the widest possible dissemination of information from diverse and antagonistic sources is essential to the welfare of the public."
- Hugo Black
JackSpratts is offline   Reply With Quote
Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Peer-To-Peer News - The Week In Review - July 30th, '11 JackSpratts Peer to Peer 0 27-07-11 06:58 AM
Peer-To-Peer News - The Week In Review - July 16th, '11 JackSpratts Peer to Peer 0 13-07-11 06:43 AM
Peer-To-Peer News - The Week In Review - July 9th, '11 JackSpratts Peer to Peer 0 06-07-11 05:36 AM
Peer-To-Peer News - The Week In Review - January 30th, '10 JackSpratts Peer to Peer 0 27-01-10 07:49 AM
Peer-To-Peer News - The Week In Review - December 5th, '09 JackSpratts Peer to Peer 0 02-12-09 08:32 AM






All times are GMT -6. The time now is 07:29 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
© www.p2p-zone.com - Napsterites - 2000 - 2024 (Contact grm1@iinet.net.au for all admin enquiries)