P2P-Zone  

Go Back   P2P-Zone > Peer to Peer
FAQ Members List Calendar Search Today's Posts Mark Forums Read

Peer to Peer The 3rd millenium technology!

Reply
 
Thread Tools Search this Thread Display Modes
Old 10-02-16, 08:15 AM   #1
JackSpratts
 
JackSpratts's Avatar
 
Join Date: May 2001
Location: New England
Posts: 10,013
Default Peer-To-Peer News - The Week In Review - February 13th, '16

Since 2002


































"In one fell swoop, a network has been created that likely has a greater level of access to science than any individual university, or even government for that matter, anywhere in the world." – Simon Oxenham






































February 13th, 2016




BitTorrent Launches First P2P Live Streaming App on Apple TV
Janko Roettgers

BitTorrent Inc. just released a new video streaming app called OTT News for Apple TV that promises live video coverage of the New Hampshire Primary with a team of journalists hired by the company for this very occasion. But the bigger story isn’t about Trump or Bernie, for once: OTT News uses peer-to-peer technology, instead of a traditional content delivery network, to stream live video to consumers — which may just be a first for Apple’s living room streamer.

BitTorrent released the app with little fanfare for tvOS and iOS Tuesday, as well as for Android earlier this week. That’s because the app is very much a test case for the company, which has been working on P2P-powered live video streaming in varying capacities for more than seven years.

BitTorrent Live, as the initiative was initially called, was originally the brain child of BitTorrent inventor Bram Cohen, who early on realized that live streaming depended on very low latencies, something that existing P2P protocols couldn’t deliver. That’s why Cohen rewrote the P2P technology for BitTorrent Live from scratch.

BitTorrent started testing live streaming for desktop PCs in 2011, and invited publishers on its platform in 2012. But the technology required users to download a separate app in order to watch live streams in t heir browser — too big a hurdle for most consumers. BitTorrent eventually shut down its existing efforts in 2014 to instead focus on mobile live streaming.

OTT News is the first app to officially make use of this new approach, but the company is already looking for additional partners to white-label its technology for their own apps. A blog post published Monday promises that “a solution for all live news broadcasts and live streaming from events will be available this year.”
https://variety.com/2016/digital/new...nt-1201701129/





Keybase, a Revolutionary Encrypted File Sharing Platform
Gautham

Keybase is a new encrypted cloud storage and file sharing application different from all the other services out there. Digital storage has moved from good old floppy disks, CDs, DVDs, magnetic hard disks to cloud storage. There are lot of players out there offering cloud based file storage services. Some of the big names in cloud based storage systems include Google (with their Google Drive), Microsoft (OneDrive), Apple (iCloud), Dropbox, Box and others.

Using cloud storage services comes with its own advantages in terms of accessibility and flexibility when it comes to storage space requirement. On the downside, there are many security and privacy concerns associated with these services. There have been multiple instances where hackers have accessed files stored on these accounts, well-known being the iCloud celebrity-photo leaks popularly known as ‘Celebgate’ or ‘The Fappening’. There are also apprehensions about how these cloud storage services use consumer data and even files stored on their servers.

Keybase Secure File Sharing Solution

Keybase is offering a solution that addresses all these concerns. The free and secure file sharing service is slated to offer encrypted storage using encrypted hash keys. Keybase was founded by Chris Coyne and Maxwell Krohn. Keybase started offering encrypted messaging services before branching out to encrypted storage and file sharing services. Currently in its alpha stage, Keybase offers 10 GB free space for users. Users can create two kinds of folders to store files – public and private servers. The cryptographically secure file mount allows users to create public signed directories on Keybase application at the location – /keybase/public/yourname. All files stored inside this directory can be readily shared with anyone. Similarly, the private files can be stored in a directory named – /keybase/private/yourname. Keybase generates cryptographic public keys for user files which are stored on a merkle tree.

Keybase Directories

The files saved on Keybase are automatically signed and anyone accessing the shared files will see them as text files (except for images). The static content shared on a public folder can be accessed by anyone by navigating to keybase.pub/public/yourname. However, unlike public folders, private folders can be accessed by those with whom the file owner wants to share it with only after the user adds permissions to share that particular directory with a specific user. Sharing private files will require users to crate and populate a folder using the following structure – /keybase/private/yourname,johndoe (This private folder is shared with John Doe). Users can also share the files with people on social media platforms as well by adding the platform’s name after the person’s name followed by ‘@’ symbol.

Keybase does not sync all the files to shared devices, but works as an on-demand streaming service. Only when a file is requested, the application will pull it for the user (given he/she has required permissions). While the public files are not encrypted, private directories enjoy end to end encryption using NaCl device specific keys. The Keybase servers do not save a copy of the user’s private key, which means, even the servers don’t have a way to peek into the content saved in private folders. Currently, one has to create a paper key for the NaCl key associated with the user account. However, in near future the company will be coming up with applications to generate key pairs automatically for Keybase service.

Keybase and Blockchain

Keybase makes use of the Bitcoin blockchain for record keeping and version control purposes. All key addition and removal data published on the merkle tree are hashed into the blockchain. By hashing the information on the blockchain, the platform indefinitely preserves signing data while protecting it from any network attacks that may compromise it.

Keybase is on its way to change the way cloud storage services work by offering a secure, private and more efficient service which is free for up to 10 GB usage. The platform is currently available as invite only alpha build. Invites can be obtained by asking for an invite on the website .
http://www.newsbtc.com/2016/02/07/ke...-file-sharing/





Anti-Piracy Group BREIN Demands Torrents Time Cease and Desist
Emil Protalinski

Not even a week has gone by since Torrents Time appeared on the scene, and the site has already been served with a cease-and-desist letter. Anti-piracy group BREIN, based in the Netherlands, has deemed the streaming tool an “illegal application” and demands the administrators “cease and desist the distribution of Torrents Time immediately.”

Torrents Time provides an embedded torrent client that lets users download and play the files inside torrents with one click. There is no need to download and install a separate BitTorrent client, download and open the torrent, or go in and actually play the download video file. After you install the plugin, everything happens in the browser. Torrent sites, including notorious The Pirate Bay, have very quickly adopted the solution.

Here is the crux of BREIN’s complaint:

Through your website Torrents Time (https://torrents-time.com/) you are distributing the illegal application ‘Torrents Time’ that structurally and systematically facilitates, enables and participates in the making available of infringing content without the authorization of the respective copyright and neighbouring rights holders. ‘Torrents Time’ is enabling the illegal distribution of popular titles of films and of TV-Series that are published by the rights holders represented by BREIN and which have not been licensed for distribution through your system.

The group argues that because Torrents Time is hosted on servers located in the Netherlands, the country’s law applies to Torrents Time, which BREIN further describes as “primarily engaged in the facilitating, enabling and participating in the making available of infringing files.” The key word here is “primarily” as of course Torrents Time, like any BitTorrent client, can be used to obtain torrents that do not contain pirated content. BREIN adds that the tool “causes extensive damage to rights holders for which you – and all others involved with the management of the site – are liable.”

In addition to the cease and desist, the anti-piracy group also requests “the geographical address of yourself and the entity or other person(s) who are responsible for the distribution of Torrents Time.” The reason? So that BREIN can hold Torrents Time liable “in case you continue your unlawful activities.” If Torrents Time does not comply, BREIN will request that the site’s host take down the site and provide it with the administrators’ names and addresses. Oh, and BREIN will hold Torrents Time “liable for all (further) incurred costs including legal fees.”

Torrents Time isn’t playing ball. The administrators have responded with their own letter, and the tone is very dismissive. Here is the introduction:

From the outset, please be informed that my clients deny all the suppositions and assumptions in your letter, including the fact that Brein represents right holders and that you are qualified to take action on behalf of an un-named un-identified entities. That having been said, your allegations as to the legal nature of my Clients’ are certainly denied as frivolous and without substantiation. In your letter, you take the liberty of accusing my Clients of distributing an “illegal application.” We deny that allegation, as being un-substantiated, false and illegal in itself, under the laws of the Netherlands.

Torrents Time is countering with the simple fact that its tool can be used to watch legally obtained videos. Indeed, the Torrents Time website in itself does not let users stream pirated content.

The response letter further goes on to point out that no court has ever ruled that Torrents Time breaches copyrights nor neighboring rights because the tool was released just a few days ago and “was carefully crafted not to do anything whatsoever so as to breach copyright or neighboring rights.” The lawyer also states, “I am confident that the outcome of a court proceeding against my Clients’ Torrent Time will end with a ruling against anyone who challenges the legality of Torrents Time.”

The letter then attempts to go on the offensive:

You are therefore advised to seriously re-think your cease and desist demand and advise my Clients that you withdraw your demands. You are also hereby warned not to attempt to take action against any third party who utilizes Torrents Time or hosts it or co-operates therewith in any other manner. Failing to comply with my demands herein will prove itself as enormously costly to your organization and its members and could lead to criminal proceedings against yourself, on the grounds of illegal threats and extortion, the consequences of which I’m sure you are very well aware of.

In the circumstances, and in order not to incur un-necessary legal bills, I advised my Clients to send you a draft of my letter, unsigned, so as to stop the fight before it becomes unstoppable. Please do not take this gesture as a sign of weakness but as a good faith action. We are, like yourself, professionals very well versed with the subject matter at hands, for more than 3 decades. We hope that you are a member of the legitimately acting legal society and not a mob thug.


Torrents Time is growing quickly, and the administrators are eager to provide commentary. Indeed, this is how we found out about the BREIN cease and desist.

“We launched last week and in a few days changed forever the way people use the treasures found through torrents sites, directly from their browser,” Torrents Time told VentureBeat. “It means that from today on, any user who is able to use Facebook can enjoy almost any movie or TV show that was created in this world. Torrents Time is revolutionizing the world of torrents, here and now. Because it’s a revolution, you can expect a bloodshed, like the fate of all revolutions. We already managed to get a cease and desist letter from BREIN.”

It would appear Torrents Time is almost proud to receive the cease-and-desist letter. The administrators were certainly expecting one, though maybe not in the same week they released the tool. Their whole demeanor is very confident — this is a fight they believe is an inevitable win.

“But because it’s the people’s revolution, a network of hundreds of millions of people who wish to consume Free Content, the people will prevail and the illegal harassment by the film and TV producers industry who claim that p2p ruins their business model will be defeated,” the administrators added. “With p2p we make the world smaller, a people’s village where all the neighbors can watch together the staff they like. Freely! Like they watch YouTube together or share content on Facebook.”

And again they reiterate that this is not about piracy: “Torrents Time is not a pirate’s tool. It’s cool and it’s legal. We are certain it will improve the world.”

BREIN and other groups naturally disagree and will be doing everything in their power to stop Torrents Time from taking off. But the streaming solution is already out there, and even if they could, wiping the Torrents Time team off the face of the Earth wouldn’t help.

You can’t put the genie back in the bottle.

BREIN’s letter and the response from Torrents Times are embedded below:
http://venturebeat.com/2016/02/08/an...se-and-desist/





PayPal Blocks VPN, SmartDNS Provider’s Payments Over Alleged Copyright Violations

Did US online payments giant just declare war on VPNs?
Glyn Moody

PayPal has stopped accepting payments for Canadian outfit UnoTelly—a provider of VPN and SmartDNS services—because these might be used to facilitate copyright infringement.

UnoTelly said in an update on its website that Paypal had "severed payment processing agreement unilaterally and without prior warning." It added: "Paypal indicated that UnoTelly is not allowed to provide services that enable open and unrestricted Internet access."

Ars sought comment from PayPal on this story, however, it had not immediately got back to us at time of publication. We'll update this story, if the online payments giant does get in touch.

UnoTelly told its customers that it had no control over PayPal's decision, and apologised for the inconvenience.

PayPal wrote in an email, seen by TorrentFreak: "Under the PayPal Acceptable Use Policy, PayPal may not be used to send or receive payments for items that infringe or violate any copyright, trademark, right of publicity or privacy, or any other proprietary right under the laws of any jurisdiction."

UnoTelly told the blog: "We are disappointed at PayPal’s unilateral action and the way it acted without prior warning. We provide both DNS resolution and secure VPN services. Our services are network relays that connect people around the world."

However, its website also points out that its SmartDNS service "removes geo-blocks imposed by streaming sites and allows you to watch geo-restricted channels regardless of where you live,"—a method that copyright holders have been fighting against.

The larger problem is that all VPN services can be used to circumvent geo-blocking, which would arguably fall foul of PayPal's ban on anything that "avoids, bypasses, removes, deactivates or impairs a technological measure without the authority of the copyright owner." In the past, PayPal has cracked down on BitTorrent sites, Usenet providers, and file-hosting services. So VPNs might be next.

PayPal is widely used around the world—it currently claims 179 million users—so the withdrawal of its services can have serious negative consequences for a company, or a non-profit organisation. Sometimes that has been done intentionally, as in the case of WikiLeaks, but frequently as a result of an error, for example, when the funds of crowdfunded projects like MailPile, and ProtonMail were blocked.

The rise of Bitcoin and other digital currencies suggest that alternative mass-market payment systems may evolve at some point. However, the fact that UnoTelly—in the immediate future, at least—will not be offering any replacement for PayPal, other than credit cards, highlights the extent to which the world of online payments can be something of a monoculture, which can be a tough market for all sorts of different providers.
http://arstechnica.co.uk/tech-policy...ht-violations/





Researcher Illegally Shares Millions of Science Papers Free Online to Spread Knowledge

Welcome to the Pirate Bay of science.
Fiona MacDonald

A researcher in Russia has made more than 48 million journal articles - almost every single peer-reviewed paper every published - freely available online. And she's now refusing to shut the site down, despite a court injunction and a lawsuit from Elsevier, one of the world's biggest publishers.

For those of you who aren't already using it, the site in question is Sci-Hub, and it's sort of like a Pirate Bay of the science world. It was established in 2011 by neuroscientist Alexandra Elbakyan, who was frustrated that she couldn't afford to access the articles needed for her research, and it's since gone viral, with hundreds of thousands of papers being downloaded daily. But at the end of last year, the site was ordered to be taken down by a New York district court - a ruling that Elbakyan has decided to fight, triggering a debate over who really owns science.

"Payment of $32 is just insane when you need to skim or read tens or hundreds of these papers to do research. I obtained these papers by pirating them," Elbakyan told Torrent Freak last year. "Everyone should have access to knowledge regardless of their income or affiliation. And that’s absolutely legal."

If it sounds like a modern day Robin Hood struggle, that's because it kinda is. But in this story, it's not just the poor who don't have access to scientific papers - journal subscriptions have become so expensive that leading universities such as Harvard and Cornell have admitted they can no longer afford them. Researchers have also taken a stand - with 15,000 scientists vowing to boycott publisher Elsevier in part for its excessive paywall fees.

Don't get us wrong, journal publishers have also done a whole lot of good - they've encouraged better research thanks to peer review, and before the Internet, they were crucial to the dissemination of knowledge.

But in recent years, more and more people are beginning to question whether they're still helping the progress of science. In fact, in some cases, the 'publish or perish' mentality is creating more problems than solutions, with a growing number of predatory publishers now charging researchers to have their work published - often without any proper peer review process or even editing.

"They feel pressured to do this," Elbakyan wrote in an open letter to the New York judge last year. "If a researcher wants to be recognised, make a career - he or she needs to have publications in such journals."

That's where Sci-Hub comes into the picture. The site works in two stages. First of all when you search for a paper, Sci-Hub tries to immediately download it from fellow pirate database LibGen. If that doesn't work, Sci-Hub is able to bypass journal paywalls thanks to a range of access keys that have been donated by anonymous academics (thank you, science spies).

This means that Sci-Hub can instantly access any paper published by the big guys, including JSTOR, Springer, Sage, and Elsevier, and deliver it to you for free within seconds. The site then automatically sends a copy of that paper to LibGen, to help share the love.

It's an ingenious system, as Simon Oxenham explains for Big Think:

"In one fell swoop, a network has been created that likely has a greater level of access to science than any individual university, or even government for that matter, anywhere in the world. Sci-Hub represents the sum of countless different universities' institutional access - literally a world of knowledge."

That's all well and good for us users, but understandably, the big publishers are pissed off. Last year, a New York court delivered an injunction against Sci-Hub, making its domain unavailable (something Elbakyan dodged by switching to a new location), and the site is also being sued by Elsevier for "irreparable harm" - a case that experts are predicting will win Elsevier around $750 to $150,000 for each pirated article. Even at the lowest estimations, that would quickly add up to millions in damages.

But Elbakyan is not only standing her ground, she's come out swinging, claiming that it's Elsevier that have the illegal business model.

"I think Elsevier’s business model is itself illegal," she told Torrent Freak, referring to article 27 of the UN Declaration of Human Rights, which states that "everyone has the right freely to participate in the cultural life of the community, to enjoy the arts and to share in scientific advancement and its benefits".

She also explains that the academic publishing situation is different to the music or film industry, where pirating is ripping off creators. "All papers on their website are written by researchers, and researchers do not receive money from what Elsevier collects. That is very different from the music or movie industry, where creators receive money from each copy sold," she said.

Elbakyan hopes that the lawsuit will set a precedent, and make it very clear to the scientific world either way who owns their ideas.

"If Elsevier manages to shut down our projects or force them into the darknet, that will demonstrate an important idea: that the public does not have the right to knowledge," she said. "We have to win over Elsevier and other publishers and show that what these commercial companies are doing is fundamentally wrong."

To be fair, Elbakyan is somewhat protected by the fact that she's in Russia and doesn't have any US assets, so even if Elsevier wins their lawsuit, it's going to be pretty hard for them to get the money.

Still, it's a bold move, and we're pretty interested to see how this fight turns out - because if there's one thing the world needs more of, it's scientific knowledge. In the meantime, Sci-Hub is still up and accessible for anyone who wants to use it, and Elbakyan has no plans to change that anytime soon.
http://www.sciencealert.com/this-wom...pen-up-science





20 Years Ago Today: The Most Important Law On The Internet Was Signed, Almost By Accident
Mike Masnick

The internet as we know it would be a very, very different place if 20 years ago today, President Clinton hadn't signed the Communications Decency Act. To be fair, nearly all of the CDA was a horrible mess that was actually a terrible idea for the internet. A key part of the bill was about "cleaning up" pornography on the internet. However, to "balance" that out, the bill included Section 230 -- added by two Congressmen in the House of Representatives: Ron Wyden and Chris Cox. They had pushed this clause as a separate bill, the Internet Freedom and Family Empowerment Act, but it didn't get enough traction. It was only when they attached it to the Communications Decency Act (which had passed the Senate without it), that it was able to move forward. And thus, 20 years ago today, when President Clinton signed the CDA, most of the attention was on the "stopping indecency" part, and very little on the "throw in" of Section 230. And yet, there's a strong argument that Section 230 may be one of the most important laws -- perhaps the most important -- passed in the past few decades.

As you hopefully already know, a year later, in Reno v. ACLU, the Supreme Court tossed out basically all of the CDA as unconstitutional. The only tidbit of the law that remained valid? You guessed it: Section 230. And, of course, it became the key law in enabling the internet to grow the way it did. It's been said in the past, fairly accurately, that no law contributed more to the growth of the internet than CDA 230, and that's because of a fairly simple and straightforward principle. CDA 230 simply said that an internet service is not liable for actions of its users. This meant that new websites and internet services didn't need to carefully monitor and track everything that every user did to make sure it wasn't violating a law. That meant the legal risks and liability for creating services that allowed the public to create all kinds of content went way down.

Without a robust Section 230, it's difficult to see many of the most popular platforms today existing. It's no surprise that soon after CDA 230 we saw the rise of blogging and social media -- and almost always coming from American companies. Both would be significantly more difficult without Section 230's protections. In fact, much of the push for Section 230 came in response to a horrible court case, Stratton Oakmont v. Prodigy, in which an internet bulletin board commenter attacked financial firm Stratton Oakmont, and its president, for apparently being involved in criminal and fraudulent activity. Stratton Oakmont -- now perhaps well known as the firm portrayed as doing all sorts of criminal and fraudulent things in the movie The Wolf of Wall Street -- sued Prodigy for the comment and won. The liability from such a ruling scared numerous online platforms, in particular because a key part of the ruling was that because Prodigy posted "guidelines" and removed posts with offensive language, it suddenly became a "publisher" of the content, and was liable for that content.

A key, and often overlooked, part of Section 230, is that it actually does encourage sites to take proactive measures to filter content, by noting that any kind of moderation or guidelines absolutely does not remove the protections of Section 230. As such, sites get to decide for themselves whether or not to moderate their content in any way, without facing the legal risk of suddenly being declared the publisher. Other countries have no such protections, leading to some dangerous rulings, and creating something akin to a "right to be forgotten" in some instances.

There have been numerous cases testing Section 230 over the years -- and the law has remained strong and in place -- though it is still being challenged to this day. The biggest and most important case was Zeran v. AOL, the first case testing Section 230, in which the court found that Section 230 was a powerful tool that kept sites from being held responsible for content posted by users.

Section 230 has been powerful in so many ways. It has both enabled and protected free speech online by letting companies set up platforms where people can speak openly. Without it, the internet would be much more limited as a platform for communicating to the public. As the 4th Circuit noted in its ruling in the Zeran case:

The amount of information communicated via interactive computer services is therefore staggering. The specter of tort liability in an area of such prolific speech would have an obvious chilling effect. It would be impossible for service providers to screen each of their millions of postings for possible problems. Faced with potential liability for each message republished by their services, interactive computer service providers might choose to severely restrict the number and type of messages posted.

It has protected privacy, by making it clear that there was no duty for websites to monitor and track their users, to avoid any kind of liability. It has created incentives to create tremendous economic value, by making it clear that companies could be formed to enable public communications, such as blogging, forums and social media -- without being sued into bankruptcy over misuse. And it has actually enabled better moderation of platforms in not making them give up protections, if they choose how to moderate certain content.

It is difficult to express just how important Section 230 has been over the past 20 years other than to say that, without it, it's unlikely that you would be able to comment on Techdirt today. It's also unlikely that you'd have tools like Twitter or Facebook or Yelp or AirBnb. Any service that relies on public input owes a huge debt to Section 230, and it's quite incredible that it was basically included as an "add-on" that very few noticed when it was signed.

So, as we're hanging out here on the internet today, in a place that is alive only because of Section 230, please thank (now Senator) Ron Wyden in particular for his role in creating Section 230, and pay attention, because there are very powerful forces working right now to undermine Section 230 entirely. It's been a key driver of free expression and economic growth for the past 20 years, and it would be a shame to undermine that now.
https://www.techdirt.com/articles/20...accident.shtml





Dallas Buyers Club Abandons Fight Against Aussie Pirates
Hannah Francis

It's a happy day for Aussie pirates: The Hollywood studio behind the film Dallas Buyers Club has abandoned its fight to extract huge sums of cash from alleged copyright infringers.

Dallas Buyers Club LLC had until midday Thursday to lodge a second appeal against an August Federal Court decision which effectively prevented it from engaging in so-called "speculative invoicing" in Australia.

Michael Bradley of Marque Lawyers, who represented the studio in the case, confirmed to Fairfax Media the studio had not appealed the decision by the deadline.

He said while his client was disappointed by the final outcome of the case, it was not the end of the war against piracy in Australia.

"The problem isn't going away; Australia is still one of the most prominent jurisdictions for infringement, and rights holders will continue to feel that they're losing a lot of money, so I expect they'll continue to look for ways of deterring that behaviour," Mr Bradley said.

"The infringement is so large scale and the financial losses involved are so big, I don't think that's the end of the story."

He said the case was "very technical" and centred around access to information, with no bearing on the actual underlying legal issues of infringement and damages.

"Presumably infringers will be happy about the result; whether it will influence their behaviour [in regard to continuing to pirate], I have no idea," Mr Bradley said.

However Graham Phillips of Thomson Geer lawyers, who led the defence by iiNet and other internet service providers whose customers' details were at the centre of the case, said the outcome effectively defeated the speculative invoicing business model in Australia.

Dallas Buyers Club LLC's application ultimately failed because the studio overreached, he said.

"The demands they wanted to make were excessive, unsupported by the evidence they collected," Mr Phillips said.

Mr Phillips praised the ISPs for defending their customers' privacy, singling out iiNet's outspoken former chief regulatory officer Steve Dalby for leading the charge.

"The case is a great legacy for Steve Dalby ... who was keen to protect his customers from DBC's unfair speculative invoicing practice," Mr Phillips said.

Rights holders could succeed where Dallas Buyers Club had failed in obtaining details of alleged pirates, if they were able to prove in court their claims for damages would be reasonable and within the law, he said.

Federal Court Justice Nye Perram granted Dallas Buyers Club access to the names and addresses of the alleged pirates back in April, but put a temporary stay on access until the studio could prove to the court it would not threaten and pursue individuals for excessive amounts of money.

Justice Perram rejected "several versions" of the studio's proposed correspondence with individuals before deciding in August to lift the stay but impose strict conditions on access. These included that the studio only seek damages from individuals for the cost of obtaining the film plus some out-of-pocket expenses, and that it forfeit a $600,000 bond if the terms were breached. The restrictions effectively made any further action from the studio against individual pirates prohibitively expensive.

In September, Dallas Buyers Club LLC appealed the decision and sought access to the contact details of only 472 alleged pirates with a $60,000 bond – 10 per cent of the original amount – plus the right to seek further compensation. Justice Perram rejected the request in December, ordering the case to be thrown out on February 11 unless further action was taken.
http://www.theage.com.au/technology/...0160210-gmr37y





Warner Music Pays $14 Million to End 'Happy Birthday' Copyright Lawsuit

The music publisher will also not stand in the way for a judge to declare the song to be in the public domain.
Eriq Gardner

Sing the song, blow out the candles, eat the cake and unwrap the gifts.

According to a court filing on Monday, music publisher Warner/Chappell will pay $14 million to end a lawsuit challenging its hold on the English language's most popular song, "Happy Birthday to You." Additionally, the settlement stipulates a proposed final judgment and order that would declare the song to be in the public domain. A memorandum in support of the settlement sings the praises of the deal as "truly, an historic result." U.S. District Judge George H. King will have to sign off on it.

The revelation of the settlement terms comes after King came to the conclusion this past September that Warner and its predecessor didn't hold any valid copyright to the song and never acquired the rights to the "Happy Birthday" lyrics. At the time, the judge stopped short of declaring that the song was in the public domain, and just before a trial was set to begin in December exploring the history of a song dating back to a 19th century schoolteacher named Patty Smith Hill and her sister Mildred Hill, the sides reached an agreement.

Warners was expecting to have "Happy Birthday" under copyright until 2030. An IP valuation expert retained by the plaintiffs estimated that the song was to reap between $14 million to $16.5 million in the next 15 years.

"The judicial determination that 'Happy Birthday' is in the public domain also has substantial value," states the memorandum in support of the settlement. "Because Defendants have charged for use of the Song, untold thousands of people chose not to use the Song in their own performances and artistic works or to perform the Song in public. This has limited the number of times the Song was performed and used. After the Settlement is approved, that restraint will be removed and the Song will be performed and used far more often than it has been in the past. While there is no way to make a reliable estimate of the increase that will result, there can be no dispute that the increase will be substantial."

An agreement to have a judge declare the song in the public domain is no doubt unusual and will likely command some attention by the judge on review.

But for now, the settlement provides a big final act to the class action lawsuit brought by film director Jennifer Nelson, who was making a documentary about "Happy Birthday" and was asked to pay a $1,500 license fee. She sued to hinder Warners from ever forcing film and TV producers, or others, to pay again. The plaintiffs argued that a song appearing in early 20th-century children's textbooks had to be in the public domain because of general publication, abandonment or the length of the copyright term.

By agreeing to the settlement, Warners avoids going to trial to determine whether it should be punished for collecting licensing money for many decades. The music publisher also forgoes an appeal that it teased. The defendant continues to believe that a 1935 copyright registration should have entitled it to a presumption of copyright validity and that the song isn't in the public domain, but it has agreed to a judgment that states otherwise.

The plaintiffs were represented by attorneys led by Mark Rifkin, who according to the settlement terms will be seeking a $4.62 million fee, a third of the $14 million settlement fund. The rest would go to those who have paid to license "Happy Birthday" and meet the definition of the proposed class. Those folks are estimated to have spent more than $50 million on licensing fees on "Happy Birthday" over the years.

Last week, in announcing its quarterly earnings, Warner Music Group partly blamed an operating loss on expenses related to the "Happy Birthday" settlement. A hearing on the settlement is scheduled in March.
https://www.hollywoodreporter.com/th...million-863120





New Ways Into the Brain’s ‘Music Room’
Natalie Angier

Whether to enliven a commute, relax in the evening or drown out the buzz of a neighbor’s recreational drone, Americans listen to music nearly four hours a day. In international surveys, people consistently rank music as one of life’s supreme sources of pleasure and emotional power. We marry to music, graduate to music, mourn to music. Every culture ever studied has been found to make music, and among the oldest artistic objects known are slender flutes carved from mammoth bone some 43,000 years ago — 24,000 years before the cave paintings of Lascaux.

Given the antiquity, universality and deep popularity of music, many researchers had long assumed that the human brain must be equipped with some sort of music room, a distinctive piece of cortical architecture dedicated to detecting and interpreting the dulcet signals of song. Yet for years, scientists failed to find any clear evidence of a music-specific domain through conventional brain-scanning technology, and the quest to understand the neural basis of a quintessential human passion foundered.

Now researchers at the Massachusetts Institute of Technology have devised a radical new approach to brain imaging that reveals what past studies had missed. By mathematically analyzing scans of the auditory cortex and grouping clusters of brain cells with similar activation patterns, the scientists have identified neural pathways that react almost exclusively to the sound of music — any music. It may be Bach, bluegrass, hip-hop, big band, sitar or Julie Andrews. A listener may relish the sampled genre or revile it. No matter. When a musical passage is played, a distinct set of neurons tucked inside a furrow of a listener’s auditory cortex will fire in response.

Other sounds, by contrast — a dog barking, a car skidding, a toilet flushing — leave the musical circuits unmoved.

Nancy Kanwisher and Josh H. McDermott, professors of neuroscience at M.I.T., and their postdoctoral colleague Sam Norman-Haignere reported their results in the journal Neuron. The findings offer researchers a new tool for exploring the contours of human musicality.

“Why do we have music?” Dr. Kanwisher said in an interview. “Why do we enjoy it so much and want to dance when we hear it? How early in development can we see this sensitivity to music, and is it tunable with experience? These are the really cool first-order questions we can begin to address.”

Dr. McDermott said the new method could be used to computationally dissect any scans from a functional magnetic resonance imaging device, or F.M.R.I. — the trendy workhorse of contemporary neuroscience — and so may end up divulging other hidden gems of cortical specialization. As proof of principle, the researchers showed that their analytical protocol had detected a second neural pathway in the brain for which scientists already had evidence — this one tuned to the sounds of human speech.

Importantly, the M.I.T. team demonstrated that the speech and music circuits are in different parts of the brain’s sprawling auditory cortex, where all sound signals are interpreted, and that each is largely deaf to the other’s sonic cues, although there is some overlap when it comes to responding to songs with lyrics.

The new paper “takes a very innovative approach and is of great importance,” said Josef Rauschecker, director of the Laboratory of Integrative Neuroscience and Cognition at Georgetown University. “The idea that the brain gives specialized treatment to music recognition, that it regards music as fundamental a category as speech, is very exciting to me.”

In fact, Dr. Rauschecker said, music sensitivity may be more fundamental to the human brain than is speech perception. “There are theories that music is older than speech or language,” he said. “Some even argue that speech evolved from music.”

And though the survival value that music held for our ancestors may not be as immediately obvious as the power to recognize words, Dr. Rauschecker added, “music works as a group cohesive. Music-making with other people in your tribe is a very ancient, human thing to do.”

Elizabeth Hellmuth Margulis, the director of the Music Cognition Lab at the University of Arkansas, said that when previous neuroscientists failed to find any anatomically distinct music center in the brain, they came up with any number of rationales to explain the results.

“The story was, oh, what’s special about music perception is how it recruits areas from all over the brain, how it draws on the motor system, speech circuitry, social understanding, and brings it all together,” she said. Some researchers dismissed music as “auditory cheesecake,” a pastime that co-opted other essential communicative urges. “This paper says, no, when you peer below the cruder level seen with some methodologies, you find very specific circuitry that responds to music over speech.”

Dr. Kanwisher’s lab is widely recognized for its pioneering work on human vision, and the discovery that key portions of the visual cortex are primed to instantly recognize a few highly meaningful objects in the environment, like faces and human body parts. The researchers wondered if the auditory system might be similarly organized to make sense of the soundscape through a categorical screen. If so, what would the salient categories be? What are the aural equivalents of a human face or a human leg — sounds or sound elements so essential the brain assigns a bit of gray matter to the task of detecting them?

To address the question, Dr. McDermott, a former club and radio disc jockey, and Dr. Norman-Haignere, an accomplished classical guitarist, began gathering a library of everyday sounds — music, speech, laughter, weeping, whispering, tires squealing, flags flapping, dishes clattering, flames crackling, wind chimes tinkling. Wherever they went, they asked for suggestions. Had they missed anything?

They put the lengthy list up for a vote on the Amazon Mechanical Turk crowdsourcing service to determine which of their candidate sounds were most easily recognized and frequently heard. That mass survey yielded a set of 165 distinctive and readily identifiable sound clips of two seconds each (listen to a selection of them here). The researchers then scanned the brains of 10 volunteers (none of them musicians) as they listened to multiple rounds of the 165 sound clips.

Focusing on the brain’s auditory region — located, appropriately enough, in the temporal lobes right above the ears — the scientists analyzed voxels, or three-dimensional pixels, of the images mathematically to detect similar patterns of neuronal excitement or quietude.

“The strength of our method is that it’s hypothesis-neutral,” Dr. McDermott said. “We just present a bunch of sounds and let the data do the talking.”

The computations generated six basic response patterns, six ways the brain categorized incoming noise. But what did the categories correspond to? Matching sound clips to activation patterns, the researchers determined that four of the patterns were linked to general physical properties of sound, like pitch and frequency. The fifth traced the brain’s perception of speech, and for the sixth the data turned operatic, disclosing a neuronal hot spot in the major crevice, or sulcus, of the auditory cortex that attended to every music clip the researchers had played.

“The sound of a solo drummer, whistling, pop songs, rap, almost everything that has a musical quality to it, melodic or rhythmic, would activate it,” Dr. Norman-Haignere said. “That’s one reason the result surprised us. The signals of speech are so much more homogeneous.”

The researchers have yet to determine exactly which acoustic features of music stimulate its dedicated pathway. The relative constancy of a musical note’s pitch? Its harmonic overlays? Even saying what music is can be tricky.

“It’s difficult to come up with a dictionary definition,” Dr. McDermott said. “I tend to think music is best defined by example.”

Justice Potter Stewart of the Supreme Court likewise said of pornography that he knew it when he saw it. Maybe music is a kind of cheesecake after all.
http://www.nytimes.com/2016/02/09/sc...usic-room.html





Music Can’t Last Forever, Not Even on the Internet
Klint Finley

Recorded music was once incredibly fragile. Before the days of digital music, an independent band might press only a few thousand, or even a few hundred copies of a vinyl record. Those albums only became more rare over the years as copies were scratched, broken, or thrown out. Likewise, master recordings could be damaged or lost, making the record difficult or impossible to reissue.

But today, thanks to the wonders of digitization, recordings can be backed up and saved indefinitely. When a formerly obscure band hits it big, fans can instantly find their early work, without having to hunt it down in used record stores or waiting for a reissue, thanks to streaming music services.

The trouble is that, even as music has become more durable, it has—paradoxically—also become more ephemeral. Your physical records don’t evaporate if the store you bought it from closes shop or the record label that published them goes out of business. If a streaming music company goes under, a stockpile of important cultural artifacts could go with it.

Fears that exactly this could happen erupted this week when a financial statements from popular audio hosting site SoundCloud surfaced online. The company, which has become a vital resource for independent musicians and podcasters, lost $44.19 million dollars in 2014 even as it increased revenue to $15.37 million, according to the regulatory document filed with the UK government. The revelation led to immediate speculation that SoundCloud could go offline, taking with it the 110 million audio tracks it hosts.

It wouldn’t be the first time a massive trove of digital music disappeared. In 2003, CNET shut down the original version of music publishing service mp3.com, which once hosted 750,000 song files. Three years later, the Internet Underground Music Archive, consisting of over 680,000 songs, went offline as well.

Threat to the Underground

SoundCloud says those fears are overblown. The financial document in question is now more than a year old, and it doesn’t reflect the $77 million dollars in funding the company secured last year. “We’re focusing on enabling creators to get paid for their creativity,” a spokesperson said in a provided statement. “And on building a financially sustainable platform that our community can enjoy for years to come.”

It’s too early to write the obituary for SoundCloud, but it isn’t the only audio service struggling to make ends meet. Last November, Rdio confirmed that it would file for bankruptcy sell its assets to Pandora.

Now Pandora itself is rumored to be for sale. Spotify, meanwhile, may be looking for a $500 million cash infusion. Losing SoundCloud, however, would be a bigger cultural blow than losing another of the major streaming sites.

While those services all host a similar catalog of music, SoundCloud has become home to countless unsigned musicians and independent broadcasters. Although unsigned artists can upload their music to services like Apple Music and Spotify, they generally have to pay a middleman like CD Baby or Tunecore to do so. SoundCloud enables musicians to publish their work directly and for free, without the need for a lengthy approval process. Meanwhile, more established artists can use it to preview tracks or connect directly with fans.

“If the service goes the way of Grooveshark, it won’t just be underground artists like Plastician that lose their access to a wealth of undiscovered talent,” FACT Magazine wrote last year of the company. “It’ll be the majors losing their access to the next generation of hitmakers too.”

Saved From Oblivion

Fortunately there are alternatives to SoundCloud, such as Bandcamp, which a spokesperson told us has been profitable since 2012, and YouTube, which has become an increasingly important part of Google’s overall strategy. But SoundCloud users would have to re-upload all of their work—if they even still have copies of it. Much of what lives on SoundCloud today would likely vanish forever.

Of course, someone could end up hosting backups of the site. An organization called the Archive Team has dedicated itself to preserving the web, and has already managed to save many sites from oblivion–sometimes years after the fact. For example, in 2012 the Internet Underground Music Archive made a surprise comeback when the organization uploaded backups of the original site to the Internet Archive. Archive Team founder Jason Scott says the group is already looking at SoundCloud, though there are several other sites the team plans to archive first.

But history tells us that today’s most financially stable companies and organizations could become tomorrow’s nostalgic memories. Even the venerable Internet Archive could one day disappear. That’s why the Archive Team started a project to backup it up. Perhaps something like Interplanetary File System, which aims to create a more distributed way of storing data online, could one day make the web more resilient to the ebbs and flows of corporate dollars. But until then we must remember that nothing lasts forever, not even on the Internet.
http://www.wired.com/2016/02/music-c...-the-internet/





Surprise, Surprise: The RIAA Gets Streaming All Wrong
Emma Grey Ellis

Platinum status has been pretty elusive lately, mostly because it’s hard to sell a million albums to people who aren’t buying albums anymore. When the Recording Industry Association of America tweaked its gold and platinum certification process last week, though, seemingly everyone became a platinum artist.

To get with the times, the RIAA now counts audio and visual streaming numbers toward an artist’s total album sales. Grab a pencil: now 1,500 streams = 10 track downloads = one album purchase. Due to the change, a ton of artists saw their albums go gold or platinum overnight: Kendrick Lamar, Big Sean, Shawn Mendes, Miranda Lambert, Hozier, and more crossed into the million-copy club. (Don’t worry, if your favorite artist isn’t on the list, all you have to do is stream their tracks 1.5 billion times!)

The good news is you and your Spotify habit may no longer be an obstacle to an artist’s success. The bad news, though, is the new system sets up a world in which streaming is both default and devalued. Rather than choose between two sides—placating artists by coming out against streaming, or embracing streaming as the will of the people—the RIAA has chosen to stake out a middle ground that’s a solution for neither, and an affront to both.

Precious Metals

The music industry has generally responded positively to the change, though Anthony “Top Dawg” Tiffith, CEO of Top Dawg Entertainment, tweeted that he (and presumably Kendrick Lamar) aren’t even acknowledging To Pimp a Butterfly as platinum until actual money changes hands.

we don’t stand behind this @RIAA bs. ole skool rules apply, 1 million albums sold is platinum.until we reach that #, save all the congrats.

— dangeroo kipawaa TDE (@dangerookipawaa) February 1, 2016

On the other end of the spectrum is Nicki Minaj, who maintains that her streaming numbers should have earned The Pinkprint triple platinum by now, and if her Instagram posts are to be believed, she’s willing to go to court to get that fixed. There’s no question that going platinum is important for artists. In an email to WIRED, Liz Kennedy Holman, director of the RIAA’s Gold & Platinum Program, likened it to artists “earning their entry into an exclusive club.” But we all know that this is about more than qualifying to own a shiny plaque. “The natural way of things is that it’s a good promotional tool,” said Jay Cooper, a Los Angeles lawyer who has represented artists like Katy Perry. “The fact that it’s gone platinum means that the public accepts it. And if people see that the public accepts it, more people are going to say, ‘Hey, maybe I should buy that.'” So, given that the RIAA is—or tries to be—a consumption-driven barometer of public opinion, it’s appropriate for them to update their process to reflect current habits. And Holman was quick to draw our attention to this “organic tweet” from Iggy Azalea:

@sighzach no. thats how music is being consumed, the world did it. RIAA is just reacting to how ppl consume music.

— IGGY AZALEA (@IGGYAZALEA) February 5, 2016


But more significantly, this rules change is also a move toward legitimizing streaming.
Islands in the Stream

Obviously, we’ve all been streaming for a while. But despite Spotify having brought the business model to the US nearly five years ago, artists have long complained (Looking at you, T-Swift. And you, David Lowery.) that they’re not seeing enough buck for their streaming bang, which still remains a concern.

“The real question is, will [the rules change] have any impact on the flow of royalties to artists,” says attorney Lisa Alter, who specializes in music copyright law at New York firm Alter, Kendrick & Baron. “Hopefully, it will focus attention on how much music is being streamed, and how the money really flows … I’ve been involved in record label audits where it was difficult for even the most experienced auditors to say what and why artists are being paid.”

It’s no real mystery why TDE isn’t feeling the RIAA’s change of tune: the old school was far more lucrative for artists. That monetary difference between physical album sales (or even downloads) and streaming is the only reason to oppose it, and TDE has been roundly called out for its stance.

“The Kendrick Lamars, the Taylor Swifts … those people can afford to say no,” Kenneth Abdo, an entertainment lawyer whose Minnesota-based practice deals with emerging artists who don’t have that luxury. Indie musicians cling to streaming despite the fact that it doesn’t pay the bills, in hopes of being discovered and breaking big. “It’s a Catch-22,” Abdo says.

This RIAA rules change hasn’t made the situation less ambiguous, either. Certainly figuring streaming data into one of the most recognized metrics of success is a nod in streaming’s favor, a sign of acceptance that it’s not going anywhere. But this is where we have to call shenanigans. The RIAA’s ratio (1,500 streams = 10 track downloads = 1 album purchase) is a clear indication they think streaming is a lesser form of listening—and that’s where the trouble lies.

Second-Class Streams

If you don’t subscribe to a streaming service, you have to pay to listen to an album even once. Can you imagine if had to listen to an album 100, 150 times through for it to count as officially purchased? That’s pretty unrealistic, and not only because it’s a 60-hour commitment.

If the RIAA’s claim to relevance here is that it is rewarding the music we appreciate by measuring how much we consume it, it should follow that one physical album listen = one streaming listen. (Though, if we’re going to really make this shift, streaming services need to be paying songwriters accordingly. Just ask Aloe Blacc. Or Lowery.)

What they have now relegates streaming to a second tier. Which ultimately might be fine and even correct. After all, there’s another kind of populist, ad-supported music service that the RIAA has never taken into account: the radio. The radio isn’t a platform artists have traditionally expected to make millions from. Like streaming, it feels social and trend-driven. It’s not as serious a commitment as buying an album. And there have probably always been people who enjoy hearing songs on the radio (or choosing them on the jukebox) but don’t then go on to buy the album.

So it’s hard to say if the situation streaming has created for the music industry is totally new, or just another version of the status quo. But the RIAA had the opportunity to throw its weight around here, and instead pulled its punch. Either streaming is the new platform to rule them all—and considering how we’re shifting from an ownership-driven culture to an access-driven culture, that seems about right—or it’s the new radio. What it can’t be is both, and that’s where the RIAA decision leaves it.
http://www.wired.com/2016/02/riaa-streaming-platinum/





TV Producers May Start Making You Wait for New Shows Online
Anick Jesdanun

The Golden Age of Online Television may be in peril.

Streaming TV has gotten popular as several online services such as Netflix make past seasons of TV shows available for binge-watching, while Hulu offers episodes from the current season.

Now, some television companies are balking at giving them timely access to shows.

The big worry: Making streaming TV too pleasant might encourage viewers to cut back or drop their cable service. Cable and satellite companies now pay TV networks billions of dollars a year to carry their channels. In turn, TV production companies make a lot from licensing fees paid by the networks.

Cord-cutting could jeopardize all of those arrangements, and the audience and ad revenue boost from the Internet might not be enough to make up for any revenue losses from traditional TV.

Time Warner Inc., which has both networks and a production business, has been exploring the possibility of holding back some of its DC Comics superhero shows such as "The Flash" and "Supergirl." If it took that step, viewers might have to wait years to watch the most recent episodes online; now, they're typically available no more than a year after airing.

Hulu may be next. The Wall Street Journal recently reported that Time Warner is in talks to invest in Hulu and has told Hulu's owners that it wants to curtail current-season TV episodes, which Hulu now makes available as early as the next day.

It's not yet clear what such restrictions might look like. Time Warner and Hulu didn't respond to requests for comment, though Time Warner may reveal more of its digital strategy when it reports quarterly earnings Wednesday.

The tremors emanating from Time Warner are just the latest instance of established media companies looking to protect their established partners and deals, whether viewers like it or not.

Hulu already has pulled back in recent years. When it launched nearly a decade ago, the service offered most shows from its network parents the morning after they aired. These days, many Fox and ABC shows require a Hulu or cable subscription for next-day viewing. Otherwise, viewers have to wait eight days — or a month in the case of Fox's "So You Think You Can Dance." And for Fox, Hulu now has just the past four episodes — not five — for free.

All online services, meanwhile, have been dabbling in creating their own television shows. Netflix won accolades for such original shows as "House of Cards," while Hulu commissioned a fourth season of "The Mindy Project" when Fox canceled it. Original shows help services set themselves apart from each other and could ease the impact of any pullback in traditional TV shows.

But for now, these services are mostly about giving viewers a chance to catch up on what's been shown on traditional TV — and giving viewers less of a reason to tune in.

"To a certain degree, you can't put the genie back in the bottle," said Anthony DiClemente, an analyst at Nomura Securities. "Once people are accustomed to it, that's going to be the expectation."

Traditional cable and satellite TV services have been in slow decline for years, in part because younger audiences aren't signing up and are turning to online options instead. The decline isn't big enough to threaten cable and television companies immediately, but they are taking notice. And the numbers hide the fact that many subscribers are opting for cheaper packages with fewer channels.

"What we're seeing is a constant game of tug-o-war," said John Buffone, an analyst at the NPD Group.

As services like Netflix and Hulu boom, he said, television companies are looking for ways they can hold onto more of those streaming revenues themselves.

The changes are especially noticeable at Hulu, which is owned by parents of the very television networks — Fox, ABC and NBC — threatened by changes in the way we watch TV. Hulu has set itself apart by offering new TV episodes faster than its rivals; making viewers wait longer could limit its appeal.

"Hulu's DNA has been recent episodes of TV shows," said Glenn Hower, an analyst at the research firm Parks Associates.

The apparent anxiety at television companies is common to any industry that's faced what Harvard business professor Clayton Christensen calls "The Innovator's Dilemma." That's when established companies find their big, lucrative businesses undercut by innovative rivals with cheaper — and, at least at first, less profitable — alternatives. The big companies can't embrace the new approaches without helping cannibalize their own cash cows.

Viewed through that prism, Hulu has been a mostly successful half-measure. Its biggest accomplishment may have been to help accustom people to paying for TV over the Internet. Given that it was established when piracy was rampant, that's no small feat.

Though Hulu still offers ad-supported shows you can watch for free, more viewers are paying at least $8 a month for viewing on mobile and streaming-TV devices and for full current seasons of some shows. As of last April, the last time Hulu disclosed figures, it had 9 million paying subscribers, a 50 percent increase from the previous year. Netflix, meanwhile, had 45 million U.S. streaming subscribers at the end of 2015, a 14 percent increase.

Yet the traditional TV business remains sizable — and will remain so for years. Even with a 1 percent drop over the past year, 98.3 million U.S. households subscribe to a cable or satellite TV service, research firm MoffettNathanson estimates.

As a result, expanded streaming deals might not make up for what TV companies might lose if that big business continues to shrink. The fees that cable and satellite companies pay television networks and stations to carry their channels are estimated at $60 billion this year, up 6 percent from 2015, according to media research firm SNL Kagan.

From that standpoint, Hower said, it makes sense to cling to older, well-established partners — the cable companies.

___

AP Business Writer Ryan Nakashima in Los Angeles and Tali Arbel in New York contributed to this report.
http://www.newstimes.com/business/te...ew-6815289.php





CBS Says Super Bowl 50 Broke Streaming Records With 3.96 Million Unique Viewers
Sarah Perez

CBS reported earlier this morning that its live stream of Super Bowl 50 broke all prior streaming records for the big game. Now the network has released numbers to back up that claim: according to an announcement this afternoon, 3.96 million unique viewers tuned in to watch the Broncos versus the Panthers across laptops, desktops, tablets, connected TV devices and mobile phones.

In addition, CBS said that viewers consumed more than 402 million total minutes of coverage, watching for more than 101 minutes each on average. During the game, viewers consumed more than 315 million minutes of coverage, with an average minute audience of 1.4 million.

To put those figures in historical context, NBC last year said more than 1.3 million people watched the Super Bowl through its web stream, which was then the highest number of concurrent users to date.

In addition, NBC’s live stream had also set records for average viewers per minute (800,000) and total minutes (213 million), per Adobe Analytics data cited by the network in its release at the time.

Prior to that, Fox’s 2014 Super Bowl live stream had peaked at around 1.1 million viewers, and averaged 528,000 viewers per minute.

It’s no surprise that the streaming figures are continuing to grow every year. Not only are more viewers than ever watching television via the web and other devices, the networks’ themselves are making it easier for viewers to find the game, no matter what platform the end user prefers.

For example, this year, CBS made the live stream available on its CBSSports.com website on PCs and tablet, as well as via its CBS Sports app on iPad, Android, Windows 10 tablets, as well on connected TVs via Apple TV (3rd, 4th generation devices), Roku and Roku TV models, Google’s Chromecast, Microsoft Xbox One, Amazon Fire TV as well as select Android TV devices from Sony, Sharp and Phillips.

Not counted in CBS’ figures was ESPN’s airing of the game in Spanish on its ESPN Deportes channel. However, Verizon’s exclusive mobile stream via the Verizon Go90 and NFL Mobile apps were included, CBS tells us. (Disclosure: TechCrunch parent AOL is owned by Verizon.)

With the diversity of platforms where the stream was available, it’s difficult to more accurately make an apples-to-apples comparison between this year’s numbers and the last. What we do know is that as there are more ways to live stream made available, the viewers will come.

Elsewhere

In addition to CBS’s live stream, other notable numbers related to the big game were also released today.

For example, YouTube said that people spent 300,000 hours watching Super Bowl ads and teaser videos on its service during the game, and overall it saw nearly 4 million hours of ads and teasers watched so far. The ads and teasers have been watched over 330 million times, with 60 percent of that coming from mobile devices, YouTube also said, adding that’s the first time the majority of views have happened on mobile.

Meanwhile, on social media, the Super Bowl continued to dominate the evening’s chatter, as usual. Twitter didn’t have overall numbers to share at time of publication, but noted that it exceeded 3.9 million tweets during the Halftime Show.

Nielsen also noted the game was the most tweeted event to date.

Facebook reported today that 60 million people were discussing the game on the social network, with 200 million posts, comments and likes. However, this year’s event was only the second-highest level of conversation Facebook has measured for any Super Bowl to date. Last year’s matchup between the New England Patriots and Seattle Seahawks still holds the record on the platform.

Facebook-owned Instagram did well, too, saying that 38 million people had 155 million interactions on its service related to Super Bowl 50.
http://techcrunch.com/2016/02/08/cbs...llion-viewers/





VideoLAN Celebrates 15 Years of VLC, the World’s Most Used Media Player.
Tech Geek

VideoLAN announced today that VLC has turned 15 years old. technically the project is almost 5 years older but this is the anniversary of VLC being licensed to the GPL license, which happened on the 1st of February 2001.

Under the GPL license, it is now worked on by many contributors across the world but prior to that it was being developed by students at Ecole Centrale Paris. Jean-Baptiste Kempf the president of VideoLAN states:

If you’ve been to one of my talks, (if you haven’t, you should come to one), you know that the project that became VideoLAN and VLC, is almost 5 years older than that, and was called “Network 2000”.

Moreover, the first commit on the “VideoLAN Client” project is from August 8th 1999, by Michel Kaempf had 21275 lines of code already, so the VLC software was started earlier in 1999.

However, the most important date for the birth of VLC is when it was allowed to be used outside of the school, and therefore when the project was GPL-ized: February 1st, 2001.


VLC is the darling of the internet due to its ability to play almost any video format and on most of the popular platforms and some not popular. It has been ported to Windows, Mac, GNU/Linux, iOS, Windows RT, Android, Android TV, Apple TV, Solaris, BSD, OS/2, Tizen and Chrome OS.

VideoLAN and their contributors are now working hard on bringing us VLC 3.0 which will unify most of our mobile ports, adding more GPU decoding, better adaptive streaming and ChromeCast integration.
http://techrampage.com/software/vide...ia-player-1369





Netflix Black Market Sees Passwords Selling for Just 25 Cents
Trevor Mogg

It’s long been known that hackers are nabbing and selling Netflix passwords, but a new report this week from security firm Symantec suggests the problem is growing following the streaming site’s recent international expansion to 130 new regions.

For hackers, the expanding membership base of Netflix, which is now available in a total of 190 regions globally, means there are more opportunities than ever to steal and sell on passwords.

While the cost of a subscription for the streaming service already seems pretty reasonable when you look at the (legal) alternatives, the rise of the black market in Netflix passwords shows some people are willing to pay a lot less even if it means breaking the law.

According to Symantec, hackers grab passwords mainly through phishing attacks where a Netflix user is tricked into hitting a malicious link in an email or website that leads them unknowingly to a fake login page for the service. Malware is also being used to harvest account information, the California-based security firm said.

It also reveals that some cybercriminals are selling Netflix passwords on the dark Web for as little as 25 cents a pop. An ad lifted from the Web by Symantec shows a password vendor offering a minimum purchase of four accounts for a total of $1, adding that it has 300,000 passwords in stock. Its “terms of service” instructs customers not to change any account details as this would obviously alert the genuine subscriber to unauthorized activity.

Assuming the account details are indeed left untouched by the intruder, as a legitimate user you could still notice that your account’s been compromised if your “recently watched” list says you’ve already steamed through the entire season of Making a Murderer when you know darn well you haven’t (though why haven’t you?).

The video-streaming service now has 75 million users worldwide, a figure that indicates there’s plenty of potential for the black market in stolen Netflix passwords to expand and go on operating.

If you think your Netflix account has been receiving an unwelcome visitor (or visitors), be sure to check out DT’s easy fix here.
http://www.digitaltrends.com/home-th...-black-market/





Treasurer Introduces 'Netflix Tax' for GST on Digital Products to Parliament
AAP

Digital products such as books, movies, games, apps and e-books purchased by Australians from overseas could soon attract the GST.

Treasurer Scott Morrison introduced draft laws to Parliament on Wednesday seeking to level the playing field between Australian and overseas business.

A software subscription service provided by an Australian company now attracts the GST, but one based overseas does not.

If the legislation clears Parliament, overseas companies will have to collect and remit GST on products purchased by Australian consumers.

The so-called "Netflix tax" is expected to raise $350 million over four years from July 2017, with funds going to the states and territories.

"It ensures Australian businesses selling digital products and services are not disadvantaged relative to overseas businesses that sell equivalent products in Australia," Mr Morrison told Parliament.

The legislation is in line with OECD guidelines, which recommend consumption be taxed in the destination country of imported digital goods and services.

Mr Morrison noted the model has been implemented recently by the European Union and other countries, such as Japan and New Zealand, are developing similar rules.
http://www.smh.com.au/federal-politi...0160210-gmq88u





US Senate Passes Bill Making Internet Tax Ban Permanent
Chris Morran

Nearly two decades ago, Congress passed the first Internet Tax Freedom Act, establishing that — with a handful of grandfathered exceptions — local, state, and federal governments couldn’t impose taxes on Internet access. Problem is, that law has had to be renewed over and over, each time with an expiration date. But today, the U.S. Senate finally passed a piece of legislation that would make the tax ban permanent.

Recent attempts to make this ban permanent — by simply striking out the current end date on the latest of the Tax Freedom Act — had stalled in both the House and Senate. And so, as we reported in December, the language was simply tacked on to HR 644, Trade Facilitation and Trade Enforcement Act of 2015, a piece of legislation authorizing funding for Customs and Border Protection, that was deemed likely to pass. Today, it passed the Senate with a 75-20 vote.

“Internet Tax Freedom saves Oregonians and most Americans hundreds of dollars a year in taxes,” writes Oregon Sen. Ron Wyden, a co-author of the original Tax Freedom Act and an outspoken proponent of the permanent ban. “There’s no ban on wireless taxes and Americans pay an average 17 percent (!) tax on their mobile service.”

In addition to removing the end date on the existing tax ban, the revised law puts an end date of June 30, 2020 for those few states that are still collecting sales tax on Internet services.

When the original law was passed, more than a dozen states were allowed to continue charging taxes for online access. In the intervening years, many have opted to drop these taxes. Currently, only Hawaii, New Mexico, North Dakota, Ohio, South Dakota, Texas, and Wisconsin still collect taxes for going online.

What the legislation does not resolve is the hotly contested issue of taxes for online purchases. For example, states are currently allowed to collect sales tax from an online retailer if it has a physical presence in that state. It’s a wrestling match that has resulted in a patchwork of settlements and arrangements around the country, with many larger e-tailers like Amazon collecting taxes in some states but not in others. The retail industry has repeatedly pushed for a federal law that would be more precise about exactly when states can collect sales tax, but to no avail.
http://consumerist.com/2016/02/11/se...ban-permanent/





Net Neutrality Again Puts F.C.C. General Counsel at Center Stage
Cecilia Kang

Every day for one month last fall, Jonathan Sallet, the general counsel at the Federal Communications Commission, sneaked into a small, windowless office at the agency, its location undisclosed except to senior staff.

From 6 a.m. until early evening, with Bach streaming in the background, he worked mostly alone, marking up stacks of law books and standing in front of a lectern. His job: Defend in court the F.C.C.’s most contentious policy — rules to classify broadband Internet providers as utilities, widely called net neutrality.

“I did nothing for one month but prepare,” Mr. Sallet said in an interview. “I talked a lot to the wall.”

His arguments, though — like nearly all of his actions for the agency — have had far-reaching reverberations.

In his two years as the F.C.C.’s general counsel, Mr. Sallet, 63, has taken center stage in some of the most divisive debates in Washington. He helped shape and then defend the net neutrality law. His input helped kill the Comcast-Time Warner Cable merger last year. In recent days, the cable industry has closely tracked his thinking about a merger between Charter and Time Warner Cable, concerned about a similar result.

“Typically a general counsel is like an administrator,” said Reed Hundt, a former Democratic chairman of the F.C.C. “But in Jon you have an administrator who is also a policy maven and political strategist.”

Mr. Sallet will draw more of the spotlight in coming months, a period that could shape the tech and cable landscape for years to come. The agency is set to take a position on the Charter and Time Warner Cable deal, as well as vote to open the market of set-top cable boxes to new competitors. And the decision in the net neutrality case, the one he prepared so long and hard for, is also expected.

Each weekday morning, Mr. Sallet is among a small cadre of top advisers who meet at 8:45 to discuss policy and strategy with Tom Wheeler, the chairman of the F.C.C. Mr. Wheeler, a former lobbyist for the cable and wireless industries, knows firsthand how the agency’s actions can be derailed by the pressure of lobbying by the powerful businesses he used to represent.

“He is a multifaceted talent with the ability to see all angles of an issue and provide sage counsel,” Mr. Wheeler said of Mr. Sallet.

In the recent interview in his office, where the slim and soft-spoken Mr. Sallet keeps a keyboard and a guitar, he said swift changes in how people communicate and consume media have thrust the agency — and him — into the spotlight.

For decades, Mr. Sallet has been an inside player in telecom policies. After graduating from law school at the University of Virginia and working as a clerk for Justice Lewis F. Powell, he spent 30 years representing companies like MCI and working on tech policy at the Commerce Department.

While at the Commerce Department, he was introduced to the Mosaic web browser by Vice President Al Gore. That was in 1993, the same year Mr. Wheeler showed him a demonstration of a wireless phone.

These days, Mr. Sallet said, policies related to smartphones and the Internet get such strong reactions that they require more than a broad view of economics and markets. They also require an ability to navigate pointed political criticism.

That climate makes sticking to a set of principles all the more important, he said. For him, those ideas are that networks should be open for consumers and that the F.C.C. should “take actions to avoid a society of haves and have-nots.”

Net neutrality “involves competition issues, the mergers are quite clearly about competition,” he said.

Mr. Sallet’s stance has not slowed criticism. The F.C.C. recently proposed an overhaul of the TV set-top box market, a move that could open a new area of business for tech giants like Google and Amazon. Cable and media companies are fighting the plan, saying it would erode the core of their business.

In addition, some academics criticized the agency’s decision last spring to block Comcast’s merger with Time Warner Cable, after a review led by Mr. Sallet. Critics said the companies did not compete in the same markets and would not reduce the number of options for consumers.

The review of Charter’s acquisition of Time Warner Cable is expected to yield a decision this spring. Though analysts are generally confident it will be approved, the agency is likely to attach several requirements that ensure the merged company doesn’t use its size to keep programmers from offering content to streaming providers.

The harshest criticisms have been for Mr. Sallet’s role in advising and writing up net neutrality policy — with technology and telecom companies arguing that Mr. Sallet, a Democrat, has made too many decisions based on politics.

In October 2014, Mr. Wheeler, also a Democrat, introduced rules that could have let cable and telecom firms carve the Internet into various high-quality and low-quality tiers, according to some legal experts and net neutrality advocates. Top officials at the agency were reluctant to create stronger rules that ban such practices by regulating Internet service providers like utilities.

Mr. Sallet had talked in public about a desire for a middle ground “hybrid” rule that prevented broadband providers from blocking or unfairly slowing down traffic but didn’t expand the agency’s regulation over the industry. That middle ground was quickly attacked by many net neutrality advocates, drawing millions of comments of protest, many of them inspired by a late-night monologue by the comedian John Oliver.

The pressure intensified the next month, when President Obama took the unusual step of saying that broadband service providers should be treated like utilities, subject to rules akin to those placed on phone services. Soon after, Mr. Wheeler directed Mr. Sallet to rewrite the draft rules to more aggressively regulate the Internet service providers.

“Ultimately Jon is the lawyer and Wheeler is his client,” said Robert D. Atkinson, president of the Information Technology and Innovation Foundation.

To complicate the situation, the rules were almost guaranteed to face strong legal challenges. Telecom and cable companies immediately sued to overturn the regulations, and the case was handed to the United States Court of Appeals for the District of Columbia, the same court that had struck down net neutrality rules twice before.

Mr. Sallet made his arguments defending the latest rules in December, winning plaudits from all sides for his work in the courtroom. A decision is expected this spring. Other legal challenges to the policy are likely.

Mr. Sallet and his supporters said his positions, and those of the F.C.C., should not come as a surprise. Mr. Sallet and top advisers have talked frequently about the agency’s push to keep big telecom and cable firms — which control the video and broadband pipes into homes — from stifling new technologies.

There are even echoes of this philosophy in a speech he helped write in 1994 for Mr. Gore. In the speech, at University of California, Los Angeles, Mr. Gore called for an overhaul of rules for phone and cable communications to encourage the development of the coming Internet economy.

“We look back now and see that was incredibly prescient,” Mr. Sallet said.
http://www.nytimes.com/2016/02/08/te...ter-stage.html





India Introduces Net Neutrality Rules Barring Facebook's Free Internet
Sankalp Phartiyal and Himank Sharma

India introduced new rules on Monday to prevent Internet service providers from having different pricing policies for accessing different parts of the web, in a setback to Facebook Inc's plan to roll out a pared-back free Internet service to the masses.

The new rules by the regulator came after a two-month long consultation process that saw Facebook launching a big advertisement campaign in support of its Free Basics program, that runs in more than 35 developing countries around the world.

The program offers pared-down Internet services on mobile phones, along with access to the company's own social network and messaging services, without charge.

The service, earlier known as internet.org, has also run into trouble in some other countries which have accused it of infringing the principle of net neutrality - the concept that all websites and data on the Internet are treated equally.

Critics and Internet activists argue that allowing access to a select few apps and web services for free would put small content providers and start-ups that don't participate in it at a disadvantage.

On Monday, the Telecom Regulatory Authority of India (TRAI), which had suspended the free Facebook service pending a policy decision, said Internet service providers would not be allowed to discriminate on pricing for different web services

"Essentially everything on the internet is agnostic in the sense that it cannot be priced differently," TRAI chairman Ram Sevak Sharma told a news conference.

Facebook did not immediately respond to Reuters' request for comment

Although the new rules will also have implications for Indian telecom operators' plans to make money from rapidly surging web traffic through differential pricing, Facebook's campaign turned the spotlight on the social networking giant.

Free Basics is part the U.S.-based social media firm's ambition to expand in its largest market outside the United States. Only 252 million out of India's 1.3 billion people have Internet access.

"We are delighted by the regulator's recognition of the irreversible damage that stands to be done to the open Internet by allowing differential pricing," said Mishi Choudhary, a New York-based lawyer who led an online campaign against Facebook.

(Reporting by Sankalp Phartiyal; Writing by Himank Sharma; Editing by Sumeet Chatterjee and Mark Potter)
http://uk.reuters.com/article/us-ind...-idUKKCN0VH162





Free Basics And Facebook’s Waterloo In India
Vivek Wadhwa

The Telecom Regulatory Authority of India made a wise decision by banning Facebook’s Free Basics internet service.

The project was ill-conceived and showed a lack of understanding of India’s culture and values. Mark Zuckerberg surely had good intentions in wanting to provide Internet access to hundreds of millions of people who lack access. But he went about it in the wrong way. In the process, he alienated India’s technology community and weakened his support in the Indian government.

Free Basics was essentially a walled garden in which Facebook and the telecom providers selected which websites people could visit. Rather than being able to do Google searches and explore the web as we are able to, users of Free Basics would live in a world in which Facebook was the center of the universe and experience only what it allowed them to. This is not an experience that any web user should have.

Facebook reportedly spent tens of millions of dollars in advertising and it implored all of its Indian users to send an email to the Telecom Regulatory Authority to support its program.

In its advertising, it used the example of a farmer named Ganesh, who would be able to find weather information and prepare for monsoons, look up commodity prices to get better deals, and invest in new crops and livestock.

The problem was that Ganesh would have a tainted view of the world and be able to use only a limited set of apps—and these were probably in the wrong language. India has dozens of languages and dialects.

There is no way that Facebook would have been able to or should have been allowed to determine what was right for Ganesh. This would be like a corporation or government dictating what services your hospital could offer and what treatments it would provide—or what books your children could read.

In using its money and platform to try to control public opinion, Facebook trampled over the nascent Indian technology community—which has been demanding the same level of net neutrality that Silicon Valley asks for. It didn’t listen to the people who were protesting against its program, it tried to drown out their voices.

This regulatory loss is a PR disaster for Facebook because Indians are now celebrating the victory over a foreign corporation that was trying to colonize parts of the Internet. Indians still cherish and celebrate the freedom that they gained from their British colonizers in 1947—who had tried to impose Victorian values.

Facebook acted arrogantly and didn’t attempt to understand Indian values and markets.

What is limiting the spread of the Internet in India isn’t the cost of mobile data. Cell phone plans and data access are really cheap there. The problem is that most people can’t afford smartphones or tablet computers. But this is changing because prices of computing devices are dropping.

Lower-end smartphones can be already purchased for around $50 in India and data access costs as little as 50 cents for 100 MB. A farmer who can afford to buy such a device can certainly afford the data.

Facebook should have used the tens of millions of advertising dollars it spent to instead subsidize the purchase of smartphones. It could also have negotiated with the telecom carriers to bundle in unrestricted data access. This would have earned it applause and gratitude.

Facebook needs to consider such a strategy now. It needs to show Indian users that it really was trying to uplift the masses—rather than trying to lock them into its limited platform.
http://techcrunch.com/2016/02/08/fre...rloo-in-india/





Facebook's India Stumble Could Embolden Other Regulators
Jeremy Wagstaff and Himank Sharma

India's decision to effectively ban Facebook's pared-back free Internet service is a major blow to the social network's plans, and may prompt other regulators to demand equal online access for their users.

Facebook will have to reconsider its approach in the light of India's new rules preventing Internet service providers from having different pricing policies for accessing different parts of the Web, analysts said.

"This is a major setback for Facebook," said Naveen Menon, lead analyst at A.T. Kearney in Singapore. "Not only because India was expected to be such a critical piece of the overall Internet.org success story, but more so because it has potential dangerous knock-on effects for the universal access initiative in other markets."

Internet.org is Facebook's umbrella initiative to bring Internet access to the unconnected. Part of that is the Free Basics program, which Facebook has launched in around three dozen emerging countries. The service has been criticized outside India, too, with Facebook accused of infringing the principle of net neutrality - the concept that all websites and data on the Internet be treated equally.

Critics and Internet activists argue that allowing free access to a select few apps and Web services disadvantages small content providers and start-ups that don't participate.

Ram Sevak Sharma, chairman of the Telecom Regulatory Authority of India (TRAI), told Reuters he hoped its ruling would clarify ambiguity about net neutrality and "that India has set the record straight that will be followed [the] world over."

In Facebook posts after Monday's ruling, founder Mark Zuckerberg said Free Basics was just one part of a larger initiative that includes solar-powered planes, satellites and lasers, and pairing with local entrepreneurs to provide wireless hotspots.

Expanding these approaches with or without the operators was one option for Facebook now, as well as legal workarounds where the service is repackaged, said Martin Geddes, a UK-based telecoms consultant.

Facebook could also challenge the ruling in the courts, but a more likely move, said Marc Einstein, Asia-Pacific director at Frost and Sullivan, would be to sit down with the TRAI "to try to come up with a solution that's deemed a little more neutral."

Facebook executives were not immediately available for comment, but India-born Karthik Naralasetty, whose blood donor matching service Socialblood is available in more than 20 countries via Free Basics, said Facebook was already re-thinking its approach.

"Facebook is re-thinking what it's doing, coming up with better plans," he said by telephone. "Communications will have to improve. They have to get the buy-in of different governments before they go into those countries."

FIGHT GOES ON

It won't be easy.

For one thing, said Neil Shah, a director of Counterpoint Research in Mumbai, Free Basics made little headway in India before it was suspended in December, gaining 1 million users. Only 252 million of India's 1.3 billion people have Internet access.

Opponents of the service said they would continue to fight.

"Facebook is not going to take it lying down and they will try and figure out a way for it to happen one way or the other," said Sachin Bhatia, co-founder of Indian dating app TrulyMadly. "Our job is to keep at it non-stop to ensure Internet freedom is not threatened."

Regional telecoms operators which partner Facebook, such as Indonesia's PT Indosat, controlled by Qatar's Ooredoo, and Globe Telecom in the Philippines, said the ruling would not lead them to reconsider the partnerships.

"The Indian experience is very isolated," said Vicente Froilan Castelo, general counsel of Globe Telecom.

(Reporting by Jeremy Wagstaff and Himank Sharma, with additional reporting by Ruma Paul in Dhaka, Eveline Danubrata in Jakarta and Neil Jerome C. Morales in Manila; Editing by Ian Geoghegan)
http://uk.reuters.com/article/us-ind...-idUKKCN0VI10E





Facebook Ordered To Stop Tracking Non-Users In France
Natasha Lomas

Yet more privacy problems for Facebook in Europe. Now the French data protection authority, the CNIL, has issued the company with a formal notice to get its house in order and comply with European data protection law or face possible referral to the CNIL’s select committee which could then choose to pursue a sanction against the company.

Facebook has been given three months to make the changes deemed necessary by the CNIL. If it does so to the DPA’s satisfaction it will not face any sanctions, the DPA said yesterday.

TechCrunch understands Facebook is in the process of reviewing the order from the CNIL. A spokesperson provided the following statement regarding the action: “We are confident that we comply with European Data Protection law and look forward to engaging with the CNIL to respond to their concerns.”

Those concerns are multiple, and were unearthed by an investigation triggered after Facebook amended its privacy policy in fall 2014. Specifically, the CNIL is unhappy that Facebook collects the browsing activity of Internet users who do not have a Facebook account.

“Indeed,” the CNIL notice reads, “the company does not inform Internet users that it sets a cookie on their terminal when they visit a Facebook public page (e.g. page of a public event or of a friend). This cookie transmits to Facebook information relating to third-party websites offering Facebook plug-ins (e.g. Like button) that are visited by Internet users.”

It also notes that Facebook collects user data concerning sexual orientation, religious and political views “without the explicit consent of account holders”. Nor does it inform users on the sign up form “with regard to their rights and the processing of their personal data”.

Advertising cookies are also set by Facebook “without properly informing and obtaining the consent of Internet users”, the CNIL asserts, noting that users are not offered any tools to prevent the compilation of info for targeted advertising — which it says “thereby violates their fundamental rights and interests, including their right to respect for private life”.

Perhaps most surprisingly, Facebook also stands accused of continuing to use the now illegal Safe Harbor data transfer mechanism, which was invalidated by the European Court of Justice last October — so a full four months ago.

And although Europe and the US have apparently agreed a new deal (called the EU-US Privacy Shield), this has yet to come into force, so cannot yet be relied up on by companies wanting to legalize data transfers across the Atlantic. And, last week the head of the CNIL, who also heads up the WP29 group of European DPAs, reiterated that Safe Harbor is not an option — stressing that companies continuing to use the invalidated framework are “in an illegal situation” and could face sanctions from DPAs.

Alternative data transfer methods were detailed by the European Commission last fall, after the Safe Harbor strikedown, so it’s rather surprising that Facebook has apparently not switched to using one of these alternatives to govern its Europe to US data transfers. We’ve asked Facebook about this point and will update this story with any response.

Update: Facebook claims it is not in fact using Safe Harbor to transfer data — pointing to prior comments it made last year, in which it said: “Facebook, like many thousands of European companies, relies on a number of the methods prescribed by EU law to legally transfer data to the US from Europe, aside from Safe Harbor.”

The CNIL goes on to add that it has made its formal notice against Facebook public due to “the seriousness of the violations and the number of individuals concerned by the Facebook service” — noting the site has more than 30 million users in France.

Its action follows a lawsuit brought against Facebook by the Belgian data protection authority last summer, which was also concerned with how it tracks non-users. The Belgian legal action led to the threat of daily fines for Facebook if it did not amend the operation of its tracking cookies — which it subsequently did, switching to requiring users to log in to view pages on the site.

As well as investigations by the French and Belgian DPAs, Facebook is also being probed by Spanish, Dutch and German (Hamburg) data protection authorities. This working group of five DPAs was set up in March 2015 explicitly to investigate its new privacy policy.

The CNIL notes that investigations by all the respective DPAs are “ongoing at the national level and within an international administrative cooperation framework”. So Facebook’s problems in Europe associated with its amended privacy policy look to be far from over.

The new EU-US Privacy Shield is also at least two months out from being approved by the WP29, so there’s no quick fix for companies needing to legalize transatlantic data transfers (although there are a range of alternative mechanisms that can be used, such as standard contractual clauses and model contracts).
http://techcrunch.com/2016/02/09/fac...ers-in-france/





Facebook Developing Radio Wave Mesh to Connect Offline Areas
Alice MacGregor

As part of its wider Internet.org initiative to deliver connectivity to poor and rural communities, Facebook is actively developing a new network technology which uses millimetre wave bands to transmit data.

The project is similar to, and has potential to conflict with, that of Chaitanya Kanojia, former CEO of Aereo, whose new company Starry is looking at boosting internet speeds through thin air rather than using traditional wired infrastructure.

According to reports, Facebook engineer Sanjai Kohli has filed two patents which outline a ‘next generation’ data system, which would make use of millimetre wave technology deployed as mesh networks. A Facebook representative also confirmed that the work is ‘part of the Connectivity Lab which supports the mission of Internet.org — to connect the four billion people who don’t have internet access.’

Kohli’s patents detailed a type of centralised, cloud-based routing system which ‘dynamically adjusts route and frequency channel assignments, transmit power, modulation, coding, and symbol rate to maximize network capacity and probability of packet delivery, rather than trying to maximize the capacity of any one link.’

The social media site has been dedicated to finding new ways to connect more remote locations and low-income populations to the internet, and ultimately its platform. These efforts have included experimenting with drone-mounted lasers and existing satellite technology to deliver internet to communities in Africa.

Having been granted only one of the patents, it is unclear whether Facebook’s millimetre wave technology will ever come into fruition. The research may soon be another area to spur the controversy over the social network’s Internet.org and Free Basics plans, and arguments surrounding the global net neutrality debate.

Earlier this week, Facebook was dealt a significant blow by Indian telecom regulator TRAI. The watchdog voted against differential pricing for data tariffs, scuppering its Free Basics project which sought to provide a free range of internet services to India’s poor.
https://thestack.com/cloud/2016/02/1...offline-areas/





Verizon Accused Of Net Neutrality Foul By Zero-Rating Its Go90 Mobile Video Service
Natasha Lomas

Verizon, the parent of TechCrunch’s parent AOL, is being accused of violating net neutrality principles by excluding its own mobile video streaming service, go90, from data charges — thereby creating an unequal playing field.

On Friday it emerged (via The Verge) that Verizon would not be charging its own customers for the data they consume over go90. An end of January update to the go90 Android app (v1.4.0) notes the service can now be used over LTE without it counting against Verizon customers’ data plans:

If you’re a Verizon wireless post-paid customer, stream go90 videos over LTE without using up your data.

Verizon has long since stopped offering new customers unlimited data plans, and instead sells a selection of tiered plans starting at $30 per month for 1GB rising to $100 for 18GB (you also need to pay $20 per month per smartphone line, plus taxes). So its data costs can get pretty expensive, especially if you’re in the habit of watching video on the go. Which makes a zero-rated video service sound pretty appealing.

Yet such a freebie has the obvious knock-on effect of penalizing other video services that do cost data to consume (and thereby eat into users’ data allowances). Hence critics calling foul over the move — which looks like an attempt to circumvent US net neutrality laws, passed by the FCC a year ago.

Verizon’s argument for the legality of the service is that go90 is open to other content providers via its sponsored data program, FreeBee Data 360, which lets content providers pay for customers’ data costs. A Verizon spokesperson told Re/code: “FreeBee Data 360 is an open, non-exclusive service available to other content providers on a non-discriminatory basis. Any interested content provider can use FreeBee Data 360 to expand their audiences by giving consumers the opportunity to enjoy their content without incurring data charges.”

However, as Re/code notes, Verizon is pulling the strings here. So rivals merely wanting to compete on an equal footing with Verizon’s free service would still have to pour data costs into Verizon’s coffers just to do so. Which again skews the competitive landscape.

Carrier AT&T has also said it is planning to launch a mobile video service involved sponsored data — so as industry watchers have noted the Verizon move is something of a test bed to see how the FCC responds.

The body has already said it is looking at zero rating to determine whether the practice violates its general conduct statement. In the meantime, carriers are clearly seeing how far they can push things…
http://techcrunch.com/2016/02/07/ver...video-service/





Gearing Up for the Cloud, AT&T Tells Its Workers: Adapt, or Else
Quentin Hard

Thirty-four years ago, Kevin Stephenson got his younger brother, Randall, a job with the telephone company.

Kevin, then 23, and Randall, 22, had tried selling cattle feed with their father near their home in Moore, Okla., but that didn’t pan out. Kevin was hired to do accounting at a local Southwestern Bell office. Randall, who was in college, needed a bit more help. “He had trouble getting hired,” Kevin said. “I talked to someone I knew in personnel.”

The brothers had different tastes. Kevin liked to be outside, and now, at 57 years old, he works in Norman, Okla., fixing the decades-old copper lines that still connect to landline telephones in most homes as well as to modern Internet conduits like high-speed fiber optics. Randall liked numbers and stayed indoors, rising through the management ranks.

Southwestern Bell became SBC Communications and took on the old AT&T name through an acquisition in 2005. By 2007, Randall was running the place.

Today, Randall Stephenson, AT&T’s chairman and chief executive, is trying to reinvent the company so it can compete more deftly. Not that long ago it had to fight for business with other phone companies and cellular carriers. Then the Internet and cloud computing came along, and AT&T found itself in a tussle with a whole bunch of companies.

AT&T’s competitors are not just Verizon and Sprint, but also tech giants like Amazon and Google. For the company to survive in this environment, Mr. Stephenson needs to retrain its 280,000 employees so they can improve their coding skills, or learn them, and make quick business decisions based on a fire hose of data coming into the company.

In an ambitious corporate education program that started about two years ago, he is offering to pay for classes (at least some of them) to help employees modernize their skills. But there’s a catch: They have to take these classes on their own time and sometimes pay for them with their own money.

To Mr. Stephenson, it should be an easy choice for most workers: Learn new skills or find your career choices are very limited.

“There is a need to retool yourself, and you should not expect to stop,” he said in a recent interview at AT&T’s Dallas headquarters. People who do not spend five to 10 hours a week in online learning, he added, “will obsolete themselves with the technology.”

Kevin? He admires his younger brother, but he is among the many AT&T lifers who are not that keen to participate in this reinvention of old Ma Bell. “I’m riding the copper train all the way down,” he said.

He talks about the changes with obvious affection for both his brother and his longtime employer. In interviews, many veteran AT&T employees around the country showed a surprising amount of emotion toward a company that has been broken up, rebuilt and reinvented several times.

But that doesn’t mean everyone is particularly eager to rebuild and reinvent themselves for a new AT&T. Even if it means, as Randall put it, obsolescence.

Companies’ reinventing themselves to compete with more nimble competitors is hardly a new story. Many have tried, and a handful have even succeeded. Mr. Stephenson wants AT&T to be among those few.

In the last three years, he has spent more than $20 billion annually, primarily on building the digital business. DirecTV was acquired in a $63 billion deal last year, and several billion more was spent to buy wireless businesses in Mexico and the United States. Even for a company with $147 billion in 2015 revenue and over $400 billion in assets built up over more than a century, it’s a lot.

By 2020, Mr. Stephenson hopes AT&T will be well into its transformation into a computing company that manages all sorts of digital things: phones, satellite television and huge volumes of data, all sorted through software managed in the cloud.

That can’t happen unless at least some of his work force is retrained to deal with the technology. It’s not a young group: The average tenure at AT&T is 12 years, or 22 years if you don’t count the people working in call centers. And many employees don’t have experience writing open-source software or casually analyzing terabytes of customer data.

If you don’t develop the new skills, you won’t be fired — at least AT&T won’t say as much — but you won’t have much of a future. The company isn’t too worried about people leaving, since executives estimate that eventually AT&T could get by with one-third fewer workers.

Mr. Stephenson declined to project how many workers he might have by 2020, when the cloud-based system is supposed to be fully in place. One thing about cutting people in an aging work force, he noted, is that “demography is on our side.” Other senior executives say shrinking the work force by 30 percent is not out of the question.

Maybe so, but count Kevin among the skeptics of how fast AT&T’s transformation will happen.

“I’m proud of my brother,” he said, “but he’s not going to get rid of this stuff as fast as he thinks.”

Eyes on the Cloud

Long ago, a phone system created wire lines between callers, and operators moved plugs in their switchboards to connect people. Over time, that was automated to become something closer to a computer, with digital fibers and wireless towers. Much of the setup, however, still needed lots of people to tend hardware that had been built for particular tasks, like feeding one neighborhood’s calls into a nationwide backbone of wires, fiber and switches.

Mr. Stephenson has concentrated on things related to cloud computing, a technology setup that is more like the computer guts of Google or Amazon than the circuits and switches of a phone company. This cloud system will eventually touch ventures in landline phones, wireless, high-speed online services, cable TV and now satellite, thanks to the DirecTV purchase.

Analysts give him good marks but say he has a long way to go. “They want to be 75 percent done by 2020, and last year they did the first 5 percent,” said Akshay Sharma, an analyst with the research firm Gartner.

Google and Amazon are increasingly in businesses that look like what AT&T does, and they thrive on analyzing the data they gather about customers. Google, for example, is offering high-speed Internet access in some cities. Amazon is selling video entertainment, as well as hosting new kinds of phone systems in its cloud.

AT&T wants to build products and services as fast as this competition. Data from satellite TV could be analyzed for viewing habits and someday used, for example, to sell football fans a replay app for their AT&T mobile phones.

In 2012, Mr. Stephenson realized, much to his dismay, that his staff was woefully unschooled for the new technology. Vision 2020, as the company calls it, is a program that combines online and classroom-based course work in subjects like digital networking and data science, as well as a look at old skills that can be transferred to new careers.

Everything at AT&T is changing, from the services customers are offered to the way they are charged for them. One service called Network on Demand, for example, allows customers to increase the size of their Internet pipes without calling a technician, something that used to take weeks. And Mr. Stephenson’s employees have to be able to deal with all that.

“If we can’t do it, mark my words, in three years we’ll be managing decline,” he said.

A possible answer showed up on a sweltering Dallas afternoon in 2012 when Sebastian Thrun arrived. Mr. Thrun, a Silicon Valley technologist and onetime Stanford professor, is known for his futuristic work on self-driving cars and Internet-based learning. He is also the founder of Udacity, an online education company.

Inside a glass-walled office where Mr. Stephenson meets presidential candidates and corporate titans, Mr. Thrun gave him a pitch on funding an online master’s degree in engineering that Udacity proposed to teach in conjunction with the Georgia Institute of Technology. Within five minutes, the two men were cross-legged on the floor, Mr. Stephenson eager to try a physics course.

His first online learning began with an unexpected challenge: getting online. AT&T’s own Wi-Fi was too clunky, Mr. Thrun recalled. Eventually they used Mr. Thrun’s smartphone, which ran on the network of T-Mobile, a rival carrier.

The building’s Wi-Fi is now said to be better, and elsewhere in AT&T the first employees are getting their online Georgia Tech degrees.

Eboni Bell, 24, a product manager for smartphone software in AT&T’s Atlanta office, sees the Vision 2020 retraining as the chance of a lifetime. The company provided tuition assistance for much of her two-year Udacity/Georgia Tech master’s degree in computer science, which it says cost $6,600. Single and childless, she doesn’t mind the hours it takes.

“I leave the office at 7 p.m., work at home until midnight, and Saturdays and Sundays are committed to school,” she said.

Ms. Bell, who wants to work in software architecture and design, plans to keep taking courses. “I need to know what my competitors are doing,” she said. “I can’t see myself staying with one product too long — it makes me feel like I’m not growing.”

So far most of the people who have taken the new courses are managers, and seem interested in learning very technical skills. Among the most popular courses are web development, data analysis, introduction to programming and writing apps for the iPhone.

“It’s great for those who want to make the transition,” said Mr. Sharma of Gartner. “If you don’t want to change, it’s a good time to retire and enjoy life.”

AT&T’s workers receive weekly emails and video broadcasts about learning online. Vision 2020 includes an internal website where employees can enter their job titles, see what types of careers might be available in a software-driven company, and what courses they need to take to get them. Over time, their grades are logged, and depending on how they do on the schoolwork, different new courses are suggested. Eventually, performance reviews will include data on what people studied, how well they did and whether, like Ms. Bell, they are willing to keep learning.

Eventually, the plan is that desire for learning will be taken into account when promotions are considered.

Across the country in Orange, Calif., Patti Cunningham, a 61-year-old technician, is struggling. Ms. Cunningham, who has worked at AT&T for 43 years, has not signed up for any courses, and can barely recall receiving emails about the new plans. (An AT&T spokesman produced dozens of such messages, going back to early 2014.) Essentially, she does not see a place for herself.

“This new concept of training on your own time, everything changing all the time, if you want to keep working, do more things on your own time — I guess they have to do it,” she said, sitting in a run-down room at her union local. “But I don’t see a need to be involved.”

Christopher Shelton, national head of the Communications Workers of America, said the changes were inevitable, and he believed his people would go along with them.

“We realized a long time ago that you can’t fight technology change and win,” he said. “Our contracts spell out training programs and policies to make sure that members get training to update their skills as technology changes.” Still, he thinks the expectations about home study after a hard day’s work are too much.

AT&T will reimburse up to $8,000 a year in tuition; the amount was raised just last month. The company claims that a year into the program, over half of the work force, mostly managers, has started training, sometimes with dozens of short online courses.

“People are going to have to work hard, but it’s not insane,” said William Blase, who oversees personnel for all of AT&T. “There’s going to be an expectation that your compensation will be tied to continuing to learn. We’re at a crossroads as a business — and a country — where education has to keep up with technology.”

If there is one thing on which Ms. Bell and Ms. Cunningham agree, albeit with different sentiments, it is that the change is necessary. Eventually phone calls, texts, sensor data and TV shows, along with much else, will all run on fiber-optic networks and be managed largely with software.

Ms. Bell works with a lot of older colleagues. “One of my co-workers has been looking at the same database for 20 years,” she said. “It sounds harsh, but if she doesn’t adapt, there won’t be room for her.”

That message has not been lost on Jacobie Davis, an 18-year employee in Richardson, Tex., who works in tech support for older-style gear. At 39, he is the youngest person in his office, and is scrambling to study both new networks and data science. “I try to put in 15 hours a week,” he said. “By 2020, my technology will be gone.”

For Some, a Tough Sell

In some ways, cloud computing is not as radical a technology shift as all the puffy language suggests. Big banks of computers still run software, as they have been doing in many industries since the 1960s. They have more power, because their chips have more transistors that enable them to do more, and they connect to more things thanks to fiber-optic cable and wireless.

The big difference is something called virtualization, which amounts to software that allows many machines to operate like one piece of computer hardware. This made it possible to run software that in effect interacted with other software instead of hardware. This, in turn, means the possibility of changing functions around rapidly by typing a few lines of code.

The new systems also collect more data, quickly analyze information about what people and things are doing, and react. That is how online ads are personalized for you, and increasingly how maps reflect current traffic conditions, or streetlights adjust to suit parking congestion.

Now what once took a year of analysis and deployment can instead happen in days, even minutes.

These concepts can be tough in some reaches of AT&T, where lives and work have not changed all that much.

In Dayton, Ohio, Kirk Warrenburg came out of a job in a bowling alley and started wiring cards for telegraph systems 40 years ago. Now he works on AT&T’s signaling network, which makes sure billions of calls get through.

He has taken 16 courses — Udacity courses and in-house “nano courses,” each about two weeks long — in the last year. He doesn’t see himself changing jobs, however, because the old machines still need someone to care for them. Younger workers, he thinks, won’t want to be in his dead end.

“Writing a telegraph circuit was like writing a recipe for a field technician,” he said. “A lot of legacy systems are still around here. I’ll be long gone before they will.”

Some other older employees besides Kevin Stephenson think the 2020 target will come and go, but basics won’t change.

The 2020 effort “is just a start,” said Kenny Williams, 64, a testing technician and the head of Ms. Cunningham’s union local in Southern California. “I’ve inoculated my people against worrying. They need a fiber network for this that doesn’t exist out here yet. Seventy percent of my folks are safe; the other 30 have to be found jobs, or they’ll take the golden handshake” and retire.

As he sees it, much of the urgency comes from the threat of Google. In 2015 Google Fiber, Google’s high-speed Internet service, caused AT&T to do something uncommon in its history: lower its prices because of competition. “In 40 years here I hadn’t seen that,” Mr. Williams said. “Their people aren’t in unions — we’re a lot more on AT&T’s side than theirs.”

AT&T recently began rolling out fiber in about 50 cities in the United States, in what it hopes is a bigger move than Google can make. Still, putting a cloud system all the way across a diverse, continentwide network will take years, which is why Mr. Williams feels safe.
Face to Face With Google

What happens next at AT&T — and how fast that will happen — is a matter of disagreement in the Stephenson family.

“I go out to houses away from the cities, and there’s not a lot of fiber there,” Kevin said. Fiber would open the way for all that new technology. He takes comfort in looking at patches linemen did on fiber systems decades ago — from both the jury-rigged craftsmanship and the way they have endured.

But Randall said his brother was not necessarily like the rest of the work force because there will always be hard, outdoor tasks for people like him. “There will be people turning screws and digging trenches. I’ll be long gone before that is over. But other guys I know in Oklahoma will do a skills pivot” with additional training, he said.

Besides, it’s not just about his brother. It’s about most of the economy.

“Everybody is going to go face to face with a Google, an Amazon, a Netflix,” he said. “You compete based on data, and based on customer insights you get with their permission. If we’re wrong, it won’t play well for anyone here.”
http://www.nytimes.com/2016/02/14/te...t-or-else.html





AT&T to Run Field Trials of 5G Wireless in Austin this Year

Move is well ahead of industry standard for faster 5G
Matt Hamblen

AT&T announced today it will begin field trials of faster 5G wireless technology this summer in Austin, Texas.

The 3GPP industry standard for 5G, also known as Fifth Generation wireless, is not expected to be completed until 2020, with the earliest phase completed in 2018.

Wireless speeds with 5G could be 10 to 100 times faster than with 4G LTE, which generally averages in the 10 Mbps to 20 Mbps range for users downloading data.

Both AT&T and Verizon have ambitious 5G rollout plans, prompted by the recent explosion of wireless video and Internet of Things connectivity. AT&T estimates that its wireless network grew 150,000% from 2007 to 2015, largely because of video traffic; more than 60% of its wireless traffic in 2015 was video.

Self-driving cars, robots, smart cities and other technologies are expected to test networks like never before, and "5G will will help make them a reality," said John Donovan, chief strategy officer at AT&T Technology and Operations.

AT&T said it is working with Ericsson and Intel on laboratory tests of 5G in the second quarter, with the outdoor tests and trials starting in the summer. By the end of the year, AT&T expects to make 5G connections to fixed locations, such as buildings and homes, while wireless connections to moving objects, like cars and devices used by passengers aboard trains, are harder to achieve.

AT&T's trials are intended to precede full 5G standard adoption so that the carrier can "pivot to compliant commercial deployments once 5G technology standards are set," AT&T said in a statement.

The advent of 5G will be more efficient and cost-effective for carriers. AT&T plans to build its version of 5G on a software-centric architecture that adapts quickly to new demands, Donovan said. That means AT&T will deliver 5G in connection with software defined networks (SDN), big data, new security tools and open source software, he added.

SDN is expected to allow AT&T to virtualize 75% of its network by 2020. In 2015, about 6% was virtualized, a number that should reach 30% in 2016. About 14 million wireless customers use the virtualized network already. SDN that uses open source software will save costs, as well.

WIth a virtualized network, AT&T can turn routers, firewalls and other network equipment into virtual functions that run on commodity hardware, primarily servers.
http://www.computerworld.com/article...this-year.html





AT&T Fights to Keep Your Internet as Slow as Possible
Joan McCarter

Last February, the Federal Communications Commission made history by taking sweeping action to promote a fast, fair, and open internet. Their net neutrality ruling made the most headlines, but a second ruling preventing states from blocking localities seeking to develop municipal broadband was nearly as huge. Both have been fought tooth and nail—and taken to court—by industry. That's made Chattanooga, Tennessee, ground zero in a war financed by AT&T.

Chattanooga, Tenn., is more than 2,400 miles from Silicon Valley, but residents of the Southern city have access to broadband that's 50 times faster than the majority of Internet connections in technology's capital. Why, you ask? Chattanooga's municipally owned electric utility, EPB, provides its broadband Internet.

Chattanooga's neighbors would like to set up a similar arrangement, but AT&T, which delivers much slower broadband in the area — when it delivers at all — is trying to block the plan, saying the government should not compete with private enterprise.

Angry Tennessee consumers and legislators aren't backing down. "Don't fall for the argument that this is a free market versus government battle. It is not. AT&T is the villain here, and so are the other people and cable," said Sen. Todd Gardenhire (R-Tenn.) at a community rally, according to the Chattanooga Times Free Press.


Just let that sink in for a second—Chattanooga residents enjoy broadband 50 times faster than Silicon Valley, and it's a local government that provided it. Which makes AT&T's claim that government interference is what's getting in the way of their advancing technology ring pretty hollow. In the news article referenced above, an AT&T flak, Daniel Hayes, actually said "[p]olicies that discourage private-sector investment put at risk the world-class broadband infrastructure American consumers deserve and enjoy today." As if AT&T were actually providing world-class broadband. As if AT&T gave a flying fig about providing world-class broadband to the millions of people who are trapped in markets where it has a stranglehold. They care even less about people in rural communities that don't have service at all.

What AT&T is trying to do in Chattanooga definitely has an impact on the rest of the nation—they want to put a stop to municipal broadband there. They want to bully other states and localities and prevent them from doing what Chattanooga has done—actually deliver world-class broadband, with no profit at all going to AT&T.
http://www.dailykos.com/stories/2016...w-as-possible?





Congressmen Upton, Walden Latest To Insist Nobody Needs Faster Broadband
Karl Bode

A little over a year ago, the FCC voted to raise the minimum definition of broadband from 4 Mbps downstream, 1 Mbps upstream -- to 25 Mbps downstream, 3 Mbps upstream. The standard better reflects household usage in the gigabit connection and Netflix binge watching era. However, the broadband industry has been whining like a petulant child ever since, largely because the change highlights how a lack of competition and the resulting failure to upgrade networks means a huge swath of the country doesn't technically have broadband.

Outraged by the FCC's sudden decision to have standards, incumbent broadband providers convinced six Senators to write in and scold the FCC last month, arguing that 25 Mbps was just a crazy metric, and that nobody needs that kind of bandwidth:

"Looking at the market for broadband applications, we are aware of few applications that require download speeds of 25 Mbps. Netflix, for example, recommends a download speed of 5 Mbps to receive high-definition streaming video, and Amazon recommends a speed of 3.5 Mbps. In addition, according to the FCC's own data, the majority of Americans who can purchase 25 Mbps choose not to."

As we noted then, the Senators apparently don't have teenage kids (or have them and don't pay attention to what they do), since 25 Mbps is a pretty reasonable standard for a household of hungry gamers, streamers, and social media addicts. And while the Senators use Netflix HD streaming as the holy grail for what constitutes "real" bandwidth usage, they apparently didn't realize that as Netflix moves to 4K, each stream will eat 25 Mbps all by itself. In the age of Google Fiber and gigabit cable, 25 Mbps is a pretty fair per household metric; in fact the upstream standard probably isn't high enough.

But this being Congress, the technical realities don't matter nearly as much as the campaign contribution cash tied at the end of telecom talking points memo. Not to be outdone by the manufactured outrage of their friends in the Senate, Congressmen Fred Upton and Greg Walden have similarly decided to waste everybody's time with a letter of their own (pdf), which accuses the FCC of "troubling actions" that "distort – or outright ignore – the FCC’s requirements to produce honest, data-driven reports to inform policymakers and the public."

Why, the Congressmen argue, does the FCC feel the need to mess with such an obviously competitive market?:

"The Communications Act requires the FCC to assess and report on the state of broadband deployment, the level of video competition, and the level of effective competition in the nation's mobile wireless market. Since 2011, it appears that the Commission has applied inconsistent definitions and analyses in making those determinations. Those reports have then been used to justify Commission actions to intervene in seemingly competitive markets. Despite the plain language of the Communications Act, the FCC's actions seem to benefit specific classes of competitors and do not promote competition. This behavior concerns us.

Yes, that's the Chairman of the Subcommittee on Communications and Technology complaining about having standards.

Of course the only reason the markets were "seemingly competitive" is that for fifteen years, the FCC has been basing policy on flimsy standards and cherry-picked industry data. Once the FCC raised the standards and started thinking a little more independently, phone companies that were happily selling snail-esque DSL at next-generation prices were suddenly outed for not trying very hard. Under the new standard, FCC data suggests 31 million Americans don't technically have broadband, and two-thirds of homes lack access to speeds of 25 Mbps from more than one provider.

Again, the real outrage isn't really that the FCC is some kind of rogue agency setting unrealistic standards just to make giant companies cry, the real outrage stems from the fact that the new standard makes it harder than ever to pretend that the United States is a competitive broadband market.
https://www.techdirt.com/articles/20...roadband.shtml





The US Ranks 55th in Terms of LTE Download Speeds

Report from OpenSignal compares country's LTE speeds and coverage
James Vincent

The quality of a country's mobile network is often decided by a recipe that's two parts economics, and one part geography. While small, developed nations like South Korea and Hong Kong can easily provide complete coverage and fast speeds to their dense populations, larger, poorer countries often struggle to deliver full bars to all of their territory. Countries that are big and rich, like America, tend to get networks that are somewhere in the middle — good on coverage, for example, but not so great on speed, as a report into LTE in the US by OpenSignal showed earlier this week. Now, the network-testing company has released its worldwide report for Q4 2015, allowing us to see how America stacks up with the rest of the globe.

Unfortunately, not much has changed since we last checked in on LTE. The US is still slipping behind the rest of the world when it comes to download speeds, with an average of 10 Mbps — ranking it 55th worldwide. It does much better when it comes to coverage (subscribers get an LTE signal 81 percent of the time — seventh best in the world), but it's still suffering from the first mover disadvantage. Like Japan and Sweden, the US got its LTE network early, but the technology is now old, and there are plenty of subscribers using it — meaning slower speeds for all. For context, the global average for download speeds on LTE is 13.5 Mbps, while Singapore offers the fastest networks, with downloads as quick as 40 Mbps. And in Q2 2015, America's average download speed was 9 Mbps.

Meanwhile, newer networks with up-to-date tech and fewer subscribers deliver faster speeds. OpenSignal notes that countries in South America, Eastern Europe, and the Middle East tend to demonstrate this trend (although their coverage might not be so good). Romania, for example, offers only 61 percent coverage for its LTE network, but has speeds as fast as 33 Mbps, ranking it sixth in the global leaderboard. But although the trend globally is for rising speeds (they're up nearly a whole megabit compared to last year), countries with established networks are finding it harder to improve with limited spectrum available to them. Looks like we might have to wait for 5G to get going before we get real next-generation mobile data.

Here's a global leaderboard for LTE speeds, and for more detailed data check out OpenSignal's full report.
http://www.theverge.com/2016/2/4/109...age-opensignal





Aereo Founder’s New Startup Wants to Bring You Wi-Fi—And Cut Out the Providers
Issie Lapowsky

Because it wasn’t enough to piss off every major television broadcaster in America with his last company, Aereo, now Chet Kanojia is taking on the country’s biggest Internet providers with a new start-up called Starry.

The company, which Kanojia officially announced in New York City this morning after keeping it under wraps for a year, aims to offer people wireless Internet access at speeds that are faster than wired broadband at a fraction of the cost. The goal is to circumvent not only the hefty infrastructure cost of wired networks, but also the companies that build and provide those networks—as well as all of the complexities of getting a technician to come to your home and install that network. Instead, Starry allows anyone to plug in a small device at home and receive the Internet instantly over a wireless connection.

If Starry's plan works, it could accomplish the very thing Aereo set out to do, which is free consumers from the bundle.

“This is how it should be in our opinion,” Kanojia said. “Wired infrastructure is just difficult.”

Here’s how it works: Starry utilizes what are known as high-frequency millimeter waves to deliver the signal to people’s homes. To broadcast that signal, Starry installs so-called Starry Beams on rooftops throughout a city. Each Beam can cover roughly 2 kilometers, sending connectivity directly to hubs called Starry Points, which people can place just outside their window to pick up a signal. This set up means that Starry will launch city by city, region by region, as it installs these networks of Beams. Its first market will be Boston, with beta tests launching this summer.

Kanojia declined to say just how much a monthly plan for Starry would cost, except to say that it will be much cheaper than standard broadband, because Starry’s own costs are expected to be much lower. According to the company’s estimates, the average wired network costs about $2,500 per home to deploy. Starry’s cost, Kanojia says, is just $25 a month.

The Next Fight

In many ways, Starry is the continuation of Kanojia’s longstanding mission to give consumers more choice in how they connect to the Internet and television. With Aereo, Kanojia wanted give cord-cutters a way to watch live over-the-air television without buying a full cable package. To do that, it built warehouses full of mini-antennae and argued that those antennae were no different than the bunny ears people are free to buy for their own homes.

The broadcasters begged to differ, arguing that if they had to pay for the rights to copyrighted programming, then Aereo should, too. In the end, the Supreme Court sided with the broadcasters, forcing Aereo to close up shop for good. Now, Kanojia is again bracing for a fight which is, while equally difficult to pull off, slightly less risky.

Kanojia, for his part, believes the regulatory winds are blowing in his favor. “I think in general there is a desire for competition,” he says.

There will, of course, be ample competition, not only from giants like Comcast, but also newer providers like Google, which have been expanding its Google Fiber networks across the country in recent years. It will also face off against startups like Karma which, though still in their infancy, are also experimenting with wireless access.

But while Starry is competing directly with these companies, it’s also building routers that anyone can use, no matter their internet service provider. The Starry Station is a sleekly designed device with a touchscreen interface that allows people to track the devices on their network and the overall health of their network on a daily basis, and lets them install things like parental controls and, possibly, even ad-blocking technology at the network level. The router, which retails for $349.99 is intended to be a high-tech upgrade from the black box routers that leave cobwebs of wires on living room floors across America.

If Starry’s plan works, it could accomplish the very thing Aereo set out to do, which is free consumers from the bundle. In Aereo’s case, it was the cable bundle. In Starry’s it’s the broadband bundle. Something tells us another battle with the industry incumbents won’t be far behind.
http://www.wired.com/2016/01/aereo-f...eless-startup/





New 1-Terabit Internet Satellites Will Deliver High-Speed Internet to Remote Areas

ViaSat's next satellite carries triple the network capacity
Sean O'Kane

US-based satellite company ViaSat is teaming up with Boeing to create and deliver three new satellites that will deliver high-speed internet to remote areas around the world. The partnership was announced yesterday, months before the company is scheduled to launch its previous generation satellite, ViaSat-2, on a SpaceX Falcon 9 rocket.

The new ViaSat-3 satellites will be capable of much more. Each satellite will carry with it a total network capacity of 1 Tbps (yes, Terabit per second), about triple what ViaSat-2 is capable of. That will allow ViaSat to deliver 100 Mbps service to remote residential properties in the Americas, Europe, the Middle East, Africa, and Asia. The company claims that work is already underway on the first two satellites, and that Boeing is already preparing them for launches by the end of 2019.

Beyond residential connections, ViaSat says the new satellites will be capable of increasing in-flight connectivity on commercial airlines, business-class jets, and government aircraft. They will also be able to provide 1 Gbps connections to "maritime, oceanic and other corporate enterprise applications such as oil and gas platforms." All told, the company says the three new satellites could deliver twice (or more) the total network capacity of the 400 or so commercial communications satellites currently orbiting the Earth combined.

But ViaSat is far from alone in these pursuits — in fact, the race to provide emerging markets with high-speed internet access has long been in full swing. SpaceX and Virgin Galactic are ramping up efforts to provide internet connections from space using small armies of satellites. Google has a few different ideas, like providing 5G connection using solar-powered drones, or using massive balloons to create widespread internet access. And then there's Facebook, which has a solar-powered internet drone of its own, is partnering with French satellite operator Eutelsat to provide internet to sub-Saharan Africa, and is also potentially working on a millimeter-wave radio mesh network solution similar to the one being teased by Starry.

All of these options face massive challenges — satellite internet can still be disrupted by weather, and Facebook, Google, SpaceX, and Virgin are years away from rolling out some of their solutions — but they each point to the same utopian endgame: a much more connected future for everyone.
http://www.theverge.com/2016/2/10/10...es-rural-homes





Japanese Scientists Push 100Gbps Wireless Broadband Using 300GHz
Mark Jackson

Last week we reported that Japanese scientists had managed to build a wireless network that could send data at a speed of 56Gbps (Gigabits per second) using the 72-100GHz (GigaHertz) radio frequency bands (here) and now another team claims it can hit 100Gbps by pushing into the TeraHertz (300GHz+).

At present most existing home WiFi networks prefer to use the 2.4GHz and 5GHz bands, with a few newer services (802.11ad) also starting to make use of 60GHz to deliver peak network speeds of around 4.6Gbps over a short distance. As a general rule, the higher the frequency the lower its coverage but the more data you can push (more frequency for you to use).

The future generation of 5G based Mobile Broadband services may also aim to harness the performance of even higher frequency ranges (e.g. 6GHz to 100GHz) for ultrafast speeds (10Gbps), although such networks would need much more powerful signals and a greater density of expensive infrastructure. Indoor coverage could also be a big problem as the signals would struggle to penetrate through walls.

By comparison the latest development of a TeraHertz (THz) transmitter (300GHz+), which was implemented as a silicon CMOS integrated circuit and can transmit a signal running at 10Gbps per data channel over multiple channels in the 275-305GHz band, is yet another big leap forwards.

Prof. Minoru Fujishima, Hiroshima University, said:

“Now THz wireless technology is armed with very wide bandwidths and QAM-capability [quadrature amplitude modulation]. The use of QAM was a key to achieving 100 Gigabits per second at 300 GHz.

Today, we usually talk about wireless data-rates in megabits per second or gigabits per second. But I foresee we’ll soon be talking about terabits per second. That’s what THz wireless technology offers. Such extreme speeds are currently confined in optical fibers. I want to bring fiber-optic speeds out into the air, and we have taken an important step toward that goal.

We plan to develop receiver circuits for the 300-GHz band as well as modulation and demodulation circuits that are suitable for ultrahigh-speed communications.”

However the practical problems of operating a wireless network in the THz band should not be underestimated and crucially we’re not told precisely what distance would be needed to achieve 100Gbps. For comparison, the speed of 56Gbps using the 72-100GHz band was only achieved over a distance of just 10cm (centimetres), but that wasn’t the same setup.

On the other hand the new approach could provide another way for devices, such as different parts of a computer, to communicate without needing to run a physical link between each section. Similarly it might potentially be used as a new means of connecting devices that are in close proximity, such as a TV and Tablet in your living room.

Sadly the THz band is presently only available for research purposes, although its future allocation is due to be discussed at the 2019 World Radiocommunications Conference (WRC). But it’s worth pointing out that the terahertz region also sits close to the region where lasers are associated, so there may be health considerations too.
http://www.ispreview.co.uk/index.php...ng-300ghz.html





Breakthrough Enables Downloads 50,000 Times Faster Than 'Superfast' Broadband
Hannah Francis

Imagine you could download the entire Game of Thrones series in high-definition in a fraction of a second.

That dream is one step closer to reality after British researchers simulated download speeds 50,000 times faster than 'superfast' 24 megabits per second (Mbps) broadband, breaking a world record.

The University College London team achieved speeds of 1.125 terabits per second (Tbps), the highest throughput ever recorded using a single receiver.
Zap! Scientists are looking for ways to make your internet connection astronomically fast.

Zap! Scientists are looking for ways to make your internet connection astronomically fast.

To put that in perspective, the National Broadband Network is promising to deliver download speeds of 25Mbps to all Australians by 2020. One terabit is a million megabits, making the speeds in the study 45,000 times faster than the NBN target speed.

Other commercial networks — and other countries — have faster speeds than what NBN is promising, but nothing like what the researchers have achieved.

"While current state-of-the-art commercial optical transmission systems are capable of receiving single channel data rates of up to 100 gigabits per second (or 100,000Mbps ), we are working with sophisticated equipment in our lab to design the next generation core networking and communications systems that can handle data signals at rates in excess of 1Tbps," said the project's lead researcher Dr Robert Maher.

The UCL team used 15 super-fast optical fibre channels and a single receiver. However they applied coding techniques commonly used to compress signals over Wi-Fi, but not yet widely used in fibre communications.

"This ultimately resulted in us achieving the greatest information rate ever recorded using a single receiver," Dr Maher said.

The channels were grouped together to create one "super channel", which the researchers believe will be the way forward for the internet as the world's demand for data and speed explodes.

"Super-channels are becoming increasingly important for core optical communications systems, which transfer bulk data flows between large cities, countries or even continents," Dr Maher said.

The team is now testing the setup over longer distances to see how the speeds stack up in the real world, where data can experience distortion due to the sometimes thousands of kilometres it travels via optical fibres.

The research is part of a broader work looking at how to improve internet speeds using fibre optic cables to support the infrastructure needed for the growing use of cloud and e-health services, and the so-called internet of things.

At last count in 2014, Australians' hunger for data jumped 33 per cent in the year, surpassing an exabyte — more than 9 million terabits.

That jump may prove to have been even bigger for 2015, with the proliferation of popular internet video streaming services like Netflix in the period. Many internet service providers also began to offer special deals with unmetered data for streaming services.
http://www.smh.com.au/technology/inn...11-gms3bd.html





ISPs Want “Flexible” Privacy Rules that Let them “Innovate” with Customer Data

ISPs should be able to choose how they protect customer data, they tell FCC.
Jon Brodkin

Broadband industry lobby groups urged the Federal Communications Commission on Thursday not to impose privacy rules that dictate "specific methods" of protecting customer data, since that would prevent "rapid innovation."

ISPs should have "flexibility" in how they protect customers' privacy and security, said the letter from the American Cable Association, Competitive Carriers Association, Consumer Technology Association, CTIA, the Internet Commerce Coalition, the National Cable & Telecommunications Association, and USTelecom. Together, these groups represent the biggest home Internet service providers and wireless carriers such as Comcast, AT&T, Verizon, Time Warner Cable, Charter, Sprint, T-Mobile, and many smaller ones.

"Rules dictating specific methods quickly become out of date and out of step with constantly changing technology, and will only hamper innovation and harm consumers," they wrote.

The debate stems from the FCC's decision to reclassify fixed and mobile broadband providers as common carriers under Title II of the Communications Act. The FCC has said it intends to enforce Section 222 of Title II, which requires telecommunications carriers to protect the confidentiality of customers' proprietary information. But since the commission's existing privacy rules apply to telephone service rather than broadband, the FCC has to draw up new rules for Internet service. The phone rules protect personal information such as the numbers customers call and when they call them.

Under Section 222, the FCC could impose some version of its Customer Proprietary Network Information (CPNI) rules on broadband providers. The ISPs' letter tried to discourage the FCC from doing so, noting that the commission's Title II decision is still undergoing judicial review. Broadband industry groups complained about potential new privacy rules in their lawsuit against the FCC's Title II reclassification, saying that CPNI rules could force them to create processes to ensure that customer data is not used in marketing without customer approval.

Although ISPs and wireless carriers already have to follow CPNI rules when they sell telephone service, they want more flexibility in broadband.

The lobby groups used some form of the word "innovate" 10 times in yesterday's letter. Specific privacy rules, they said, could "create consumer confusion and stifle innovation." Stronger privacy rules could also make it hard for ISPs to "innovate and compete" or to develop "innovative new business models" and "innovative products and services."

If the FCC does adopt Section 222 rules, it should make them similar to the Federal Trade Commission's privacy regulations, which "combine strong protections for consumers with flexibility that allows for rapid innovation," the letter said.

"Under the FTC regime, all companies in the Internet ecosystem must ensure that their privacy and data security practices are neither deceptive nor unfair," the lobby groups wrote. "As a result, consumers are protected and all companies that collect consumer data should be able to innovate and adapt to the inevitable changes in technology and the market for online services."

ISPs also have incentives to protect customer data because otherwise they might not "earn and maintain their customers' loyalty," the groups said.

ISPs are fighting against an effort led by consumer advocacy groups such as the American Civil Liberties Union, the Electronic Frontier Foundation, Free Press, and Public Knowledge. Those groups, plus a few dozen more, last month urged the FCC to make its privacy rules stronger than the FTC's.

Even FTC Commissioner Julie Brill has said she welcomes the FCC becoming a "brawnier cop on the privacy beat," the consumer advocacy groups noted in their letter.

The FCC's rules should "protect consumers from having their personal data collected and shared by their broadband provider without affirmative consent, or for purposes other than providing broadband Internet access service," the groups said. "The proposed rules should also provide for notice of data breaches, and hold broadband providers accountable for any failure to take suitable precautions to protect personal data collected from users. In addition, the rules should require broadband providers to clearly disclose their data collection practices to subscribers, and allow subscribers to ascertain to whom their data is disclosed."
http://arstechnica.com/business/2016...rt-innovation/





Senate Sends Sweeping Trade Enforcement Bill to Obama
Jackie Calmes

The Senate gave overwhelming final approval Thursday to the most comprehensive overhaul of customs law in decades, giving presidents new tools to combat unfair trade, yet falling short of bipartisan demands for penalties against other nations that manipulate their currencies.

Senators voted 75 to 20 for the Trade Facilitation and Trade Enforcement Act, a blend of bills the House and Senate passed separately nearly a year ago. The bipartisan compromise was reached by negotiators in December, with White House support, and was quickly approved by the House. Senate action stalled partly because of an unrelated dispute over taxing Internet sales.

The White House, in a statement, hailed the bill’s passage as “an important milestone in our overall trade agenda,” and said President Obama would sign it into law “to help strengthen enforcement of the rules and level the playing field for American workers and businesses.”

Senator Orrin G. Hatch, Republican of Utah and chairman of the Senate Finance Committee that is responsible for trade issues, called the package “a major step forward in advancing a robust agenda for international trade that better reflects the realities of the 21st-century global economy.”

The trade enforcement package took shape last year as a way to persuade trade skeptics in Congress to support a separate measure renewing presidents’ bolstered authority to make trade deals. The idea was to assure legislators that the federal government would have enhanced powers to act against trade violations once Mr. Obama completed negotiations on the 12-nation Trans-Pacific Partnership accord.

Congress did narrowly approve so-called trade-promotion authority for Mr. Obama and his successor, but the enforcement package is not expected to similarly improve Mr. Obama’s low odds of winning congressional approval this year of the Pacific agreement, which was concluded in October.

The administration, in its statement, reiterated its objection to a provision that the United States make it a condition of trade deals that other nations not join a movement to boycott Israeli businesses operating in Israeli-occupied Palestinian territories. While the administration opposes the so-called Boycott, Divestment and Sanctions movement against Israel, it objected that the language of the trade bill — by covering the occupied territories — “contravenes longstanding U.S. policy” against Israel’s construction of settlements there.

In the main, the trade measure defines a new process for the Customs and Border Protection service to act quickly against foreign businesses that evade anti-dumping laws and American duties on imports, or that traffic in counterfeit goods. It also addresses a loophole that allows some imports made with forced or child labor to get through customs, and includes new protections for businesses’ intellectual property rights.

Among the strongest supporters were the National Association of Manufacturers and the National Retail Federation; both served notice to lawmakers that this would be among the “key votes” when they assess whom to support at election time.

But labor unions, led by the A.F.L.-C.I.O., were adamantly opposed. Labor had supported last year’s Senate version of the trade bill, which mandated import duties against countries found to manipulate their currencies to make their exports cheaper to American consumers.

That penalty was dropped, however, from the final bill, and House Republicans won other concessions that infuriated unions. The new language, proposed by Mr. Hatch and two Democrats – Senator Michael Bennet of Colorado and Senator Thomas R. Carper of Delaware – calls for the president to seek “enhanced bilateral engagement” with suspected manipulators, and to keep violators out of trade pacts with the United States.

Critics dismissed the language as toothless. China and Japan are considered the worst offenders at manipulating currency.

Most of the 20 senators who opposed the trade bill were Democrats, though more than half of Democrats – led by Senator Ron Wyden of Oregon, the senior Democrat on the Senate Finance Committee – strongly backed the final compromise.

Unions, joined by senators from both parties, also objected to a provision unrelated to trade. It would prohibit state and local governments from taxing Internet access. That, the critics said, would close off a revenue source to those governments as they seek funds for public services.

But Senator Richard J. Durbin, Democrat of Illinois, and others got a promise from the majority leader, Senator Mitch McConnell of Kentucky, to allow a vote this year on a bill allowing states to tax sales of retailers that sell only online. Supporters say online sellers have a competitive advantage over “brick and mortar” stores that must collect sales taxes.
http://www.nytimes.com/2016/02/12/bu...-approval.html





Google Reverses Its Decision To Ban Ad Blocking Apps From The Google Play Store
Sarah Perez

Google appears to have reversed its earlier decision to ban ad blockers from the Google Play store – a move which had seen the company pulling apps like Adblock Fast and stalling the updates for others, like Crystal’s ad blocker. Now, following an appeal from Rocketship, the developers behind Adblock Fast, Google has re-approved and republished its app to Google Play.

The decision represents a change in course for Google, regarding its position on what sort of apps the company will allow in its app store for Android devices.

From a source with knowledge of the situation, TechCrunch learned at the time of the original decision that Google had planned to only support mobile browsers that could block ads – including those with built-in ad-blocking features, like the Adblock Plus browser, as well as those that supported ad blocking via extensions, as with Firefox, Javelin, and Dolphin browsers.

However, Google had decided that standalone ad blocking apps distributed via APKs, like Crystal and Adblock Fast, would not be permitted under its new guidelines.

Those apps and others had emerged following Samsung’s introduction of ad blocking support within its own mobile web browser in early February. The feature worked a lot like how Apple’s Safari supports ad blocking. That is, third-party developers can take advantage of Samsung’s new Content Blocker API which allows them to build apps that work the browser to block ads and other unwanted content that can slow down web pages, like trackers.

But there was already some indication that Google may have been debating its decision ahead of Adblock Fast’s reinstatement, as Google’s policy was being inconsistently applied.

While Adblock Fast was pulled, Crystal was merely slowed down, for example – Google blocked its app update from going through, citing the same “violation” of its Android Developer Distribution Agreement as the reason.

In Section 4.4, the company informs third-party developers that they cannot interfere with “the devices, servers, networks, or other properties or services of any third-party including, but not limited to, Android users, Google or any mobile network operator.”

As no further guidance was offered, affected developers said they understand Google’s request to mean they should not interfere with Samsung’s web browser.

That decision obviously didn’t make sense as Samsung’s release of its ad blocking API indicated that was exactly what the company wanted third-party developers to do.

TechCrunch understands from a person familiar with the situation at Google that the company will now allow those apps that integrate with one another app through authorized channels, like APIs, but will continue to prohibit apps on Google Play that interfere with the functionality of other apps in an unauthorized manner.

Rocketship says it submitted an appeal last Monday – the day its app update was rejected – which was before the app itself was pulled from the store on Tuesday.

Today, Rocketship explains what happened since. The company says it received an email on Friday noting that Google had accepted its appeal, then the app was republished this morning.

In addition to being back on Google Play, Rocketship was also able to update the app to version 1.1.0, which now extends support up to Android 4.0 and clarifies the onboarding experience for those users who didn’t yet have Samsung’s web browser installed.
As Google flip-flopped on its decision, other ad blockers makers remained more or less untouched by the shifting policy. For instance, Adblock Plus, which released a version of its ad blocker for Samsung Browser, tells us they never heard from Google nor had their app pulled.

Crystal also remains online. Its developer, Dean Murphy, tells us he appealed a few times, as well, and Google has just now accepted his update.

Google declined to comment.
http://techcrunch.com/2016/02/09/goo...le-play-store/





Wired Is Launching an Ad-Free Website to Appease Ad Blockers

Readers can pay $3.99 for a four-week subscription to a version of its site without advertising.
Joshua Brustein

More than 1 in 5 people who visit Wired Magazine’s website use ad-blocking software. Starting in the next few weeks, the magazine will give those readers a choice: stop blocking ads, pay to look at a version of the site that is unsullied by advertisements, or go away. It’s the kind of move that was widely predicted last fall after Apple allowed ad-blocking in the new version of its mobile software, but most publishers have shied away from it so far.

Wired plans to charge $3.99 for four weeks of ad-free access to its website. In many places where ads appear, the site will simply feature more articles, said Mark McClusky, the magazine’s head of product and business development. The portion of his readership that uses ad blockers are likely to be receptive to a discussion about their responsibility to support the businesses they rely on for information online, McClusky said.

There are legitimate reasons that people use ad blockers, according to McClusky, like a desire to speed up web browsing or not wanting to be tracked online. But Wired has bills to pay. “I think people are ready to have that conversation in a straightforward way,” he said.

The magazine’s editors are explaining the move in a note to readers:

"At WIRED, we believe that change is good. Over the past 23 years, we’ve pushed the boundaries of media, from our print magazine to launching the first publishing website. We even invented the banner ad. We’re going to continue to experiment to find new ways to bring you the stories you love and to build a healthy business that supports the storytelling. We hope you’ll join us on this journey. We’d really appreciate it."

This idea didn't originate with Wired. Many publishers have been flirting with a subscribe-or-see-ads model. Google has even offered a way for websites to accept donations in exchange for ad-free experiences, although it hasn’t gotten much uptake. So far, though, the fear of alienating readers has outweighed the fear of losing revenue to ad blockers.

MediaRadar, a company that makes software for advertising sales departments, recently found evidence of anti ad-blocking activity on only four percent of large online publishers.

And while most of the attention and wave of ad-blockers came in the wake of Apple's decision to let iPhone owners use the software, Wired says almost all of its ad-blocking activity happens on desktop, not mobile.

Conde Nast, which publishes Wired, has been conducting gentle experiments with anti ad-blocking approaches for months with a small percentage of its readers. This is the first time that everyone who visits Wired will be subject to the actual rules of ad-blocking. McClusky couldn’t say whether the subscription price would offset the average monthly advertising revenue Wired brings in from each reader, but subscription services generally generate more revenue per user than those supported by advertising. McClusky describes the $1 weekly subscription price as an opening volley, and says the magazine may tweak it as it sees how people respond.

The tricky part for online publishers is that they’ve been offering their content without directly charging for it for long enough that readers aren’t accustomed to the idea that they should pay for it. Newer companies like Pandora and Spotify have offered a choice between subscription or advertising with little blowback from users. “Making it apparent to the customer that there’s a choice is a good thing,” said Todd Krizelman, the chief executive of MediaRadar.

While ad blockers have raised the scary possibility that the public may be rejecting advertising altogether, the experiences of the music services point in the exact other direction. Their biggest challenge has been convincing enough people to pay to go ad-free.
http://www.bloomberg.com/news/articl...se-ad-blockers





Why Stack Overflow Doesn’t Care About Ad Blockers
Steve Feldman

Hi! I’m Steve Feldman, Senior Ad Ops Manager at Stack Overflow. My whole life I’ve been fascinated by advertising. Even as a kid, I wondered what a company was trying to tell me by using one word instead of another in an ad. Over time, I developed a strong (read: subjective) opinion of what makes an ad ’good,’ and what makes an ad ’bad.’ It took me many years-- in fact, it wasn’t until joining the Ad Sales team at Stack Overflow-- to finally figure out that the common thread shared by the best ads is relevance. Maintaining that relevance is how we’ve managed to avoid one of the biggest issues facing publishers today: ad blocking.

What’s the deal with ad blockers?

At this point, it’s pretty clear that ad blocking is a big deal. A recent study suggesting the advertising industry is set to lose over $22 billion in 2015 alone as a result of ad blockers is setting off alarm bells. That is a LOT of money. Companies are scrambling to ‘fix’ the ad blocking problem, as active users of ad blocking utilities hits nearly 200 million. But it’s not just that tiny stop sign in the toolbar raising alarms. Apple caused a panic when they announced that iOS9 would permit the use of ad blockers, as many see mobile ads are an important piece of revenue for the industry.

First, the ad industry went up in arms over ad blocking, offering suggestions like developing ways to deliver specific ads to users employing ad blockers. Then, they considered going after Apple when they announced iOS 9 would permit ad blockers. Later, they began asking users to turn off their ad blockers as a sign of good faith. That did not go so well for some. Finally, they prevented Ad Block Plus from attending an industry event. Through all of this, those of us at Stack Overflow sighed and shrugged our shoulders. Clearly, many in the industry just don’t get it. Publishers can’t win by forcing ads -- especially low-quality ads -- in people’s faces. But some in the industry do get it. Eyeo (the company behind Adblock Plus) outlined in their ‘Acceptable Ads Manifesto’ some strong ideas for how to improve digital advertising-- not to mention the iAB’s L.E.A.N Ads program. While there is criticism for both of these solutions, the positive takeaway is that powerful organizations are finally moving toward addressing the problem. Reddit is proactive in their public outreach with their ads. Quartz is trying new and interesting ways to engage with users, to mixed reviews. I’m going to toot our own horn by saying that Stack Overflow started doing these things a long time ago via a numerous channels on our Meta sites for both Stack Overflow and Stack Exchange.

Ads at Stack Overflow

The display ads team grew from just two in 2012 to nine today. In that time, traffic on SO tripled, and sales have grown with it. One of the attractions for new hires on our team is the unique relationship we have with our users and the challenge that represents for a salesperson. We entered into an agreement with Stack Overflow users long ago that we wouldn’t subject them to low-quality ads. Think scantily-clad women selling flight deals, weight-loss supplement promos or wacky waving inflatable arm-flailing tube-men promoting car dealerships. But really: anything that doesn’t speak specifically to the Stack Overflow audience is not permitted. We also don’t accept rich media like animated ads, expandable ads, or video, which are the norm for most publishers today. This strict policy means we leave money on the table, but our team wants to protect Stack Overflow from those kinds of ads, as they run the risk of alienating that established trust.

Salespeople and campaign managers on our team do much more than they do at other companies. They’re more involved with a campaign from start to finish. From explaining to a new client how reputation works to working with ad ops to suggest a shift to a new and popular tag like [tag:swift] because it is attracting many new users. This may seem irrelevant to the ad blocking debate, but it’s not. It encourages edification and awareness for people who otherwise would have little or none, which in turn breeds respect and appreciation. This works for a new member of our team much better than simply saying ‘Stack Overflow is important for reasons x, y, and z.’ And they grow to learn over time what IDEs and SDKs are, and it’s remarkable to watch. This acquisition of knowledge really just means that our team cares about keeping ads useful and relevant on Stack Overflow.

We don’t care!

The truth is: we don’t care if our users use ad blockers on Stack Overflow. More accurately: we hope that they won’t, but we understand that some people just don’t like ads. Our belief is that if someone doesn’t like them, and they won’t click on them, any impressions served to them will only annoy them-- plus, serving ads to people who won’t click on them harms campaign performance. That focus on relevance and performance arrives early in the QA process. Whether it’s our sales people explaining that ads must have borders, or our campaign managers checking landing pages to ensure they adequately inform, we are thorough.

An important part of the QA process is ensuring that not just the creative, but the advertiser is relevant to our audience. Every single ad to appear on any of our sites is vetted by the operations team.* We check copy and content on the ads as well as the landing pages. What we repeatedly ask ourselves in this QA process is quite simple: is this relevant to users? ‘Kiss your hosting problems goodbye’ with a provocative image is not something we want on our sites, and I’m sure our users don’t either. The purpose of this heavy QA is to ensure that our users get the most out of their experience on Stack Overflow. The content is helpful-- why can’t the ads be the same?

This ad tries to be relevant, but falls on its face. Also it’s fake. Sorry, future herpetologists.

The Value of Valuing User Experience

User experience is always on our minds. Indeed, others believe that putting user experience ahead of revenue is a path toward long-term growth for publishers. As the chief revenue office of The Washington Post said, “...the product experience has to be every bit as good as the content.” Our approach is in harmony with that belief, as we keep ads confined to certain areas, and we permit users to downvote or close ads that they don’t like.* This allows users control over their experience. QA, curating content and advertisers, and a consideration of the user experience have been successful tools preventing ad blockers from hindering our growth.

The recent resizing of the sidebar from the non-standard, completely made up, for reasons unknown to us 220x250 to the industry standard 300x250 went through thorough research prior to launch. The problem boiled down to this: we wanted to increase the sidebar size, but wanted to ensure that the content wouldn’t be harmed in any way. Bret and the ad server team dug in and investigated screen size of every user across the Stack Exchange network and concluded that only about 2% of users would be affected by the change. As a result, we proceeded confident that our new increase in size would be a net gain for all involved.

Stack Overflow’s mission to make the world a better place for developers remains a central tenet for the Display Ads team. Everyone on our team considers the impact on our users in most of our primary functions. This dedication to keeping content relevant and beneficial to users is what makes the Big Scary Ad Blocking Problem not so big or scary for us. We want advertising to benefit our users and be a resource, not an eyesore. We want advertising on Stack Overflow to be better for our users and advertisers than anywhere else. I’m proud of what we’ve accomplished so far, and I’m excited to see what’s next.

*Special props to the sales support team, as they manage these functions but for the Careers sales team. This QA process used to go through the ops team on the Display Ads side, but it became too much for an army of two to handle. Sales support now consists of seven excellent support specialists.

By Steve Feldman, Senior Ad Ops Manager
https://blog.stackoverflow.com/2016/...t-ad-blockers/





DLL Hijacking Issue Plagues Products like Firefox, Chrome, iTunes, OpenOffice

Oracle patches Java installer against DLL hijacking issue
Catalin Cimpanu

Oracle has released new Java installers to fix a well-known security issue (CVE-2016-0603) that also affects a plethora of other applications, from Web browsers to antivirus products, and from file compressors to home cinema software.

The problem is called DLL hijacking (or DLL side-loading) and refers to the fact that malware authors can place DLLs of the same name in specific locations on the target's filesystem and have it inadvertently load the malicious DLL instead of the safe one.

DLL hijacking is a very well-known issue

This type of attack is very old and has been known to many software vendors, and especially to malware authors, who sometimes prefer it because it allows them to hijack legitimate applications and not to rely on convincing users to double-click and execute their own malicious binary.

If you've been keeping an eye on infosec sites like Packet Storm, SecLists, or Security Focus, German security researcher Stefan Kanthak has been quite busy testing the installers of various software products against this vulnerability.

Here's a short (probably incomplete) list of applications that he found vulnerable to this attack: Firefox, Google Chrome, Adobe Reader, 7Zip, WinRAR, OpenOffice, VLC Media Player, Nmap, Python, TrueCrypt, and Apple iTunes.

Mr. Kanthak also seems to have paid special attention to antivirus software installers. Here are some of the security products he discovered vulnerable to DLL hijacking: ZoneAlarm, Emsisoft Anti-Malware, Trend Micro, ESET NOD32, Avira, Panda Security, McAfee Security, Microsoft Security Essentials, Bitdefender, Rapid7's ScanNowUPnP, Kaspersky, and F-Secure.

Oracle was the first one to take his report seriously, patched Java and VirtualBox

According to a blog post from last Friday, February 5, Oracle decided to release new versions for its Java 6, 7, and 8 installers that protect users from this type of attack.

"Java SE users who have downloaded any old version of Java SE prior to 6u113, 7u97 or 8u73 for later installation should discard these old downloads and replace them with 6u113, 7u97 or 8u73 or later," said the company in its announcement.

Additionally, besides the updates to Java SE's installer, the company also addressed this very same issue (CVE-2016-0602) in its VirtualBox VM installer, during its quarterly security update train last month.

Since it's pretty hard to track all bug reports to the various vendors affected by this issue, we've sent an email to ask Mr. Kanthak if other vendors besides Oracle have addressed this issue until now. We'll update the article with his response.

UPDATE: Mr. Kanthak has told Softpedia that "most of the companies/vendors I contacted patched their products." Rapid7 went so far as to withdrew their ScanNow product altogether.

"Some of the companies/vendors which did not reply to my reports in the first place contacted me after they became aware of the [public disclosure] posts and fixed their installers, or are working on a fix now," Mr. Kanthak also added.

Additionally, there also some other software products for which Mr. Kanthak has not yet posted a public disclosure post, but to which he reported vulnerabilities, and the companies are now working on a fix.
http://news.softpedia.com/news/dll-h...e-500060.shtml





Obama Seeks Over One-Third Rise in U.S. Cyber Security Funding
Dustin Volz

President Barack Obama's budget proposal for the 2017 fiscal year seeks $19 billion for cyber security across the U.S. government, a surge of $5 billion over this year, according to senior administration officials.

The request comes as the Obama administration has struggled to address the growing risk posed by criminals and nation states in the digital world.

The initiative, to be released later on Tuesday, is more than a one-third increase from the $14 billion sought last year and will include $3.1 billion for technology modernization at various federal agencies.

It is unclear whether the Republican-controlled Congress will approve the increase.

The request for a cash infusion is the latest signal from the White House that it intends to make cyber security a top priority in the last year of Obama’s presidency.

The move follows a series of high-profile hacks against the government and companies like Sony Pictures and Target, that were largely met with legislative inaction and administrative uncertainty on how best to address evolving cyber threats.

Those difficulties played out publicly last year when the Office of Personnel Management announced it had fallen victim to a massive hack that lifted sensitive information on roughly 22 million individuals from its databases.

The White House will also announce Tuesday plans for a presidential commission on cyber security, which will make recommendations on how to strengthen defenses over the next decade. Officials, who briefed reporters before the formal release of the Obama budget, said they would create a new position of federal chief information security officer.

A government watchdog report last month concluded that the government’s cyber defense system, known as Einstein, is ineffective at combating hackers.

“No matter how good we get, we will never stop 100 percent of intrusions” Michael Daniel, special assistant to the president and cybersecurity coordinator, told reporters in the briefing before the release of the budget plan.

Obama will also sign an executive order Tuesday to create a permanent Federal Privacy Council, which aims to connect privacy officials across the government to develop comprehensive guidelines for how personal data is collected and stored.

The president’s budget proposal will also call for $62 million to expand efforts to attract and retain qualified cyber professionals working for the government, with things like student loan forgiveness and the creation of a CyberCorps Reserve program, where Americans can obtain college scholarships if they pursue technical jobs in government.

(Reporting by Dustin Volz; Editing by Richard Cowan and Andrew Hay)
http://uk.reuters.com/article/us-oba...-idUKKCN0VI0R1





McCain Pushes for Encryption Legislation in Fight Against ISIS
Katie Bo Williams

Sen. John McCain (R-Ariz.) is calling for legislation that would require tech firms to build their products in such a way that they can crack open encrypted content in response to legal requests from authorities.

"By taking advantage of widely available encryption technologies, terrorists and common criminals alike can carry out their agendas in cyber safe havens beyond the reach of our intelligence agency tools and law enforcement capabilities. This is unacceptable," the Senate Armed Services chairman writes in a Bloomberg op-ed.

McCain’s proposal would not dictate “what those systems should look like.” Instead, it would require “technological alternatives” to end-to-end encryption, which prevents even the manufacturer from accessing communications.

“This would allow companies to retain flexibility to design their technologies to meet both their business needs and our national security interests,” McCain said.

The proposal comes with lawmakers increasingly divided on the need for legislation to address encryption technology.

The top two members of the House Intelligence Committee said last week that they have not made any decisions about endorsing a bill regulating encryption standards.

“I don’t think we’re any closer to a consensus on that than we were, I think, six months ago,” Rep. Adam Schiff, the committee’s top Democrat, said at a Christian Science Monitor breakfast. “Or if there is a consensus, it is that a legislative solution, I think, is very unlikely.”

Following the deadly terrorist attacks on San Bernardino, Calif. and Paris, fears that terrorists were using encryption technology to plan attacks beyond the reach of U.S. surveillance sparked a number of lawmakers to call for new legislation.

Senate Intelligence Committee Chairman Richard Burr (R-N.C.) is working on a bill with his committee’s ranking member, Sen. Dianne Feinstein (D-Calif.), that would force companies to decrypt data under court order.

But tech companies and cryptologists have pushed back, arguing that providing any guaranteed access to law enforcement opens up the day-to-day functions of the Internet — like banking — to hackers.

“There have been people that suggest that we should have a backdoor. But the reality is if you put a backdoor in, that backdoor's for everybody, for good guys and bad guys,” Apple CEO Tim Cook said in a December interview with “60 Minutes.”

Last fall, Apple rejected a court order to turn over communications sent using its iMessage feature, citing its encryption system.

McCain alluded to those concerns, but insisted “this is not the end of the analysis.”

“We recognize there may be risks to requiring such access, but we know there are risks to doing nothing,” McCain writes.

He compared his proposal to wiretap laws enacted in the 1990s that required telecommunications providers to “enable law enforcement officials to conduct electronic surveillance pursuant to court order,” but did not dictate the technology’s design.

Some lawmakers have taken a more measured approach. House Homeland Security Committee Chairman Michael McCaul (R-Texas) and Sen. Mark Warner (D-Va.) — worry that a bill like Burr’s and Feinstein’s offering would weaken encryption.

They’re pushing legislation that would establish a national committee to study the topic first, then present potential suggestions to Congress about how police could get at encrypted data without endangering Americans’ privacy or security.

McCain echoed FBI Director James Comey, who in recent months has sought to recast the question of how to provide access to encrypted data as a business challenge, not a technological one.

“We have to encourage companies and individuals who rely on encryption to recognize that our security is threatened, not encouraged, by technologies that place vital information outside the reach of law enforcement,” McCain wrote.

“Developing technologies that aid terrorists like Islamic State [in Iraq and Syria] is not only harmful to our security, but it is ultimately an unwise business model.”
http://thehill.com/policy/cybersecur...t-against-isis





U.S. Lawmakers Seek to Bar States from Mandating Encryption Weaknesses
Dustin Volz

U.S. House of Representatives lawmakers will introduce bipartisan legislation on Wednesday that would prohibit states from requiring tech companies to build encryption weaknesses into their products.

The move marks the latest foray into an ongoing debate over encryption between Silicon Valley and Washington. While tech companies generally oppose weakened security standards, federal authorities have warned about a "going dark" phenomenon in which criminal suspects use powerful encryption in their communications so that investigators cannot access a phone's content, even with a warrant.

The ENCRYPT Act, sponsored by Democratic Representative Ted Lieu and Republican Blake Farenthold, would prevent any state or locality from mandating that a “manufacturer, developer, seller, or provider” design or alter the security of a product so it can be decrypted or surveilled by authorities, according to bill text viewed by Reuters.

The legislation is in response to proposals in recent months in New York and California that would require companies to be able to decrypt their smartphones manufactured after 2017, Lieu said.

"It is completely technologically unworkable for individual states to mandate different encryption standards in consumer products," Lieu told Reuters in an interview. "Apple (AAPL.O) can't make a different smartphone for California and New York and the rest of the country."

It is unclear how much momentum the bill will have in the House, though the chamber has staked out positions sympathetic to digital privacy in recent years.

Encryption has been an area of disagreement between tech companies and law enforcement authorities for decades, but it gained renewed scrutiny after Apple and Google (GOOGL.O) began offering strong encryption by default on their products in 2014.

FBI Director James Comey told a Senate panel on Tuesday that federal investigators have still been unable to access the contents of a cellphone belonging to one of the killers in the Dec. 2 shootings in San Bernardino, California, because of encryption.

But technology companies, privacy advocates and cryptographers say any mandated vulnerability would expose data to hackers and jeopardize the overall integrity of the Internet.

A study from the Berkman Center for Internet and Society at Harvard University released last month, citing some current and former intelligence officials, concluded that fears about encryption are overstated in part because new technologies have given investigators unprecedented means to track suspects.

(Editing by Richard Cowan and Matthew Lewis)
http://uk.reuters.com/article/us-usa...-idUKKCN0VJ0VI





U.S. Can't Ban Encryption Because it's a Global Phenomenon, Harvard Study Finds
Patrick Howell O'Neill

After a two-year campaign from the FBI, U.S. intelligence officials, and powerful politicians calling for backdoor access into Americans’ encrypted data, a new Harvard study argues that encryption is a worldwide technology that the United States cannot regulate and control on its own.

The study, titled “A Worldwide Survey of Encryption Products,” aimed to catalog all the encryption products available online today. Researchers identified 546 encryption products from developers outside the U.S., a number representing two-thirds of the 865 that are available worldwide.

The point of the research is clear: There’s a whole world of cryptography outside the United States. Any U.S. law that mandates so-called “backdoors” in encryption technology—Sen. Richard Burr (R-N.C.) is currently writing a bill that may do just that—will just push the business outside American borders.

“If U.S. products are all backdoored by law, I guarantee you stuff coming out of Finland is going to make a big deal of that.”

The migration has already begun. Silent Circle, an encrypted communications company started in America, made the move to Switzerland in 2014 to avoid American government attempts to access their data.

Open-source projects that have their code freely available online and whose developers and supports are spread out across the world may be more “jurisdictionally agile” and able to move toward countries, like the Netherlands, that disavow backdoors, the study found.

The new study is authored by independent and Harvard researchers Bruce Schneier, Kathleen Seidel, and Saranya Vijayakumar.

The researchers expect non-U.S. tech companies to take advantage of any anti-encryption policy to come out of America.

“The potential of an NSA-installed backdoor in U.S. encryption products is rarely mentioned in the marketing material for the foreign-made encryption products,” the study explains. “This is, of course, likely to change if U.S. policy changes.”

“If U.S. products are all backdoored by law, I guarantee you stuff coming out of Finland is going to make a big deal of that,” Schneier told the Daily Dot.

Despite pretensions about the superiority of American-made technology, non-U.S. encryption products are just as good as American made software, the study concluded.

“Cryptography is very much a worldwide academic discipline, as evidenced by the quantity and quality of research papers and academic conferences from countries other than the U.S. Both recent [National Institute of Standards and Technology] encryption standards—AES and SHA-3—were designed outside of the US, and the 4 submissions for those standards were overwhelmingly non-US. Additionally, the seemingly endless stream of bugs and vulnerabilities in US encryption products demonstrates that American engineers are not better their foreign counterparts at writing secure encryption software. Finally, almost all major U.S. software developers have international teams of engineers, both working in the U.S. and working in non-U.S. offices.”

FBI Director James Comey, the leading voice in the campaign against strong encryption, agrees with some of what the study concludes. Last year, Comey said the solution to “going dark” was to construct a legal regime spanning North America, Europe, and China that requires tech companies to build backdoors for governments into their products, effectively acknowledging that changes to U.S. law are not enough to stymie increased use of encrypted technology.

Thanks to stalwart rhetoric from Comey and other state and federal authorities, the debate over encryption has reached a new intensity in the last year. Across the divide, a virtual consensus of technologists from academia, industry, and civil society argue that backdoors into encryption will harm both the cybersecurity and privacy rights of Americans.

"So let me be crystal clear: Weakening encryption or taking it away harms good people who are using it for the right reason," Apple CEO Tim Cook, one of the most vocal and powerful defenders of encryption, said in 2015.

Even former NSA chief Michael Hayden stands firmly against government backdoors into encryption.

But powerful figures like Comey and Manhattan District Attorney Cyrus Vance, Jr. have been vocal opponents of the rising popularity of encryption.

Apple's “unilateral decision” to encrypt iPhones will harm American national security by allowing “homegrown violent extremists and terrorists to communicate with each other, to send messages without law enforcement being able to identify what they’re saying,” Vance argued last year.

Because they encrypt data on the device, he added, iPhones are going to be "the terrorists community device of choice."
http://www.dailydot.com/politics/wor...tion-products/





UK Politicians Green-Light Plans to Record Every Citizen's Internet History

But recommend that no encryption backdoors should be installed
James Vincent

Surveillance legislation proposed by the UK last November has been examined in detail by the country's politicians, with a new report recommending 86 alterations, but broadly approving the powers requested by the government. The parliamentary committee scrutinizing the draft Investigatory Powers Bill said that companies like Apple and Facebook should not be required to decrypt messages sent on their services, but approved plans to record every UK citizen's browsing history for 12 months. The committee also gave a thumbs up to the bulk retention of data, and the targeted hacking of individuals' computers, known as "equipment interference."

Confusing wording like "data includes any information that is not data"

The Investigatory Powers Bill will be the first legislation to fully codify digital surveillance in the UK, and has been dubbed the "snoopers' charter" by critics (a name used to refer to similar laws rejected a few years ago). The Bill has been attacked by ISPs, privacy advocates, the UN, and the world's largest tech companies, with critics agreeing that the Bill is being rushed into law and that its wording is confusing. Critics point to portions of the law like the statement that "data includes any information that is not data." The UK's home secretary and the Bill's principal architect, Theresa May, later explained that this was supposed to refer to things like paper.

This latest report repeats these complaints, stressing the need for clarity in the Bill's language. However, it also gives its approval to a number of controversial items. The report's authors says that the bulk interception and surveillance should be "fully justified" in a rewrite of the legislation, and notes that although these powers might contravene the EU's right to privacy, "security and intelligence agencies would not seek these powers if they did not believe they would be effective." This is despite the fact that this sort of mass surveillance (already in place, of course, just not officially legislated) has often proven to be ineffective, as with last year's terrorist attacks in Paris.

Similarly, the committee found no faults with the government's plans to force ISPs to store users' web history for 12 months at a time. This information (known as Internet Connection Records or ICRs) would be available to police without a warrant, with the report noting: "We heard a good case from law enforcement and others about the desirability of having such a scheme. We are satisfied that the potential value of ICRs could outweigh the intrusiveness involved in collecting and using them."

Evidence submitted to the committee pointed out that these records would reveal "sensitive information" about citizens' political, religious, and sexual preferences, as well their health and daily activities, while ISPs noted that storing this data securely would be a "technical challenge." Experts also testified to the difficulty of sorting this data, as many apps like Facebook and Twitter keep a near-constant connection to the internet, and internet users can access sites they're not aware of. One expert noted that he created a blog with a "tiny one-pixel image in the corner" that showed up as Pornhub.com on visitors' internet history.

Good news for American tech giants

By comparison, the committee were much more wary of the UK's desire to access encrypted data, including chat logs from apps like Apple's iMessage and Facebook's WhatsApp. "The Government still needs to make explicit on the face of the Bill that [internet companies] offering end-to-end encrypted communication or other un-decryptable communication services will not be expected to provide decrypted copies of those communications if it is not practicable for them to do so."

Facebook, Microsoft, Google, Yahoo, and Twitter all submitted evidence to the committee saying the proposed legislation would be harmful, impacting individuals' privacy while emboldening more authoritarian regimes like Russia and China to demand similar access to users' data. Apple submitted evidence separately, although CEO Tim Cook also took the time to personally criticize the Bill, saying: "If you halt or weaken encryption, the people that you hurt are not the folks that want to do bad things. It’s the good people. The other people know where to go."
http://www.theverge.com/2016/2/11/10...ttee-criticism





FBI Director Says Investigators Unable to Unlock San Bernardino Shooter's Phone Content
Dustin Volz and Mark Hosenball

FBI Director James Comey said on Tuesday that federal investigators have still been unable to access the contents of a cellphone belonging to one of the killers in the Dec. 2 shootings in San Bernardino, California, due to encryption technology.

Comey told the Senate Intelligence Committee that the phenomenon of communications "going dark" due to more sophisticated technology and wider use of encryption is "overwhelmingly affecting" law enforcement operations, including investigations into murder, car accidents, drug trafficking and the proliferation of child pornography.

"We still have one of those killer's phones that we have not been able to open," Comey said in reference to the San Bernardino attack.

Syed Rizwan Farook, 28, launched the Islamic State-inspired attack with his wife, Tashfeen Malik, 29, at a social services agency in the California city, leaving 14 dead.

Comey and other federal officials have long warned that powerful encryption poses a challenge for criminal and national security investigators, though the FBI director added Tuesday that "overwhelmingly this is a problem that local law enforcement sees."

Technology experts and privacy advocates counter that so-called "back door" access provided to authorities would expose data to malicious actors and undermine the overall security of the Internet.

A study from the Berkman Center for Internet and Society at Harvard released last month citing some current and former intelligence officials concluded that fears about encryption are overstated in part because new technologies have given investigators unprecedented means to track suspects.

Senator Ron Wyden, an Oregon Democrat, asked Director of National Intelligence James Clapper to provide a declassified response to the Berkman study within 60 days. Clapper agreed to the request.

The White House last year abandoned a push for legislation that would mandate U.S. technology firms to allow investigators a way to overcome encryption protections, amid rigorous private sector opposition. But the issue has found renewed life after the shootings in San Bernardino and Paris.

Senators Richard Burr and Dianne Feinstein, the Republican and Democratic leaders of the intelligence panel, have said they would like to pursue encryption legislation, though neither has introduced a bill yet.

(Reporting by Dustin Volz and Mark Hosenball; editing by Sandra Maler and G Crosse)
http://www.reuters.com/article/us-ca...-idUSKCN0VI22A





US SPY CHIEF: We Might Hack Your Fridge to Spy on You

Spies might use smart fridges and other internet-connected devices in the home to spy on you, US intelligence chief James Clapper has admitted.

Speaking to the US Senate on Tuesday, Clapper publicly acknowledged — for the first time, The Guardian reports — that intelligent agents might take advantage of the new possibilities presented by having computers built into ever-more home appliances.

"In the future, intelligence services might use the [internet of things] for identification, surveillance, monitoring, location tracking, and targeting for recruitment, or to gain access to networks or user credentials," he said.

So a fridge that tells you when you're out of milk might sound pretty cool. But spies could hack it to track your movements in your own home, or as a stepping stone for hacking into other devices on your network.

American intelligence figures have spoken about the Internet of Things' vulnerabilities — and its opportunities for spooks — before. "'Transformational' is an overused word, but I do believe it properly applies to these technologies,' then-CIA chief David Petraeus said in 2012. "Particularly to their effect on clandestine tradecraft."

A recent study published by the Berkman Center for Internet and Society highlighted exactly this. Taking issue with frequent claims from law enforcement that evidence is "going dark" due to increasing use of encryption, it argues that increasing numbers of devices present ever-more opportunities for surveillance.

"We’re questioning whether the ‘going dark’ metaphor used by the FBI and other government officials fully describes the future of the government’s capacity to access communications," Berkman fellow and cryptographer Bruce Schneier said. "We think it doesn’t. While it may be true that there are pockets of dimness, there other areas where communications and information are actually becoming more illuminated, opening up more vectors for surveillance."

"Appliances and products ranging from televisions and toasters to bed sheets, light bulbs, cameras, toothbrushes, door locks, cars, watches and other wearables are being packed with sensors and wireless connectivity," the report says. "The audio and video sensors on IoT devices will open up numerous avenues for government actors to demand access to real-time and recorded communications."
http://www.newstimes.com/technology/...on-6820088.php

















Until next week,

- js.



















Current Week In Review





Recent WiRs -

February 6th, January 30th, January 23rd, January 16th

Jack Spratts' Week In Review is published every Friday. Submit letters, articles, press releases, comments, questions etc. in plain text English to jackspratts (at) lycos (dot) com. Submission deadlines are Thursdays @ 1400 UTC. Please include contact info. The right to publish all remarks is reserved.


"The First Amendment rests on the assumption that the widest possible dissemination of information from diverse and antagonistic sources is essential to the welfare of the public."
- Hugo Black
JackSpratts is offline   Reply With Quote
Old 13-02-16, 01:14 PM   #2
multi
Thanks for being with arse
 
multi's Avatar
 
Join Date: Jan 2002
Location: The other side of the world
Posts: 10,343
Default

good stuff Jack!
that article about the Section 230 law is pretty important
I wonder if the TPP might be a danger to that?
__________________

i beat the internet
- the end boss is hard
multi is offline   Reply With Quote
Old 13-02-16, 09:28 PM   #3
malvachat
My eyes are now open.
 
malvachat's Avatar
 
Join Date: Jan 2004
Location: Oxford uk
Posts: 1,409
Default

I look forward to this news page every week.
Takes about an hour to get though.
But I have lots of time now.
I love being retired.
I don't miss work one bit.
__________________
Beer is for life not just Christmas
malvachat is offline   Reply With Quote
Old 15-02-16, 07:57 AM   #4
JackSpratts
 
JackSpratts's Avatar
 
Join Date: May 2001
Location: New England
Posts: 10,013
Default

tpp is a danger multi, and here's to it's demise.

happy to oblige mal. and here's to geezer duffers everywhere, me included.

- js.
__________________
Thanks For Sharing
JackSpratts is offline   Reply With Quote
Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Peer-To-Peer News - The Week In Review - July 16th, '11 JackSpratts Peer to Peer 0 13-07-11 06:43 AM
Peer-To-Peer News - The Week In Review - July 9th, '11 JackSpratts Peer to Peer 0 06-07-11 05:36 AM
Peer-To-Peer News - The Week In Review - January 30th, '10 JackSpratts Peer to Peer 0 27-01-10 07:49 AM
Peer-To-Peer News - The Week In Review - January 16th, '10 JackSpratts Peer to Peer 0 13-01-10 09:02 AM
Peer-To-Peer News - The Week In Review - December 5th, '09 JackSpratts Peer to Peer 0 02-12-09 08:32 AM






All times are GMT -6. The time now is 09:33 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
© www.p2p-zone.com - Napsterites - 2000 - 2024 (Contact grm1@iinet.net.au for all admin enquiries)