P2P-Zone  

Go Back   P2P-Zone > Peer to Peer
FAQ Members List Calendar Search Today's Posts Mark Forums Read

Peer to Peer The 3rd millenium technology!

Reply
 
Thread Tools Search this Thread Display Modes
Old 07-05-14, 07:23 AM   #1
JackSpratts
 
JackSpratts's Avatar
 
Join Date: May 2001
Location: New England
Posts: 10,013
Default Peer-To-Peer News - The Week In Review - May 10th, '14

Since 2002


































"There doesn’t seem to be any empirical evidence that either blocking websites or sending harsh notices to customers ... does anything to reduce the incidence of piracy. Show me the evidence." – Steve Dalby






































May 10th, 2014




Online Piracy Crackdown Looms
Madeleine Heffernan

Media industry lobbying against online piracy is intensifying, amid growing expectations the Abbott government will move shortly to clamp down on internet service providers and the national pastime.

Fairfax Media has been told that federal cabinet will consider two proposals to crack down on illegal downloads as early as this week.

One is internet service providers being required to issue warnings to people who repeatedly download illegally. The other is forcing ISPs to block file-sharing websites such as Pirate Bay.

The government has promised to make ‘‘significant’’ changes to Australia’s copyright laws as a first-term commitment, although a spokesman for Arts Minister and Attorney General George Brandis said there was no firm timetable for this. The topic is also battling for attention ahead of the federal budget.

Senator Brandis has warned that the government could legislate if a voluntary, industry-code of practice for ISPs isn't agreed. He has argued that ISPs ‘‘need to take some responsibility’’ for illegal downloading, because they ‘‘provide the facility which enables this to happen’’.

The ALP, which unsuccessfully sought a voluntary scheme while in government, said it would examine any policy proposal put forward. But it said there was no single solution and the government was yet to ‘‘put forward a coherent policy proposal’’.

‘‘Labor supports the freedom of internet users, while also recognising that the rights of artists and copyright holders need to be protected,’’ a spokeswoman for shadow attorney general Mark Dreyfus said.

News Corp Australia, half owner of pay TV company Foxtel, told Fairfax Media that copyright infringement ‘‘hurts the creative community - it undermines investment, employment, business models and innovation.

‘‘We support the Attorney General’s approach, and while there isn’t a silver bullet, evidence from overseas suggest that such initiatives do work,’’ spokesman Stephen Browning said.

Australians are among the biggest pirates per capita. Debate continues about whether this is driven by opportunism, the delays for overseas content to reach here, or an aversion to the country's higher prices.

Justin Diddams, media analyst at Citi, said last week that the ‘‘increased volume of pirated content consumption is demand driven, more out of necessity’’ than ‘‘some deep ingrained convict desire to steal’’.

There are also varying estimates on how much piracy costs content holders, depending on how illegal downloads are measured as sales foregone.

ISP iiNet in 2012 won a four-year legal battle against 34 parties including Village Roadshow, Disney Enterprises and Dreamworks Films, relating to whether it was responsible for its users’ illegal downloads. The High Court ruled that iiNet had not authorised copyright infringements.

iiNet chief regulatory officer Steve Dalby told Fairfax Media that ISPs should not be held responsible for ‘‘protecting the rights of American companies" and the above changes could cost ‘‘in the order of tens of millions.’’

‘‘There doesn’t seem to be any empirical evidence that either blocking websites or sending harsh notices to customers ... does anything to reduce the incidence of piracy. Show me the evidence,’’ Mr Dalby said.

‘‘As a secondary issue, if we are convinced that it actually will reduce the level of piracy, then we need to talk about who is going to pay for it.’’

The Communications Alliance, the industry body for ISPs, has also argued that rights holders should fund the cost of any scheme, and ensure that content is available quickly and affordably.

Telco giant Telstra said it ‘‘continue[d] to stand willing to engage in constructive discussion with industry and government to help address online piracy through means which balance the interests of all stakeholders including our customers and shareholders.’’

Harold Mitchell, chairman of commercial TV industry body Free TV, said Australia needs to "consistently search for solutions that enable it to continue to develop its own products."
http://www.smh.com.au/business/onlin...505-37r3g.html





Deal to Combat Piracy in UK with 'Alerts' is Imminent
Dave Lee

After years of wrangling, a deal between entertainment industry bodies and UK internet service providers to help combat piracy is imminent.

BT, Sky, TalkTalk and Virgin Media will send "educational" letters to customers believed to be downloading illegally.

But a document seen by the BBC shows that rights holders are set to make do with considerably weaker measures than originally asked for.

The first letters - known as "alerts" - are expected to be sent out in 2015.

The deal has been struck with the BPI, which represents the British music industry, and the Motion Picture Association (MPA), which covers film.

The bodies had originally suggested the letters should tell repeat infringers about possible punitive measures.

They also wanted access to a database of known illegal downloaders, opening the possibility of further legal action against individuals.

However, following almost four years of debate between the two sides, the final draft of the Voluntary Copyright Alert Programme (Vcap) contains neither of those key measures.

Steve Kuncewicz, a lawyer specialising in online and internet law, said the agreement had been "watered down beyond all recognition".

"I imagine the content owners are going to be very angry about it," he added.

"There's no punitive backstop to any of this."

Softened

Instead, letters sent to suspected infringers must be "educational" in tone, "promoting an increase in awareness" of legal downloading services.

The rights holders have agreed to pay £750,000 towards each internet service provider (ISP) to set up the system, or 75% of the total costs, whichever is smaller.

A further £75,000 (or 75% of total costs) will be paid each year to cover administration costs.

The BBC understands the UK's remaining ISPs are likely to join the agreement at a later stage.

It is expected that the deal will be finalised soon. The UK's data regulator, the Information Commissioner's Office, is advising the parties about the collection of data about which customers receive alerts.

A record of which accounts had received alerts, and how many, will be kept on file by the ISPs for up to a year.

The rights holders will receive a monthly break down of how many alerts have been sent out, but only the ISPs will know the identities of the customers involved.

A cap for the total number of alerts that can be sent out per year has also been set.

Between the four ISPS, no more than 2.5 million alerts can be sent out. As and when other ISPs join the agreement, this cap will be adjusted.

A maximum of four alerts - by either email or physical letter - can be sent to an individual customer account. Language will "escalate in severity" - but will not contain threats or talk of consequences for the accused users.

After four alerts, no further action will be taken by the ISPs.

Bitter argument

When it comes to piracy, the two sides have been locked - with the government playing occasional mediator - in discussion since the controversial introduction of the Digital Economy Act in 2010.

Enacted just before the end of the last Labour government, the DEA included various measures to deal with copyrighted infringement including, but not limited to, cutting off repeat offenders from the internet.

But ISPs, and various internet rights groups, put up vocal opposition - arguing that it would force ISPs to police their users, while raising questions about whether internet access should be considered a human right.

In response to frustration from content industries that the measures were taking too long to enforce, the Department for Culture, Media and Sport suggested that a voluntary agreement be put forward.

But as recently as September 2013, the proposals were described as "unworkable" by ISPs, and so the final draft required considerable compromise, mostly on the part of the content industry groups.

In a joint statement, the BPI and MPA said: "Content creators and ISPs, with the support of government, have been exploring the possibility of developing an awareness programme that will support the continuing growth of legal creative content services, reduce copyright infringement and create the best possible customer experience online."

The groups declined to comment specifically on the leaked document. The four ISPs involved all confirmed that they were in discussion.

In the meantime, the content creators have been successful in having more than 40 websites blocked so that users in the UK could not reach them unless using technical workarounds.

The alert system is planned to run for three years - with regular reviews on its effectiveness.

In the agreement, it states that an ineffective system would lead rights holders to call for "rapid implementation" of stronger measures as outlined in the Digital Economy Act.

On the new plans, a source close to the discussions told the BBC: "The rights holders have accepted they can't use this process to go after individuals.

"The ISPs have insisted that already established, legal routes are used in that scenario. Instead, the purpose of any campaign will be to inform and raise awareness rather than punitive action."

How the system will work

• Rights holders identify IP addresses of devices or locations they believe to be downloading files illegally. This is done through a variety of methods, including "listening in" to traffic on Bittorrent networks - a common practice
• A "Copyright Infringement Report", known as a CIR, will be sent to the ISP involved. It will contain the IP address as well as the date and time the alleged infringement took place
• The ISP will then match that report to a customer account it knows was connected to the internet from that IP address at the time of the download
• It will then send out an alert to that customer about the behaviour, either as an email or a physical letter
• Identifying users in this way is not an exact science. Sometimes, it will be unable to determine which customer account was being used at the time.
• Furthermore, in the alerts, no individual person will be directly accused - as a single IP address could be used by several people at a time, or even, to use one example, by someone using a neighbour's wifi without their knowledge.

Source: Voluntary Copyright Alert Programme

http://www.bbc.com/news/technology-27330150





Google Blocks Filesharing Website Demonoid Over Malware Downloads

Relaunched site blames advertisers and removes all ads, but anti-piracy campaigners will be overjoyed
Stuart Dredge

Anyone searching for the site, which relaunched earlier this year after a lengthy period offline, sees a message warning that "This site may harm your computer".

If they decide to click through, they are taken to a Google "Malware Warning" page rather than the website, suggesting that they return to Google and pick another search result, or continue at their own risk. The page also points to a diagnostic report explaining why Demonoid has been blocked.

"Of the 78 pages we tested on the site over the past 90 days, 7 page(s) resulted in malicious software being downloaded and installed without user consent," explains that page, citing malicious software hosted on the adv-inc-net.com domain, as well as "intermediaries for distributing malware" tlvmedia.com and torrpedo.net.

The block goes beyond Google's search engine, too. Users of its Chrome web browser are also being warned off Demonoid. "The website ahead contains malware!" explained a warning page encountered by The Guardian today when trying to visit the site using Chrome.

"Google Chrome has blocked access to demonoid.ph for now. Even if you have visited this website safely in the past, visiting it now is very likely to infect your Mac with malware."

Demonoid is still accessible through Apple's Safari browser, but is blocked by Mozilla's Firefox, which shows its own "Reported Attack Page! This web page at demonoid.ph has been reported as an attack page and has been blocked based on your security preferences" warning page when users try to access the site.

According to technology news site TorrentFreak, Demonoid is blaming advertisers on its site for the detected malware. "We run content from a lot of ad networks in our ad banners, and a lot of banners from each," it explained in a statement.

"One of those banners started serving malware, so we disabled all ads until we are 100% sure of the culprit and get it removed. We are also taking the proper steps to get us out of all the blacklists."

Anti-piracy campaigners will be rubbing their hands with joy at the news, though. A recent report by British film, TV and video body The Industry Trust claimed that 90% of the top piracy sites in the UK contain malware or credit card scams.

The City of London Police's PIPCU intellectual property crime unit, which is funded by the UK's Department for Business, Innovation and Skills, backed the study, which also claimed that 77% of people who had visited a pirate website had found "unwanted extras" including malware, spyware and intrusive pop-up ads.

"People need to know that by visiting copyright infringing websites they are running the risk of having their personal details stolen and used fraudulently, as well as exposing their computer to malicious malware and viruses," said PIPCU head DCI Andy Fyfe at the time.

News of Google's Demonoid block also comes at a time when PIPCU – in partnership with various creative industries – is trying to squeeze the flow of advertising revenue to filesharing sites from respectable brands. In April, PIPCU launched an Infringing Website List of sites it believes brands and agencies should ensure aren't carrying their ads.

If this strategy is successful, then it may in turn force filesharing sites to look elsewhere for their banner ads: gambling and porn, for example. If this also brings more risk of malware-toting ads like those found on Demonoid, rightsholders will be sure to point this out regularly as an incentive for people not to frequent these sites.

Google's decision to block Demonoid could bring more pressure on the internet giant from those rightsholders, though. Music bodies like the BPI and IFPI have long lobbied Google to remove sites from its search index if they are consistently linked to piracy.

Until now, their strategy has focused on sending a barrage of takedown notices for infringing links on Google. The Demonoid news may encourage them to also conduct more research into malware on filesharing sites, and present those results to Google in an effort to get them blocked.
http://www.theguardian.com/technolog...cious-software





RightsCorp Wants To Bring Its Copyright Protection Methods To The UK

The US company is forcing ISPs to participate in the copyright crusade, but will this approach work in Europe?
Max Smolaks

Notorious US copyright enforcement agency RightsCorp is set on bringing its services – and controversial methods – to Europe and the UK.

The company, which works on behalf of copyright holders, specialises in monitoring popular torrent trackers and identifying the IP addresses of online pirates. It then forces the Internet Service Providers (ISPs) to send letters to the owners of those addresses, demanding $20 (£12) for each illegally downloaded song or movie, threatening legal action as the alternative.

“We are expanding in Canada first, but we are investigating a launch in Europe. I can’t give any specific dates, but we are getting a great reception from everyone we have spoken to [in the UK],” RightsCorp co-founder and CEO Robert Steele told TechWeekEurope.

“In Germany and the UK some copyright holders engage law firms and run large-scale campaigns to bring court orders and file lawsuits against file-sharers. We are sending a much simpler, less expensive communication that resolves the matter before it goes to court. That’s why we are optimistic that there will be a way to do this in Europe.”

Who doesn’t like money?

Since 2011, RightsCorp has achieved more than 60,000 court settlements in the US. To find its victims, the company directly monitors BitTorrent resources like IsoHunt or The Pirate Bay. The monitoring software simply connects to the peers as another file-sharer – once the connection has been established, it can see what files are being downloaded.

According to research from University of Birmingham published in 2012, it takes just three hours from the start of a download for an average BitTorrent user to be monitored by a copyright enforcement agency.

Pirate flag closeupCopyright holders are perfectly aware they are losing money due to online piracy, but they have no tools to fight this cultural phenomenon. And then RightsCorp comes along, offering to claw some of those losses back for free, since any expenses would be taken from the copyright infringers themselves.

Steele says that the US legislation places the responsibility for any file-sharing activity on the user and not the ISP, as long as the ISP had no knowledge that said file-sharing activity took place. And this is where it gets interesting – RightsCorp makes detailed reports and sends them to service providers, so they can no longer claim ignorance.

Under US law, they are then forced to take action, and thereby arguably betray their customers’ trust. As a result, the ISP is likely to lose some of its business. Meanwhile, RightsCorp takes half of the settlement money in fees – as much as the artists it claims to represent.

“We have sent FedEx packages to nearly every ISP in the United States with the login credentials for their RightsCorp dashboard, we send them weekly emails with the list of all their subscribers suspected of infringement,” told us Steele. “We are compelling them with the evidence that they have to do something to maintain their ‘shield’, or our clients can sue them.”

A similar approach to copyright enforcement is already practiced in the UK by companies like Golden Eye, which legally challenged O2 broadband to disclose information on more than 900 of its customers in 2012.

It is important to note that the IP addresses and peer-to-peer sharing information provided by RightsCorp may not in fact make solid court evidence. The experts suggest that recipients of such copyright infringement letters keep calm, do not admit their guilt and under no circumstances agree to settle. Agreeing to settle is an admission of guilt in this instance, and may prejudice future accusations.

Last month, Minister for Intellectual Property Lord Younger warned that the enforcement of intellectual property in the UK is going to get tougher. At the same time, by June the government is planning to introduce exceptions to copyright law which will allow consumers to make copies of legitimately acquired CDs, DVDs and e-books for personal use.
http://www.techweekeurope.co.uk/news...hods-uk-144925





Dropbox Scrambles To Block Leaks Of Shared Data

Dropbox and Box users are leaking private data through Google Analytics and Adwords, a competitor has revealed
Peter Judge

Dropbox has moved to fix a weakness that allows users of its service, along with those of its arch-rival Box, to accidentally to leak private data to other web users. However security experts have warned that the danger remains.

When users share a file on Dropbox or similar services they send a link intended for the reader alone, but there are two ways in which these links can be leaked to third parties, allowing them to access the files without any restriction.

The vulnerability was uncovered by Dropbox’s competitor Intralinks, which stumbled across the problem and found links to documents including bank statements and mortgage applications during routing use of Google’s Adwords and Analytics services. It says that traffic arriving at a website contains details of the link which sent it there. This can then be used by services such as Google Analytics and provides ways for shared Dropbox and Box documents to be leaked to other people.

Open the Dropbox!

“During a routine analysis of Google AdWords and Google Analytics data mentioning competitors’ names (Dropbox and Box), we inadvertently discovered the fully clickable URLs necessary to access these documents that led us to live folder contents, some with sensitive data,” said Intralinks.

Security expert Graham Cluley provides a more detailed explanation on his blog, detailing two ways users can leak a shared document URL to other people.

The first is via Google Adwords, and was noticed by Intralinks. “If a user, attempting to access the document that has been shared with them, puts the Share link into a search engine rather than their browser’s URL box (an easy finger fumble to make),” says Cluley, “then the advertising server receives the Share link as part of the referring URL.”

Intralinks bought an Adwords campaign around Dropbox and Box and when a user carried out a search including either of those words, they saw an advert for Intralinks in their browser. At the same time, Intralinks got details of the search terms as part of the referring URL. The aforementioned finger-fumble sent the direct Dropbox or Box link to Intralinks, giving the company access to a lot of personal files.

The other way data can leak is if users include links within shared documents. Shared document can include links to third party sites. If users click from the document, those sites get the referring URL (the shared document) in their traffic statistics.

Fixing the problem

The best way to avoid the problem is to not include any clickable links within any shared documents, and to restrict access to the document to only the people you are sharing it with.

Box allows users to restrict access, but the option is not activated by default, and Intralinks’ experience show it is often not used. People using the free version of Dropbox can’t restrict access to links – they have to pay for the Business version.

Dropbox claims to have fixed the problem in a blog post, but Cluley says the job is not complete: “Dropbox has published a blog post overnight about the issue, saying it has taken action against the hyperlink disclosure vulnerability,” he said. “In other words, Dropbox has fixed one of the problems, but not the one which revealed the private documents to Intralinks.”
http://www.techweekeurope.co.uk/news...ed-data-144992





CertainSafe File Sharing Wins Coveted PCMag Editors' Choice Award
Press release

"CertainSafe is a great solution for businesses of any size, and the fact that it's certified for compliance with HIPAA and other standards can be a huge plus if your business is subject to those regulations. I'm impressed with the security possibilities of MicroEncrypt technology, and with the product's ease of use," writes Neil J. Rubenking of PCMag.

CertainSafe is a complete data security solution that features:

• no software to install; completely web-based
• ultra-secure file sharing and data storage
• Secure Messaging as an alternative to email
• Military grade shredding standards
• PCI Level 1 DSS certified data centers, for compliance with HIPAA, PCI, PFI, PHI, PII and others
• Brandable user interface

Read the review in its entirety at: http://www.pcmag.com/article2/0,2817,2455725,00.asp

CertainSafe's proprietary MicroTokenization technology means each file is MicroEncrypted down to the byte level, making a data breach mathematically improbable. Combined with multi-layer login security, CertainSafe is the most secure file sharing and data storage platform on the market today.

"We are thrilled to have a media outlet as respected as PCMag write such a positive review of CertainSafe, and especially honored to receive their Editors' Choice award," says Steven Russo, Executive Vice President of Transcertain, parent company of CertainSafe. "Businesses and consumers will find no safer file sharing and storage solution anywhere, bar none."

PCMag.com is the world's largest publisher of tech reviews and news reaching 1 in 5 online consumers in the United States and 1 in 10 globally.

Learn more about CertainSafe at http://www.certainsafe.com.
http://www.virtual-strategy.com/2014...s-choice-award





Europe's Cybersecurity Policy Settings Under Attack
AFP

Even as Europe powered up its most ambitious ever cybersecurity exercise this month, doubts were being raised over whether the continent's patchwork of online police was right for the job.

The exercise, called Cyber Europe 2014, is the largest and most complex ever enacted, involving 200 organizations and 400 cybersecurity professionals from both the European Union and beyond.

Yet some critics argued that herding together normally secretive national security agencies and demanding that they spend the rest of 2014 sharing information amounted to wishful thinking.

Cyber Europe 2014 Cybersecurity ExcercisesOthers questioned whether the law enforcement agencies taking part in the drill should be involved in safeguarding online security, in the wake of American whistleblower Edward Snowden's revelations of online spying by western governments.

"The main concern is national governments' reluctance to cooperate," said Professor Bart Preneel, an information security expert from the Catholic University of Leuven, in Belgium.

"You can carry out all of the exercises you want, but cybersecurity really comes down to your ability to monitor, and for that, national agencies need to speak to each other all the time," Preneel said.

The Crete-based office coordinating the EU's cybersecurity, the European Union Agency for Network and Information Security (ENISA), calls itself a "body of expertise" and cannot force national agencies to share information.

As with most aspects of policing and national security, the EU's 28 members have traditionally been reluctant to hand over powers to a central organization, even when -- as in the case of online attacks -- national borders are almost irrelevant.

'Citizens and economy at risk'

Cyberattacks occur when the computer information systems of individuals, organizations or infrastructure are targeted, whether by criminals, terrorists or even states with an interest in disrupting computer networks.

The EU estimates that over recent years there has been an increase in the frequency and magnitude of cybercrime and that the attacks go beyond national borders, while the smaller-scale spreading of software viruses is also an increasingly complex problem.

The EU's vulnerability has been highlighted over recent years by a number of high-profile cyberattacks, including one against Finland's foreign ministry in 2013 and a network disruption of the European Parliament and the European Commission in 2011.

And with Europe's supply of gas from Russia focusing attention on energy security, the highly computerized "smart" energy grids which transport and manage energy in the EU are also seen as vulnerable.

Yet the view from Brussels is that the member states' reluctance to work together on cybersecurity amounts to "recklessness", with one EU source saying national governments were "happy to put their citizens and economy at risk rather than coordinate across the EU."

ENISA was established in 2001 when it became clear that cybersecurity in the EU would require a level of coordination. Unlike other EU agencies, ENISA does not have regulatory powers and relies on the goodwill of the national agencies it works with.

The agency is undaunted by its task, arguing that the simulations it stages every two years, taking in up to 29 European countries, are both effective and necessary in preparing a response to cyber-attacks.

This week's simulation created what ENISA described as "very realistic" incidents in which key infrastructure and national interests came under attack, "mimicking unrest and political crisis" and "disrupting services for millions of citizens across Europe."

Responsibility with industry

However, Amelia Andersdotter, a Swedish member of the European Parliament with the libertarian Pirate Party, is dismissive of both the exercise and the European online security model.

Andersdotter, along with a number of European experts, is calling for reforms to move responsibility for cybersecurity away from law enforcement agencies toward civilian bodies.

Their argument is that a civilian agency would be better placed to coordinate a response with industry, which Andersdotter argues has not done enough to safeguard cybersecurity.

At present, she told AFP, industry actors in software or infrastructure simply report cybercrime to authorities without being required to compensate or inform consumers.

A civilian authority would end what Andersdotter calls the "conspiracy of database manufacturers and law enforcement agencies" by placing greater responsibility with industry.

What most experts agree on is that European companies and consumers are vulnerable to cybersecurity threats, and that can have an impact on people's willingness to use online services.

James Wootton, from British online security firm IRM, said the ENISA exercises are a step in the right direction, but are not enough.

"The problem is nation states wanting to fight cybercrime individually, even when cybercrime does not attack at that level," Wootton says, arguing that national law enforcement agencies often lack the required resources.

"So it is good to look at this at the European level, but what power does ENISA have? What can they force countries to do?"

Eurostat figures show that, by January 2012, only 26 percent of EU enterprises had a formally defined information technology security plan in place.

One industry insider said the view in Brussels is that EU cybersecurity was "like teenage sex: everyone says they are doing it but not that many actually are."
http://www.securityweek.com/europes-...s-under-attack





Russia Quietly Tightens Reins on Web With ‘Bloggers Law’
Neil MacFarquhar

Russia has taken another major step toward restricting its once freewheeling Internet, as President Vladimir V. Putin quietly signed a new law requiring popular online voices to register with the government, a measure that lawyers, Internet pioneers and political activists said Tuesday would give the government a much wider ability to track who said what online.

Mr. Putin’s action on Monday, just weeks after he disparaged the Internet as “a special C.I.A. project,” borrowed a page from the restrictive Internet playbooks of many governments around the world that have been steadily smothering online freedoms they once tolerated.

The idea that the Internet was at best controlled anarchy and beyond any one nation’s control is fading globally amid determined attempts by more and more governments to tame the web. If innovations like Twitter were hailed as recently as the Arab uprisings as the new public square, governments like those in China, Pakistan, Turkey, Iran and now Russia are making it clear that they can deploy their tanks on virtual squares, too.

China, long a pioneer in using sophisticated technology to filter the Internet, has continually tightened censorship. It has banned all major Western online social media sites, including Facebook, Twitter, YouTube and Google, though it seems not to be bothered by Alibaba, its homegrown e-commerce site, which has filed the paperwork for what could be the biggest public stock offering ever.

Nevertheless, even Beijing’s own social media champion, Weibo, valued at $3.6 billion in a public stock offering this year, has come under mounting censorship pressure as the government fine-tunes its policing of expression.

Under the pressure of a corruption scandal, Turkey recently imposed bans on Twitter and YouTube over tapes alleging corruption by the country’s prime minister, Recep Tayyip Erdogan. Although the YouTube ban remains, Twitter service was restored in April only after the Constitutional Court overturned the ban.

During protests against the government in Venezuela in February, there were reports that the government there was blocking online images from users. In recent years, Pakistan has banned 20,000 to 40,000 websites, including YouTube, saying they offend Muslims. Facebook was blocked for a while in 2010, but is now accessible.

The level of challenge is rising, but “we also see the amount of resources going into censorship increasing greatly,” Jonathan Zittrain, a professor at Harvard Law School who specializes in Internet law, said in a telephone interview.

Widely known as the “bloggers law,” the new Russian measure specifies that any site with more than 3,000 visitors daily will be considered a media outlet akin to a newspaper and be responsible for the accuracy of the information published.

Besides registering, bloggers can no longer remain anonymous online, and organizations that provide platforms for their work such as search engines, social networks and other forums must maintain computer records on Russian soil of everything posted over the previous six months.

“This law will cut the number of critical voices and opposition voices on the Internet,” said Galina Arapova, director of the Mass Media Defense Center and an expert on Russian media law. “The whole package seems quite restrictive and might affect harshly those who disseminate critical information about the state, about authorities, about public figures.”

Mr. Putin has already used the pliable Russian Parliament to pass laws that scattered the opposition, hobbled nongovernmental organizations and shut down public protests. Now, riding a wave of popular support after hosting the Winter Olympics and annexing Crimea, he has turned his attention to regulating the Internet, as well as burnishing his credentials as the worldwide champion of conservative values.

Aside from the Internet law signed Monday, the Russian leader signed a new profanity law that levies heavy fines for using four common vulgarities in the arts, including literature, movies, plays and television.

Speaking in St. Petersburg in late April, Mr. Putin voiced his suspicions about the Internet, even while noting that it had become a public market of huge proportions.

“You know that it all began initially, when the Internet first appeared, as a special C.I.A. project,” he said in remarks broadcast live nationally, before adding that “special services are still at the center of things.” He specifically thanked Edward J. Snowden, the former National Security Agency contractor granted asylum in Russia, for revealing to the world how efficient the N.S.A. was at collecting information.

Mr. Putin went on to say that someone writing online whose opinion affects thousands or even hundreds of thousands of people should be considered a media outlet. He said he was not talking about a ban, only acting “the way it is done all over the world.”

Russian Internet pioneers despaired that Mr. Putin was really talking about the Chinese model of curtailing any political discussion online.

“It is part of the general campaign to shut down the Internet in Russia,” said Anton Nossik, an early online media figure here. “They have not been able to control it until now, and they think they should implement the Chinese model. But they don’t understand how it works. The Chinese model also stimulates the development of local platforms, while the Russian laws are killing the local platform.”

Russia is among a growing list of countries that have sought to shut down Internet voices circumventing a subservient national news media. Many leaders see the Internet as the key tool behind antigovernment demonstrations and are determined to render it ineffective.

Yet polls conducted in 24 countries last spring by Pew Research found that most people are against government censorship of the Internet, including 63 percent in Russia and 58 percent in Turkey.

Another Russian Internet law, one that went into effect on Feb. 1, gave the government the power to block websites. It immediately used the law against its most vocal critics, like Alexei Navalny and Garry Kasparov, as well as online news sites that reported on demonstrations and other political activity.

In April, Pavel Durov, the 29-year-old founder of Vkontakte, Russia’s popular version of Facebook, said he had fled the country because he feared the consequences of refusing to turn over information the government requested about activists in Russia and Ukraine. Critics said he had fled after cashing out, and United Capital Partners, the owner of a 48 percent stake in the company, posted a lengthy statement online saying he was trying to divert attention from legal issues surrounding his running of the company.

Aleksandr Zharov, who runs Roskomnadzor, the government agency that supervises the Internet, told the state-run RIA Novosti news agency last month that the law was necessary because people need to be held responsible for what they say on the web. “What he would never say face to face, he often allows himself online,” Mr. Zharov was quoted as saying.

The lack of transparency in Russia creates a kind of fog around countless issues, and the Internet is no different. Many critics and even some supporters of the new law said it was too vague to understand.

The Internet needs to be regulated by law just like publishing, said Robert A. Shlegel, among the youngest members of Parliament from United Russia, Mr. Putin’s party. But Internet savvy among legislators is weak, he added. “The law, as it is, is so raw,” he said. “It is clear that the person who wrote it just doesn’t understand.”

The law does not specify how the government will count the 3,000 daily visitors, for example. Even before Mr. Putin signed it, two of the largest blogging platforms, Yandex and LiveJournal, announced that henceforth their publicly visible counters would stop below 3,000.

Ms. Arapova said other murky issues included who would be considered a provider. For instance, will large international social media or search sites like Google, Twitter and Facebook have to keep their data in Russia or face fines and possible closing?

In California, both Twitter and Facebook said they were studying the law but would not comment further.

Ms. Arapova said the law would undoubtedly have a chilling effect in terms of who would go online. Whistle-blowers who work for corrupt government agencies, for example, would theoretically no longer be able to post anonymously.

The actual impact of the law will not be measurable until after it goes into effect on Aug. 1, Ms. Arapova said. Punishments start at fines that can reach up to $142,000 or the temporary closing of the blog, if the law is actively enforced.

Like the Internet law, the ban on four vulgar words was met with a combination of dismay and derision among artists. (The words, not mentioned in the law either, are crude terms for male and female genitalia, sex and a prostitute.) Many people thought it would be widely ignored, but the very idea that the Kremlin was trying to censor the arts rankled.

“We feel like we are back in kindergarten again when they said, ‘Don’t pee in your bed and don’t eat with your hands and don’t use that word,’ ” said Viktor V. Yerofeyev, a popular writer. “On the one hand, the Russian government says the Russian people are the best. On the other hand, it doesn’t trust the people.”


¬¬¬¬¬¬¬Reporting was contributed by Andrew Roth and Alexandra Odynova from Moscow; Tim Arango from Baghdad; Declan Walsh from London; Marjorie Connelly, Noam Cohen and Peter Lattman from New York; and Vindu Goel from San Francisco.
http://www.nytimes.com/2014/05/07/wo...ggers-law.html





US Government Begins Rollout of its 'Driver's License for the Internet'
Tim Cushing

An idea the government has been kicking around since 2011 is finally making its debut. Calling this move ill-timed would be the most gracious way of putting it.

A few years back, the White House had a brilliant idea: Why not create a single, secure online ID that Americans could use to verify their identity across multiple websites, starting with local government services. The New York Times described it at the time as a "driver's license for the internet."

Sound convenient? It is. Sound scary? It is.

Next month, a pilot program of the "National Strategy for Trusted Identities in Cyberspace" will begin in government agencies in two US states, to test out whether the pros of a federally verified cyber ID outweigh the cons.


The NSTIC program has been in (slow) motion for nearly three years, but now, at a time when the public's trust in government is at an all time low, the National Institute of Standards and Technology (NIST -- itself still reeling a bit from NSA-related blowback) is testing the program in Michigan and Pennsylvania. The first tests appear to be exclusively aimed at accessing public programs, like government assistance. The government believes this ID system will help reduce fraud and overhead, by eliminating duplicated ID efforts across multiple agencies.

But the program isn't strictly limited to government use. The ultimate goal is a replacement of many logins and passwords people maintain to access content and participate in comment threads and forums. This "solution," while somewhat practical, also raises considerable privacy concerns.

[T]he Electronic Frontier Foundation immediately pointed out the red flags, arguing that the right to anonymous speech in the digital realm is protected under the First Amendment. It called the program "radical," "concerning," and pointed out that the plan "makes scant mention of the unprecedented threat such a scheme would pose to privacy and free speech online."

And the keepers of the identity credentials wouldn't be the government itself, but a third party organization. When the program was introduced in 2011, banks, technology companies or cellphone service providers were suggested for the role, so theoretically Google or Verizon could have access to a comprehensive profile of who you are that's shared with every site you visit, as mandated by the government.


Beyond the privacy issues (and the hints of government being unduly interested in your online activities), there are the security issues. This collected information would be housed centrally, possibly by corporate third parties. When hackers can find a wealth of information at one location, it presents a very enticing target. The government's track record on protecting confidential information is hardly encouraging.

The problem is, ultimately, that this is the government rolling this out. Unlike corporations, citizens won't be allowed the luxury of opting out. This "internet driver's license" may be the only option the public has to do things like renew actual driver's licenses or file taxes or complete paperwork that keeps them on the right side of federal law. Whether or not you believe the government's assurances that it will keep your data safe from hackers, keep it out of the hands of law enforcement (without a warrant), or simply not look at it just because it's there, matters very little. If the government decides the positives outweigh the negatives, you'll have no choice but to participate.
http://www.techdirt.com/articles/201...internet.shtml





Exclusive: Emails Reveal Close Google Relationship with NSA

National Security Agency head and Internet giant’s executives have coordinated through high-level policy discussions
Jason Leopold

Email exchanges between National Security Agency Director Gen. Keith Alexander and Google executives Sergey Brin and Eric Schmidt suggest a far cozier working relationship between some tech firms and the U.S. government than was implied by Silicon Valley brass after last year’s revelations about NSA spying.

Disclosures by former NSA contractor Edward Snowden about the agency’s vast capability for spying on Americans’ electronic communications prompted a number of tech executives whose firms cooperated with the government to insist they had done so only when compelled by a court of law.

But Al Jazeera has obtained two sets of email communications dating from a year before Snowden became a household name that suggest not all cooperation was under pressure.

On the morning of June 28, 2012, an email from Alexander invited Schmidt to attend a four-hour-long “classified threat briefing” at a “secure facility in proximity to the San Jose, CA airport.”

“The meeting discussion will be topic-specific, and decision-oriented, with a focus on Mobility Threats and Security,” Alexander wrote in the email, obtained under a Freedom of Information Act (FOIA) request, the first of dozens of communications between the NSA chief and Silicon Valley executives that the agency plans to turn over.

Alexander, Schmidt and other industry executives met earlier in the month, according to the email. But Alexander wanted another meeting with Schmidt and “a small group of CEOs” later that summer because the government needed Silicon Valley’s help.

“About six months ago, we began focusing on the security of mobility devices,” Alexander wrote. “A group (primarily Google, Apple and Microsoft) recently came to agreement on a set of core security principles. When we reach this point in our projects we schedule a classified briefing for the CEO’s [sic] of key companies to provide them a brief on the specific threats we believe can be mitigated and to seek their commitment for their organization to move ahead … Google’s participation in refinement, engineering and deployment of the solutions will be essential.”

Jennifer Granick, director of civil liberties at Stanford Law School’s Center for Internet and Society, said she believes information sharing between industry and the government is “absolutely essential” but that “at the same time, there is some risk to user privacy and to user security from the way the vulnerability disclosure is done.”

The challenge facing government and industry was to enhance security without compromising privacy, Granick said. The emails between Alexander and Google executives, she said, show “how informal information sharing has been happening within this vacuum where there hasn’t been a known, transparent, concrete, established methodology for getting security information into the right hands.”

The classified briefing cited by Alexander was part of a secretive government initiative known as the Enduring Security Framework (ESF), and his email provides some rare information about what ESF entails, the identity of some participant tech firms and the threats they discussed.

Alexander explained that the deputy secretaries of the Department of Defense, Homeland Security and “18 US CEOs” launched the ESF in 2009 to “coordinate government/industry actions on important (generally classified) security issues that couldn’t be solved by individual actors alone.”

“For example, over the last 18 months, we (primarily Intel, AMD [Advanced Micro Devices], HP [Hewlett-Packard], Dell and Microsoft on the industry side) completed an effort to secure the BIOS of enterprise platforms to address a threat in that area.”

“BIOS” is an acronym for “basic input/output system,” the system software that initializes the hardware in a personal computer before the operating system starts up. NSA cyberdefense chief Debra Plunkett in December disclosed that the agency had thwarted a “BIOS plot” by a “nation state,” identified as China, to brick U.S. computers. That plot, she said, could have destroyed the U.S. economy. “60 Minutes,” which broke the story, reported that the NSA worked with unnamed “computer manufacturers” to address the BIOS software vulnerability.

But some cybersecurity experts questioned the scenario outlined by Plunkett.

“There is probably some real event behind this, but it’s hard to tell, because we don’t have any details,” wrote Robert Graham, CEO of the penetration-testing firm Errata Security in Atlanta, on his blog in December. “It”s completely false in the message it is trying to convey. What comes out is gibberish, as any technical person can confirm.”

And by enlisting the NSA to shore up their defenses, those companies may have made themselves more vulnerable to the agency’s efforts to breach them for surveillance purposes.

“I think the public should be concerned about whether the NSA was really making its best efforts, as the emails claim, to help secure enterprise BIOS and mobile devices and not holding the best vulnerabilities close to their chest,” said Nate Cardozo, a staff attorney with the Electronic Frontier Foundation’s digital civil liberties team.

He doesn’t doubt that the NSA was trying to secure enterprise BIOS, but he suggested that the agency, for its own purposes, was “looking for weaknesses in the exact same products they’re trying to secure.”

The NSA “has no business helping Google secure its facilities from the Chinese and at the same time hacking in through the back doors and tapping the fiber connections between Google base centers,” Cardozo said. “The fact that it’s the same agency doing both of those things is in obvious contradiction and ridiculous.” He recommended dividing offensive and defensive functions between two agencies.

Two weeks after the "60 Minutes" broadcast, the German magazine Der Spiegel, citing documents obtained by Snowden, reported that the NSA had inserted backdoors into BIOS, doing exactly what Plunkett accused a “nation state” of doing during her interview.

Google’s Schmidt was unable to attend to the mobility security meeting in San Jose in August 2012.

“General Keith.. so great to see you.. !” the Google executive wrote. “I’m unlikely to be in California that week so I’m sorry I can’t attend (will be on the east coast). Would love to see you another time. Thank you !” Since the Snowden disclosures, Schmidt has been critical of the NSA and said its surveillance programs may be illegal.

Army Gen. Martin E. Dempsey, chairman of the Joint Chiefs of Staff, did attend that August 8, 2012, briefing. Foreign Policy reported a month later that Dempsey and other government officials — no mention of Alexander — were in Silicon Valley “picking the brains of leaders throughout the valley and discussing the need to quickly share information on cyber threats.” Foreign Policy noted that the Silicon Valley executives in attendance belonged to the ESF. The story did not say that “mobility threats and security” was the top agenda item along with a classified threat briefing.

A week after the gathering, Dempsey said during a Pentagon press briefing: “I was in Silicon Valley recently, for about a week, to discuss vulnerabilities and opportunities in cyber with industry leaders … They agreed — we all agreed on the need to share threat information at network speed.”

Google’s co-founder, Sergey Brin, had attended previous meetings of the ESF group but, according to Alexander’s email, he also could not attend the August 8, 2012, briefing in San Jose due to a scheduling conflict. However, it's unknown if someone else from Google was sent.

A few months earlier, however, Alexander had emailed Brin to thank him for Google’s participation in the ESF.

“I see ESF’s work as critical to the nation’s progress against the threat in cyberspace and really appreciate Vint Cerf (Google’s vice president and chief Internet evangelist), Eric Grosse (vice president of security engineering) and Adrian Ludwig’s (lead engineer for Android security) contributions to these efforts during the past year,” Alexander wrote in a January 13, 2012 email.

“You recently received an invitation to the ESF Executive Steering Group meeting, which will be held on January 19, 2012. The meeting is an opportunity to recognize our 2012 accomplishments and set direction for the year to come. We will be discussing ESF’s goals and specific targets for 2012. We will also discuss some of the threats we see and what we are doing to mitigate those threats … Your insights, as a key member of the Defense Industrial Base, are valuable to ensure ESF’s efforts have measurable impact.”

A Google spokesperson declined to answer specific questions about Brin and Schmidt’s relationship with Alexander, or about Google’s work with the government.

“We work really hard to protect our users from cyber attacks, and we always talk to experts— including in the US government — so we stay ahead of the game,” the spokesperson said in a statement to Al Jazeera. “It's why Sergey attended this NSA conference.”

Brin responded to Alexander the following day though the head of the NSA didn’t use the appropriate email address when contacting the co-chairman.

“Hi Keith, looking forward to seeing you next week. FYI, my best email address to use is [redacted],” Brin wrote. “The one your email went to – sergey.brin@google.com – I don’t really check.”
http://america.aljazeera.com/article...ef-google.html





House to Advance Bill to End Mass NSA Surveillance

The USA Freedom Act, with more than 140 cosponsors, will get a vote in the House Judiciary Committee on Wednesday.
Dustin Volz

A bill that would effectively end one of the National Security Agency's most controversial spy programs is finally getting its day in congressional court.

The House Judiciary Committee will hold a markup of an amended version of the USA Freedom Act on Wednesday, a surprising and sudden move that would essentially nullify the government's ability to collect bulk metadata of Americans' phone records.

The manuever may also be a counter to plans the House Intelligence Committee has to push forward a competing bill that privacy advocates say would not go far enough to curb the government's sweeping surveillance programs.

Indeed, just hours after the Freedom Act earned a markup date, the Intelligence Committee announced it, too, would move forward with a markup of its own NSA bill—the FISA Transparency and Modernization Act—on Thursday.

The more aggressive Freedom Act is sponsored by Rep. Jim Sensenbrenner, the one-time mastermind behind the post-9/11 Patriot Act, from which both the Obama and Bush administrations have derived much of the legal authority for their surveillance programs. Sensenbrenner, a Wisconsin Republican, has vocally condemned NSA spying since Edward Snowden's leaks surfaced last June. The bill has long been supported by privacy and civil-liberties groups who view it as the best legislative reform package in Congress.

A manager's amendment has been posted to House Judiciary's website that a source close to the negotiations said is a "compromise" on the Freedom Act. The "bipartisan substitute" from House Judiciary Chairman Bob Goodlatte; Rep. John Conyers, the panel's top Democrat; and others would prohibit the bulk collection of data under Section 215 of the Patriot Act but keep intact the "relevancy" standard for collection authority. The original Freedom Act had sought to completely rewrite the relevancy standard.

But the bill does grant the government "emergency authority" to use Section 215 for "tangible things."

"The attorney general may authorize the emergency production of tangible things, provided that such an application is presented to the court within seven days," a section-by-section breakdown of the amendment reads. "If the court denies an emergency application, the government may not use any of the information obtained under the emergency authority except in instances of a threat of death or serious bodily harm."

In addition, the compromise language would adopt new standards for national security letters, which are used by the FBI, and make the Foreign Intelligence Surveillance Court more transparent. The court, which oversees the NSA's spying activities, has been criticized for not being a robust check against the agency.

The bill would also provide companies that give the government access to its stored metadata some liability protection and allow them to seek compensation for complying.

The swift move follows a jurisdictional squabble last month between the House Judiciary and House Intelligence committees. Some Judiciary members, who are among the most vocal NSA critics in Congress, said they were being intentionally sidelined after the House parliamentarian gave primary jurisdiction of a another surveillance reform bill to the Intelligence Committee instead of Judiciary.

In a joint statement, several Judiciary members, including Goodlatte, Conyers, and Sensenbrenner, said they "as the committee of primary jurisdiction" on intelligence matters have concluded that the NSA's sweeping spy programs are need in need of reform.

"Over the past several months, we have worked together across party lines and with the administration and have reached a bipartisan solution that includes real protections for Americans' civil liberties, robust oversight, and additional transparency, while preserving our ability to protect America's national security," the lawmakers said. "We look forward to taking up this legislation on Wednesday and continuing to work with House leaders to reform these programs."

But the House Intelligence Committee struck back just as quickly on Monday. Hours after the Judiciary's surprise, the intelligence panel announced it would vote on its own NSA bill on Thursday.

In January, President Obama announced his administration would reform the way NSA collects and stores telephone metadata, which includes phone numbers and call durations but not the contents of a call, of virtually all Americans.

The compromise language revealed by the House Judiciary Committee appears to be an attempt to align more closely with the way forward the president has outlined. Obama has said he cannot act before Congress gives him a bill that closely resembles his preferred reforms.

The Freedom Act has 143 cosponsors in the House and a mirror bill in the Senate sponsored by Patrick Leahy, a Vermont Democrat and chairman of the Senate Judiciary Committee.
http://www.nationaljournal.com/tech/...lance-20140505





DOJ Seeks New Authority to Hack and Search Remote Computers

The agency asks that judges be allowed to issue warrants to search computers outside their judicial districts
Grant Gross

The U.S. Department of Justice wants new authority to hack and search remote computers during investigations, saying the new rules are needed because of complex criminal schemes sometimes using millions of machines spread across the country.

Digital rights groups say the request from the DOJ for authority to search computers outside the district where an investigation is based raises concerns about Internet security and Fourth Amendment protections against unreasonable searches and seizures.

"By expanding federal law enforcement's power to secretly exploit 'zero-day' vulnerabilities in software and Internet platforms, the proposal threatens to weaken Internet security for all of us," Nathan Freed Wessler, a staff attorney with the American Civil Liberties Union, said by email.

The proposal, which was made public Friday, raises serious privacy concerns, Wessler added, because it would "significantly expand the circumstances under which law enforcement can conduct secret, remote searches of the sensitiveA contents of people's computers. Our computers contain a wealth of private information about us, and it is crucial that the courts place strict limitsA on secret electronic searches by law enforcement."

The DOJ proposal comes after nearly a year of leaks about broad U.S. National Security Agency surveillance programs.

But a change in the federal rules of criminal procedure is needed to investigate botnets and crimes involving anonymizing technologies, the DOJ said in a September letter to the Advisory Committee on the Criminal Rules. The DOJ has asked the U.S. court system to give judges authority to issue search warrants for computers outside their districts.

Investigators are increasingly encountering crimes where they "can identify the target computer, but not the district in which it is located," Mythili Raman, then an acting assistant attorney general, wrote in the letter. "Criminals are increasingly using sophisticated anonymizing technologies when they engage in crime over the Internet."

Raman, now working at a private law firm, also pointed to criminals' use of botnets as a need for the rules change. A large botnet investigation could involve computers in dozens of judicial districts, she wrote.

"Criminals are using multiple computers in many districts simultaneously as part of complex criminal schemes, and effective investigation and disruption of these schemes often requires remote access to Internet-connected computers in many different districts," Raman wrote. "Botnets are a significant threat to the public: they are used to conduct large-scale denial of service attacks, steal personal and financial data, and distribute malware designed to invade the privacy of users of the host computers."

Yet, current rules of criminal procedure established by the U.S. court system allow magistrate judges to issue search warrants for property outside the judge's district in only limited circumstances, the DOJ noted. The DOJ's request for the rules change is scheduled to be discussed at the meeting of the U.S. courts' Committee on Rules of Practice and Procedure in Washington, D.C., later this month.

A DOJ spokesman downplayed privacy concerns, saying judges would have to issue warrants for the remote computer searches. The rules change would relate only to expanded venues for warrant applications, he said.

"The key thing to highlight is that our proposal would not authorize any searches or remote access not already authorized under current law," spokesman Peter Carr said by email. "The probable cause and particularity standards we have to meet to obtain the warrant from the court do not change, and the execution of the warrant remains under the supervision of the court."
http://www.computerworld.com/s/artic...mote_computers





Snapchat Settles Charges With F.T.C. That It Deceived Users
Jenna Wortham

The disappearing act of messages on Snapchat, the mobile messaging service, has not been as foolproof as the company promised.

The Federal Trade Commission on Thursday said Snapchat had agreed to settle charges that the company was deceiving users about the ephemeral nature of the photos and video messages sent through its service.

In marketing the service, Snapchat has said that its messages “disappear forever.” But in its complaint, the commission said the messages, often called snaps, can be saved in several ways. The commission said that users can save a message by using a third-party app, for example, or employ simple workarounds that allow users to take a screenshot of messages without detection.

The complaint also said Snapchat transmitted users’ location information and collected sensitive data like address book contacts, despite its saying that it did not collect such information. The commission said the policies allowed security researchers to compile a database of 4.6 million user names and phone numbers during a recent security breach.

“If a company markets privacy and security as key selling points in pitching its service to consumers, it is critical that it keep those promises,” Edith Ramirez, the chairwoman for the Federal Trade Commission, said in a statement. “Any company that makes misrepresentations to consumers about its privacy and security practices risks F.T.C. action.”

Under the terms of the settlement, Snapchat will be prohibited from misrepresenting how it maintains the privacy and confidentially of user information. The company will also be required to start a wide-ranging privacy program that will be independently monitored for 20 years. Fines could ensue if the company does not comply with the agreement.

“While we were focused on building, some things didn’t get the attention they could have,” the company said in a statement.

The company added: “Even before today’s consent decree was announced, we had resolved most of those concerns over the past year by improving the wording of our privacy policy, app description, and in-app just-in-time notifications. And we continue to invest heavily in security and countermeasures to prevent abuse.”

The company declined an interview request.

The mobile messaging service Snapchat settled with the Federal Trade Commission over charges that the service inaccurately claimed that, once sent, messages disappeared.

Snapchat warns users about potential data collection in its privacy statement. The company says: “There may be ways to access messages while still in temporary storage on recipients’ devices or, forensically, even after they are deleted. You should not use Snapchat to send messages if you want to be certain that the recipient cannot keep a copy.”

The commission’s complaint could not have come at a worse time for the company. Last year, Snapchat turned down a multibillion-dollar buyout offer from Facebook.

The company is based in Los Angeles and is run by Evan Spiegel and Bobby Murphy, two former fraternity brothers at Stanford. The service was first released in 2011 and quickly gained a following among high school students in Southern California. In recent months it has become one of the most sought-after businesses in the tech industry, drawing attention from venture capital firms in Silicon Valley as well as companies like Facebook and Google.

The company does not reveal the number of people currently using its service, but says it now shuttles more than 700 million messages back and forth to users each day.

Snapchat is one of many tech start-ups that promise varying degrees of privacy or, in some cases, anonymity to its users. Mobile applications like Whisper, Secret, Confide and a number of others also assure their users that the content they share and sometimes even their identities are secure.

The case and the resulting settlement are part of a larger, continuing investigative effort by the commission to hold start-ups and companies accountable to their marketing claims and privacy assurances to consumers.
http://www.nytimes.com/2014/05/09/te...ommission.html





John McAfee Launches Chadder, a New Encrypted Private Messaging App
Paul Sawers

Antivirus software pioneer John McAfee has launched a new messaging app called Chadder, with a focus on encryption and security.

The app is produced by Future Tense Private Systems (FTC), a company founded by McAfee that has previously developed the Dcentral1 app, in conjunction with Etransfr. FTC says that “other privacy and security solutions are under development” too.

The core raison d’être of Chadder is privacy, with only the recipient able to read the message – would-be interceptors will only see “garbled, encrypted text”. When you first sign up, you do so by providing a nickname, username, and password. That’s it. Then, when you’re in, you should be able to add your email address and mobile number (to help people find you), otherwise you can generate a code to share and link up with individuals that way.

Screenshot 2014 05 05 12 55 40 220x391 John McAfee launches Chadder, a new encrypted private messaging app unnamed 220x391 John McAfee launches Chadder, a new encrypted private messaging app

However, when we tested this earlier today, we could not get it to work. We tried using ‘Generate Code’, and also ‘Find by Email’, and we weren’t able to find other users. It’s still quite an early-stage ‘beta’ product, so hopefully this issue will be fixed soon – it’s quite a big stumbling block.

In a rare platform-launch reversal, Chadder is available now for Android and Windows Phone, though a version for iOS is currently being tested with a view to launch “in the coming weeks”.

Meanwhile, check out the official Chadder promo skit below.
http://thenextweb.com/apps/2014/05/0...messaging-app/





As Domestic Abuse Goes Digital, Shelters Turn to Counter-Surveillance with Tor
George LeVines

Sarah’s abuser gained access to every password she had. He monitored her bank accounts and used her phone to track her location and read her conversations. She endured four years of regular physical and emotional trauma enabled by meticulous digital surveillance and the existing support services, from shelters to police, were almost powerless to help her.

“We wish we could just stop the clock because we need to catch up,” said Risa Mednick, director of the Cambridge domestic violence prevention organization Transition House.

To fight back, Transition House and others turn to the same methods used by intelligence agencies in order to keep their clients safe.

Sarah’s case — one severe enough that using her real name would put both her and the domestic violence prevention community at risk — exemplifies how digital components of abuse stymie social workers more accustomed to dealing with physical and emotional trauma.

Mednick, whose organization worked with Sarah and others in similarly abusive relationships, saw first hand how her own staff struggled to handle casework involving technology. She responded by putting out a query for assistance.

Last fall the Tor Project — a nonprofit that builds anonymous Web browsing and communication tools — answered Mednick’s query. Since then, the two groups have been working to develop a resource that will provide staff and advocates with the base level of technological know-how required to address casework with a digital abuse component.

“Abuses with technology feel like you’re carrying the abuser in your pocket. It’s hard to turn off,” said Kelley Misata, a Tor spokesperson.

The Tor Browser Bundle is free software that works like most ordinary browsers but comes configured to make it harder for individuals to be tracked, obscuring or deleting things like a browser’s history, location, and IP address from both the website the user is browsing as well as erasing traces from the computer the browser is hosted on.

To better understand the dangers and the prevention community’s response to digital abuse, I asked Transition House to connect me with one of their clients, someone who experienced total loss of control over their connected life.

Transition House introduced me to Sarah. She agreed to let her story be used only under the condition of anonymity. The domestic violence prevention community expressed concern at the prospect of reaching out to her abuser or any member of his family, fearing retaliation both toward Sarah and Transition House.

Police reports and sworn affidavits, interviews with Sarah’s family, her best friend, a current and former co-worker, and Transition House all corroborated Sarah’s story.

BECOMING A PRISONER

Sarah and her abuser first struck up a relationship after meeting in a recreational soccer league in June of 2008. Things seemed to be going well after a weeklong vacation early on.

But two months in, after a brief breakup, he put her hands around her neck and threatened her life, Sarah said.

He blamed his behavior on a cocktail of drugs — some prescription, some not — to manage the long hours and stress incurred from what he told her was an undercover position at the Federal Bureau of Investigation.

The man insisted, citing the need to protect his cover, that Sarah grant him access to every aspect of her life: Cell phone, social media, bank accounts, website passwords. Everything.

Feeling she had nothing to hide, Sarah complied. That compliance soon evolved into a complete relinquishing of freedoms.

Each day became the same: She went to work. If she left a building she was to notify him. If she didn’t notify him, he called. When work ended she went home. If she didn’t go home, he called.

“The question I always asked was how does someone end up in that situation?” her best friend said. “And the answer — from having witnessed it — is, gradually.”

That gradual evolution is crucial to understanding abuse, Mednick said.

Abuse works slowly: First abusers often forbid Facebook, then friends of the opposite sex, then friends altogether, then access to transportation, then privacy of any kind. Without noticing, a victim feels suddenly suffocated and intensely vulnerable.

On New Year’s Eve of 2008, Sarah’s partner passed out in their car after an argument over the gratuity on their bar tab. She tried to help him up the stairs but when he came-to he began throwing her, repeatedly shoving her to the ground, and finally kicking her into a wall before passing out again.

“That night I was done with it,” she said. “I felt like I couldn’t talk to anybody because if I did, he would know. I felt more alone than I ever felt before. I was a prisoner in my own head because I couldn’t tell anybody what was going on.”

To escape, Sarah took about a hundred ibuprofen in an attempt to end her life.

TROUBLES WITH TECHNOLOGY

The first iPhone came out seven years ago. But for the law, those enforcing it, and providers of domestic violence prevention services, contemporary and pervasive use of Web and mobile computing technologies is still a challenge.

Sarah’s case represents a larger trend, one where the lines blur between digital and real-life abuse. Most laws governing abuse and stalking came before the cell phone — and so do most social workers — making response to the trend challenging and slow.

“We’re almost looking at Tor as technological epidemiologists,” Mednick said.

Law enforcement, policy makers, and service providers know very little about that epidemic.

The Cyber Crimes Division of the Massachusetts Attorney General’s Office does not keep statistics specifically on cyberstalking. Nor does the Massachusetts District Attorney’s Association. Even the Washington, D.C.-based National Network to End Domestic Violence (NNEDV) only managed to dig up a singular table generated by the National Institute of Justice — with data gathered in 2006.

“Unfortunately the most comprehensive statistics kept by the FBI don’t drill down to that level of detail,” said Cindy Southworth of NNEDV in an email.

Today, many abuse cases contain at least one digital facet because abuse is about power and control and most victims are using some form of technology, Southworth said.

Andrew Lewman, the Tor Project’s executive director, understands better than most the challenges facing advocates and social workers in domestic violence prevention roles. Lewman works directly with abuse victims whose partners are in law enforcement or intelligence professions.

“You have a whole separate set of issues,” he said.

The world Lewman works in looks especially grim. He sees abusers posting tips and tricks to online forums, telling others how to achieve masterful levels of surveillance and control.

For example, one abuser might hack a company’s password database and share the whole thing with others online, Lewman said. Digital communities have sprung up where individuals teach each other how to compromise cell phones to track victim’s whereabouts, listen to conversations in a room, take pictures, and read texts and email so that they can learn about their victim’s behavior on a microscopic level.

Often the language to describe the surveillance is couched in protective terms, such as monitoring a child’s activities or queries posed to check if a partner is cheating. Commercially available software advertises easy tracking of exact locations, call logs, text messages, and more, often in an interface as easy to use as Google Maps.

And while digital stalkers often know nothing more about technology than the average person, their devotion is intense.

“Most of them quit their jobs and do this full time or they’ve been fired,” Lewman said. “They spend all their time thinking about what they’re going to do next.”

In 2013 more than half of all US adults carried a smartphone according to the Pew Research Center. To turn a device against someone, stalkers need only basic smartphone knowledge and $40. With a hacked phone an abuser can track GPS location and gather exacting details through access to email, text messaging, and other apps.

By contrast, in order to help victims in such a predicament, social workers need a clear line of communication, one insulated from abuser snooping. Establishing that requires learning the basics of encryption — a word many Americans only became familiar with when stories arose around the National Security Agency and Edward Snowden.

“That’s where Tor is useful because most social workers are not tech savvy,” said David Adams. “We don’t even know half the features on our cell phones.”

Adams is co-executive director at Emerge, an organization founded in 1977 that runs education programs for abusers, treating their actions as learned behavior capable of being unlearned.

Emerge is developing a training module with Lewman and Tor that both educates social workers to be more aware of technological aspects of abuse and gives victims tools to “immunize” themselves, Adams said.

That project is still in the early phases. Lewman first needs to understand how to teach social workers to protect themselves before helping others. He thinks of the digital abuse epidemic like a doctor might consider a biological outbreak.

“Step one, do not infect yourself. Step two, do not infect others, especially your co-workers. Step three, help others,” he said.

In the case of digital infections, like any other, skipping those first two steps can quickly turn caretakers into infected liabilities. For domestic violence prevention organizations that means ensuring their communication lines stay uncompromised. And that means establishing a base level of technology education for staff with generally little to no tech chops who might not understand the gravity of clean communication lines until faced with a situation where their own phone or email gets hacked.

While the Tor Project seeks funding to create a program that will give social workers that basic tech education, domestic violence advocates and victims remain in limbo when faced with challenging digital abuses.

AN INTERVENTION

The overdose of ibuprofen landed Sarah in the hospital, but she survived. Ultimately she and her partner got back together.

It was 2010, two years into the relationship. Sarah worked at a domestic violence shelter in Greater Boston. Every day she helped people in situations much like her own. She felt like a fraud.

“It was shoved in my face everyday,” she said. “I didn’t tell anybody what was going on.”

Familiar patterns emerged. Sarah went to work and went home, notifying him of her every move. He called if she failed to do so. Slowly other staff at the domestic violence shelter learned about Sarah’s situation.

She worried that he had weapons and the degree of stalking escalated to a point where she feared for her life.

Sarah’s co-workers staged something of an intervention, sending her through a high-risk assessment team. The team acts to triage an abuse case and assess risk — including that of homicide — based on data points from law enforcement and service providers.

The communities of Cambridge, Arlington, and Belmont assembled the team in 2009 bringing together the Middlesex district attorney, law enforcement, and domestic violence community service providers from each town.

Surveillance within an abusive relationship raises a red flag for an increased risk of homicide, according to Adams, who wrote a book on abusive men who murder.

The team assessed a very high risk to Sarah’s situation.

The domestic violence shelter she worked at also worried about the integrity of its organization and the services they provided because Sarah’s abuser knew of the shelter’s location and its business operations from information he gleaned by surveilling her phone and Web accounts.

The team developed an elaborate plan to procure a restraining order and get Sarah on a flight out of state to stay with a trusted friend.

Obtaining a restraining order presented a number of complications, starting when Sarah’s partner drove by the courthouse, watching as she walked up the stairs to file in early fall of 2010.

Filling out the paperwork, she feared he would follow her into the courtroom. She knew him by at least three different names seen on various identifications in his possession. She included all three in the restraining order. In the affidavit she explained the drive by that had just happened and detailed a number of past violent outbursts.

A local detective served Sarah’s abuser the restraining order by phone. He called back that day to acknowledge the order.

Police decided to confiscate Sarah’s smartphone in order to decipher whether it was in fact being used as a surveillance tool. They also decided she would be safer traveling by police cruiser to the airport rather than with a co-worker as the plan originally intended.

Sarah flew from Logan to stay with her best friend. Having complete access to her digital life, her partner knew exactly where she had gone.

“Before she got out here, I went out and bought a baseball bat and a couple cans of mace and prepared for the fact that he might follow her out here,” Sarah’s best friend said.

The friend planned to provide a safe place for her to relax and recover. Her plan and reality didn’t line up.

“It set in pretty quick for me that she wasn’t done with him,” her friend said.

The friend found Google searches for pay phones in the area and assumed Sarah was trying to make contact with her partner. According to Sarah he had emailed her, violating the restraining order. She called to notify the domestic violence shelter of the violation.

In another email, Sarah’s partner said that police had arrested one of his family members under the restraining order because the primary name that she knew him by — and one of the names she inscribed on the paperwork — in fact belonged to the family member.

Feeling guilty for putting the wrong person in jail, Sarah returned to the Boston area to release the family member and address threats he had made against her and her father.

“For me, for my friends, for my family, in that moment going back to him was the safest thing I could do,” Sarah said.

RESTRAINT

For the domestic violence prevention community, restraining orders are an important tool but they come with a price. Victims and social workers devote hours and receive a token of moderate protection in return. Mednick calls the process an “endurance test” in which returning to the court over and over is not uncommon, often with the abusive person bringing their family and attorney, sitting directly across the aisle from the victim.

“It can feel very scary and you never know what’s going to happen,” Mednick said.

Scared is exactly how Sarah felt summer of 2012. She narrowly avoided being struck by a car when her partner shoved her into oncoming traffic after becoming angry with her for interacting with men at a gay bar.

She filed for a second restraining order later that week, court records show. The processing took far longer than with the first restraining order. From the Friday that she filed throughout the fall, she returned to the courthouse every 10 days in order to keep the order alive, five times in total.

In Massachusetts service of a restraining order must occur in person unless a judge grants special circumstance allowing another manner of delivery. For more than a month police tried serving Sarah’s partner to no avail. Eventually a judge granted the restraining order by allowing police to serve him via voicemail but warned Sarah that the order might not hold up should the defendant decide to fight it in court.

According to the affidavit she wrote and later recounted during an interview, it was only with the help of five strangers that Sarah managed to get in her car and leave the night he pushed her into oncoming traffic. In the affidavit she also revealed his supposed FBI cover and said she had not returned home after the incident for fear of seeing him.

Since obtaining that second restraining order Sarah has neither seen nor spoken to her former partner, except on one occasion when he saw her driving on U.S. I-93 and attempted to run her into a guardrail before exiting the highway.

HEALING

When Transition House staff learned about the courthouse drive by, the gravity of technological abuse struck.

“He knew to drive by a court that was completely in a different town,” said one staff member with detailed knowledge of Sarah’s case.

Since then, response to digital abuse at some organizations has improved. Staff members ask questions to determine if cases contain a digital component earlier in the screening process. They are wary of client cell phones and compromised communication channels. If they feel a victim carries a bugged phone they replace it. If the abuse is happening across social media they encourage not using the platform in question.

However digital abuse often remains hidden until well after a victim’s first contact with the organization, making it vulnerable to compromised communication lines, Mednick said.

In more severe cases the Tor Project gets involved and introduces some of their anonymizing tools to provide an added layer of security.

Meanwhile Sarah’s restraining order remains precarious in the eyes of the law because it was served over voicemail. A court advocate told Sarah that her abuser was likely ducking the restraining order because there were warrants out for his arrest, she said. But when contacted the Middlesex County Clerk’s office said it found no warrants, at least under the name Sarah believed truly belonged to him.

Over the years Sarah’s partner came to know and hold power over almost every detail of her life. In all she endured two lost jobs and two restraining orders. She was coerced into pregnancy and then miscarried. She survived monthly physical violence, persistent emotional trauma, and a suicide attempt.

More than a year out of the relationship — and lots of therapy later — Sarah said she still feels like she spent four years of her life with a person she knew absolutely nothing about. She eventually realized that he was likely not undercover FBI, but instead an abusive, married man with a drug habit.

“I had no idea who he was,” Sarah said.

Six months after I first met with Sarah, she reflected on the struggle that started in June, 2008.

“No body is going to believe all of this stuff,” Sarah said. “Even now I have a lot of shame. I have a lot of blaming myself.”

After obtaining the second restraining order she often called asking her best friend if particular social interactions were normal. Behavior for which she received punishment for so long, her best friend said.

“The entire thing was just such a surreal experience,” said Sarah’s friend. “It’s a process, healing — a long, never-ending process.”

Sarah now works two jobs and is considering a career change. She goes out with friends at night and runs during the day. All things that she could not or would not do for a very long time.

“I started playing soccer again,” Sarah said.
http://betaboston.com/news/2014/05/0...ance-with-tor/





Changing Channels: Americans View Just 17 Channels Despite Record Number to Choose From

Americans have no shortage of options in every aspect of their lives. The proliferation of devices for consuming content has enabled more choices than most can count. But the “problem” of having too many options—including a growing expanse of content—doesn’t seem to be having an impact on our TV viewing preferences.

According to Nielsen’s forthcoming Advertising & Audiences Report, the average U.S. TV home now receives 189 TV channels—a record high and significant jump since 2008, when the average home received 129 channels. Despite this increase, however, consumers have consistently tuned in to an average of just 17 channels.

This data is significant in that it substantiates the notion that more content does not necessarily equate to more channel consumption. And that means quality is imperative—for both content creators and advertisers. So the best way to reach consumers in a world with myriad options is to be the best option.
http://www.nielsen.com/us/en/newswir...oose-from.html





Level 3 Calls Out Comcast, TWC and Others for ‘Deliberately Harming’ Their Own Broadband Service
Zach Epstein

Level 3, a tier 1 Internet service provider based in Colorado, has called out Comcast, Charter, Time Warner Cable and other top U.S. ISPs for “deliberately harming the service they deliver to their paying customers.”

In a thorough post that goes into great detail about the networks that deliver Internet service to homes and businesses across the globe, Level 3′s VP of Content and Media Mark Taylor explained “peering,” a term that has been pulled into the mainstream media recently. Netflix, as we’re sure you have read, has agreed to pay certain ISPs a “ransom” in order to reduce peering congestion and deliver faster streaming video to its subscribers.

“Level 3 builds a route map of the Internet by connecting its tens of thousands of customers together and allowing them to communicate. So a Level 3 customer in Hong Kong can communicate with a Level 3 customer in Sao Paulo. But to complete the map we also need to fill in interconnection to everyone who isn’t a direct Level 3 customer, so that our customers can also communicate with those who are not our customers,” Taylor explained on Level 3′s blog. “We do that through connections to other networks and their customers. This latter sort of connectivity is often called peering. Peering connections allow for exchanges of traffic between the respective customers of each peer.”

The executive went on to explain the process in great detail, and also to explain some issues that might cause peering congestion and slow down Internet service for subscribers.

“Level 3 has 51 peers that are interconnected in 45 cities through over 1,360 10 Gigabit Ethernet ports (plus a few smaller ports). The distribution of that capacity with individual peers ranges from a single 10 Gigabit Ethernet port to 148 ports,” Taylor wrote.

He then said that the average utilization across those interconnected ports is 36%. Utilization at 12 of Level 3′s ports is in excess of 90%, however, which is saturated and causes service slowdowns and packet loss. Level 3 is currently working with six of those 12 partner ISPs to upgrade service and resolve issues.

The remaining six peers, however, refuse to work with Level 3 to address the congestion. These ports have been saturated for more than a year according to Taylor, but the ISPs still refuse to work toward a resolution.

“They are deliberately harming the service they deliver to their paying customers,” Taylor wrote. “They are not allowing us to fulfil the requests their customers make for content.”

Which six ISPs are we talking about here? Taylor stops short of naming them, but he still manages to shame them.

“Five of those congested peers are in the United States and one is in Europe,” he said. “There are none in any other part of the world. All six are large Broadband consumer networks with a dominant or exclusive market share in their local market. In countries or markets where consumers have multiple Broadband choices (like the UK) there are no congested peers.”

Taylor also noted that the ISPs in question “happen to rank dead last in customer satisfaction across all industries in the U.S.,” and he linked to the American Customer Satisfaction Index, which regularly ranks ISPs including Comcast, Time Warner Cable, Charter, Cox, Verizon and Cablevision at the bottom of customer satisfaction surveys.
http://bgr.com/2014/05/06/comcast-in...ision-level-3/





Comcast is Destroying the Principle that Makes a Competitive Internet Possible
Timothy B. Lee

Conservatives love the internet. They don't just love using it, they also love to point to it as an example of the power of free markets. And they're right. The internet has had a remarkable 20-year run of rapid innovation with minimal government regulation.

That was possible because the internet has a different structure than other communications networks. Most networks, like the 20th century telephone market, are natural monopolies requiring close government supervision. But the internet is organized in a way that allows markets, rather than monopolists or government regulators, to set prices.

That structure has been remarkably durable, but it's not indestructible. And unfortunately, it's now in danger. In recent years, Comcast has waged a campaign to change the internet's structure to make it more like the monopolistic telephone network that came before it, making Comcast more money in the process.

Conservatives are naturally and properly skeptical of government regulation. But this is a case where the question isn't whether to regulate, but what kind of regulation is preferable. If federal regulators don't step in now to preserve the structures that make internet competition possible, they will be forced to step in later to prevent the largest ISPs from abusing their growing monopoly power.

The old, busted way to run a network

The telephone industry was dominated by a single monopoly called AT&T. Everyone paid AT&T a flat subscription for a telephone. Long-distance calls came with an extra per-minute fee.

If you want to build a network that reaches everyone in America and provides a consistent quality of service, this isn't a bad way to do it. But it has some obvious problems that practically require government oversight.

The simple danger is that monopolies tend to charge high prices. Another problem is that monopolies are often bad for innovation. If you had an idea for a new telecommunications product in 1950, you needed AT&T's permission to try it. And AT&T generally refused to allow third-party devices to be attached to its network. Regulators had to step in frequently to force AT&T to be more accommodating toward third-party innovators.

AT&T was forced to spin off its local telephone business into seven independent companies that were dubbed the "Baby Bells." The economics of long-distance calling became a lot more complicated. Before, making a long-distance call just involved one company, AT&T. Now it involved three companies: a Baby Bell at each end and a long-distance company in the middle. AT&T was one option for long-distance service, but it competed against rivals such as Sprint or MCI. (Baby Bells are outside the grey circle, long-distance companies are inside of it.)

The long distance market is based on a sender-pays principle, which I have illustrated with green arrows. The customer who dials the phone pays for the call. The payment goes to the long-distance company of the customer's choice. The long-distance company, in turn, makes a payment to the local phone company that operates the other end of the connection.

The terminating monopoly problem

This restructuring of the telephone market helped to create a competitive market for long-distance service. But there's still a serious problem, known in telecom jargon as a "terminating access monopoly." Suppose an Ameritech customer in Detroit wants to call her sister, a BellSouth customer in Atlanta. She has several options for long-distance service. AT&T, MCI, and Sprint are all competing for her business. But no matter which long-distance company she chooses, that long-distance provider is ultimately going to need to connect to BellSouth to complete the call. That means BellSouth always gets to collect a fee for the call.

THANKS TO COMPETITION AMONG TRANSIT PROVIDERS, PRICES HAVE FALLEN BY A FACTOR OF 1000

In a revealing post on its public policy blog, the modern AT&T (which has reunited with four of its seven former subsidiaries) described what happened as a result: "In the late ‘90s, CLECs began to tariff ever-increasing rates for terminating access services." In plain English, certain phone companies began demanding higher and higher fees to complete incoming calls, a problem that ultimately forced the FCC to regulate the market more strictly.

Unfortunately, while all-knowing perfectly benevolent regulators could make this work, in practice regulators tend to be neither all-knowing nor benevolent. Special interest can "capture" regulatory agencies and bend rules to their own ends, and even when acting in perfect good faith regulators simply may make the wrong call. Even worse, a regulated system tends to discourage innovation since any company who'd be disadvantaged by change can use the regulatory process to block it.

At the same time, regulation is the worst solution to the terminating monopoly problem except for all the alternatives. The sender-pays rule creates terminating monopolies. Absent price regulation, that will necessarily lead to exorbitant prices.

How the internet solved the terminating monopoly problem

In the bill-and-keep internet, companies at each "end" of a connection bill their own customers — whether that customer is a big web company like Google, or a an average household. Neither end pays the other for interconnection. Instead, the Internet Service Provider (ISP) at each end is responsible for ensuring that its traffic can reach the ISP at the other end. This is part of the service that the ISP sells to its customers, a guarantee that traffic will get where you want it to go.

ISP's typically do this by hiring a third party to provide "transit," the service of carrying data from one network to another. Transit providers often swap traffic with one another without money changing hands. The transit providers are the ones inside the grey circle in the middle of the figure.

"EVERY DAY I HAVE SOMEONE COME UP TO ME AND SAY 'COMCAST CAME UP TO US ASKING FOR MONEY'"

The terminating monopoly problem occurs when a company at the end of a network not only charges its own customers for their connection, but charges companies in the middle of the network an extra premium to be able to reach its customers. In a bill-and-keep regime, the money always flows in the other direction — from customers to ISPs to transit companies. And because the market for transit is highly competitive, there's no need for government regulation of transit fees. It's an ordinary market where if a transit company tries to charge too much, ISPs will switch to another company.

Bill and keep has worked well. Thanks to competition among transit providers, average transit prices have fallen by a factor of 1000 since 1998:

People who love the internet's lack of regulation have bill-and-keep to thank. By solving the terminating monopoly problem, bill-and-keep makes possible a robust and competitive market for internet connectivity that requires minimal government oversight.

Comcast is trying to break bill-and-keep

Over the last four years, Comcast has engaged in a campaign to undermine the bill-and-keep system. The effort first came to public attention in 2010. Level 3 had just signed a contract to host Netflix content, and Level 3 asked Comcast to upgrade a connection between them to accommodate the higher traffic. Level 3 expected this to be an easy sell since Comcast had previously paid Level 3 for transit service. But instead, Comcast demanded that Level 3 pay it for the costs of the upgrade.

COMCAST HAS ENGAGED IN A CAMPAIGN TO UNDERMINE THE BILL-AND-KEEP SYSTEM

Since then, Comcast has evidently begun demanding that other transit and content providers pay it for faster connections too. "Every day I have someone come up to me and say 'Comcast came up to us asking for money,'" says Tim Wu, the Columbia law professor who coined the term "network neutrality."

Comcast itself has been silent on the details of these agreements, but the company's defenders take it for granted that transit providers should be paying Comcast, not the other way around. For example, Dan Rayburn has argued that "the reason for the poor [Netflix] quality streaming is that Cogent refuses to pay Comcast to add more capacity." This, of course, is begging the question. Why should Cogent pay Comcast to deliver content that Comcast customers requested in the first place?

In a letter to the FCC defending its handling of the dispute with Level 3, Comcast provides an answer. Comcast argued that the two companies' "traffic ratio" — the ratio between the traffic Comcast was sending Level 3 and the traffic Level 3 was sending Comcast — had been thrown out of balance by the growth of Netflix streaming. Comcast portrayed it as a standard industry practice for the network that sends a disproportionate amount of traffic to pay the receiving network for the costs of carrying the traffic.

But that's not how the internet works. Consumer-facing ISPs have always received more traffic than they send out. Comcast itself sells "unbalanced" internet service to its customers, with download speeds much faster than upload speeds. That makes it inevitable that ISPs like Comcast will receive more data than they send. But in the bill-and-keep model, ISPs generally pay transit providers for connectivity, regardless of traffic ratios.

The traffic ratio rule Comcast advocated in 2010 was a variation on the sender-pays rule. It will create the same kind of terminating monopoly problem that plagued the long distance telephone market. But that might not seem like a bad thing if you own the monopoly.

The importance of market share

Two factors tend to make the bill-and-keep model stable. One is competition in the consumer ISP market. If customers can easily switch between broadband providers, then it would be foolish for a broadband provider to allow network quality to degrade as a way to force content companies to the bargaining table.

A MERGED CABLE GIANT WOULD HAVE EVEN MORE LEVERAGE TO DEMAND MONOPOLY RENTS

The second factor is ISP size. When ISPs are relatively small, payments naturally flow from the edges of the network to the middle because small edge networks need large transit networks to reach the rest of the internet.

Imagine, for example, if the Vermont Telephone Company, a tiny telecom company that recently started offering ultra-fast internet services, tried to emulate Comcast. Suppose it began complaining that Netflix was sending it too much traffic and demanding that its transit providers start paying it for the costs of delivering Netflix content to its subscribers. Netflix and the big transit companies that provide it with connectivity would laugh at this kind of demand. It would be obvious to everyone that VTel needs transit service more than transit providers need VTel.

But when an ISP's market share gets large enough, the calculus changes. Comcast has 80 times as many subscribers as Vermont has households. So when Comcast demands payment to deliver content to its own customers, Netflix and its transit suppliers can't afford to laugh it off. The potential costs to Netflix's bottom line are too large.

This provides a clear argument against allowing the Comcast/Time Warner merger. Defenders of the merger have argued that it won't reduce competition because Comcast and Time Warner don't serve the same customers. That's true, but it ignores how the merger would affect the interconnection market. A merged cable giant would have even more leverage to demand monopoly rents from companies across the internet.

A century ago, the Wilson administration decided not to press its antitrust case against AT&T, allowing the firm to continue the acquisition spree that made it a monopoly. In retrospect, that decision looks like a mistake. Wilson's decision not to intervene in the market led to a telephone monopoly, which in turn led to 70 years of regulation and a messy, 10-year antitrust case.

Obviously, the combination of Comcast and Time Warner would not dominate the internet the way AT&T dominated the telephone industry. But recent events suggest that Comcast is already large enough to threaten competition on the internet. Preventing the company from getting even larger might avoid the need for a lot more regulation in the years ahead.

Comcast declined to comment for this story.
http://www.vox.com/2014/5/6/5678080/...aining-telecom





AT&T Claims Common Carrier Rules Would Ruin the Whole Internet

AT&T is just spouting "scary mumbo-jumbo," net neutrality proponent says.
Jon Brodkin

AT&T today urged the Federal Communications Commission to avoid reclassifying broadband Internet access as a telecommunications service, which is something network neutrality advocates are asking the FCC to do.

Reclassification would open broadband providers up to common carrier rules under Title II of the Communications Act, similar to regulations that have covered our phone system since 1934. Recent calls for reclassification of broadband stem from a federal appeals court ruling that the FCC could not impose strict network neutrality rules, such as prohibitions against blocking Web services and Internet fast lanes, without first declaring the providers to be common carriers.

AT&T’s anti-regulatory view isn’t surprising. It’s already arguing that the Public Switched Telephone Network should be shut down and replaced with largely unregulated Internet-based voice service.

The company’s eight-page ex parte filing claims that the reclassification of broadband would backfire in all sorts of unintended ways that would wreak havoc on the Internet, without even achieving the goal of banning paid prioritization deals in which Web services pay for faster access to consumers. Here’s a copy of the letter (thanks to Wall Street Journal reporter Gautham Nagesh for passing it along).

One of the most interesting arguments made by AT&T is that Title II reclassification would force giant changes in the peering and interconnection markets. AT&T Senior VP Robert Quinn wrote:

Indeed, reclassification would raise a host of issues that reclassification proponents have completely ignored in their advocacy. For example, if broadband Internet access service is a telecommunications service, then broadband Internet access providers could be entitled to receive transport and termination fees under section 251(b)(5).15. The Commission could not avoid this occurrence by establishing a bill-and-keep regime because, unlike voice traffic, Internet traffic is asymmetric. And because Internet traffic would now be subject to reciprocal compensation, virtually every settlement free peering arrangement would have to be replaced by newly negotiated arrangements implementing the reciprocal compensation provisions of the Communications Act. Moreover, in those instances in which reciprocal compensation does not apply, ISPs would be entitled to file tariffs for the collection of charges for terminating Internet traffic to their customers.

Reclassification could also bring lots of new requirements for ISPs that don’t directly serve consumers, AT&T argued:

[i]t is foolish to think that the Commission could reclassify the provision of broadband Internet access to consumers as a telecommunications service without similarly reclassifying a broad array of other functionally analogous services in the Internet ecosystem. For example, there is no logical or legally sustainable basis to distinguish between ISPs serving consumer “eyeballs” and those serving content and other edge providers. Likewise, transit providers and content delivery networks (CDNs) would be telecommunications service providers subject to Title II, as would connected device customers. (The latter would be resellers of telecommunications services and thus telecommunications service providers in their own right.) Indeed, the logic behind reclassification would dictate that when a search engine connects an advertising network to a search request or effectuates a connection between a search user and an advertiser, it too would be providing a telecommunications service. And so too would an email provider that transmits an email or a social network that enables a messaging or chat session.

...

The key legal rationales for any Title II reclassification decision would thus necessarily extend to any Internet provider that holds itself out to customers as arranging for the transmission of data from one point on the Internet to another, whether or not it owns transmission facilities. As discussed above, this category would extend to ISPs such as Earthlink and AOL that do not own last-mile transmission facilities; to content delivery networks (“CDNs”) such as Akamai that hold themselves out to the commercial public as transporters of data to distant points on the Internet; to providers of e-readers like Amazon.com, which provides Internet access through the Kindle; to companies like Google that provide advertising-supported Internet search services and, on behalf of countless commercial customers, arrange for the transmission of advertising content to end users; and to a variety of other online transport providers ranging from Netflix to Level 3 to Vonage. In short, Title II reclassification would be a sledgehammer, not a scalpel.


This is nonsense, AT&T critics say

AT&T doesn’t seem to accept the possibility that the FCC could limit reclassification to a specific type of Internet service. In fact, the FCC’s treatment of broadband as an “information service” that isn’t subject to common carrier regulation is the result of several decisions tailored to specific types of services. It started in 2002 when the FCC classified cable modem service as an “information service.” This continued in 2005 when the FCC declared wireline broadband (such as DSL and fiber) an information service. In 2007, wireless broadband was classified as an information service.

"As usual, AT&T's positions are laughable at best—though disingenuous is more like it," Matt Wood, policy director of consumer advocacy group Free Press, told Ars. "Nothing in Title II says that every last provision has to apply to any Title II service. That's the whole point of forbearance. The fact that broadband providers could be entitled to something doesn't mean they actually are entitled to it, or that AT&T's cost-causation story is true."

"To look at a real-world example instead of an AT&T fantasy: mobile voice is a common carrier service, and yet CMRS [commercial mobile radio service] providers are barred from charging access fees," Wood also said. "And many enterprise market broadband services are telecom services yet still subject to bill-and-keep and privately negotiated contracts. Nothing about Title II commands the results that AT&T's straw man argument suggests."

Public Knowledge Senior VP Harold Feld agreed. "To a large extent, this is just scary mumbo-jumbo to make Title II look big and complicated," Feld told Ars. "It's like 'technobabble' on Star Trek. People make phone calls every day, but AT&T has no right to demand to negotiate special "access charges" from Public Knowledge (based on content) or Pizza Hut (based on volume). I pay my phone provider to negotiate my 'termination fee' and 'access charges' etc., etc."

Many of the issues raised by AT&T were already hashed out in FCC proceedings in 2010, "which saves a lot of time," Feld said.

As for paid prioritization, AT&T claims reclassification won’t give net neutrality advocates what they want:

The supreme irony here is that Title II reclassification would not even preclude the paid prioritization arrangements that are purportedly animating reclassification proposals. Title II does not require that all customers be treated the same as reclassification proponents seem to believe. Rather, by its express terms, Title II prohibits only “unjust and unreasonable” discrimination, and it is well established that Title II carriers may offer different pricing, different service quality, and different service quality guarantees to different customers so long as the terms offered are “generally available to similarly situated customers.”

Despite what AT&T says, the federal appeals court that struck down the FCC’s net neutrality rules didn’t dispute the FCC’s authority to impose common carrier obligations if it were to reclassify broadband, and the court said that anti-discrimination and anti-blocking rules are common carrier obligations.

Wood noted that while "Title II allows for reasonable discrimination in concept, the FCC can prohibit paid-prioritization as an unjust and unreasonably discriminatory practice. The fact that Title II may allow some types of discrimination does not mean that it must allow all types of discrimination."

"We have 80 years of telecom law," Feld said. "There are lots of precedents and arguments on how the Commission could interpret things. Very few things in the Act are as compulsory or absolute as opponents of Title II like to argue... I am confident that the commission can craft a Title II regime that adequately protects consumers and competition without being unduly burdensome on providers."

AT&T does make one prediction that sounds pretty likely: Reclassification would lead to years of legal wrangling. After all, Verizon appealed the FCC's original net neutrality order and won in court, leading to the latest FCC proposal to outlaw blocking but allow paid prioritization.

“Beyond all this, any forbearance decision today could be prone to judicial challenge and attempted reversal by future Commissions," AT&T said. "No issue would ever be settled, and the Internet ecosystem would be subject to a state of perpetual regulatory uncertainty.”

So far, FCC Chairman Tom Wheeler has resisted calls to reclassify broadband as a telecommunications service but said he "won't hesitate" to change his mind if ISPs misbehave.
http://arstechnica.com/tech-policy/2...hole-internet/





Level 3 Claims Six ISPs Dropping Packets Every Day Over Money Disputes

Network provider doesn't name and shame ISPs guilty of "permanent congestion."
Jon Brodkin

Network operator Level 3, which has asked the FCC to protect it from "arbitrary access charges" that ISPs want in exchange for accepting Internet traffic, today claimed that six consumer broadband providers have allowed a state of "permanent congestion" by refusing to upgrade peering connections for the past year.

Level 3 and Cogent, another network operator, have been involved in disputes with ISPs over whether they should pay for the right to send them traffic. ISPs have demanded payment in exchange for accepting streaming video and other data that is passed from the network providers to ISPs and eventually to consumers.

When the interconnections aren't upgraded, it can lead to congestion and dropped packets, as we wrote previously regarding a dispute between Cogent and Verizon. In a blog post today, Level 3 VP Mark Taylor wrote:

A port that is on average utilized at 90 percent will be saturated, dropping packets, for several hours a day. We have congested ports saturated to those levels with 12 of our 51 peers. Six of those 12 have a single congested port, and we are both (Level 3 and our peer) in the process of making upgrades—this is business as usual and happens occasionally as traffic swings around the Internet as customers change providers.

That leaves the remaining six peers with congestion on almost all of the interconnect ports between us. Congestion that is permanent, has been in place for well over a year and where our peer refuses to augment capacity. They are deliberately harming the service they deliver to their paying customers. They are not allowing us to fulfill the requests their customers make for content.

Five of those congested peers are in the United States and one is in Europe. There are none in any other part of the world. All six are large Broadband consumer networks with a dominant or exclusive market share in their local market. In countries or markets where consumers have multiple Broadband choices (like the UK) there are no congested peers.

Taylor didn't name these companies and Level 3 hasn't answered our request for more information. Comcast may not be one of them, as Level 3 grudgingly agreed to pay Comcast after a dispute over Netflix traffic that began in late 2010.

AT&T, Verizon, and Time Warner Cable are among the ISPs that have warred with Netflix, transit providers that carry Netflix and other traffic, or both. And Level 3 did point the finger at AT&T in a previous post in March.

Level 3 today said it has "settlement-free" agreements with 48 of its 51 peers, meaning they exchange traffic without any money changing hands. The company is interconnected with those 51 peers "in 45 cities through over 1,360 10 Gigabit Ethernet ports (plus a few smaller ports)."

"In one case, a peer pays us for access to a number of routes in a region where their network doesn’t go; a choice they made rather than buying Internet Services from another party," Taylor wrote.

The company that pays Level 3 is likely Cogent.

Although Netflix is now paying Comcast and Verizon for direct connections to their networks, Level 3 still carries "some Netflix traffic," a Level 3 spokesperson told Ars.
http://arstechnica.com/information-t...oney-disputes/





Talk of an Internet Fast Lane Is Already Hurting Some Startups

The cost of delivering content over the Internet may determine which Internet products and services succeed in coming years.

Some VCs say the FCC’s latest net neutrality proposal will raise costs for startups that need fast connections or use a lot of bandwidth.
David Talbot

Some venture capitalists at the cutting edge of Internet innovation say they will shun startups requiring fast connections for video, audio, or other services, mindful that the U.S. Federal Communications Commission may let ISPs charge extra fees to major content providers.

Proposed rules being drafted by the FCC’s chairman, Tom Wheeler, would allow ISPs to charge content providers like Netflix to ensure speedy service, so long as those charges are “commercially reasonable.” The rules are scheduled to be released for public comment May 15.

In the absence of clear rules, some ISPs have already begun requesting—and receiving—access fees. Netflix recently agreed to pay big ISPs like Comcast interconnection fees to ensure a high quality of service, but Netflix CEO Reed Hastings then wrote in a blog post that the United States needs a strict form of net neutrality, with no such tolls, because users who are already paying high prices for fast service should be able to get what content they want.

The cable industry says such charges are sensible, especially when a few large content providers like Netflix can take up a large fraction of traffic. But if deep-pocketed players can pay for a faster, more reliable service, then small startups face a crushing disadvantage, says Brad Burnham, managing partner at Union Square Ventures, a VC firm based in New York City. “This is absolutely part of our calculus now,” he says.

Burnham says his firm will now “stay away from” startups working on video and media businesses. It will also avoid investing in payment systems or in mobile wallets, which require ultrafast transaction times to make sense. “This is a bad scene for innovation in those areas,” Burnham says of the FCC proposal.

This will be the third time the FCC has tried to impose regulations on discrimination in data delivery, following two losses on earlier versions in federal court (see “Net Neutrality Quashed: New Pricing, Throttling, and Business Models to Follow”). The latest proposal has been interpreted as a reversal, in that it would allow carriers to charge extra for certain services.

Wheeler said in a blog post last week that he was being misunderstood. “There has been a great deal of discussion about how our proposal to follow the court’s roadmap will result in a so-called ‘fast lane’ and Internet ‘haves’ and ‘have-nots.’ This misses the point,” he wrote.

Wheeler said the rules are designed to “to ensure that everyone has access to an Internet that is sufficiently robust to enable consumers to access the content, services and applications they demand, as well as an Internet that offers innovators and edge providers the ability to offer new products and services.”

History shows that some Web-based products and services are most likely to take root when access to Internet users is free. The founders of Foursquare, as an example, were able to set up their mobile social-networking service and reach 100,000 users with a mere $25,000 budget, Burnham says. “The thing that has been so remarkable about the Internet is that it’s been possible for a small startup to reach a global audience at no cost,” he adds. “An entrepreneur can get a product in the market, demonstrate real interest, and then go to talk to investors.”

Other VCs, particularly those who fund broadband providers, have another view. They say the explosion of video service has triggered massive costs that far exceed the growth in subscribers. Gillis Cashman, a managing partner at MC Partners in Boston, says it makes sense to charge extra to big content providers like Netflix, whose services at peak hours can sometimes consume more than 30 percent of total Internet traffic. Video is “significantly congesting these networks, and causing real issues for carriers where they have to spend a lot of money upgrading networks, and pushing fiber deeper into their networks,” he says. “There is currently no model for monetizing that required investment.”

Some less financially invested observers have little sympathy for this argument. Rob Faris, research director at the Berkman Center for Internet & Society at Harvard University, notes that broadband providers typically have had very high profit margins and often charge tiered rates based on the speeds consumers desire. “You can’t credibly argue that consumers aren’t paying enough for access to maintain higher-bandwidth services,” he says.

Not all the costs of network upgrades are high, either. While it’s true that digging trenches to install fiber-optic cables is always going to be expensive, improvements in electronics and software also boost speeds. “One reason why there isn’t more investment in new fiber-optic networks by the carriers is simply that boosting speeds on existing networks is more profitable for them,” Faris says.

If the FCC does let ISPs impose access fees, new business models and technologies for imposing those charges will emerge. Wireless carriers like Verizon are also working out “fast lane” technology, as are Web optimization companies like Akamai, which did not return a request for comment for this story (see “Akamai’s Plan for a Wireless Data Fast Lane” and “Verizon Plans a Fast Lane for Some Apps”). And AT&T offers so-called sponsored data that allows a broadcaster to subsidize its bits so can you watch shows (and ads) free on your phone, while other streams of data will count against your monthly caps.
http://www.technologyreview.com/news...some-startups/





Mozilla’s Crazy Plan to Fix Net Neutrality and Turn Broadband into a Utility – and Why it Could Work
Stacey Higginbotham

SUMMARY:
Mozilla thinks it has found a way to ensure true network neutrality without going back and reclassifying broadband. Will its regulatory sleight of hand find support?


Well here’s an interesting twist on the net neutrality debate. Mozilla, the open source foundation behind the Firefox browser, thinks that it has found a legal way to get the Federal Communications Commission to protect network neutrality and to give consumer activists calling for the agency to regulate broadband as a utility what they want.

Mozilla filed a petition with the FCC Monday asking the agency to recognize that, in addition to the relationship between ISP’s and their end customers, there is a separate relationship between the content provider (Amazon, Google, Netflix etc) and the ISP, and that relationship should be classified as transport under Title II of the 1996 Telecommunications Act.

Instead of saying the ISP simply has a duty to deliver all packets over its pipes to the consumer without discrimination, Mozilla claims there is a second legal obligation: a duty the ISP owes to content providers, who expect their packets to be delivered in a neutral manner. Mozilla suggests that the FCC split this relationship into two relationships, and that it classify the content-provider and ISP relationship as transport.

In that way, the FCC can continue making rules on the consumer side under the current regulatory regime, but apply more regulatory oversight between the ISP and the content provider. The proposal would also protect net neutrality on wireless networks. From the Mozilla blog post on the topic:

Categorizing remote delivery services as telecommunications services is consistent with the guidelines set by both Congress and the DC Circuit Court of Appeals, and would give the FCC ample ability to adopt and enforce meaningful net neutrality. With clear authority and effective rules, ISPs would be prevented from blocking or discriminating against any edge provider, whether on a wireline or wireless network.

A bit of history — okay, a lot of it.

There’s a lot to unpack here, so let’s take it in order. Back 2002 when broadband was gaining popularity mostly because it was a delivery vehicle for email, the FCC ruled that broadband was classified as an information service, with email and various web pages being the information. This is unlike phone lines, which were transport services, which allowed the FCC more power of making sure no one played unfairly.

In a series of rulings, the FCC declared DSL, cable broadband and wireless broadband information services, under Title I of the telecommunications Act. That, from a net neutrality perspective today, was the beginning of the end.

In 2010, the U.S. Federal Court of Appeals struck the first real blow in the reclassification fight noting that because cable broadband was classified as an information service the FCC couldn’t censure Comcast for blocking P2P files. At the time, since the agency was in the process of writing formal net neutrality rules, as opposed to the basic principles it had followed since 2005, the thinking was that the agency would solidify its position and reclassify broadband as transport under Title II.

It didn’t. Former chairman Julius Genachowski was too afraid of the telco lobby and Congress to solve the gaping hole in the net neutrality rules. Instead, he tried to walk the line between reclassifying broadband and keeping telcos happy, which led to the decision in January, that gutted the agency’s net neutrality rules.

Back to the present

Now, with Chairman Tom Wheeler at the helm, proponents of network neutrality are calling for the agency to reclassify broadband like it should have done in 2010 — to undo the series of decisions it made going back as far as 2002 when broadband providers often did provide email, storage and other so-called information services.

Instead, Wheeler is taking a path to net neutrality that will allow ISPs to create different traffic lanes and possibly even offer companies — from new services to content providers like Amazon or Netflix — the chance to pay to get their traffic priority on last mile networks.

This is a solution that will fundamentally change the internet — although Wheeler denies that ISPs will have free rein to start charging. The agency’s rules currently propose a non-blocking provision as well as order transparency regarding any prioritization. The agency plans to start the process of making those rules the law of the land with the release of a Notice of Proposed Rulemaking on May 15. The idea is that Wheeler’s compromise will take effect before the end of the year.

The next steps

Mozilla’s proposal calls on the FCC to make a Declaratory Judgement, and the hope is by petitioning the agency now, it can influence the content of the Notice of Proposed Rulemaking, so this suggested new relationship is discussed in parallel. Chris Riley, senior policy engineer at Mozilla, says he’s hoping the agency can issue a declaratory judgment recognizing this relationship and protect net neutrality also before the end of the year.

But can this work? It’s a neat way to call Wheeler’s bluff on the reclassification issue, which is so politically charged, that he truly can’t touch it. Instead of attacking the cable and telcos from the front on reclassification, he could sneak around from the side. However, Wheeler’s made statements in the past that indicate he’s okay with a double-sided market for broadband, which means he may not want to impose this new relationship on ISPs.

Such an action would also undoubtedly lead to lawsuits if it were implemented, which throws net neutrality into doubt for even longer. However, it’s about time someone changed the terms of this debate to reflect how the internet has changed since 2002 when the FCC decided it wasn’t a utility. Since then, as people have abandoned ISP-specific email, portals and more to surf for content and choose services delivered from the wider internet, it’s clear that ISPs are a conduit for content and services, not a provider of them.

Mozilla seeks to get the FCC to recognize this in a way that might be politically viable. Hopefully the agency takes Mozilla up on the idea.
http://gigaom.com/2014/05/05/mozilla...it-could-work/





Tech Firms Write to U.S. FCC to Oppose 'Net Neutrality' Plan
Alina Selyukh

Over 100 leading technology companies, including Google Inc, Facebook Inc, Twitter Inc and Amazon.com Inc, have written to U.S. telecom regulators to oppose a new "net neutrality" plan that would regulate how Internet providers manage web traffic.

The letter to Federal Communications Commission Chairman Tom Wheeler and the agency's four commissioners, warning of a "grave threat to the Internet," came amid calls for a delay in a vote on the plan scheduled for May 15.

"Rushing headlong into a rulemaking next week fails to respect the public response to his (Wheeler's) proposal," FCC Commissioner Jessica Rosenworcel said on Wednesday in remarks prepared for delivery to an industry meeting. She called for a delay of "at least a month" on Wheeler's plan.

FCC spokesman Neil Grace said Wheeler still intends to put forward his proposals for public comment next week, as planned, as part of a "robust public debate" on the Internet.

"Moving forward will allow the American people to review and comment on the proposed plan without delay, and bring us one step closer to putting rules on the books to protect consumers and entrepreneurs online," Grace said in a statement.

With two Republican commissioners broadly opposed to regulation of Internet traffic, the support of two Democrats on the panel - Rosenworcel and Mignon Clyburn - is critical.

Tens of thousands of public comments have been received by the FCC on Wheeler's plan over the past two weeks, and commission staff has met with nearly 100 stakeholders, including public interest groups and Internet content providers.

The latest to weigh in is the consortium of technology and Internet companies, which ranged from household names to small startups. They called on the FCC to "take the necessary steps to ensure that the Internet remains an open platform for speech and commerce."

Commission rules should not permit "individualized bargaining and discrimination," the companies said.

Engine Advocacy and New America's Open Technology Institute, long-time supporters of Open Internet policies, helped organize the effort.

(Reporting by Alina Selyukh; Editing by Ros Krasny, Sandra Maler and Jan Paschal)
http://www.reuters.com/article/2014/...A4615420140507





Sen. Franken Spearheads a Major Campaign to Save Net Neutrality
Brad Reed

In addition to taking a leading role in trying to kill the Comcast-Time Warner Cable merger, Senator Al Franken is also now leading a campaign to stop the Federal Communications Commission from letting ISPs create Internet “fast lanes” with its latest net neutrality proposal. In a new video posted by the Progressive Change Campaign Committee, Franken makes his case that the FCC’s controversial plan could hinder future innovation and consumer choice by giving big incumbent companies a permanent competitive advantage over up-and-coming startups.

As an example, Franken points to the short-lived battle between YouTube and Google’s Google Video platform last decade. Under traditional network neutrality principles, neither YouTube nor Google Video were given preferential treatment and consumers were free to choose YouTube, which eventually won out as the superior platform and was bought by Google. However, under the FCC’s new proposal, there’s a danger that Google Video would have delivered its videos at a significantly faster speed than YouTube, which wouldn’t have been able to afford to pay for its own “fast lane.”

Because of this, Franken says he wants to rally the public to tell the FCC to scrap its plan.

“We cannot allow the FCC to implement a pay-to-play system that silences our voices and amplifies that of big corporate interests,” Franken says. “We have come to a crossroads. Now is the time to rise up and make our voices heard to preserve net neutrality. We paid for a free and open Internet. We can’t let it be taken away.”

You can watch Franken’s full video below.
http://bgr.com/2014/05/07/senator-fr...lity-campaign/





F.C.C. Commissioner Asks for Delay on New Net Neutrality Rules
Edward Wyatt

A Democratic member of the Federal Communications Commission called Wednesday on the agency’s chairman to delay a proposal for new net neutrality rules, throwing into doubt whether the chairman will be able to muster enough votes at an F.C.C. meeting next week to issue proposed rules.

Jessica Rosenworcel, one of three Democrats on the five-member commission, said in a speech Wednesday that a delay was warranted because of a “torrent of public response” to the idea that the commission’s rules might create a fast lane on the Internet for companies willing to pay for it.

Last month, Tom Wheeler, the F.C.C. chairman, said he would aim to get a new set of proposed Open Internet rules before the commission at its May 15 meeting. The commission would then vote on whether to put the proposal out for public comment before adopting a final version.

Since then, tens of thousands of individuals, companies, interest groups and others have visited with or written to the F.C.C. about the topic, with most of them opposing any sort of paid access that might cause some Internet content to be favored over others.

Shannon Gilson, a spokeswoman for Mr. Wheeler, said that he intends to go ahead with his planned introduction of a proposal.

“Chairman Wheeler fully supports a robust public debate on how best to protect the Open Internet, which is why he intends to put forward his proposals for public comment next week,” Ms. Gilson said. “Moving forward will allow the American people to review and comment on the proposed plan without delay, and bring us one step closer to putting rules on the books to protect consumers and entrepreneurs online.”

The idea that all Internet content should be treated equally as it flows from content providers to consumers and back, known as net neutrality, has been debated for at least a decade. A federal appeals court has twice thrown out F.C.C. attempts to codify permissible behavior among companies that provide high-speed Internet service.

Ms. Rosenworcel said, essentially, that the commission needs to stop and take a breath to allow the F.C.C.’s legal experts to engage the public in an online dialogue about what net neutrality means and how, or whether, it should be enforced.

“While I recognize the urgency to move ahead and develop rules with dispatch, I think the greater urgency comes in giving the American public opportunity to speak right now, before we head down this road,” Ms. Rosenworcel said in an address to a meeting of the Chief Officers of State Library Agencies.

Under F.C.C. rules, the commission must stop accepting public comment one week before it votes on a proposal, meaning that commissioners could no longer be lobbied beginning Thursday.

“I think it’s a mistake to cut off public debate right now as we head into consideration of the chairman’s proposal,” Ms. Rosenworcel said. “I think we should delay our consideration of his rules by a least a month,” she added.

The two Republican F.C.C. commissioners have said that they do not believe that the agency should impose strict, or possibly any, net neutrality requirements.

The fifth commissioner, Mignon Clyburn, a Democrat, has said that she continues to oppose the idea of an Internet fast lane and that this is an opportunity for the commission “to take a fresh look and evaluate our policy in light of the many developments that have occurred over the last four years.”
http://bits.blogs.nytimes.com/2014/0...utrality-rules





The FCC Can’t Handle all the Net Neutrality Calls it’s Getting, Urges People to Write Emails Instead

INTERNET FCC Net Neutrality Controversy
Brad Reed

The Federal Communications Commission would rather read your thoughts about net neutrality than hear about them. Columbia Law School professor and leading net neutrality activist Tim Wu points out that calling the FCC’s main consumer hotline will give you a message that asks you to write an email to the commission if you’re calling about FCC chairman Tom Wheeler’s controversial net neutrality plans. This seemingly indicates that either the FCC is being flooded with calls about net neutrality that its operators can’t handle them all or it just is tired of hearing everyone call about net neutrality and would like to see them send emails instead. Either way, it looks as though people are speaking up about the issue.

This week has been a very bad one for Wheeler’s proposal that would create Internet “fast lanes” that would let Internet service providers charge companies more to ensure faster traffic delivery. Several big-name tech companies this week — including Google, Microsoft, Facebook, Amazon and Netflix — wrote a joint letter to the FCC telling it to back off any plan that would create a two-tiered Internet and instead urged it to adopt policies that would not only protect against blocking of websites but also the Internet’s traditional architecture where all packets are delivered on a first-come, first-serve basis.

What’s more, two FCC commissioners have come out and said that they want to delay voting on Wheeler’s proposal, which is scheduled to take place at an FCC meeting on May 15th. However, with at least two commissioners seeking to delay the vote and expressing opposition to parts of Wheeler’s plan, it remains unclear whether Wheeler will even have the votes to get his plan passed even if he decides not to table it.
http://bgr.com/2014/05/09/fcc-net-ne...y-controversy/





Google’s Fiber Effect: Fuel for a Broadband Explosion

The mere promise of Google Fiber seems to be enough to send rivals scrambling to deliver ultrafast Internet service at a reasonable price. Just look at Austin, Texas.
Marguerite Reardon

Dallas Miller's virtual soldier needed more firepower, and he couldn't think of a better weapon than Google Fiber.

The 28-year-old Austin, Texas, native is an avid player of the shooter game Battlefield. But he was frustrated by the spotty performance of the 20Mbps connection available through his AT&T U-Verse Internet service. In the middle of an online match, his game often froze, leaving his avatar unable to move or shoot. Other times, the game would pause or buffer as he fired, his opponent suddenly popping up in another location as the game lurched forward in real time.

"I had the maximum U-verse service at 20Mbps," he said. "But I never really got that speed. It was always slower."

Miller was among the first in Austin to sign-up online for Google Fiber when it was announced in April 2013. His hope: that Google's $70 faster 1-Gbps service would be the answer to his problems.

But then he got a call from AT&T with an offer for its new GigaPower service. Even though the 1Gbps service wasn't yet available, AT&T offered Miller 300Mbps -- more than 15 times the speed he was paying for. The best news was that the cost of his service would drop from $208 a month to $120. When AT&T finishes upgrading the electronics on the network later this year, he expects to see a 50-fold improvement. With network speeds this fast, Miller could stream without buffering at least five high-definition videos at the same time and still have enough to play his games and surf the Web.

"It was a no-brainer," he said of the switch.

Call it the Google Fiber effect. Google makes a splashy announcement that it intends to build a super high-speed network in a city. Competition follows, which translates into higher-speed services and lower prices for consumers.

A year after Google unveiled its plans in Austin, investments in gigabit fiber networks are being announced across the country. Incumbent Internet providers, like AT&T, and new entrants alike are taking elements of the Google Fiber playbook and applying them to their own deployments as they try to stay ahead of Google. AT&T last week said it was talking to 21 major metropolitan areas about an expansion of its U-verse with GigaPower fiber service. Others such as regional wireless operator C-Spire, which is using the Google Fiber business plan to build a fiber broadband network in Mississippi, are creating new lines of business using existing infrastructure.

How Google chooses the cities it deploys fiber to has been a mystery, but sister site TechRepublic recently attempted to bring some clarity to the selection process.

Within a week of Google's declaration last spring that it planned to build a fiber network in the city of Austin, AT&T, which is based a few hours' drive away in Dallas, announced its own Austin fiber network. And in less than a year's time, AT&T and local cable operator Grande Communications have beaten Google to market with their own ultra-high speed services using newly built fiber networks.

Like Google, which offers service over its fiber network in two cities today, these companies are striving for 1Gbps speeds at affordable prices -- less than $100 a month -- making ultra high-speed broadband a much more attractive offering for consumers, who stream lots of video, play online games and want to upload photos and other files in seconds rather than hours. Even slow-moving incumbent cable operator Time Warner Cable has increased speeds on its traditional copper cable infrastructure.

"Google Fiber has been the biggest driver of the fiber-to-the home movement," said Blair Levin, executive director of the Gig.U project and head of the committee that wrote the 2010 National Broadband Plan for the Federal Communications Commission.

In 2008, the Organization for Economic Co-operation and Development (OECD) ranked the US as No. 15 out of 30 countries when it came to broadband penetration and speeds. With the US in the midst of a massive recession back then, prospects for investment in new broadband infrastructure looked dismal. But Google Fiber seems to have lit a fire under the feet of the broadband industry.

"In 2009 when we were writing the National Broadband Plan it looked like the US was headed toward a significant under-investment in broadband infrastructure by 2020," Levin said. "Other countries were well ahead of us. But I have to say since Google's announcements, things are a whole lot better than what we had predicted five years ago."

Google says that it has also noticed an uptick in gigabit projects throughout the US, as broadband providers recognize that people have a "need for speed."

"The truth is, people across America want access to faster Internet," Jenna Wandres, a spokeswoman for Google Fiber said in an email. "There's a growing demand for speed from folks, who don't want to wait for videos to buffer, and who don't want to fight their family members for bandwidth. This was really the main reason we decided to build Fiber back in 2009."

Google is still going through Austin's permitting process before it begins its initial fiber deployment. Currently, Kansas City and Provo, Utah are the only cities in which Google Fiber is available. Earlier this year, the company listed 34 cities in nine metro markets that it was considering for deployment.

An ideal city

Austin, a city of about 865,000 people, might be the luckiest city in the country when it comes to Internet access.

Proud Austinites will rattle off a list of reasons why their city is ripe for massive capital investments in new, speedier Internet infrastructure.

The once small college town, which is also home to the Texas state legislature, often makes it onto top 10 lists of best places to live in the US. This, coupled with the city's thriving tech and arts scene, has made it one of the fastest-growing cities in the country. Whole Foods and Dell are headquartered here, and Apple, Samsung, Facebook, and DropBox are opening offices. SXSW, the popular music and tech festival, also makes its home here.

As a result, Austinites are particularly tech-savvy, according to AT&T's executives. Not only did the city have a higher concentration of Apple iPhone users compared to big cities like Chicago or New York when the smartphone was released in 2007, but broadband consumers in Austin often use 15 percent to 20 percent more data than the average AT&T U-verse customer, according to Dave Nichols, AT&T state president of Texas, who is a key lobbyist for the company in Texas.

"When we decided to launch our fiber service, we couldn't think of a better place than Austin," he said. "When it comes to technology it's very forward-looking."

AT&T maintains it has been planning this fiber upgrade for a long time, and that Google's announcement didn't affect the timing of its network.

But Rondella Hawkins, the telecommunications and regulatory affairs officer for the city of Austin, said she had never heard about AT&T's plans before Google's news came out. Hawkins was part of the original committee that put together Austin's application to become the first Google Fiber city. The city ultimately lost out to Kansas City.

"Our application for Google would have been a good tip-off to the incumbents that we were eager as a community to get fiber built," Hawkins said in an interview. "But we never heard from them. Until Google announced that it was going to deploy a fiber network in Austin, I was unaware of AT&T's plans to roll out gigabit fiber to the home."

Grande Communications' CEO Matt Murphy admits that without Google in the market, his company wouldn't have moved so aggressively on offering gigabit speeds. It also wouldn't be offering its service at the modest price of $65 a month, considering that the average broadband download speed sold in the US is between 20Mbps and 25Mbps for about $45 to $50 a month.

"1 gigabit per second is such a leap in terms of speeds," Murphy said. "It's nothing we would have even considered doing yet without Google in the market."

Even with such a tech-centric crowd, it's hard to imagine that three companies -- AT&T, Grande and Google -- decided at roughly the same time this city should be among the first to get ultra high-speed broadband. It's even harder to believe that all three players would decide to offer service that is more than 50 to 100 times faster than what they're currently offering at a cost that's only about $20 to $30 more than their average broadband package.

This is a huge leap in speed for a very small price increase, considering that AT&T currently offers 6Mbps DSL service for $35 a month. In markets where it offers its regular U-verse broadband service, AT&T charges $45 a month for 18Mbps service and $65 for a 45Mbps service.

While it's clear that Google Fiber is not coming to every community, the pressure is on.

It's not surprising, then, that in every city in AT&T's 22-state footprint where Google is considering deploying fiber, AT&T also plans to bring GigaPower. That's a total of 14 markets, including Austin, the Triangle region of North Carolina, and Atlanta, home to AT&T's mobility division.

Major cities not on the Google roadmap include San Francisco and Los Angeles.

While AT&T refuses to acknowledge that its gigabit fiber plans are answering the competitive challenge posed by Google Fiber, others say that Kansas City may have been a wake-up call.

"I think all the providers have learned some valuable lessons from Google's Kansas City deployment," said Julie Huls, president and CEO of the Austin Technology Council. Kansas City went live with Google Fiber in November 2012. "Speed to market and speed to deployment really matters and will determine the winners in a market. So it doesn't pay to be a laggard."

The lessons of Google Fiber

Google wasn't the first company to use fiber to deliver high-speed broadband, but it was the first company to offer such high speeds at $70 a month. It was also the first to come up with a business plan that requires participation from the city government and community.

Google specifically asked cities to cut the red tape required to make deployment more efficient and economical. And it asked communities to rally support and commit residents to subscribe to the service before it agreed to install the expensive infrastructure.

"What Google recognized that others didn't is that Americans want to have the best communications infrastructure," Gig.U's Levin said. "When you say to a community, 'Who wants fiber and a chance to have the most advanced network in the country and possibly the world?' you get a whole bunch of hands going up."

AT&T's executives admit that Google has made it easier for AT&T and others to work with cities where it wants to deploy its own Gigabit fiber service.

"Since Google Fiber came on the scene, we've seen a significant shift in how municipalities view network operators," said Eric Boyer, senior vice president of U-verse. "They see how Kansas City was able to work with Google and now, they're willing to do that with other providers."

Specifically, cities such as Austin are trying to speed up the permit and inspections processes.

"In the past, certain permitting processes cost us millions of dollars," said Eric Small, vice president of Fiber broadband planning for AT&T. "But now the city is interested in working with us to reduce those expenses."

Need for speed? Maybe yes, maybe no

Other broadband operators have built networks capable of delivering 1Gbps service. Cable operators, which use a different network technology, have already demonstrated download speeds at that level. Verizon Communications, which was the first major broadband provider to install a full fiber network, has stopped short of delivering 1Gbps service, even though it is capable of delivering such speeds.

Cable operators and Verizon have said that customers don't need or want a service at those speeds.

"We're continuing to see a growing interest for faster broadband among our customer base," Bill Kula, a spokesman for Verizon, said in an email. "However, widespread adoption of 1Gbps is not evident as of yet."

Indeed, today very few Americans have connections at that speed, but demand for broadband itself is increasing. Pew Research found in its most recent survey, conducted in September, that about 70 percent of Americans have broadband service, which is up from 66 percent the previous year. But Pew and the Federal Communications Commission have a very low benchmark for what constitutes broadband: download speeds of 4Mbps and uploads of 1Mbps.

To put this in perspective: a single DVD-quality Netflix movie requires a broadband connection of about 3Mbps. You need speeds of at least 5Mbps if you want to stream that movie in high-definition. With a 1Gbps connection you could stream at least five HD videos at the same time and still have plenty of bandwidth to surf the web, check email, and upload pictures to Facebook. Also, with a 1Gbps connection you can simply do things much faster. For instance, you could download an entire HD movie in about 33 seconds.

But cable operators and Verizon are skeptical about whether consumers really need to be streaming five HD movies at once. And speeds that are a tenth as fast as the gigabit service (100 Mbps) can also offer speedy downloads.

These companies have a point. Even Grande CEO Murphy admits that most consumers don't need to go that fast. He added that even if they subscribe to such a service, the equipment and devices in the home aren't capable of delivering those full speeds. Few customers even subscribed to the company's highest tier of service, which previously topped out at 100Mbps, before it introduced the 1Gbps service.

David Noonan, who covers broadband for consultancy IBB, said that most families couldn't consume enough online media to justify a 1Gbps connection.

"But it doesn't mean that they don't want it," he said. What Google and other broadband providers are doing, then, when they tout gigabit services is this: marketing.

Murphy admits that going to such speeds has been great publicity. "We've gotten an unbelievable amount of PR from raising the speeds," he said. "As a small provider we rarely have something as new and noteworthy."

Getting the price right

Even if 1Gbps is overkill for most consumers, speeds of 100Mbps or even 300Mbps may not be. Incumbent providers such as Comcast and Verizon offer such speeds in certain markets, but the pricing on these services is often well over $100. For example, Comcast and Verizon each charge more than $300 a month for their 500Mbps services, which are available only in certain markets.

It's little surprise that Comcast and Verizon have seen few customers sign up for these services, which has led executives, such as Brian Roberts, the CEO of Comcast, to conclude that consumers don't really see a need for these speeds.

But the reality is that consumers likely don't see a need that justifies exorbitant prices.

Google, however, entered the market at $70 a month, which is $20 to $25 above the average price that most customers are comfortable spending on Internet service, said Murphy. Even with that difference, some consumers may find the pricing a stretch. But the overwhelmingly higher speed can often entice customers into a higher-priced package.

That's exactly what happened to Austin local David Greene. Greene, who for the past 12 years has gone without cable TV, agreed to take the U-verse TV package on top of $70 a month broadband service simply because it was only $50 more a month.

Greene said he is willing to pony up the extra money for AT&T's video package because he is getting such a great deal on his 1Gbps broadband service. Even though he's paying $65 more per month, he said it's worth it for the nearly 100-fold increase in broadband speed.

Execs at AT&T agree that the prices on other top-tier broadband speeds have been too high.

"People aren't willing to spend five times more for the higher-speed service," said AT&T's Small. "They might spend 50 percent more, but not the multiple it has been in the last few years."

Even at the $70 price point, AT&T may have to fight to retain customers once Google Fiber is up and running in Austin. Greene says he is satisfied with AT&T's GigaPower service and has been more than happy with the company's customer support, but he could be persuaded to make a change.

"I'm no brand loyalist," he said. "I'd absolutely switch if Google offered a better deal."
http://www.cnet.com/news/googles-fib...and-explosion/

















Until next week,

- js.



















Current Week In Review





Recent WiRs -

May 3rd, April 26th, April 19th, April 12th

Jack Spratts' Week In Review is published every Friday. Submit letters, articles, press releases, comments, questions etc. in plain text English to jackspratts (at) lycos (dot) com. Submission deadlines are Thursdays @ 1400 UTC. Please include contact info. The right to publish all remarks is reserved.


"The First Amendment rests on the assumption that the widest possible dissemination of information from diverse and antagonistic sources is essential to the welfare of the public."
- Hugo Black
JackSpratts is offline   Reply With Quote
Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Peer-To-Peer News - The Week In Review - July 16th, '11 JackSpratts Peer to Peer 0 13-07-11 06:43 AM
Peer-To-Peer News - The Week In Review - July 9th, '11 JackSpratts Peer to Peer 0 06-07-11 05:36 AM
Peer-To-Peer News - The Week In Review - January 30th, '10 JackSpratts Peer to Peer 0 27-01-10 07:49 AM
Peer-To-Peer News - The Week In Review - January 16th, '10 JackSpratts Peer to Peer 0 13-01-10 09:02 AM
Peer-To-Peer News - The Week In Review - December 5th, '09 JackSpratts Peer to Peer 0 02-12-09 08:32 AM






All times are GMT -6. The time now is 02:09 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
© www.p2p-zone.com - Napsterites - 2000 - 2024 (Contact grm1@iinet.net.au for all admin enquiries)