P2P-Zone  

Go Back   P2P-Zone > Peer to Peer
FAQ Members List Calendar Search Today's Posts Mark Forums Read

Peer to Peer The 3rd millenium technology!

Reply
 
Thread Tools Search this Thread Display Modes
Old 01-07-15, 07:57 AM   #1
JackSpratts
 
JackSpratts's Avatar
 
Join Date: May 2001
Location: New England
Posts: 10,013
Default Peer-To-Peer News - The Week In Review - July 4th, '15

Since 2002


































"Secrecy is no way to trade. We need to know what the government is preparing to trade away in our names." – Senator Ludlam, Greens Party


"At the time, what put a thorn in my ass was seeing fans get sued. Now I can just pay a subscription and get all this music… I think it is a healthier answer. You ask a lot of kids today how much pirating are you doing, and I think it's down." – Nikki Sixx






































July 4th, 2015




Surveillance Court Rules That N.S.A. Can Resume Bulk Data Collection
Charlie Savage

The Foreign Intelligence Surveillance Court ruled late Monday that the National Security Agency may temporarily resume its once-secret program that systematically collects records of Americans’ domestic phone calls in bulk.

But the American Civil Liberties Union said Tuesday that it would ask the United States Court of Appeals for the Second Circuit, which had ruled that the surveillance program was illegal, to issue an injunction to halt the program, setting up a potential conflict between the two courts.

The program lapsed on June 1, when a law on which it was based, Section 215 of the USA Patriot Act, expired. Congress revived that provision on June 2 with a bill called the USA Freedom Act, which said the provision could not be used for bulk collection after six months.

The six-month period was intended to give intelligence agencies time to move to a new system in which the phone records — which include information like phone numbers and the duration of calls but not the contents of conversations — would stay in the hands of phone companies. Under those rules, the agency would still be able to gain access to the records to analyze links between callers and suspected terrorists.

But, complicating matters, in May the Court of Appeals for the Second Circuit, in New York, ruled in a lawsuit brought by the A.C.L.U. that Section 215 of the Patriot Act could not legitimately be interpreted as permitting bulk collection at all.

Congress did not include language in the Freedom Act contradicting the Second Circuit ruling or authorizing bulk collection even for the six-month transition. As a result, it was unclear whether the program had a lawful basis to resume in the interim.

After President Obama signed the Freedom Act on June 2, his administration applied to restart the program for six months. But a conservative and libertarian advocacy group, FreedomWorks, filed a motion in the surveillance court saying it had no legal authority to permit the program to resume, even for the interim period.

In a 26-page opinion made public on Tuesday, Judge Michael W. Mosman of the surveillance court rejected the challenge by FreedomWorks, which was represented by a former Virginia attorney general, Ken Cuccinelli, a Republican. And Judge Mosman said the Second Circuit was wrong, too.

“Second Circuit rulings are not binding” on the surveillance court, he wrote, “and this court respectfully disagrees with that court’s analysis, especially in view of the intervening enactment of the USA Freedom Act.”

When the Second Circuit issued its ruling that the program was illegal, it did not issue any injunction ordering the program halted, saying it would be prudent to see what Congress did as Section 215 neared its June 1 expiration. Jameel Jaffer, an A.C.L.U. lawyer, said on Tuesday that the group would now ask for one.

“Neither the statute nor the Constitution permits the government to subject millions of innocent people to this kind of intrusive surveillance,” Mr. Jaffer said. “We intend to ask the court to prohibit the surveillance and to order the N.S.A. to purge the records it’s already collected.”

The bulk phone records program traces back to October 2001, when the Bush administration secretly authorized the N.S.A. to collect records of Americans’ domestic phone calls in bulk as part of a broader set of post-Sept. 11 counterterrorism efforts.

The program began on the basis of presidential power alone. In 2006, the Bush administration persuaded the surveillance court to begin blessing it under of Section 215 of the Patriot Act, which says the government may collect records that are “relevant” to a national security investigation.

The program was declassified in June 2013 after its existence was disclosed by the former intelligence contractor Edward J. Snowden.

It remains unclear whether the Second Circuit still considers the surveillance program to be illegal during this six-month transition period. The basis for its ruling in May was that Congress had never intended for Section 215 to authorize bulk collection.

In his ruling, Judge Mosman said that because Congress knew how the surveillance court was interpreting Section 215 when it passed the Freedom Act, lawmakers implicitly authorized bulk collection to resume for the transition period.

“Congress could have prohibited bulk data collection” effective immediately, he wrote. “Instead, after lengthy public debate, and with crystal-clear knowledge of the fact of ongoing bulk collection of call detail records,” it chose to allow a 180-day transitional period during which such collection could continue, he wrote.

The surveillance court is subject to review by its own appeals panel, the Foreign Intelligence Surveillance Court of Review. Both the Second Circuit and the surveillance review court are in turn subject to the Supreme Court, which resolves conflicts between appeals courts.

Wyn Hornbuckle, a Justice Department spokesman, said in a written statement that the Obama administration agreed with Judge Mosman.

Since the program was made public, plaintiffs have filed several lawsuits before regular courts, which hear arguments from each side before issuing rulings, unlike the surveillance court’s usual practice, which is to hear only from the government. Judge Mosman’s disagreement with the Second Circuit is the second time that the surveillance court has rejected a contrary ruling about the program by a judge in the regular court system.

In a lawsuit challenging the program that was brought by the conservative legal advocate Larry Klayman, Judge Richard J. Leon of Federal District Court in the District of Columbia ruled in December 2013 that the program most likely violated the Fourth Amendment, which prohibits unreasonable searches and seizures.

But in March 2014, Judge Rosemary M. Collyer, a Federal District Court judge who also sits on the secret surveillance court, rejected Judge Leon’s reasoning and permitted the program to keep going. The Obama administration has appealed Judge Leon’s decision to the Court of Appeals for the District of Columbia.

The Freedom Act also contains a provision saying that whenever the surveillance court addresses a novel and significant legal issue, it must either appoint an outside “friend of the court” who can offer arguments contrary to what the government is saying, or explain why appointing one is not appropriate.

The first test of that reform came last month when another judge on the court, F. Dennis Saylor IV, addressed a separate issue raised by the passage of the Freedom Act. Judge Saylor acknowledged that it was novel and significant, but declined to appoint an outside advocate, saying the answer to the legal question was “sufficiently clear” to him without hearing from one.
http://www.nytimes.com/2015/07/01/us...ollection.html





China Adopts New Security Law to Make Networks, Systems 'Controllable'
Michael Martina

China's legislature adopted a sweeping national security law on Wednesday that covers everything from territorial sovereignty to measures to tighten cyber security, a move likely to rile foreign businesses.

A core component of the law, passed by the standing committee of the National People's Congress (NPC), is to make all key network infrastructure and information systems "secure and controllable".

President Xi Jinping has said China's security covers areas including politics, culture, the military, the economy, technology and the environment.

But foreign business groups and diplomats have argued that the law is vague and fear it could require that technology firms make products in China or use source code released to inspectors, forcing them to expose intellectual property.

Zheng Shuna, vice chairwoman of the Legislative Affairs Commission of the NPC standing committee, downplayed those concerns, saying China welcomes "all countries' businesses to operate in China and provide legitimate services according to law".

"We will continue to follow the path of peaceful development but we absolutely will not give up our legitimate rights and absolutely will not sacrifice the country's core interests," she said at a briefing.

The security of territorial seas and airspace is among those core interests, which, according to the legislation, China will take "all necessary measures" to safeguard.

The law, which comes amid tensions with neighbors over disputes in the South China and East China Seas, passed through the NPC standing committee, the top body of China's rubber stamp parliament, by a vote of 154 to zero, with one abstention.

'GROWING INFLUENCE OF HARDLINERS'

The national security law is part of a raft of government legislation - including laws on anti-terrorism, cyber security and foreign non-government organizations - that have drawn criticism from foreign governments, business and civil society groups.

Those policies, many of which have cyber security components, have emerged after former National Security Agency contractor Edward Snowden disclosed that U.S. spy agencies planted code in American tech exports to snoop on overseas targets.

"The fact that these different pieces of legislation are all moving forward in tandem indicates the seriousness of Beijing's commitment as well as the growing influence of hardliners shaping China's technology policy agenda," Samm Sacks, an analyst at U.S.-based consulting firm Eurasia Group, said in an emailed statement.

Critics have argued that the extensive nature of the law, which covers everything from China's deep sea and space assets to "harmful cultural influences", constitutes national security overreach.

Its passage also coincides with a crackdown on dissent, as the government has detained and jailed activists and blamed "foreign forces" for the pro-democracy protests in Hong Kong last year.

Hong Kong and Macau must "fulfill responsibilities to safeguard national security" according to the law, which also covers crimes of subversion and inciting rebellion. That reference could spark more fears of Beijing encroaching on Hong Kong's rule of law.

Britain returned Hong Kong to China in 1997 under a "one country, two systems" formula, with the promise of a high degree of autonomy. Unlike on the mainland, Hong Kong does not have laws criminalizing subversion of the state. Macau, a former Portuguese colony, returned to China in 1999.

Some seven months after Hong Kong police forcibly cleared pro-democracy protesters from the streets, tens of thousands of people were expected to rally for free elections on Wednesday as the city marks the 18th anniversary of its return to China.

(Additional reporting by Sui-Lee Wee; Editing by Raju Gopalakrishnan and Nick Macfie)
http://uk.reuters.com/article/2015/0...0PB39H20150701





Jitters in Tech World Over New Chinese Security Law
Paul Mozur

When a draft of China’s new national security law was made public in May, critics argued that it was too broad and left much open to interpretation.

In the final form of the law, which the government said Wednesday had been enacted, Beijing got more specific, but in a way that is sending ripples through the global technology industry.

New language in the rules calls for a “national security review” of the technology industry — including networking and other products and services — and foreign investment. The law also calls for technology that supports crucial sectors to be “secure and controllable,” a catchphrase that multinationals and industry groups say could be used to force companies to build so-called back doors — which allow third-party access to systems — provide encryption keys or even hand over source code.

As with many Chinese laws, the language is vague enough to make it unclear how the law will be enforced, but it suggests a new front in the wider clash between China and the United States over online security and technology policy.

The United States has accused China of state-sponsored hacking attacks against American companies to gain a commercial advantage, and of creating policies meant to force the transfer of intellectual property to Chinese companies.

In turn, China maintains that the disclosures by Edward J. Snowden, the former United States National Security Agency contractor, about American online espionage give it plenty of reason to wean itself from foreign technology that may have been tampered with by United States intelligence agencies.

The most recent policy clash between the United States and China came in April, and it ended with Beijing’s saying that it would withdraw a law that restricted which technology products could be sold by foreign companies to Chinese banks. Groups that represent companies like Apple, Google and Microsoft had pushed against that law.

At the time, the withdrawal was billed as a victory for foreign companies, but the recent additions to the Chinese national security law show that it might have been a brief reprieve. The changes to the law are also likely to increase lobbying pressure on the United States by multinationals aimed at a separate Chinese measure, a counterterrorism law that Beijing is expected to pass this year and that could include stronger restrictions on foreign technologies being sold into China.

“I think it’s a perfect storm: The cybersecurity concerns because of Snowden and the techno-nationalist perspective have really gained strength over the past few years,” said Adam Segal, a senior fellow at the Council on Foreign Relations in New York. “China is not particularly swayed by or sympathetic to arguments that the foreign companies have made, and they’re going to push forward on all these fronts.”

At a news conference on Wednesday in Beijing, Zheng Shuna, deputy director of the legislative affairs commission of the National People’s Congress, China’s legislature, underscored those concerns.

“China’s cybersovereignty shall be respected and maintained,” she said, using the term Beijing has adopted to argue that countries should be allowed to enact whatever laws are necessary to manage the Internet and information technology within their borders.

“Raising the idea of ‘safeguarding national cybersovereignty’ in the National Security Law is a response to the needs of the development of the Chinese Internet,” Ms. Zheng added. “It provides the legal basis for managing cyberactivity on China’s soil and resisting activities which jeopardize China’s cybersecurity.”

The concept of a governmental body’s reviewing the national security concerns for issues like foreign investment in the technology industry is not reserved to the Chinese. The Committee on Foreign Investment in the United States, for instance, oversees and occasionally bars investment in sensitive industries by countries like China.

Still, foreign companies argue that Beijing is likely to use such an oversight body to favor local companies and to push multinationals to do more to help Chinese companies develop more advanced capabilities.

Mr. Segal pointed to one phrase in the new Chinese law that could be particularly problematic: “secure and controllable.”

“Since no one knows how you implement that phrase,” he said, “foreign companies are worried about what that’s going to mean. Does it mean they have to give access through back doors, or are they going to have to partner with Chinese firms?”

Mr. Segal added that one recourse would be to argue through the World Trade Organization that what China considers national security concerns are not valid.
http://www.nytimes.com/2015/07/03/bu...urity-law.html





Sloppy Cyber Threat Sharing Is Surveillance by Another Name
Jennifer Granick

This post is the latest installment of our “Monday Reflections” feature, in which a different Just Security editor examines the big stories from the previous week or looks ahead to key developments on the horizon.

Imagine you are the target of a phishing attack: Someone sends you an email attachment containing malware. Your email service provider shares the attachment with the government, so that others can configure their computer systems to spot similar attacks. The next day, your provider gets a call. It’s the Department of Homeland Security (DHS), and they’re curious. The malware appears to be from Turkey. Why, DHS wants to know, might someone in Turkey be interested in attacking you? So, would your email company please share all your emails with the government? Knowing more about you, investigators might better understand the attack.

Normally, your email provider wouldn’t be allowed to give this information over without your consent or a search warrant. But that could soon change. The Senate may soon make another attempt at passing the Cybersecurity Information Sharing Act, a bill that would waive privacy laws in the name of cybersecurity. In April, the US House of Representatives passed by strong majorities two similar “cyber threat” information sharing bills. These bills grant companies immunity for giving DHS information about network attacks, attackers, and online crimes.

Sharing information about security vulnerabilities is a good idea. Shared vulnerability data empowers other system operators to check and see if they, too, have been attacked, and also to guard against being similarly attacked in the future. I’ve spent most of my career fighting for researchers’ rights to share this kind of information against threats from companies that didn’t want their customers to know their products were flawed.

But, these bills gut legal protections against government fishing expeditions exactly at a time when individuals and Internet companies need privacy laws to get stronger, not weaker.

Worse, the bills aren’t needed. Private companies share threat data with each other, and even with the government, all the time. The threat data that security professionals use to protect networks from future attacks is a far more narrow category of information than those included in the bills being considered by Congress, and will only rarely contain private information.

And none of the recent cyberattacks — not Sony, not Target, and not the devastating grab of sensitive background check interviews on government employees at the Office of Personnel Management — would have been mitigated by these bills.

None of this has stopped private companies from crowing about their need for corporate immunity, but it should stop Congress from giving it to them. We don’t need to pass laws gutting privacy rights to save cybersecurity.

These bills aren’t needed and aren’t designed to encourage sharing the right kind of information. These are surveillance bills masquerading as security bills.

Instead of removing (non-existent) barriers to sharing — and undermining American privacy in the process — Congress should consider how to make sharing worthwhile. I’ve been told by many entities, corporate and academic, that they don’t share with the government because the government doesn’t share back. Silicon Valley engineers have wondered aloud what value DHS has to offer in their efforts to secure their employer’s services. It’s not like DHS is setting a great security example for anyone to follow. OPM’s Inspector General warned the government about security problems that, left unaddressed, led to the OPM breach.

And there’s a very serious trust issue. We recently learned that the NSA is sitting on the domestic Internet backbone, searching for foreign cyberthreats, helping the FBI and thinking about how to get authority to scan more widely. You can see the lifecycle now. Vulnerable company reports a threat to DHS, NSA programs its computers to search for that threat, vulnerable company’s proprietary data gets sucked in by FBI. Any company has to think at least twice about sharing how they are vulnerable with a government that hoards security vulnerabilities and exploits them to conduct massive surveillance.

Cybersecurity is a serious problem, but it’s not going to get better with Congress doing whatever it politically can instead of doing what it should. It’s not going to get better by neutering the few privacy protections we have. Good security is supposed to keep your information safe. But these laws will make your private emails and information vulnerable. Lawmakers have got to start listening to experts, and experts are saying the same thing. Don’t just do something, do the right thing. And if you can’t do the right thing, then don’t do anything at all.
http://justsecurity.org/24261/sloppy...-surveillance/





Most Internet Anonymity Software Leaks Users’ Details

Services used by hundreds of thousands of people in the UK to protect their identity on the web are vulnerable to leaks, according to researchers at QMUL and others.
Will Hoyles

Virtual Private Networks (VPNs) are legal and increasingly popular for individuals wanting to circumvent censorship, avoid mass surveillance or access geographically limited services like Netflix and BBC iPlayer. Used by around 20 per cent of European internet users they encrypt users’ internet communications, making it more difficult for people to monitor their activities.

The study of fourteen popular VPN providers found that eleven of them leaked information about the user because of a vulnerability known as ‘IPv6 leakage’. The leaked information ranged from the websites a user is accessing to the actual content of user communications, for example comments being posted on forums. Interactions with websites running HTTPS encryption, which includes financial transactions, were not leaked.

The leakage occurs because network operators are increasingly deploying a new version of the protocol used to run the Internet called IPv6. IPv6 replaces the previous IPv4, but many VPNs only protect user’s IPv4 traffic. The researchers tested their ideas by choosing fourteen of the most famous VPN providers and connecting various devices to a WiFi access point which was designed to mimic the attacks hackers might use.

Researchers attempted two of the kinds of attacks that might be used to gather user data – ‘passive monitoring’, simply collecting the unencrypted information that passed through the access point; and DNS hijacking, redirecting browsers to a controlled web server by pretending to be commonly visited websites like Google and Facebook.

The study also examined the security of various mobile platforms when using VPNs and found that they were much more secure when using Apple’s iOS, but were still vulnerable to leakage when using Google’s Android.

Dr Gareth Tyson, a lecturer from QMUL and co-author of the study, said:

“There are a variety of reasons why someone might want to hide their identity online and it’s worrying that they might be vulnerable despite using a service that is specifically designed to protect them.

“We’re most concerned for those people trying to protect their browsing from oppressive regimes. They could be emboldened by their supposed anonymity while actually revealing all their data and online activity and exposing themselves to possible repercussions.”
http://www.qmul.ac.uk/media/news/items/se/158459.html





Avira Wins Case Upholding its Right to Block Adware
Mark Wilson

Security firm Avira has won a court case that can not only be chalked up as a win for consumer rights, but could also set something of a precedent. German company Freemium.com took Avira to court for warning users about 'potentially unwanted applications' that could be bundled along with a number of popular games and applications.

Freemium.com downloads included a number of unwanted extras in the form of browser toolbars, free trial applications, adware, and other crapware. Avira's antivirus software warned users installing such applications; Freemium took objection to this and filed a cease and desist letter, claiming anti-competitive practices. But the court ruled in Avira's favor, saying it could continue to flag up and block questionable software.

Freemium runs a number of download and gaming sites, and its own installer is frequently used in place of a regular setup executable. It is used to install software from partners and advertisers to generate extra funds. The company claimed its ad revenue has dropped by 50 percent since its software had been flagged. The court said that Avira was well within its rights to issue warnings about potentially unwanted components. Freemium was ordered to pay all court costs.

Avira CEO Travis Witteveen said:

“This ruling establishes a major legal milestone in the fight against misleading consumers into unintentionally installing unwanted software onto their computers. Earlier this year we established clear guidelines defining unethical software behavior, and defining what our security software will block. We believe in 'freemium' and advertising-supported business models, however they must remain transparent and ethical in their implementation.”

Freemium had argued that users were given the ability to opt out of installing extra software, but the court found that opting out was not made sufficiently clear.
http://betanews.com/2015/06/29/avira...-block-adware/





'Blind Agreement' and Closed-Door Deals: Report Slams TPP Negotiations

With sign-off on the Trans-Pacific Partnership edging closer and critics warning the deal could "attack internet freedoms," a Parliamentary review has slammed the negotiating process for a lack of oversight.
Claire Reilly

As a trade deal between Australia and its allies edges closer towards completion, leading critics to warn of an impending "attack [on] internet freedoms," a parliamentary committee has slammed the deal-making process saying it lacks adequate "oversight and scrutiny."

The comments come as Australia engages in closed-door negotiations with 11 other countries over the Trans-Pacific Partnership -- a trade agreement that could change the copyright and piracy landscape in Australia and have major ramifications for the way Australians access online content.

A joint-Parliamentary report on the TPP and other trade deals has declaimed that "not is all right with the current process" and that politicians and key stakeholders are being "kept in the dark" on the negotiation process.

The "Blind Agreement" report warns that under the current system, "Parliament is faced with an all-or-nothing choice" on whether or not to approve trade agreements and can only officially review trade laws once they've officially passed.

"This does not provide an adequate level of oversight and scrutiny," the report reads. "Parliament should play a constructive role during negotiations and not merely rubber-stamp agreements that have been negotiated behind closed doors."

The Committee, comprised of three Labor Senators, two Liberal Senators and one Senator from the Greens, made a series of 10 recommendations for reform. However, a "dissenting report" appended to the text noted these Coalition Senators "disagree with all of the findings and recommendations of the majority report" and that "Australia's treaty-making system works well."

The full text of the TPP has not been made public, with only select unverified chapters surfacing through Wikileaks, including a chapter covering intellectual property and copyright.

Greens Senator Scott Ludlam, also a member of the committee that drafted the Blind Agreement report, has heavily criticised the secrecy of TPP negotiations saying this IP chapter alone has the power to "attack internet freedoms and criminalise downloading."

"We know from other leaks the TPP covers everything from giving America the right to put Australian Internet users under surveillance, to giving multinational companies the rights to sue governments for the laws they make," said Senator Ludlam.

"Secrecy is no way to trade. We need to know what the government is preparing to trade away in our names."

Australia's Productivity Commission has also warned the TPP could bring a further tightening of digital freedoms, giving rights holders the power to increase 'technical protection measures' such as geo-blocks or digital rights management (DRM) protections on content.

But it's not just Australian politicians who have criticised the negotiation process into question, with US Democratic Senator Barbara Boxer recently taking to floor of the US Senate to explain the convoluted steps she needed to take to view the text of the massive trade agreement.

"Follow this: You can only take a few of your staffers who happen to have a security clearance, because, God knows why, this is secure, this is classified," she said.

"The guard says...'You can take notes, but you have to give them back to me, and I'll put them in a file.' So I said: 'Wait a minute. I'm going to take notes and then you're going to take my notes away from me and then you're going to have them in a file, and you can read my notes? Not on your life.'"

Despite concerns over transparency stateside, the US Senate last week voted to give President Barack Obama the power to "fast track" the TPP. This grants the President authority to put a final draft of the TPP before Congress for a 'yes-or-no' vote, but Congress will not have power to amend any part of the trade agreement.

Back at home, this kind of "all-or-nothing" approval was a key point of contention raised in the joint-Parliamentary report on the TPP.

But while the committee examining Australia's deal-making future has called for reform on this front, amongst a total of 10 recommendations, the TPP marches closer to completion with Federal Trade Minister Andrew Robb saying less than a fortnight ago, "We are literally one week of negotiation away from completing this extraordinary deal."
http://www.cnet.com/au/news/blind-ag...ation-process/





This Online Anonymity Box Puts You a Mile Away From Your IP Address
Andy Greenberg

In the game of anonymity-versus-surveillance online, the discovery of the user’s IP address usually means game over. But if Ben Caudill has his way, a network snoop who successfully hunts a user through layers of proxy connections to a final IP address would be met with a dead end—while the anonymous user remains safe at home more than a mile away.

At the upcoming DefCon hacker conference in Las Vegas next month, Caudill plans to unveil ProxyHam, a “hardware proxy” designed to use a radio connection to add a physical layer of obfuscation to an internet user’s location. His open-source device, which he built for $200, connects to Wi-Fi and relays a user’s Internet connection over a 900 megaherz radio connection to their faraway computer, with a range of between one and 2.5 miles depending on interference from the landscape and buildings. That means even if investigators fully trace the user’s internet connection, they’ll find only the ProxyHam box the person planted in a remote library, cafe, or other public place—and not their actual location.

Caudill, a researcher for the consultancy Rhino Security Labs, compares his tool to typical tactics to hide the source of an Internet connection, like using a neighbor’s Wi-Fi, or working from a coffee shop instead of home. But “the problem with Wi-Fi as a protocol is that you can’t get the range you need. If the FBI kicks down the door, it may not be my door, but it’ll be so close they can hear me breathe,” says Caudill. “[ProxyHam] gives you all the benefits of being able to be at a Starbucks or some other remote location, but without physically being there.”

ProxyHam, which Caudill says he’ll offer for sale at cost to DefCon attendees and will also teach users how to build with instructions on his website and ProxyHam’s Github page (both available after DefCon), is actually two devices. The first part is a box the size of a large dictionary, containing a Raspberry Pi computer connected to a Wi-Fi card and a small 900 megaherz antenna, all of which is meant to be plugged in at some inconspicuous public place—Caudill suggests a dark corner of a public library. On the other end of a radio connection, the user plugs in a 900 megaherz antenna into his or her ethernet port. (In the picture above, Caudill uses a giant Yagi antenna, but he says a much smaller $57 flat patch antenna works, too.)

Caudill intends ProxyHam to protect sensitive Internet users, such as dissidents and whistleblowers, for whom tools like VPNs and even the anonymity software Tor may not provide sufficient security. If an attacker can manage to install malware on the user’s PC, for instance, that malware can circumvent Tor and send the user’s IP address directly to the attacker. But with ProxyHam, that malware attack would only lead investigators to the ProxyHam device, not the user. “The KGB isn’t kicking in your door,” says Caudill. “They’re kicking in the door of the library 2.5 miles away.”

To avoid radio detection on the user’s end, ProxyHam’s wireless signals are designed to look indistinguishable from the many cordless telephones that use the same frequency. And Caudill says the rise of more internet-connected wireless gadgets will provide further cover for ProxyHam users over time. “There are a ton of devices jumping into that space and communicating there,” he says. “It’s not feasible to say ‘we’ll chase down everyone who has this device communicating on this frequency.’ It’s a needle in a haystack.”

No one should depend on ProxyHam alone—particularly until its security has been proven in real-world testing, says Micah Lee, a security technologist for The Intercept and occasional developer for the anonymous whistle-blowing software SecureDrop. But Lee points out that it can be used in combination with existing anonymity software like VPNs and Tor. “It seems like a thing to augment your Tor usage rather than replace it. In that sense, it seems like a good idea,” he says. Lee himself counsels anonymous leakers who use SecureDrop to send secrets to a news organization to first connect to a public Wi-Fi network. ProxyHam, he says, could accomplish something similar. “No matter how many hops over the Internet you use, if there’s someone spying on everything, they can connect all the dots. But if one of the hops isn’t over the Internet and is instead over a radio link, it’ll be a lot harder to connect those dots.”

The version of ProxyHam Caudill intends to sell at DefCon will be fairly basic. But in future versions he’s still developing, Caudill says the device will also include accelerometers designed to detect and warn users if it’s been moved from its hiding place. He’s even hoping to include a microphone that can act as a “black box” recorder to relay to the owner the last few moments of audio the ProxyHam hears before it’s disconnected. All of that, says Caudill, is intended to prevent investigators from discovering a ProxyHam and then tampering with it to eavesdrop on its communications or to trap a user who comes to fix or retrieve it.

Going to the trouble of buying and planting a ProxyHam device—one that if used safely, you may never see again—may sound like paranoia. But Caudill intends ProxyHam to protect the very most sensitive people on the internet, those for whom mere software protections aren’t good enough. “Journalists and dissidents in Arab Spring countries, for instance…these people have very high security requirements,” Caudill says. “This is that last-ditch effort to remain anonymous and keep yourself safe.”
http://www.wired.com/2015/07/online-...ay-ip-address/





EU Plans to Destroy Net Neutrality by Allowing Internet Fast Lanes

Potentially a big defeat for EU internet users, online startups and future innovation.
Glyn Moody

A two-tier Internet will be created in Europe as the result of a late-night "compromise" between the European Commission, European Parliament and the EU Council. The so-called "trilogue" meeting to reconcile the different positions of the three main EU institutions saw telecom companies gaining the right to offer "specialised services" on the Internet. These premium services will create a fast lane on the Internet and thus destroy net neutrality, which requires that equivalent traffic is treated in the same way.

In a fact sheet on the agreement, the European Commission tries to hide the reality that net neutrality is being destroyed by defining something called the open Internet : "Under today's agreement, paid prioritisation in the open Internet will be banned. Based on this new legislation, all content and application providers will have guaranteed access to end-users in the open Internet. This access should not be dependent on the will or particular commercial interest of Internet service providers."

But running alongside this "open Internet," on the same network, there will be "specialised services," which are not open and where paid prioritisation is permitted: "The new EU net neutrality rules guarantee the open Internet and enable the provision of specialised or innovative services on condition that they do not harm the open Internet access." The caveat is vague, and in practice will not prevent "specialised services" competing with those offered on the "open Internet"—the Commission mentions "internet TV" as an example of a specialised service—so large companies will be able to offer premium services at attractive prices, which startups with limited resources will find hard to match.

The Commission is aware of that threat to fair competition. In its press release, it says the new rules will mean that "access to a start-up's website will not be unfairly slowed down to make the way for bigger companies." However, this only applies to its newly-defined "open Internet," where all traffic must be treated fairly. It does not apply to specialised services, which will be able to pay telecoms companies for faster delivery than rivals on the "open Internet." Inevitably, this tilts the playing field in favour of established players with deeper pockets.

The Commission's main argument for introducing "specialised services" is to encourage new offerings: "more and more innovative services require a certain transmission quality in order to work properly, such as telemedicine or automated driving." But it would be unwise to run these kinds of critical services over a connection that was also running traditional Internet services: instead, a dedicated connection used only for that service would be needed. In that case, prioritisation and net neutrality would not be an issue because it would not be used for anything else.

If a service is not critical, it would not need to be prioritised over traditional internet offerings. If people wanted better connectivity for that "internet TV" service, they could upgrade their whole connection, not just pay a premium for the "specialised services" part. In other words, the argument for "specialised services" does not stand up to scrutiny. In fact, the real motivation behind introducing the concept is simply that telecoms want to be able to charge those offering online services extra for preferential delivery.

The new rules are further biased in favour of incumbents by allowing zero rating: "Zero rating, also called sponsored connectivity, is a commercial practice used by some providers of Internet access, especially mobile operators, not to count the data volume of particular applications or services against the user's limited monthly data volume." This, too, creates a two-tier Internet, since the services that are not provided free by ISPs are at a big disadvantage compared to those that are. Zero rating will encourage bigger companies to pay for a place on zero-rate programmes, thus raising the barrier to entry for new, competitive services.

The European Commission's press release skirts over another major loss for European innovation: the failure to bring in a unified approach to releasing radio spectrum in the EU. It simply says: "Member States receive valuable revenues from the sale of spectrum rights. While these revenues should stay exclusively with the Member States, we need a more harmonised management of radio spectrum at EU level given its vital importance for connectivity." That's a huge missed opportunity to free up spectrum across the EU for innovative services, particularly from smaller companies. Once more, the main benefactors are the established telecoms companies that can use their entrenched positions to throttle competition.

The only real benefit for EU citizens from last night's agreement is the eventual abolition of roaming charges for mobile phone use in the EU from June 2017, but even here the situation is not as rosy as the European Commission would have us believe. According to the Green MEP Michel Reimon, who took part in the trilogue, telecoms companies will still be allowed to add on special roaming charges to protect their domestic rates, provided the relevant national government agrees (original in German).

Calling the details of the deal "blurry" and "ambiguous," particularly on "specialised services," Joe McNamee of the digital rights group EDRi wrote today: "This is 'just' a provisional agreement. First, the explanatory recitals need to be finalised. Then, the EU institutions need to decide if they are really prepared to create such legal uncertainty for European citizens and business. This will become clear in the coming weeks."
http://arstechnica.co.uk/tech-policy...et-fast-lanes/





Americans’ Internet Access: 2000-2015

As internet use nears saturation for some groups, a look at patterns of adoption
Andrew Perrin and Maeve Duggan

The Pew Research Center’s unit studying the internet and society began systematically measuring internet adoption among Americans in 2000. Since then, Pew Research has conducted 97 national surveys of adults that have documented how the internet has become an integral part of everyday life across diverse parts of society.

84% of American Adults Use the Internet

Year Percent
2000 52%
2001 55%
2002 59%
2003 61%
2004 63%
2005 68%
2006 71%
2007 74%
2008 74%
2009 76%
2010 76%
2011 79%
2012 83%
2013 84%
2014 84%
2015 84%

Source: Pew Research Center surveys, 2000-2015.

A new analysis of 15 years-worth of data highlights several key trends: For some groups, especially young adults, those with high levels of education, and those in more affluent households, internet penetration is at full saturation levels. For other groups, such as older adults, those with less educational attainment, and those living in lower-income households, adoption has historically been lower but rising steadily, especially in recent years. At the same time, digital gaps still persist.

In this report, we cover some of the major demographic trends that lie beneath the topline adoption numbers and highlight:

• Age differences: Older adults have lagged behind younger adults in their adoption, but now a clear majority (58%) of senior citizens uses the internet.
• Class differences: Those with college educations are more likely than those who do not have high school diplomas to use the internet. Similarly, those who live in households earning more than $75,000 are more likely to be internet users than those living in households earning less than $30,000. Still, the class-related gaps have shrunk dramatically in 15 years as the most pronounced growth has come among those in lower-income households and those with lower levels of educational attainment.
• Racial and ethnic differences: African-Americans and Hispanics have been somewhat less likely than whites or English-speaking Asian-Americans to be internet users, but the gaps have narrowed. Today, 78% of blacks and 81% of Hispanics use the internet, compared with 85% of whites and 97% of English-speaking Asian Americans.
• Community differences: Those who live in rural areas are less likely than those in the suburbs and urban areas to use the internet. Still, 78% of rural residents are online.

The full story is told in the charts below:

Internet Usage by Age

The proportion of young adults ages 18-29 who use the internet has always outpaced overall adoption levels among older groups. But while older adults still report lower levels of internet use today, seniors have the greatest rate of change since 2000.

Young Adults Are Most Likely to Use The Internet, but Seniors Show Faster Adoption Rates

Year 18-29 30-49 50-64 65 or older
2000 70% 61% 46% 14%
2001 72% 65% 50% 14%
2002 76% 70% 54% 18%
2003 78% 72% 56% 22%
2004 77% 75% 61% 24%
2005 83% 79% 66% 28%
2006 86% 82% 70% 32%
2007 89% 85% 71% 35%
2008 89% 84% 72% 38%
2009 92% 84% 75% 40%
2010 92% 85% 74% 43%
2011 94% 87% 77% 46%
2012 96% 91% 79% 54%
2013 97% 92% 81% 56%
2014 97% 92% 81% 57%
2015 96% 93% 81% 58%

Source: Pew Research Center surveys, 2000-2015.

In 2000, 70% of young adults used the internet and that figure has steadily grown to 96% today. At the other end of the spectrum, 14% of seniors used the internet in 2000, while 58% do so today. Not until 2012 did more than half of all adults ages 65 and older report using the internet.
Internet Usage by Education Attainment

Since the Pew Research Center began consistently measuring internet penetration, educational attainment has been one of the strongest indicators of use. While a large majority of the well-educated has consistently been online, those without a college degree saw greater rates of adoption over the past 15 years and have notably lowered the access gap.

While Less-Educated Adults Are Catching Up, Their Internet Adoption Rates Are Still Below Those of College Graduates

Year College+ Some college HS grad Less than HS
2000 78% 67% 40% 19%
2001 81% 68% 43% 21%
2002 83% 73% 48% 24%
2003 85% 75% 51% 25%
2004 86% 76% 53% 27%
2005 89% 80% 58% 32%
2006 91% 83% 61% 37%
2007 92% 85% 65% 40%
2008 93% 86% 65% 38%
2009 94% 87% 68% 40%
2010 93% 87% 68% 41%
2011 94% 89% 72% 50%
2012 96% 91% 75% 55%
2013 96% 92% 76% 60%
2014 96% 91% 76% 55%
2015 95% 90% 76% 66%

Pew Research Center Surveys, 2000-2015.

Adults with a college or graduate degree are the most likely to use the internet, with almost all of these adults (95%) saying they are internet users. This proportion has always been high – fifteen years ago, 78% of adults with at least a college degree used the internet. But the situation in 2000 was much different for those with less education: in that year, only 19% of those without a high school diploma reported that they were internet users. For those who have not completed high school, 66% now use the internet, still below where college graduates were in 2000.
Internet Usage by Household Income

Another marker of class differences – household income – is also a strong indicator of internet usage. Adults living in households with an annual income of at least $75,000 a year are the most likely to use the internet, with 97% of adults in this group currently reporting they are internet users. Those living in households with an annual income under $30,000 a year are less likely to report internet usage, with 74% of adults doing so now.

Those In Higher-Income Households Are Most Likely To Use Internet

Year $75K+ $50K-$74,999 $30K-$49,999 Less than $30K
2000 81% 72% 58% 34%
2001 84% 75% 60% 36%
2002 85% 76% 64% 39%
2003 87% 81% 66% 41%
2004 88% 83% 68% 44%
2005 92% 86% 73% 49%
2006 92% 86% 75% 52%
2007 93% 86% 74% 58%
2008 95% 88% 78% 54%
2009 95% 92% 79% 60%
2010 95% 88% 81% 61%
2011 97% 90% 85% 64%
2012 97% 93% 87% 71%
2013 97% 93% 86% 72%
2014 96% 93% 86% 74%
2015 97% 95% 85% 74%

Source: Pew Research Center surveys, 2000-2015.

These trends have been consistent over time, although the more recent rise of smartphones has provided internet access to lower-income people, sometimes with lower prices, sometimes with other attractive technology features. Indeed, a recent report released by Pew Research found that lower-income Americans are increasingly “smartphone-dependent” for internet access.
Internet Usage by Race/Ethnicity

Since 2000, English-speaking Asian-Americans have shown consistently higher rates of internet usage compared to whites, blacks, and Hispanics. Fully 72% of English-speaking Asian-Americans said they were internet users 15 years ago when Pew Research began to regularly measure internet access. Whites and Hispanics would not cross this threshold until 2006, and blacks would reach this level in 2011. In 2014, fully 97% of English-speaking Asian-Americans reported being internet users.

Among different racial and ethnic groups, African-Americans have seen the greatest growth rate between 2000 and today, though they are still less likely than whites and English-speaking Asian-Americans to be internet users.

English-speaking Asian-Americans Are the Most Likely To Report Internet Usage

Year Asian, English-speaking White, non-Hispanic Hispanic Black, non-Hispanic
2000 72% 53% 46% 38%
2001 73% 57% 50% 40%
2002 73% 60% 58% 47%
2003 74% 63% 58% 50%
2004 77% 65% 61% 49%
2005 75% 70% 71% 55%
2006 85% 72% 73% 59%
2007 84% 75% 76% 64%
2008 89% 75% 74%* 63%
2009 90% 79% 68%* 69%
2010 89% 78% 71% 68%
2011 91% 81% 73% 73%
2012 95% 84% 79% 77%
2013 95% 85% 81% 79%
2014 97% 85% 81% 79%
2015 N/A 85% 81% 78%

*Note 1: In December 2008, the Pew Research Center began offering national general population surveys in both Spanish and English, helping to increase the share of Hispanics who participated and improving the representativeness of our national surveys. Those who preferred to take interviews in Spanish were more likely to be recently arrived immigrants who had somewhat lower education levels, lower household income, and less connection to some technologies compared with other Hispanics living in the U.S. This helps to explain the break in the usage rate trend among U.S. Hispanics between 2008 and 2009.

Note 2: The results reported here on Asian-Americans are limited to English speakers only. The surveys reported here were conducted only in English and Spanish. Those who speak other Asian languages but are not comfortable speaking English are less likely to respond to these phone surveys.

Note 3: The 2015 data come from a survey that does not include enough Asian-Americans to yield statistically-reliable findings.

Source: Pew Research Center surveys, 2000-2015. Asian American sample size for 2015 is too low to report.

In December 2008, Pew Research began offering all surveys of the U.S. population in Spanish as well as English. This change ensured better coverage of the national population, including more recently arrived Hispanic immigrants. More recently arrived Hispanic immigrants are more likely to have limited English ability, have lower levels of income and formal education, and have less internet experience than other Hispanics living in the U.S. Thus, we report two separate time trends for Hispanics: the first leading up to late 2008 when Pew Research Center surveys of the U.S. population were only available in English, and the second, from late 2008 onward, when all Pew Research national surveys were administered in both English and Spanish.1

Furthermore, the trends presented here on Asian-Americans are limited to English speakers only. The respondents classified as Asian-American said in surveys that they were “Asian or Pacific Islander” when asked to identify their race. As Pew Research surveys are only offered in English and Spanish, the Asian-Americans who respond are English speakers or bilingual. Those who speak other Asian languages but are not comfortable speaking English are less likely to respond to these phone surveys. Pew Research Center does not usually report on Asian-American technology use in it reports as surveys do not typically contain enough Asian-American respondents to yield statistically reliable findings. Aggregating surveys, as is done here, does yield sufficient cases of English-speaking Asian-Americans to report the findings.2
Internet Usage by Community Type

Adults who live in urban or suburban communities have shown consistently higher levels of internet adoption, compared with rural residents. This gap has persisted even as internet adoption has risen in all three types of communities.

Rural Citizens Are Less Likely To Use Internet

Year Urban Suburban Rural
2000 53% 56% 42%
2001 55% 59% 46%
2002 61% 63% 49%
2003 64% 65% 51%
2004 65% 67% 53%
2005 69% 70% 60%
2006 71% 73% 62%
2007 75% 77% 63%
2008 75% 77% 63%
2009 73% 76% 68%
2010 78% 79% 69%
2011 80% 82% 73%
2012 84% 84% 76%
2013 86% 85% 78%
2014 85% 85% 79%
2015 85% 85% 78%

Source: Pew Research Center surveys, 2000-2015.

In 2000, 56% of suburban residents, 53% of urban residents, and 42% of rural residents were internet users. Today those figures stand at 85%, 85%, and 78% respectively. Rural communities tend to have a higher proportion of residents who are older, lower-income, and have lower levels of educational attainment – additional factors associated with lower levels of internet adoption.
Internet Usage by Gender

Today, men and women are equally likely to be internet users, a trend that has not wavered throughout the 15 years these surveys have been conducted. However, the earliest Pew Research surveys found that men were more likely than women to be internet users. For instance, a 1995 survey found 9% of men and 4% of women had used a “modem to connect to any computer bulletin boards, information services such as Compuserve or Prodigy.”

Gender Parity Has Been the Norm In Internet Usage

Year Men Women
2000 54% 50%
2001 57% 53%
2002 61% 57%
2003 63% 60%
2004 66% 61%
2005 69% 67%
2006 72% 70%
2007 75% 73%
2008 74% 73%
2009 77% 75%
2010 77% 76%
2011 80% 78%
2012 83% 82%
2013 84% 84%
2014 84% 84%
2015 85% 84%

Source: Pew Research Center surveys, 2000-2015.

By 2000, when Pew Research began tracking internet use more consistently, 54% of men were internet users, compared with half of women. This modest gap continued, gradually shrinking until 2008 when a statistically indistinguishable 74% of men and 73% of women identified as internet users. Today, 85% of men and 84% of women report being internet users.
http://www.pewinternet.org/2015/06/2...ess-2000-2015/





The Wait-for-Google-to-Do-It Strategy

America’s communications #infrastructure is finally getting some crucial upgrades because one company is forcing #competition when regulators won’t.
James Surowiecki

It’s too often said that some event “changed everything” in technology. But when it comes to the history of broadband in the United States, Google Fiber really did. Before February 2010, when Google asked cities to apply to be first in line for the fiber-optic lines it would install to deliver Internet service to homes at a gigabit per second, the prospects for upgrading Americans’ wired broadband connections looked dismal. The Federal Communications Commission was on the verge of releasing its first National Broadband Plan, which stressed the importance of affordable, abundant bandwidth and the need to spread it by “overbuilding”—stringing fiber to houses and businesses even if they already had service over cable and phone lines with relatively low capacity. Yet at the time, as Blair Levin, executive director of the broadband plan, told me, “for the first time since 1994, there was no national provider with plans to overbuild the current network.”

This was not because of technological hurdles. Instead, it was a simple matter of incentives. Building much faster networks was an expensive task, one that would require the kind of hefty capital expenditures that Wall Street typically frowns upon. (Verizon’s spending on its FIOS TV and high-speed Internet service, for instance, came in the face of deep skepticism from investors, which eventually led the company to curtail its expansion of FIOS nationally.) And since Internet service in most cities was supplied by either a near monopoly or a cozy duopoly in which the two players—typically a cable company and a major telecom provider—barely competed against each other, there was little competitive pressure to improve. As long as all the players kept the status quo intact, it seemed, Internet providers could look forward to years of making sizable profits without having to put much money into their networks. The Internet as we know it was only 15 years old, but ISPs were already shifting into harvesting mode: maximizing revenue from their infrastructure rather than upgrading it. Forget gigabit Internet. The National Broadband Plan set a goal of getting 100 million homes affordable access to download speeds of just one-tenth of a gigabit, or 100 megabits, per second. (Only 15 percent of American homes have connections above 25 megabits now.)

State and local governments had done little to disrupt the status quo or push ISPs to invest in upgrades. And governments also showed little interest in subsidizing, let alone fully paying for, a better infrastructure themselves. (There was money allocated to broadband investment in the 2009 stimulus bill, but it went mainly to wire underserved areas rather than lay fiber.) On the municipal level, most cities still had building regulations and permit requirements that, inadvertently or not, tended to discourage the laying of new line, particularly by new entrants. And in many cases, even if cities were interested in building or operating their own high-speed networks, state laws barred them from doing so. The result of all these factors was that the United States, slowly but certainly, began falling well behind countries like Sweden, South Korea, and Japan when it came to affordable, abundant bandwidth.

The unnerving thing is that so much of the present and future of broadband has come down to the whims of a single company.

Five years later, things look very different. The United States is still behind Sweden and South Korea. But fiber-to-the-home service is now a reality in cities across the country. Google Fiber, which first rolled out in Kansas City in the fall of 2012, is now operating in Austin, Texas, and Provo, Utah, and Google says it will expand next to Atlanta, Salt Lake City, Nashville, and Charlotte and Raleigh-Durham, North Carolina, with another five major metro areas potentially on the horizon. The biggest impact, though, has arguably been the response from big broadband providers. In the wake of Google Fiber’s debut, AT&T announced that it would begin offering one-gigabit connections at prices that would previously have seemed impossible, and the company says it might expand that service into a hundred cities. #CenturyLink and Cox now have gigabit service in a few cities, and Suddenlink promises an offering in the near future. (Whether such promises will be kept is, of course, a different question, but the mere fact that they’ve been made is striking.) And even in areas where gigabit connections may be a long time coming, cable companies have dramatically improved speeds for their customers, often at no added cost. Time Warner Cable—one of whose executives declared, at a public conference, that it wasn’t offering gigabit service because consumers didn’t want it—offers connections today that are five times the speed of what was its fastest connection a couple of years ago.

Google Fiber has also inspired action on the municipal level. Gig.U, of which Blair Levin is now executive director, is working on bringing gigabit connections to more than two dozen college towns (where the demand for ultra-high-speed connections is obvious). A consortium of cities in Connecticut is talking with the Australian investment bank #Macquarie about a public-private partnership to build a fiber network that the cities would eventually own (an approach similar to the one Stockholm used to build its fiber network). Seeing how Chattanooga, Tennessee, went ahead and built its own network, wiring every home with fiber, cities everywhere are looking to streamline their permit processes in order to make laying these new networks as simple (and affordable) as possible. “When you talked to mayors a few years ago, they would tell you about all the other problems they had that mattered much more than bandwidth,” Levin says. “When you talk to them today, they recognize that this is something they really need, and that it isn’t about streaming TV but about making sure businesses and schools and health-care facilities are going to have what they need in the future.”

None of this means that we’ve reached a true tipping point when it comes to fiber. The share of the country’s homes connected to fiber lines was still only about 3 percent at the end of 2013. But compared with where the U.S. was just a few years ago, progress has been dramatic. Had Google not chosen to do what it did, we’d probably still be stuck with the lack of investment and slow downloads that were our lot in 2010. As Levin puts it, “I would like to believe that all this happened because we made such a brilliant case for the benefits of abundant bandwidth in the National Broadband Plan. But that’s not the case. Without Google, this would not have happened.”

That raises the obvious question, of course, of just why Google did this, given that investing in physical networks is a long way from its core business. Google Fiber was introduced as “an experiment,” but as it has expanded, the company has said that it views the project as a real business and is managing it that way. And obviously, even if the direct return on the investment in Google Fiber ends up being small (as seems likely, given that Google is charging similar prices for gigabit connections as cable companies charge for much slower ones), the company will reap ancillary benefits from making the Internet more valuable and driving more traffic online.

In the end, though, the reason Google has invested in fiber is less important than the practical outcome of that investment. In effect, what the company is doing—both in building these networks and in pushing national providers to upgrade—is providing a public good whose spillover benefits are likely to be immense, and one that neither the government nor the private sector was doing much to deliver. This is somewhat similar to what Google did, on a smaller scale, back in 2008, when the FCC was auctioning off sections of the airwaves to wireless providers. The FCC had announced that if bids for a certain slice of the spectrum exceeded $4.6 billion, it would attach an open-access requirement that existing wireless providers didn’t want to have to follow. So Google placed a bid that was above the FCC’s price. It did so not in the expectation of winning (though it was prepared to spend the money if it did) but, rather, in order to ensure that regardless of who won—in this case, Verizon—the open-access requirement would go into effect. One might speculate that a similar dynamic is at work in Project Fi, Google’s new wireless-service offering, which challenges most wireless providers’ traditional pricing strategies (as well their dependence on privately owned networks).

What Google’s doing, in these cases, is using its deep pockets in the interest of broader social ends, with seemingly little concern for short-term returns. This strategy has historical precedent. In the early years of the American republic, there was little appetite for government spending on public works, like roads and canals. But the country needed better roads to facilitate the growth of trade and commerce. So the states turned to private companies, which built turnpikes that they then operated as toll roads. In the late 18th and early 19th centuries, hundreds of these companies invested millions of dollars in laying thousands of miles of road, in effect providing the basic infrastructure for travel in the United States.

What’s interesting about these companies is that while they were, in theory, for-profit, and while they had shareholders, in most cases there was no expectation that they would actually turn a profit in operating the roads—tolls were kept low enough to encourage traffic and commerce. Instead, the shareholders—who were typically local merchants and manufacturers—saw their investments in turnpikes as a way to collectively provide a public good that, not incidentally, would also deliver benefits to them as business owners and consumers. They knew, of course, that other businesses would benefit from these roads even if they didn’t invest in them (the nature of a public good being that everyone can use it). But that didn’t mean the investment wasn’t worth making. It’s hard not to see a similar logic underlying much of what Google does.

When it comes to the current state of innovation and the economy, the implications of Google Fiber are complicated. On the one hand, it is a testament to the power of competition. Google’s willingness to invest the money in a new network threatened cable and telecommunications companies’ dominance, and took customers away from them. That shifted the economic calculus. It’s no coincidence that the cities and regions where cable companies first announced they were building fiber, and offering high-speed connections at affordable prices, have been the places where Google Fiber either is or is going.

At the same time, though, it’s depressing that ensuring competitive broadband markets required the intervention of an outsider like Google. Indeed, the system as it was five years ago was designed to keep us stuck in the broadband dark ages. The government didn’t really do anything to change that, either by acting to make markets more competitive or by investing on its own. We just got bailed out by Google.

The unnerving thing is that so much of the present and future of broadband has come down to the whims of a single company, and a company that, in many ways, doesn’t look or act much like most American firms. If Google didn’t have such a dominant position in search and online advertising, giving it the resources to make big investments without any requirement of immediate return, Google Fiber wouldn’t have happened. And if Google’s leadership weren’t willing to make big long-term investments in projects outside the core business, or if the company didn’t have a dual-share structure that preserved its founders’ power and somewhat insulated its executives from Wall Street pressure, gigabit connections would more than likely be a fantasy in the United States today. As Levin puts it, “We got fortunate that a company with a real long-term view came into this market.” It might be good to design technology policy so that next time around, we don’t need to get so lucky.
http://www.technologyreview.com/revi...o-it-strategy/





Aussie ISP Bakes in Geo-Dodging for Netflix, Hulu

No need for separate VPN or DNS manipulation to stream video.
Tony Yoo

A new Australian ISP is integrating geo-blocking circumvention into its broadband service, allowing customers to access streaming services like Hulu, Netflix USA, BBC iPlayer and Amazon Prime.

When Yournet launches in August, customers will be able to sign up for broadband that allows users to instantly change the country they are supposedly surfing from.

"On our client website there will be a region-switching option," founder Raj Bhuva told CRN.

The ISP expects USA and United Kingdom to be popular options, but will also offer the ability to choose Canada, New Zealand, France and Germany.

The service means customers won't need to purchase a separate VPN or DNS manipulation service on top of their broadband connection.

"[Yournet] is definitely not a VPN… We're importing the technology in from New Zealand and the technical details I can't go into too much." Bhuva said. "But it's more advanced than a smart DNS and won't easily be blocked by anyone."

The Sydney entrepreneur said that the service has performance advantages compared to methods curently used by many Australians to watch overseas content.

"[Performance] is one of the big advantages. It won't affect your internet connection whatsoever. VPN routes your traffic to the other side of the world – we're not doing that."

With the only available plan costing $129.95 per month, the Yournet service is not cheap, but the company is targetting heavy video users craving overseas streaming.

"We're aiming for the premium end of the market," Bhuva said, adding that the $129.95 plan includes unlimited data, free geographic switching and superior performance.

"Because we've designed this network for the current age of internet content and video streaming, our contention issues are significantly lower than the [rest of] the market out there," he said. "So you shouldn't get any dropouts, buffering or pixelation. And you can have multiple devices streaming and surfing the web at the same time."

Bhuva is a British ex-patriate who currently works as a senior technology manager for Sydney ISP Alpha Dot Net Australia. Although his employer has invested in Yournet, the venture is his own, with the technology licensed from New Zealand's Global Mode.
http://www.crn.com.au/News/406071,au...flix-hulu.aspx





Changing the Game: Study Reaffirms the Massive Impact Netflix is Having on Pay TV
Zach Epstein

The term “game-changer” isn’t always a meaningless buzzword, despite its overuse across every corner of the technology industry. There are, in fact, companies that truly have changed the game, and there is little question that Netflix can be counted among them. It wasn’t the first site to offer streaming movies and TV shows, but it certainly popularized the practice of streaming your entertainment rather than watching it on TV. It has also had a profound impact on the pay TV industry, helping prompt a growing number of people to cancel their cable and satellite TV packages in favor of web-based viewing.

Now, a new study shows that Netflix may have had an even more profound impact on pay TV than most people realize.

Evidence is mounting that cord cutting is becoming an increasingly popular trend as households ditch pricey pay TV packages in favor of less expensive streaming services. Netflix is clearly a leader in this area, with a recent study suggesting that Netflix alone is responsible for more than one-third of all internet traffic in North America during peak hours.

What many people don’t realize, however, is that Netflix isn’t just a supplemental service anymore. Netflix subscribers are abandoning their traditional pay TV packages in droves, and a recent study sought to quantify just how many Netflix subscribers have left pay TV or have never had a pay TV subscription to begin with.

An off-site survey commissioned by CutCableToday found that only two-thirds of Netflix subscribers also subscribe to a cable or satellite television service. The study surveyed 829 current Netflix customers.

Also of note, the survey found that 9% of those users who still subscribed to pay TV planned to cancel their subscriptions, and another 16% were unsure if they would continue to pay for cable or satellite TV in the future.

Networks are finally beginning to react to the cord cutting trend. HBO and ESPN are among the first networks to offer streaming plans completely independent of pay TV packages, and several additional networks have signed on to Sling TV, which offers a streaming-only live TV service.

How are cord cutters enjoying life without pay TV? CutCableToday’s survey found that an overwhelming majority of Netflix subscribers — 92% — are satisfied with the service.
https://bgr.com/2015/06/30/netflix-c...pay-tv-impact/





Inside Facebook’s Blu-Ray Cold Storage Data Center
Rich Miller

The temperature remains constant as you walk through Facebook’s custom data storage facility. But as you approach the back of the room, you transition from cold storage to even colder storage.

In a row of 14 racks housing square enclosures, Facebook is test-driving the future of its long-term data storage. The racks are packed with thousands upon thousands of Blu-Ray disks.

That’s right: the optical media that plays your movies can now back up all your status updates and Facebook photos.

The Blu-Ray storage system is part of the company’s evolving effort to manage a flood of incoming data, with users uploading more than 900 million photos every day. To keep pace with all those uploads, Facebook must constantly seek new ways to add capacity.

“It’s amazing how much storage you can do with Blu-Ray,” said Keven McCammon, Datacenter Manager for Facebook’s facility in western North Carolina. “There are times when you can look back to look forward.”

The racks of Blu-Ray storage are still in the testing phase. But Facebook has high hopes for Blu-Ray as a tool to optimize its infrastructure. McCammon and his team are putting the system through its paces at the company’s massive East Coast data center campus.

“We’re doing some pretty extensive testing right now,” said McCammon. “We want to really make sure it can function in a production environment and can scale.”

With 1.44 billion users, Facebook’s storage needs may seem otherworldly to most users. But many companies will soon face similar challenges managing the explosive growth of data storage. The hyperscale data center players pioneer the design strategies and best practices that will soon filter into many other data centers. Facebook’s cold storage journey offers insights for managing the coming data tsunami.

Retooling Tiered Storage for the Hyperscale Age

Facebook’s cold storage system is a web-scale implementation of tiered storage, a strategy that organizes stored data into categories based on their priority and then assigns the data to different types of storage media to reduce costs. The goal is to create a top tier consisting of high-performance enterprise hardware and networking, while lower tiers can use commodity hardware or, for rarely-used assets, backup tape libraries.

The storage world has changed since tiered storage made its debut in 1990, but Facebook is applying many of the principles in its infrastructure, albeit with different technologies.

Facebook was an early adopter of solid state drives (SSDs), storage devices that use integrated circuits as memory. SSDs have no moving parts, unlike traditional hard disk drives (HDDs), which contain spinning disks and moveable read/write heads. Most importantly, SSDs are faster than hard disks, and can accelerate key parts of Facebook’s infrastructure.

While focusing on SSD and Flash in the high-performance portions of its storage infrastructure, Facebook continues to use plenty of hard disk drives to store photos. In 2012 it created a custom storage tray through the Open Compute Project, known as Knox.
By 2013, Facebook was storing more than an exabyte of images that were rarely accessed, with 82 percent of traffic focused on just 8 percent of photos. So the company created a “cold storage” design to shift these rarely-viewed photos from its expensive high-performance server farms to simpler data centers with no generators or UPS (uninterruptible power supply). Emergency backup power isn’t needed because the facility does not serve live production data.

“Reducing operating power was a goal from the beginning,” wrote Facebook’s Krish Bandaru and Kestutis Patiejunas in a blog post outlining the cold storage system. “So, we built a new facility that used a relatively low amount of power but had lots of floor space. The data centers are equipped with less than one-sixth of the power available to our traditional data centers, and, when fully loaded, can support up to one exabyte (1,000 PB) per data hall.”

Facebook operates cold storage facilities at its Prineville, Oregon and North Carolina data center campuses. Custom software optimizes the energy use of trays filled with commodity HDDs. By reducing the disk activity and power draw, the design has slashed the amount of airflow needed to cool the racks.

Facebook wasn’t done yet. At the 2014 Open Compute Summit, it unveiled a prototype for an “ultra-cold” storage system using Blu-Ray disks as the storage medium and a robotic arm to retrieve data. The robotic arm is similar in concept to systems used to retrieve tape cartridges in tape storage libraries. (See the prototype in action in this video from Facebook).

Blu-Ray is an optical data storage format that uses blue lasers to read the disk. It’s not for primary storage, as data can’t be retrieved instantly. But using Blu-Ray disks offers savings of up to 50 percent compared with the first generation of Facebook’s cold storage design, since the Blu-Ray cabinet only uses energy when it is writing data during the initial data “burn,” and doesn’t use energy when it is idle.

How Design Tweaks Drive Big Savings

Facebook has built three cold storage data centers on the Forest City campus. Only one is in use, with the other two reserved for future capacity. The single-story buildings are a departure from the company’s primary server farms, which have an upper floor that functions as a huge cooling plenum, treating fresh air to be used to cool servers in the data halls on the first floor.

In the cold storage design, air enters the facility through louvers in the side of the building. The cooling is handled by a series of air handlers along the exterior wall, which consolidate the multi-step cooling and filtering into a single piece of equipment.

The cold storage racks are more densely packed than a standard Open Compute Knox storage units. A cold storage rack can hold 32 trays, which each contain 15 standard 4 terabyte drives, allowing Facebook to store nearly 2 petabytes of data in every rack.

“We spin them up, fill them with data and then spin them down. Every now and again we spin them up again to make sure they’re working fine,” said McCammon, who noted the design “doesn’t draw as much power, and you don’t need as much air. In the other data hall, the drives are always going.”

The Blu-Ray racks are even more efficient. They are different in appearance from the Knox storage trays, with five units per Open Rack, each housing carousels filled with Blu-Ray disks. When data must be accessed, the action happens in the rear of the unit. That’s where the robotic retrieval system has been condensed into a space in the bottom of the rack. When it springs into action, the arm travels along tracks on either side of the rack, fetches a Blu-Ray disk, pulls data off the disc and writes it to a live server.

facebook-cold-storage-rack“In a BluRay scenario, you don’t even need to spin them up,” said McCammon. “It also takes less power to perform a write. It also reduces our network traffic, and so we need fewer switches.”

“It’s incredibly efficient,” he added. “It’s truly lights out.”

Facebook isn’t alone in tapping Blu-Ray as a medium for data storage archives. It’s a concept that dates to optical jukebox systems, and companies like Hie Electronics and DISC Group offer smaller scale data archives using Blu-Ray and robot retrieval.

Through the Open Compute Project, the Facebook Blu-Ray initiative has created a commercial spinoff of its own. Last year Facebook hardware guru Frank Frankovsky left to found a startup focused on adapting the Blu-Ray optical storage system for the enterprise. Last month that startup, Optical Archive, was acquired by Sony.

Is the enterprise ready for movie disks powering their long-term storage? For now, Facebook will continue testing the Blu-Ray cold storage racks. Very soon other hyperscale players and enterprises will have more information about the performance of Blu-Ray storage at scale, and its potential for use beyond Facebook and OCP. In the meantime, the data will keep coming.
http://datacenterfrontier.com/inside...e-data-center/





Music Piracy is Down but Still Very Much in Play
Ryan Faughnder

Apple's biggest rival when it launches its $10-a-month streaming music service on Tuesday might not be Spotify or Tidal, but piracy.

About a fifth of Internet users around the world continue to regularly access sites offering copyright infringing music, according to the International Federation of the Phonographic Industry.

In the U.S. alone, 20 million people still get music through peer-to-peer file-sharing networks, according to research firm MusicWatch. And newer methods have emerged, such as mobile apps and software that rips audio from YouTube.

"It's a tremendous problem," said analyst Russ Crupnick of MusicWatch. "The good news is some of the very traditional ways of stealing are down pretty dramatically."

The rise of convenient, licensed streaming has helped cut U.S. file-sharing rates in half in the last decade. Anti-piracy efforts of the Recording Industry Assn. of America — representing Universal Music Group, Sony Music Entertainment, Warner Music Group and others — have also contributed to the drop off.

But the music industry is still trying to recover from piracy's heyday. Last year, total music industry revenue was about $15 billion worldwide, well below the 1999 peak of $38 billion.

Free downloads have maintained their allure for people who want to build music collections and refuse to go to iTunes, Amazon or Wal-Mart.

Part of the problem is getting people who grew up in the age of Napster, LimeWire and Kazaa to pay anything for music, industry experts say. Many young people don't see anything wrong with downloading from unauthorized sites or ripping from YouTube.

"We now have a generation of people for whom the value proposition of music has changed," said Larry Rosin, co-founder and president of Edison Research. "A lot of people saw this as revenge for being ripped off by the industry for years."

The RIAA brought thousands of lawsuits against individual alleged thieves early on. While the legal actions brought widespread attention to copyright theft, many viewed them as heavy-handed. Even elderly suspected culprits and teenagers were sued.

"At the time, what put a thorn in my ass was seeing fans get sued," said Mötley Crüe bassist Nikki Sixx. "Now I can just pay a subscription and get all this music… I think it is a healthier answer. You ask a lot of kids today how much pirating are you doing, and I think it's down."

The failed 2011 anti-piracy congressional bills known as SOPA and PIPA were an added black eye for the RIAA and fellow Hollywood supporters.

Industry groups have since emphasized softer measures to combat illegal downloads. The record companies have worked to get Internet service providers to help prevent copyright theft and pushed search engines to demote sites in users' results. The RIAA has also tried to pressure brands to not allow their advertisements to appear on offending web destinations.

"Infringing site operators don't care about music, they care about eyeballs," said RIAA Deputy General Counsel Victoria Sheckler.

Taking down file-sharing sites, many of which operate offshore, is akin to whack-a-mole. The founders of the website The Pirate Bay were convicted of aiding copyright theft in 2009, but it remains one of the most popular sites for free music and movies. Swedish police raided and shut down Pirate Bay late last year, only to watch it rise again with a new Phoenix logo.

Others have been added to the wall of defunct services. LimeWire was discontinued in 2010 and agreed to pay the record industry $105 million to settle a 2011 copyright case. And Kim Dotcom's Megaupload was shut down in 2012.

Most recently, the music industry forced the demise of Grooveshark, the streaming service that once counted tens of millions of visitors and got its music from user uploads, rather than from the labels.

Facing $736 million in potential damages, Grooveshark's parent company Escape Media agreed to close the site in April as part of a settlement with the record companies. "We failed to secure licenses from rights holders for the vast amount of music on the service," the company said in an apology letter on the site.

Former executive John Ashenden said in a blog post he was "overwhelmingly impressed" that Grooveshark was able to "fight the line" for so long.

"How many young people do you know that still download music (legally or illegally) instead of using a paid or ad-promoted streaming music service? I would wager that it is quite few— likely a minority," he wrote.
http://www.latimes.com/business/la-e...620-story.html





Americans Stream 135 Billion Songs in First Half of Year
Dawn Chmielewski

Americans streamed a staggering 135 billion songs and music videos in the first half of 2015, nearly double this time a year ago, according to the latest data from Nielsen.

Growing demand for songs delivered on demand via Internet services such as Spotify, YouTube, Slacker and others helped fuel overall music consumption in the last six months.

“Obviously, the streaming piece is really great news, when you’re talking about darn near 100 percent growth … with no new players,” Dave Bakula, senior vice president of Nielsen Entertainment, told Re/code, noting the tallies do not reflect the high-profile launch of the updated Apple Music.

The streaming numbers underscore a fundamental shift in how people get their music. Digital song sales fell 10.4 percent to 531.6 million. Total album sales, including CDs and digital albums, declined 4 percent to 116 million units.

Taylor Swift’s “1989” ranked as the top-selling and most-consumed album of the year so far, despite a highly publicized Spotify protest that limited the number of times songs from her new album were streamed (a mere 188,213 times). Still, Swift sold two million albums.

Drake’s album, “If You’re Reading This It’s Too Late,” was the No. 2 album with 409 million audio streams and 1.4 million units sold.
https://recode.net/2015/07/02/americ...-half-of-year/





Seeking Genuine Discovery on Music Streaming Services
Ben Ratliff

One day last week I woke up and decided I wanted to have the best morning ever.

So I went to Songza — the music-streaming service that offers only predetermined playlists, much of them designed around mood, time of day or activity — and chose the playlist “Best Morning Ever.”

The listening experience felt similar to the mainstream-urban-radio format called rhythmic contemporary. A few different styles may have been represented, directly or through reference — hip-hop with Missy Elliott’s “We Run This,” postdisco with Mariah Carey’s “Fantasy,” Antillean pop with Becky G’s “Can’t Stop Dancin’ ”— but this was still a closed loop of pop for an off-the-rack kind of morning.

Later in the day I checked out Spotify’s “Evening Calm” playlist. It was consistently soft and warm, but focused almost exclusively on the last five years of English-speaking, postfolk Caucasians singing and playing acoustic guitars. If I was being soothed, I was also being sold to. For the streaming services to take over your listening, they need to know who you are. They hope that you can be slotted into predictable contours: a look, a format, a genre.

Even given unlimited options, many people would prefer to listen like this, and that’s fine. But I have a recurring problem with most streaming-service playlists: Though I keep listening out of animal curiosity for what comes next, I am almost never surprised. I always feel as if I’m shopping somewhere, and the music reflects What Our Customers Like to Listen To. The experience can feel benignly inhuman.

With the start of Apple Music this coming Tuesday, I have been thinking about the potential of streaming services. Apple Music, Spotify, Pandora and others oversee an astonishing library — each offers upward of 30 million songs. Not everything, but enough to suggest the concept of “everything.” You can’t listen to it all, of course. There is only so much time to waste.

This is where all those playlists come in, and the other ways services manage or choose songs for you. For music-data analysts and others, “discovery” is very important — the idea that listeners can be guided, through stated or anticipated preferences, toward music just slightly to the side of what they like, and ideally by pressing a single button.

That great library is virtual, not physical. You may never know what you aren’t being exposed to. So will the management of your “discovery” involve a narrowing of your library or an expansion of it? And can the streaming services help us “discover” without confining us?

If there is a basic human desire to have “everything,” there is also a need to manage it. The creators of the great library of Alexandria, in the third century B.C., sought to bring together the history of intellectual activity. Their goal, it is said, was a half-million volumes, and they contacted rulers of the world, asking them to send scrolls of all kinds. It was overseen by the poet and scholar Callimachus, who understood that users would need to easily find things, and so he cataloged it.

According to Alberto Manguel’s book “A History of Reading,” Callimachus may have been the first to organize a catalog alphabetically. He separated the library’s holdings into drama, oratory, lyric poetry, legislation and so forth; he also attached biographical descriptions and other metadata to them. He was one of the important early stratifiers of genre, which has helped define the reception of creative work — and to some extent the making of it — ever since. All the same, as Mr. Manguel writes elsewhere in his book, “all classifications are ultimately arbitrary.”

The Callimachuses of the music-streaming services know that you need help navigating the great library of music. But they also seem to sense that you associate the concept of musical genres, and their ever more imprecise names — R&B, country, Latin, punk — with the old ways.

So what are you going to do? Will you dig deeper into the old classifications and consider yourself a lifer with, say, blues or classical music? (I should mention that I have been impressed by the depth of some streaming-service playlists, particularly Songza’s, within a given genre or artist; most are attributed to “Songza’s music experts.”) Will you listen according to a radiolike model, either with a purely algorhythmic logic, or playlists sequenced by humans, or the 24-hour live radio service promised by Apple Music? Will you listen to playlists based on mood or time of day or occasion — which usually turn out to be presentations of genre, or at least format, by another name? Or will you find some way to better expand your listening?

I wonder if there is a way of listening that acknowledges the vastness of the library and casts it in a positive light, even for the indifferent.

That might involve a greater trust in non-obvious juxtapositions, and in the person programming the juxtapositions. It might be interesting for “Evening Calm” to smuggle elements from other traditions or ethnicities or centuries: say, Sylvia Rexach, or Gilberto Gil, or Ravi Shankar, or Debussy. And it might be even more interesting if the smuggler was someone you felt you knew and respected.

Some of Tidal’s playlists created by artists have been promising. Beyoncé’s “Drop It for Summer” list, posted in late May, was two hours of hip-hop associated with regional or viral dance moves from the last decade or so — a subject she knows a lot about — plus Bob & Earl’s “Harlem Shuffle” from 1963. Usher’s “How U Hear It Part 2: Inspiration,” from a few weeks ago, includes tracks from the rapper Young Thug, the German experimental rock band Can and one particularly non-obvious segue: Frank Zappa’s ridiculous “Don’t Eat the Yellow Snow” followed by Aretha Franklin’s sublime “Ain’t No Way.” Both used the great library to make fresh or personal associations, and passed that associative energy on to you.

The old classification systems in music have done important work. They may have reduced and simplified and excluded; they also helped make culture knowable by delimiting the options. But music is everywhere now, and each of us has many more small relationships with it. I am hoping that we can create some bigger ones, too, worthy of our great and mysterious souls.

That Usher playlist could be the first 14 songs generated by his own music library in shuffle mode. But its hard shifts apparently come from his own tastes, and the playlist therefore becomes more interesting than most curation by streaming services. The same probably could be said of the first 14 songs from your library on shuffle, if you own that many songs.

Streaming services will find more clever ways to anticipate listeners’ tastes, which do often follow patterns. And I can see the point of listening to predictable playlists while you’re doing something else; it keeps you company and isn’t distracting.

But for “discovery,” I’d rather be guided by a single person of distinction — whether Beyoncé, or Usher, or one of the musicians that Apple Music has hired for its radiolike features, or one of the countless other brilliant musical thinkers out there. The streaming services should sign them up furiously, for playlists and shows and podcasts, and pay them well for the privilege.

Let those people confuse listeners, or challenge precepts of fashion, by being themselves. That’s where discovery will always lie: In the suggestions of actual human beings. Divisions by genre, format, or mood may be arbitrary. Individual people never are.
http://www.nytimes.com/2015/06/29/ar...-services.html





Audio Overkill? Some Question Benefits of 'High-Res' Music
Ryan Nakashima

Its backers say it does for music lovers what ultra high-definition television has done for couch potatoes.

It's a digital format that packs nearly seven times the data found on CDs, touted as producing crystal-clear sounds with a sharpness that'll blow consumers away. Advocates like Neil Young and major record labels say the format that's the high end of what's known as "high-resolution" audio restores textures, nuances and tones that listeners sacrifice when opting for the convenience of music compressed into formats like MP3s or Apple's AAC.

But some recording-technology experts say this super high-res format — known by its 192 kHz, 24-bit technical specs — is pricy digital overkill, an oversized "bit bucket" that contains sounds only dogs or dolphins can truly enjoy.

Some cynics say the push to high-res audio is just another attempt to get consumers to rebuy music they already own.

Marc de Oliveira did just that in February when he bought Bob Dylan's latest album, "Shadows In The Night" from the Young-backed PonoMusic store. Already having bought the CD from a physical record store, the Copenhagen-based 49-year-old IT consultant splurged on a 24-bit version, hoping to feel more present in the room where Dylan recorded.

Instead, he stumbled on a blog that analyzed the file and found no more than 16 of the 24 bits were used, the same as on the CD. After months of de Oliveira trying to get a refund, Pono's Vice President of Content Acquisition Bruce Botnick replied to his posts saying that Dylan himself liked sample CDs cut in the studio. Engineers mastered the album from those discs, forever locking this particular release at the lower specs.

Still, that hasn't changed what Pono is charging for the file, $17.99, versus the physical CD, which costs $9.70 on Amazon.

"They should have probably been more active about not accepting that as a real 24-bit file," de Oliveira said.

More than 90 percent of the PonoMusic store is represented essentially by digital copies, or rips, of CDs, Botnick acknowledged to The Associated Press in an interview at his Ojai, California-based studio. To be fair, they're labeled as such. And those files are still in a higher category than AAC files or MP3s, which eliminate some sounds in the compression process.

But of the other albums on PonoMusic labeled higher-than-CD quality, Botnick says about 70 to 75 percent "we know are real," meaning they've researched the recording history to verify the file has more information than just a CD rip or has some other quirk in the original recording justifying a mixed or lower resolution.

He said efforts are being made to further assure consumers of the "provenance," or origins of recordings, and how they got to be labeled high-resolution.

"It's a real fact-finding job" and "it's going to take some time" to handle the thousands of albums in question, he said. Until then, it's a case of "buyer beware," he said.

And while audiophiles may be aware of the rarified, often hard-to-detect benefits of the high-resolution files, average music lovers can easily over-value the claims made by backers, according to Mark Waldrep, a recording engineer, college professor and writer of the "Real HD-Audio" blog.

Studios are re-releasing older recordings in giant data containers that are sometimes barely merited, he says.

That conclusion was reinforced when he analyzed high-res Warner Music re-releases of Joni Mitchell's "All I Want" from the 1971 album "Blue" and "Ain't No Way" from Aretha Franklin's 1968 album "Lady Soul," which The Associated Press bought from the PonoMusic store.

"You're buying a container that's really 50-60 or even 70 percent zeroes. It's all empty information," he said. "The frequencies you're buying up here are either all zeroes, or hiss, which contributes nothing to the enjoyment of the music, unless you're into hiss."

And very few, if any new albums, are being made in the super-high resolution specs that Pono is touting.

Giles Martin, the Grammy-winning producer of the "Love" soundtrack for The Beatles-Cirque du Soleil show in Las Vegas, says the highest fidelity he records at is 96 kHz, 24 bits, after which there's no benefit in boosting the playback specs. "You can't upscale audio," he says. "There's a compromise in having huge high-res files that don't sound any different than other ones."

From the record labels' point of view, part of the re-mastering process is simply to preserve aging analog tapes at the highest practical digital format.

George Lydecker, a vice president of engineering and archiving at Warner Music, says a CD-specification release of Franklin's "Lady Soul" wouldn't have been as accurate a reproduction partly because placing a necessary filter at the lower frequency required by CDs creates some distortion. Instead, the 192 kHz, 24-bit file that was released "is like standing in the studio live and hearing Aretha belt it out."

The album goes for $17.99 on the PonoMusic store. A CD can be had for $4.99 on Amazon.

While not all people will be able to hear a difference, some will.

"For the first time, you can get the file (that was) approved by the mastering engineer in the studio," says Jim Belcher, Universal Music's vice president of technology and production. "And for a lot of people that doesn't make sense. For a segment of the market that really cares about audio quality, they want that."

And that's the other thing. Even with a $400 PonoPlayer or some other high-end playback device like a Sony Hi-Res Walkman or Astell and Kern AK100II, or even the latest smartphones from Samsung and Apple, audiophiles who want to hear the true benefits of high-resolution audio should also have headphones or speakers capable of playing back those high frequencies that only few humans can hear. In some cases, that could require a headphone amplifier.

John Siau, director of engineering at high-end equipment maker Benchmark Audio, argues that consumers are fooling themselves if they believe they can appreciate high-res audio without the proper high-end equipment.

"There's no point in having high-resolution playback formats if your playback equipment can't even match CD quality," he says.
http://www.newstimes.com/business/te...of-6351306.php
































Until next week,

- js.



















Current Week In Review





Recent WiRs -

June 27th, June 20th, June 13th, June 6th

Jack Spratts' Week In Review is published every Friday. Submit letters, articles, press releases, comments, questions etc. in plain text English to jackspratts (at) lycos (dot) com. Submission deadlines are Thursdays @ 1400 UTC. Please include contact info. The right to publish all remarks is reserved.


"The First Amendment rests on the assumption that the widest possible dissemination of information from diverse and antagonistic sources is essential to the welfare of the public."
- Hugo Black
JackSpratts is offline   Reply With Quote
Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Peer-To-Peer News - The Week In Review - July 30th, '11 JackSpratts Peer to Peer 0 27-07-11 06:58 AM
Peer-To-Peer News - The Week In Review - July 16th, '11 JackSpratts Peer to Peer 0 13-07-11 06:43 AM
Peer-To-Peer News - The Week In Review - July 9th, '11 JackSpratts Peer to Peer 0 06-07-11 05:36 AM
Peer-To-Peer News - The Week In Review - January 30th, '10 JackSpratts Peer to Peer 0 27-01-10 07:49 AM
Peer-To-Peer News - The Week In Review - December 5th, '09 JackSpratts Peer to Peer 0 02-12-09 08:32 AM






All times are GMT -6. The time now is 04:48 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
© www.p2p-zone.com - Napsterites - 2000 - 2024 (Contact grm1@iinet.net.au for all admin enquiries)