P2P-Zone  

Go Back   P2P-Zone > Peer to Peer
FAQ Members List Calendar Search Today's Posts Mark Forums Read

Peer to Peer The 3rd millenium technology!

Reply
 
Thread Tools Search this Thread Display Modes
Old 05-09-18, 07:10 AM   #1
JackSpratts
 
JackSpratts's Avatar
 
Join Date: May 2001
Location: New England
Posts: 10,013
Default Peer-To-Peer News - The Week In Review - September 8th, ’18

Since 2002


































"Many audio extractions qualify as non-infringing fair uses under copyright. Providing a service that is capable of extracting audio tracks for these lawful purposes is itself lawful, even if some users infringe." – EFF






































September 8th, 2018




Canadians Are Actually Getting Sued For File-Sharing

If you torrent all your movies, this one's for you.
Jessica Chin

Turns out, those pesky warning emails from your internet service provider about those movies you download actually mean something.

When you engage in unauthorized downloading of content like movies or TV shows, the creators will often contact ISPs about it. Your ISP will then send you an email notifying you that they've been contacted (as they are required to do by law since 2015).

Since the so-called "notice and notice" emails aren't actual threats of legal action, most people ignore them.

However, if the unauthorized activity continues, you may get another email stating that the copyright holder has the right to sue.

In most cases, the intellectual property holders who go after individuals for file-sharing can only get a list of IP addresses. They have filed "John Doe" lawsuits, wherein a lawsuit targets an unlimited number of unidentified people who've allegedly committed copyright infringement.

Damages up to $5,000

But copyright holders have cracked down after the Federal Court ordered internet service providers in 2016 to name the people being targeted.

For the past year and a half, movie rights holders have launched lawsuits naming more than 1,000 Canadians, and in some of those suits, have won damages up to $5,000, according to the National Post.

Lawyer James Plotkin of CazaSaikaley told TorrentFreak that he's seen settlements where the defendants will pay $5,000, which is actually the maximum liability for non-commercial copyright infringement.

"I am therefore puzzled as to why individuals would agree to settle for their likely maximum liability at trial," Plotkin said. "I see no rational basis for paying that amount."

But if you happen to get caught up in a suit, Plotkin's advice is to lawyer up.

How is this happening?

It all started seven years ago with case involving the Oscar-winning film "The Hurt Locker."

Voltage Pictures, the movie's owner, has been trying to sue Canadians over its unauthorized sharing since 2011. In 2016 it sued 50,000 Canadian John Does in a reverse class-action lawsuit, in which a large entity or corporation sues a group of people, instead of the other way around.

They sought a court order in 2016 to force Rogers Communications to name a single one of those John Does.

That case, which is still ongoing and is before the Supreme Court of Canada, is about who should foot the bill for naming customers who've allegedly infringed copyright, which Rogers says costs about $100 an hour, according to the Post.

If Voltage wins, lawyers say the current lawsuits will be the "tip of the iceberg."

"The lower the cost for the plaintiff up front, the easier for them to increase the number of defendants. That can make the difference between naming a lot of defendants and a few," Toronto intellectual property lawyer Graham Honsa told the Post.
https://www.huffingtonpost.ca/2018/0...ng_a_23520421/





YouTube Download Sites are Biggest Piracy Threat to Music Industry

A third of young people use sites that illegally convert videos from Youtube to MP3 files, but a major crackdown is underway
Anthony Cuthbertson

Websites dedicated to “stream ripping” music from YouTube represent the biggest threat to the global music business, industry figures have warned, as a major crackdown seeks to address it.

Sites that allow YouTube videos to be converted into an MP3 file and illegally downloaded to someone’s phone or computer are attracting millions of visitors, with estimates suggesting that a third of 16-24-year-olds in the UK have ripped music from the Google-owned platform.

Other platforms affected by the illegal ripping sites include DailyMotion, SoundCloud and Vimeo, however YouTube is by far the most pirated.

The results of a crackdown that began in 2016 are beginning to be seen, thanks to a coordinated effort by organisations representing record companies in the US and the UK.

Earlier this week, stream ripping website MP3Fiber was forced to shut down following legal pressure.

However, dozens of sites offering similar services still remain active and are easily accessible through Google, whose search engine provides more than 100 million results for the term “YouTube MP3 converter”. YouTube is even host to videos that explain exactly how to do it, some of which have amassed tens of thousands of views over several years.

The illegal sites are able to monetise their popularity through online adverts, making millions in revenue that rightfully belongs to the content creators and copyright holders. Such widespread use makes these stream ripping sites a bigger threat to copyright holders and the music industry generally than illegal torrent sites like The Pirate Bay. Beyond the stream ripping websites are apps that offer a similar service, such as the Videoder app, which has over 40 million users worldwide.

The British Phonographic Industry (BPI), the UK record industry’s trade association, is among those attempting to clear these illegal sites from the internet and encourage the use of legal music streaming sites.

“Although coordinated action by the record industry is delivering results, with major platforms like YouTube-mp3 closed down, we must continue to act against illegal sites that build huge fortunes by ripping off artists and labels,” Geoff Taylor, the chief executive of the BPI and Brit Awards, told The Independent.

Mr Taylor said the “fantastic range” of legal streaming services, such as Spotify and Apple Music, could be under threat if the issue of stream ripping sites is not properly dealt with. The best way to do this, he said, is through a coordinated crackdown.

“We hope that responsible advertisers, search engines and hosting providers will also reflect on the ethics of supporting sites that enrich themselves by defrauding creators,” he said.

But chasing down and shutting down such sites is not a simple issue, with some digital rights groups arguing that they provide a legitimate service, even if they are misused by some people.

The Electronic Frontier Foundation (EFF) claims that even referring to them as “stream ripping” sites is misstating copyright law.

“There exists a vast and growing volume of online video that is licensed for free downloading and modification, or contains audio tracks that are not subject to copyright,” the EFF told the US Office of the United States Trade Representative last year.

“Moreover, many audio extractions qualify as non-infringing fair uses under copyright. Providing a service that is capable of extracting audio tracks for these lawful purposes is itself lawful, even if some users infringe.”

For its part, YouTube says it has “invested heavily” in copyright and content management tools, such as Content ID – a technology that provides rights holders with ways to manage and monetise their content.

The Google-owned platform’s terms of service state that users are forbidden from accessing content on YouTube through third parties, while disabling features that prevent the copying of content is also not allowed.

A YouTube spokesperson reiterated this to The Independent and said it is working to prevent such practices from taking place.

“Our terms of service prohibit the downloading or copying of videos on YouTube without explicit consent from the copyright holder,” the spokesperson said. “Once notified of an infringing tool, or service that allows the downloading of a YouTube video without permission from the content owner we take appropriate action.”
https://www.independent.co.uk/life-s...-a8505131.html





Wikimedia Warns EU Copyright Reform Threatens the ‘Vibrant Free Web’
Natasha Lomas

The Wikimedia Foundation has sounded a stark warning against a copyright reform proposal in Europe that’s due to be voted on by the European Parliament next week. (With the mild irony that it’s done so with a blog post on the commercial Medium platform.)

In the post, also emailed to TechCrunch, María Sefidari Huici, chair of the Wikimedia Foundation, writes: “Next week, the European Parliament will decide how information online is shared in a vote that will significantly affect how we interact in our increasingly connected, digital world. We are in the last few moments of what could be our last opportunity to define what the Internet looks like in the future.

“The next wave of proposed rules under consideration by the European Parliament will either permit more innovation and growth, or stifle the vibrant free web that has allowed creativity, innovation, and collaboration to thrive. This is significant because copyright does not only affect books and music, it profoundly shapes how people communicate and create on the internet for years to come.”

Backers of the reform proposals argue they will help European creatives be fairly recompensed for their work. But critics argue the proposals are not balanced and will chill the creative freedoms of web users to share and comment on content online.

The two articles attracting the most controversy in the reforms are:

• Article 11; which proposes a neighboring copyright for snippets of journalistic content — requiring news aggregators such as Google News to gain a license from the publisher to use this type of content (branded a ‘link tax’ by critics);
• Article 13; which seeks to shift liability for platform users’ copyright infringements onto the platforms themselves — and which critics contend will therefore push them towards creating upload filters to monitor all content before it’s posted, having a chilling effect on Internet expression. Critics sometimes dub this component ‘censorship machines’.

In July MEPs issued a smackdown to the Commission by refusing to back the reforms — and voting to reopen debate. Another vote is due next week, with amendments in the process of being tabled now, hence Wikimedia’s intervention.

In her blog post, Sefidari Huici urges MEPs to remember the original objective for the update: “To make copyright rules that work for better access to a quickly-evolving, diverse, and open internet.”

“The very context in which copyright operates has changed completely. Consider Wikipedia, a platform which like much of the internet today, is made possible by people who act as consumers and creators. People read Wikipedia, but they also write and edit articles, take photos for Wikimedia Commons, or contribute to other Wikimedia free knowledge projects. Content on Wikipedia is available under a free license for anyone to use, copy, or remix,” she writes.

“Every month, hundreds of thousands of volunteers make decisions about what content to include on Wikipedia, what constitutes a copyright violation, and when those decisions need to be revised. We like it this way — it allows people, not algorithms, to make decisions about what knowledge should be presented back to the rest of the world.”

She also warns that changes to EU copyright could have serious implications for Wikipedia and other collaborative non-profit websites, urging MEPs to “institute policies that promote the free exchange of information online for everyone”.

“We urge EU representatives to support reform that adds critical protections for public domain works of art, history, and culture, and to limit new exclusive rights to existing works that are already free of copyright,” she writes.

On Article 13 specifically she warns this would push platforms towards creating “costly, often biased systems to automatically review and filter out potential copyright violations on their sites”, warning: “We already know that these systems are historically faulty and often lead to false positives. For example, consider the experience of a German professor who repeatedly received copyright violation notices when using public domain music from Beethoven, Bartók, and Schubert in videos on YouTube.”

“The internet has already created alternative ways to manage these issues,” she adds. “For instance, Wikipedia contributors already work hard to catch and remove infringing content if it does appear. This system, which is largely driven by human efforts, is very effective at preventing copyright infringement.”

She also argues that the copyright reform debate has been dominated by market relationships between large rights holders and for-profit internet platforms — saying this too narrow slice “does not reflect the breadth of websites and users on the internet today”.

“Wikipedians are motivated by a passion for information and a sense of community. We are entirely nonprofit, independent, and volunteer-driven. We urge MEPs to consider the needs of this silent majority online when designing copyright policies that work for the entire internet,” she adds, calling for MEPs to create a copyright framework that reflects “the evolution of how people use the internet today”.

“We must remember the original problem policymakers set out to solve: to bring copyright rules in line with a dramatically larger, more complex digital world and to remove cross-border barriers. We should remain true to the original vision for the internet — to remain an open, accessible space for all.”

Asked for a response to Wikimedia’s criticisms, a spokeswoman for the European Commission pointed us to an FAQ where it discusses what will happen to online encyclopaedias based on content uploaded by users — and claims these sites will not fall under the scope of the reform (because “the vast majority of the content on Wikipedia is uploaded with the consent of their rights holders” — something critics of the reform dispute).

She also sent us a general comment from Commission spokesperson, Nathalie Vandystadt, in which she states:

The new copyright rules are necessary in order to allow creators and the press to get a better deal when their works are made available online. At the same time, our proposal safeguards free speech and ensures that online platforms – including 7,000 European online platforms – can develop new and innovative offers and business models. It will not ban memes or hyperlinks, as has often been claimed in the public debate.

The Commission presented its balanced proposal two years ago, in September 2016. We have discussed the proposal with all relevant actors. We now expect the European Parliament to reach a position and stand ready to start negotiations on this important reform with the Parliament and the Council of the EU as soon as possible. The process has been long enough. Any further delay at this stage would put at risk adoption before the next European elections.


It’s not the first time Wikimedia has made a high profile intervention in the reform debate; Wikipedia founder Jimmy Wales added his name to an open letter in June warning that it “takes an unprecedented step towards the transformation of the Internet from an open platform for sharing and innovation, into a tool for the automated surveillance and control of its users”.

While, in July, several local language versions of the Wikipedia encyclopaedia voted to temporarily black out their content to protest the copyright proposals.

It remains to be seen whether MEPs will be swayed by all this public pressure — not least given all the counterlobbying they are getting behind the scenes.

Commenting on the state of play for the copyright reform ahead of the vote later this month, Marietje Schaake, a Dutch Member of the European Parliament, told us it’s too close to call right now.

“Right now it is impossible to say how the copyright vote will play out next week. I have been working hard on a sensible compromise that respects our fundamental rights, but we don’t know until tomorrow which amendments will be voted on,” she told TechCrunch. “MEPs and political groups are still making up their minds, and the margins are very tight. The votes could swing either way.”

Schaake said it’s likely more clarity will emerge tomorrow, once it’s clear who has tabled what (in terms of amendments) that will then get voted on by the whole parliament next week.

On the controversial article 13 portion of the reform, which would make platforms directly liable for copyright infringements by users, options likely to be on the table include some previous texts (such as the text produced the Commission, or the original Legal Affairs Committee (Juri) text), which are therefore unlikely to gain a majority.

The new proposal by @AxelVossMdEP still makes platforms liable for all their users’ copyright infringements, with no safeguards to prevent filtering. Platforms can either filter everything or get a license for every work in the world. https://t.co/VBLzy3FC2v #SaveYourInternet

— Julia Reda (@Senficon) August 31, 2018


An amendment suggesting full deletion of the article is also likely to be tabled — but also probably wouldn’t get majority backing given the level of backing the reform has behind it.

There may also be a version of the text produced by the Internal Market and Consumer Protection committee, which had joint competency on Article 13 of the proposal with the Juri committee but at the vote in July argued that its position had not been taken into account by the Juri text (which it criticized as not achieving “the needed balance”.

On top of that additional new compromise versions — which “aim to remove the worst parts of Article 13”, as Schaake puts it — are also likely to be tabled. But with votes predicted to be tight it’s hard to say which way MEPs will jump.

In July, the parliament voted by 318 votes to 278, with 31 abstentions, to reject the negotiating mandate that had been proposed by the Juri committee the month before.

As a result, the parliament’s position was reopened for debate, amendment and a vote — which will be held during an afternoon plenary session on September 12.

The EC’s VP and commissioner for the digital single market, Andrus Ansip, has described the scale of lobbying from “all sides” around the copyright reform proposals as “astonishing“.

“Everyone claims that their rivals will kill creativity, or kill innovation, or kill the internet — or kill all of it at the same time. This all has to stop. It is getting us nowhere,” he wrote in a blog post in late July. “It is good to have a lively debate about copyright – but not one which has descended into slogans and exaggeration.

“We need to go beyond that, to find an acceptable and workable compromise that gives Europeans the right kind of copyright laws for the digital age. They deserve nothing less. And it is achievable.”

“Today, the debate sounds as if we had to choose between protecting artists or the internet,” he added. “I do not agree with this. What we should be doing — together — is to protect both: to make sure artists are paid fairly for their work, and at the same time protect freedom of expression and creativity on the internet. So we should not accept anything that puts that freedom in danger.

“Neither should we accept leaving artists and quality media unprotected. Those were my starting points for the Commission’s proposal. They have not changed.”

Ansip also wrote that he would like to see the parliament move closer to the Commission’s original proposals in its September vote, writing: “I genuinely believe that it was a good proposal, taking all opposing interests into account. That was not easy to achieve in itself.”

How or even whether MEPs manage to compromise on such a contentious issue remains to be seen. And in the meanwhile the lobbying isn’t letting up.

Pirate Party MEP Julia Reda has been campaigning for a copyright reform fit for the digital age for her entire career as an elected member of the European parliament.

Instead, she finds herself fighting against the threat of “censorship machines” and “link taxes”, as she sees it.

“The proposals for a “compromise” currently circulating in the EP don’t go nearly far enough to address the concerns that Article 11 will harm hyperlinking and Article 13 will lead to the widespread installation of upload filters which will massively overblock legal posts,” she tells us. “In fact, they may make matters worse.”

“The latest compromise proposal on the “link tax” now specifies that reproducing “individual words” from an article as part of a link is okay – but that just confirms that reproducing the full title of an article in a link, as is commonplace on the web, would require a license. It would be left to lengthy and expensive court cases to define how many words still count as “individual words” – links, the basic building block of the internet, would be mired in legal uncertainty for years.

“The proposal on platform liability was simplified by removing mentions of specifically which measures should be deployed to prevent copyrighted content from appearing online, but along with that, also all safeguards, complaint mechanisms and the objective not to remove legal works. Even if the text no longer mentions upload filters per se, total liability for all user-uploaded content leaves platforms with no other choice but to filter uploads as best they can, erring on the side of caution. This is, at best, a cosmetic change.”

Reda will be filing her own counter-proposals — aimed at removing “the threats to freedom of speech and the basic functioning of the internet”, as she couches it — from the text.

The deadline for amendments is Wednesday night so there is still time for proposals to shift. After that it will be up to MEPs to decide how they vote.
https://techcrunch.com/2018/09/04/wi...rant-free-web/





Google: Sorry Professor, Old Beethoven Recordings on YouTube are Copyrighted

Op-ed: How one German professor had a bad experience with overly broad upload filters.
Ulrich Kaiser

Imagine you are a teacher at a public school and you want to use a free recording of Beethoven’s Fifth Symphony in your classroom. As an author of music textbooks and a music theory professor, I am always looking for creative ways to develop teaching materials as Open Educational Resources (oer-musik.de) so that everyone can share and learn from these important recordings. In an effort to develop a set of these materials, I recently began to digitize both my own records as well as records from my employer.

Under German law, the copyright term for recordings which were made prior to January 1, 1963 has expired, meaning they have entered the public domain. Recordings taken after that date were given extended protection in 2013 and thus cannot be digitized. Aware of this rule, I only undertook to upload recordings which were taken before the 1963 date in order to fully comply with the law. Despite that precaution, the process that followed presented a number of unexpected challenges.

The first video I uploaded to YouTube promoted the website where my digitized copies of public domain recordings are available to download. In this video, I explained my project while examples of the music played in the background. Less than three minutes after uploading, I received a notification that there was a ContentID claim against my video.

ContentID is a system, developed by YouTube, which checks user-uploaded videos against databases of copyrighted content in order to curb copyright infringement. This system took millions of dollars to develop and is often pointed to as a working example of upload filters by rights holders and lawmakers who wish to make such technology mandatory for every website which hosts user content online. However, these claims ignore the widespread reports of its often flawed execution.

In fact, when I replied to the claim on my introductory video stating that the claimant’s own website said that the date of the recording’s first publication was in 1962, and thus it was in the public domain, the claim was withdrawn with no further ado. This interaction sparked a curiosity in me: were other users uploading public domain music to YouTube receiving similar requests?

Opening a test account

I decided to open a different YouTube account “Labeltest” to share additional excerpts of copyright-free music. I quickly received ContentID notifications for copyright-free music by Bartók, Schubert, Puccini and Wagner.

Again and again, YouTube told me that I was violating the copyright of these long-dead composers, despite all of my uploads existing in the public domain. I appealed each of these decisions, explaining that 1) the composers of these works had been dead for more than 70 years, 2) the recordings were first published before 1963, and 3) these takedown request did not provide justification in their property rights under the German Copyright Act.

I only received more notices, this time about a recording of Beethoven’s Symphony No.5, which was accompanied by the message: “Copyrighted content was found in your video. The claimant allows its content to be used in your YouTube video. However, advertisements may be displayed.” Once again, this was a mistaken notification.

The recording was one by the Berlin Philharmonic under the direction of Lorin Maazel, which was released in 1961 and is therefore in the public domain. Seeking help, I emailed YouTube, but its reply, “[...] thank you for contacting Google Inc. Please note that due to the large number of enquiries, emails received at this email address support-de@google.com cannot be read and acknowledged” was less than reassuring.

I wish I could tell you that the ending to this tale was wholly happy. It is true that many of my contestations of these copyright violations were successful, and the videos were not taken down from YouTube. However, I intended to release all of my videos under a free license so that they could be used in the future for others to educate and inform students about these beautiful works.

Even in cases where my defense to the ContentID claims were successful, the videos were not reverted to this free license, making it much more difficult for others to use and share these digitized works in the way I originally had intended.

Filters like ContentID can be useful for platforms that host large amounts of user-generated content, but as my story exposes, they have significant flaws which can lead to the diminishment of educational and cultural resources online. In addition to the copyright issues, the technology appears unable to always recognize the musical piece playing.

For example, my video of Hans Hotter’s performance of Franz Schubert’s “Der Atlas” was automatically recognized; however, the recording was actually by Dietrich Fischer-Dieskau. When lawmakers mandate the use of these types of filters for all platforms regardless of their size or existing effective practices, they miss the nuance of how the Internet operates and the technical flaws in automatic content detection which still exist, just as these upload filters missed the nuance of my public domain uploads.
https://arstechnica.com/tech-policy/...-of-beethoven/





Local Product Quotas for Netflix, Amazon to Become Law, EU Official Says (EXCLUSIVE)
Nick Vivarelli

Quotas obligating Netflix, Amazon and other streaming services operating in the European Union to dedicate at least 30% of their on-demand catalogs to local content are set to become enshrined in law soon.

Roberto Viola, head of the European Commission department that regulates communications networks, content and technology, said the new rules, which will also demand visibility and prominence of European product on streamers, are on track to be approved in December.

“We just need the final vote, but it’s a mere formality,” he told Variety at the Venice Film Festival.

Netflix, Amazon and other streamers will be required to fund TV series and films produced in Europe by commissioning content, acquiring it or paying into national film funds through a small surcharge added to their subscription fee, something which is already happening in Germany. Netflix tried unsuccessfully to fight the German surcharge in court.

Viola said that, starting in December, the EU’s 28 member states would have 20 months to apply these new norms and that countries “could choose to raise the quota from the 30% minimum to 40%.” EU nations can each choose whether the 30% includes sub-quotas on original productions in their countries and whether they want to follow the German model of adding a small surcharge on streamer subscription fees to support the national production fund.

Viola noted that Netflix isn’t that far from having a 30% portion of European content on its platform already, but said that the new rules are clearly intended to force streamers to up their investments in Europe.

He added that, in October, the EU will publish figures showing the percentage of European works already present on the various streaming platforms. “It doesn’t have a legal value, but will help national regulators apply the rules,” he said.

Other European Union rules being developed are intended to force streamers and user-generated platforms like YouTube to pay increased copyright fees to film and TV directors and writers.

“It’s a paradox that, in the digital world, the platforms are getting the largest shares of the revenue stream and those who create the content and drive traffic get the smaller share,” he said. “There is what we call a value gap there which the Internet world has created. … Artists and creatives must be able to renegotiate their contracts.”
https://variety.com/2018/film/news/n...aw-1202924740/





YouTube, Netflix Videos Found to Be Slowed by Wireless Carriers
Olga Kharif

• Wehe app shows videos streaming at fraction of available speed
• As FCC backs off, researcher becomes net-neutrality watchdog

The largest U.S. telecom companies are slowing internet traffic to and from popular apps like YouTube and Netflix, according to new research from Northeastern University and the University of Massachusetts, Amherst.

The researchers used a smartphone app called Wehe, downloaded by about 100,000 consumers, to monitor which mobile services are being throttled when and by whom, in what likely is the single largest running study of its kind.

Among U.S. wireless carriers, YouTube is the No. 1 target of throttling, where data speeds are slowed, according to the data. Netflix Inc.’s video streaming service, Amazon.com Inc.’s Prime Video and the NBC Sports app have been degraded in similar ways, according to David Choffnes, one of the study’s authors who developed the Wehe app.

From January through early May, the app detected "differentiation" by Verizon Communications Inc. more than 11,100 times, according to the study. This is when a type of traffic on a network is treated differently than other types of traffic. Most of this activity is throttling.

AT&T Inc. did this 8,398 times and it was spotted almost 3,900 times on the network of T-Mobile US Inc. and 339 times on Sprint Corp.’s network, the study found. The numbers are partly influenced by the size of the networks and user bases. C Spire, a smaller privately held wireless operator, had the fewest instances of differentiation among U.S. providers, while Verizon had the most.

"If you are a video provider, you have a patchwork of different carriers doing different things to your network traffic," Choffnes said. "And the patchwork can change any time." Consumers often don’t know about this throttling, he added, noting that he discovered AT&T began slowing down some of his apps earlier this year.

Carriers say they’re throttling to manage internet traffic. To deliver the videos people want to watch on their phones, sacrifices in speed are required, according to the three largest U.S. wireless companies, Verizon, AT&T and T-Mobile. Terms-of-service agreements tell customers when speeds will be slowed, like when they exceed data allotments. And people probably don’t notice because the video still streams at DVD quality levels. If you want high-definition video, you can pay more, the carriers say.

While slowing speeds can reduce bottlenecks and congestion, it raises questions about whether all traffic is treated equally, a prime tenet of net neutrality. The principle states that carriers have to treat all data on their networks the same, and not discriminate by user, app or content. The Federal Communications Commission under President Barack Obama enshrined net-neutrality rules in 2015. After Donald Trump won the 2016 election, a Republican-led FCC voted to scrap the regulations.

The Wehe app has so far conducted more than 500,000 tests involving more than 2,000 internet service providers in 161 countries. It measures how fast each wireless network delivers video from services like Netflix and YouTube and compares those speeds with the network speeds available at that time. For example, in one recent test of the app, Netflix speeds were 1.77 megabits per second on T-Mobile, compared with the 6.62 megabits-per-second speed available to other traffic on the network at the same time.

In recent months, Choffnes has become a new kind of net-neutrality watchdog since the FCC vote in 2017. He’s been retained by the French government to use the Wehe app to audit for net-neutrality violations. State and local governments in the U.S. have come calling, too. Choffnes said he also shared his findings with the Federal Trade Commission, which took over the job of policing U.S. internet service providers from the FCC.

"Efforts like Wehe are an important approach to detect whether internet service providers are engaging in traffic shaping, i.e., slowing down traffic of certain online services or apps," said Florian Schaub, a privacy and mobile-computing expert at the University of Michigan. "Now that net neutrality has been repealed by the FCC, it is important for consumers and researchers to watch out for ISPs starting to make use of their new ‘freedom’ in that way, and then call ISPs out for it."

Throttling was happening well before the FCC stopped enforcing net neutrality. T-Mobile has been streaming video at different speeds since it started offering free streaming through Binge On in 2015. It was an agreement between customers, T-Mobile and video providers like Netflix.

“We do not automatically throttle any customers,” said Rich Young, a Verizon spokesman. “To manage traffic on our network, we implement network management, which is significantly different than blanket throttling.”

John Donovan, head of AT&T’s satellite, phone and internet operations, said "unequivocally we are not selectively throttling by what property it is. We don’t look at any traffic differently than any other traffic."

He compared AT&T throttling to an electricity grid where some customers sign up for rolling blackouts in return for cheaper service. That’s what Choffnes’s research is detecting, the AT&T executive said. "We talked to him about some of his methodologies," he added.

Choffnes’s work is funded by the National Science Foundation, Google parent Alphabet Inc. and ARCEP, the French telecom regulator. Amazon provided some free services for the effort, too. He even has a deal with Verizon to measure throttling at all U.S. carriers for a public report that’s yet to be published. Choffnes said Verizon can’t restrict his ability to publish research and the companies that support him don’t influence his work.

Choffnes became an internet celebrity in December, when Apple Inc. rejected his Wehe app from the App Store. He tweeted about it, and the news website Motherboard wrote about it. Following an outcry, Apple approved and published his app. Wehe had only a handful of users before the episode, but quickly gained tens of thousands new testers.

The net-neutrality debate came to the forefront again in recent days after Verizon limited the data speeds of California firefighters as they battled a blaze. The company said it made a “customer-support mistake” in limiting access to emergency responders.

"As we saw with Verizon throttling the Santa Clara County Fire Department, ISPs are happy to use words like ‘unlimited’ and ‘no throttling’ in their public statements, but then give themselves the right to throttle certain traffic by burying some esoteric language in the fine print," Jeremy Gillula, tech policy director at Electronic Frontier Foundation, said. "As a result, it’s especially important that consumers have tools like this to measure whether or not their ISP is throttling certain services. Only tools like this can really keep ISPs honest."

With no federal net-neutrality rules in the U.S., legislators and regulators from state and local governments including New York City and Massachusetts have reached out to Choffnes for advice on writing their own replacement rules, he said.

"I’ve always wanted to focus on areas where not only I benefit as a user but also pretty much everyone else will benefit," he said. "Problems where we can have a real-world impact." His prior work includes ways to improve download speeds of BitTorrent, an online service that’s used for sharing files.

Once Wehe collects a year’s worth of data, Choffnes hopes to present the trove at a major technology research conference. His recent paper on the project was rejected by the Internet Measurement Conference partly because it didn’t have 12 months of data, he said.

With his app, Choffnes said he wants to give regulators the tools to monitor the marketplace.

"I would not contest the term watchdog -- that was certainly was the goal of our project," he said.

— With assistance by Scott Moritz, and Tom Giles
https://www.bloomberg.com/news/artic...research-finds





Why Google Fiber is High-Speed Internet’s Most Successful Failure
Blair Levin, Larry Downes

In 2010, Google rocked the $60 billion broadband industry by announcing plans to deploy fiber-based home internet service, offering connections up to a gigabit per second — 100 times faster than average speeds at the time. Google Fiber, as the effort was named, entered the access market intending to prove the business case for ultra-high-speed internet. After deploying to six metro areas in six years, however, company management announced in late 2016 that it was “pausing” future deployments.

In the Big Bang Disruption model, where innovations take off suddenly when markets are ready for them, Google Fiber could be seen as a failed early market experiment in gigabit internet access. But what if the company’s goal was never to unleash the disrupter itself so much as to encourage incumbent broadband providers to do so, helping Google’s expansion in adjacent markets such as video and emerging markets including smart homes?

Seen through that lens, Google Fiber succeeded wildly. It stimulated the incumbents to accelerate their own infrastructure investments by several years. New applications and new industries emerged, including virtual reality and the Internet of Things, proving the viability of an “if you build it, they will come” strategy for gigabit services. And in the process, local governments were mobilized to rethink restrictive and inefficient approaches to overseeing network installations.

The story of Google Fiber provides valuable lessons for future network transformations, notably the on-going global race to deploy next-generation 5G mobile networks. It seems, then, a good time to review the story of how the effort came into being, what it achieved, and what it teaches investors, consumers, and community leaders eager to ensure continued private spending on internet infrastructure.

In 2009, Congress charged the Federal Communications Commission with the development of a National Broadband Plan (NBP). The plan set aggressive targets for expanding high-speed broadband service throughout the U.S., continuing to rely almost entirely on private investment. The overall goal: to ensure at least 100,000,000 Americans had access to broadband speeds of 100 Mbps by 2020.

As it turned out, providers blew past that milestone as early as 2016. But in 2009, no leading carrier was planning a major upgrade of its existing physical plant. This was a break from the previous decade, when technical improvements and competing technologies meant constant upgrades, advancing from dial-up through early cable-based broadband, DSL service offered over the analog phone network, early fiber-based deployments (notably Verizon FiOS) and cable’s last major upgrade, known as DOCSIS 3.0.

By 2009, however, Verizon had scaled back plans for more fiber and DSL technology was falling behind improvements in cable. Major markets were migrating to two segments — a high-end served by cable and a low-end served by DSL.

The broadband market was experiencing a classic “prisoner’s dilemma,” where neither cable nor DSL providers felt a competitive threat from the other requiring substantial new investment, trusting in relative peace within their own market segment. Continued expansion of broadband capacity was on the verge of stalling.

Google sets off a “game of gigs”

In response to requests from the NBP team, Google suggested construction of a fiber-based gigabit testbed to demonstrate the competitive and economic importance of new applications that would not be possible without next-generation infrastructure — including virtual reality, smart grids, autonomous vehicles, advanced tele-health, electronic government, and distance-based education.

Rather than wait for incumbent providers or a government-funded experiment, the company announced that it would build a small number of experimental gigabit networks itself. To everyone’s surprise, Google was overwhelmed with cities promoting themselves for the test, receiving 1,100 proposals rather than the 10 to 50 they expected.

Cities saw great value to their communities of being one of the testbeds. They also understood that what Google was looking for was not tax breaks or other financial incentives so much as speed in execution, and in particular commitments from the participating communities to minimize build-out delays—and help in lowering construction costs. In short, Google wanted partners, not antagonists.

The finalists offered administrative efficiency—a single master contract, a sole point of contact in city government, streamlined procedures for permits to install equipment on city-owned property, and permission to dig up city streets to lay conduit. These costs—in dollars, time, and political conflict—had proven to be a major hindrance for network deployment, and Google knew that its test wouldn’t unleash entrepreneurial and competitive energy if it couldn’t deploy quickly.

Google was always coy about whether its real goal was to become a nationwide broadband provider, or simply to stimulate investment in next-generation networks by incumbent providers and other new entrants. What is clear is that Google’s own interest in fiber stemmed from a conviction that faster speeds would eventually generate more revenue and services for the broader Alphabet enterprise, making the investment justifiable if not profitable.

Becoming a competitive ISP itself was a secondary aspiration.

So Google went about announcing locations, and incumbent broadband ISPs, including AT&T, CenturyLink, Comcast, and Time Warner Cable, would quickly counter by promising improved pricing, faster speeds, network upgrades or some combination of the three. A “game of gigs” had erupted.

In the end, Google announced plans to build in 34 cities, playing a kind of broadband whack-a-mole game. Incumbents, who initially dismissed the effort as a publicity stunt, accelerated and reprioritized their own deployments city by city as Google announced follow-on expansion.

As the game of gigs played out, city leaders were forced to offer the same administrative advantages to incumbents as they had to Google Fiber. Construction costs fell, and the speed of deployments increased. Only six years after Google’s initial announcement, according to the Fiber Broadband Association, 30% of urban residents had access to gigabit Internet service.

Though Google appears to have paused future deployments, the broadband business has permanently changed. Fiber investments by former telephone companies have accelerated or restarted. More advanced DSL using fiber-copper hybrid technology was rushed into operation, as were new fiber-to-the-home services from AT&T, CenturyLink and Frontier. Cable companies once again upgraded their technology, accelerating deployment of gigabit-capable standards. New technologies — including low-orbit satellites and “fixed wireless” — were developed for remote and rural locations.

The two-tiered market of high-speed cable and lower-speed DSL broadband has given way to a free-for-all, forcing adoption of more disruptive strategies by incumbents and new entrants alike. The result is increased competition between providers and among cities and regions eager for game-changing private investment.

But we believe Google Fiber’s most significant impact was to change the nature of relations between infrastructure providers and local authorities. Even after substantial deregulation in the 1980’s and 1990’s, and even as separate networks and technologies converged on a single internet-based standard, local governments continued to treat network providers as quasi-governmental public utilities, regulating their construction efforts and access to public rights of way with cumbersome procedures developed decades earlier.

Thanks to Google Fiber, the monopoly mindset gave way to one in which both sides understood the other could walk away. Cities learned that inefficient construction management would lead providers to invest elsewhere, while ISPs came to see that cities could only do so much to improve the economics of upgrades and new deployments. Following Google’s lead, the ISPs and the cities created public-private partnerships such as Research Triangle’s North Carolina Next Generation Network, in which both got more, in terms of their goals, than they gave.

A shift in competition and investor mindsets

The Google Fiber experiment caused a reexamination of basic assumptions about competition in what was seen as a static infrastructure industry. Somehow, a powerful new entrant that dominated an adjacent market began a competitive service, deployed it city-by-city, and ignited investment and new competition among the incumbents.

As the authors of the National Broadband Plan hoped, enthusiasm for gigabit internet testbeds broke a logjam in infrastructure investment, accelerating fiber deployments perhaps by as much as two years and stimulating incumbents to commit an estimated $7 to $10 billion in additional capital spending.

Google Fiber’s entrance also — and crucially — changed investor mentality. Wall Street had punished Verizon for investing in its FiOS network ahead of market demand, and likely would have punished other incumbents for upgrading copper-based networks to compete with cable, which had a cheaper upgrade path. Similarly, Wall Street would have punished cable for upgrading its technology when it was already beating DSL in both performance and market share.

Investors, on the other hand, did not punish Google for entering the ISP market. When cities and their residents unexpectedly embraced the vision of gigabit Internet, investors then allowed the incumbents to respond to the new opportunities and threats Google Fiber created.

At the same time, to the benefit of all stakeholders, Google’s entrance into the broadband market exposed long-standing federal, state, and local regulatory inefficiencies that made deployment slow and expensive. With wasteful processes reformed, providers improved the efficiency of their capital investments. Consumers got new services. Cities saw revitalized industries and positive press coverage.

As the U.S. and other economies now undertake even more expensive deployment of next-generation 5G mobile network technologies, heeding the lessons of Google Fiber will distinguish the winners from the losers.

What it means for a 5G mobile network

5G promises speeds and new applications that will make mobile broadband competitive even with fiber. And deployment will likely follow the new city-by-city model pioneered by Google Fiber. Local governments will again have to rethink their approach to construction oversight, including permitting, zoning, franchising, tower siting, and fees.

And there is evidence that they are. For example, rental costs for rights of way, pole attachment rents and other recurring charges, long seen by some cities as a rich source of funding for budget shortfalls, are now being fiercely negotiated by providers. In Boston, Sacramento, and other cities that have secured early 5G investment, local governments are finding that carriers are more than willing to deal, but may walk away if officials demand too many concessions.

For example, authorities are finding, as with Google Fiber, that they must offer competitive rates or risk delaying private investment in new networks, a critical source of local development and competitiveness. In effect, they are learning to balance long-term objectives against short-term fee maximization.

The winners once again will be those communities that appreciate the importance of forming early and comprehensive private-public partnerships with network operators and their investors.

That’s quite a legacy for a project that, at least on paper, looks like a failed experiment. And it’s yet another example of the very different rules that apply in the growing list of industries being rapidly transformed by digital change
https://hbr.org/2018/09/why-google-f...essful-failure





Brett Kavanaugh's Net Neutrality Views Could have a Broad Impact if he Joins the Supreme Court

The Trump nominee's positions on abortion and presidential authority have made headlines. But his ruling on internet access could hurt many Americans,
Gigi Sohn

Most critiques of the nomination of Judge Brett Kavanaugh, President Trump’s nominee to replace Justice Anthony Kennedy on the Supreme Court, focus on his positions on a woman’s right to choose, his extreme deference to presidential power or his views on sensible gun laws.

But, as the Senate Judiciary Committee begins its hearings on the nomination, let me add: His decided opposition to net neutrality and any oversight of big broadband and cable companies like Comcast, AT&T and Verizon represent another incredibly problematic aspect of his judicial rulings that could have a broad impact on Americans for decades to come.

And, unlike issues like abortion, where Judge Kavanaugh has never issued an opinion to which he might easily be held to task, he made his views on the 2015 net neutrality rules unambiguous in 2017.

In a long dissent when the full D.C. Circuit Court of Appeals declined to even rehear opponents' arguments that the Federal Communication Commission didn't have the right to regulate broadband providers as "common carries," Judge Kavanaugh made it clear that he believes that big broadband and cable companies should be able to control your Internet experience as they see fit.

First, Kavanaugh stated that, because the net neutrality rules and the FCC’s decision to classify broadband providers as common carriers were “one of the most consequential regulations ever issued by any executive or independent agency in the history of the United States,” they were what he called “major rules” that need express and unambiguous Congressional authority, which he finds lacking.

But Kavanaugh’s arguments were significantly flawed. The Supreme Court really hasn’t adopted a “major rules” doctrine - he cobbled one together from bits and pieces of law review articles and examples from prior cases.

His theory on "major rules" also directly contradicted Supreme Court precedent: In a 2005 decision, the court ruled that the FCC has the discretion to decide how and under what part of the Communications Act broadband providers should be regulated. And conservative hero Justice Antonin Scalia said in a dissent to that decision that the Communications Act was clear and unambiguous that broadband providers are common carriers that are barred from engaging in discrimination.

Second, Kavanaugh's dissent espoused a radical view of the First Amendment which, not surprisingly, favored broadband and cable providers and ignored your right to a free and open Internet.

Kavanaugh first argued that companies (which provide nothing more than an on-ramp to the internet) should have the same First Amendment rights as newspaper publishers — that, essentially, their right to choose to slow down or speed up the parts of the internet you might access are on a par with the right to a free press or your own speech to be unlimited by government interference.

Further, he argued that the government’s interest in ensuring an open, free and non-discriminatory internet for all Americans is not a substantial one that justifies the net neutrality rules — in other words, that there's no reason the government should want to protect the free and open internet more than your broadband and cable company’s ability to decide what you see.

The fact of the matter is that there are nearly 50 years of Supreme Court precedent suggesting that the public’s right to speak and be heard is preeminent when weighing First Amendment rights in communications media. And, in the absence of net neutrality, companies can clearly limit the rights of their customers to communicate in the online space.

According to Kavanaugh, the only way the net neutrality rules could be justified under the First Amendment would be if broadband providers have “market power.” But he believes that there is “vibrant competition” in the market for internet access, and as a result, the rules are unconstitutional.

Many Americans, who have but one broadband choice, could have told him that isn't true. The FCC’s own numbers show that Americans in more than 40 percent of census blocks have a “choice” of just one broadband provider and, in more than 70 percent of census blocks, there is a choice of no more than two.

What’s most dangerous about Kavanaugh’s views on government oversight of corporations and the First Amendment is that they threaten most, if not all, constraints on broadband internet access providers’ anti-consumer and anti-competitive behavior. Broadband privacy rules wouldn’t likely pass the test laid out by Kavanaugh in his dissent; likely neither would an FCC prohibition against broadband providers throttling first responders, which Verizon did to the Santa Clara County Central Fire Department during the Mendocino Complex fire, the largest in California history.

While Kavanaugh’s views are already far outside of the mainstream from his current perch, there is reason to be very concerned about what could happen if he is confirmed to the high court. A Supreme Court with Justices Kavanaugh and Gorsuch will tilt to the far right, where eliminating sensible protections for consumers and competition, as well as giving cable and broadband providers the same constitutional rights as ordinary Americans, could become the norm in the not too distant future. Among the factors that ought to disqualify Kavanaugh from confirmation, this one shouldn't be ignored in the hearings.
https://www.nbcnews.com/think/opinio...act-ncna906086





BitTorrent Embraces Streaming Torrents and Takes uTorrent Web Out of Beta
Mark Wycislik-Wilson

Acknowledging that we are now very much in the streaming age, BitTorrent has launched the first version of µTorrent Web. The aim of the browser-based tool is to make torrenting as simple as possible and -- most importantly -- support torrent streaming.

It remains to be seen how many people are willing to switch from a dedicated app to a browser-based torrenting experience, but the promise that you can "play while you download, no more staring at progress bars" is certainly alluring.

Files are streamable near-instantly as they download, but they are also saved locally in the way you're used to. µTorrent Web is available for Chrome, Firefox, Internet Explorer, Microsoft Edge and Opera and the release finds BitTorrent partnering with Adaware (see below for one of the consequences of this) to check torrents for signs of malware, and even download torrents without having to visit websites.

BitTorrent says of µTorrent Web:

µTorrent Web is a Web-based torrent client for Windows that meets the streaming demands of today's users. Available for Windows across all major browsers, µTorrent Web makes it possible to quickly download and play torrent files inside the browser. With a simple download to play experience as the focal point of µTorrent Web, we see more users successfully downloading and playing torrents than with any other product in BitTorrent's history.

Speaking in a video about the new product, µTorrent Web product designer Cory Keller talks about the new version of the torrenting tool.

You can download the Windows app right now, and BitTorrent promises that a Mac version is in the works.

It's worth noting that the installer includes (optional) bundleware in the form of Adaware Internet Security and the Opera web browser, so keep an eye out for that.
https://betanews.com/2018/09/05/utorrent-web/





Tor Browser Gets a Redesign, Switches to New Firefox Quantum Engine

Tor Browser finally updated to use new-and-improved Firefox Quantum codebase. This includes new Photon UI.
Catalin Cimpanu

Two major browsers have received redesigns of their frontend user interface (UI) this week --Google Chrome on Tuesday, and the Tor Browser yesterday.

After Chrome updated its UI for the first time in ten years, the Tor Browser has also rolled out a new interface with the release of v8, on Wednesday.

The Tor Browser has always been based on the Firefox codebase, but it lagged behind a few releases. Mozilla rolled out a major overhaul of the Firefox codebase in November 2017, with the release of Firefox 57, the first release in the Firefox Quantum series.

Firefox Quantum came with a new page rendering engine, a new add-ons API, and a new user interface called the Photon UI.

Because these were major, code-breaking changes, it took the smaller Tor team some time to integrate all of them into the Tor Browser codebase and make sure everything worked as intended.

The new Tor Browser 8, released yesterday, is now in sync with the most recent version of Firefox, the Quantum release, and also supports all of its features.

This means the Tor Browser now uses the same modern Photon UI that current Firefox versions use, it supports the same speed-optimized page rendering engine and has also dropped support for the old XUL-based add-ons system for the new WebExtensions API system used by Chrome, Opera, Vivaldi, Brave, and the rest of the Chromium browsers.

But there are Tor Browser-specific changes as well. The biggest of these is that the Tor team has revamped the onboarding screen that appears the first time users install and run the browser.

This screen, the Tor Browser team says, has been simplified to help new users set up a proper (and safe) connection to the Tor network from the get-go. Since most users of the Tor Browser use it because of the privacy and anonymity it provides, and since the first browser setup contained lots of technical terms, this was a crucial and most-welcomed redesign.

But from all the changes to the onboarding system, the one that stands out is the modification to the "request bridge" mechanism.

"For users where Tor is blocked, we have previously offered a handful of bridges in the browser to bypass censorship. But to receive additional bridges, you had to send an email or visit a website, which posed a set of problems," the Tor Project explained yesterday.

"To simplify how you request bridges, we now have a new bridge configuration flow when you when you launch Tor. Now all you have to do is solve a captcha in Tor Launcher, and you'll get a bridge IP.

"We hope this simplification will allow more people to bypass censorship and browse the internet freely and privately."

The second biggest change was in the Tor Circuit button. This allows users to randomly change the server path through which they connect, and their data travels through the Tor network.

Previously, this required pressing the "Onion button" and selecting a vaguely worded option. In Tor Browser 8, users only have to click the URL info icon and click on the giant blue button that says "New Circuit for this Site."

Users who didn't like the new Firefox UI and took refuge in the Tor Browser offshoot will surely not like the new redesign, but a privacy-focused browser like the Tor Browser needs to keep its codebase updated against the latest exploits and bugs. Therefore, this v8 release is a welcomed addition for all the good reasons, and an update users shouldn't ignore just because of a UI they don't like.

Tor Browser 8 is based on the Firefox ESR 60 version. The current Firefox version is v62, which was also released yesterday.
https://www.zdnet.com/article/tor-br...uantum-engine/





SKS Keyservers Being Used as Piracy Sites
Yakamo K

I was recently poking around an SKS keyerver dump of over 5million+ pgp keys to see if there's anything fun stored in them, i also wanted to inspect what the recent attacks on the vulnerable keyservers left behind.

Just from a guess at the data i was looking at i think its easy to say that more than 50% of the data is junk from people messing around.

The most interesting is the use of Magent Links in UID’s, i found one old magnet link that did not work, and a small collection of links that appeared to still be active.

I suppose its a pretty clever way to use the keyservers as the data is distributed automatically and cannot be removed. A pirates wet dream!

If your curious about searching through sks keyerver dumps you can use pgpdmp or python-pgpdump. Please let me know if you find anything interesting.
https://medium.com/@mdrahony/sks-key...s-59ce5144101f





Get Ready for Atomic Radio

Using a laser to detect the effect of radio waves on certain atoms is the basis for a new kind of antenna that resists interference and can receive a wider range of signals.

The basic design of the radio antenna hasn’t changed in a century. The antenna is usually a set of metal rods roughly half the size of the wavelength they are designed to receive. The electric field in a passing radio wave accelerates electrons inside these rods, converting energy from the wave into a tiny electrical current that can be amplified.

But physicists would dearly love to make antennas more capable and more secure. It would be good, for example, if simple antennas could receive a wider range of wavelengths and be more resilient to electromagnetic interference.

Enter David Anderson at Rydberg Technologies in Ann Arbor, Michigan, and a couple of colleagues, who have reinvented the antenna from scratch. Their new device works in an entirely different way from conventional antennas, using a laser to measure the way radio signals interact with certain types of atoms.

The secret sauce in the new device is Rydberg atoms. These are cesium atoms in which the outer electrons are so excited that they orbit the nucleus at great distance. At these distances, the electrons’ potential energy levels are extremely closely spaced, and this gives them special properties. Indeed, any small electric field can nudge them from one level to another.

Radio waves consist of alternating electric fields that readily interact with any Rydberg atoms they come across. This makes them potential sensors.

But how to detect this interaction? A gas made of Rydberg atoms has another property that turns out to be useful—it can be made transparent by a laser tuned to a specific frequency. This laser essentially saturates the gas’s ability to absorb light, allowing another laser beam to pass through it.

However, the critical frequency at which this happens depends crucially on the properties of the Rydberg atoms in the gas. When these atoms interact with radio waves, the critical frequency changes in response.

That’s the basis of the radio detection. Anderson and co create a gas of cesium atoms excited into Rydberg states. They then use a laser tuned to a specific frequency to make the gas transparent.

Finally, they shine a second laser through the gas and measure how much light is absorbed, to see how the transparency varies with ambient radio waves.

The signal from a simple light-sensitive photodiode then reveals the way the radio signals are frequency modulated or amplitude modulated.

And that’s it: an antenna consisting of a cloud of excited cesium atoms, zapped by laser light that flickers in time to any ambient radio waves. They call it atomic radio.

Anderson and co have put their device through its paces using microwaves and say it works well. “We demonstrate an atom-based receiver for AM and FM microwave communication,” they say.

Among its advantages over conventional antennas is the huge range of signals it can detect—over four octaves from the C band to the Q band, or wavelengths from 2.5 to 15 centimeters. The antenna itself is a small vapor cell that can create and hold cesium gas excited into Rydberg atoms.

But perhaps most revolutionary is that the detection does not involve conventional radio circuitry. “The atomic radio wave receiver operates by direct real-time optical detection of the atomic response to AM and FM baseband signals, precluding the need for traditional de-modulation and signal-conditioning electronics,” say Anderson and co.

That means the device should be more or less insensitive to the kind of electromagnetic interference that can render conventional antennas useless.

To test the device, the team have used it to receive AM and FM microwave signals of a recording of a human voice singing “Mary Had a Little Lamb.” “The demonstrated atomic radio exhibits good performance over the entire human audio band,” they say.

The new antenna is not perfect. For example, its dynamic range is a little less than usually expected over radio. But the team is optimistic that it can be significantly improved.

Atomic radios are on their way.
https://www.technologyreview.com/s/6...-atomic-radio/





More than 1 in 4 American Users have Deleted Facebook, Pew Survey Finds
Hamza Shaban

Nearly three-quarters of American Facebook users have changed how they use the social media app in the past year, following a barrage of scandals involving the abuse of personal data, foreign interference in U.S. elections and the spread of hateful or harassing content on the platform.

The findings were released Wednesday in a new survey by the Pew Research Center the same day that Facebook chief operating officer Sheryl Sandberg is testifying before a Senate panel hearing to discuss how the company is combating foreign interference on the platform similar to Russian-tied efforts seen during the 2016 election.

The survey revealed that 74 percent of U.S. adult Facebook users have taken one of the following actions: changed their privacy settings, taken a break from the app or deleted it all together.

Pew found that more than 1 in 4 Americans have deleted the app from their phones. Fifty-four percent tweaked their privacy settings, and 42 percent stopped using the app for several weeks or longer. Those interventions were also much more likely to have been taken by younger people, who outpaced older users in each of the three actions. For instance, 64 percent of 18- to 29 year-olds changed their privacy settings in the past year, compared with 33 percent of people aged 65 and older.

Pew conducted the research between May 29 and June 11, surveying 4,594 people.

Facebook said in a statement to The Washington Post that users manage their information through the app's privacy controls every day. "Over recent months we have made our policies clearer, our privacy settings easier to find and introduced better tools for people to access, download, and delete their information. We've also run education campaigns on and off Facebook to help people around the world better understand how to manage their information on Facebook."

While the survey suggests large portions of Americans are abandoning the platform or scaling back their usage, Facebook reported stable daily active user numbers in its most recent earnings report. Analysts have said, however, that the company may face challenges in acquiring new users in mature markets such as the United States and Europe. Facebook said that 185 million users are on the platform in the United States and Canada every day, unchanged from last quarter. Most of Facebook's user growth now comes from Asia.

Debra Aho Williamson, an analyst at eMarketer, said that the survey rings true with the public backlash over Facebook's data privacy scandals and with continued concerns over false news reports, election meddling, and negativity on the platform.

"It does show that consumers have a heightened awareness of privacy and how social media companies use their data. People are getting fed up with the idea that they may not have as much control as they think they do," she said. "There is an undercurrent of people feeling like they are not sure social media is positive for them and if it is a good use of their time."

Williamson noted that other research has not supported the case that Facebook is shedding users and that it's possible users who have shunned the app later returned to it.

"Surveys are a good barometer of how people are feeling, but at the end of the day, it's really hard to let go completely," she said. "If you take a break you feel like you missed something."
http://www.courant.com/consumer-revi...905-story.html





Facebook, Twitter Execs Admit Failures, Warn of 'Overwhelming' Threat to Elections
Dell Cameron

Openly recognizing their companies’ past failures in rare displays of modesty, Facebook and Twitter executives touted new efforts to combat state-sponsored propaganda across their platforms before the Senate Intelligence Committee on Wednesday, acknowledging that the task is often “overwhelming” and proving a massive drain on their resources.

Despite frequent and contradictory remarks by President Donald Trump, America’s top national security officials have continued to warn of ongoing foreign influence operations aimed at the 2018 and 2020 U.S. elections. Weeks ago, FBI Director Christopher Wray said that U.S. officials had been targeted using traditional tradecraft, and that the bureau had detected criminal efforts to suppress voting and provide illegal campaign contributions.

Among other tactics employed by foreign rivals, senior officials at FBI, Homeland Security, and U.S. Cyber Command cited open-ended efforts to spread disinformation on social media, directly targeting U.S. voters, as well as ongoing cyberattacks against the nation’s voting infrastructure. “Our adversaries are trying to undermine our country on a persistent and regular basis,” said Wray, “whether it’s election season or not.”

In opening remarks on Wednesday, Facebook’s chief operating officer, Sheryl Sandberg, acknowledged that Facebook had been “too slow to act” in 2016 against the Kremlin-backed campaign that was designed to sow discord among American voters. “That’s on us,” she said, describing Moscow’s meddling as “completely unacceptable” and a violation of Facebook’s values “and of the country we love.”

“Our adversaries are determined, creative, and well-funded. But we are even more determined—and we will continue to fight back.”

“We’re investing for the long term because security is never a finished job,” Sandberg added, noting that Facebook has increased its security and communications staff to 20,000 people, doubling it over the past year. “Our adversaries are determined, creative, and well-funded,” she said. “But we are even more determined—and we will continue to fight back.”

Twitter CEO Jack Dorsey, meanwhile, portrayed the matter as not just a threat to democracy, but as a threat to the overall health and security of his business, saying that above all else, Twitter’s goal is to serve a “global public conversation.” Dorsey also acknowledged a range of threats faced by his company, including widespread abuse, manipulation by foreign powers, and “malicious automation” (i.e., bots).

“Any attempts to undermine the integrity of our service is antithetical to our fundamental rights,” he said, calling freedom of expression a “core tenant” upon which the Twitter is based.

Google, which was also asked to appear before the committee, was chided by Democrats and Republicans alike for declining to send one of its top executives. Instead of offering to send CEO Sundar Pichai or Google co-founder Larry Page, the company offered Kent Walker, its senior vice president of global affairs, who published his would-be “testimony” on his blog on Tuesday. An empty chair was left at the table next to Sandberg and Dorsey to signify Google’s absence.

“Given its size and influence, I would have thought the leadership at Google would want to demonstrate how seriously it takes these challenges and to lead this important public discussion,” said Sen. Mark Warner, the committee’s vice chairman.

Sen. Marco Rubio, Republican of Florida, likewise took shots at Google for skipping the hearing. “They’re not here today. Maybe its because they’re arrogant,” he said, also suggesting that Google may have avoided the hearing due to a report published by BuzzFeed News late Tuesday evening, outlining successful efforts by alleged Kremlin-linked trolls to purchase advertisements on the websites of several media brands, including CNN, HuffPost, and the Daily Beast.

“We simply haven’t done enough.”

Recognizing that technology often evolves too quickly for regulators to keep pace, Chairman Richad Burr said that recent arguments over whether social media companies should be considered “content publishers” were counterproductive. “I think that ambiguity has given rise to something of a convenient identity crisis, whereby judgments about what is and isn’t allowable on social media are too episodic, too reactive, and too unstructured,” he said.

“The information your platforms disseminate changes minds. It hardens opinions. It helps people make sense of the world,” Burr continued. “When you control that, or even influence a little of it, you’re in a position to win wars without firing a shot. That’s how serious this is.”

Questioned by Sen. Ron Wyden, Democrat of Oregon, over whether Facebook or Twitter had seen evidence that Russia or Iran had “supported, coordinated, or attempted to amplify” any online hoaxes, Dorsey and Sandberg seemed unsure of what precisely constitutes a hoax. “We certainly have evidence to show to show they have utilized our systems and gamed our systems to amplify information,” Dorsey responded. “I’m not sure—in terms of the definition of hoax, in this case—but it is likely.”

Sandberg recalled Facebook’s takedown of 650 pages and accounts linked to Iran two weeks ago, saying some had been linked to state-owned media. “Some of them were pretending to be free press and they weren’t free press,” she said. “It depends how you define a ‘hoax,’ but I think we’re certainly seeing them use misinformation campaigns.”

Twitter likewise removed 284 accounts late last month linked to Iran, for engaging in what it called “coordinated manipulation.”

When asked by Sen. Susan Collins, Republican of Maine, what steps Twitter had taken to notify users that had been potentially duped by fake accounts imitating Americans—naming specifically the defunct Twitter account that masqueraded as the Tennessee Republican Party—Dorsey acknowledged that his company had dropped the ball in this area.

“We simply haven’t done enough,” he said. “In this particular case, we didn’t have enough communication going out, in terms of what was seen and what was tweeted and what people were falling into.”

“We need to meet people where they are,” added Dorsey. “If we determine that people were subject to any falsehoods or manipulation of any sort, we do need to provide them the full context of that and this is an area of improvement for us and something we’re going to be diligent to fix.”

Dorsey will also appear later today before the House Energy and Commerce Committee, where he’s expected to firmly refute claims by Republican lawmakers that Twitter has censored accounts expressing right-wing views.

Additional reporting by Bryan Menegus, Rhett Jones.
https://gizmodo.com/facebook-twitter...whe-1828829221





Five Eyes Threaten to Force Encryption Backdoors: 'Privacy is Not Absolute'

The Five Eyes intelligence alliance issued an encryption ultimatum to tech companies and device makers.
Ms. Smith

“Privacy is not absolute,” said Five Eyes (US, UK, Canada, Australia and New Zealand) and tech companies will either have to give Five Eyes access to encrypted data, communications and devices or else.

Or else what? According to the recently issued Statement of Principles on Access to Evidence and Encryption, it seems the government intelligence alliance is ready to bring the pain by pursing “technological, enforcement, legislative or other measures to achieve lawful access solutions.”

That is but one statement that came after government representatives of the five countries met in Australia at the end of August. None of the statements issued specifically mention “backdoors;” in fact, Five Eyes calls encryption “vital to the digital economy, a secure cyberspace and the protection of personal, commercial and government information. The five countries have no interest or intention to weaken encryption mechanisms.”

But from there, it’s the same old song and dance. Baddies – such as criminals and terrorists – use encryption and end-to-end encryption and keeping intelligence and law enforcement from accessing their encrypted data and communications makes it difficult “to protect” communities. Giving intelligence agencies the ability to access encrypted data is for the children! Tech companies, device manufacturers and carriers have a “mutual responsibility” to “assist authorities to lawfully access data, including the content of communications.”

“Privacy laws must prevent arbitrary or unlawful interference, but privacy is not absolute,” Five Eyes said. “The increasing gap between the ability of law enforcement to lawfully access data and their ability to acquire and use the content of that data is a pressing international concern that requires urgent, sustained attention and informed discussion on the complexity of the issues and interests at stake.”

Five Eyes was disappointed that digital industry CEOs were not interested in attending the meetings and put emphasis on “freedom of choice for lawful access solutions,” but not freedom of choice to prevent access to encrypted communications.

The Governments of the Five Eyes encourage information and communications technology service providers to voluntarily establish lawful access solutions to their products and services that they create or operate in our countries. Governments should not favor a particular technology; instead, providers may create customized solutions, tailored to their individual system architectures that are capable of meeting lawful access requirements. Such solutions can be a constructive approach to current challenges.

Should governments continue to encounter impediments to lawful access to information necessary to aid the protection of the citizens of our countries, we may pursue technological, enforcement, legislative or other measures to achieve lawful access solutions.


Each Five Eyes jurisdiction “will consider how best to implement the principles of this statement, including with the voluntary cooperation of industry partners. Any response, be it legislative or otherwise, will adhere to requirements for proper authorization and oversight, and to the traditional requirements that access to information is underpinned by warrant or other legal process.”

On top of all that, Five Eyes called upon the tech industry “to meet public expectations regarding online safety” by doing the following:

• Developing and implementing capabilities to prevent illegal and illicit content from ever being uploaded, and to execute urgent and immediate takedown where there is a failure to prevent upload.
• Deploying human and automated capabilities to seek out and remove legacy content.
• Acting on previous commitments to invest in automated capabilities and techniques (including photo DNA tools) to detect, remove and prevent re‑upload of illegal and illicit content, as well as content that violates a company's terms of service.
• Prioritizing the protection of the user by building user safety into the design of all online platforms and services, including new technologies before they are deployed.
• Building upon successful hash sharing efforts to further assist in proactive removal of illicit content.
• Setting ambitious industry standards, and increasing assistance to smaller companies in developing and deploying illicit content counter-measures.
• Building and enhancing capabilities to counter foreign interference and disinformation.
• Preventing live streaming of child sexual abuse on all platforms.

https://www.csoonline.com/article/33...-absolute.html





Google Notifies People Targeted by Secret FBI Investigation

Dozens of people reported receiving an email from Google revealing a potential FBI investigation into people who purchased malware.
Lorenzo Franceschi-Bicchierai

At least dozens of people have received an email from Google informing them that the internet giant responded to a request from the FBI demanding the release of user data, according to several people who claimed to have received the email. The email did not specify whether Google released the requested data to the FBI.

The unusual notice appears to be related to the case of Colton Grubbs, one of the creators of LuminosityLink, a $40 remote access tool (or RAT), that was marketed to hack and control computers remotely. Grubs pleaded guilty last year to creating and distributing the hacking tool to hundreds of people.

Several people on Reddit, Twitter, and on HackForums, a popular forum where criminals and cybersecurity enthusiast discuss and sometimes share hacking tools, reported receiving the email.
A copy of the email, posted by a Reddit user.

“Google received and responded to legal process issue by Federal Bureau of Investigation (Eastern District of Kentucky) compelling the release of information related to your Google account,” the email read, according to multiple reports from people who claimed to have received it.

The email included a legal process number. When Motherboard searched for it within PACER, the US government’s database for court cases documents, it showed that it was part of a case that’s still under seal.

Despite the lack of details in the email, as well as the fact that the case is still under seal, it appears the case is related to LuminosityLink. Several people who claimed to have received the notice said they purchased the software. Moreover, Grubbs’ case was investigate by the same district mentioned in the Google notice.

Luca Bongiorni, a security researcher who received the email, said he used LuminosityLink for work, and only with his own computer and virtual machines.

The FBI declined to comment. Google did not respond to a request for comment. Lawyers that specialize in cybercrime told me that it’s not unusual for Google to disclose law enforcement requests when it is allowed to.

“It looks to me like the court initially ordered Google not to disclose the existence of the info demand, so Google was legally prohibited from notifying the user. Then the nondisclosure order was lifted, so Google notified the user. There's nothing unusual about that per se,” Marcia Hoffman, a lawyer who specializes in cybercrime, told Motherboard in an online chat. “It's common when law enforcement is seeking info during an ongoing investigation and doesn't want to tip off the target(s).”

What may be unusual and controversial is for the FBI to try to unmask everyone who purchased software that may not necessarily be considered illegal.

“If one is just buying a tool that enables this kind of capability to remotely access a computer, you might be a good guy or you might be a bad guy,” Gabriel Ramsey, a lawyer who specializes in internet and cybersecurity law, told Motherboard in a phone call. “I can imagine a scenario where that kind of request reaches—for good or bad—accounts of both type of purchasers.”
https://motherboard.vice.com/en_us/a...-investigation

















Until next week,

- js.



















Current Week In Review





Recent WiRs -

September 1st, August 25th, August 18th, August 11th

Jack Spratts' Week In Review is published every Friday. Submit letters, articles, press releases, comments, questions etc. in plain text English to jackspratts (at) lycos (dot) com. Submission deadlines are Thursdays @ 1400 UTC. Please include contact info. The right to publish all remarks is reserved.


"The First Amendment rests on the assumption that the widest possible dissemination of information from diverse and antagonistic sources is essential to the welfare of the public."
- Hugo Black
__________________
Thanks For Sharing
JackSpratts is offline   Reply With Quote
Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Peer-To-Peer News - The Week In Review - November 24th, '12 JackSpratts Peer to Peer 0 21-11-12 09:20 AM
Peer-To-Peer News - The Week In Review - July 16th, '11 JackSpratts Peer to Peer 0 13-07-11 06:43 AM
Peer-To-Peer News - The Week In Review - January 30th, '10 JackSpratts Peer to Peer 0 27-01-10 07:49 AM
Peer-To-Peer News - The Week In Review - January 16th, '10 JackSpratts Peer to Peer 0 13-01-10 09:02 AM
Peer-To-Peer News - The Week In Review - December 5th, '09 JackSpratts Peer to Peer 0 02-12-09 08:32 AM






All times are GMT -6. The time now is 10:42 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
© www.p2p-zone.com - Napsterites - 2000 - 2024 (Contact grm1@iinet.net.au for all admin enquiries)