P2P-Zone  

Go Back   P2P-Zone > Peer to Peer
FAQ Members List Calendar Search Today's Posts Mark Forums Read

Peer to Peer The 3rd millenium technology!

Reply
 
Thread Tools Search this Thread Display Modes
Old 26-09-18, 07:01 AM   #1
JackSpratts
 
JackSpratts's Avatar
 
Join Date: May 2001
Location: New England
Posts: 10,013
Default Peer-To-Peer News - The Week In Review - September 29th, ’18

Since 2002































September 29th, 2018




Russian Communications Regulator Blocks Biggest Sites For Pirated Movies – Head
Daniyal Sohail

Russia's communications regulator Roskomnadzor has blocked about 6,000 illegal websites sharing links to pirated movies in recent years, the watchdog's head Alexander Zharov said on Monday.

"The Russian internet used to be a real piracy haven a few years ago ...

Now the situation has changed radically as 6,000 [piracy] websites were blocked, and 11,000 [websites] removed [illegal] content ... The largest piracy websites are now blocked," Zharov said at a meeting with Russian President Vladimir Putin.

The law to tackle online piracy in Russia by cutting off access to websites found to be pirating movies, was introduced in August 2013.

The legislation was updated in May 2015 and expanded to cover websites sharing pirated music, books and software.
https://www.urdupoint.com/en/technol...ge-439358.html





Judge Orders Cloudflare to Turn Over Identifying Data in Copyright Case

Copyright trolls want personal information of operators behind ShowBox, Popcorn Time, and YTS websites
Cal Jeffrey

In context: Back in May, several studios started targeting movie-pirating sites and services. Dallas Buyers Club, Cobbler Nevada, Bodyguard Productions, and several other copyright owners filed a lawsuit against ShowBox, a movie-streaming app for mobile devices.

The companies tried pressuring CDN and DDoS protection provider Cloudflare into releasing information on the operators of some of these platforms. However, Cloudflare told them if they wanted such information they would have to get it the right way — through legal action.

The plaintiffs did just that. A subpoena was issued in the case from a federal court in Hawaii. The documents were not made public, but TorrentFreak was able to obtain a portion of the subpoena from a source.

The court order demands the details of the operators behind the Showboxbuzz website, Showbox.software, website Rawapk, Popcorn Time, and others. Cloudflare has not filed a motion to quash, so it appears likely that the company will hand over the requested data.

TorrentFreak notes that other than ShowBox, the targets listed in the subpoena are not named in the original lawsuit from May. It surmises that these sites and services might have links to the defendants in the legal action.

Even if Cloudflare turns over identifying information to the courts, it is highly likely that it will be of little use. Most pirate site operators go to great lengths to mask their identities. Rarely does one information request result in a direct link to the real operator.

In fact, one of the plaintiffs in the suit, Cobbler Nevada LLC, had a lawsuit shot down twice (once in appeals court) back in August because the judge ruled that an IP address was not sufficient enough information to establish identity.

It is also worth mentioning that Cobbler and other plaintiffs in the lawsuit are known copyright trolls.
https://www.techspot.com/news/76544-...copyright.html





Sky TV Taking Pirate Sites to Court

Sky TV is stepping up its attack on piracy. The pay-TV operator says it will start legal action against some of the biggest websites offering pirated content before the end of the year.

Today it released research showing 30 percent of New Zealand adults regularly pirate content and 10 percent do it on a weekly basis. Mark Jennings reports.


The bosses at Sky TV have had a feeling that pirates were costing the company serious money. Now they know it’s the case.

Three hundred thousand New Zealanders are regularly watching illegal streams of sports. The figure comes from a major research project Sky had carried out by Research Now in May this year.

The Navigators surveyed 1009 people aged 18 and over.

It found 50 percent had viewed pirated content at some time in the past and nearly a third of the population do it regularly or at least every six months.

Most (81 percent) gave “it’s not available in New Zealand” as the reason for their digital piracy. Nearly 70 percent said not being able to afford to pay for the content was a key reason for illegally downloading or streaming it.

Sophie Moloney, Sky TV’s general counsel, accepts that cost is a genuine issue.

“We have been trying to grapple with this problem and it is one of the reasons we have created a Neon package for $11.99. Socioeconomic factors do come into it but there is also a problem with higher income earners and these are the people we really want to have a conversation with.”

Moloney, who says her experience working in the UK and the Middle East have made her passionate about “content protection”, wants people to understand the impact piracy has on the creative industry.

“People say it is only Hollywood, but they don’t understand that there a lot of people involved behind the scenes in producing content.”

She cites Game of Thrones as an example of what it takes to get a series to air.

“With GOT there were 3589 people involved in getting the first episode from page to screen. With every All Blacks game 80 of our people are involved (getting it to air).”

Moloney says people also need to realise that there are “criminals behind these websites” and it is risky to download movies and other content.

“The pirate sites make money in a number of ways, but they also use spyware, malware and ransomware. I saw this when I worked in the Middle East. People paid the ransom, or they were never going to see their kids’ photos that had been locked up, ever again.

“When you get material from these sites you don’t know what else you are getting.”

Moloney said Sky had not quantified how much money it was losing from people pirating material but thought it would be “in the millions.” It planned to do some impact analysis later this year.

SKY knows that it is fighting what has become “normalised” behaviour for a lot of people.

The research showed that most pirates know others who watch pirated content and feel it is an easy thing to do. Still, Moloney believes the “group behaviour” also opens up an opportunity.

“If we can get one person in a group (of pirates) to change their thinking, that could have a significant impact.

“For instance, people wouldn’t go into a shop stocking a New Zealand clothing brand and steal something every couple of months, so we need to interrupt their thought process when they are looking at pirated content.”

The “conversation” Maloney says she wants to have with people will be kick-started when Sky takes legal action in the next few months.

It has two initial targets, big international operation The Pirate Bay and a sports streaming site it declined to name.

Moloney said Sky believes the Copyright Act is being breached and a High Court ruling in its favour would mean it could get ISPs to block the pirate websites.

“There would certainly be significant upfront legal costs for Sky but once the precedent is established it should be straightforward for the ISPs to block these sites and that includes the proxy and mirror sites of The Pirate Bay.

“The research shows it is time to take this forward and we are in the process of that now. We have already begun discussions with the ISPs.”

Site blocking has been effective in many overseas countries.

A Motion Picture Association study from 2016 revealed that 32 countries in Europe have legislation for blocking overseas websites and 15 countries have successfully had cases processed through the courts.

In the Asia Pacific region, South Korea has blocked 403 websites, Indonesia 215 and Australia 78.

MPA says that without piracy, box office revenue would have been 14 to 15 percent higher.

A UK case study shows “site blocking” had led to a 22 percent decrease in piracy for all users affected by the blocks and corresponding increase in the use of legal streaming sites like Netflix and BBC.

Moloney says Sky’s research has indicated strong support for “site blocking” in New Zealand.

Fifty-one percent of pirates and 68 percent of non-pirates would be happy for their ISPs to block piracy websites if were a court requirement.

Sky is also trying a softer approach to go with the legal manoeuvring. It has licensing a video game called ‘Copycat Combat’ and is piloting an education programme called Kiwa in an Auckland primary school.
https://www.newsroom.co.nz/2018/09/2...sites-to-court





Illegal Streams, Decrypting m3u8's, and Building a Better Stream Experience
Jon Luca

Having not lived in the US for the majority of my life, I often needed to rely on illegal streams to watch America sports games. The experience on these streams is, to say the least, extremely poor. Most have some sort of crypto miner running in the background, as well as dozens of ads covering the stream. I wholeheartedly support ads for free content but unfortunately the sorts of ads that show up on these streams are terrible at best and malicious at worst.

I wanted to do some research and figure out if there was a way to quickly export the stream and watch it elsewhere.

Background

HTTP Live Streaming, or HLS, is the defacto live streaming standard. There’s a healthy open source community around building tools and infrastructure for HLS, most notably hls.js. HLS is designed for reliability and dynamically adapts to network conditions by optimizing playback for the available speed of wired and wireless connections. It’s a fairly complicated spec, but the core of it is as follows.

Live streams start at the input, which is usually either a live event or a static file that is being streamed. For live events, the server requires a media encoder, which can be off-the-shelf hardware, and a way to break the encoded media into segments and save them as files. These files are then categorized into a playlist file, usually with a file extension .m3u8.

The client software begins by fetching the index file, using a URL that identifies the stream. The index file, in turn, specifies the location of the available media files, decryption keys, and any alternate streams available. For the selected stream, the client downloads each available media file in sequence. Each file contains a consecutive segment of the stream. Once it has a sufficient amount of data downloaded, the client begins presenting the reassembled stream to the user. A playlist, example below, is just a collection of these ts links.

#EXTM3U
#EXT-X-VERSION:3
#EXT-X-TARGETDURATION:9
#EXT-X-MEDIA-SEQUENCE:1808
#EXT-X-KEY:METHOD=AES-128,URI="<redacted>",IV=0xcbbc4b1952b18ab386984ab67d2df816,K EYFORMAT="identity",KEYFORMATVERSIONS="1"
#EXTINF:6.0,
media-uqjt9f59i_1808.ts
#EXTINF:4.0,
media-uqjt9f59i_1809.ts
#EXTINF:6.0,
media-uqjt9f59i_1810.ts
#EXTINF:4.0,
media-uqjt9f59i_1811.ts
#EXTINF:6.0,
media-uqjt9f59i_1812.ts
#EXTINF:4.0,
media-uqjt9f59i_1813.ts
#EXTINF:6.0,
media-uqjt9f59i_1814.ts
#EXTINF:4.0,
media-uqjt9f59i_1815.ts

Each of these ts files is the actual media chunk, either a part of an MP4, HEVC, or some other media format.


The client is responsible for fetching any decryption keys, authenticating or presenting a user interface to allow authentication, and decrypting media files as needed.

This process continues until the client encounters the EXT-X-ENDLIST tag in the index file. If no EXT-X-ENDLIST tag is present, the index file is part of an ongoing broadcast. During ongoing broadcasts, the client loads a new version of the index file periodically. The client looks for new media files and encryption keys in the updated index and adds these URLs to its queue.

Extraction

This means that it should be fairly trivial to extract the .m3u8 (the live stream playlist format) from a livestream and just plug it into a site that’s nothing more than a wrapper for a <video> element.

I experimented with it for a bit, and found the fastest way to do so would be to go to the Network tab and just filter by m3u8. At that point you can just copy and paste the URL into any program that plays m3u8 and watch the stream (VLC and QuickTime both work).

This works perfectly well for unobfuscated and simple streams. However, after trying a few streams like this, they stopped working. The URL and parameters would be exactly right, but VLC or any other player would not work.

The streamers had started using HTTP headers for verification.

Issues

All of the new streams would now have additional headers, with the most common one being the “Referer” header. If the referer sent with the stream did not match the one they were expecting, they would not respond with the playlist file. Some even used jwts in HTTP only cookies to make sure their stream wasn’t being ripped by others (ironic).

Unfortunately most common desktop players do not allow you to customize the cookies and headers sent with your request. I had to find another solution.

I wanted to make a tool that would quickly extract the stream and start playing it, without needing to manually type in any headers, options, or cookies. Unfortunately, Chrome does not provide easy direct access to your network requests. Only devtools extensions have access to the chrome.devtools API. When you do have access to that API, though, you can export any network request in the HTTP Archive Format, which contains everything you need to exactly recreate the request.

I built a quick chrome extension that would do just that.

Once I had the HAR blob, I could just base64 encode it and pass it to an HTML file I created that was a simple wrapper around hls.js, seen below.

function loadHar() {
let token = $("#token").val();
if (token) {
try {
let harString = atob(token);
let har = JSON.parse(harString);
console.log(har);
let request = har.request;
let streamUrl = request.url;
let cookies = request.cookies;
let headers = request.headers;
loadVideo(streamUrl, cookies, headers);
} catch (e) {
console.error(e);
}
}
}


At this point, I could scan a page for playlist files, retrieve the HAR, encode it, and pass it onto my small static site. It still didn’t work though.

Chrome Security

Chrome refuses to set certain headers, one of which is ‘Referer’. Any time I tried to deconstruct the HAR and recreate the XMLHttpRequest it would warn me in the console that the headers I chose to set were unsafe, and refuse to apply them. There was no way around this, unfortunately. So I turned to the fastest way of keeping the work I had done so far while reaching a working solution - Electron.

Electron

Electron is a cross platform open source framework for creating desktop apps. While it is a little bloated (you’re effectively running Chrome + Node, which makes anyone with limited amounts of RAM shudder), it is effectively a way to run webapps on your desktop. The real plus is that it doesn’t suffer from the same restrictions as a static HTML page.

I started by just porting everything over to electron - it worked pretty much out of the box, displaying my custom site.

However, since Electron is based on Chromium, and I was making regular XMLHttpRequests, it still wouldn’t let me apply those headers. I’d have to use a Node library to make my HTTP requests, but the library I was using (hls.js) is only a browser library, and lacks a direct integration with Node.

Fortunately hls.js provides a way to overlaod their HTTP loader function. I didn’t really need a custom loader though - I just needed a version of XMLHttpRequest that didn’t need to go through Chromium.

Unsafe Clones

There are a few libraries out there that emulate XMLHttpRequest in Node. I found one that worked and imported it, and overloaded the custom loader with one that was exactly the same but used Node’s http module rather than Chrome’s.

Unfortunately this library also refused to set unsafe headers. This time it was easy to fix though - I could just fork it, and remove the code that checks for those headers! Thus, node-xhr2-unsafe was born (published [url=https://github.com/jonluca/node-xhr2-unsafe]here[/ul] and on npm).

After all those changes, and an experiencing in yak shaving that would’ve made Donald Knuth proud, I got the streams working. I also added some options to make the viewing experience better, such as dark mode and forcing LIVE mode.

ESPN and Fox?

Within the last 6 months an interesting trend has arisen - the illegal streams actually link to the actual ESPN, Fox, and CBS streams. If you inspect the network requests you’ll see the streamer is linking directly to these companies streams. The streams were beautiful quality and would now hardly every stutter or have the awkward “let me minimize the window I’m streaming and check my email” problem you’d get with bootlegged streamers.

How were they doing this? All the real providers use the encryption built into HLS to prevent illegal streams.

The streamers had done something much more clever. They were actually authenticating all these streams with valid keys. They did this by setting up an actual semi-proxy server to the official keyservers and wrapping the requests with the valid keys.

Let me explain.

Each playlist file has a field where you can go and retrieve the keys to decrypt the stream.

The streamers would override the XMLHttpRequest open function, and if the url was to the keyserver, they would send it to their proxies.

(function (open) {
XMLHttpRequest.prototype.open = function (method, url, async, user, pass) {
if (url.indexOf('media-api') != -1) {
rewrittenUrl = url.replace("https://media-api.foxsportsgo.com/programs/", "http://<redacted>.000webhostapp.com/fskey.php?gth=");
} else if (url.indexOf('media-api') != -1) {
rewrittenUrl = url.replace("https://media-api.foxsportsgo.com/programs/", "http://<redacted>.000webhostapp.com/fskey.php?gth=");
} else if (url.indexOf('playback.svcs.plus.espn.com') != -1) {
rewrittenUrl = url.replace("https://playback.svcs.plus.espn.com/events/", "http://<redacted>.000webhostapp.com/espnplus.php?gth=");
} else {
rewrittenUrl = url;
}
open.call(this, method, rewrittenUrl, async, user, pass);
};
})(XMLHttpRequest.prototype.open);

Code snippet taken directly from one of their streams


At this point I wanted to know what these streams were doing. Certain streams would Base64 encode the URL actually being requested and pass it on to their proxies.

I tried changing the URL to my person site, and what do you know, it replies back with my site!

The “aHR0cHM6Ly9qb25sdS5jYQ==” in the URL is just https://jonlu.ca base64 encoded.

At this point I made an educated guess - it takes the URL passed in the URL parameter, applies the hard coded authentication from the server, makes the request, then replies back with the results. However, the server isn’t currently checking if the host it’s sending it’s cookies/headers to is the correct one (i.e. espn, fox, cbs, etc).

I built up a quick HTTP server that would just dump all the requests contents to stdout. I threw this up on my server at https://jonlu.ca/stream/ and then base64 encoded that and sent it to their proxy key server.

I monitored my logs and all of a sudden there it was - the valid session for the streamer!

In the interest of privacy I’ve commented out partial credentials, but they’re entirely valid and work for any stream on ESPN.

Moving Forward

Unfortunately implementing reverse AES keyservers was a bit beyond the scope of what I wanted this weekend project to be. I settled for the Electron app I made - the best part is that this will stream directly to a Chromecast, or behave like a native video. There are 0 (non-stream) ads, and I have much more direct, low level control of the stream myself. This has worked for pretty much every live event I’ve watched online, including Apple’s keynotes.

I don’t think there’s much else to explore here - most streams are exactly reproducible at the touch of a button, and those that aren’t are so complicated that it’s not worth investing the effort. If you find a stream that you can’t seem to reverse I’d love to hear about it! Leave it in the comments or email me directly.
https://blog.jonlu.ca/posts/illegal-streams





SiriusXM to Acquire Pandora in $3.5 Billion Deal
Jem Aswad

SiriusXM will acquire Pandora in an all-stock transaction valued at approximately $3.5 billion, the companies announced Monday.

According to the announcement, the deal will create “the world’s largest audio-entertainment company,” with more than $7 billion in projected revenue in 2018 and more than 100 million monthly listeners, combining SiriusXM’s 36 million subscribers and Pandora’s 70 million-plus monthly active users. It also moves SiriusXM and its parent company, Liberty Media, aggressively into the streaming market.

The transaction is expected to close in the first quarter of 2019 and is subject to approval by Pandora stockholders; expiration or termination of any applicable waiting period under the Hart-Scott-Rodino Antitrust Improvements Act and certain competition laws of foreign jurisdictions; and other customary closing conditions.

In June 2017, after months of circling, SiriusXM struck a deal to invest $480 million in Pandora. The cash purchase stock gave the satellite radio company a 19% stake in Pandora, and included a provision to curb SiriusXM’s ability to take over the entire company by limiting SiriusXM from acquiring more than 31.5% of Pandora without the approval of the Pandora board.

SiriusXM’s takeover of Pandora has been unanimously approved by both the independent directors of Pandora and by the board of directors of SiriusXM, according to Monday’s announcement.

In a 45-minute conference call Monday morning, SiriusXM CEO Jim Meyer spoke of how the deal creates “the leading audio streaming company in the U.S., with over 70 million monthly users” and “a win-win for both companies,” singling out the fact that “Pandora’s advertising [business] is multiples bigger than SiriusXM’s.” Pandora has struggled in recent years due to competition from streaming services like Spotify, which is the world’s largest, and Apple Music, and only recently added an on-demand component to its offerings.

“This strategic transaction builds on SiriusXM’s position as the leader in subscription radio and a critically acclaimed curator of exclusive audio programming with the addition of the largest U.S. audio streaming platform,” the announcement reads in part. “Pandora’s powerful music platform will enable SiriusXM to significantly expand its presence beyond vehicles into the home and other mobile areas. Following the completion of the transaction, there will be no immediate change in listener offerings.”

In an interview at Goldman Sachs’ Communacopia conference earlier this month, Pandora CEO Roger Lynch gave no indication that the deal was in the works, saying only that since the equity purchase last year Liberty Media has been involved “on a board level and as an investor; they have not been involved at an operational level.”

According to the companies, the combined entity will drive long-term growth by:

• Capitalizing on cross-promotion opportunities between SiriusXM’s base of more than 36 million subscribers across North America and 23 million-plus annual trial listeners and Pandora’s more than 70 million monthly active users, which represents the largest digital audio audience in the U.S.
• Leveraging SiriusXM’s exclusive content and programming on Pandora’s ad-supported and subscription tiers to create unique audio packages, while also using SiriusXM’s automotive relationships to drive Pandora’s in-car distribution.
• Continuing investments in content, technology, innovation, and expanded monetization opportunities through both ad-supported and subscription services in and out of vehicles.
• Supporting and strengthening Pandora’s brand.
• Creating a promotional platform for emerging and established artists, curated and personalized in ways to deliver the most compelling audio experience that connects artists to their fan bases, as well as new listeners.

With regard to the combined company’s future relationship with music companies, Meyer pointed out that it will be paying some $2 billion in royalties annually to record labels. Questioned further, he said, “I can’t tell you how the record labels are going to behave, but Roger and I are hoping for a strong working relationship. We [represent] a sizable piece of money that goes to pay the audio ecosystem, and I believe we’ll find a [positive relationship].”

Meyer also acknowledged that despite SiriusXM’s deep penetration of the automobile audio market, a significant number of people do not ultimately pay to subscribe to the service, and “We would benefit from having a free funnel, and we will have a scaled user base of 65 million people providing us an opportunity to refer millions of users who do not subscribe to Sirius to the country’s largest free [digital] radio option.”

He also acknowledged that since SiriusXM acquired the 19% stake in Pandora, the two companies “have done nothing” thus far to create synergies. “We didn’t get to that phase of the relationship because we decided to let the Pandora management get the ship righted,” Meyer said. “Trying to overlay that was never our intention in the first year. I am very confident that we will successfully accomplish working together and finding places for one-plus-one.”

Pandora CEO Roger Lynch added, “There’s no shortage of ideas that Jim and I have had on things we can do together, everything from the content side to things we can bundle in the car.” He added that the SiriusXM team has been “very supportive of our strategy.”

In prepared statements announcing the deal, Meyer said: “We have long respected Pandora and their team for their popular consumer offering that has attracted a massive audience, and have been impressed by Pandora’s strategic progress and stronger execution. We believe there are significant opportunities to create value for both companies’ stockholders by combining our complementary businesses. The addition of Pandora diversifies SiriusXM’s revenue streams with the U.S.’s largest ad-supported audio offering, broadens our technical capabilities, and represents an exciting next step in our efforts to expand our reach out of the car even further. Through targeted investments, we see significant opportunities to drive innovation that will accelerate growth beyond what would be available to the separate companies, and does so in a way that also benefits consumers, artists, and the broader content communities. Together, we will deliver even more of the best content on radio to our passionate and loyal listeners, and attract new listeners, across our two platforms.”

In the same announcement, Lynch said: “We’ve made tremendous progress in our efforts to lead in digital audio. Together with SiriusXM, we’re even better positioned to take advantage of the huge opportunities we see in audio entertainment, including growing our advertising business and expanding our subscription offerings. The powerful combination of SiriusXM’s content, position in the car, and premium subscription products, along with the biggest audio streaming service in the U.S., will create the world’s largest audio entertainment company. This transaction will deliver significant value to our stockholders and will allow them to participate in upside, given SiriusXM’s strong brand, financial resources and track record delivering results.”

According to the announcement, the owners of the outstanding shares in Pandora that SiriusXM does not currently own will receive a fixed exchange ratio of 1.44 newly issued SiriusXM shares for each share of Pandora they hold. Based on the 30-day volume-weighted average price of $7.04 per share of SiriusXM common stock, the implied price of Pandora common stock is $10.14 per share, representing a premium of 13.8% over a 30-day volume-weighted average price. The transaction is expected to be tax-free to Pandora stockholders. SiriusXM currently owns convertible preferred stock in Pandora that represents a stake of approximately 15% on an as-converted basis.
https://variety.com/2018/biz/news/siriusxm-to-acquire-pandora-1202954221/





Satellite Company Partners with Bezos' AWS to Bring Internet Connectivity to the 'Whole Planet'

• Iridium is partnering with AWS to develop a satellite-based network called CloudConnect for Internet of Things (IoT) applications.
• "Now that Amazon has put our language into the cloud platform, they can extend their applications to the satellite realm," Iridium CEO Matt Desch told CNBC.
• The CloudConnect network will focus on "where cellular technologies aren't," Desch said.

Michael Sheetz

Iridium Communications announced a partnership with Amazon Web Services on Thursday, to develop a satellite-based network called CloudConnect for Internet of Things (IoT) applications.

"We're really covering the whole planet ... with terrestrial networks today it's still only 10 percent or 20 percent" of the Earth, Iridium CEO Matt Desch told CNBC. "Everybody today can connect pretty easily with very little effort. Now that Amazon has put our language into the cloud platform, they can extend their applications to the satellite realm."

CloudConnect, which the company expects to launch in 2019, makes Iridium "the first, and only, satellite provider now connected to" Amazon Web Services, Desch said. The CloudConnect network will focus on "where cellular technologies aren't," Desch said, bringing the rest of the world within reach of AWS.

Amazon has been looking to hire people to work on "interconnecting space system networks," CNBC reported earlier this month. The company has never publicly discussed such a project.

Shares of Iridium rose 7.1 percent in trading, hitting an all-time high of $21.98 a share.

The company is nearly finished putting its Iridium NEXT constellation of 75 satellites into orbit. SpaceX is launching the $3 billion satellite network for Iridium, with the eighth and final launch happening later this year. Desch has called SpaceX "critical" to Iridium's commercial success, which is now the satellite company's sole launch provider.

Once online, Iridium NEXT will offer services such as higher broadband communications speeds and global airplane tracking. Iridium describes the IoT aspect of the network as a "catalyst for strong subscriber growth." Desch said the network hosts "about half a million" active devices, growing at a rate of about 20 percent per year for the last three years. With AWS onboard, Desch gave a very bullish estimates for his IoT services: "Easily this could expand to tens of millions of devices."

"We have the best bandwidth over anybody," Desch said. "Our network is super efficient at how it can manage these bytes of information."

Using the AWS, the most widespread cloud-computing service in the world, applications all speak the same "language," Desch said. It can take companies months or years to connect their applications into a new cloud suite, Desch explained, saying the IoT devices sometimes "talking in proprietary languages" or "they have to learn to talk from scratch." Add that to the 80 percent of the world where "it's still hard to connect things up" and one can see why CloudConnect will be optimized for connecting things very efficiently around the world," Desch said.

"We're talking things where a couple dollars can deliver really timely information in seconds from anywhere-to-anywhere in the planet," Desch said.

Desch expects CloudConnect to initially cater to large things like agricultural equipment or cargo ship in the open sea but said "it will move downwards into smaller and smaller vehicles, such as drones." Iridium is also looking at partnering with low-cost satellite companies like Myriota, Hiber and Fleet – as those will focus on a different range of IoT data.

"There are eight or 10 of these new networks that people want to develop with new satellites," Desch said. "We're more of the high end, when you've got to really get the data and it's got to be real time."
https://www.cnbc.com/2018/09/27/amazon-partners-with-iridium-for-aws-cloud-services-via-satellite.html





FCC Hiding Evidence Of Suspected Russian Role In Ending Net Neutrality: Lawsuit

The suit filed by The New York Times alleges an "orchestrated campaign'' by Russians to corrupt democratic rule-making.
Mary Papenfuss

The Federal Communications Commission has obstinately hidden information concerning its system for gathering public input about its unpopular plan to kill net neutrality amid signs of Russian manipulation of the comment procedure, according to a New York Times lawsuit.

Freedom of Information Act requests by the newspaper concerning the comment system were turned down repeatedly by the FCC as the Times attempted to investigate possible influence by Russia after huge numbers of comments were linked to Russian emails.

Stonewalling by the FCC has made the American public the “victim of an orchestrated campaign by the Russians to corrupt the notice-and-comment process and undermine an important step in the democratic process of rule-making,” states the Times’ lawsuit, which was filed Thursday in U.S.District Court in the Southern District of New York.

The agency also ignored similar demands — at least nine times — from the New York attorney general last year as that office investigated millions of suspicious comments.

The FCC voted last year to end net neutrality, upending the American internet system. The change allows internet service providers to block, slow down, or charge extra for certain content.

The FCC’s bungled comment procedure has long been the target of criticism. As many as 2 million comments were fraudulently submitted in other people’s names without their knowledge, and the system was overrun with bots, a favorite tool of the Russians. The system also crashed for a period of time as the FCC was overwhelmed by a massive number of comments supporting net neutrality.

FCC Commissioner Jessica Rosenworcel revealed in an Op-Ed after the comment debacle that the commission received half a million comments from Russian email addresses, and nearly 8 million comments from email domains associated with FakeMailGenerator.com — all with nearly the exact same wording.

A cyber-security company in July issued a report linking FCC comment emails to Russian email addresses named in indictments of Russians and Russian companies as part of special counsel Robert Mueller’s probe into foreign interference in the 2016 presidential election, according to the Times’ suit.

The Times initially filed a request in June 2017 for FCC server logs linked to the system for accepting public comments. The request, the Times stated in the lawsuit, “involves records that will shed light on the extent to which Russian nationals and agents of the Russian government have interfered with the agency notice-and-comment process about a topic of extensive public interest.”

The FCC refused, saying that fulfilling such a request would breach the privacy of people sending comments, would put security practices at risk and would be overly burdensome.

Public comments are open to public review — or identifying information can easily be redacted, the Times argued. It also pared back its request — a number of times — to reduce any security risk and the burden of fulfilling the request.

The paper finally filed suit after being stonewalled by the FCC for over a year.

“We are disappointed that The New York Times has filed suit to collect the commission’s internal web server logs — logs whose disclosure would put at jeopardy the commission’s ... security practices for its Electronic Comment Filing System,” an FCC representative told Ars Technica.
https://www.huffingtonpost.com/entry/fcc-shielding-evidence-of-russian-role-in-killing-net-neutraility-lawsuit_us_5ba72892e4b0375f8f9db029





Ajit Pai Slams Cities and Towns as FCC Erases $2 Billion in Local Fees

FCC orders cities and towns to slash permit fees for 5G equipment.
Jon Brodkin

The Federal Communications Commission today finalized an order that will prevent city and town governments from charging wireless carriers about $2 billion dollars' worth of fees related to deployment of wireless equipment such as small cells.

The decision has angered both large and small municipalities, as we reported last week.

The FCC's Republican majority says that limiting local fees will cause carriers to build 5G networks in rural and sparsely populated areas where it would otherwise be financially unfeasible. But the order doesn't require carriers to deploy any more broadband than they otherwise would have, and carriers already promised nationwide 5G networks before the FCC made its proposal.

"Comb through the text of this decision—you will not find a single commitment made to providing more service in remote communities," FCC Commissioner Jessica Rosenworcel, the FCC's only Democrat, said before today's vote. "Look for any statements made to Wall Street—not one wireless carrier has said that this action will result in a change in its capital expenditures in rural areas."

The $2 billion savings is less than 1 percent of the estimated $275 billion that carriers will have to spend to deploy 5G small cells throughout the US. That level of savings won't spur extra deployment "because the hard economics of rural deployment do not change with this decision," Rosenworcel said.

The FCC order suggests up-front application fees of $100 for each small cell and annual fees of up to $270 per small cell, saying that these should cover local governments' costs for processing applications and managing deployments on public property. Cities and towns that charge more than that would likely face litigation from carriers and would have to prove that the fees are a reasonable approximation of all costs and are "non-discriminatory."

The FCC order also limits the kinds of aesthetic requirements cities and towns can impose on carrier deployments. The FCC is telling municipalities "which fees are permissible and which are not, about what aesthetic choices are viable and which are not, with complete disregard for the fact that these infrastructure decisions do not work the same in New York, New York, and New York, Iowa," Rosenworcel said.

Pai slams cities for “extracting” fees

FCC Chairman Ajit Pai said the decision "has won significant support from mayors, local officials, and state legislators," and he criticized those cities and towns that oppose the FCC's decision.

"To be sure, there are some local governments that don't like this order," Pai said. "They would like to continue extracting as much money as possible in fees from the private sector and forcing companies to navigate a maze of regulatory hurdles in order to deploy wireless infrastructure."

Pai claimed that the fees carriers are charged in big cities prevent them from investing in rural areas. "Big-city taxes on 5G slow down deployment there and also jeopardize the construction of 5G networks in suburbs and rural America," he said.

But Pai offered no evidence that deployment decisions in rural areas are affected by permit fees in big cities.

Carriers' previous actions show that savings from tax cuts and deregulation don't necessarily cause new deployment. Comcast and AT&T laid off thousands of employees less than a year ago after claiming they would create thousands of new jobs in exchange for a federal tax cut. Shortly after the FCC voted to eliminate net neutrality rules, Charter announced a "meaningful decline" in capital investment.

"We see this play out in NYC, where poles are priced as low as $12 per month in underserved areas yet there are very few providers looking to install in those communities," New York City CIO Samir Saini wrote in a blog post yesterday. "Our colleagues in rural areas tell us they haven’t been able to attract companies even when offering poles at NO cost." (NYC charges much higher fees in more affluent areas.)

Saini accused the FCC of "handing taxpayer-owned assets over to multi-billion dollar telecommunications companies, and encouraging them to run wild on our public rights of way."

"Without local control, multiple companies could pile many different installations on a single light pole," Saini wrote. "Imagine a mass of new equipment on a single structure, ruining streetscapes and potentially interfering with first responder, electric utility and other critical equipment."

The National Association of Counties and the National League of Cities also criticized the FCC decision.

“Over 100 local governments from 22 states filed comments in opposition to the proposed ruling during the FCC's comment period," the group said. "The FCC's impractical actions will significantly impede local governments' ability to serve as trustees of public property, safety and well-being. The decision will transfer significant local public resources to private companies, without securing any guarantee of public benefit in return."

The FCC move will also force cities and towns to act on carrier applications within 60 or 90 days.

"By narrowing the window and resources for evaluating small cell applications, the FCC is effectively hindering our ability to fulfill public health and safety responsibilities during the construction and modification of broadcasting facilities," the counties' and cities' group said.
FCC vote pleases industry lobbyists

Today's vote to preempt local regulatory decisions was supported by CTIA, the major wireless carriers' primary lobby group.

"[T]his decision will promote billions in investment and significant job creation," CTIA CEO Meredith Attwell Baker said. "It creates a common-sense national framework that will also enable the wireless industry to accelerate the deployment of 5G for millions of Americans."

But the order will likely lead to lawsuits because the FCC mostly disregarded the concerns of localities, according to consumer advocacy group Public Knowledge.

"Rather than finding consensus between industry and state and local government stakeholders, the Commission's Declaratory Ruling and Order overwhelmingly sides with industry on nearly every issue, resulting in a vote that will almost certainly be challenged in the courts and create uncertainty, rather than predictability, for small cell wireless deployments," Public Knowledge Senior Policy Counsel Phillip Berenbroick said.
https://arstechnica.com/tech-policy/2018/09/ajit-pai-slams-cities-and-towns-as-fcc-erases-2-billion-in-local-fees/





How Bad Maps Are Ruining American Broadband

ISPs are painting over US broadband problems, and the FCC is letting it happen
Karl Bode

Like countless other American cities, Cleveland, Ohio, suffers from a lack of meaningful broadband competition. With only one or two largely apathetic ISPs to choose from, high prices, slow speeds, limited deployment, and customer service headaches are the norm. It’s particularly bad in the city’s poorer, urban areas. AT&T has avoided upgrading lower-income minority neighborhoods at the same rate as higher-income parts of the city, despite decades of subsidies and tax breaks intended to prevent that from happening, according to a report by the National Digital Inclusion Alliance (NDIA). Even in more affluent neighborhoods, users are lucky if they have an ISP that can deliver speeds over 50 Mbps.

The problem is much bigger than Cleveland, but the FCC isn’t ready to do much about it. US customers pay some of the highest prices for broadband in the developed world, and broadband availability is sketchy at best for millions of Americans. But instead of tackling that problem head on, the FCC is increasingly looking the other way, relying on ISP data that paints an inaccurately rosy picture of Americans’ internet access. And as long as regulators are relying on a false picture of US broadband access, actually solving the problem may be impossible.

As it currently stands, ISPs are required to deliver Form 477 data to the FCC indicating broadband availability and speed twice a year. But the FCC doesn’t audit the accuracy of this data, despite the fact that ISPs are heavily incentivized to overstate speed and availability to downplay industry failures. The FCC also refuses to make the pricing data provided by ISPs available to the public.

Worse, the FCC’s methodology declares an entire ZIP code as “served” with broadband if just one home in an entire census block has it. As a result, the government routinely declares countless markets connected and competitive when reality tells a very different story.

The FCC’s $350 million broadband map, for example, relies on the agency’s Form 477 data to help educate users on broadband availability. But users who plug their address into the map will quickly find that it hallucinates not only the number of broadband options available in their area, but the speeds any local ISPs can provide. A recent FCC update fixed none of these problems.

In Cleveland, the FCC’s map insists that city residents have at least six ISPs to choose from. But if you look closer, you’ll find that Cleveland residents really only have one option (Charter’s Spectrum) if they want a good connection. The other options the FCC cites include substandard satellite broadband, which is plagued by high latency and usage limits, and AT&T DSL, which is listed twice by the FCC but is patchy in its availability.

Given the unaudited unreliability of this data, the reality is likely even uglier. (This data doesn’t include pricing or restrictions on your line like usage caps and overage fees, which are glorified price hikes only made possible by said lack of competition.)

“The best maps we have at the federal level are awful,” notes Christopher Mitchell of the Institute for Local Self-Reliance (ILSR), a group dedicated to helping communities improve broadband availability. Even if the data were accurate, Mitchell notes, it’s usually 18 months old by the time it’s integrated into policy conversations.

“Broadband data is not a fine wine or cheese,” Mitchell says. “There is no reason to store it in a cave to age.”

Mitchell’s organization recently took a closer look at the disparity between reality and FCC data in Rochester, Minnesota, home of the Mayo Clinic. It’s a city of 114,000 people in southeastern Minnesota, the third-largest city in the state. Like Cleveland, Rochester residents are hungry for better, cheaper broadband.

According to the FCC’s data, Rochester is awash with broadband options. The agency insists that as many as a dozen broadband providers are available to most city residents. But according to the ILSR report, the reality is far different.

At least 4,000 of the 215,000 residents living within a 30-mile radius of the Rochester city center lack access to any broadband whatsoever. Another 42,000 people lack access to any fixed-line broadband options, driving them toward satellite broadband, which is considered the black sheep of the broadband sector due to cost, high latency, and daily or monthly usage restrictions.

Wireless is often promoted as a wonderful alternative to fixed-line broadband, but that’s not always the case. Wireless is often expensive, loaded with inconsistent restrictions, and users in rural markets often find themselves booted from the network for what’s often moderate usage. A monopoly over the fiber lines feeding cell towers only complicates the problem.

In Rochester, 19,000 consumers have the choice of only one local cable broadband provider — Charter’s Spectrum — and reality looks absolutely nothing like the picture ISPs and the FCC try to paint, Mitchell’s group found.

“Even where residents have a choice in broadband, anyone looking for speeds in excess of 40 Mbps will almost certainly have to subscribe to Charter Spectrum,” the report concludes.

In policy conversations, ISP lobbyists lean heavily on the FCC’s flawed data to falsely suggest that American broadband is dirt cheap and ultra competitive, despite real-world evidence to the contrary. ISPs also use this false reality to imply meaningful consumer protections aren’t necessary because the market is healthy (as we saw during the fight over net neutrality).

Some cities like Rochester have eyed either building their own broadband networks or striking public / private partnerships to fix the problem. But incumbent ISPs not only use the false FCC data to imply such efforts aren’t necessary, but they have lobbied (and, in some cases, written) protectionist laws in more than 20 states, prohibiting that from happening.

On the wider policy level, having accurate data is incredibly important as the government determines which areas are in need of broadband subsidies. That was a major point of contention at a recent FCC oversight hearing, as states vie for $4.5 billion in rural broadband deployment funds intended to shore up connectivity gaps.

“The maps stink, and we’ve got to be more proactive in getting them fixed,” said Sen. Jon Tester (D-MT) at the hearing. “The providers created this problem by showing you a map that’s covered in red,” he said, suggesting that “we’ve got to kick somebody’s ass” to get the problem fixed.

Other government agencies agree. The General Accounting Office released a study last week stating that the FCC routinely overstates broadband availability on tribal lands, actively harming the government’s ability to get these marginalized populations connected.

“Residents of tribal lands have lower levels of broadband Internet access relative to the US as a whole, but the digital divide may be greater than currently thought,” the GAO said. “FCC data overstated tribes’ broadband availability and access to broadband service. These overstatements limit FCC and tribal users’ ability to target broadband funding to tribal lands.”

This inaccurate data “could affect FCC’s funding decisions and the ability of tribal lands to access broadband in the future,” the GAO wrote. That’s of particular concern as the Ajit Pai-led FCC contemplates reducing broadband subsidies for tribal areas as part of the agency’s slow dismantling of Lifeline, a program designed to help bridge America’s digital divide.

The GAO provided a laundry list of recommendations to fix the problem, but it seems unlikely that the FCC — given its eagerness to please incumbent providers — will rush to fix a problem that has plagued America for years.

“By painting a far rosier picture of the digital divide than is warranted, policymakers have a far less sense of urgency about fixing the problem,” notes Gigi Sohn, a lawyer under the previous FCC. “And of course, if you don’t know the breadth of a problem, policymakers can’t be very strategic or targeted in fixing it.”

Fixing the data collection methodology at the heart of the problem shouldn’t be complicated, Sohn said, but ISPs have routinely lobbied against nearly every effort to do so.

“I would require the Internet access providers to, at a minimum, do a block-by-block mapping, and preferably, every home or building, along with prices, which would then be reported on the Form 477,” Sohn said when asked how she’d go about fixing the problem.

“It’s just a lack of political will to ask the companies to do more,” Sohn said.
https://www.theverge.com/2018/9/24/17882842/us-internet-broadband-map-isp-fcc-wireless-competition





US Wireless Video Streaming Sucks, Study Says

The US lags behind dozens of countries thanks to arbitrary throttling practices.
Karl Bode

A new study suggests that the United States lags behind dozens of other countries in terms of video streaming quality over wireless networks. Open Signal’s latest State of Mobile Video Report examined wireless video quality in more than 69 countries, utilizing 90 billion measurements across 8 million phones between May and August of this year.

The report examined video load times, the volume of stuttering and buffering during video playback, and overall video resolution on wireless networks. Countries were then ranked on a scale of 1-100, ranging from scores of 0-40 (poor) to 75-100 (excellent).

The United States didn’t fare well. US carriers were ranked 34th in terms of average network speeds (16.5 Mbps) and 59th in terms of users’ “overall video experience.” That’s on the heels of previous studies showing that US residents pay some of the highest prices for mobile data in the developed world (before one even includes a bevy of obnoxious, hidden fees).

A big reason for the United States’ poor showing? Carrier policies that artificially restrict video quality across the network via throttling or “deprioritization.” As a result, you may have more than enough bandwidth to enjoy a solid YouTube or Netflix stream, but artificial limits imposed by your cellular carrier can still hamper video quality.

“As our tests sample video at different resolutions, any downgrading of video quality—say from HD to SD—would have an impact on our scores,” OpenSignal wrote in its new report. “The U.S. is a prime example of such policies at work.”

Last year, competition from T-Mobile forced larger carriers like Verizon and AT&T to back away from punitive usage caps and steep overage fees and re-introduce simpler, more popular unlimited data plans.

But as California firefighters recently discovered, these “unlimited” plans feature a laundry list of often unclear limits and caveats, from restrictions on using your phone as a modem, to throttling that kicks in if you consume a set amount of bandwidth. Carriers have also started throttling all video by default unless users are willing to pony up significantly more money.

Verizon, for example, now bans all 4K video by default, and throttles back all video on its unlimited plans to 480p (about 1.5 Mbps) unless you’re willing to pay significantly more. Other mobile carriers, like Sprint, have experimented with throttling music and game performance unless users pony up additional cash.

“To prevent their networks from becoming overloaded with video traffic, operators have put streaming restrictions on their different tiers of unlimited plans,” Open Signal said. “Depending on the type of video, a 720p stream can consume twice as much or more data than a 480p stream,” the firm insisted.

But a study highlighted by Motherboard last week showed how these video limits often have absolutely nothing to do with network congestion, and everything to do with nickel and diming consumers.

Researchers at Northeastern University conducted half a million data traffic tests across 161 countries to help determine which ISPs routinely hamstring streaming performance. They found that carriers consistently apply arbitrary restrictions on video streaming that have nothing to do with managing network load.

Imposing arbitrary video quality restrictions users then have to pay more to avoid is the new normal in the post net neutrality landscape. And with looming wireless sector mergers poised to reduce competition, and the FCC’s 2015 net neutrality rules now defunct, there’s probably going to be a lot more of this kind of behavior waiting for users just over the horizon.
https://motherboard.vice.com/en_us/article/438ngj/us-wireless-video-streaming-sucks-study-says





News Site to Investigate Big Tech, Helped by Craigslist Founder

The Markup, dedicated to investigating technology and its effect on society, will be led by two former ProPublica journalists. Craig Newmark gave $20 million to help fund the operation.
Nellie Bowles

When the investigative journalist Julia Angwin worked for ProPublica, the nonprofit news organization became known as “big tech’s scariest watchdog.”

By partnering with programmers and data scientists, Ms. Angwin pioneered the work of studying big tech’s algorithms — the secret codes that have an enormous impact on everyday American life. Her findings shed light on how companies like Facebook were creating tools that could be used to promote racial bias, fraudulent schemes and extremist content.

Now, with a $20 million gift from the Craigslist founder Craig Newmark, she and her partner at ProPublica, the data journalist Jeff Larson, are starting The Markup, a news site dedicated to investigating technology and its effect on society. Sue Gardner, former head of the Wikimedia Foundation, which hosts Wikipedia, will be The Markup’s executive director. Ms. Angwin and Mr. Larson said that they would hire two dozen journalists for its New York office and that stories would start going up on the website in early 2019. The group has also raised $2 million from the John S. and James L. Knight Foundation, and $1 million collectively from the Ford Foundation, the John D. and Catherine T. MacArthur Foundation, and the Ethics and Governance of Artificial Intelligence Initiative.

Ms. Angwin compares tech to canned food, an innovation that took some time to be seen with more scrutiny.

“When canned food came out, it was amazing,” said Ms. Angwin, who will be the site’s editor in chief. “You could have peaches when they were out of season. There was a whole period of America where every recipe called for canned soup. People went crazy for canned food. And after 30 years, 40 years, people were like, ‘Huh, wait.’

“That is what’s happened with technology,” Ms. Angwin said, calling the 2016 election a tipping point. “And I’m so glad we’ve woken up.”

The site will explore three broad investigative categories: how profiling software discriminates against the poor and other vulnerable groups; internet health and infections like bots, scams and misinformation; and the awesome power of the tech companies. The Markup will release all its stories under a creative commons license so other organizations can republish them, as ProPublica does.

Ms. Angwin, who was part of a Wall Street Journal team that won a Pulitzer Prize in 2003 for coverage of corporate corruption, said the newsroom would be guided by the scientific method and each story would begin with a hypothesis. For example: Facebook is allowing racist housing ads. At ProPublica, Ms. Angwin’s team bought ads on the site and proved the hypothesis.

At The Markup, journalists will be partnered with a programmer from a story’s inception until its completion.

“To investigate technology, you need to understand technology,” said Ms. Angwin, 47. “Just like I got an M.B.A. when I was a business reporter, I believe that technologists need to be involved from the very beginning of tech investigations.”

Ms. Angwin has known Mr. Newmark since 1997, when she wrote about him while a reporter at The San Francisco Chronicle.

“Craig is ideal for us because he has no interest or temperament for trying to interfere in coverage,” she said.

Mr. Newmark, who splits his time between San Francisco and New York, has for years kept a low profile. But he worries about what he sees as a lack of self-reflection among engineers.

“Sometimes it takes an engineer a while to understand that we need help, then we get that help, and then we do a lot better,” Mr. Newmark said. “We need the help that only investigative reporting with good data science can provide.”

Craigslist, which Mr. Newmark founded in the mid-1990s, helped to decimate print newspapers’ main source of revenue at the time: classified advertising. Recently, he has given several substantial donations to journalistic institutions, including $20 million to the CUNY Graduate School of Journalism.

“We’re in an information war now,” Mr. Newmark said.

For many years, the outrageous success of Silicon Valley companies — and the aggressive public relations teams who worked for them — kept many journalists at a remove.

The societal effects of tech were hard to quantify, and moral responsibility was often sloughed off on something called an algorithm, which most people could not quite explain or examine. Even if, as in the case of Facebook, it influenced around 2.5 billion people.

At ProPublica, Ms. Angwin and Mr. Larson subverted the traditional model of tech reporting altogether. They did not need access. With the right tools, they could study impact.

“There’s an opportunity for more reporters to use statistics to uncover societal harms,” said Mr. Larson, who has been doing data-driven journalism for a decade. “And then Julia’s gift is she takes data journalism and doesn’t make it like an academic report.”

Some of Ms. Angwin and Mr. Larson’s reporting tactics may violate tech platform terms of service agreements, which ban people from performing automated collection of public information and prohibit them from creating temporary research accounts. Ms. Angwin has been a strong defender of these practices and has argued that tech companies ought to allow reporters to be an exception to their rules.

“Without violating those rules, journalists can’t investigate our most important platform for public discourse,” Ms. Angwin wrote in August.

The two worked together on investigations like one into criminal sentencing software, which took a year. Ms. Angwin would report and write. Mr. Larson would measure and analyze. In the end, they proved that the algorithm was racially biased.

Mr. Larson, who will be The Markup’s managing editor, said the result was just as much a surprise to readers as it was to those who had made the biased algorithm.

“Increasingly, algorithms are used as shorthand for passing the buck,” said Mr. Larson, 36. “We don’t have enough people to look at parole decisions, so we’re going to pass it on to the computer and the computer is going to decide, and once they go into production, there’s no oversight.”

The two also showed how big tech companies were helping extremist sites make money, how African-Americans were overcharged for car insurance, and how Facebook allowed political ads that were actually scams and malware.

“There are unintended consequences,” Mr. Larson said. “In all three of those cases, it was a complete surprise to the people who made those algorithms as well.”

Engineers being surprised by the tools they have made is, to the Markup team, part of the problem.

“Part of the premise of The Markup is the level of understanding technology and its effects is very, very low, and we would all benefit from a broader understanding,” Ms. Gardner said. “And I would include people who work for the companies.”

Ms. Angwin said part of her goal was to help readers understand what exactly they should be worried about when it comes to tech.

“We’re all a little uncertain,” Ms. Angwin said. “The evidence isn’t in. I want to be providing the evidence.”

She hopes the stories they take on will lead to better government and corporate policies.

“We are a numbers-driven data society,” Ms. Angwin said. “That’s the price of entry these days for political change — a data set.”

And searching for that information, Ms. Angwin said she was not worried about getting Facebook or Google to return her phone calls.

“I’ve never been on Google’s or Facebook’s campus and I imagine I’ll never be invited,” she said. “I’m kind of a dorky scientist just over here measuring stuff.”
https://www.nytimes.com/2018/09/23/business/media/the-markup-craig-newmark.html

















Until next week,

- js.



















Current Week In Review





Recent WiRs -

September 22nd, September 15th, September 8th, September 1st

Jack Spratts' Week In Review is published every Friday. Submit letters, articles, press releases, comments, questions etc. in plain text English to jackspratts (at) lycos (dot) com. Submission deadlines are Thursdays @ 1400 UTC. Please include contact info. The right to publish all remarks is reserved.


"The First Amendment rests on the assumption that the widest possible dissemination of information from diverse and antagonistic sources is essential to the welfare of the public."
- Hugo Black
__________________
Thanks For Sharing
JackSpratts is online now   Reply With Quote
Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Peer-To-Peer News - The Week In Review - November 24th, '12 JackSpratts Peer to Peer 0 21-11-12 09:20 AM
Peer-To-Peer News - The Week In Review - July 16th, '11 JackSpratts Peer to Peer 0 13-07-11 06:43 AM
Peer-To-Peer News - The Week In Review - January 30th, '10 JackSpratts Peer to Peer 0 27-01-10 07:49 AM
Peer-To-Peer News - The Week In Review - January 16th, '10 JackSpratts Peer to Peer 0 13-01-10 09:02 AM
Peer-To-Peer News - The Week In Review - December 5th, '09 JackSpratts Peer to Peer 0 02-12-09 08:32 AM






All times are GMT -6. The time now is 05:36 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
© www.p2p-zone.com - Napsterites - 2000 - 2024 (Contact grm1@iinet.net.au for all admin enquiries)