P2P-Zone  

Go Back   P2P-Zone > Peer to Peer
FAQ Members List Calendar Search Today's Posts Mark Forums Read

Peer to Peer The 3rd millenium technology!

Reply
 
Thread Tools Search this Thread Display Modes
Old 17-06-20, 06:49 AM   #1
JackSpratts
 
JackSpratts's Avatar
 
Join Date: May 2001
Location: New England
Posts: 10,013
Default Peer-To-Peer News - The Week In Review - June 20th, ’20

Since 2002































June 20th, 2020




Facebook Says it Doesn’t Need News Stories for its Business and Won’t Pay to Share them in Australia

Social media giant rejects ACCC proposal, saying it could cut out news completely without any significant impact on its business
Naaman Zhou and Amanda Meade

Facebook has rejected a proposal to share advertising revenue with news organisations, saying there would “not be significant” impacts on its business if it stopped sharing news altogether.

On Monday, the social media giant issued its response to the Australian Competition and Consumer Commission, which has been tasked with creating a mandatory code of conduct aimed at levelling the playing field.

The treasurer, Josh Frydenberg, told the ACCC to develop a code after multiple Australian media companies and regional newspapers cut jobs, or folded entirely, as a result of advertising downturn during the Covid-19 pandemic.

Facebook and Google have previously refused to accept they needed to pay for using news content.

In its submission to the watchdog, Facebook said it rejected many of the ACCC’s potential ideas, and said there was a “healthy rivalry” between itself and news organisations.

The social media giant said it supported the idea of a code of conduct between digital platforms and news publishers, but that itself and Google were being “singled out” unfairly.

Facebook also said it could cut out news completely without any significant impact on its business.

“We made a change to our News Feed ranking algorithm in January 2018 to prioritise content from friends and family,” the company said. “These changes had the effect of reducing audience exposure to public content from all pages, including news.

“Notwithstanding this reduction in engagement with news content, the past two years have seen … increased revenues, suggesting both that news content is highly substitutable with other content for our users and that news does not drive significant long-term value for our business.

“If there were no news content available on Facebook in Australia, we are confident the impact on Facebook’s community metrics and revenues in Australia would not be significant.”

Facebook said that news represents “only a very small fraction of the content in the average Facebook users’ newsfeed” because Facebook was primarily a service used to connect with family and friends.

“It is not healthy nor sustainable to expect that two private companies, Facebook and Google, are solely responsible for supporting a public good and solving the challenges faced by the Australian media industry,” it said.

“The code needs to recognise that there is healthy, competitive rivalry in the relationship between digital platforms and news publishers, in that we compete for advertising revenue.”

The company said the revenue-sharing proposal would be forcing them to “subsidise a competitor” and “distort advertising markets, potentially leading to higher prices”.

Despite this, the company said it was still committed to supporting Australian news, and had “invested” in the industry.

Facebook said it had sent 2.3bn clicks to Australian news publishers in the five months from January to May 2020, which they estimated to be worth $195.8m to the news organisations.

“Despite claims of an ‘imbalance’ that should impede the striking of such agreements, we have been steadily increasing our investments in the Australian news ecosystem,” Facebook said. “We continue to ramp up our direct financial contributions to the news industry – not to make a profit – rather because we believe news is a public good and it plays an important social function.”

Instead of the ACCC’s proposal of a body that could issue financial penalties and binding dispute resolution, Facebook proposed the creation of an “Australian Digital News Council” that will mediate complaints from news organisations, based on the Australian Press Council as a model.

The social media company also objected to the focus on itself and Google, and disputed the idea that it “possesses unequal bargaining power compared to some of the largest media companies in Australia”.

“The decision to limit the initial version of the code to two US companies is discriminatory and will inevitably give an unfair advantage to Facebook’s competitors in the technology sector, including rivals from countries that propagate different and undesirable visions for the internet.”
https://www.theguardian.com/media/20...m-in-australia





This Senator Is Seemingly Obsessed With Threatening the Internet Archive

While the country deals with a pandemic and mass protests, Sen. Thom Tillis has repeatedly sent letters to the Internet Archive that warn its projects may be illegal.
Karl Bode

In the wake of a raging pandemic, the Internet Archive has been working overtime to preserve internet history and expand the public’s access to digital library collections. That recently culminated in the creation of a National Emergency Library that made 1.4 million ebooks available to the public at a time when traditional libraries pose a health risk.

Increasingly, the organization’s reward has been a parade of headaches. Both from the entertainment industry and their loyal allies in Congress.

Last week, the archive’s emergency library was forced to dramatically scale back the effort thanks to a publisher lawsuit and political pressure from politicians like North Carolina Senator Thom Tillis, who accused the organization of violating the country’s often draconian—and frequently ridiculous—copyright laws.

Now Tillis is taking aim at another Internet Archive effort, the Great 78 Project.

The Great 78 Project is a communal effort geared towards the preservation, research and discovery of the 3 million 78rpm discs produced between 1898 and 1950. Often made from far more fragile shellac than the resin commonly used today, many of these recordings are fragile, and the digitization effort has created a historical archive of some amazing work.

The organization’s overall album preservation effort was buoyed by the Internet Archive’s recent purchase of the entire 500,000 album catalog of Bop Street Records, one of Seattle’s oldest and most beloved record stores—forced to close due to the COVID-19 crisis.

Enter Tillis, who in a new letter complains that both the Great 78 Project and the acquisition of the Bop Street collection could potentially run afoul of America’s aggressive copyright laws.

“It is clear that these sound recordings were very recently for sale in a commercial record shop and likely contain many sound recordings that retain significant commercial value,” Tillis said. “This raises serious alarms about copyright infringement.”

Tillis also complained that the Great 78 Project potentially violated the Orrin G. Hatch-Bob Goodlatte Music Modernization Act (MMA), passed in 2018 to help modernize U.S. copyright laws for the internet streaming era. In his letter, Tillis demands that the Archive provide detailed evidence it isn’t violating copyright by July 10.

But copyright experts approached by Motherboard aren’t quite sure what the fuss is about.

“The statement about MMA not allowing for ‘streaming or downloading’ is very odd,” John Bergmeyer, Legal Director at consumer group Public Knowledge told Motherboard.

“The MMA specifically allows for non-commercial ‘use’ of pre-72 sound recordings,” he said. “The ‘covered activities’ that MMA creates new liability for, but exempts noncommercial use from, specifically includes ‘permanent download, limited download, or interactive stream.’”

In short, Bergmeyer noted that while there could be some thorny and minor flaws in the Archive’s implementation, the law specifically carves out exemptions for such efforts.

Even if it didn’t, that’s a flaw with U.S. copyright law, not the Internet Archive’s proposal, copyright expert and Techdirt.com founder Mike Masnick told Motherboard.

“It's in that murky space where it's unclear whether or not some of it violates copyright law, but if it does, it would highlight yet another fundamental problem with copyright law in working against culture, rather than in support of it,” Masnick said. “Indeed, if anything, Tillis should work on fixing the law and rolling back copyright term extension that makes this so murky.”

With the country facing record levels of unemployment, massive protests against racial injustice, and a raging pandemic, it’s unclear why Tillis’ top priority appears to be badgering a good faith internet preservation effort. His office did not respond to a request for comment as to whether Congressional resources would be better spent elsewhere.

It's worth noting the RIAA originally alerted Motherboard Tillis' letter, but declined to comment on it.

Masnick noted this is part of a broader effort by Tillis to expand the nation’s already controversial patent and copyright laws, including the Digital Millennium Copyright Act (DMCA).

“In Tillis' short time atop the Senate Judiciary Committee's IP Subcommittee, he has shown himself to be strongly aligned with legacy ‘maximalist’ interests on both copyright and patent law,” Masnick said. “He is seeking to rewrite the DMCA to make it even more in favor of legacy copyright interests and also introduced laws to expand patent eligibility massively.”

With the nation reeling from an historic pandemic and the DMCA routinely under fire for already being an over-aggressive mess, copyright experts say Tillis’ priorities don’t really make a whole lot of sense.
https://www.vice.com/en_us/article/a...ternet-archive





After Merger, T-Mobile Lays Off Hundreds of Sprint Employees
Zack Whittaker, Brian Heater

In a conference call on Monday lasting under six minutes, T-Mobile vice president James Kirby told hundreds of Sprint employees that their services were no longer needed. He declined to answer his employees’ questions, citing the “personal” nature of employee feedback, and ended the call.

TechCrunch obtained leaked audio of that call, which was said to be one of several calls held by T-Mobile leadership throughout the day to lay off staff across the organization. The layoffs come just two months after its contested $26 billion Sprint merger was finally completed.

On the call, Kirby said T-Mobile was eliminating Sprint’s inside sales unit (BISO), a sales division that focuses on small businesses across the United States. The executive didn’t say exactly how many staff were laid off. Almost 400 people were in the phone meeting, a person on the call told TechCrunch.

Kirby is heard saying that the division’s layoffs would make way for 200 new positions, and encouraged employees to apply for one of the new positions using T-Mobile’s external careers page, spelling out the web address on the call twice. Some impacted employees may be able to shift to new roles, though the carriers don’t appear to have done much to facilitate the moves beyond encouraging staff to apply.

The employees who were laid off Monday will keep their jobs for another two months until August 13, said Kirby. A person on the call told TechCrunch that the severance packages amount to two weeks pay for every year on the job, but some employees may get more.

Employers are required to give two months notice in advance of mass layoffs under the WARN Act.

T-Mobile leadership held several conference calls with employees to announce layoffs across various Sprint divisions on Monday on both the business and consumer sides, according to the person on the call. The person said that they were unaware of any T-Mobile employees affected by the layoffs.

“They cut people from every division, but BISO seems to have been hit the hardest,” the person said.

One employee described their frustration. “I just feel the company needs to acknowledge the pain they are putting people through during a pandemic — severance package or not.”

When reached, a T-Mobile spokesperson did not comment by our deadline.

T-Mobile closed the Sprint merger on April 1. The deal found the nation’s third- and fourth-largest carriers merged in a manner they insisted would keep them more competitive with the No. 1 and No. 2 services — AT&T and Verizon (TechCrunch’s parent company) — which have long dominated the category.

The merger was, understandably, subject to intense regulatory scrutiny in the months leading up to its final approval, as it would effectively reduce the country’s key carriers to three down from four. Among T-Mobile’s chief selling points were the claim that — in addition to increased competition — a merger would create more jobs.

“In total, New T-Mobile will have more than 11,000 additional employees on our payroll by 2024 compared to what the combined standalone companies would have,” then-chief executive John Legere claimed in an open letter last April.

The exact effect the merger has had on employee headcount isn’t entirely clear, but last month The Communications Workers of America estimated that it would impact some 30,000 jobs due to the consolidation of retail stores and corporate roles.

“T-Mobile has made no written, verifiable commitments to the FCC to protect jobs,” the union wrote. “While T-Mobile has tried to muddy the waters with vague loophole-ridden pledges to maintain jobs for current T-Mobile and Sprint employees, three-quarters of current employees selling the companies’ services work for authorized dealers and are not covered by the jobs pledge — 88,000 workers in total.”
https://techcrunch.com/2020/06/16/t-...ayoffs-sprint/





T-Mobile’s Outage Yesterday was so Big that Even Ajit Pai is Mad

But Pai's FCC has a history of letting carriers off easy.
Jon Brodkin

T-Mobile's network suffered an outage across the US yesterday, and the Federal Communications Commission is investigating.

FCC Chairman Ajit Pai, who takes an extremely hands-off approach to regulating telecom companies, used his Twitter account to say, "The T-Mobile network outage is unacceptable" and that "the FCC is launching an investigation. We're demanding answers—and so are American consumers."

No matter what the investigation finds, Pai may be unlikely to punish T-Mobile or impose any enforceable commitments. For example, an FCC investigation last year into mobile carriers' response to Hurricane Michael in Florida found that carriers failed to follow their own previous voluntary roaming commitments, unnecessarily prolonging outages. Pai himself called the carriers' response to the hurricane "completely unacceptable," just like he did with yesterday's T-Mobile outage. But Pai's FCC imposed no punishment related to the bad hurricane response and continued to rely on voluntary measures to prevent recurrences.

T-Mobile CEO Mike Sievert confirmed the outage in a blog post. "Starting just after 12pm ET and continuing throughout the day, T-Mobile has been experiencing a voice and text issue that has intermittently impacted customers in markets across the US," Sievert wrote. Sievert reported that the "issues are now resolved" just after 1am ET, about 13 hours after the outage began.

T-Mobile mistake may have caused outage

The outage may have been self-inflicted when T-Mobile was making network configuration changes. Cloudflare CEO Matthew Prince last night tweeted that T-Mobile was "making some changes to their network configurations today. Unfortunately, it went badly. The result has been for around the last 6 hours a series of cascading failures for their users, impacting both their voice and data networks." The T-Mobile problem was "almost certainly entirely of their own team's making," he also wrote.

Sievert attributed the outage to "an IP traffic related issue that has created significant capacity issues in the network core throughout the day," but he did not say what caused the traffic problem or whether it was due to a T-Mobile mistake. We asked T-Mobile to explain the outage cause and will update this article if we get a response.

T-Mobile President of Technology Neville Ray described the problem as "a voice and data issue that has been affecting customers around the country" and said T-Mobile engineers were working to fix it.

(Update: Ray provided some detail on the outage cause about eight hours after this article published, saying that "the trigger event is known to be a leased fiber circuit failure from a third party provider in the Southeast. This is something that happens on every mobile network, so we’ve worked with our vendors to build redundancy and resiliency to make sure that these types of circuit failures don’t affect customers. This redundancy failed us and resulted in an overload situation that was then compounded by other factors." The overload then caused an "IP traffic storm that spread from the Southeast to create significant capacity issues across the IMS (IP multimedia Subsystem) core network that supports VoLTE calls." To prevent recurrences, T-Mobile said it has "worked with our IMS and IP vendors to add permanent additional safeguards to prevent this from happening again and we're continuing to work on determining the cause of the initial overload failure.")

The T-Mobile outage was so large that it apparently caused some people to think other carriers and websites were down, too. Business Insider wrote that "Downdetector and customers on social media reported that AT&T and Verizon service was down," but both AT&T and Verizon said their networks were doing fine. "A Verizon spokesperson also told Business Insider the carrier was 'operating at normal service levels' and said that, given that 'another national carrier' was having issues, calls to and from that carrier might get an error message, resulting in reports of issues," the article said.

Prince wrote that the phone-service outage "caused a lot of T-Mobile users to complain on Twitter and other forums that they weren't able to reach popular services." Downdetector, an outage-monitoring website, "scrapes Twitter" for such reports and consequently "report[ed] those services as being offline" even though they weren't, he wrote. This contributed to the spread of rumors about a "massive DDoS attack" that also did not happen, he wrote.
Don’t expect much FCC action

Mobile voice services like T-Mobile's are still classified as common-carrier services under Title II of the Communications Act, but the FCC under Pai deregulated the home and mobile broadband industry and has taken a hands-off approach to ensuring resiliency in phone networks.

"This is, once again, where pretending that broadband is not an essential telecommunications service completely undermines the FCC's ability to act," longtime telecom attorney and consumer advocate Harold Feld, the senior VP of advocacy group Public Knowledge, told Ars today. "We're not talking about an assumption that T-Mobile necessarily did anything wrong. But when we have something this critical to the economy, and where it is literally life and death for people to have the service work reliably, it's not about 'trusting the market' or expecting companies to be on their best behavior. We as a country need to know what is the reality of our broadband networks, the reality of their resilience and reliability, and the reality of what happens when things go wrong. That takes a regulator with real authority to go in, ask hard questions, seize documents if necessary, and compel testimony under oath."

Several provisions of Title II common-carrier rules that Pai has fought against "give the FCC authority to make sure the network is resilient and reliable," Feld said. The FCC gutting its own authority "influences how the FCC conducts its investigations," he said. "[FCC] staff and the carriers know very well that if push comes to shove, companies can simply refuse to give the FCC information that might be too embarrassing. So the FCC is stuck now playing this game where they know they can't push too hard or they get their bluff called. Carriers have incentive to play along enough to keep the FCC or Congress from re-regulating, but at the end of the day it's the carriers—not the FCC—that gets to decide how much information to turn over."
https://arstechnica.com/tech-policy/...it-pai-is-mad/





FCC Has 'Serious Doubts' SpaceX Starlink Can Deliver a Low Latency Internet Service

The Federal Communications Commission doesn't believe low-Earth orbit satellite providers can deliver internet service while keeping latency below 100ms.
Stephanie Mlot

The Federal Communications Commission has "serious doubts" that low-Earth orbit (LEO) satellite providers—including SpaceX—can deliver internet service while keeping latency under 100 milliseconds.

In a lengthy report on the Rural Digital Opportunity Fund phase I auction (scheduled for late October), the FCC confirmed that while LEO companies are allowed to apply for rural broadband funding as low-latency providers, they should expect a fight. "Short-form applicants seeking to bid as a low-latency provider using low-Earth orbit satellite networks will face a substantial challenge demonstrating to Commission staff that their networks can deliver real-world performance to consumers below the Commission's 100 ms low-latency threshold," the FCC said.

SpaceX has been hogging the headlines lately, over the weekend completing yet another Starlink mission, launching 58 branded satellites into orbit. Elon Musk's rocket firm plans to provide internet service in the northern US and Canada by year's end, with near-global coverage in 2021.

The company is confident its satellites' roundtrip latency is up to par for consumers and the Federal Communications Commission. "SpaceX explained that its system easily clears the Commission's 100 ms threshold for low-latency services, even including its 'processing time' during unrealistic worst-case situations," David Goldman, director of satellite policy, wrote in a May 29 letter to the FCC.

Still, SpaceX has a lot to prove and little time to do it. While the firm has launched nearly 500 satellites, it isn't yet offering a commercial service. And companies must submit auction applications by July 15—one month from today.

"The record demonstrates significant concern regarding applicants that propose to use technologies that have not been widely deployed to offer services at high speeds or low latency, or have not been deployed at all on a commercial basis to retail consumers," the FCC said. "Auction 904 is not the appropriate venue to test unproven technologies using universal service support."

SpaceX last year earned regulatory approval to develop a satellite constellation that offers a low-cost, high-performance solution to providing fast internet access. Called Starlink, it will eventually consist of close to 12,000 satellites spread across multiple orbits. Musk recently hinted at a private beta launch this summer, with a public beta by the end of the year, "starting with high altitudes," which, he confirmed, could include the German market.
https://www.pcmag.com/news/fcc-has-s...tency-internet





SpaceX Starlink Internet Prepares for Beta Users

Thanks to its latest launch, SpaceX now has over 500 Starlink internet satellites in orbit. With this advance, the company is now inviting people to apply to become Starlink beta testers.
Steven J. Vaughan-Nichols

Sick of being stuck with low internet speeds out in the sticks? Want to free yourself from your low-quality rural cable, DSL, or -- you poor devil -- modem charges and speeds? SpaceX is finally cracking open the door to its Starlink low-earth orbit (LEO) internet service.

The June 13 launch carried 58 Starlink satellites into space. With this latest launch, there are now 540 Starlink satellites in orbit.

According to SpaceX founder and CEO Elon Musk, SpaceX needs about 400 Starlink satellites to provide "minor" coverage and 800 for "moderate" coverage. The initial Starlink mega-constellation will have 12,000 satellites. But that's far from the end. In late May, SpaceX applied to the FCC to launch as many as 30,000 Starlink satellites,

Still, with 540 satellites is enough for SpaceX to be inviting users to apply to become beta testers. The website now invites you to "Get updates on Starlink news and service availability in your area," by filling out a form for an email address and zip code. The form allows prospective customers to apply for updates and access to a public beta test of the Starlink service.

Once you do, you'll get an e-mail reading:

Starlink is designed to deliver high-speed broadband internet to locations where access has been unreliable, expensive, or completely unavailable. Private beta testing is expected to begin later this summer, followed by public beta testing, starting with higher latitudes.

If you provided us with your zip code, you will be notified via email if beta testing opportunities become available in your area. In the meantime, we will continue to share with you updates about general service availability and upcoming Starlink launches.


By the time the beta is open, late-summer or early fall, there may well be about 800 satellites ready to deliver broadband internet to Americans living in the northern contiguous states. We don't know how far south the initial beta coverage area extends.

If you're accepted for the beta, you can expect to get a user terminal with a flat disc antenna, which measures 0.48 meters in diameter. Musk describes these as looking like a "little UFO on a stick." Your terminal's antennas will self-direct itself for the best satellite signals.

According to SpaceX, Starlink will offer speeds of up to a gigabit per second at latencies from 25 milliseconds to 35 milliseconds. That's much faster than old-school satellites. HughesNet, the grandpa of satellite Internet, offers download speeds up to 25Mbps and upload speeds up to 3Mbps.

SpaceX has yet to release data on its upload speeds, but my best guess is it will be far slower than its 1Gbps download speed. I strongly suspect it won't be able to do much better than a geostationary satellite's 3Mbps. Still, that's a usable speed.

When it comes to latency, SpaceX has the older satellite Internet services beat all hollow. HughesNet has a latency of over 500 milliseconds. That's a half-second in people time. Starlink promises to have a latency of between 15ms to 25ms. Good Earth-bound broadband gives you latency of about 8ms to 20ms. This high-bandwidth will enable you to run video-conferencing and play high-end video games.

More recently, Musk tweeted, initially you can expect "Around 20ms. It's designed to run real-time, competitive video games. Version 2, which is at lower altitude could be as low as 8ms latency."

If all goes well with the beta, Musk has said Starlink will cost about $80 per month. It won't be available everywhere, though, at first. The US and Canada will get the service first. Eventually, it will be global.

However, if you're already living in cities or suburbs where you can get 100Mbps internet services, odds are you may not eligible for Starlink. As SpaceX tweeted: "Starlink's [goal] will deliver high-speed broadband Internet to locations where access has been unreliable, expensive, or completely unavailable."

Starlink is not looking to compete with your local ISP. Or, at least not yet anyway. Stay tuned.
https://www.zdnet.com/article/spacex...or-beta-users/





Meet the Big Tech Critic Behind Hey, Basecamp’s Radical New Email Platform

The Basecamp cofounder David Heinemeier Hansson has become increasingly outspoken about Big Tech’s privacy violations and monopolistic tendencies. Now he’s inviting you to join the cause—by switching your email provider.
Ainsley Harris

Silicon Valley has plenty of critics, but not many of them are tech company founders. Fewer still regularly tweet about breaking up Google, quitting Facebook, and holding Tesla to account for its Autopilot failures.

No wonder David Heinemeier Hansson, cofounder and CTO of Basecamp and creator of web application framework Ruby on Rails, has over 400,000 Twitter followers.

“A handful of companies have colonized the web, and they’re choking it,” he says. Don’t get him started on venture capital firms.

Two years ago, he and fellow Basecamp cofounder and CEO Jason Fried decided to do something about it. The culmination of that work is a paid, $99-per-year email service called Hey, which launches today. Along with protecting users from the types of invasive surveillance tactics that have become de rigueur online, Hey also contains some radical ideas about the way that modern correspondence should work. Silicon Valley will be watching the product closely: Consumers like to say they value their privacy, but are they finally willing to pay for it?

From tech founder to tech critic

Until recently, Hansson and Fried were mostly content to nurture their profitable project management software business, which has existed in some form since 1999, and write books advocating for their ideas on topics like remote work and productivity. It’s not a bad life: Hansson lives in Malibu with his family and makes time for a serious car racing hobby (he won his class at Le Mans in 2014).

For Hansson, comfort with those rhythms started to change after the 2016 election, during which political data firm Cambridge Analytica provided the Trump campaign with private information about Facebook users, obtained without their consent, in order to assist the campaign with voter targeting. The episode, which a whistleblower came forward to publicly describe in March of 2018, revealed the dark underbelly of Facebook’s highly profitable ad targeting machine.

“It wasn’t Cambridge Analytica in and of itself, it was the Pandora’s box it opened in terms of laying bare the way the industry actually worked,” Hansson says.

He started to do his homework on ad tech and privacy, and was increasingly horrified by what he found. In December of 2018, after running a series of targeted ads on Facebook and feeling “disgusted” by how invasive it could be, he announced that Basecamp was becoming a “Facebook-free” business: no Facebook ads, no presence on any of Facebook’s platforms, and no Facebook-based login. In January of this year, he testified before Congress in a hearing on online platforms and market power alongside three other technology business leaders (including the CEO of Sonos, who is locked in a legal battle with Google).

“At first I was a little cynical, to be honest, that I was just going to show up and deliver my little speech,” Hansson says. “When you’re on Twitter a lot, you’re in the bubble of thinking that everyone is paying attention to the same things you know. Then you get into the real world. It felt like we were delivering new information to Congress.”

He walked away from the experience with a newfound sense of purpose. “We need stringent and overdue legislation, we need to break up monopolies, and we need to give consumers better alternatives,” he says. That last piece became a personal and professional refrain: “Abstinence-only advocacy doesn’t work. Effective advocacy needs to present compelling alternatives.”

A complete reimagining of email

Most people haven’t tried a new email service since Gmail launched 16 years ago, if not earlier. A handful of startups have played around with email interfaces in the years since, trying to make the experience cleaner and mobile-friendly, but no one has touched concepts as foundational as the inbox itself.

Hansson and Fried argue that now is the time to do just that. They have made several radical changes to the inbox, the most glaring of which is that you, the email recipient, have control over who is allowed to appear there. That means you screen all first-time senders.

They’ve also separated out what they call the “The Feed” and the “Paper Trail,” so that there are distinct places for emails like newsletters and shipping confirmation notices. Because The Feed requires opt-in confirmation, it’s much more pleasant to browse than Gmail’s cluttered Promotions tab. It’s also more private: Hey strips incoming messages of the tracking tools known as spy pixels that have become common practice in many emails. (The service indicates any emails that originally had tracking capabilities by displaying a small binoculars icon next to them.)

Overall, the idea behind the Hey and its $99 annual price tag is to make email peaceful, rather than distracting. In my testing of a beta version, Hey largely fulfills that promise. If anything is lacking, it’s an integrated calendar. I didn’t mind the extra upfront work associated with prescreening; thanks to smart design choices, the payoff feels worthwhile. I also appreciated that my “human intelligence” (such as it is) obviated the need for a black-box algorithm to do the work of sorting on my behalf.

For Hansson, that is the essential difference between Hey and its competitors. “Email is getting turned into the [Facebook] News Feed. It’s an editorialized inbox that the likes of Google have designed for you,” he says. Reporting by The Markup earlier this year, for example, found that some Democratic presidential candidates’ email updates were getting sent to spam, while others were reliably claiming prime inbox territory. “It’s the same power that Facebook sits on; it’s their moral prerogative whether to use it. I don’t think that’s good. I think it’s completely dystopian.

“We’re turning that on its head,” he adds. “The only curation in your inbox is what you do through your choices.”

So far, over 40,000 people have joined the Hey wait list. Access to the service starts today, on a rolling basis: “200,000 customers would be an overwhelming success for us, but would also be a drop in the bucket,” Hansson says. (Basecamp has grown over the years to serve 3.3 million users.)

Hansson’s goal, if anything, is to generate enough buzz and momentum to put real pressure on email’s entrenched market leaders, and the technology industry more broadly. “We can’t rely on the people who created the problems to solve the problems,” he says. “We have to look for solutions outside of Silicon Valley.”
https://www.fastcompany.com/90516667...email-platform





Ripple20 Vulnerabilities will Haunt the IoT Landscape for Years to Come

Security researchers disclose 19 vulnerabilities impacting a TCP/IP library found at the base of many IoT products.
Catalin Cimpanu

Cyber-security experts have revealed today 19 vulnerabilities in a small library designed in the 90s that has been widely used and integrated into countless of enterprise and consumer-grade products over the last 20+ years.

Affected products include smart home devices, power grid equipment, healthcare systems, industrial gear, transportation systems, printers, routers, mobile/satellite communications equipment, data center devices, commercial aircraft devices, various enterprise solutions, and many others.

Experts now fear that all products using this library will most likely remain unpatched due to complex or untracked software supply chains.

Problems arise from the fact that the library was not only used by equipment vendors directly but also integrated into other software suites, which means that many companies aren't even aware that they're using this particular piece of code, and the name of the vulnerable library doesn't appear in their code manifests.

The Ripple20 vulnerabilities

These vulnerabilities -- collectively referred to as Ripple20 -- impact a small library developed by Cincinnati-based software company Treck.

The library, believed to have been first released in 1997, implements a lightweight TCP/IP stack. Companies have been using this library for decades to allow their devices or software to connect to the internet via TCP/IP connections.

Since September 2019, researchers from JSOF, a small boutique cyber consultancy firm located in Jerusalem, Israel, have been looking at Treck's TCP/IP stack, due to its broad footprint across the industrial, healthcare, and smart device market.

Their work unearthed serious vulnerabilities, and the JSOF team has been working with CERT (computer emergency response teams) in different countries to coordinate the vulnerability disclosure and patching process.

In an interview with ZDNet last week, JSOF said this operation involved a lot of work and different steps, such as getting Treck on board, making sure Treck has patches on time, and then finding all the vulnerable equipment and reaching out to each of the impacted vendors.

Efforts have been successful, Shlomi Oberman, chief executive officer at JSOF, has told ZDNet. Oberman credited CERT/CC for playing a major role in coordinating the vulnerability disclosure process with all impacted vendors.

Treck, while reticent in the beginning and thinking it was the subject of an extortion attempt, is now fully on board, Oberman said.

In an email to ZDNet on Monday, Treck has confirmed that patches are now available for all the Ripple20 vulnerabilities.
Work on Ripple20 only halfway done

But JSOF said the work on identifying all the vulnerable devices is not yet done. The researchers said they named the 19 vulnerabilities as Ripple20 not because they were 20 vulnerabilities in the beginning, but because of the ripple effect they'll cause in the IoT landscape in 2020, and the years to come.

Researchers say they only scratched the surface when it comes to discovering all the devices that have implemented Treck's TCP/IP library, and that many equipment vendors will need to verify their own code going forward.

Oberman said that while not all of the Ripple20 vulnerabilities are severe, there are a few that are extremely dangerous, allowing attackers to take over vulnerable systems from a "remote" scenario.

In a security advisory that will go live today and reviewed by ZDNet under embargo, the US Department of Homeland Security has attributed ratings of 10 and 9.8 on the CVSSv3 vulnerability severity scale (scale goes from 1 to 10) to four of the Ripple 20 vulnerabilities. These are:

• CVE-2020-11896 - CVSSv3 score: 10 - Improper handling of length parameter inconsistency in IPv4/UDP component when handling a packet sent by an unauthorized network attacker. This vulnerability may result in remote code execution.
• CVE-2020-11897 - CVSSv3 score: 10 - Improper handling of length parameter inconsistency in IPv6 component when handling a packet sent by an unauthorized network attacker. This vulnerability may result in possible out-of-bounds write.
• CVE-2020-11898 - CVSSv3 score: 9.8 - Improper handling of length parameter inconsistency in IPv4/ICMPv4 component when handling a packet sent by an unauthorized network attacker. This vulnerability may result in exposure of sensitive information.
• CVE-2020-11899 - CVSSv3 score: 9.8 - Improper input validation in IPv6 component when handling a packet sent by an unauthorized network attacker. This vulnerability may allow exposure of sensitive information.

These four vulnerabilities, when weaponized, could allow attackers to easily take over smart devices or any industrial or healthcare equipment. Attacks are possible via the internet if the devices are connected online, or from local networks if the attacker gains a foothold on an internal network (for example, via a compromised router).

These four vulnerabilities are ideal for both botnet operators, but also for targeted attacks. Testing all systems for the Ripple20 vulnerabilities and patching these four issues, in particular, should be a priority for all companies, primarily due to Treck's large footprint across the software landscape.

The impact of the Ripple20 vulnerabilities is currently expected to be the same as the Urgent/11 vulnerabilities that were disclosed in July 2019, and which are still being investigated to this day, and new vulnerable devices are being found and patched on a regular basis. The comparison is not accidental, as the Urgent/11 vulnerabilities impacted the the TCP/IP (IPnet) networking stack of the VxWorks real-time operating system, another product widely used in the IoT and industrial landscape.

Just like in the case of Urgent/11, some products will remained unpatched, as some have gone end-of-life, or the vendors have shut down operations in the meantime.

JSOF has been invited to speak about these vulnerabilities at the Black Hat USA 2020 security conference.
https://www.zdnet.com/article/ripple...years-to-come/





Email the Most Common Method of File Sharing, Despite Risks

Email among the most popular targets for cybercriminals.
Sead Fadilpašić

UK consumers use email as their primary method of file sharing, despite the associated cybersecurity risks. This is according to a new report from NordLocker, which claims that 56 percent of UK users lean on email to send and receive files.

Based on a poll of 1,400 consumers in both the UK and the US, the report found consumers are aware of the importance of safeguarding data.

Respondents equated data exposure with losing a wallet, personal documents, or returning home to find their front door open. Further, having an exposed email password was said to be as damaging as losing a job or getting injured.

Still, half of consumers share devices with other users, and just one in ten use encryption to protect their files.

Email is among the most popular targets for cyberattacks and, as NordLocker encryption specialist Oliver Noble notes, “if your email gets hacked, all of your attachments, such as sensitive documents or private photos, can fall into the hands of criminals.”

The poll found that more than half of UK consumers (55 percent) have fallen victim to a cyberattack at least once. More than a third of users have owned a device infected with a virus, and a fifth have clicked on a link in a fraudulent email.
https://www.itproportal.com/news/ema...despite-risks/





Inside the Underground Trade of Pirated OnlyFans Porn

Motherboard investigated the ways people download subscriber-only content in bulk and repost it for free or for profit around the internet.
Samantha Cole, Joseph Cox

In late February, rumors of a massive database of adult content stolen from OnlyFans subscription-only accounts spread through social media. At the time, both the people spreading it and the many models fearing the hundreds of gigabytes of material called it the "OnlyFans leak."

It wasn't a security breach of the OnlyFans platform, which hosts creators' content for subscribers who pay on average $5 a month per subscription. It wasn't a hack or a leak at all, but rather someone simply downloading and dumping stolen content from hundreds of models into a hosting service, and sharing the download link.

Taking content from OnlyFans accounts manually, by downloading videos one by one, would be laborious and time-consuming. But a user with some basic technical knowledge can automate this process and collect content by using a tool called a "scraper." This software churns through the OnlyFans site and downloads whatever videos and photos a user has access to.

Any subscriber could run one of these programs and locally download all of the content a model has behind a paywall, then re-host it on free sites, resell it, or in the worst cases, use it to dox and harass models.

A Motherboard investigation has uncovered an entire supply chain of people stealing sex workers' labor using scraping programs, without permission, in some cases by the hundreds of terabytes, and distributing it on other adult sites or selling scraping services through Discord.

Following the OnlyFans "leak," Motherboard spoke with several performers and owners of premium adult content platforms about the problem of scraping. They agreed that it's an issue, and that casual consumers of porn don't realize that some of the videos they watch on tube sites for free is actually content stolen from OnlyFans and other subscription sites.

Motherboard has found a wide range of tools online that let people easily download and store OnlyFans content, whether the original creator likes it or not. Some users claim to be scraping huge volumes of content, which they say is then resold without disclosing that it originated from OnlyFans. Motherboard also gained access to Discord servers where administrators explicitly offer a paid service to scrape OnlyFan profiles.

"Some people own server farms that download terabytes of data every day from OF [OnlyFans]," a developer of one of the OnlyFans scrapers, who we've declined to name in this piece to limit access to their tools, told Motherboard in an email.

Some of these tools were Chrome browser extensions that give users the option to download content directly from the OnlyFans site. Google removed an OnlyFans scraping Chrome extension when approached for comment by Motherboard.

But the types of tools varied, often requiring some coding knowledge to operate. Motherboard used some of these tools to verify that they work.

One of the people doing the scraping said they made their own OnlyFans scraper because they collect images, videos, and text from the internet. Motherboard previously covered how so-called data-hoarders trade and accumulate passwords, names, and other personal information from data breaches like any other sort of collectible. These differ slightly to archivists, who may preserve digital information that is of risk of disappearing, like writing or art. They said they also worked more on their own script when another tool made by someone else stopped working.

"It’s very disappointing that some people refuse to recognize our work as something valuable"

They said this first tool stopped working when OnlyFans required users to run Javascript in their browser to view content. "I believe [this] was the first step to protect their models from being scraped easily, but they ended up making the site easier to scrape since I found their API," they added.

But these tools are not only for people who save OnlyFans content for their personal use. The scraper said they develop their script for others' benefit too, and pointed to an underground trade in OnlyFans material where people monetize the stolen content.

"Some people own adult websites that use my scripts to download and upload content," they said.

Another data collector who used the moniker DHRB told Motherboard in an online chat that they use another script to download content from OnlyFans accounts when they are running promotions, meaning that a user can temporarily subscribe for free. In that small window, the script then grabs all the content it can. DHRB referred to the technique as "timed promotion sniping."

"We've fully scraped accounts that have thousands of videos. We don't compress anything either since we prefer quality over storage space. Literally everything gets scraped. Images, videos, audio and text," DHRB said.

DHRB described what appears to be a supply chain of OnlyFans content, with material being sold from one person to another, and one that original creators may be unaware that their content is ending up in.

"The data I scrape is resold to a few clients who either own adult websites that host pirated content or people who resell content on Discord. I only handle OnlyFans though," DHRB said. "It really is just going down a rabbit hole. One person sells to another, and then that person sells to another, and so on."

DHRB declined to name the adult sites that buy the content they scrape. "Those sites don't say they get content from me or my partner though. It's better that they claim it as their own so they can build trust with their user base," they said.

But the content isn't just being reposted on difficult-to-access forums or sites casual consumers haven't heard of; It's all over free tube sites like Pornhub, YouPorn, XVideos, and xHamster, easy to find using keyword searches of the subscription sites they're stolen from. These sites then monetize the scraped content with ads that appear next to the uploaded videos.

Pornhub did not respond to a request for comment. A spokesperson for xHamster told Motherboard that while they monitor and sometimes ban keywords for searching abusive or non-consensual material (such as the "Iggy Azalea" or the "R. Kelly sex tapes", they said), moderating keywords that include other premium sites is more difficult, because performers sometimes upload subscription content to the site themselves to attract more subscribers.

Beyond adult sites, multiple people run their own Discord servers where material is traded, or in some cases, where they offer to scrape OnlyFans content for a fee. One administrator advertised scraping one account for $7, five accounts for $25, or 10 accounts for $50. This is often cheaper than a user would ordinarily pay to subscribe to an OnlyFans creator every month, and the buyer then gets to keep the videos and photos on their hard-drive, likely without the permission of the creator. The administrator takes payment via PayPal, Cash App, or Bitcoin, and uploads the scraped content onto a file sharing site for customers to download.

After the publication of this piece,Discord removed multiple servers and banned the owners of each, a Discord spokesperson told Motherboard in an email.

Motherboard also found accounts on Reddit that advertise cloud storage folders allegedly containing content scraped from specific performers. When approached for comment, Reddit directed Motherboard to its user agreement, which states that it expects users to respect intellectual property and ban repeat copyright infringers. Copyright holders have to contact Reddit and file a takedown request to get it removed, first.

The Discord administrator also sells OnlyFans user accounts which come pre-loaded with credits that buyers can then use to purchase other content on the OnlyFans site, such as private videos with creators that are not on a creator's public feed. Judging by messages in the Discord, it appears either these are hacked OnlyFans accounts or the administrator is loading them with credits via stolen payment information.

The person scraping said they believe these Discord administrators are using the script they created, as the resulting folder structure of scraped material is the same.

Last year, professional dominatrix Mistress Harley discovered over 500 items she'd posted for sale for about $10 each had been reposted in full to a site dedicated to reposting scraped adult content. She originally sold the videos on ManyVids, a site where performers can sell individual clips.

"Many pirates will subscribe for one month and then rip all the content they can find, in some cases issuing a credit card fraud chargeback for the one month that they subscribed for in order to steal all your content," Harley told Motherboard. "If you know that people would rather steal from you than pay for your content and encourage you to keep making more content, it does reduce the work I'm willing to put into content."

"Stolen content reposted on free tube sites usually makes your content less desirable and forces you to continuously step your game up to satisfy your clientele and that can likely lead to a common 'burnout' a lot faster," Romi Chase told Motherboard. "Other than that, it’s just completely wrong especially that many times because of the free porn, men seem to see little to no value in our work."

Despite looking effortless in the finished product, it can take six to seven hours to film a full-length video, Chase said—time spent planning, prepping, filming and editing.

"It’s very disappointing that some people refuse to recognize our work as something valuable while in fact we provide a type of service just like any other worker," she said.

Chase said she believes OnlyFans is doing a good job of protecting creators by requiring users to agree not to repost content viewed or purchased on the platform as part of its terms of service, offering watermarking services to trace stolen content, and not allowing subscribers to download directly from within the platform.

Romi Rain, another performer who sells content on subscription sites, said that even with rampant content theft (she found a video ripped from her OnlyFans reposted for free to Pornhub as we were talking) premium sites have been a huge improvement over the centralized studio system.

"Performers have more control over their content than ever and DEFINITELY make more money from it," Rain said. "Ironically the pandemic really spawned confidence in the content revolution in porn. The safety net of knowing you wouldn’t immediately go broke if you spoke out about something or stopped shooting five scenes a week with 12 hour days for a surprisingly low fee, has been everything."

"There’s a social bias against sex workers that’s made this more permissable"

On a technical level, OnlyFans is not stopping or tangibly slowing down scraping, however. Clearly the scraping tools work reliably enough for people to use them at scale.

Larger sites such as Facebook and LinkedIn aggressively try to stamp out scraping with both technical and legal measures, but making any site invulnerable to scraping is difficult. A spokesperson for OnlyFans did not answer specific questions on what technical anti-scraping measures it takes, but the spokesperson told Motherboard that the platform has a dedicated anti-piracy team that issues DMCA takedowns on behalf of its creators.

"This procedure is inclusive of all required notices to move any infringement up to litigation if target websites refuse to comply. OnlyFans also notifies the offending domain registrars and hosting services as well as reporting to all major search engines," the spokesperson said. "With a duty to help battle against illegal piracy, OnlyFans is firmly in the fight to protect user content. Takedown success rates this year have been over 75 percent across offending image hosting sites, torrent providers, and cyber lockers."

OnlyFans isn't the only platform with a content scraping problem, and it's also not the only platform claiming it has robust anti-piracy policies in place. Tube sites also have terms of service clauses that forbid users to upload content they don't own, but stolen and copyrighted content on those sites has been a widespread problem on those platforms for as long as they've existed, and often put the responsibility of getting content removed on the original content creator.

But with OnlyFans' sudden rise in popularity during the coronavirus pandemic, it's become an even more lucrative target for content thieves who get around those policies using scraping tools.

Other platforms Motherboard talked to are also grappling with how to prevent theft.

Dominic Ford, founder and CEO of premium adult content site JustForFans, said his platform uses a two-pronged approach to piracy. Models can report piracy to his anti-piracy company Porn Guardian, which works to get content removed from stolen sites and collects legal damages. Just For Fans also uses proprietary fingerprinting technology, which Ford says is embedded in content to identify who is streaming videos.

"If that video then gets uploaded somewhere else, we can identify who on our site was the original pirate. We can then shut them down and help our model pursue them legally if they so choose," Ford said.

Bella French, founder and CEO of ManyVids, said that her platform employs a team to handle theft and fraud, and also uses a combination of third-party anti-piracy companies to automatically identify and remove stolen content, as well as using unique stream links for each user, generated once the user’s permission to view models' content is validated.

But French also acknowledges that the problem of theft is as much an issue with sex worker stigma as it is technological.

"We are making every effort available to us at ManyVids to do what we can to protect the content creators and minimize the risk of piracy but I don’t think we can rely on technology as the panacea in this case," French said. There must be a "sea change," she said, in how people perceive the work of models, and how they consume and pay for their porn.

"This is an industry filled with hard-working people that need to generate income to be able to survive and ideally thrive," she said. "It’s a message we must get across and the entire industry must come together on this if we stand a chance of removing the stain of piracy from our industry for good."

At another premium content platform, FanCentro, vice president Kat Revenga told Motherboard that content theft is one of the biggest challenges creators face.

"There’s a social bias against sex workers that’s made this more permissable," Revenga said. "Those stealing the content feel that it’s their right to take it, that the creators deserve the violation by virtue of the work they do."

When creators alert them of content that's been stolen, FanCentro investigates the claim and helps them get it taken down. Following the OnlyFans "leak," the platform offered anyone affected who signed up for FanCentro free DMCA protection.

"Platforms, especially ones whose success is built on the work of sex workers need to step up," Revenga said.

OnlyFans subscriptions range from $4.99 to $49.99 a month—on average, they cost less than a streaming video service like Netflix or Spotify—but some people still look for ways to get that content for free, downloaded from sites where thieves post content for sale or free.

Motherboard asked one of the scrapers we talked to whether they've considered content theft from the models' perspectives.

"I'm going to be truthful and say while I do understand where they're coming from, I also like keeping the content I paid for and many others do too. This is the most common justification for the script," they said.

"[Content scraper tools] are always going to be around, people are going to record/download your content, send it to their friends, or just leak it on the internet just to spite you. The best thing you can do is to get DMCA protection to guard your content," they said.

One of the people doing the scraping Motherboard spoke to blames the models for this outcome—not the thieves.

“Before you decide to put your face on the adult side of the internet where your main audience is lonely men, you’ll need to consider the fact that people are actively scraping your content to build a database of faces so they can cross-reference images on other social media sites," they said.

Many adult sites, including OnlyFans, allow for models to set up geo-blocking to prevent people they know in real life within a certain region finding their content. Once it's scrapped and reposted or resold elsewhere, however, that protection goes away. Anyone can see a video, and once it goes viral, there's no telling where it will end up online. The repercussions of adult content going viral within a model's community can be deeply damaging and in the worst cases, deadly.

But the biggest concern creators have when their content is stolen is still financial. As platforms like OnlyFans change their policies to cut their income even more, performers have to work even harder to make a living that outpaces the theft. Until consumers value sex workers' labor as having value, the demand for free content will continue.

"When you steal or view stolen content, you’re literally taking away someone’s income," Revenga said. "Their ability to pay rent, buy groceries or pay for education... Influencers depend on platforms to keep their content safe, and we need to take the appropriate precautions to prevent being part of the problem."

"In general, piracy affects models very directly. Models make their money directly from consumers, and conversely, piracy is directly stealing from models," Ford said. "It was bad enough when studios suffered, but pirates think studios are all-powerful and wealthy and wouldn’t feel it. This wasn’t true then and isn’t true now. And it’s even worse with stealing fan content, because users know this is money coming directly out of the hands of models. That makes this crime more cruel and personal."
https://www.vice.com/en_us/article/5...n-scraper-leak

















Until next week,

- js.



















Current Week In Review





Recent WiRs -

June 13th, June 6th, May 30th, May 23rd

Jack Spratts' Week In Review is published every Friday. Submit letters, articles, press releases, comments, questions etc. in plain text English to jackspratts (at) lycos (dot) com. Submission deadlines are Thursdays @ 1400 UTC. Please include contact info. The right to publish all remarks is reserved.


"The First Amendment rests on the assumption that the widest possible dissemination of information from diverse and antagonistic sources is essential to the welfare of the public."
- Hugo Black
__________________
Thanks For Sharing
JackSpratts is offline   Reply With Quote
Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Peer-To-Peer News - The Week In Review - July 16th, '11 JackSpratts Peer to Peer 0 13-07-11 06:43 AM
Peer-To-Peer News - The Week In Review - July 9th, '11 JackSpratts Peer to Peer 0 06-07-11 05:36 AM
Peer-To-Peer News - The Week In Review - January 30th, '10 JackSpratts Peer to Peer 0 27-01-10 07:49 AM
Peer-To-Peer News - The Week In Review - January 16th, '10 JackSpratts Peer to Peer 0 13-01-10 09:02 AM
Peer-To-Peer News - The Week In Review - December 5th, '09 JackSpratts Peer to Peer 0 02-12-09 08:32 AM






All times are GMT -6. The time now is 03:05 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
© www.p2p-zone.com - Napsterites - 2000 - 2024 (Contact grm1@iinet.net.au for all admin enquiries)