P2P-Zone  

Go Back   P2P-Zone > Peer to Peer
FAQ Members List Calendar Search Today's Posts Mark Forums Read

Peer to Peer The 3rd millenium technology!

Reply
 
Thread Tools Search this Thread Display Modes
Old 15-04-20, 07:04 AM   #1
JackSpratts
 
JackSpratts's Avatar
 
Join Date: May 2001
Location: New England
Posts: 10,016
Default Peer-To-Peer News - The Week In Review - April 18th, ’20

Since 2002































April 18th, 2020




This Group Is Pirating Medical Device Standards and Sharing Them With DIY Projects

To license medical technical standards costs hundreds of dollars. The Human Standards Project is sharing them all in a huge torrent to fight the coronavirus.
Matthew Gault

A pandemic is raging and America is short on medical supplies. New York City plans to start seizing unused ventilators, medical professionals are sharing a hack that can double ventilator capacity, and medical device manufacturers are fighting with the people trying to make it easier to repair the machines.

Now, a decentralized group of data hoarders called the Human Standards Project is pirating (where needed), collating, and otherwise making available medical information and technical standards that could help people working on solutions for the coronavirus pandemic.

Technical standards are specifications for services, systems, and equipment that ensure quality, safety, compatibility, and efficiency in manufacturing. Typically, people have to purchase technical standards from one of the private firms that creates them to view them.

ASTM International is one of the larger firms that standardizes medical equipment. Its 2020 book of information about medical and surgical equipment costs $314 for a print volume or $828 for a license for multiple people to view it online. ASTM International has publicly shared its standards for medical gowns, masks, gloves, hand sanitizers, and respirators. But it’s not the only firm setting standards, and much remains behind a paywall. The Human Standards Project is gathering decades worth of information from ASTM International and others and making it available online, for free. Technical standard information such as this will help the DIY community build their own medical equipment safely.

“These peer-to-peer collections include over 50,000 international technical standards to aid those innovators and creators in their efforts,” user shrine, who is working on the project, said in a post on Reddit. “The Standards help these teams ensure their work meets internationally agreed-upon specifications for product safety. Access to Standards means the difference between life and death in the developing world every day. In a worst-case scenario governments around the world may be forced to consult these volunteer DIY projects amidst global shortages. We must prepare for the worst-case scenario.”

The Human Standards Project is sharing its 75GB collection of technical standards via IPFS, a distributed peer-to-peer network built to share data. It’s also available as a torrent. The project needs help too. Its collection of data is incomplete and it’s hunting for specific information about medical equipment. You can join the hunt here.
https://www.vice.com/en_ca/article/y...h-diy-projects





Senator Tillis Angry At The Internet Archive For Helping People Read During A Pandemic; Archive Explains Why That's Wrong
Mike Masnick

A few weeks ago, we wrote about the misguided freakout by (mainly) publishers and some authors over the Internet Archive's decision to launch the National Emergency Library during the COVID-19 pandemic, to help all of us who are stuck at home be able to digitally access books that remain in locked libraries around the country. A key point I made in that post: most (not all, but most) of the criticisms applied to the NEL project could equally apply to regular libraries. And perhaps that's why hundreds of libraries have come out in support of the project, even as those attacking the project insist that it's not an attack on libraries.

Either way, it was only a matter of time before publishers got their lapdogs in Congress to start making noise, and first out of the gate was Senator Thom Tillis, who is already deep into his attempt to make copyright law worse, and who last week sent a letter to the Internet Archive's Brewster Kahle that reads very much like it was written by book publishers. First it gets high and mighty about how the pandemic has "shown the critical value of copyrighted works to the public interest" which is just a weird way to phrase things. The fact that something valuable is covered by copyright does not automatically mean that copyright is helpful or valuable for that situation. Then it gets to the point:

I am not aware of any measure under copyright law that permits a user of copyrighted works to unilaterally create an emergency copyright act. Indeed, I am deeply concerned that your "Library" is operating outside the boundaries of the copyright law that Congress has enacted and alone has jurisdiction to amend.

A few days later, Kahle responded in a detailed and thorough letter to Tillis. It points out that the Internet Archive is well-established and recognized by the state of California as a library, and that it has already shown that it has a legal right to digitize books. And then goes on to explain that the point of the NEL is to help enable Tillis' own constituents to access to the books that their tax dollars paid for while they're locked up collecting dust inside libraries that are closed during the pandemic.

The National Emergency Library was developed to address a temporary and significant need in our communities — for the first time in our nation’s history, the entire physical library system is offline and unavailable. Your constituents have paid for millions of books they currently cannot access. According to National Public Library survey data from 2018-2019, North Carolina’s public libraries house more than fifteen million print book volumes in three-hundred twenty-three branches across the State. Because those branches are now closed and their books are unavailable, the massive public investment paid for by tax-paying citizens is unavailable to the very people who funded it. This also goes for public school libraries and academic libraries at community colleges, public colleges and universities as well. The National Emergency Library was envisioned to meet this challenge of providing digital access to print materials, helping teachers, students and communities gain access to books while their schools and libraries are closed.

It also highlights something else that many had missed: the NEL does not include any books published within the last five years -- which is pretty important, since the commercial value of a book usually exists in the first couple years after publishing. Indeed, a recent study highlighted how the vast, vast, vast majority of sales tends to come soon after a book is published and then sales decline rapidly. So the argument that the NEL is somehow taking away from author income is already somewhat questionable.

And, indeed, the Archive is currently seeing evidence that suggests the NEL is not actually impacting author earnings in any significant way:

In an early analysis of the use we are seeing what we expected: 90% of the books borrowed were published more than ten years ago, two-thirds were published during the twentieth century. The number of books being checked out and read is comparable to that of a town of about 30,000 people. Further, about 90% of people borrowing the book only looked at it for 30 minutes. These usage patterns suggest that perhaps that patrons may be using the checked-out book for fact checking or research, but we suspect a large number of people are browsing the book in a way similar to browsing library shelves.

The Internet Archive has also been highlighting case studies of teachers and students helped out by the NEL.

Kahle also explains to Tillis how he's wrong to say that copyright law does not allow this kind of lending. It's called fair use.

You raise the question of how this comports with copyright law. Fortunately, we do not need an “emergency copyright act” because the fair use doctrine, codified in the Copyright Act, provides flexibility to libraries and others to adjust to changing circumstances. As a result, libraries can and are meeting the needs of their patrons during this crisis in a variety of ways. The Authors Guild, the leading critic of the National Emergency Library, has been incorrect in their assessment of the scope and flexibility of the fair use doctrine in the past and this is another instance where we respectfully disagree.

The reference regarding the Authors Guild being wrong about fair use refers to its years-long fight to stop libraries from digitizing books, which resulted in a massive loss for the Guild's ridiculous interpretation of copyright and fair use.

In the end there are a bunch of important points here: even if Tillis is right that copyright is somehow proving its value in a pandemic (and he's not), that doesn't change the simple fact that this library is enabling people who cannot check out physical books from their locked community libraries to at least be able to access those books while remaining safe at home. The Internet Archive has legal scans of these books, and hundreds of libraries are supporting this effort. While it's true, as some authors and publishers highlight, that there are official ebooks for some books, many (especially older) books do not have them at all -- and those include lots of books that are commonly read in classrooms. And, as we pointed out last time, in cases where there are official ebooks, almost anyone would prefer to get those copies, because they are much easier to read and designed to be read on a reading device (specialized reading device, tablet, or phone) as compared to the NEL scans, which are straight scans of the book pages.

No matter what, it's a really bad look for Tillis to stomp around complaining that his constituents might actually be able to read books that are currently locked up in libraries. Remember that the entire intent of copyright law in the first place, and the subtitle of the US's very first copyright law, was that it be to enable learning. The Internet Archive is trying to help push forward that clear goal of copyright law... while Senator Tillis seems to want to stop it.

In the end, there's very little "there" there to the complaints about this project. It's difficult to see how it's harming author revenue in any real way, but it is clearly helping schools and students while the libraries and books they normally use are unavailable. And, there are strong arguments for why this is perfectly legal under copyright law -- and if the claim is that we should wait until that's absolutely proven in court, well, that kinda misses the whole point of helping out during a pandemic.

As professor Brian Frye recently wrote about all of this: "When you find yourself complaining about libraries, you might want to think twice about your priorities." And I'd say that counts double in the midst of a pandemic.
https://www.techdirt.com/articles/20...ts-wrong.shtml





Nebraska Education Department Accused of Software-Pirating
AP

A New York-based software company is suing the Nebraska Department of Education for $15 million in federal court, accusing the department of using elements of the company’s copyrighted software in designing its own web-based program to share student and staff data.

The state’s education department contracted with eScholar, a company based in White Plains, New York, for student data collection software from 2004 until November 2019. The department paid $84,000 to eScholar with its last renewal, which ended Oct. 31. The state now uses a system it designed, but eScholar says that system pirated elements of the company’s software, the Lincoln Journal Star reported.

The lawsuit accused the department of violating a segment of their contract that forbids the state from using any of the company’s product design elements without written consent. Because the department puts instructional materials for its new system on its website, others could steal eScholar trade secrets, the lawsuit says.
https://apnews.com/2488f20dabd55d702e0f863f0525fd89





Digital Hoarders: “Our Terabytes are Put to Use for the Betterment of Mankind”

Nerds with hoarding tendencies? Preservationists of history? Many terabytes either way.
David Rutland

Think we prefer the album version, but OK, sure Top of the Pops

Today perhaps more than ever, data is ephemeral. Despite Stephen Hawking's late-in-life revelation that information can never truly be destroyed, it can absolutely disappear from public access without leaving a trace.

It’s not just analogue data, either. Just as books go out of print, websites can drop offline, taking with them the wealth of knowledge, opinions, and facts they contain. (You won't find the complete herb archives of old Deadspin on that site, for instance.) And in an era where updates to stories or songs or short-form videos happen with the ease of a click, edits happen and often leave no indication of what came before. There is an entire generation of adults who are unaware that a certain firefight in the Mos Eisley Cantina was a cold-blooded murder, for instance.

So on any given day, 19-year-old Peter Hanrahan now spends his evenings bingeing on chart-topping radio shows from the 1960s. A student from the North of England, he recently started collecting episodes of Top of the Pops—a British chart music show that ran between 1964 and 2006—after seeing the 2019 Tarantino flick, Once Upon a Time in Hollywood.

"I was searching for TOTP episodes as I found that there was a severe lack of them available on YouTube, the BBC iPlayer, or any other radio shows,” he tells Ars. “But I wanted to experience what it would have been like back then and searching because of how atmospheric the radio was in Once Upon a Time in Hollywood. It's been another way to discover music from that era."

If Hanrahan merely wanted to experience more ‘60s British chart-toppers, of course, he could have simply run to Spotify. But he wants the experience of TV as it was recorded back in the day—including live studio audiences, lip sync controversies, and alleged sex offenders.

Naturally, YouTube does have many old episodes, but the BBC has tried taking down ones featuring Jimmy Savile or Gary Glitter, for instance. Today it’s far from a complete TOTP library with only a fraction of the episodes Hanrahan is looking for accessible on the platform. YouTube is also quick to respond to takedown notices, and episodes that are currently there one day can disappear the next.

His next stop is archive.org, the venerable non-profit library that boasts a tremendous 411 billion archived Web pages, 23 million books, 5.5 million movies, and a variety of other data. Often they will have what Hanrahan needs, but if not, his next stop is an obscure corner of reddit, where it is just possible that someone, somewhere, will have a copy saved.

It has taken Hanrahan a long time to find and obtain them, but his work, trawling the edges of the Internet and connecting with real people, is finally paying off. In his first year as a self-confessed hoarder, Hanrahan had collected more than a terabyte of data.

This impermanence of information, of course, goes far beyond old British radio. And luckily for future generations, the itch to seek it out, collect it, and store it goes beyond Hanrahan, too. It’s a sentiment currently driving thousands of individuals to band together online in the communal pursuit of archiving old media of all sorts. This ain’t the grant-and-partnerships-funded well-coordinated operation of the Internet Archive; it’s the individual-obsession-driven r/Datahoarder.

There’s a subreddit for everything

In 2020, the r/Datahoarder community on reddit is almost 200,000 members strong, with around 1,000 or so idling or posting in the subreddit at any time. The communal purpose here is exactly what it sounds like: these amateur archivists set out to collect and capture data and to preserve it for record, reference, and future reading. Often, the goal is to retain this information both online and off, through physical media or terabytes of personal hard drives and storage. In a way, you can think of r/Datahoarder like thousands of haphazard individual Internet Archives—though each member tends to have a few specific niche areas of focus.

On r/Datahoarder, you’ll find people storing data on everything from YouTube videos to game install discs. One person was even planning to copy all Australia-based websites even as the country burned in the worst wildfires in history. The post was deleted after it was pointed out that the physical servers for Australian websites are located outside the country. They’re safe for now—phew.

Some users archive every website they visit or service they use, and the gamut of media includes virtually everything: movies, music, and porn are all popular.

And for future historians, every tweet, every livestream, every TV and news show of the recent and ongoing Hong Kong democracy movement has been squirreled away by a few dedicated users. Already it's proving useful to at least one academic who visited r/DataHoarder seeking research material for their Sociology master's thesis on the Hong Kong protests.

Any hardware is welcome. While many users boast huge storage racks of expensive equipment, even humble Raspberry Pis are routinely kitted out with oversized drives and employed as real-time reddit-scrapers. That embarrassing 3am post about how you really need to get back with your ex? You may have deleted it within seconds of posting, but it's almost guaranteed that there are multiple copies in private archives—available to your ex on request.

1990’s-era mass storage devices such as the Iomega Zip Drive occasionally float to the surface of the sub, as their owners rediscover them from a cupboard under the stairs, prompting discussion on drivers, recovery methods, file formats, and readability.

The desire to save information for posterity seems to be almost universal but manifests in different ways according to each hoarder's own interest. Scroll through the boards and you'll find archived websites offering customization for Windows 98 machines and novelty cursors. You'll find users on a mission to preserve the entire Internet of a single country at a given point in time. You'll find users whose particular obsession is satellite weather forecasts for Japan, or silent movies.

As you might guess based on a collection of highly motivated and obsessed tech users, r/Datahoarder started first as a single IRC chat channel on freenode. Eventually, the community transitioned to the still-in-occasional-use r/datahoarders, with r/datahoarder being brought into existence four years ago. There is also a separate exchange subreddit, r/DHExchange, where members attempt to fill gaps in their collections.

Discussion these days is typically highly technical, largely revolving around efficient means of storing or hoarding vast quantities of data gleaned from online and elsewhere. Users want to get advice on hard drive arrays running into the hundreds of terabytes, mass storage options in the cloud, and the astonishing costs associated with archiving otherwise forgotten older media like broadcasts, music, journals, and webpages.

Hanrahan didn’t get involved out of his love of the 1960s musical zeitgeist—old British music acts are only the latest archival effort he’s undertaking. In real life, Hanrahan has 12 drawers of color-coordinated Lego bricks he uses frequently and an extensive vinyl collection, which includes everything from the original The Good, the Bad, and the Ugly soundtrack to music from Red Dead Redemption II. Perhaps unsurprisingly, he also maintains a large digital games library.

"It started out as me compiling together stuff that I think is relatively hard to find, and just some cool stuff I find, like old commercials and TV intros like ABC's," he said.

As a small and whimsical fish in the data hoarding pool, Hanrahan’s storage isn't extensive but is still considerably more than what most users would have on their home systems. His storage capacity is 6TB, with 3TB given over to backups. He spends an additional £100 (roughly $130) on two 1TB drives each time he starts to run out of space. He even keeps additional drives containing his most valued data at another family member's house and updates his hoard yearly.

A brief history of archiving impulses

The urge to store rare or useful recordings and information has been going on for as long as humans have had the means at their disposal. The first archives of written material started appearing at around 3500 BC—not long after the invention of writing, and the Great Library of Alexandria was founded with the aim of acquiring and hoarding the best and most authoritative copies of every piece of work ever produced, employing scribes to hand copy onto the finest parchment available—the ancient equivalent of 8K UltraHD blu ray rips.

It wasn't until the 1970s, with the phenomenal success of the compact cassette tape, that amateur archiving of popular live media became possible. Teenagers in their bedrooms would record live radio shows as they aired with the latest pop songs from pirate radio stations. By 1974, Billboard magazine reported that over 40 percent of all age groups recorded live shows from the radio, with a corresponding drop in the number of prerecorded tapes being purchased. Home taping is killing the music industry? This is where it started. Tapes were recorded and recorded again, before being condemned to disposal or a purgatory of eternal storage in a slowly yellowing plastic case, or at the back of a kitchen drawer.

The advent of Betamax and VHS soon gave hoarders a new tool. Live and pre-recorded TV shows and movies became available to watch on demand from the users' own personal libraries. As with cassette tapes, most recorded shows were later recorded over to make room for the next episode of The Bob Newhart Show or All in the Family. What most people had in mind was not a permanent archive—it was the convenience of being able to watch or listen to the latest installment of a favorite soap when it suited them.

But as VCRs gave way to DVD players, then to DVDRs, TiVo boxes, and eventually the streaming landscape we know and love today, VHS tapes suffered the same fate as cassettes. Broadcast TV, like radio, has largely been lost to the mists of time unless the creators and rights holders put in the effort to create and securely store backups.

For instance, Doctor Who is one of British television's most successful exports, and at its peak popularity in 1982, the show was being watched by a global audience of 98 million people. Today, the fandom is obsessive—poring over the tiniest plot details, stockpiling episodes, and arguing over which of the Doctor’s 13 incarnations was the greatest.

But between its initial broadcast in 1967 and 1978, the BBC routinely deleted its programming after it had been broadcast in the belief that there was no practical value to keeping copies. Nine years of beloved Doctor Who episodes are missing. Some clips survive and occasionally, a full episode will turn up, courtesy of a foreign network that found the original two-inch tape in a box down the side of the couch, but most of Doctor Who's earliest broadcasts are gone for good.

Do we really need everything?

In the specific example above, the Doctor Who rescue effort is underway, and the BBC archives are unlikely to disappear any time soon. But some r/Datahoarder users are worried about the impermanence of other types of network television, its archives, and the Internet as a whole.

Take Reddit user Cwtard. He’s worried that politics and censorship will prevent the people of the future from easily accessing the facts of today. If all that survives are news opinion shows in streaming service archives, for instance, future viewers will see only a distorted and one-sided vision of the past.

"I collect news because it is in the most danger. It is a record of what we were being led to believe as well as a record of what we were allowed to hear," he told Ars. "If there is anything that globalists, corporations, and politicians want scrubbed from the Internet—in my opinion, it is the news."

Cwtard started archiving the news in 2008 and only more recently discovered r/Datahoarder. It has become a virtual venue where he can keep an eye out for broadcasts to flesh out his incomplete collection. To him, the Internet is an impermanent place, which could vanish at any moment, and Cwtard needs the material on his servers, in his physical possession. He sees it as an obligation to ensure that a true record of the present and the past survives into the future.

Currently, Cwtard is on the lookout for old CBS Evening News broadcasts, the NBC Today Show, Hoda and Jenna, CBS Sunday Morning, Face the Nation, and 60 Minutes, as well as copies or scans of old newspapers.

"There's definitely a wider duty when you see what's coming down the pipe. At best the Internet will be subscription based with only the rich having access currently enjoyed by everyone. At worst it will be completely sanitized of anything deemed dangerous or ‘wrongthink,’" he says. “Given the geopolitical climate these days, there's a real possibility that an event could shut down the Internet completely—at least until TV 2.0 is ready to go online. In this event, you want to be able to save as much history as possible because when it comes back on—only authorized history will be allowed, in my opinion."

Cwtard isn’t wrong. Even the Internet Archive—a hugely respected institution on r/Datahoarder—is under threat. In 2019, a dispute over audiobooks threatened to take the site offline across the whole of Russia. Lawsuits can happen at any time in any part of the world, and the monolithic archive.org could be legally blocked by ISPs, its treasure trove buried forever.

Think you'll need a little more than a <a href="https://arstechnica.com/gadgets/2020/02/guidemaster-ars-tests-and-picks-the-best-portable-ssds-you-can-buy/">portable SSD</a> to truly become a digital hoarder.

Do I really need a server closet? Yes and no

Distrust of cloud computing is a common but not overriding theme among data hoarders. Some do trust their archives and backups to the likes of Google and Amazon; others certainly share Cwtard's view.

"Cloud storage has the same impermanence as the Internet, even less actually,” he says. “You are putting your trust in companies that have proven they can't be trusted. Why surrender your data to a company that holds different morals and values than you?... They are pushing the cloud idea. They want the public to surrender all data to them. I think they have a vision for the world where computers just access the cloud, so there's no reason to own a computer."

Again, the idea of entrusting vast amounts of data to Web giants such as Google is a popular one outside of r/Datahoarder. Sales of desktop rigs with limitless upgradeability and a largely empty case in which to stuff RAID arrays have been in freefall for years, while sales of Google's Chromebook range (which typically offer very limited memory and non-upgradeable onboard storage—along with a free cloud account) have been soaring since the range's launch in 2006. In January 2019, Google disclosed that Chromebooks were being used by 30 million students and educators. By 2023, estimates point to Google shipping 17 million Chromebooks per year.

So although all anyone needs to start out down the r/Datahoarder rabbit hole is a low spec laptop and an account with a few terabytes in the cloud, there are reasons to focus on the physical and local. Google specifically has long had a nasty habit of snooping into its users' business, and its so-called anti-abuse mechanisms can lock access to files that the search giant suspects may be an illicit copy of copyrighted material.

In the United States, anything created after January 1, 1978, whether it’s a drawing, a poem, reference material, or a blog post, has a copyright for the life of the author plus 70 years. So almost everything published to the Web since its inception, or broadcast in any form in the last 100 years, is still legally locked down. What many cloud-reliant data hoarders are doing is technically illegal and can be shut down or deleted by Google without warning. In light of that, on premises storage is the only way to go for users who want to keep an archive that they both own and that is safe from prying eyes or deletion.

Numerous hoarders have been hit by DMCA copyright claims, forcing them to take their publicly available archives offline. One of the more poignant incidents involved u/dunklesToast who had amassed over 200GB of Donald Duck comics in Finnish but was forced offline after receiving a legal notice. A couple of fans had even pledged to learn the language so they could appreciate the monumental effort, while others planned a mass translation effort. All for nought.

Accordingly, it’s not uncommon to see, on the r/DHExchange request subreddit, users asking for rare or banned movies. There are also posts seeking current releases, proposing magazine scans to augment holes in decades-long print runs, and (like Peter) requesting British music shows from the distant past.

Surprisingly, the requests are often met by the people who recorded from live TV in decades long gone—often on obsolete equipment.

"I have a couple of recordings from 1999," reads one reply to yet another TOTP request. "Not the best quality unfortunately; knackered VCR heads when recording and a horrible ghost due to tall buildings nearby, but it’s watchable." The user, AU8830, has uploaded the episodes to a temporary host, each file a hefty 12GB in all of its interlaced and artifact-ridden glory. "Thanks so much man this is epic! You are a legend," reads the reply.

Even if the data, whatever form it takes, is available elsewhere online or in libraries, there is an ever-present awareness on r/Datahoarder that it could vanish at any moment. And what becomes of these personal information caches decades down the line is a clear risk and challenge inherent to this rogue recording approach versus the institutionalization of libraries or The Internet Archive. Hanrahan plans to give his drives to a friend who will “sustain it and store it safely,” he says (and hopes). But Cwtard's relatives and friends lack an appreciation for what he does. "My life's work will be for nothing. Tossed in the garbage like an old Atari found in an attic," he said. "Doesn't mean I don't have an obligation to do it anyway."

More than anything else, that—a sense of obligation to act for some greater informational good—may be the ultimate takeaway for anyone wading periodically into the r/Datahoarder world. Some people do come only for the British music or the Japanese weather or the Microsoft install discs. But the wider duty expressed by Cwtard is certainly echoed periodically by other users.

"A shower thought just hit me today," posted u/mamborambo on the lesser-used r/datahoarders subreddit. "With all the active archiving projects being launched recently to save historical content from Yahoo Groups, Youtube, dying mailing lists, evidence of human rights abuses, etc. etc., the datahoarders' role has been elevated from a nerd with compulsive hoarding tendencies into a champion of free speech and preservationist of history. We now boldly go where the corporate interest fails. Our terabytes are finally put to use for the betterment of mankind. Hopefully none of our home rigs fail, we remember to do our 3-2-1 backup correctly, and most importantly to make our loot accessible, because data is useless if not shared."

And if data is what you currently seek, I have a feeling I know where I can find you an archived copy.
https://arstechnica.com/gaming/2020/...nt-of-mankind/





Western Digital Admits 2TB-6TB WD Red NAS Drives Use Shingled Magnetic Recording
Chris Mellor

Some users are experiencing problems adding the latest WD Red NAS drives to RAID arrays and suspect it is because they are actually shingled magnetic recording drives submarined into the channel.

Alan Brown, a network manager at UCL Mullard Space Science Laboratory, the UK’s largest university-based space research group, told us about his problems adding a new WD Red NAS drive to a RAID array at his home. Although it was sold as a RAID drive, the device “keep[s] getting kicked out of RAID arrays due to errors during resilvering,” he said.

Resilvering is a term for adding a fresh disk drive to an existing RAID array which then rebalances its data and metadata across the now larger RAID group.

Brown said: “It’s been a hot-button issue in the datahoarder Reddit for over a year. People are getting pretty peeved by it because SMR drives have ROTTEN performance for random write usage.”

SMR drives

Shingled media recording (SMR) disk drives take advantage of disk write tracks being wider than read tracks to partially overlap write tracks and so enable more tracks to be written to a disk platter. This means more data can be stored on a shingled disk than an ordinary drive.

However, SMR drives are not intended for random write IO use cases because the write performance is much slower than with a non-SMR drive. Therefore they are not recommended for NAS use cases featuring significant random write workloads.

Smartmontools ticket

Brown noted: “There’s a smartmontools ticket in for this [issue] – with the official response from WDC in it – where they claim not to be shipping SMR drives despite it being trivial to prove otherwise.”

That ticket’s thread includes this note:

“WD and Seagate are _both_ shipping drive-managed SMR (DM-SMR) drives which don’t report themselves as SMR when questioned via conventional means. What’s worse, they’re shipping DM-SMR drives as “RAID” and “NAS” drives This is causing MAJOR problems – such as the latest iteration of WD REDs (WDx0EFAX replacing WDx0EFRX) being unable to be used for rebuilding RAID[56] or ZFS RAIDZ sets: They rebuiild for a while (1-2 hours), then throw errors and get kicked out of the set.”

(Since this article was published Seagate and Toshiba have also confirmed the undocumented use of shingled magnetic recording in some of their drives.)

The smartmontools ticket thread includes a March 30, 2020, mail from Yemi Elegunde, Western Digital UK enterprise and channel sales manager:

“Just a quick note. The only SMR drive that Western Digital will have in production is our 20TB hard enterprise hard drives and even these will not be rolled out into the channel. All of our current range of hard drives are based on CMR Conventional Magnetic Recording. [Blocks & Files emboldening.] With SMR Western Digital would make it very clear as that format of hard drive requires a lot of technological tweaks in customer systems.”

WD’s website says this about the WD Red 2TB to 12TB 6Gbit/s SATA disk drives: “With drives up to 14TB, the WD Red series offers a wide array of solutions for customers looking to build a high performing NAS storage solution. WD Red drives are built for up to 8-bay NAS systems.” The drives are suitable for RAID configurations.

Synology WD SMR issue

There is a similar problem mentioned on a Synology Forum where a user added 6TB WD Red [WD60EFAX] drive to a RAID setup using three WD Red 6TB drives [WD60EFRX] in SHR1 mode. He added a fourth drive to convert to SHR2 but conversion took two days and did not complete.

The hardware compatibility section on Synology’s website says the drive is an SMR drive:

The Synology forum poster said he called WD support to ask if the drive was an SMR or conventionally recorded drive: “Western Digital support has gotten back to me. They have advised me that they are not providing that information so they are unable to tell me if the drive is SMR or PMR. LOL. He said that my question would have to be escalated to a higher team to see if they can obtain that info for me. lol”

Also: “Well the higher team contacted me back and informed me that the information I requested about whether or not the WD60EFAX was a SMR or PMR would not be provided to me. They said that information is not disclosed to consumers. LOL. WOW.“

Price comparison

A search on Geizhals, a German language price comparison site, shows various disk drives using shingled magnetic media recording. Here, for example, is a search result listing for WD Red SATA HDDs with SMR technology. The result is.

However, a WD Red datasheet does not mention SMR recording technology.

WD comment

We brought all these points to Western Digital’s attention and a spokesperson told us:

“All our WD Red drives are designed meet or exceed the performance requirements and specifications for common small business/home NAS workloads. We work closely with major NAS providers to ensure WD Red HDDs (and SSDs) at all capacities have broad compatibility with host systems.

“Currently, Western Digital’s WD Red 2TB-6TB drives are device-managed SMR (DMSMR). WD Red 8TB-14TB drives are CMR-based.

“The information you shared from [Geizhals] appears to be inaccurate.

“You are correct that we do not specify recording technology in our WD Red HDD documentation.

“We strive to make the experience for our NAS customers seamless, and recording technology typically does not impact small business/home NAS-based use cases. In device-managed SMR HDDs, the drive does its internal data management during idle times. In a typical small business/home NAS environment, workloads tend to be bursty in nature, leaving sufficient idle time for garbage collection and other maintenance operations.

“In our testing of WD Red drives, we have not found RAID rebuild issues due to SMR technology.

“We would be happy to work with customers on experiences they may have, but would need further, detailed information for each individual situation.”

Comment

Contrary to what WD channel staff have said, the company is shipping WD Red drives using SMR technology. WD told us: “In a typical small business/home NAS environment, workloads tend to be bursty in nature, leaving sufficient idle time for garbage collection and other maintenance operations.”

Not all such environments are typical and there may well not be “sufficient idle time for garbage collection and other maintenance operations”.

We recommend posters on the Synology forum, data hoarder Reddit and smartmontools websites to get back in touch with their WD contacts, apprise them of the information above and let them know that WD is “happy to work with customers on experiences they may have”.
https://blocksandfiles.com/2020/04/1...tic-recording/





Sneaky Marketing Redux: Toshiba, Seagate Shipping Slower SMR Drives Without Disclosure, Too
Paul Alcorn

News emerged earlier this week that Western Digital was sneaking out hard drives using inferior SMR technology -- which results in devastatingly slow performance in some types of applications -- without disclosing that fact to customers in marketing materials or specification sheets. After a bit of continued prodding, storage industry sage Chris Mellor secured statements from both Seagate and Toshiba that confirmed that those companies, too, engage in the misleading tactic of selling drives using the slow SMR technology without informing their customers. The latter two even use the tech in hard drives destined for desktop PCs.

Given that SMR drives suffer from abysmal random write performance, which is a key type of write pattern that impacts performance in desktop operating systems, the drives will result in noticeably slower performance for PC users.

It's important to understand that there are different methods of recording data to a hard drive, and of the productized methods, shingled magnetic recording (SMR) is by far the slowest. That results in a perceivable difference in performance and even compatibility issues with some types of applications (like RAID). As such, these drives are mainly intended for write-once-read-many (WORM) applications, like archival and cold data storage, and certainly not as boot drives for mainstream PC users.

Due to the complexities of increasing drive density, the industry developed SMR to boost hard drive capacity within the same footprint. The tactic revolves around writing data tracks over one another in a 'shingled' arrangement. The original concept hinged on systems designed top-down from the hardware, software, and file systems, to reduce the performance penalties. However, the complexity and cost of adopting those types of systems (host-managed SMR) prevented the industry from adopting the drives en masse.

As a middle ground, the hard drive industry, which winnowed down to three players due to the brutal economics of producing hard drives in the emerging era of SSDs, developed SMR drives that can work in any normal system (drive-managed SMR). This type of drive is cheaper for HDD vendors to produce, equating to savings that used to be passed down to the customer. But the noticeably poorer performance required vendors to disclose that critical fact to consumers.

Unfortunately, the industry has now shifted to selling these drives in product families that have traditionally consisted of 'normal' models that use the faster conventional magnetic recording (CMR) technique, but without disclosing that fact to consumers.

For WD, that consisted of working the SMR models into its WD Red line of drives, but only the lower-capacity 2TB to 6TB models. Slower SMR drives do make some measure of sense in this type of application, provided the NAS is used for bulk data storage. Still, compatibility issues have cropped up in RAID and ZFS applications that users have attributed to the unique performance characteristics of the drives.

Toshiba tells Block and Files that it is also selling SMR drives without disclosure, but does so within its P300 series of desktop drives. Seagate also disclosed that it uses the tech in four models, including its Desktop HDD 5TB, without disclosure. However, Seagate, like others, does correctly label several of their archival hard drives as using SMR tech, making the lack of disclosure on mainstream models appear to be a bit of purposeful deception.

Boosting HDD capacity is a tough proposition in the razor-thin margin world of hard drive production, and the most promising techniques, which use exotic approaches like lasers or even microwaves, aren't as economical. That stings in an industry where SSDs are fast becoming the de-facto solution due to their speedy performance, so hard drive vendors have retreated further into the 'cheap and deep' storage space. That means focusing on less-expensive and higher-capacity drives, even if it comes at the cost of reduced performance.

Now, we're all for cheaper and deeper storage tech like SMR, but given the comparatively terrible performance in random write workloads (a specification that none of the vendors reveal in their documentation), it goes without saying that this should be disclosed to customers so they can decide where, and when, to use the drives.

Hopefully enough consumers will vote with their wallets to force some sort of responsible disclosure from the hard drive vendors. Although, with all major hard drive makers apparently pushing SMR without disclosure in some form, doing so might be difficult.

You can check out our previous article for a [url=https://www.tomshardware.com/news/wd-fesses-up-some-red-hdds-use-slow-smr-techdeeper look at how SMR technology works[/url].
https://www.tomshardware.com/news/sn...out-disclosure





Netflix Stock Hits Record High, Is Now Worth More Than Disney
Ariel Shapiro

Netflix is one of the few companies actually thriving during the coronavirus, hitting an all-time high in the market Thursday after becoming more valuable than rival entertainment giant Disney.
tiger king

'Tiger King' has been a huge boon for Netflix, drawing more than 34 million viewers in its first 10 ... [+] NETFLIX

KEY FACTS

The streamer’s stock climbed 5% by midday, after closing at a record price of nearly $427 per share on Wednesday.

With viewers turning to hits like true-crime documentary Tiger King and comfort-food reruns like The Office and Cheers, Netflix is seeing a spike in new subscriptions, according to streaming analytics firm Antenna.

Aside from increased viewership, Netflix is seeing another benefit from the crisis: like the rest of Hollywood, it has had to halt production, says Lightshed partner and analyst Rich Greenfield, freeing up its cashflow as it accumulates revenue.

What’s good for Netflix is also good for its cofounder and CEO Reed Hastings — his net worth is now $4.9 billion, a $1.2 billion increase since the end of March.

Big number

$194 billion. That is how much Netflix NFLX is now worth, having increased its market value more than $50 billion so far this year. Disney DIS , having been hit particularly hard by the coronavirus, is valued below $184 billion, down from nearly $258 billion at the end of 2019. Disney and Netflix are two of the heaviest hitters in entertainment, but they have opposite approaches. While Netflix is entirely dependent on paid subscriptions for revenue, Disney is much more varied, as reliant on tourism and merchandise sales as it is on the content it puts out. While that diversity has traditionally been its strength, it has become a liability as the pandemic forces Disney to shut down theme parks and delay film releases.

What to watch for

While that pause in production may be good for Netflix’s bottom line now, it could result in significantly less new content in the fall than planned. Even so, Netflix may still be better off than its competitors. “The reality is they have much more content in the pipeline than anyone else right now,” says Greenfield.
https://www.forbes.com/sites/arielsh...e-than-disney/





A New Connection with the Lost Art of Phone Conversation
Daphne Merkin

One of my favorite sentences in twentieth-century fiction is the one that goes: “She was a girl who for a ringing phone would drop exactly nothing.” It’s from J.D. Salinger’s short story, “A Perfect Day for Bananafish,” which appears in his supernal Nine Stories. One of its pleasures is the partially inverted syntax of the sentence; another is how quickly Salinger’s description allows you to get inside the imperturbable head of the egocentric Muriel Glass, who sits in her Florida hotel room painting her nails, and is married to the mentally unstable Seymour Glass, whom she worries about not at all.

I’ve been thinking of this sentence in recent weeks because in this, our surreal and grim season of the coronavirus, everyone is suddenly dropping exactly everything for their ringing phones. The old-fashioned fuddy-duddy telephone—which once seemed as dated as Dorothy Parker’s short story “The Telephone Call,” in which a young woman waits desperately for a man to call—is suddenly back in style.

I’m sure I’m not the only one who finds myself spending hours on the phone with friends and editors I used to converse with minimally, if at all. Surely this has everything to do with the limited and mediated intimacy provided by our more recent modes of communication—email, texting, Twitter direct messages, chat apps, FaceTime, and now the suddenly ubiquitous Zoom—as well as with our longing for a more immediate, audible sense of connection in these harrowing times. (It may be a generational thing, or my own intractable Luddism, but video-chats just don’t do it for me; they seem stagy and artificial. Besides, who wants to look at oneself bobbing up on a screen that much, even with helpful hints from the likes of Tom Ford on how to look good on camera. Whatever the lighting and the angle at which one tilts one’s phone or screen, one always looks somewhere between wan and ghoulish.)

To think of Bell Telephone’s long-ago slogan “Reach out and touch someone you love” is to be reminded anew of the primacy the phone once enjoyed. Think of the centrality of the telephone in movies like Hitchcock’s Rear Window (1954), in which James Stewart, playing a photographer confined to a wheelchair by a broken leg, is forced to do much of his communication by phone—including to call the police to rescue Grace Kelly’s character, Lisa. Or of the heart-stopping calls made by the deranged and homicidal Jessica Walter to Clint Eastwood in Play Misty for Me (1971). Indeed, there was a time when talking on the phone, at least in films, had a somewhat sensuous quality to it—almost as sexy-seeming as smoking.

Although it seems as if the phone has been around since prehistoric times, it’s only a century and a half old—a technology that seemed nothing short of magical when it first appeared. As some of us may remember from high-school chemistry classes, Alexander Graham Bell, a Scottish-born American inventor and scientist, created the first patented phone in 1876, apparently involving a water-based device (don’t ask me to explain), and his first, somewhat peremptory message was addressed to his assistant: “Mr. Watson, come here, I want to see you.”

The first rotary-dial handset was made as early as 1904; Bell’s version of the “candlestick” model of phone followed in 1919; touch-tone dialing and cordless phones came in with the 1960s, and finally mobile phones arrived in the 1970s—still, for several decades, a device for speaking to people, until smartphones made that seem dumb.

In 1959, Bell Telephone introduced a compact phone designed for use in the bedroom. It came with a light-up dial that could do double-duty as a night light. I can remember to this day my excitement when I, then in high school, got a white Princess phone, delighting in its smallness and distinctly feminine appeal. (Those Princess phones, which would go out of production in 1994, have since become collectibles.)

In this present age—one, arguably, of waning interpersonal skills—talking on the phone offers a unique way into reading another person’s cues. Although we generally assume that the best means of identifying someone’s emotions is through their facial expressions, particularly their eyes—the “window of the soul” and all that—a Yale study cited in Psychology Today in 2018 suggests that we may, in fact, be better at reading voices than faces, that we’re more emotionally intelligent on the phone.

It is certainly easier to gauge someone’s enthusiasm—or lack thereof—about getting together than it is through texting or email, which are often, at best, emotionless and perfunctory by nature. I suppose emojis—the use of which my Luddism prohibits; righteously, I feel—are a way of conveying tonality, albeit in a cutesy, unnuanced fashion. There is research that suggests the human ability to perceive nuance in voices may have developed as an evolutionary advantage, ensuring that expressions of need and distress were perceived by our ancestors. There is no research I know of, however, on the evolutionary value conferred by a large emoji vocabulary.

Then, too, talking on the phone encourages spontaneous play; there’s something improvisational about it that can lead one into entirely unexpected directions. Of course, the downside of talking on the phone is how much time it can take up.

“Talking on the phone offers the pleasure, even sensuality of hearing a voice,” notes my friend Deborah, “but they are genuinely time-consuming. I don’t know how we ever got any writing done when we talked to friends for hours. It takes all day! You can wake up, shower, check in on your mother and two friends, go for a walk and then it’s dinnertime.”

In a moment when we’re all mandated to stay at home, staring into the middle distance, this seems, on balance, more of a plus. Until now, there were times I found myself missing the hours-long phone calls I used to have with friends well into the night, detailing the ins and outs of my life to a responsive person on the other end of the line. It made me feel unalone in my life and linked in a way few things did.

So I welcome the return to phone calls, no matter how time-consuming they are—even if there will always be die-hard post-phone types, like my friend Ben, who abjures its pleasures and insists that the phone is a waste of time:

“I remember the seventies and eighties—the heyday of epic phoning—when the calls began at 5 PM and stretched on till dinner, and afterward resumed, lasting past twelve or one. And what was accomplished,” he wanted to know. “Transmission of gossip, by and large.”

Just so. I dare say it comes down to what your taste in social exchange is: whether you prefer it more remote, to-the-point, and controllable, or enjoy a more languid, digressive, and protracted mode of communication. My own preference lies with what that connoisseur of impassable differences, E.M. Forster, well understood: “Only connect.” I like to think that Forster, who lived to enjoy the heyday of the telephone conversation (he died in 1970), must have gabbed away to his Bloomsbury friends for hours, but who can say?
https://www.nybooks.com/daily/2020/0...-conversation/





For the First Time, a Spacecraft has Returned an Aging Satellite to Service

It’s a big step forward for satellite servicing.
Eric Berger

In a triumph for the nascent industry of "satellite servicing," an aging communications satellite has returned to service in geostationary orbit.

Northrop Grumman announced Friday that its Mission Extension Vehicle-1, or MEV-1, has restored the Intelsat 901 satellite and relocated it into a position to resume operations.

"We see increased demand for our connectivity services around the world, and preserving our customers’ experience using innovative technology such as MEV-1 is helping us meet that need,” Intelsat Chief Services Officer Mike DeMarco said in a news release.

After launching on a Proton rocket last October, Northrop Grumman's servicing vehicle used its mechanical docking system to latch onto Intelsat 901 on February 25, at an altitude of 36,000km above Earth. Prior to this, no two commercial spacecraft had ever docked in orbit before.

Since then, the MEV-1 servicer has assumed navigation of the combined spacecraft stack, reducing the satellite's inclination by 1.6 degrees and relocating it to a new orbital location, at 332.5° east. Intelsat then transitioned about 30 of its commercial and government customers to the satellite two weeks ago. The transition of service took approximately six hours and was successful.

Based on the agreement between Northrop and Intelsat, MEV-1 will provide five years of life extension services to the satellite before moving it into a graveyard orbit. MEV-1 will then be available to provide additional mission extension services, Northrop said, including orbit raising, inclination corrections, and inspections. Northrop is already building a second MEV to service another Intelsat satellite, 1002, later this year.

This satellite servicing milestone comes as both low-Earth orbit as well as geostationary space—where large, expensive communications satellites are often placed high above the planet to hold their position over the ground—are becoming more crowded. The availability of a service such as that offered by MEV-1 offers satellite providers both the ability to extend the lifetime of aging assets, but also to potentially remove those they have lost control of from the ground.

These kinds of services are generally seen in the space community as important to keeping orbit as decluttered as possible in the coming decades, so it is good that this demonstration case worked out well.
https://arstechnica.com/science/2020...-into-service/





Frontier Communications, Saddled with Debt from Acquisitions, Files for Bankruptcy
Stephen Singer

Frontier Communications Corp., the Norwalk telecommunications company that bought Connecticut’s legacy Southern New England Telephone Co. in 2014 and was weighed down by debt from acquisitions that failed to perform, has filed for bankruptcy.

In a filing Tuesday in federal bankruptcy court in White Plains, N.Y., Frontier said a restructuring support agreement with bondholders represents more than 75% of its $11 billion in outstanding bonds. The restructuring plan is expected to reduce its debt by more than $10 billion.

It also said it has obtained commitments for $460 million in financing and, with cash on hand, it has more than $1.1 billion in liquidity.

Frontier said it expects to continue providing service to customers without interruption.

Shares plunged 27%, closing Wednesday at 28 cents. It had reached a high of $125.70 in 2015.

“We are undertaking a proactive and strategic process with the support of our bondholders to reduce our debt by over $10 billion on an expedited basis," said Robert Schriesheim, chairman of the board of directors’ finance committee.

In its bankruptcy filing, Frontier said that with three acquisitions between 2010 and 2016, including its Connecticut AT&T franchise, it transformed from a provider of telephone and internet services in mainly rural areas to a large, national telecommunications provider in rural, urban and suburban markets in 29 states with revenue last year of about $8.1 billion.

It anticipated that the acquisitions would yield savings from consolidation of administrative functions and lower prices. Instead, serving the new areas “proved more difficult and expensive” than Frontier expected. Integration issues made it more difficult to keep customers, it said.

In addition, “fierce competition” in telecommunications, shifting consumer preferences and accelerating demand for bandwidth and performance have been"redefining what infrastructure telecommunications companies need to compete in the industry," Frontier said.

The conditions have contributed to the “unsustainability of the company’s outstanding funded debt obligations” amounting to $17.5 billion, it said.

As a result, “Frontier has not been able to fully realize the economies of scale” expected from its acquisitions. The number of customers shrank to 4.1 million in January from 5.4 million in 2016, it said.
https://www.courant.com/business/hc-...rzy-story.html





ISPs Ignore Toothless FCC Demand To Not Kick Users Offline During COVID-19
Karl Bode

A few weeks back, the Trump FCC put on a big show about a new "Keep America Connected Pledge." In it, the FCC proudly proclaimed that it had gotten hundreds of ISPs to agree to not disconnect users who couldn't pay for essential broadband service during a pandemic. The problem: the 60 day pledge was entirely voluntary, temporary, and because the FCC just got done obliterating its authority over ISPs at lobbyist behest (as part of its net neutrality repeal), it's largely impossible to actually enforce.

Well, guess what:

"Some people who just lost their jobs because of the coronavirus pandemic are finding that they have lost something else — phone and internet access. Across the country, suddenly unemployed residents are getting threatening notices, despite an initiative from the Federal Communications Commission that pledged last month to "Keep Americans Connected."

Yes, gosh, who could have predicted many ISPs would simply ignore voluntary guidelines from an agency that repeatedly signals that there's no real penalty for bad behavior under the Trump administration?

As a result, Sprint, Verizon, and others are all following normal procedure and shutting down accounts, even after informing their subscribers that this most certainly wouldn't happen:

"It was a surprise when my line was suddenly disconnected, because I had actually got an email saying that during this time there would be no interruptions to phone service," Aaron Joshua Perra, a hairstylist from Minneapolis, told NBC News. He had his Sprint phone shut off soon after his salon closed down last month. Sprint has since reconnected him."

Meanwhile, over in Ohio, one disabled woman tells the tale of Charter Spectrum severing her service in the middle of a telemedicine appointment, again despite insisting this would not happen:

"The phone cut out in the middle of a telemedicine visit with her brother’s doctor. Joyce Manz had called Spectrum’s customer service a few days earlier and told a representative that she would pay the phone and internet bill as soon as her disability check arrived. It would be OK, she said she was assured.

“I was in tears when the phone cut out,” the 59-year-old Cleveland resident said. “I started panicking."

None of this this particularly surprising. The telecom industry has some of the worst customer service in the country, in large part thanks to a lack of competition or regulatory oversight. In short: they don't really have much of an incentive to improve it. It's a problem that's particularly notable during disasters, when cable and broadband companies routinely try to immediately bill disaster victims for destroyed cable boxes -- even if the customer just lost everything they owned. Not because ISPs are intentionally malicious, just because they don't give fixing these systems priority. Again, because they don't have to.

Mergers (growth for growth's sake) don't usually scale customer service to handle the growth because it's not profitable to do so. Geographical monopolies also mean there's no organic market pressure to do so either. Then you've got regulatory capture, and U.S. regulators and lawmakers that are all but owned by AT&T, Verizon, and Comcast on both the state and federal level. The current FCC can't even acknowledge there's a broadband competition problem, or that Americans pay more for broadband than most developed nations. So it's a problem that's not getting fixed.

Pai, like many of his ideological bent, operates under the illusion that if you eliminate oversight of telecom, miracles happen. Except that's never been true: when you eliminate regulatory oversight of an uncompetitive sector dominated by politically powerful monopolies like Comcast, those monopolies simply double down on bad behavior. Market based, regulatory, or antitrust, there's no U.S. incentive to improve because corrupt lawmakers and regulators have prioritized monopoly profits over everything else, gutting all systems of accountability, then dressed this blind greed up as some kind of sophisticated, elaborate ethos.

Like much of the Trump administration, the FCC's reply to the complaints is largely just hubris and misdirection. In short, the agency implies that the only reason we're seeing these complaints is because the FCC brought attention to the problem:

"Although we have received some disconnection complaints recently, we think it may reflect increased attention on the FCC's work to keep people connected," the spokesman said."

That's nonsense. One, because the FCC's "solution" to this very real problem was a voluntary proposal ISPs know they don't have to adhere to because the current FCC is a bunch of feckless pushovers. Two, because the FCC doesn't even collect disconnection data. It's so typically Trumpian: coddle monopolies, then pretend said coddling is resulting in wonderful outcomes that simply aren't supported by factual reality. Rinse, wash, repeat.

The Trump FCC and its supporters claim the net neutrality repeal was a good thing because it "freed the industry from burdensome regulations." But that's fantasy. The repeal gutted the FCC's authority to hold ISPs accountable for a wide variety of bad behaviors, including obvious billing fraud. It then shoveled any remaining authority to an FTC that lacks the resources or authority to police a sector rife with hugely unpopular regional monopolies. This accountability vacuum is the entire reason the industry lobbied for the plan. All of the claims about how the repeal increased "internet freedom" encouraged "unbridled innovation" or "stoked network investment" is a heaping pile of bullshit.

It's corruption and regulatory capture, propped up by a mountain of bogus data, magical thinking, and telecom policy concepts debunked decades ago. And as former FCC staffer Gigi Sohn pointed out a few weeks back, that discarded agency authority sure would come in useful during a pandemic where broadband connections are now widely seen as an essential cornerstone of survivability.
https://www.techdirt.com/articles/20...covid-19.shtml





Internet Service in Western Colorado was So Terrible that Towns and Counties Built their Own Telecom

The new 481-mile rural Project Thor network is complete -- and run by a regional government council. It’s the opposite of what a state law intended 15 years ago
Tamara Chuang

Internet outages became a distant memory this month as a good chunk of western Colorado turned on a new broadband system. But this wasn’t built by a typical telecom. It took a band of local governments and partners from 14 rural communities to stitch together the 481-mile network, dubbed “Project Thor.”

Communities from Aspen to Meeker craved better access and affordability but also demanded reliability. Over the years, multiple outages caused by accidental cuts in the internet line would shut them off from the rest of the world. At a Granby clinic, for example, medical staff couldn’t quickly send images of stroke patients’ brain scans down to radiologists in Denver for review during an outage, putting “the patient at significant risk of long-term damage,” said Dr. Thomas Coburn, a family and emergency medicine physician and CEO of Middle Park Health.

So the Northwest Colorado Council of Governments, referred to as Northwest COG, coordinated the two-year effort of public and private organizations that couldn’t wait any longer for existing broadband providers to fix their problem.

“A service outage is extremely taxing on our hospital operations and frustrations do run very high during a down time. No internet means no email, no access to the outside world, and limited access to even our own networks,” said Rob C. Wissenbach, director of information technology services for Middle Park Health clinics, which partnered with Project Thor in Kremmling and Granby. “So far, the partnership with Thor has proven to be successful.”

Project Thor is that “middle mile” of physical lines sitting between the greater internet and an internet service provider that serves consumers.

Thor doesn’t serve home users, but Northwest COG partners can tap it for their own communities and share it with their school districts, local institutions and even local ISPs, which, in turn, can expand service to consumers.

It’s been fully operational for a few weeks now, though facilities like Middle Park Health have been able to access part of the network since last summer. While there still could be outages, Thor has multiple redundant lines. If one line gets cut, internet traffic is rerouted to another to prevent an outage for customers.

“The other day, there was a small fiber outage that affected a few of our communities,” said Nate Walowitz, director of Northwest COG’s broadband operation. “And Project Thor kept going. Nobody knew.”

Government-owned internet

Project Thor represents a municipal success story in what has been a contentious battle in Colorado over who controls internet service.

Back in 2005, state lawmakers passed a law preventing municipalities from becoming internet providers. But beginning in 2008 with the city of Glenwood Springs, communities began voting to opt out of Senate Bill 152. Today, more than 100 communities are exploring or offering their own broadband service.

Criticism came — and still comes — largely from the cable and telecom industry, which supported the bill as being the fiscally responsible thing to do with taxpayer money.

“Where you’re using government money to put in a duplicate network in a community that already has broadband, that I think is a waste of money,” said Ron Rizzuto, a finance professor at University of Denver’s Daniels College of Business and a consultant for the cable industry. “And it’s a conflict where you have the government competing with the private sector.”

He pointed to projects in Fort Collins, where the city is constructing a broadband network. But Rizzuto feels differently about Thor.

“(Thor) is a good example of why you need the government to be the catalyst. The private sector is not stepping up,” he said. “The government sector can help entice the private sector to participate because the private sector won’t act on its own.”

Northwest COG owns the Thor network. The council is made up of local governments from Jackson, Grand, Summit, Eagle and Pitkin counties. Not all the county governments joined the broadband project as partners, though some of their cities did. And neighboring communities — including Glenwood Springs, Georgetown and Meeker — chose to join and help finance construction of the $2.6 million project.

The new network is available to large users who want to pay for it. So far, neither Comcast nor CenturyLink has joined.

CenturyLink is plugging away to expand faster internet to 50,000 homes and businesses in rural Colorado by 2021 with the help of federal funds. Last July, CenturyLink officials told The Colorado Sun that even with federal funds, it’s been able to get to only 60% of the targeted population and cannot reach every rural Colorado consumer.

Thor, however, isn’t going anywhere CenturyLink doesn’t already have service, the company said in a statement. The new network “does not add any significant route diversity and, unfortunately, does not provide the last-mile facilities needed to deliver broadband to rural Colorado consumers and businesses, which are the most expensive areas to serve.

“We hope that future taxpayer-funded projects focus on meeting the needs of unserved and underserved Coloradans, rather than competing with existing facilities and networks,” the Monroe, Louisiana-based telecom wrote.

Meanwhile, Comcast is expanding its cable broadband service into Eagle and Gypsum. Comcast still offers its service to communities involved in Project Thor and “we appreciate the valuable partnerships we have in those communities, and will remain open to others that could benefit customers,” the company said in a written statement.

Expanding broadband in rural areas has been difficult for private companies, said Chris Mitchell, director of the Community Broadband Networks at the Institute for Local Self-Reliance, which tracks municipal-owned broadband.

“Some people believe that the existing companies can be regulated to make sure everyone has access. I’ve lost faith in that approach. I don’t think you can force CenturyLink to meet rural Colorado needs,” Mitchell said. “In particular, CenturyLink’s (former) CFO has said they’re focused on urban enterprise customers and they are not trying to invest a lot in rural areas, which is a perfectly rational thing for them to do as a publicly owned company.”

That’s why Project Thor is “an impressive feat,” Mitchell said. He believes it will pay long-term dividends for those rural communities.

“Colorado already has a good system of regional collaboration that other states don’t have,” Mitchell said. “So in Colorado, I think it makes sense for local governments to play a major role in solving this problem.”

Thor’s origins

Project Thor sprang up in 2018 and was named after the Norse god of thunder (“The image of the hammer and breaking down roadblocks and breaking down barriers really worked,” Walowitz said).

But the project’s origin dates back to 2013 when the council studied internet service in the region. Back then, 76% of Colorado’s households had broadband access if you counted connections of 200 kbps as broadband, which the Federal Communications Commission did. That’s one-fifth of a megabit per second. In the Front Range, speeds were about 10 mbps.

“In sum, we find northwest Colorado broadband sits at the tail end of a middling state in a middling country,” according to the Northwest COG’s report from 2013. “The NWCCOG has decided that simply isn’t good enough.”

Northwest COG then hired Walowitz, who helped connect the dots for municipalities to get faster broadband. It helped Jackson County attract a Wyoming broadband provider in 2017. It promoted the benefits of broadband in Rio Blanco County, where 1 gbps service is now available in Meeker and Rangely, with wireless service available to “the remaining, most remote 7,000 residents.” Red Cliff’s broadband service went live in December 2017.

But there remained a major problem for every town even after their residents voted to opt out of the 2005 law. In those rural communities, if a part of the fiber line inadvertently got cut, there would be no internet since there was often only one internet pipe into the region.

Hence, Project Thor would build a series of loops and provide alternate lines. The momentum grew, attracting local governments, a school district, a medical office, an electric co-op, internet service providers and even Colorado Department of Transportation, which had existing fiber along local roadways.

“Most of what we did was leverage existing fiber. That’s what made the cost of this project so low comparatively. If we had to build 400 miles of fiber, that would have been a significant investment,” Walowitz said. “… We knew that if we built it on our own, it would have been unaffordable for the region.”

The state Department of Local Affairs kicked in about $1.3 million in grants, while participating communities matched the funds.

Thor’s final price tag? $2.6 million.

The affordability factor

While reliability was the key goal, a more affordable service was also desired. Years before Thor became a project, Tim Miles wasn’t having any luck negotiating a lower price for internet service from the town’s main provider, CenturyLink. As technology director of the Steamboat Springs School District, he said the district paid about 10 times more for the same amount of data as school districts in more urban areas. He felt helpless.

He realized he needed some political power. So he reached out to city and county officials and together, they negotiated as a larger single customer. It worked. Prices came down for the school district. They formed Northwest Colorado Broadband, a nonprofit cooperative, that was soon joined by the local hospital, electric utility, community college and others.

But there were still reliability issues resulting in outages. So the group built its own redundant system and acquired its own internet lines. Eventually, they weaned themselves off of CenturyLink.

Thor was built on the same premise that as a group, they could negotiate lower prices. With its own network, NCB also had reached its 10-gig capacity and would have to invest heavily to expand capacity to accommodate more users. So they joined the Thor effort from the start.

“Thor is exactly what we did but just with more parties,” Miles said. “And now we share that cost and so I’m paying less now to join Thor, and I can turn the dial to 100 gigs. And I’m paying less than I was when I was paying (NCB) for 10 gigs.”

More: Rural Colorado sees more broadband options coming online. But getting up to speed is taking longer than anticipated in some areas.

For the Steamboat Springs school district, Miles said he’s now paying $500 a month less for 10 times the amount of bandwidth. And that’s already a major savings from what the district once paid CenturyLink years ago.

It’s a “drastic change” in how much the internet now costs for some Thor partners. “I’ll say we were paying at least 10 times more per megabit than what we’re currently paying” for Thor, said Wissenbach, with the Middle Park Health clinics.

Northwest COG owns the network and splits its annual $1.1 million operating costs among its partners based on the amount of data they use. Partners host “Meet Me” centers in 14 communities that opted in for internet. For Northwest COG, it’s a break-even venture where partners pay per megabyte — $1.30 per MB for private-sector partners and $1.02 for public sector partners. That’s about half what other middle-mile providers charge, Walowitz said.

“Everyone pays, whether at a large or small Meet Me center, they still pay the same cost per megabyte because we aim to be fair and equitable,” he said. “Project THOR is structured to be an enterprise that covers its costs. It is not a profit center for Northwest COG.”

Hope for home users

Project Thor, however, isn’t an internet service sold directly to home users. But because Thor now offers a reliable backbone, local internet service providers can tap into it and expand service into new communities or upgrade speeds for existing ones.

That’s the case for Visionary Broadband, a subsidiary of Mammoth Networks, that launched wireless broadband service in Kremmling in August. Luminate Broadband, part of Yampa Valley Electric Association, is expanding in Routt County. Both parent companies are partners in Thor.

But for residents eager for faster internet, the point of Project Thor is that it’s already helping attract more ISPs to the region. One is Allo Communications, which is rolling out service in the Town of Breckenridge. The Nebraska company agreed to expand in the town because Thor provided that needed redundancy.

“Even if it was for a maintenance period at night, it’s not responsible (to cause an outage) when you’re providing services to first responders or critical businesses or those types of things,” said Brad Moline, Allo’s president and founder. “It’s just not how we do business.”

Summit County and Breckenridge are handling the installation to residents’ homes, with the town expected to spend more than $20 million to finance the project. Allo manages the customer side of the service. According to Allo, its prices start at $60 for a 500-mbps connection. Allo relies on two sources for internet: Thor and the town’s existing provider CenturyLink. Without two sources, Breckenridge users would be in a bind if there was a fiber cut.

“The biggest problem that the mountain communities have is with resiliency and redundancy and telecommunications services, especially internet services,” said Andy Atencio, the IT director for Summit County, a Thor partner. “So one of the things that we’re getting out of Thor is the ability to really provide redundant and resilient service to customers that are able to get connected to it.”

Ultimately, Thor exists to equip rural governments with tools to help their community access services and other benefits that are disappearing if you’re not online, such as telemedicine, banking and even news. Walowitz said that’s Northwest COG’s goal: helping its local governments.

“There are increased numbers of our population that are getting either improved broadband or just getting broadband. And I believe that will continue to expand,” Walowitz said. “The reality is that local governments are not in the business of marketing other than, ‘Hey come to our town, it’s really cool.’ … They’re not in the telecom business. We’re helping bring them to an understanding on how they can help enable more providers to come into town and improve broadband in their communities.”
https://coloradosun.com/2020/04/16/i...-nwccog-sb152/





Your Internet is Working. Thank these Cold War-Era Pioneers Who Designed it to Handle Almost Anything

Coronavirus may have forced people to stay at home, but the Internet these scientists envisioned long ago is keeping the world connected
Craig Timberg

Coronavirus knocked down — at least for a time — Internet pioneer Vinton Cerf, who offers this reflection on the experience: “I don’t recommend it … It’s very debilitating.”

Cerf, 76 and now recovering in his Northern Virginia home, has better news to report about the computer network he and others spent much of their lives creating. Despite some problems, the Internet overall is handling unprecedented surges of demand as it keeps a fractured world connected at a time of global catastrophe.

“This basic architecture is 50 years old, and everyone is online,” Cerf noted in a video interview over Google Hangouts, with a mix of triumph and wonder in his voice. “And the thing is not collapsing.”

The Internet, born as a Pentagon project during the chillier years of the Cold War, has taken such a central role in 21st Century civilian society, culture and business that few pause any longer to appreciate its wonders — except perhaps, as in the past few weeks, when it becomes even more central to our lives.

Many facets of human life — work, school, banking, shopping, flirting, live music, government services, chats with friends, calls to aging parents — have moved online in this era of social distancing, all without breaking the network. It has groaned here and there, as anyone who has struggled through a glitchy video conference knows, but it has not failed.

“Resiliency and redundancy are very much a part of the Internet design,” explained Cerf, whose passion for touting the wonders of computer networking prompted Google in 2005 to name him its “Chief Internet Evangelist,” a title he still holds.

Comcast, the nation’s largest source of residential Internet, serving more than 26 million homes, reports peak traffic was up by nearly one third in March, with some areas reaching as high as 60 percent above normal. Demand for online voice, video and VPN connections — all staples of remote work — have surged, and peak usage hours have shifted from evenings, when people typically stream video for entertainment, to daytime work hours.

Concerns about shifting demands prompted European officials to request downgrades in video streaming quality from major services such as Netflix and YouTube, and there have been localized Internet outages and other problems, including the breakage of a key transmission cable running down the West coast of Africa — an incident with no connection to the coronavirus pandemic. Heavier use of home WiFi also has revealed frustrating limits to those networks.

But so far Internet industry officials report they’ve managed the shifting loads and surges. To a substantial extent, the network has managed them automatically because its underlying protocols adapt to shifting conditions, working around trouble spots to find more efficient routes for data transmissions and managing glitches in a way that doesn’t break connections entirely.

Some credit goes to Comcast, Google and the other giant, well-resourced corporations essential to the Internet’s operation today. But perhaps even more goes to the seminal engineers and scientists like Cerf, who for decades worked to create a particular kind of global network — open, efficient, resilient and highly interoperable so anyone could join and nobody needed to be in charge.

“They’re deservedly taking a bit of a moment for a high five right now,” said Jason Livingood, a Comcast vice president who has briefed some members of the Internet’s founding generation about how the company has been handling increased demands.

Cerf, along with fellow computer scientist Robert E. Kahn, was a driving force in developing key Internet protocols in the 1970s for the Pentagon’s Defense Advanced Research Projects Agency, which provided early research funding but ultimately relinquished control of the network it spawned. Cerf also was among a gang of self-described “Netheads” who led an insurgency against the dominant forces in telecommunications at the time, dubbed the “Bellheads” for their loyalty to the Bell Telephone Company and its legacy technologies.

Bell, which dominated U.S. telephone service until it was broken up in the 1980s, and similar monopolies in other countries wanted to connect computers through a system much like their lucrative telephone systems, with fixed networks of connections run by central entities that could make all of the major technological decisions, control access and charge whatever the market — or government regulators — would allow.

The vision of the Netheads was comparatively anarchic, relying on technological insights and a lot of faith in collaboration. The result was a network — or really, a network of networks — with no chief executive, no police, no taxman and no laws.

In their place were technical protocols, arrived at through a process for developing expert consensus, that offered anyone access to the digital world from any properly configured device. Their numbers, once measured in the dozens, now rank in the tens of billions, including phones, televisions, cars, dams, drones, satellites, thermometers, garbage cans, refrigerators, watches and so much more.

This Netheads’ idea of a globe-spanning network that no single company or government controlled goes a long way toward explaining why an Indonesian shopkeeper with a phone made in China can log on to an American social network to chat — face to face and almost instantaneously — with her friend in Nigeria. That capability still exists, even as much of the world has banned or restricted international travel.

“You’re seeing a success story right now,” said David D. Clark, a Massachusetts Institute of Technology computer scientist who worked on early Internet protocols, speaking by the videoconferencing service Zoom. “If we didn’t have the Internet, we’d be in an incredibly different place right now. What if this had happened in the 1980s?”

Such a system carries a notable cost in terms of security and privacy, a fact the world rediscovers every time there’s a major data breach, ransomware attack or controversy over the amount of information governments and private companies collect about anyone who’s online — a category that includes more than half of the world’s almost 8 billion people.

But the lack of a central authority is key to why the Internet works as well as it does, especially at times of unforeseen demands.

Some of the early Internet architects — Cerf among them, from his position at the Pentagon — were determined to design a system that could continue operating through almost anything, including a nuclear attack from the Soviets.

That’s one reason the system doesn’t have any preferred path from Point A to Point B. It continuously calculates and recalculates the best route, and if something in the middle fails, the computers that calculate transmission paths find new routes — without having to ask anyone’s permission to do so.

Steve Crocker, a networking pioneer like Cerf, compared this quality to that of a sponge, an organism whose functions are so widely distributed that breaking one part does not typically cause the entire organism to die.

“You can do damage to a portion of it, and the rest of it just lumbers forward,” Crocker said, also speaking by Zoom.

Even more elementally, the Netheads believed in an innovation called “packet-switching,” which broke from the telephone company’s traditional model, called “circuit switching,” that dedicated a line to a single conversation and left it open until the participants hung up.

The Netheads considered that terribly wasteful given that any conversation includes pauses or gaps that could be used to transmit data. Instead, they embraced a model in which all communications were broken into chunks, called packets, that continuously shuttled back and forth over shared lines, without pauses.

The computers at either end of these connections reassembled the packets into whatever they started as — emails, photos, articles or video — but the network itself didn’t know or care what it was carrying. It just moved the packets around and let the recipient devices figure out what to do.

That simplicity, almost an intentional brainlessness at the Internet’s most fundamental level, is a key to its adaptability. As many others have said, it’s a web of highways everyone can use for almost any purpose they desire.

Many of the Internet’s founding generation have memories of trying to convince various Bellheads packet-switching was the inevitable future of telecommunications — cheaper, faster, easier to scale and vastly more efficient and adaptable.

Those anecdotes all end the same way, with the telephone company titans of the day essentially treating the Netheads as precocious but fundamentally misguided children who, some day, might understand how telecommunications technology really worked. Several acknowledged they celebrated just a bit when the telephone companies gradually abandoned old-fashioned circuit-switching for what was called “Voice Over IP” or VoIP. It was essentially transmitting voice calls over the Internet — using the same technical protocols that Cerf and others had developed decades earlier.

Leonard Kleinrock, one of three scientists credited with inventing the concept of packet switching in the 1960s, also was present for the first transmission on the rudimentary network that would, years later, become the Internet.

That was Oct. 29, 1969, and Kleinrock was a computer scientist at the University of California at Los Angeles. A student programmer tried to send the message “login” to a computer more than 300 miles away, at the Stanford Research Institute, but got only as far as the first two letters — “L” and “O” — before the connection crashed.

Retelling the story by phone, over a line using the Internet’s packet-switching technology instead of the one long preferred by the “Bellheads,” he recalled his own experience in trying to convince some phone company executives that he had discovered a technology that would change the world.

“They said, ‘Little boy, go away,’” Kleinrock said. “So we went away.”

And now Kleinrock, 85 and staying home to minimize the risk of catching the coronavirus, is enjoying that his home Internet connection is 2,000 times faster than the phone-booth sized communications device that Internet pioneers used in 1969.

“The network,” he said, “has been able to adapt in a beautiful way.”
https://www.washingtonpost.com/techn...most-anything/

















Until next week,

- js.



















Current Week In Review





Recent WiRs -

April 11th, April 4th, March 28th, March 21st

Jack Spratts' Week In Review is published every Friday. Submit letters, articles, press releases, comments, questions etc. in plain text English to jackspratts (at) lycos (dot) com. Submission deadlines are Thursdays @ 1400 UTC. Please include contact info. The right to publish all remarks is reserved.


"The First Amendment rests on the assumption that the widest possible dissemination of information from diverse and antagonistic sources is essential to the welfare of the public."
- Hugo Black
__________________
Thanks For Sharing
JackSpratts is offline   Reply With Quote
Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Peer-To-Peer News - The Week In Review - July 16th, '11 JackSpratts Peer to Peer 0 13-07-11 06:43 AM
Peer-To-Peer News - The Week In Review - July 9th, '11 JackSpratts Peer to Peer 0 06-07-11 05:36 AM
Peer-To-Peer News - The Week In Review - January 30th, '10 JackSpratts Peer to Peer 0 27-01-10 07:49 AM
Peer-To-Peer News - The Week In Review - January 16th, '10 JackSpratts Peer to Peer 0 13-01-10 09:02 AM
Peer-To-Peer News - The Week In Review - December 5th, '09 JackSpratts Peer to Peer 0 02-12-09 08:32 AM






All times are GMT -6. The time now is 11:46 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
© www.p2p-zone.com - Napsterites - 2000 - 2024 (Contact grm1@iinet.net.au for all admin enquiries)