P2P-Zone  

Go Back   P2P-Zone > Peer to Peer
FAQ Members List Calendar Search Today's Posts Mark Forums Read

Peer to Peer The 3rd millenium technology!

Reply
 
Thread Tools Search this Thread Display Modes
Old 14-02-18, 09:15 AM   #1
JackSpratts
 
JackSpratts's Avatar
 
Join Date: May 2001
Location: New England
Posts: 10,013
Default Peer-To-Peer News - The Week In Review - February 17th, ’18

Since 2002


































"Rejecting years of settled precedent, a federal court in New York has ruled that you could infringe copyright simply by embedding a tweet in a web page." – Daniel Nazer






































February 17th, 2018




The Long, Slow Decline of BitTorrent

The long hard road out of hell...
Jonathan Bailey

In 2006 BitTorrent, or specifically peer-to-peer (P2P) file sharing. was king.

In a study from January of that year, P2P traffic accounted for over 70% of all internet traffic. Though, at that time, BitTorrent shared the file sharing crown with other networks, it quickly moved to become the number one file sharing protocol, a title it would hold decisively by 2008, in part due to an incredible period of growth in late 2007/early 2008.

However, even by then cracks were showing in the P2P armor. By late 2007, web traffic had overtaken p2p traffic. This was largely because of the meteoric rise of YouTube.

By 2011, P2P had fallen to under 19% in North America and was beaten by Netflix during peak times. By 2013 that traffic was down to just 7.39 percent and represented a drop not just in percentage, but actual traffic. In 2015, it was estimated to be 3 percent, a percentage that put it on par with Hulu, the fourth most popular video streaming site.

While this shift has been most acutely felt in North America. Studies have found it to also be on the decline elsewhere, including Europe and Asia.

This decreased prominence reached a peak in May of this year when pirates threatened to leak episode of Netflix’s Orange is the New Black before their slated release. Their demand for ransom was balked at the they posted the files on BitTorrent sites with almost no impact.

This raises a simple question: What’s eating away at BitTorrent? Why is BitTorrent the only kind of internet traffic not growing nor expected to grow?

To understand we have to look at all of the variables at play and understand how BitTorrent is under assault from all sides and why piracy may never be the same.

Cause 1: Better Legitimate Alternatives

When the decline of BitTorrent and P2P file sharing is raised the credit is usually given largely to legitimate alternatives such as Netflix, Hulu and Amazon Prime.

This is supported by the numbers as well. Netflix, YouTube, Amazon Video, iTunes and Hulu combine to make up well over 60% of all peak internet traffic in North America. That’s 20 times the estimated size of BitTorrent.

Most convincing of all, the rise of Netflix as a traffic largely mirrors BitTorrent’s fall. This makes it clear that, as users were firing up Netflix, they were doing so at the expense of P2P file sharing.

Simply put, when a month of unlimited streaming comes costs less than a lunch, people snap it up. Even more so when the library of content is sound and the ease/reliability of streaming is very high.

There’s no doubt that Netflix and its competitors have played a significant role in the downfall of BitTorrent.

Cause 2: Streaming Piracy

However, legitimate streaming services wasn’t the only destination that former BitTorrent users went to. Many leapt onto pirate streaming services, which made up 74% of all visits to pirated film sites last year.

Much like their legitimate counterparts, pirate streaming sites offer greater convenience and security than BitTorrent.

In this case, that security is two-fold.

First, it includes security from viruses and other malware that’s common on BitTorrent downloads and BitTorrent software. Second, it protects against legal threats filed against BitTorrent users.

Though the risk of using BitTorrent is small, especially if done intelligently, streaming sites are seen as far safer because they don’t (or at least shouldn’t) download anything to the computer or require the user to re-share the content, greatly expanding their potential legal liability.

Streaming pirate sites are more convenient than BitTorrent, especially when a permanent download isn’t useful, and tools such as Kodi boxes make it even easier. So much so that they are actively being targeted for bans.

Cause 3: Enforcement

The role of copyright enforcement in the decline of P2P file sharing is a hotly-debated one. Some headline-grabbing steps such as the arrest of the founders of The Pirate Bay did little to blunt piracy or even shutter the site.

However, in recent years anti-piracy efforts have instead focused on a more subdued approach, one that’s commonly referred to as “follow the money”. This approach works to discourage advertising and other partners, such as hosts and domain registrars, from working with pirate sites.

While arrests still do happen, such as the 2016 arrests of the alleged founders of Kickass Torrents, other sites shut down spontaneously, such as the recent closure of Extratorrents.

While it’s hard to know for certain, the call for donations to save the ExtraTorrent’s release group makes it clear that money was at least part of the reason. That, in turn, is likely due to a combination dwindling ad revenue and increasing expenses associated with running a pirate site in 2017.

A survey of pirate sites by Torrentfreak found that, even with large sites, many were either barely breaking even or were even losing money, making them vulnerable this approach.

Though some BitTorrent sites make it work, it’s getting more and more difficult to do so.

Cause 4: Demographics

The peak for P2P piracy was, by most accounts, between 2003 and 2006. At the same time, piracy demographics skew heavily to a younger audience with 18-29 year olds being significantly more likely to pirate than the adult population at large.

Now, more than a decade later, the people who were heavy into P2P piracy are no longer in the traditional demographic for piracy. Though people of all ages pirate, the older one gets the less likely they are to pirate and the less they are likely to take.
As for the new generation, they didn’t grow up with Napster (which is celebrating its 18th birthday today) or with BitTorrent. They grew up with iTunes, YouTube and Netflix. They’re much more likely to have access to paid services, use legitimate portals.
In short, young people have greater access to legitimate services than they did 10 years ago and, when they do pirate, they’re going to turn more to streaming pirate sites rather than BitTorrent.

After all, the convenience of Netflix and Spotify set the bar for what they wan’t out of their content, whether they get it legitimately or not.

Meanwhile, generation BitTorrent grew up and left the dorm room.

Cause 5: VPN Usage

Finally, there’s one other possibility for why BitTorrent and P2P traffic seems to be falling out of favor: It’s harder to track.

Though there are no hard numbers, it’s been widely reported that VPN usage has been increasing. Whether it’s due to anti-piracy efforts, political changes or privacy concerns, by all accounts VPN usage is growing and it could change the internet.

The real problem here is that there’s no way to know just how much of VPN traffic is growing or how much of it is BitTorrent. Though different traffic studies look at the internet from different vantage points, some simply would not be able to see VPN-obscured BitTorrent traffic.

Still, it’s unlikely that VPN usage accounts for the drop in interest in BitTorrent, but it may mean the gains aren’t quite as drastic as some studies indicate.

Bottom Line

For BitTorrent and P2P file sharing in general, the heyday has passed. If you’re a BitTorrent site operator, the halcyon days of the early 2000s are not coming back.

To be clear, there will always be P2P piracy on the internet. Most predictions call for P2P traffic to remain flat over the next few years. But that flatness is against the backdrop of an anticipated explosion in other types of content.

As a percentage of traffic, P2P will, almost certainly, continue to dwindle until it’s less and less significant. Some pockets of the world will hold out longer than others, but the global trend is clear.

This doesn’t mean that piracy is over and done (even P2P piracy) or that there aren’t problems with the legitimate services we have today. Creators can’t quite celebrate yet.

However, something that seemed impossible 10 years ago has come to fruition, BitTorrent is no longer the dominant force on the internet. It’s not even the largest service for video nor is it even in the top 3 providers.

The internet is now dominated by legitimate choices for video streaming.
https://www.plagiarismtoday.com/2017...of-bittorrent/





Internet Users Spoke Up to Keep Safe Harbors Safe
Mitch Stoltz

Today, we delivered a petition to the U.S. Copyright Office to keep copyright’s safe harbors safe. We asked the Copyright Office to remove a bureaucratic requirement that could cause websites and Internet services to lose protection under the Digital Millennium Copyright Act (DMCA). And we asked them to help keep Congress from replacing the DMCA safe harbor with a mandatory filtering law. Internet users from all over the U.S. and beyond added their voices to our petition.

Under current law, the owners of websites and online services can be protected from monetary liability when their users are accused of infringing copyright through the DMCA “safe harbors.” In order to take advantage of these safe harbors, owners must meet many requirements, including participating in the notorious notice-and-takedown procedure for allegedly infringing content. They also must register an agent—someone who can respond to takedown requests—with the Copyright Office.

The DMCA is far from perfect, but provisions like the safe harbor allow websites and other intermediaries that host third-party material to thrive and grow without the constant threat of massive copyright penalties. Without safe harbors, small Internet businesses could face bankruptcy over the infringing activities of just a few users.

Now, a lot of those small sites risk losing their safe harbor protections. That’s because of the Copyright Office’s rules for registering agents. Those registrations used to be valid as long as the information was accurate. Under the Copyright Office’s new rules, website owners must renew their registrations every three years or risk losing safe harbor protections. That means that websites can risk expensive lawsuits for nothing more than forgetting to file a form. As we’ve written before, because the safe harbor already requires websites to submit and post accurate contact information for infringement complaints, there’s no good reason for agent registrations to expire. We’re also afraid that it will disproportionately affect small businesses, nonprofits, and hobbyists, who are least able to have a cadre of lawyers at the ready to meet bureaucratic requirements.

Many website owners have signed up under the Copyright Office’s new agent registration system, which is designed to send reminder emails when the three-year registrations are set to expire. While the new registration system is a vast improvement over the old paper filing system, the expiration requirement is unnecessary and dangerous.

We explained these problems in our petition, and we also explained how the DMCA faces even greater threats. If certain major media and entertainment companies get their way, it will become much more difficult for websites of any size to earn their safe harbor status. That’s because those companies’ lobbyists are pushing for a system where platforms would be required to use computerized filters to check user-uploaded material for potential copyright infringement.

Requiring filters as a condition of safe harbor protections would make it much more difficult for smaller web platforms to get off the ground. Automated filtering technology is expensive—and not very good. Even when big companies use them, they’re extremely error-prone, causing lots of lawful speech to be blocked or removed. A filtering mandate would threaten smaller websites’ ability to host user content at all, cementing the dominance of today’s Internet giants.

If you run a website or online service that stores material posted by users, make sure that you comply with the DMCA’s requirements. Register a DMCA agent through the Copyright Office’s online system, post the same information on your website, and keep it up to date. Meanwhile, we’ll keep telling the Copyright Office, and Congress, to keep the safe harbors safe.
https://www.eff.org/deeplinks/2018/0...e-harbors-safe





US Court Orders TickBox to Remove Pirate Add-Ons from Sold Devices
Luke Bouma

In a major landmark court order a California federal court this week ordered the makers of TickBox TV to remove pirate Kodi add-ons from devices that have already been sold.

This comes from a lawsuit filed by Amazon, Netflix, and several major Hollywood studios. The fight is over a so-called “Kodi Box,” which is one of many streaming devices that come preloaded with third-party Kodi add-ons that offer free access to movies and TV shows.

This is a landmark ruling because for the first time a US court has ordered that a seller must not only stop selling or promoting a device but is also ordering the seller to update the boxes to remove the infringing software.

This means people who already paid for their devices could soon find their box updated and their access to pirated content cut off. Many of these owners paid between $150 and $300 per device, depending on where they bought the device.

According to the court ruling, “TickBox shall issue an update to the TickBox launcher software to be automatically downloaded and installed onto any previously distributed TickBox TV device and to be launched when such device connects to the internet,” the injunction reads.

“Upon being launched, the update will delete the Subject [infringing] Software downloaded onto the device prior to the update, or otherwise cause the TickBox TV device to be unable to access any Subject Software downloaded onto or accessed via that device prior to the update.”

Many are looking at the TickBox TV lawsuit as a precedent-setting case in the United States. One that they hope can be used to go after other sellers of so-called “Kodi Boxes.” While this is still only an injunction and not a final ruling, all signs are pointing to a win for Netflix, Amazon, and several Hollywood studios.
https://www.cordcuttersnews.com/us-c...-sold-devices/





Federal Judge Says Embedding a Tweet Can Be Copyright Infringement
Daniel Nazer

Rejecting years of settled precedent, a federal court in New York has ruled that you could infringe copyright simply by embedding a tweet in a web page. Even worse, the logic of the ruling applies to all in-line linking, not just embedding tweets. If adopted by other courts, this legally and technically misguided decision would threaten millions of ordinary Internet users with infringement liability.

This case began when Justin Goldman accused online publications, including Breitbart, Time, Yahoo, Vox Media, and the Boston Globe, of copyright infringement for publishing articles that linked to a photo of NFL star Tom Brady. Goldman took the photo, someone else tweeted it, and the news organizations embedded a link to the tweet in their coverage (the photo was newsworthy because it showed Brady in the Hamptons while the Celtics were trying to recruit Kevin Durant). Goldman said those stories infringe his copyright.

Courts have long held that copyright liability rests with the entity that hosts the infringing content—not someone who simply links to it. The linker generally has no idea that it’s infringing, and isn’t ultimately in control of what content the server will provide when a browser contacts it. This “server test,” originally from a 2007 Ninth Circuit case called Perfect 10 v. Amazon, provides a clear and easy-to-administer rule. It has been a foundation of the modern Internet.

Judge Katherine Forrest rejected the Ninth Circuit’s server test, based in part on a surprising approach to the process of embedding. The opinion describes the simple process of embedding a tweet or image—something done every day by millions of ordinary Internet users—as if it were a highly technical process done by “coders.” That process, she concluded, put publishers, not servers, in the drivers’ seat:

[W]hen defendants caused the embedded Tweets to appear on their websites, their actions violated plaintiff’s exclusive display right; the fact that the image was hosted on a server owned and operated by an unrelated third party (Twitter) does not shield them from this result.

She also argued that Perfect 10 (which concerned Google’s image search) could be distinguished because in that case the “user made an active choice to click on an image before it was displayed.” But that was not a detail that the Ninth Circuit relied on in reaching its decision. The Ninth Circuit’s rule—which looks at who actually stores and serves the images for display—is far more sensible.

If this ruling is appealed (there would likely need to be further proceedings in the district court first), the Second Circuit will be asked to consider whether to follow Perfect 10 or Judge Forrest’s new rule. We hope that today’s ruling does not stand. If it did, it would threaten the ubiquitous practice of in-line linking that benefits millions of Internet users every day.
https://www.eff.org/deeplinks/2018/0...t-infringement





Judge Tosses Playboy's Lawsuit Over Links to Centerfold Photos

The magazine sued over a Boing Boing post linking to "scans of every Playboy Playmate centerfold."
Ashley Cullins

Playboy is going to have to get creative to prove a site that linked to a third-party post containing more than 700 photos of centerfolds is liable for copyright infringement after a California federal judge dismissed the magazine's lawsuit.

Playboy in November sued Happy Mutants, claiming the company's site Boing Boing infringed its rights by linking to "Every Playboy Playmate Centerfold Ever."

"Some wonderful person uploaded scans of every Playboy Playmate centerfold to imgur," states the Feb. 29, 2016, post on Boing Boing. "It's an amazing collection, whether your interests are prurient or lofty. Kind of amazing to see how our standards of hotness, and the art of commercial erotic photography, have changed over time."

Boing Boing then linked to the imgur collection and a YouTube video, both of which appear to have since been removed.

U.S. District Judge Fernando Olguin on Wednesday dismissed Playboy's complaint with leave to amend, asking the magazine to carefully evaluate the contentions made in Happy Mutants' motion to dismiss before drafting a second amended complaint.

In short, the website owner argues that there is no evidence that Boing Boing copied or displayed the centerfold photos or that any of its users downloaded the images instead of viewing them.

Olguin quotes a decision in Quentin Tarantino's 2014 lawsuit against Gawker Media over a link to a copy of his Hateful Eight script, which at the time had yet to be produced.

"An allegation that a defendant merely provided the means to accomplish an infringing activity is insufficient to establish a claim for copyright infringement," held U.S. District Judge John F. Walter in that case. "Rather, liability exists if the defendant engages in personal conduct that encourages or assists the infringement."

Walter allowed Tarantino leave to amend, and the filmmaker ultimately withdrew his amended complaint shortly after filing it.

Playboy's amended complaint is due by Feb. 26.
https://www.hollywoodreporter.com/th...photos-1084918





Google Removes ‘View Image’ Button from Search Results to Make Pics Harder to Steal

Better for websites and photographers but worse for users
Jacob Kastrenakes

Google is making a change to image search today that sounds small but will have a big impact: it’s removing the “view image” button that appeared when you clicked on a picture, which allowed you to open the image alone. The button was extremely useful for users, since when you’re searching for a picture, there’s a very good chance that you want to take it and use it for something. Now, you’ll have to take additional steps to save an image.

The change is essentially meant to frustrate users. Google has long been under fire from photographers and publishers who felt that image search allowed people to steal their pictures, and the removal of the view image button is one of many changes being made in response. A deal to show copyright information and improve attribution of Getty photos was announced last week and included these changes.

“Today we're launching some changes on Google Images to help connect users and useful websites. This will include removing the View Image button. The Visit button remains, so users can see images in the context of the webpages they're on. pic.twitter.com/n76KUj4ioD ”

— Google SearchLiaison (@searchliaison) February 15, 2018

The intention seems to be either stopping people from taking an image altogether or driving them through to the website where the image is found, so that the website can serve ads and get revenue and so people are more likely to see any associated copyright information. That’s great news for publishers, but it’s an annoying additional step for someone trying to find a picture. Now you’ll have to wait for a website to load and then scroll through it to find the image. Websites sometimes disable the ability to right click, too, which would make it even harder for someone to grab a photo they’re looking for.

Fortunately, there’s still at least one way around it: if you right click, you can select “open image in new tab” or “view image” (or whatever your browser’s equivalent option is), and you’ll still open up the full-size picture. It’s just a bit less likely that everyone will realize this is an option. And since the “visit” site button is now the most visible button, that’s probably what’ll end up getting clicked the most.

In addition to removing the “view image” button, Google has also removed the “search by image” button that appeared when you opened up a photo, too. This change isn’t quite as big, however. You’ll still be able to do a reverse image search by dragging the image to the search bar, and Google will still display related images when you click on a search result. The button may have been used by people to find un-watermarked versions of images they were interested in, which is likely part of why Google pulled it.

While it’s good to see Google protecting photographers and driving traffic to websites, it’s still hard not to be a little annoyed by the changes. There are plenty of legitimate and legal uses for copyrighted images. And while it’s fair to ask users to do their due diligence by making sure they’re properly attributing photos, these changes really seem designed to stop images from being grabbed in the first place.
https://www.theverge.com/2018/2/15/1...search-results





Electronics-Recycling Innovator Faces Prison for Trying to Extend Computers' Lives

Prosecutors said Lundgren ripped off Microsoft Corp. by manufacturing 28,000 counterfeit discs
Tom Jackman

Eric Lundgren is obsessed with recycling electronics.

He built an electric car out of recycled parts that far outdistanced a Tesla in a test. He launched what he thinks is the first "electronic hybrid recycling" facility in the United States, which turns discarded cellphones and other electronics into functional devices, slowing the stream of harmful chemicals and metals into landfills and the environment. His Chatsworth company processes more than 41 million pounds of e-waste each year and counts IBM, Motorola and Sprint among its clients.

But an idea Lundgren had to prolong the life of personal computers could land him in prison.

Prosecutors said the 33-year-old ripped off Microsoft Corp. by manufacturing 28,000 counterfeit discs with the company's Windows operating system on them. He was convicted of conspiracy and copyright infringement, which brought a 15-month prison sentence and a $50,000 fine.

In a rare move though, a federal appeals court has granted an emergency stay of the sentence, giving Lundgren another chance to make his argument that the whole thing was a misunderstanding. Lundgren does not deny that he made the discs or that he hoped to sell them. But he says this was no profit-making scheme. By his account, he just wanted to make it easier to extend the usefulness of secondhand computers — keeping more of them out of the trash.

The case centers on "restore discs," which can be used only on computers that already have the licensed Windows software and can be downloaded free from the computer's manufacturer, in this case Dell. The discs are routinely provided to buyers of new computers to enable them to reinstall their operating systems if the computers' hardware fails or must be wiped clean. But they often are lost by the time used computers find their way to a refurbisher.

Lundgren said he thought electronics companies wanted the reuse of computers to be difficult so that people would buy new ones. "I started learning what planned obsolescence was," he said, "and I realized companies make laptops that only lasted as long as the insurance would last. It infuriated me. That's not what a healthy society should have."

He thought that producing and selling restore discs to computer refurbishers — saving them the hassle of downloading the software and burning new discs — would encourage more secondhand sales. In his view, the new owners were entitled to the software, and this just made it easier.

The government, and Microsoft, did not see it that way. Federal prosecutors in Florida obtained a 21-count indictment against Lundgren and his business partner, and Microsoft filed a letter seeking $420,000 in restitution for lost sales. Lundgren claims that the assistant U.S. attorney on the case told him, "Microsoft wants your head on a platter and I'm going to give it to them."

The U.S. attorney's office in Miami and Microsoft declined to comment. Senior U.S. District Judge Daniel T.K. Hurley observed that none of the discs Lundgren made were actually sold and declined to order him to pay restitution. Hurley imposed a 15-month sentence that was less than half of that called for by federal sentencing guidelines, which indicated 36 to 47 months.

In court, the judge made it clear that this was a tough case.

"This case is especially difficult," Hurley told Lundgren at his sentencing last May, "because of who you are today and in terms of who you have become." The judge received evidence of Lundgren's recycling company, IT Asset Partners; his projects to clean up e-waste in Ghana and China; and a 2016 initiative in which Lundgren's company repaired and donated more than 14,000 cellphones and $100,000 to the Cell Phones for Soldiers organization to benefit U.S. soldiers deployed overseas.

Lundgren grew up in Lynden, Wash., where, as a 16-year-old, he became the town's computer recycler after the local sheriff's department heard about his talent for fixing or reusing computer parts. Some parts of a computer — for example, the Apple touch screen — are proprietary and cannot be recycled. But 95% of a computer, Lundgren said — such as the battery or the motor or the circuits — are generic and can be reused or repurposed. He has devoted much time to recovering discarded batteries, whether from cars or computers, and reusing them in wheelchairs, electronics and various vehicles.

At 19, Lundgren moved to Los Angeles and started his first electronics recycling company, and at 20 he landed his first big client: American Airlines, refurbishing and selling about 40,000 computers a year. The computers came with the original license or "certificate of authenticity" stickers and with product key numbers on the sticker, though their hard drives had been erased, so reinstalling Windows was legal, Lundgren said.

"If they brought in a computer without a certificate of authenticity," Lundgren said of his customers, "then we'd part it out" and not refurbish and resell it. He added clients including Dell, Asus, Lenovo and Coca-Cola, handling their discarded computers.

Lundgren became intrigued with following the world's e-waste stream and wound up moving to China. "I learned the back end of what happens when things are thrown away," he said. He became more focused on reducing the ever-growing heaps of discarded plastics and glass that a "use it and toss it" society creates, eliminating the burning of electronic trash that pollutes the air and combating the leakage of computer-based chemicals that filter into the water.

While in China, Lundgren hit upon the idea of selling restore discs to computer refurbishers. The discs work if computers still have their license and product keys available, and the license transfers with the computer, no matter who owns it.

"Microsoft does not sell restore CDs," Lundgren said. "Microsoft sells licenses" that enable their software to work, from $300 for a new operating system to $25 for a license for a refurbisher who wants to resell a computer that does not already have a licensed copy of Windows.

In 2013, federal authorities intercepted shipments of 28,000 restore discs that Lundgren had manufactured in China and sent to his sales partner in Florida. The discs had labels nearly identical to the discs provided by Dell for its computers and had the Windows and Dell logos. "If I had just written 'Eric's Restore Disc' on there, it would have been fine," Lundgren said.

As a result of violating the copyright of Windows and Dell, Lundgren pleaded guilty to two of the 21 counts against him. But he believed that because the discs had no retail value and were seized before they were sold, he would not receive any prison time. His sentence was based on the financial loss involved.

Microsoft attorney Bonnie MacNaughton wrote to Hurley, the judge, describing the case as one of "software piracy," costing the computer industry billions of dollars annually, and saying that prosecution was important "to deter others from engaging in the illicit global trade in decoupled product activation keys" — meaning the sale or trade of the license stickers applied to the originally licensed computer. Microsoft calculated that Lundgren's 28,000 restore discs could have been sold to refurbishers for $20 each, and that 75% of that total was Microsoft's average profit, so it demanded restitution of $420,000.

As their expert witness at the sentencing, prosecutors called a Microsoft program manager from Ireland to explain to the judge how the discs worked and their value. Jonathan McGloin testified that Microsoft licensed Windows to computer manufacturers such as Dell and also licensed them to make restore or recovery discs to be included with the new computers. McGloin also testified that Microsoft charges computer refurbishers about $25 for a new license and copy of the software but didn't differentiate that from what was done by Lundgren, who was not making a new copy of the software and intended his restore discs only for computers that were already licensed.

"In essence, I got in the way of Microsoft's profits, so they pushed this into federal court on false pretense," Lundgren said. He said McGloin "testified that a free restore CD was worth the same price as a new Windows operating system with a license. ... This was false and inaccurate testimony provided by Microsoft in an attempt to set a precedent that will scare away future recyclers and refurbishers from reusing computers without first paying Microsoft again for another license. ... Anyone successfully extending the life cycle of computers or diverting these computers from landfills for reuse in society is essentially standing in the way of Microsoft's profits."

Lundgren called his own expert witness, Glenn Weadock, an author of numerous software books who testified for the government in a major antitrust case against Microsoft that was resolved in 2001. Weadock was asked, "In your opinion, without a code, either product key or COA [certificate of authenticity], what is the value of these reinstallation discs?"

"Zero or near zero," Weadock said.

Why would anybody pay for one? Lundgren's lawyer asked.

"There is a convenience factor associated with them," Weadock said.

Still, Hurley decided that Lundgren's 28,000 restore discs had a value of $700,000, and that qualified Lundgren for a 15-month term along with a $50,000 fine. He denied Lundgren's request to remain free pending his appeal, but the U.S. 11th Circuit Court of Appeals granted the request as Lundgren was about to surrender for imprisonment.

"I thought it was freeware," Lundgren said of the restore discs. "If it's free, then I'm just going to duplicate the free repair tool and give it away, and that'll be fine," he thought. "The value's in the license. They didn't understand that."

His appeal is pending before the 11th Circuit.
http://www.latimes.com/business/tech...215-story.html





Man Jailed for Pirating Bible
Robert Egbe

A Federal High Court in Lagos has sentenced a book pirate, Anthony Okojie, to two years imprisonment for pirating Bibles and religious books published by Bible Society of Nigeria (BSN).

Justice Musa Kurya found Okojie guilty of a one-count charge of piracy and sentenced him without an option of fine.

The judgment which was delivered on January 29, was made available to The Nation by the Nigerian Copyright Commission (NCC).

Okojie’s sentence commences from the day of his arrest, November 26, 2013.

The NCC arraigned the convict before Justice Kurya on November 26, 2013.

Okojie challenged his arraignment but eventually pleaded not guilty to the charge on February 13, 2014.

During trial, the Commission told the court that it received a letter from the BSN on June 12, 2013 alleging that its products (Bibles and motivational books) were being pirated by Okojie at Sango-Ota, Ogun State.

It said it investigated the claim by, among others, making a test purchase from Okojie. On June 21, 2013, NCC operatives raided Okojie’s shop at Ojolowu Shopping Complex along Abeokuta Expressway, Sango-Ota.

They found and confiscated 376 pirated works (Bibles and motivational books) belonging to the BSN following which Okojie was arrested and arraigned.

The prosecution presented three witnesses and tendered several exhibits to establish its case.

But Okojie neither put in a defence nor called any witness, rather, he filed a no-case submission which was eventually dismissed by Justice Kuya.

The NCC adopted its written address and Okojie’s counsel adopted his no-case submission on November 24, 2015.

The trial finally came to an end with Okojie’s conviction and sentencing.

Speaking on the judgment, NCC Lagos Office Director/Zonal Manager Mr. Obi Ezeilo said the agency will continue to fish out copyright pirates and prosecute them.

He warned pirates to desist from such act. Ezeilo affirmed the Commission’s zero tolerance for piracy and warned bookshop operators to ensure that their premises are not used for copyright piracy.
http://thenationonlineng.net/man-jailed-pirating-bible/





An American Served >1 Year in Prison for Conduct that is 100% Legal in Europe. But it’s not Drugs. It’s Copyright. Here’s Why it Matters.
Tim Armstrong

The Administration is receiving some unusual advice from the content industries as it undertakes to renegotiate the North American Free Trade Agreement (NAFTA). It’s not surprising that content providers would weigh in, given the multiple obligations Article 17 of NAFTA imposes on Canada, Mexico, and the United States concerning intellectual property protections. If it’s your ox being gored, you’re going to have some strong opinions; it would be weird if you didn’t.

What makes the content industries’ position unusual is that they are apparently attempting to take some issues off the table entirely as subjects for negotiation. According to this recent TechDirt article (emphasis mine):

“the entertainment industries are arguing that exceptions and limitations are outdated and unnecessary in trade agreements. They say that copyright holders should be protected from piracy and unlawful use of their works, claiming that any exceptions and limitations are a barrier to the protection of American artists.”

That is, the content industries want the final agreement to mandate strong copyright protections without mandating (or, depending on how you read the word “any” in the preceding quotation, permitting) signatory countries to recognize exceptions or limitations on those rights. Copyright rights, on this view, are a one-way ratchet: they can only be strengthened everywhere, never reduced.

There are a lot of reasons to quarrel with this view, assuming it has been accurately reported. It’s profoundly ahistorical; every international agreement that defines copyright rights also fixes outer boundaries of protection. Article 10(1) of the Berne Convention requires every country to allow quotations from published works, and other parts of Berne allow copyright exceptions for teaching, news reporting, and other socially valuable purposes. The WTO TRIPS Agreement, another landmark multilateral treaty, incorporates long-recognized copyright limitations (for example, Article 9(2)’s idea/expression dichotomy) into international law. An argument that copyright limitations and exceptions have no place in trade agreements isn’t an argument against NAFTA; it’s an argument against every copyright treaty ever concluded.

An argument for international copyright law to ignore copyright limitations and exceptions is also bad policy, which is why civil society and public-interest groups have been loudly insisting that NAFTA renegotiations protect fair use and maintain balance between the rights of content creators and users. It’s why the American Library Association, the Center for Democracy and Technology, and over 80 other individuals and organizations from several nations issued the Washington Principles on Copyright Balance in Trade Agreements last fall as NAFTA renegotiation got underway in earnest.

But most of all, it’s just shortsighted even from the perspective of people who want to maximize the scope of copyright rights to take limitations and exceptions off the table as a subject of trade negotiations. To understand why, let’s look at two recent cases involving an American citizen and a European company who did the same thing: they distributed a tool that allowed owners of Nintendo Wii entertainment systems to play games that had not been authorized by Nintendo. These systems, colloquially known as “mod chips,” bypassed the internal authentication system that the Wii used to verify that the games users were seeking to play on their consoles had been approved by Nintendo. Nintendo complained to the authorities in each jurisdiction that helping Wii owners play unauthorized games was against the law because it circumvented a technological protection that Nintendo had put in place to ensure that only authorized games were playable.

Both the United States and Europe have laws against that sort of thing. They have laws of that type because they are required, under Article 11 of the WIPO Copyright Treaty, to

“provide adequate legal protection and effective legal remedies against the circumvention of effective technological measures … that restrict acts, in respect of their works, which are not authorized by the authors concerned or permitted by law.”

The EU Court of Justice, however, refused to condemn the making of a device to circumvent the Wii’s technological protections, because those protections appeared to the court to be broader than necessary to protect Nintendo’s copyright rights. It required, in assessing the legality of the circumvention tool, a consideration of “how often [the] devices are in fact used in order to allow unauthorised copies … and how often that equipment is used for purposes which do not infringe copyright[.]”

In contrast, the American who circumvented the Wii’s technological protection went to prison for twelve months and one day; the opinion of the Court of Appeals did not even address the question whether the “mod chips” he sold could have been put to noninfringing uses.

These two cases involved the same underlying offense and essentially identical substantive laws, written to implement a single treaty obligation, yet they reached opposite outcomes for reasons neither opinion said a word about.

If you believe in meaningful limitations to copyright holders’ exclusive rights (or, for that matter, in the rule of lenity which usually governs complex criminal issues of first impression), you probably prefer the European outcome to the American one: perhaps tools should be evaluated based on whether they are mostly put to infringing or noninfringing use. Maybe the limitation on copyright holders’ exclusive rights that the European court recognized would be a good rule for the United States, too.

On the other hand, if you believe that copyright holders should have the power to control how their works (such as Nintendo’s game console) are used, then the European approach probably infuriates you and the American one probably seems correct. Maybe you wish that European law didn’t recognize a limitation on copyright holder’s exclusive rights in this instance, and that the American view prevailed instead.

The point is, whichever outcome you favor, the only way to get to that outcome is for international treaty negotiators to discuss limitations and exceptions. Taking limitations and exceptions off the table makes it impossible to move the ball in any direction, not just the one the content industries claim to oppose. That’s a foolish way to conduct negotiations, and one all the players in the NAFTA debate should reject.
https://blogs.harvard.edu/infolaw/20...hy-it-matters/





BMG V. Cox is Major Copyright Victory for Music Industry
J. Alexander Lawrence

While reversing and remanding for a new trial in light of certain errors in the jury instructions, the Fourth Circuit Court of Appeals has largely sided with the copyright holders in the dispute between BMG Rights Management and Cox Communications.[1] The case represents a major victory for the music industry in one of its first attempts to hold an internet service provider liable for unauthorized peer-to-peer file sharing by its subscribers. The decision has broad implications beyond the narrow dispute between BMG and Cox.

With respect to two important questions of copyright law on appeal, the Fourth Circuit sided with BMG. First, to receive safe harbor protection under the Digital Millennium Copyright Act, an ISP will be required to take action against a subscriber where there is evidence the subscriber has engaged in repeated acts of infringement, even if the subscriber has never been proven to be an infringer. Second, an ISP can be held contributorily liable for its subscribers' actions, notwithstanding that the service -- providing access to the internet -- is capable of substantial noninfringing uses.

The District Court Action Against Cox

On Nov. 26, 2014, BMG commenced the underlying action against Cox in the United States District Court for the Eastern District of Virginia, home of the "rocket docket." BMG claimed that thousands of Cox subscribers had used Cox's network to illegally download copyrighted works via peer-to-peer file sharing programs, and argued that Cox should be held secondarily liable for their infringement.

As its first line of defense, Cox asserted that its actions were protected by Section 512(i) of the DMCA, a "safe harbor" provision that shields service providers from secondary copyright infringement damages, so long as the service provider "reasonably implement[s]" a policy that provides for the termination of "repeat infringers" in "appropriate circumstances."

In a partial summary judgment decision issued on Nov. 19, 2015, the district court sided with BMG and held that Cox did not qualify for the DMCA safe harbor in that it had not reasonably implemented such a policy. In doing so, the court found that "before the fall of 2012 Cox did not implement its repeat infringer policy. Instead, Cox publicly purported to comply with its policy, while privately disparaging and intentionally circumventing the DMCA's requirements."[2]

It was undisputed that Cox had a written policy that prohibited subscribers from using Cox's network to infringe others' intellectual property and warned that violations could result in the "immediate suspension or termination of either ... access to the service and or [the] Cox account." In practice, however, Cox seldom terminated subscribers. After receiving a notice of a claim of infringement, Cox would place a "strike" on the subscriber's account and, for every notice but the first, would forward the notice to the subscriber. No further action would be taken until the eighth and ninth notices, when the account holder would be directed to a page warning against further infringement. After the 10th,11th, 12th and 13th notices, the account holder would be shown an additional warning and was required to contact customer support to get back online. After 14 "strikes" within a six-month period, Cox would consider terminating the subscriber's account.

The district court also noted that a series of emails from Cox employees suggested that Cox was intentionally circumventing the DMCA requirements. The district court found that the emails demonstrated that "Cox employees followed an unwritten policy put in place by senior members of Cox's abuse group by which accounts used to repeatedly infringe copyrights would be nominally terminated, only to be reactivated upon request. Once these accounts were reactivated, customers were given clean slates, meaning the next notice of infringement Cox received linked to those accounts would be considered the first in Cox's graduate response procedure."[3]

A jury trial commenced in early December 2015 and resulted in a verdict in favor of BMG. The jury found that Cox's subscribers had used its service to infringe copyrighted works, and while Cox was not liable under a theory of vicarious liability, it was liable under a theory of contributory infringement for the acts of its subscribers. The jury awarded $25 million in statutory damages.[4]

Cox thereafter sought judgment as a matter of law that it could not be held contributorily liable because its Internet service is capable of substantial non-infringing uses. In turn, BMG moved for judgment as a matter of law on its vicarious infringement claim.

In a decision on Aug. 8, 2016, the district court denied both motions.[5] With respect to the vicarious infringement claim, which requires that a party have "an obvious and direct financial interest" in the infringing activity, the court noted that substantial evidence supported the jury's finding that Cox had no such financial interest in its subscribers' infringement of copyrights. The court likewise upheld the jury's contributory infringement verdict, reasoning that because Cox had general knowledge of the infringement on its network, and because Cox materially contributed to the infringement by turning a "blind eye" to the acts of its subscribers, sufficient evidence existed for a reasonable jury to find Cox contributorily liable for infringement of BMG's copyrights.

The Fourth Circuit Decision on Appeal

In its decision on appeal, the Fourth Circuit first addressed the district court's summary judgment order stripping Cox of its DMCA safe harbor. The Fourth Circuit affirmed the district court's ruling.

The Fourth Circuit rejected Cox's argument that a subscriber cannot be a "repeat infringer" unless that subscriber has been "adjudicated" and found liable for infringement. Cox had argued that "infringer" must mean an "actual infringer" rather than simply an "alleged infringer" and pointed to other provisions of the Copyright Act where Congress had used terms like "alleged infringement" or "claimed infringement," thereby indicating that Congress knew how to specify where mere allegations of infringement suffice. Cox argued that none of its subscribers had ever been proven to be infringers; they were simply accused of misconduct.

After dispensing with Cox's proffered interpretation of what constitutes a "repeat infringer," the Fourth Circuit then turned to whether Cox reasonably implemented "a policy that provides for the termination in appropriate circumstances" of its subscribers who repeatedly infringe copyrights. The Fourth Circuit found that Cox had not reasonably implemented such a policy. The Fourth Circuit held that "in carrying out its thirteen-strike process, Cox very clearly determined not to terminate subscribers who in fact repeatedly violated the policy."[6] The Fourth Circuit noted that, until September 2012, "Cox never terminated a subscriber for infringement without reactivating them."[7] The Fourth Circuit held that "[a]n ISP cannot claim the protection of the DMCA safe harbor provision merely by terminating customers as a symbolic gesture before indiscriminately reactivating them within a short timeframe."[8] While in September 2012, Cox abandoned its policy of routine reactivation, the Fourth Circuit held that the record showed that "this was a change in form rather than substance, because instead of terminating and then reactivating subscribers, Cox simply stopped terminating them in the first place."[9] The Fourth Circuit noted that Cox could only point to four terminations for repeat copyright infringement.

After addressing the DMCA safe harbor issues, the Fourth Circuit then turned to the secondary liability issues, in particular contributory infringement, in that the claim of vicarious infringement had been rejected by the jury.

The Fourth Circuit recognized the U.S. Supreme Court's holding in Sony Corporation of America v. Universal City Studios Inc., 464 U.S. 417 (1987), that as a general matter, the distributor of a product cannot be held liable for users' infringement so long as the product has "substantial non-infringing uses." In that case, the Supreme Court held that Sony was not secondarily liable for customers' use of Betamax video recorders to copy television programs, because the recorders were also capable of "substantial non-infringing uses." In Metro-Goldwyn-Mayer Studios Inc. v. Grokster Ltd., 545 U.S. 913 (2005), the Supreme Court held that this general rule set forth in Sony had its limits. In Grokster, the Supreme Court held that while the peer-to-peer file sharing network at issue did have substantial noninfringing uses, secondary liability was nonetheless available because the defendants had intentionally induced or encouraged the direct copyright infringement using the service.

Cox argued that its internet service clearly has substantial noninfringing uses and that it never intentionally induced or encouraged direct copyright infringement using its service. The Fourth Circuit disagreed. Citing to the Restatement (Second) of Torts, the Fourth Circuit held that if a person "knows that the consequences are certain, or substantially certain, to result from his act, and still goes ahead, he is treated by the law as if he had in fact desired to produce the result."[10]

In reaching this conclusion, the Fourth Circuit made clear that those who provide a subscription service -- as opposed to a one-time sale of a consumer product like a video recorder -- may find it more difficult to point to the substantial noninfringing uses of the service as a shield against secondary liability. If a copyright holder can show that the provider of a service has actual knowledge of specific acts of infringement (or was willfully blind to such acts) and failed to act to put a stop to the conduct, it can be held secondarily liable for the conduct.

Nonetheless, in that the district court instructed the jury in a manner that could lead to the conclusion that Cox could be found secondarily liable upon a showing of recklessness or negligence, the Fourth Circuit reversed and remanded and ordered a new trial.

The Impact of the Decision

While Cox has obtained a new trial and BMG has lost its multimillion-dollar judgment, the Fourth Circuit has provided important guidance regarding the availability of the DMCA safe harbor and the scope of secondary liability under the Copyright Act.

The Fourth Circuit's decision has broad implications for a whole host of service providers who rely on the DMCA safe harbors and the limits of secondary copyright infringement to avoid the potential for crippling damage awards if held liable for the acts of their subscribers. While the Fourth Circuit does not say exactly what must be done to avoid Cox's fate, the decision certainly provides guidance as to what not to do. Service providers cannot insist that copyright holders establish that subscribers have been adjudicated infringers to qualify as "repeat infringers." Likewise, service providers must act upon reasonable evidence that actual infringement occurred and cannot act to avoid obtaining knowledge of specific instances of actual infringement on their networks.
https://www.lexology.com/library/det...5-b86a573dce45





Comcast Mulling New Bid for Fox Assets: Sources
Anjali Athavaley, Jessica Toonkel

Comcast Corp is considering a new offer for Rupert Murdoch’s Twenty-First Century Fox assets, despite an agreement in December to sell them to Walt Disney Co for $52.4 billion, according to people familiar with the matter.

Comcast’s deliberations indicate that it believes it still has a chance to clinch a deal with Fox, even though it previous bid last year for more than $60 billion was rejected over concerns that regulators worried about media consolidation could thwart it, the sources said.

Comcast may decide not to make any new offer, the sources said. Its decision will be informed by how Fox justifies the deal with Disney in a regulatory filing to its shareholders sometime before they are asked to vote on the deal this summer, the sources added.

The sources asked not to be identified because the deliberations are confidential. Comcast, Disney and Fox gave no immediate comment.

Comcast might be prepared to offer protections to Fox such as agreeing to remove certain assets from the deal that prove controversial in Washington, D.C., including regional sports channels, according to the Wall Street Journal, which first reported on Comcast’s deliberations.

It was is possible that instead of re-engaging in pursuit of all of Fox assets, Comcast could zero in on something in particular, such as European pay TV giant Sky, the Wall Street Journal report said, citing anonymous sources.

The Murdoch family, which controls Fox, preferred a deal with Disney because it would rather be paid in Disney than Comcast stock, and expects a potential deal with Disney to be cleared by U.S. antitrust regulators more easily, sources told Reuters in December.

Disney struck a deal with Fox to buy film, television and international businesses. The deal is set to bring to a close more than half a century of expansion by Murdoch, 86, who turned a single Australian newspaper he inherited from his father at the age of 21 into one of the world’s most important global news and film conglomerates. The new, slimmed down Fox will focus on TV news and sport.

Reporting by Anjali Athaveley and Jessica Toonkel in New York; Additional reporting by Rama Venkat Raman in Bengaluru; Editing by Alistair Bell
https://www.reuters.com/article/us-f...-idUSKBN1FW05C





Fox Commits to Sky News Independence to Try to Secure Sky Deal

Rupert Murdoch’s Twenty-First Century Fox (FOXA.O) said it would commit to maintain Sky News in Britain for at least five years and would establish an independent board for the channel to try to secure its takeover of pay-TV operator Sky (SKYB.L).

Britain’s competition regulator said last month that Fox‘s$15.7 billion deal to buy the 61 percent of Sky it does not already own should be blocked unless a way is found to prevent Murdoch influencing Sky’s news output.

Fox disputed the regulator’s assessment, saying it was based on a number of legal and factual errors.

Nonetheless it said its proposed remedies, based on putting a “firewall” around the 24-hour news channel, went to the heart of addressing the Competition and Markets Authority’s (CMA) concerns.

“The combined effect of the Proposed Firewall Remedies is that there could be no circumstances in which, post-transaction, the MFT (Murdoch family trust) or members of the Murdoch family could influence, whether directly or indirectly, the editorial line or policy of Sky News,” the company said.

It also said it agreed with the CMA’s view that the possibility of Disney buying Fox assets, including its stake in Sky, should be taken into consideration.

The CMA in January put forward three broad possible solutions to its concerns: insulating Sky News from Fox’s influence, spinning off or divesting Sky News, or blocking the deal outright.

Fox’s proposals were released by the CMA on Monday. It is due to present media secretary Matt Hancock with a final report by May 1 and he has said he will rule on the deal by June 14.

In its own submission, Sky said it considered the remedies proposed by Fox - a structural separation of Sky News governance and editorial decision-making underpinned by robust behavioral commitments - would be an effective and comprehensive solution to the regulator’s concerns.

“The proposed remedies (...) decisively eliminate the source of the CMA’s provisional concerns – potential MFT influence over Sky News’ editorial decision-making,” it said.

“They are straightforward to implement and monitor.”

Shares in Sky were trading up 0.3 percent at 10.50 pounds at 1216 GMT.

Reporting by Paul Sandle; editing by Sarah Young and David Evans
https://www.reuters.com/article/us-s...-idUSKBN1FW16M





U.S. Seeks to Block AT&T from Citing Trump Opposition in Merger Lawsuit
Diane Bartz, David Shepardson

The U.S. Department of Justice on Friday moved to prevent AT&T Inc (T.N) from arguing that politics played a role in the government’s decision to stop its merger with Time Warner Inc (T.N), a deal that President Donald Trump had publicly criticized.

“There was no selective enforcement,” Justice Department lawyer Craig Conrath said at a pre-trial hearing. “The president is unhappy with CNN. We don’t dispute that. But AT&T wants to turn that into a get-out-jail-free card for their illegal merger.”

AT&T and Time Warner’s lawyer Daniel Petrocelli, however, cited Trump’s repeated criticism of the deal as reason to allow the company to argue that the government opposed the deal for political reasons. It is seeking records of communications between the White House and Justice Department that describe Trump’s views on the merger.

AT&T wants the judge to review any communications found to see if they bolster their contention that the transaction was singled out because of Trump’s anger with CNN.

The documents were requested as preparation for a March 19 trial in which Judge Richard Leon will decide if the $85 billion deal would raise prices. The Justice Department sued to stop the deal on the grounds that it is illegal under antitrust law.

The government has asked Judge Leon to rule that AT&T may not cite politics, formally known as selective enforcement, as a defense and to quash a request for documents to support that defense.

AT&T’s Petrocelli defended the request. “If there is something in those documents, it’s important for us,” he said at the hearing at the U.S. District Court for the District of Columbia.

Leon said he would rule on Tuesday.

The deal has been followed more closely than most antitrust matters because Trump attacked it while on the campaign trail in 2016. Trump has also repeatedly criticized Time Warner’s CNN news network and, in November, he reiterated his opposition to the proposed transaction.

Conrath said the government’s lawsuit was not motivated by Trump’s irritation with CNN and said it had offered several settlement options that would have allowed AT&T to acquire CNN.

As recently as November, Trump stood by his criticism of the proposed transaction.

“Personally I’ve always felt that that was a deal that’s not a good deal for the country,” the president said. “I think your pricing’s going to go up, I don’t think it’s a good deal for the country.”

Conrath warned that if the Justice Department were forced to conduct a much broader search for additional documents it could delay the trial until July. Leon also said he did not want the case to get “sidetracked.”

“We have no intention of losing this schedule,” Petrocelli said.

Petrocelli said Time Warner CEO Jeffrey Bewkes had laid out in extensive detail the company’s belief that Trump’s improper influence was a factor in a deposition Thursday.

AT&T had taken the unusual step of including the assistant attorney general for antitrust, Makan Delrahim, on its witness list as it looks to find evidence to support its position that the government was bringing a case because of Trump’s anger.

Conrath offered an affidavit from Delrahim in which he said he had not been instructed or ordered by anyone at the White House or at the Justice Department to bring the lawsuit.

Merger cases, however, are judged illegal or legal depending on whether prices go up or innovation is lost because of the deal.

Petrocelli derided the government’s case as “weak,” saying that their economists had determined the price of AT&T’s DirecTV could go down and that there could be a small increase to non-DirecTV consumers. Conrath sharply disagreed, saying the deal would cause “hundreds of millions of dollars of damage.”

Reporting by Diane Bartz and David Shepardson; Writing by Susan Heavey; Editing by Jonathan Oatis and Chris Reese
https://www.reuters.com/article/us-t...-idUSKCN1G02HG





F.C.C. Watchdog Looks Into Changes That Benefited Sinclair
Cecilia Kang

Last April, the chairman of the Federal Communications Commission, Ajit Pai, led the charge for his agency to approve rules allowing television broadcasters to greatly increase the number of stations they own. A few weeks later, Sinclair Broadcasting announced a blockbuster $3.9 billion deal to buy Tribune Media — a deal those new rules made possible.

By the end of the year, in a previously undisclosed move, the top internal watchdog for the F.C.C. opened an investigation into whether Mr. Pai and his aides had improperly pushed for the rule changes and whether they had timed them to benefit Sinclair, according to Representative Frank Pallone of New Jersey and two congressional aides.

“For months I have been trying to get to the bottom of the allegations about Chairman Pai’s relationship with Sinclair Broadcasting,” Mr. Pallone, the top Democrat on the committee that oversees the F.C.C., said in the statement to The New York Times. “I am grateful to the F.C.C.’s inspector general that he has decided to take up this important investigation.”

It was unclear the extent of the inspector general’s investigation or when it might conclude, but the inquiry puts a spotlight on Mr. Pai’s decisions and whether there had been coordination with the company. It may also force him to answer questions that he has so far avoided addressing in public.

The inquiry could also add ammunition to arguments against the Sinclair-Tribune deal. Public interest groups and Democratic lawmakers, including Mr. Pallone, are strongly opposed to the deal, arguing that it would reduce the number of voices in media and diminish coverage of local news.

Sinclair’s chief executive, Chris Ripley, has called Mr. Pai’s relaxation of media ownership rules a “landmark” development for his company and the industry. A union of Sinclair and Tribune would create the nation’s biggest television broadcaster, reaching seven out of 10 American homes. The F.C.C. and Justice Department are widely expected to approve the merger in the coming weeks.

The office of F.C.C. inspector general, which is a nonpartisan role that reports to the agency and regularly updates Congress on some investigations, said it would “not comment on the existence or the nonexistence of an investigation.”

Mr. Pai’s office and Sinclair declined to comment. When the legislators called for an investigation in November, a spokesman for the F.C.C., representing Mr. Pai, said the allegations of favoritism were “baseless.”

“For many years, Chairman Pai has called on the F.C.C. to update its media ownership regulations,” the F.C.C. spokesman said. “The chairman is sticking to his long-held views, and given the strong case for modernizing these rules, it’s not surprising that those who disagree with him would prefer to do whatever they can to distract from the merits of his proposals.”

A New York Times investigation published in August found that Mr. Pai and his staff members had met and corresponded with Sinclair executives several times. One meeting, with Sinclair’s executive chairman, took place days before Mr. Pai, who was appointed by President Trump, took over as F.C.C. chairman.

Sinclair’s top lobbyist, a former F.C.C. official, also communicated frequently with former agency colleagues and pushed for the relaxation of media ownership rules. And language the lobbyist used about loosening rules has tracked closely to analysis and language used by Mr. Pai in speeches favoring such changes.

In November, several Democrats in Congress, including Mr. Pallone, called on the inspector general’s office to explore all communications — including personal emails, social media accounts, text messages and phone calls — between Sinclair and Mr. Pai and his staff.

The lawmakers also asked for communications between Mr. Pai’s office and the White House. They pointed to a report in March 2017 from The New York Post, in which Mr. Trump is said to have met with Sinclair’s executive chairman, David Smith, and discussed F.C.C. rules.

Some members of Congress have asked Mr. Pai for such communications, but he has not responded.

The F.C.C. inspector general, David L. Hunt, and other officials in his office met with aides in the House and Senate, including those for Mr. Pallone, in December. The F.C.C. officials told the aides that they would open an investigation, according to four people with knowledge of the meetings.

In later conversations, F.C.C. officials said that an investigation was underway, according to two other aides.

The aides, all of whom work for Democratic lawmakers, would speak only on the condition of anonymity because the investigation is private.

The investigation could put the F.C.C. inspector general’s office in a high-profile situation.

Mr. Hunt was promoted to lead the office in 2011 by Julius Genachowski, a Democrat and the F.C.C.’s then-chairman, after working in the agency for about five years. The office investigates potential violations of civil and criminal laws by agency staff members and companies that receive money from the agency. On Wednesday, the inspector general for veterans affairs, a similar position, released a scathing report about travel spending by the department’s secretary, David J. Shulkin.

The F.C.C.’s inspector general does not make public all of its investigations. But details of some investigations have been disclosed through Freedom of Information Act requests and through the office’s reports to Congress.

In 2015, the inspector general’s office looked into possible coordination between the Obama administration and the F.C.C. chairman at the time, Tom Wheeler, on the creation of so-called net neutrality rules. The rules prevented broadband providers from blocking or slowing traffic to consumers. The inspector general said its investigation could not find clear improper conduct.

Antitrust experts said this new investigation may complicate the reviews of the Sinclair-Tribune deal by the F.C.C. and the Justice Department. Even if the deal were approved, they said, any conclusions of improper conduct by Mr. Pai could give fuel to critics to challenge the review in courts.

“An investigation could cast a cloud over the whole process,” said Andrew Schwartzman, a senior fellow at Georgetown Law Center’s Institute for Public Representation. “For the review, knowledge of an investigation could generate caution and even delay completion of the deal.”
https://www.nytimes.com/2018/02/15/t...-ajit-pai.html





US Representatives Pile Questions on the FCC Regarding Net Neutrality Comment Process
Devin Coldewey

The order to roll back net neutrality may have been made (though it’s not quite in effect yet), but the fight to restore it is ongoing. Twenty-four members of Congress have asked the FCC for answers on a variety of topics relating to the huge and controversial comment docket for the “Restoring Internet Freedom” order, with a due date of March 6.

The letter, from the House Committee on Energy and Commerce Democrats, can be read in its entirety here; a few highlights from the 16 multi-part questions can be found below:

• How did the FCC decide how to handle the unprecedented volume of comments files? Was there discussion of how best to do so? What resources were dedicated to this immense task?
• How were comments determined to be “devoid of substance”? How were others determined to “bear substantively” on the issue? What were the training methods and guidelines for staff making these determinations? How many staff hours were dedicated to this?
• “Several members of this Committee filed comments in the docket of this proceeding, yet a number of the arguments raised in those comments were either dismissed out of hand or overlooked entirely. How did the Commission decide which arguments filed by members of Congress should not be considered?” (That’s a nasty one.)
• Why did the FCC fail to cite a single consumer comment in the order?
• Why has the FCC failed to cooperate with the NY attorney general’s investigation into potential identity theft?
• Why did the FCC choose to not implement any kind of identity verification in its comment platform?
• The FCC says it excluded comments that used fake names, but how was it determined which these were? And if it is known which comments used fake names, why were these comments not removed from the docket?
• Internal communications, analysis and documents are requested for most of these questions.

This isn’t the first such letter to be sent to Chairman Pai, and it probably won’t be the last. While he and his staff have responded to some more or less publicly, other letters may not have been published widely. So there’s no reason to think he won’t respond, although if previous replies are any indication, the answers to these questions may be less informative than Congress (and we) would like.
https://techcrunch.com/2018/02/13/us...mment-process/





Judge Won't Let FCC's Net Neutrality Repeal Stop Lawsuit Alleging Charter Throttled Netflix
Eriq Gardner

In December, the Federal Communications Commission voted to repeal landmark net neutrality rules. The agency's "Restoring Internet Freedom Order" went beyond rolling back prohibitions on ISPs from blocking and throttling content. The FCC also attempted to preempt any states wishing to enact their own policies. But in the first significant decision referring to the repeal since FCC chairman Ajit Pai got his way, a New York judge on Friday ruled that the rescinding of net neutrality rules wasn't relevant to an ongoing lawsuit against Charter Communications.

New York Attorney General Eric Schneiderman filed the lawsuit almost exactly a year ago today. It's alleged that Charter's Spectrum-TWC service promised internet speeds it knew it couldn't deliver and that Spectrum-TWC also misled subscribers by promising reliable access to Netflix, online content and online games. According to the complaint, the ISP intentionally failed to deliver reliable service in a bid to extract fees from backbone and content providers. When Netflix wouldn't pay, this "resulted in subscribers getting poorer quality streams during the very hours when they were most likely to access Netflix," and after Netflix agreed to pay demands, service "improved dramatically."

This arguably is the kind of thing that net neutrality was supposed to prevent. And Charter itself pointed to the net neutrality repeal in a bid to block Schneiderman's claims that Charter had engaged in false advertising and deceptive business practices.
New York Supreme Court Justice O. Peter Sherwood isn't sold.

He writes in an opinion that the FCC's order "which promulgates a new deregulatory policy effectively undoing network neutrality, includes no language purporting to create, extend or modify the preemptive reach of the Transparency Rule," referring to how ISPs have to disclose "actual network performance."

And although Charter attempted to argue that the FCC clarified its intent to stop state and local governments from imposing disclosure obligations on broadband providers that were inconsistent with FCC's rules, Sherwood notes other language from the "Restoring Internet Freedom Order" how states will "continue to play their vital role in protecting consumers from fraud, enforcing fair business practices... and generally responding to consumer inquiries and complaints."

The judge further rejects Charter's motion to dismiss where it argued that federal law preempts Schneiderman's lawsuit. The internet provider argued that the lawsuit conflicted with regulatory safe harbors by treating speed characterizations sanctioned by the FCC as deceptive and interfered with the FCC's policy of giving flexibility in the measurement of actual speeds. Sherwood sees no conflict.

And the judge sees no preemption either, writing "Spectrum-TWC fails to identify any provision of the [Federal Communications Act] that preempts state anti-fraud or consumer-protection claims, or reflects any intention by Congress to make federal law the exclusive source of law protecting consumers from broadband providers' deceptive conduct."

And then in a point of emphasis that could come into play down the road as states look to make their own end-run around the net neutrality repeal, Sherwood adds, "Here, the FCC lacks the authority to preempt state law because the FCA does not clearly authorize it to do so. Indeed, the FCC, itself, has repeatedly recognized that federal regulation of telecommunication carriers co-exists with, rather than displaces, state laws protecting the rights of consumers."
https://www.hollywoodreporter.com/th...ttling-1085591





To Kill Net Neutrality, FCC Might have to Fight More than Half of US States

Bucking FCC and Ajit Pai, lawmakers across US have proposed net neutrality laws.
Jon Brodkin

The legislatures in more than half of US states have pending legislation that would enforce net neutrality, according to a new roundup by advocacy group Free Press. So far, the states that have taken final action have done so through executive orders issued by their governors. Those are Vermont, Hawaii, Montana, New Jersey, and New York.

The legislative process obviously takes longer and is more uncertain because it requires votes by state lawmakers in addition to a governor's signature. Many bills are submitted in state legislatures without ever coming to a vote. But it wouldn't be surprising if some states impose net neutrality laws through the legislative process. The Washington State House of Representatives approved net neutrality rules by a vote of 93-5 on Wednesday, pushing the bill along to the state's Senate. In California, the state Senate passed a net neutrality bill last month.

The 27 states with pending legislation are Alaska, California, Colorado, Connecticut, Delaware, Georgia, Hawaii, Illinois, Iowa, Kansas, Maryland, Massachusetts, Minnesota, Nebraska, New Jersey, New Mexico, New York, North Carolina, Oregon, Pennsylvania, Rhode Island, South Dakota, Tennessee, Vermont, Virginia, Washington, and Wisconsin. Free Press has links to the pending bills or articles about the pending bills in nearly all of these states. (Free Press listed 26 states with legislation but we found out after this article published that Kansas also has pending net neutrality legislation, bringing the total to 27.)

"In the eight weeks since the FCC voted to take away net neutrality, a groundswell of activism by local advocates and politicians has revived prospects for lasting open-Internet safeguards," Free Press strategy director Timothy Karr wrote.

Multiple approaches to net neutrality

The potential role of states in regulating net neutrality is uncertain. The Federal Communications Commission repealed its own net neutrality rules and is attempting to preempt state and local laws that regulate net neutrality.

The executive orders from Vermont, Hawaii, Montana, New Jersey, and New York require ISPs to follow net neutrality principles if they sell Internet service to state agencies. Vermont's governor was the latest to sign such an order, doing so yesterday. Instead of imposing net neutrality restrictions on all Internet providers, these states are using their status as buyers of Internet service to protect net neutrality.

"Through the order, the State of Montana acts as a consumer—not a regulator," Montana Governor Steve Bullock said. "Because there's no mandate, and no new regulations, there's certainly no federal preemption. Companies that don't like Montana's proposed contract terms don't have to do business with the State."

Some legislative attempts, such as this Rhode Island bill, also use the approach of enforcing net neutrality through the state procurement process. But others would go beyond that by imposing net neutrality rules on all ISPs regardless of whether they contract with state agencies. The bill that passed the California Senate would essentially replicate the FCC's repealed prohibitions on blocking, throttling, and paid prioritization, for example.

Possible legal roadblocks

The Electronic Frontier Foundation is worried that the California Senate approach is likely to be thrown out of court when ISPs sue to overturn the rules. States are more likely to survive legal challenges if they use indirect approaches, such as requiring recipients of state funding to abide by net neutrality, the EFF argues.

But Harold Feld, a telecom lawyer and senior VP of consumer advocacy group Public Knowledge, argues that states can pass their own net neutrality laws.

The FCC decided in its net neutrality repeal "that Congress withheld authority over broadband from the FCC," Feld wrote. "If Congress explicitly withheld authority over broadband from the FCC, it withheld from the FCC the power to preempt any 'contrary' state authority."

One thing net neutrality advocates agree on is that reinstating the FCC rules in full would work better than a state-by-state approach. The attorneys general from 22 states and the District of Columbia have sued the FCC to reverse the net neutrality repeal and preemption of state and local laws. Separately, Democrats in Congress are trying to push through a vote to reinstate the FCC rules.

There have also been municipal efforts, such as San Francisco requiring net neutrality protections in a potential city-wide fiber network.

"It's important to have action across the map," Free Press wrote. "City and state protections are encouraging, and would likely withstand legal challenges from the phone and cable lobby, but they offer only a piecemeal solution to a national problem. If we're to protect net neutrality for everyone we must restore the Title II standard of the FCC's 2015 Open Internet Order."
https://arstechnica.com/tech-policy/...-of-us-states/





FCC Report Finds Almost No Broadband Competition at 100Mbps Speeds
Jon Brodkin

If you live in the US and want home Internet service at speeds of at least 100Mbps, you will likely find one Internet service provider in your area or none at all.

The latest Internet Access Services report was released by the Federal Communications Commission last week. The report's broadband competition chart shows that 44 percent of developed Census blocks had zero home broadband providers offering download speeds of at least 100Mbps and upload speeds of at least 10Mbps.

Forty-one percent of developed Census blocks had one ISP offering such speeds, for a total of 85 percent with zero or one ISP. The remaining 15 percent had two or three providers at that level as of the end of 2016. That's up a bit from June 30, 2016, when about 12 percent of Census blocks had at least two providers of 100Mbps services.

While the FCC tracks deployment of 100Mbps Internet, the commission uses 25Mbps downstream and 3Mbps upstream as the primary speed threshold for judging broadband progress. FCC Commissioner Jessica Rosenworcel has called for raising the FCC's broadband download standard to 100Mbps, but the FCC has kept it at 25Mbps.

At the 25Mbps/3Mbps level, 56 percent of developed Census blocks had at least two providers in the latest data. That's up from 42 percent, the percentage with at least two providers six months previously. Thirty percent had exactly one provider, and 13 percent had none.

The data is based on the extensive Form 477 filings that ISPs have to make to the FCC. While this is the FCC's best data on deployment and competition, it's always about a year behind the present time—the latest report covers broadband access as of December 31, 2016.

There were certainly new broadband deployments and speed increases in existing deployments last year despite those pesky net neutrality rules, so the numbers would look better if they went through 2017. But the FCC data can also overstate broadband competition slightly because it counts ISPs as serving an entire Census block even if it only serves one home in the block. There are more than six million developed Census blocks in the US.

One analysis based on the FCC's June 2016 data found that more than 10.6 million US households had no access to wired Internet service with download speeds of at least 25Mbps, and an additional 46.1 million households live in areas with just one provider offering those speeds.

DSL in more Census blocks than cable or fiber

The latest FCC report covering December 2016 has another table that provides some sense of what kinds of broadband choices consumers have. It shows that 63.2 percent of developed Census blocks had one cable provider, but only 3.8 percent had two and 0.3 percent had three:

With fiber-to-the-home, more than 81 percent of developed Census blocks had no providers, while 18.3 percent of blocks had one. DSL and satellite were in a higher percentage of developed Census blocks than cable or fiber.

The data also shows that consumers frequently choose higher speeds when they have the option to do so. Out of 105.7 million fixed Internet connections, including both business and residential locations, 38.9 million had download speeds between 25Mbps and 99Mbps. Another 24.5 million connections were 100Mbps or higher:

For home Internet connections, "the median downstream speed was 50Mbps and the median upstream speed was 5Mbps," the report said. About 62 percent of residential broadband connections had speeds of at least 25Mbps downstream and 3Mbps upstream, and "[a]bout 24 percent of all residential fixed connections had a downstream speed of at least 100Mbps."

FCC Chairman Ajit Pai recently declared that broadband is being deployed to all Americans "in a reasonable and timely fashion," and he claimed that his repeal of net neutrality rules is spurring deployment. But as we wrote previously, the FCC has no data for the time after the net neutrality repeal, and the examples of new deployment cited by Pai's FCC were mostly projects that began during the Obama administration.

Satellite's impact on broadband data

The nationwide presence of satellite has the potential to make broadband competition look better than it really is in the FCC reports. In previous years, we've been told by the FCC that the broadband competition chart in each report excludes satellite services.

But that was apparently incorrect, as the FCC now tells us that the chart has always included satellite within its count of how many providers serve each Census block at different speeds. The FCC could provide the competition data with and without satellite, but it hasn't done so.

So far, that has only really affected the competition data for lower speeds. The FCC says satellite providers report offering 10Mbps speeds in 99.1 percent of developed Census blocks.

But in future reports, satellite could have a sizable impact on the 25Mbps/3Mpbs competition data. HughesNet began offering 25Mbps/3Mbps service in March 2017, for example. That's going to show up as an option for consumers in future reports even though few people would choose satellite over cable or fiber because of satellite's latency and low data caps.

The fact that Internet consumers generally don't consider satellite to be a worthy alternative to cable or fiber broadband can be seen in other data from the FCC report. Satellite accounts for about 1.7 percent of all residential fixed Internet connections with speeds of at least 10Mbps despite being available in more than 99 percent of the country. Cable accounts for 72.3 percent, fiber-to-the-premises for 12.8 percent, and DSL is at 12.7 percent. There were 78.2 million total residential connections of at least 10Mbps downstream and 1Mbps upstream.

Fixed wireless services account for about 0.4 percent of 10Mbps-and-over connections. Fixed wireless' presence in the US broadband market is also set to grow as major carriers like AT&T and Verizon expand the use of wireless home Internet services in less densely populated areas.
https://arstechnica.com/information-...00mbps-speeds/





SpaceX Hits Two Milestones in Plan for Low-Latency Satellite Broadband

SpaceX got good news from the FCC and will launch two demo satellites Saturday.
Jon Brodkin

SpaceX's satellite broadband plans are getting closer to reality. The company is about to launch two demonstration satellites, and it is on track to get the Federal Communications Commission's permission to offer satellite Internet service in the US.

Neither development is surprising, but they're both necessary steps for SpaceX to enter the satellite broadband market. SpaceX is one of several companies planning low-Earth orbit satellite broadband networks that could offer much higher speeds and much lower latency than existing satellite Internet services.

Today, FCC Chairman Ajit Pai proposed approving SpaceX's application "to provide broadband services using satellite technologies in the United States and on a global basis," a commission announcement said. SpaceX would be the fourth company to receive such an approval from the FCC, after OneWeb, Space Norway, and Telesat. "These approvals are the first of their kind for a new generation of large, non-geostationary satellite orbit, fixed-satellite service systems, and the Commission continues to process other, similar requests," the FCC said today.

SpaceX's application has undergone "careful review" by the FCC's satellite engineering experts, according to Pai. "If adopted, it would be the first approval given to an American-based company to provide broadband services using a new generation of low-Earth orbit satellite technologies," Pai said.

Falcon 9 has two demo satellites

Separately, CNET reported yesterday that SpaceX's Falcon 9 launch on Saturday will include "[t]he first pair of demonstration satellites for the company's 'Starlink' service."

The demonstration launch is confirmed in SpaceX's FCC filings. One SpaceX filing this month mentions that a secondary payload on Saturday's Falcon 9 launch will include "two experimental non-geostationary orbit satellites, Microsat-2a and -2b."

Those are the two satellites that SpaceX previously said would be used in its first phase of broadband testing.

"These are experimental engineering verification vehicles that will enable the company to assess the satellite bus and related subsystems, as well as the space-based and ground-based phased array technologies," SpaceX told the FCC.

SpaceX originally told the FCC that it might launch these test satellites by the end of 2017, so the launch is slightly later than that optimistic estimate. Longer-term, SpaceX has said that it might begin the launch of operational satellites as early as 2019. Further satellites will be launched in phases, with SpaceX intending to reach full capacity with 4,425 satellites in 2024.

Gigabit speeds, low latency

SpaceX has said it will offer speeds of up to a gigabit per second, with latencies between 25ms and 35ms. Those latencies would make SpaceX's service comparable to cable and fiber. Today's satellite broadband services use satellites in much higher orbits and thus have latencies of 600ms or more, according to FCC measurements.

The demonstration satellites will orbit at 511km, although the operational satellites are planned to orbit at altitudes ranging from 1,110km to 1,325km. By contrast, the existing HughesNet satellite network has an altitude of about 35,400km, making for a much longer round-trip time than ground-based networks.

We asked SpaceX for an update on its satellite broadband plans today, but the company declined to comment.

OneWeb was the first company to seek FCC approval to enter the US broadband market with low-Earth orbit satellites and received approval in June 2017. OneWeb wants to offer service in Alaska as early as 2019. Boeing is also planning to offer satellite broadband.

Pai praised SpaceX and other companies for using "innovative technologies" to improve broadband access. "Satellite technology can help reach Americans who live in rural or hard-to-serve places where fiber optic cables and cell towers do not reach," Pai said. "And it can offer more competition where terrestrial Internet access is already available."
https://arstechnica.com/information-...t-test-launch/





Trump’s Infrastructure Plan has No Dedicated Money for Broadband

Broadband would be one of numerous projects competing for a pool of money.
Jon Brodkin

President Trump's new 10-year plan for "rebuilding infrastructure in America" doesn't contain any funding specifically earmarked for improving Internet access. Instead, the plan sets aside a pool of funding for numerous types of infrastructure projects, and broadband is one of the eligible categories.

The plan's $50 billion Rural Infrastructure Program lists broadband as one of five broad categories of eligible projects. Here's the full list:

• Transportation: roads, bridges, public transit, rail, airports, and maritime and inland waterway ports.
• Broadband (and other high-speed data and communication conduits).
• Water and Waste: drinking water, wastewater, storm water, land revitalization, and Brownfields.
• Power and Electric: governmental generation, transmission, and distribution facilities.
• Water Resources: flood risk management, water supply, and waterways.

Eighty percent of the program's $50 billion would be "provided to the governor of each state." Governors would take the lead in deciding how the money would be spent in their states. The other 20 percent would pay for grants that could be used for any of the above project categories.

Separately, broadband would be eligible for funding from a proposed $20 billion Transformative Projects Program, along with transportation, clean water, drinking water, energy, and commercial space.

Trump's plan would also add rural broadband facilities to the list of eligible categories for Private Activity Bonds, which allow private projects to "benefit from the lower financing costs of tax-exempt municipal bonds." The plan would also let carriers install small cells and Wi-Fi attachments without going through the same environmental and historical preservation reviews required for large towers.

Democratic lawmakers have been lobbying for $40 billion in dedicated broadband funding. That amount would raise broadband availability from 86 percent to 98 percent of the country, according to a Federal Communications Commission report released during the Obama administration.

Democrats critical of funding plan

Some Democratic and Republican lawmakers previously argued that putting broadband in a list of eligible projects isn't enough and that there should be funding dedicated to improving broadband availability. Democrats were scathing in their reviews of Trump's plan yesterday.

"With a comparatively paltry investment from the federal government over ten years—less than one-tenth of 1 percent of GDP—and no dedicated funding for rural broadband, the Administration’s plan falls far short in resources, leaving many communities behind," Rep. Mark Pocan (D-Wis.) said.

"This glaring omission is a betrayal of the rural voters that supported [Trump] in his election, and a missed opportunity to close the digital divide that separates rural and urban America," Rep. Peter Welch (D-Vt.) said. "A robust rural broadband network is essential to attract businesses, provide access to health care through telemedicine, help farmers become more efficient, and close the homework gap that hamstrings rural students."

Welch is co-chair of the House Rural Broadband Caucus, which includes three Democrats and three Republicans. Last month, all six of those lawmakers urged Trump to "include funding specifically for rural broadband deployment in unserved and underserved areas."

Republican leadership in Congress has seemed reluctant to propose dedicated broadband funding, though. Last month, we contacted the House Committee on Energy and Commerce to ask if a Republican proposal would include specific funding for broadband.

A committee spokesperson did not provide us with a direct answer and said the committee's Republican leadership was waiting to see what Trump would propose. "We intend to work with our Democratic colleagues on these bipartisan issues, and look forward to reviewing the [Trump] administration's infrastructure proposal once it is announced," the spokesperson told Ars.

Democrats’ $40 billion plan to cover 98 percent of US

Democrats had previously proposed a $40 billion broadband investment to ensure broadband coverage for 98 percent of the country.

"Congressional Republicans have not reached out to work with us on broadband infrastructure so far, which doesn't make a lot of sense since Democrats are the only ones proposing actual funding to improve and expand broadband access," Rep. Frank Pallone (D-N.J.) told Ars in a statement last month.

Pallone represents Democrats as the ranking member of the Energy and Commerce Committee. Forty billion dollars "is the funding necessary to ensure that nearly every American has broadband access," Pallone also told Ars. "The Republican proposals are far less ambitious and do not actually solve any of our country's most pressing broadband infrastructure problems."

Still, some Republicans want to add dedicated broadband funding to the infrastructure plan when it goes through Congress. Senate Broadband Caucus co-Chairwoman Shelley Moore Capito (R-W.Va.) "argues that governors would be more inclined to use cash for roads and bridges than broadband if infrastructure modes are lumped into one rural fund," according to Politico.

The $40 billion figure proposed by Democrats came from a FCC paper titled, "improving the Nation's Digital Infrastructure."

As of December 2015, 14 percent of 160 million US residential and small- and medium-sized business locations lack access to fiber or cable service with speeds of at least 25Mbps down and 3Mbps up, the FCC paper said. It would cost $40 billion to cover 12 percent with fiber in order to bring total coverage up to 98 percent, it says. Getting fiber broadband to that final 2 percent, in sparsely populated areas, would require an additional $40 billion, the paper said.

"Unlike the last 2 percent, moreover, we do not expect these first 12 percent of locations will require material ongoing support once the network has been built, as subscriber revenues should be sufficient to pay for ongoing network costs," the paper said.

FCC chair praises Trump plan

The FCC paper was issued in the final days of then-Chairman Tom Wheeler's term. The FCC rescinded the paper and all of its findings without explanation shortly after Trump appointed Republican Ajit Pai to be the new chair.

Yesterday, Pai praised Trump's infrastructure plan. Pai's statement discussed improving wireless via 5G technology but did not mention expansion of fiber or cable.

"Too often, regulatory barriers make it harder and more expensive to build out broadband than it needs to be—to the detriment of American consumers," Pai said. "That's why this plan is a welcome and strong call to action. I stand ready to work with the Administration and Congress to turn this plan into a reality as we continue to bridge the digital divide and extend 5G digital opportunity to all Americans."

You can also check out our other coverage on what the Trump budget means for science research and energy.
https://arstechnica.com/tech-policy/...for-broadband/





U.S. Senators Voice Concern Over Chinese Access to Intellectual Property

Leaders of the U.S. Senate Intelligence Committee said on Tuesday they were concerned about what they described as China’s efforts to gain access to sensitive U.S. technologies and intellectual property through Chinese companies with government ties.

Senator Richard Burr, the committee’s Republican chairman, cited concerns about the spread of foreign technologies in the United States, which he called “counterintelligence and information security risks that come prepackaged with the goods and services of certain overseas vendors.”

“The focus of my concern today is China, and specifically Chinese telecoms (companies) like Huawei and ZTE that are widely understood to have extraordinary ties to the Chinese government,” Burr said.

Senator Mark Warner, the committee’s Democratic vice chairman, said he had similar concerns.

“I’m worried about the close relationship between the Chinese government and Chinese technology firms, particularly in the area of commercialization of our surveillance technology and efforts to shape telecommunications equipment markets,” Warner said.

Both senators spoke at an annual hearing of the panel where leaders of U.S. spy agencies testify about worldwide threats.

Several U.S. lawmakers have been focusing recently on Chinese technology firms.

Last week, two other Republican senators - Marco Rubio and Tom Cotton - introduced legislation that would block the U.S. government from buying or leasing telecommunications equipment from Huawei Technologies Co Ltd or ZTE Corp (0763.HK), citing concern the Chinese companies would use their access to spy on U.S. officials.

The companies did not return calls last week seeking comment on the legislation. In 2012, they were the subject of a U.S. investigation into whether their equipment provided an opportunity for foreign espionage and threatened critical U.S. infrastructure - something they have consistently denied.

Reporting by Patricia Zengerle and Doina Chiacu; Editing by Frances Kerry
https://www.reuters.com/article/us-u...-idUSKCN1FX23M





Universities Rush to Roll Out Computer Science Ethics Courses
Natasha Singer

The medical profession has an ethic: First, do no harm.

Silicon Valley has an ethos: Build it first and ask for forgiveness later.

Now, in the wake of fake news and other troubles at tech companies, universities that helped produce some of Silicon Valley’s top technologists are hustling to bring a more medicine-like morality to computer science.

This semester, Harvard University and the Massachusetts Institute of Technology are jointly offering a new course on the ethics and regulation of artificial intelligence. The University of Texas at Austin just introduced a course titled “Ethical Foundations of Computer Science” — with the idea of eventually requiring it for all computer science majors.

And at Stanford University, the academic heart of the industry, three professors and a research fellow are developing a computer science ethics course for next year. They hope several hundred students will enroll.

The idea is to train the next generation of technologists and policymakers to consider the ramifications of innovations — like autonomous weapons or self-driving cars — before those products go on sale.

“It’s about finding or identifying issues that we know in the next two, three, five, 10 years, the students who graduate from here are going to have to grapple with,” said Mehran Sahami, a popular computer science professor at Stanford who is helping to develop the course. He is renowned on campus for bringing Mark Zuckerberg to class.

“Technology is not neutral,” said Professor Sahami, who formerly worked at Google as a senior research scientist. “The choices that get made in building technology then have social ramifications.”

The courses are emerging at a moment when big tech companies have been struggling to handle the side effects — fake news on Facebook, fake followers on Twitter, lewd children’s videos on YouTube — of the industry’s build-it-first mind-set. They amount to an open challenge to a common Silicon Valley attitude that has generally dismissed ethics as a hindrance.

“We need to at least teach people that there’s a dark side to the idea that you should move fast and break things,” said Laura Norén, a postdoctoral fellow at the Center for Data Science at New York University who began teaching a new data science ethics course this semester. “You can patch the software, but you can’t patch a person if you, you know, damage someone’s reputation.”

Computer science programs are required to make sure students have an understanding of ethical issues related to computing in order to be accredited by ABET, a global accreditation group for university science and engineering programs. Some computer science departments have folded the topic into a broader class, and others have stand-alone courses.

But until recently, ethics did not seem relevant to many students.

“Compared to transportation or doctors, your daily interaction with physical harm or death or pain is a lot less if you are writing software for apps,” said Joi Ito, director of the M.I.T. Media Lab.

One reason that universities are pushing tech ethics now is the popularization of powerful tools like machine learning — computer algorithms that can autonomously learn tasks by analyzing large amounts of data. Because such tools could ultimately alter human society, universities are rushing to help students understand the potential consequences, said Mr. Ito, who is co-teaching the Harvard-M.I.T. ethics course.

“As we start to see things, like autonomous vehicles, that clearly have the ability to save people but also cause harm, I think that people are scrambling to build a system of ethics,” he said. (Mr. Ito is a director of The New York Times Company.)

Last fall, Cornell University introduced a data science course where students learned to deal with ethical challenges — such as biased data sets that include too few lower-income households to be representative of the general population. Students also debated the use of algorithms to help automate life-changing decisions like hiring or college admissions.

“It was really focused on trying to help them understand what in their everyday practice as a data scientist they are likely to confront, and to help them think through those challenges more systematically,” said Solon Barocas, an assistant professor in information science who taught the course.

In another Cornell course, Karen Levy, also an assistant professor in information science, is teaching her students to focus more on the ethics of tech companies.

“A lot of ethically charged decision-making has to do with the choices a company makes: what products they choose to develop, what policies they adopt around user data,” Professor Levy said. “If data science ethics training focuses entirely on the individual responsibility of the data scientist, it risks overlooking the role of the broader enterprise.”

The Harvard-M.I.T. course, which has 30 students, focuses on the ethical, policy and legal implications of artificial intelligence. It was spurred and financed in part by a new artificial intelligence ethics research fund whose donors include Reid Hoffman, a co-founder of LinkedIn, and the Omidyar Network, the philanthropic investment firm of Pierre Omidyar, the eBay founder.

The curriculum also covers the spread of algorithmic risk scores that use data — like whether a person was ever suspended from school, or how many of his or her friends have arrest records — to forecast whether someone is likely to commit a crime. Mr. Ito said he hoped the course would spur students to ask basic ethical questions like: Is the technology fair? How do you make sure that the data is not biased? Should machines be judging humans?

Some universities offer such programs in their information science, law or philosophy departments. At Stanford, the computer science department will offer the new ethics course, tentatively titled “Ethics, Public Policy and Computer Science.”

The expectations for the course are running high in part because of Professor Sahami’s popularity on campus. About 1,500 students take his introductory computer science course every year.

The new ethics course covers topics like artificial intelligence and autonomous machines; privacy and civil rights; and platforms like Facebook. Rob Reich, a Stanford political science professor who is helping to develop the course, said students would be asked to consider those topics from the point of view of software engineers, product designers and policymakers. Students will also be assigned to translate ideal solutions into computer code.

“Stanford absolutely has a responsibility to play a leadership role in integrating these perspectives, but so does Carnegie Mellon and Caltech and Berkeley and M.I.T.,” said Jeremy Weinstein, a Stanford political science professor and co-developer of the ethics course. “The set of institutions that are generating the next generation of leaders in the technology sector have all got to get on this train.”
https://www.nytimes.com/2018/02/12/b...s-courses.html





UK Unveils New Technology to Fight Extremist Content Online
Danica Kirka

The British government is unveiling new technology designed to remove extremist material from social media, amid mounting pressure on companies like Facebook and Twitter to do more to remove such content from their platforms.

The software, developed by ASI Data Science with funding from the government, was announced Tuesday by Home Secretary Amber Rudd ahead of meetings with technology executives and U.S. Secretary of Homeland Security Kirstjen Nielsen this week in Silicon Valley. The program will be shared with smaller companies that don't have the resources to develop such technology, the agency said.

"I hope this new technology the Home Office has helped develop can support others to go further and faster," Rudd said before the meetings. "The purpose of these videos is to incite violence in our communities, recruit people to their cause, and attempt to spread fear in our society."

Governments and law enforcement agencies have been pressing social media companies to do more to prevent extremists from using their sites to promote violence and hatred. British Prime Minister Theresa May has called on internet companies to remove extremist propaganda from their sites in less than two hours.

But extremist content is only one type of objectionable content on the internet, with governments struggling to stem the flow of everything from child pornography to so-called fake news. The importance of the battle was underscored during the 2016 U.S. presidential election, during which Russian entities sought to influence to outcome by placing thousands of ads on social media that reached some 10 million people on Facebook alone.

Social media companies have struggled to respond. Because the companies see themselves not as publishers but as platforms for other people to share information, they have traditionally been cautious about taking down material.

Amid growing pressure, Facebook, Twitter, Google and its unit YouTube last year created the Global Internet Forum to Combat Terrorism, which says it is committed to developing new content-detection technology, helping smaller companies combat extremism and promoting "counter-speech," content meant to blunt the impact of extremist material.

Unilever, a global consumer products company and one of the world's largest advertisers, on Monday demanded results, saying it wouldn't advertise on platforms that do not make a positive contribution to society. Its chief marketing officer, Keith Weed, said he's told Facebook, Google, Twitter, Snap, and Amazon that Unilever wants to change the conversation.

"Consumers ... care about fraudulent practice, fake news, and Russians influencing the U.S. election," he said at a digital advertising conference, according to excerpts of a speech provided by Unilever. "They don't care about good value for advertisers. But they do care when they see their brands being placed next to ads funding terror, or exploiting children."

So far, though, the technology needed to detect and remove dangerous posts hasn't kept up with the threat, experts say. Removing such material still requires judgment, and artificial intelligence has not proved good enough to determine the difference, for example, between an article about the so-called Islamic State and posts from the group itself.

The software being unveiled Tuesday is aimed at stopping the vast bulk of material before it goes online.

Marc Warner, CEO ASI Data Science, which helped developed the technology, said the social media giants can't solve this problem alone.

"The way to fight that is to cut the propaganda off at the source," he said. "We need to prevent all of these horrible videos ever getting to the sort of people that can be influenced by them."

Tests of the program show it can identify 94 percent of IS propaganda videos, according to the Home Office, which provided some 600,000 pounds ($833,000) to fund the software's development.

But experts on extremist material say even if the software works perfectly it will not even come close to removing all Islamic State material on line.

Charlie Winter, Senior Research Fellow at the International Center for the Study of Radicalization at King's College London, said the program only focuses on video and video is only a small portion of "the Islamic state corpus."

"I think it's a positive step but it shouldn't be considered a solution the problem," he said. "There's so much more that needs to be done."
https://www.newstimes.com/business/t...t-12609397.php





Head to Head, Does the Apple HomePod Really Sound the Best?
David Pogue

To convince journalists about the audio quality of its new HomePod smart speaker (here’s my review), Apple did something smart: Before we were given our review units, we were required to attend a listening session. Mine was held in Apple’s New York City public-relations loft, a mockup of an apartment.

Four speakers were on a counter against a wall: Sonos One ($200), Google Home Max ($400), the HomePod ($350), and the Amazon Echo Plus ($150).

The PR person could switch playback from one speaker to the other without missing a beat. They even had a halo light rigged to turn on behind whichever speaker was playing, so you’d know which was which.

There was not a shred of doubt: In this side-by-side comparison, the HomePod sounded better than its competitors.

Most of the reviews, including mine, said the same thing: that the HomePod isn’t as smart as the other smart speakers (among other problems, its voice control is limited to iTunes and Apple Music — no Spotify), but that it sounds amazing.

• What Hi-Fi (a British audiophile site): “The HomePod is the best-sounding smart speaker available—and by quite a margin.”
• Pocket-Lint (tech site): “The best sounding speaker of its type.”
• The Verge: “It sounds far better than any other speaker in its price range.”
• Tech Crunch: “HomePod is easily the best sounding mainstream smart speaker ever. It’s got better separation and bass response than anything else in its size.”

Still, when I tweeted about the test, a couple of people were suspicious of the setup, which of course was entirely controlled by Apple. What was the source material? What was the wireless setup?

An Apple rep told me that the test songs were streaming from a server in the next room (a Mac). But each speaker was connected to it differently: by Bluetooth (Amazon Echo), Ethernet (Sonos), input miniplug (Google Home), and AirPlay (HomePod), which is Apple’s Wi-Fi-based transmission system.

Since the setup wasn’t identical, I wondered if it was a perfectly fair test. (Bluetooth, for example, may degrade (compress) the music it’s transmitting, depending on the source and the equipment.)

So I decided to set up my own test at home.

The setup

I hid the four speakers behind a curtain — a sheet of thin, sheer fabric that wouldn’t affect the sound. It took me a Sunday to figure out how to get the A/B/C/D switching to work seamlessly, but I finally managed it: All four speakers would be streaming from Spotify, all four over Wi-Fi. I’d use the Spotify app’s device switcher to hop among speakers without missing a beat.

I chose five songs, each with different styles, instrumentation, and sonic demands:

• “Star Wars: Imperial March.” Full orchestra, full volume, full of brass.
• “Havana” (Camila Cabello). Current pop hit. Distinct bass, drums, piano, and voice. Lots of rhythm.
• Brandenburg Concerto No. 3 in G Major. All strings, full range of pitches and dynamics.
• “Hallelujah” (Pentatonix). A cappella ballad, five voices, very exposed and close to the mikes.
• “Helpless” (from “Hamilton”). Broadway pit band, pop sound, female harmonies.

In these kinds of tests, volume matching is incredibly important, for a couple of reasons. As Tom’s Hardware puts it: “First, if sources are at different levels, they’re easy to tell apart. From there, the test is no longer blind. Second, us humans tend to prefer (all other factors being equal) louder sources.”

For my dress rehearsal the night before, I volume matched them as best I could by ear.

The panelists at the dress rehearsal were my wife Nicki and my friend Mike, a professional guitarist who spent years as an audio technician for big-name touring bands.

I gave each panelist a score sheet, with room for notes, and asked them to rank the four speakers, 1 through 4, after each listening test. I sat at the laptop to control the tests; I played the same section of each song for about 20 seconds on each speaker. Panelists were free to ask for re-plays, or to hear any speaker again, or to hear two speakers in a different succession.

At the end of the rehearsal, I asked the listeners to choose a winner, based on how many first-place finishes they’d marked down. Both Nicki and Mike declared the HomePod to have the best sound, hands down.

The test

The next day, the Yahoo film crew arrived. Our sound recordist, Dave, used level meters to help me volume-match the speakers more precisely.

My five panelists included Darwin, a professional violinist who spends a lot of time listening to recordings on nice gear; Julie, an entrepreneur and homeowner who is precisely the target market for these speakers; Dana and Tori, high-schoolers who haven’t yet begun to lose their ability to hear high frequencies; and Rob, a sound technician for Yahoo.

I didn’t tell them which speakers would be involved. I said only that there were four of them behind the curtain, and I’d refer to them as speakers A through D.

I handed out their score sheets and began the test. Five songs, 20 seconds each, free replays when requested. For each song, I played the speakers in a different order (A to D sometimes, D to A sometimes).

The results

Of course, I knew what the results would be. I’d heard them myself in the Apple demo; I’d read the other reviews; and I’d done the dress rehearsal the night before. Every time, the HomePod won the match easily.

At the end of my own listening test, then, I handed out signs that said “A,” “B,” “C,” and “D,” and asked the panelists to hold up their winners’ signs on the count of three. I knew what they would say: “B,” “B,” “B,” “B,” and “B” (that was the HomePod’s letter).

That’s not what happened.

They held up their signs. Two of them ranked the Google Home Max (“D”) as the best. Three of them ranked the Sonos One (“A”) the best.

Nobody ranked the HomePod the best.

The explanation

I actually have no great explanation for this outcome. Most of the panelists had ranked the HomePod (“B”) as first on some of the songs — just not most of the songs.

• Rob: “For me, A, the Sonos, consistently had the most robust sound of all of them.”
• Tori: “The Sonos won two of them for me. ‘B’ [HomePod] won the ‘Star Wars.’”
• Dana: “’B’ [HomePod] won one of mine. I felt like ‘A’ [Sonos], a lot of times, sounded a lot more sharp.”
• Julie: “I picked between B and D [HomePod and Google Home Max] as being the two best. B and D were pretty clear. And C [the Amazon Echo] came in consistently last for me.”
• Darwin: “I actually found A [the Sonos] to be the one that I hated the most. B [HomePod] did win one for me. It won ‘Havana,’ because it had a better low end. But I generally picked D [Google Home Max], because it had a clearer, nicer range. As a classical person, I definitely would go with D. But if I were listening to more pop stuff, I could see where ‘A’ [Sonos] could win.”

So what are we to make of this? Why did none of my panelists rank HomePod a solid No. 1, when most critics all do (and so do I)?

Was something wrong with my setup? Well, no, because the night before, using the same setup, Nicki and Mike both ranked the HomePod No. 1.

Here are my theories:

• Different music is different. My panelists all conceded that there was some variation depending on the material. “Honestly, they were pretty on par,” Rob said. “I don’t know that one stood out that much more than the other.” “It was much different with different music,” Darwin added. “It varied a lot for me, depending on the song,” Tori agreed.
• Different people are different. I said that most professional critics ranked HomePod as No. 1, but not all of them. Buzzfeed’s young critic, for example, concluded: “Ultimately, none of this is a hard science, and audio preferences are highly subjective. Reactions to its audio quality from the four people who listened to it for this review.. were mixed. The HomePod outperformed other speakers in some situations and not others.” And the Wall Street Journal’s Joanna Stern wrote, “The HomePod’s bass is impressive for the size of the speaker, but in many songs, it’s far too front-and-center in the mix.”
• Nobody else did blind tests. As far as I can tell, none of the other critics who declared HomePod No. 1 actually set up their own blind A/B/C/D tests. Maybe their conclusions wouldn’t have been so emphatic if they had.
• Apple’s setup was different. Remember, Apple’s four speakers were each connected to the source material differently: Two wired, one over Wi-Fi, one over Bluetooth. Maybe that wasn’t an even playing field — and for sure, it wasn’t a real-world playing field. Most people, most of the time, just connect these speakers to their Wi-Fi networks and stream music from an online service.

What I can say for sure is this:

• The Apple HomePod generally sounds better than any other smart speaker—but only somewhat, and only in direct A/B/C/D tests. If you listened to the HomePod, Sonos, and Google Home an hour apart, you’d never be able to declare one a clear winner. (Everyone agrees that the Amazon Echo Plus is the loser in this roundup, but then again, it’s $150 and the size of a Pringle’s can; it’s not a fair fight.)
• You can get two Sonos Ones for the price of a single Apple HomePod. You can use them as a stereo pair, or put them in different rooms and control them by voice. And you can have your choice of 42 music services (Spotify, Pandora, TuneIn, etc.) — not just Apple Music. And you can use all of Amazon’s Alexa voice commands (and, soon, Google’s commands and even Siri’s commands!), meaning you can control a vastly larger range of smart-home devices than the HomePod can.

Everybody’s different.

Music gear (and listening tests) are famously contentious; they’re probably responsible for triggering more flame wars online than abortion and gun control put together. I’d love to hear your thoughts on Apple’s test and mine in the Comments!
http://pogueman.tumblr.com/post/1707...ally-sound#_=_





‘Black Panther’ Will Break Box-Office Records. But could it Change the Movie Business?
Steven Zeitchik

Black Panther,” Marvel’s African-oriented comic book adaptation, is shaping up to be the most successful movie at the box office ever for a film with a primarily black cast.

And it’s raising hopes for a new wave of broad-interest commercial films featuring black actors and stories.

“That a predominantly black-cast movie is getting this kind of traction finally shows what we all intuitively know: Make great art and people will respond,” said the actor-musician Common, who like many African-Americans in entertainment has criticized the industry for ignoring the potential of black-oriented films.

But some in Hollywood also worry that “Black Panther” may prove more the exception than the rule. For all the enthusiasm over the movie, they say it has attributes – like Marvel’s massive production and marketing machine – that will make it easy to be dismissed as an outlier by executives contemplating future projects. It also comes after decades during which films from black artists struggled to gain traction in Hollywood.

Hollywood has long underinvested in African-American movies. The movie industry makes assumptions about audiences that don’t match their behavior, say both African-American and white critics of the current system.

For many years, they note, studios executives believed that white moviegoers wouldn’t come out in droves to a black-driven film, a corollary to another longstanding trope: that black stars don’t fare well internationally. As it turns out, minorities now help drive the box-office of many blockbusters.

“I think no matter how much money this movie makes, it will be seen by a lot of people in [studio] staff meetings and greenlight committees as just a one-off,” said a prominent African-American figure in the Hollywood development world, asking for anonymity so as not to be perceived as criticizing potential business partners. “And the question: is how many more ‘one-offs’ will we have before they realize it’s a pattern?”

Ryan Coogler’s take on the 1960’s hero Black Panther, starring Chadwick Boseman, Lupita Nyong’o and Michael B. Jordan, has turned into a cultural event. Opening Friday, the movie comes from Disney and its Marvel subsidiary, two of the most potent commercial forces currently operating in Hollywood.

It is projected by tracking services to gross some $175 million at the domestic box office over the four-day holiday weekend and even has a chance to top $166 million for the Friday-Sunday period—which would put it in the top ten openings of all time. Over the coming weeks, the action-adventure is expected to sail past $350 million domestically and could well surpass $400 million. Marvel gave Coogler a budget pegged by trade reports at $200 million, nearly unheard of for a black director.

The social context is ripe for “Black Panther” to be a breakout. Rhea Combs, the curator of film and photography at the Smithsonian’s National Museum of African American History and Culture, said that “‘Black Panther’ comes in the midst of Black Lives Matter and all the other important social movements that are happening now. It makes for a very powerful combination.”

Black stories are almost never made at top production and marketing budget levels, and thus often perform at more mid-range levels–they include the movies of Tyler Perry, which reliably gross between $50-$70 million.

One of the highest-grossing films to date with a largely black cast, 1988’s “Coming to America,” ranks at No. 291 on the all-time U.S. list when adjusting for inflation, with $274 million. And even that was an anomaly. The thirty years since have brought very little else of its kind, as the studios constructed most movies with primarily black casts as lower-budget niche offerings.

But “Black Panther” breaks that pattern. “What’s unprecedented is here we have a film with a black lead and majority black cast that’s also a tentpole film,” said Darnell Hunt, a professor and dean of social science at UCLA who specializes in issues of Hollywood and race, using the term for a modern expensive effects movie.

“Black Panther” has been tracking extraordinarily well with African-Americans. According to a survey conducted two weeks ago by the data analytics firm You.Gov, 74% of African-Americans said they planned to see the movie on some platform in the coming months. (By contrast, the highest percentage of African-Americans who said they’d seen any of the other 17 movies in the Marvel Cinematic Universe was 44%, for 2008’s “The Incredible Hulk.)

But it’s in fact white audiences where the numbers really pop: 49% of white respondents said they planned to watch “Black Panther” on some platform. That’s actually a higher proportion than has seen any other Marvel movie. The next closest was the three-part “Iron Man” franchise, which drew 46% respondents.

“What you see here are numbers across the board with all races,” said Larry Shannon-Missal, the head of data services for You.Gov in the U.S. “Black people will turn out in a very significant way, but so will white moviegoers.”

UCLA’s Hunt drew an analogy: to “The Cosby Show.” “You knew black people were going to watch it,” he said. “But what made that show a hit was that a lot of white people watched it.”

For a lot of movies, that may not be even necessary. Minority audiences are actually helping propel box-office success for the biggest blockbusters. According to Hunt, who heads up an annual study of diversity in Hollywood, minority groups constituted more than 50% of U.S. ticket buyers on five of the top ten highest-grossing global releases in 2015, the most recent year for which statistics are available. It was the first time that threshold had been reached. His study of 2016’s box-office, due at the end of the month, could see that number rise higher, he said.

Other parts of the entertainment industry have long understood—economically if not ethically—the importance of diverse artists, says Marc Morial, the former mayor of New Orleans and the current president of the National Urban League. “From Motown to Prince to hip-hop, the music business understands black artists have a lot of broad appeal beyond the black community,” Morial said. “So why hasn’t Hollywood understood that?”

Until recently, Hollywood studios have indeed been far more reluctant to make movies that went beyond a single black star or that dealt with black-specific stories – even though movies with African-American leads such as Will Smith and Eddie Murphy have been major box-office champions for years.

Only in the past several years have studios begun producing movies with predominantly black casts that deal with such themes. (Some of them, including “Get Out” and “Straight Outta Compton,” each topped $150 million domestic on budgets far smaller than that.)

For a long time, Hollywood refused to believe that white audiences would attend so-called minority movies—or even bothered to test the theory. A group of largely white male executives rarely greenlit stories about black America.

On the rare occasions those stories were made, they tended to find profitability—John Singleton’s 1991 inner-city drama “Boyz N The Hood” and the films of Spike Lee, particularly 1992’s “Malcolm X,” both of which came from major studios and topped $75 million in domestic box office when adjusting for inflation. But that success happened very infrequently and, observers say, with little regard for its meaning.

Still, some activists privately wonder why it took so long for Marvel to finally have a black superhero lead, particularly when many other lesser-known superhero characters were mined in the interim. Disney began actively developing “Black Panther” in 2009, driven by Nate Moore, Marvel’s key executive who is also African-American. Disney declined to make anyone involved in the movie available to comment.

Until Disney gave African-American filmmaker Ava DuVernay $100 million to direct next month’s “A Wrinkle In Time,” no black female had ever been handed a budget that high.

The activists also wonder if the effect of “Black Panther,” both inside and outside Hollywood, might be exaggerated.

“This movie is a fantasy,” said Todd Boyd, a professor of cinema and media studies at USC who specializes in race and popular culture. Spike Lee’s “Malcolm X,” he noted, raised awareness for a political movement and even catalyzed others to carry the mantle. “I don’t think this will change anything,” he said. “Hollywood has never put its muscle behind telling the story of the real Black Panthers, so why would a fictional movie about a fictional place make them do things differently?”

Transforming “Black Panther” into a springboard for more representation in executive suites and on film slates won’t be easy, other diversity advocates acknowledge. But they say the fact so many are embracing an African-centric story could make the strongest case for more black-oriented movies.

According to the You.Gov study, 15% of Americans who have never seen a Marvel movie say they will break that pattern for “Black Panther,” suggesting the movie is broadening audiences.

“The best part of this film’s success is it will show a black movie can be not just important culturally,” said the Urban League’s Morial, “but a big winner economically.”
https://www.washingtonpost.com/news/...ovie-business/





New York Times CEO: Print Journalism has Maybe Another 10 Years

• New York Times print products may last another 10 years, says the company's CEO, Mark Thompson.
• As the company continues to build its digital presence, it will re-evaluate the demand for print, Thompson says.
• Meanwhile, the company added 157,000 new digital subscriptions in the last quarter of 2017.

Kellie Ell

The newspaper printing presses may have another decade of life in them, New York Times CEO Mark Thompson told CNBC on Monday.

"I believe at least 10 years is what we can see in the U.S. for our print products," Thompson said on "Power Lunch." He said he'd like to have the print edition "survive and thrive as long as it can," but admitted it might face an expiration date.

"We'll decide that simply on the economics," he said. "There may come a point when the economics of [the print paper] no longer make sense for us."

"The key thing for us is that we're pivoting," Thompson said. "Our plan is to go on serving our loyal print subscribers as long as we can. But meanwhile to build up the digital business, so that we have a successful growing company and a successful news operation long after print is gone."

Digital subscriptions, in fact, may be what's keeping the New York Times afloat for a new generation of readers. While Thompson said the number of print subscribers is relatively constant, "with a little bit of a decline every time," the company said last week that it added 157,000 digital subscribers in the fourth quarter of 2017. The majority were new subscribers, but that number also included cooking and crossword subscriptions.

Revenue from digital subscriptions increased more than 51 percent in the quarter compared with a year earlier. Overall subscription revenue increased 19.2 percent.

Meanwhile, the company's fourth-quarter earnings and revenue beat analysts expectations, "even though the print side of the business is still somewhat challenged," Thompson said. Total revenue rose 10 percent from a year earlier to $484.1 million. New York Times' shares have risen more than 20 percent this year.

Even with the recent market volatility the stock is up 8 percent from last week.

Under Thompson's leadership, the New York Times has become the first news organization in the world to pass the 1 million digital-only subscription mark.

"Without question we make more money on a print subscriber," Thompson said. "But the point about digital is that we believe we can grow many, many more of them. We've already got more digital than print subscribers. Digital is growing very rapidly. Ultimately, there will be many times the number of digital subscribers compared to print."
https://www.cnbc.com/2018/02/12/prin...times-ceo.html

















Until next week,

- js.



















Current Week In Review





Recent WiRs -

February 10th, February 3rd, January 27th, January 20th

Jack Spratts' Week In Review is published every Friday. Submit letters, articles, press releases, comments, questions etc. in plain text English to jackspratts (at) lycos (dot) com. Submission deadlines are Thursdays @ 1400 UTC. Please include contact info. The right to publish all remarks is reserved.


"The First Amendment rests on the assumption that the widest possible dissemination of information from diverse and antagonistic sources is essential to the welfare of the public."
- Hugo Black
__________________
Thanks For Sharing
JackSpratts is offline   Reply With Quote
Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Peer-To-Peer News - The Week In Review - July 16th, '11 JackSpratts Peer to Peer 0 13-07-11 06:43 AM
Peer-To-Peer News - The Week In Review - July 9th, '11 JackSpratts Peer to Peer 0 06-07-11 05:36 AM
Peer-To-Peer News - The Week In Review - January 30th, '10 JackSpratts Peer to Peer 0 27-01-10 07:49 AM
Peer-To-Peer News - The Week In Review - January 16th, '10 JackSpratts Peer to Peer 0 13-01-10 09:02 AM
Peer-To-Peer News - The Week In Review - December 5th, '09 JackSpratts Peer to Peer 0 02-12-09 08:32 AM






All times are GMT -6. The time now is 10:58 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
© www.p2p-zone.com - Napsterites - 2000 - 2024 (Contact grm1@iinet.net.au for all admin enquiries)