P2P-Zone  

Go Back   P2P-Zone > Peer to Peer
FAQ Members List Calendar Search Today's Posts Mark Forums Read

Peer to Peer The 3rd millenium technology!

Reply
 
Thread Tools Search this Thread Display Modes
Old 18-05-23, 05:45 PM   #1
JackSpratts
 
JackSpratts's Avatar
 
Join Date: May 2001
Location: New England
Posts: 10,013
Default Peer-To-Peer News - The Week In Review - May 20th, ’23

Since 2002































May 20th, 2023




Supreme Court Won’t Hold Tech Companies Liable for User Posts

The justices ruled in one case that a law allowing suits for aiding terrorism did not apply to the ordinary activities of social media companies.
Adam Liptak

The Supreme Court handed twin victories to technology platforms on Thursday by declining in two cases to hold them liable for content posted by their users.

In a case involving Google, the court for now rejected efforts to limit the sweep of the law that frees the platforms from liability for user content, Section 230 of the Communications Decency Act.

In a separate case involving Twitter, the court ruled unanimously that another law allowing suits for aiding terrorism did not apply to the ordinary activities of social media companies.

The rulings did not definitively resolve the question of what responsibility platforms should have for the content posted on and recommended by their sites, an issue that has grown increasingly pressing as social media has become ubiquitous in modern life. But the decision by the court to pass for now on clarifying the breadth of Section 230, which dates to 1996, was cheered by the technology industry, which has long portrayed the law as integral to the development of the internet.

“Companies, scholars, content creators and civil society organizations who joined with us in this case will be reassured by this result,” Halimah DeLaine Prado, Google’s general counsel, said in a statement.

The Twitter case concerned Nawras Alassaf, who was killed in a terrorist attack at the Reina nightclub in Istanbul in 2017 for which the Islamic State claimed responsibility. His family sued Twitter, Google and Facebook, saying they had allowed ISIS to use their platforms to recruit and train terrorists.

Justice Clarence Thomas, writing for the court, said the “plaintiffs’ allegations are insufficient to establish that these defendants aided and abetted ISIS in carrying out the relevant attack.”

He wrote that the defendants transmitted staggering amounts of content. “It appears that for every minute of the day, approximately 500 hours of video are uploaded to YouTube, 510,000 comments are posted on Facebook, and 347,000 tweets are sent on Twitter,” Justice Thomas wrote.

And he acknowledged that the platforms use algorithms to steer users toward content that interests them.

“So, for example,” Justice Thomas wrote, “a person who watches cooking shows on YouTube is more likely to see cooking-based videos and advertisements for cookbooks, whereas someone who likes to watch professorial lectures might see collegiate debates and advertisements for TED Talks.

“But,” he added, “not all of the content on defendants’ platforms is so benign.” In particular, “ISIS uploaded videos that fund-raised for weapons of terror and that showed brutal executions of soldiers and civilians alike.”

The platforms’ failure to remove such content, Justice Thomas wrote, was not enough to establish liability for aiding and abetting, which he said required plausible allegations that they “gave such knowing and substantial assistance to ISIS that they culpably participated in the Reina attack.”

The plaintiffs had not cleared that bar, Justice Thomas wrote. “Plaintiffs’ claims fall far short of plausibly alleging that defendants aided and abetted the Reina attack,” he wrote.

The platforms’ algorithms did not change the analysis, he wrote.

“The algorithms appear agnostic as to the nature of the content, matching any content (including ISIS’ content) with any user who is more likely to view that content,” Justice Thomas wrote. “The fact that these algorithms matched some ISIS content with some users thus does not convert defendants’ passive assistance into active abetting.”

A contrary ruling, he added, would expose the platforms to potential liability for “each and every ISIS terrorist act committed anywhere in the world.”

The court’s decision in the case, Twitter v. Taamneh, No. 21-1496, allowed the justices to avoid ruling on the scope of Section 230, a law intended to nurture what was then a nascent creation called the internet.

Section 230 was a reaction to a decision holding an online message board liable for what a user had posted because the service had engaged in some content moderation. The provision said, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”

Section 230 helped enable the rise of huge social networks like Facebook and Twitter by ensuring that the sites did not assume legal liability with every new tweet, status update and comment. Limiting the sweep of the law could expose the platforms to lawsuits claiming they had steered people to posts and videos that promoted extremism, urged violence, harmed reputations and caused emotional distress.

The case against Google was brought by the family of Nohemi Gonzalez, a 23-year-old college student who was killed in a restaurant in Paris during terrorist attacks there in November 2015, which also targeted the Bataclan concert hall. The family’s lawyers argued that YouTube, a subsidiary of Google, had used algorithms to push Islamic State videos to interested viewers.

In a brief, unsigned opinion in the case, Gonzalez v. Google, No. 21-1333, the court said it would not “address the application of Section 230 to a complaint that appears to state little, if any, plausible claim for relief.” The court instead returned the case to the appeals court “in light of our decision in Twitter.”

It is unclear what the ruling will mean for legislative efforts to eliminate or modify the legal shield.

A growing group of bipartisan lawmakers, academics and activists have grown skeptical of Section 230 and say that it has shielded giant tech companies from consequences for disinformation, discrimination and violent content across their platforms.

In recent years, they have advanced a new argument: that the platforms forfeit their protections when their algorithms recommend content, target ads or introduce new connections to their users. These recommendation engines are pervasive, powering features like YouTube’s autoplay function and Instagram’s suggestions of accounts to follow. Judges have mostly rejected this reasoning.

Members of Congress have also called for changes to the law. But political realities have largely stopped those proposals from gaining traction. Republicans, angered by tech companies that remove posts by conservative politicians and publishers, want the platforms to take down less content. Democrats want the platforms to remove more, like false information about Covid-19.

Critics of Section 230 had mixed responses to the court’s decision, or lack of one, in the Gonzalez case.

Senator Marsha Blackburn, a Tennessee Republican who has criticized major tech platforms, said on Twitter that Congress needed to step in to reform the law because the companies “turn a blind eye” to harmful activities online.

Hany Farid, a computer science professor at the University of California, Berkeley, who signed a brief supporting the Gonzalez family’s case, said that he was heartened that the court had not offered a full-throated defense of the Section 230 liability shield.

He added that he thought “the door is still open for a better case with better facts” to challenge the tech platforms’ immunity.

Tech companies and their allies have warned that any alterations to Section 230 would cause the online platforms to take down far more content to avoid any potential legal liability.

Jess Miers, legal advocacy counsel for Chamber of Progress, a lobbying group that represents tech firms including Google and Meta, the parent company of Facebook and Instagram, said in a statement that arguments in the case made clear that “changing Section 230’s interpretation would create more issues than it would solve.”

David McCabe contributed reporting.
https://www.nytimes.com/2023/05/18/u...itter-230.html





ChatGPT Tax Proposed By Media Organizations In Spain In Effort To Combat Content Pirating
Benjamin A. Smith

A proposal has been put forward by media associations, spearheaded by the Asociación para la Investigación de Medios de Comunicacion (AMI) suggesting the implementation of a “ChatGPT tax” on artificial intelligence (AI). This idea harks back to the unsuccessful “Google News Tax” from the past. The premise behind this proposal is that the process of AI generating content is akin to a journalist reading a story and crafting their own version. As a result, the associations are urging OpenAI, Google, and Microsoft to engage in negotiations regarding compensation for the use of their news by AI systems.

The primary point of contention revolves around the issue of intellectual property and the utilization of copyrighted content by AI. These systems generate text by statistically analyzing vast amounts of text available on the internet, absorbing and scrutinizing billions of pieces of information. However, the question of who holds the copyright to content generated by AI remains unresolved.

The AMI’s legal action emerges at a time when Google and Microsoft are on the brink of incorporating generative AI tools into their search engines, specifically OpenAI’s ChatGPT and Microsoft’s enhanced ChatGPT model integrated into Bing. Google has responded to the AMI’s request by affirming its commitment to developing tools that assist publishers in monetizing their content and enhancing their relationship with their audience.

The Club Abierto de Editores (CLABE), an organization representing approximately 1,000 titles from 180 publishing companies, the majority of which are digital natives, aligns with Google’s stance. They argue that the question of how AI-generated content is produced is a matter that concerns all citizens, not just the media industry.

Meanwhile, European privacy regulators have initiated a comprehensive investigation into ChatGPT to ascertain whether it has utilized personal data from European Union citizens during its training process.
https://thedalesreport.com/trends/ch...tent-pirating/





Police Arrest HBO Hacker

Man leaked House of the Dragon finale online.
Leonard Bernardone

Police in Israel have arrested a man suspected of hacking into HBO servers and publishing stolen episodes of its hit show House of the Dragon to pirating sites.

Last October, the season finale of House of the Dragon (a prequel series to Game of Thrones) was leaked to torrent sites by an anonymous hacker only two days before its intended premiere.

A statement from Israel Police reveals a months-long undercover investigation was conducted by its cyber crime unit after HBO issued a complaint about the suspected hack.

Police investigators used “advanced technological measures” while investigating the suspect – a resident of Givatayim in his 30s – and notably drew findings from social media to hone in on an arrest.

In a sleuthy and somewhat bizarre outcome, police found a vital clue in the inclusion of the word “Bird” in a username tied to a pirate file – which later turned out to be the name of the suspect's cat.

Israel Police said it conducted a “search of his home” and seized equipment such as computers and “digital storage devices” possibly relevant to the investigation.

At the time of the leak, HBO appeared fed up with piracy (Game of Thrones was the most pirated TV show of 2019), and boldly declared it would "aggressively" monitor and pull copies from the internet.

The TV company has long pursued illegal copies of copyright media, issued warnings to viewers who pirate HBO content, and sometimes even overstepped in its anti-piracy efforts – such as when it issued a misplaced takedown notice on media player VLC.

Following investigation, Israel Police released the suspected House of the Dragon hacker and the matter is now pending consideration at the prosecutor's office.

But HBO isn’t the only big media company cracking down on piracy of its copyright content.

Tears of the pirating community

A week before The Legend of Zelda: Tears of the Kingdom released to rave reviews, the game had already leaked to the internet.

The game is now out, but players are still turning to illegal downloads rather than the Nintendo eShop – stirring the notoriously litigious game company into a copyright frenzy.

Nintendo issued a slew of copyright takedowns to platforms such as Discord, Twitch and Twitter for so much as hosting links to pirated versions of the game.

Later, members of the Switch Pirates subreddit warned Nintendo was closely eyeing pirating activity – with one user in particular posting a Digital Millennium Copyright Act notice they received from internet service provider Comcast – warning them about an alleged copyright infringement.

“Bruh wtf. Did Nintendo somehow find out?” said user RevolutionaryToe6738.

“They know the exact means of how I did it too wtf.”

The copyright notice also contained information on the specific file the player had downloaded, where they pirated it and what torrenting platform they used.

The gaming titan went on to subpoena social platform Discord for the personal details of a user involved in sharing copies of an official Tears of the Kingdom artbook, and leading up to the release, takedown requests were fired against wholly legitimate social media accounts which simply re-sharing official preview images from the game.

Nintendo's ask-questions-later approach has often left fans disgruntled by questionable copyright action, and Nintendo even recently hit its own Zelda Twitter accounts with a takedown notice.

“Nintendo is working hard to preserve the video game industry's ability to invest in the development of new and exciting games, and to give all legally-sold Nintendo games a chance to succeed,” reads Nintendo's Anti-Piracy Programme.

“Piracy continues to be a significant threat to Nintendo's business, as well as thousands of game development companies working to provide unique and innovative games for Nintendo’s console and handheld systems.”
https://ia.acs.org.au/article/2023/p...bo-hacker.html





Supreme Court Rules Against Andy Warhol in Copyright Case

The justices considered whether the artist was free to use elements of a rock photographer’s portrait of the musician Prince.
Adam Liptak

The Supreme Court ruled on Thursday that Andy Warhol was not entitled to draw on a prominent photographer’s portrait of Prince for an image of the musician that his estate licensed to a magazine, limiting the scope of the fair-use defense to copyright infringement in the realm of visual art.

The vote was 7 to 2. Justice Sonia Sotomayor, writing for the majority, said the photographer’s “original works, like those of other photographers, are entitled to copyright protection, even against famous artists.”

She focused on the fact that Warhol and Lynn Goldsmith, the photographer whose work he altered, were both engaged in the commercial enterprise of licensing images of Prince to magazines.

“To hold otherwise would potentially authorize a range of commercial copying of photographs, to be used for purposes that are substantially the same as those of the originals,” Justice Sotomayor wrote. “As long as the user somehow portrays the subject of the photograph differently, he could make modest alterations to the original, sell it to an outlet to accompany a story about the subject, and claim transformative use.”

In dissent, Justice Elena Kagan, joined by Chief Justice John G. Roberts Jr., wrote that the decision “will stifle creativity of every sort.”

“It will impede new art and music and literature,” she wrote. “It will thwart the expression of new ideas and the attainment of new knowledge. It will make our world poorer.”

The dueling opinions, from two liberal justices who are often allies, had an unusually sharp tone.

Justice Kagan’s opinion, Justice Sotomayor wrote, was made up of “a series of misstatements and exaggerations, from the dissent’s very first sentence to its very last.”

Justice Kagan responded that Justice Sotomayor wholly failed to appreciate Warhol’s art.

“The majority does not see it,” Justice Kagan wrote. “And I mean that literally. There is precious little evidence in today’s opinion that the majority has actually looked at these images, much less that it has engaged with expert views of their aesthetics and meaning.”

The decision was also unusual for including more than a dozen reproductions of artworks by Warhol and others, most of them in color.

The portrait of Prince at issue in the case was taken in 1981 by Lynn Goldsmith, a successful rock photographer on assignment for Newsweek.

In 1984, around the time Prince released “Purple Rain,” Vanity Fair hired Warhol to create a work to accompany an article titled “Purple Fame.” The magazine paid Ms. Goldsmith $400 to license the portrait as an “artist reference,” agreeing to credit her and to use it only in connection with a single issue.

In a series of 16 images, Warhol altered the photograph in various ways, notably by cropping and coloring it to create what his foundation’s lawyers described as “a flat, impersonal, disembodied, mask-like appearance.” Vanity Fair ran one of them.

Warhol died in 1987, and the Andy Warhol Foundation for the Visual Arts assumed ownership of his work. When Prince died in 2016, Vanity Fair’s parent company, Condé Nast, published a special issue celebrating his life. It paid the foundation $10,250 to use a different image from the series for the cover. Ms. Goldsmith received no money or credit.

Litigation followed, much of it focused on whether Warhol had transformed Ms. Goldsmith’s photograph. The Supreme Court has said a work is transformative if it “adds something new, with a further purpose or different character, altering the first with new expression, meaning or message.”

Justice Kagan rejected the idea that those photographs and Warhol’s image were fungible.

“Suppose you were the editor of Vanity Fair or Condé Nast, publishing an article about Prince,” she wrote. “You need, of course, some kind of picture. An employee comes to you with two options: the Goldsmith photo, the Warhol portrait. Would you say that you don’t really care? That the employee is free to flip a coin? In the majority’s view, you apparently would.”

She added: “All I can say is that it’s a good thing the majority isn’t in the magazine business. Of course you would care!”

The majority’s analysis, Justice Kagan wrote, was simplistic and wooden.

“All of Warhol’s artistry and social commentary,” she wrote, “is negated by one thing: Warhol licensed his portrait to a magazine, and Goldsmith sometimes licensed her photos to magazines too. That is the sum and substance of the majority opinion.”

The case, Andy Warhol Foundation for the Visual Arts v. Goldsmith, No. 21-869, concerned the limits of the fair-use defense, which allows copying that would otherwise be unlawful if it involves activities like criticism and news reporting.

Lower courts differed about whether Warhol’s alterations of the photograph transformed it into something different. Judge John G. Koeltl of the Federal District Court in Manhattan ruled that Warhol had created something new by imbuing the photograph with fresh meaning.

But a three-judge panel of the U.S. Court of Appeals for the Second Circuit said that judges should compare how similar the two works are and leave the interpretation of their meaning to others.

“The district judge should not assume the role of art critic and seek to ascertain the intent behind or meaning of the works at issue,” Judge Gerard E. Lynch wrote for the panel. “That is so both because judges are typically unsuited to make aesthetic judgments and because such perceptions are inherently subjective.”

Justice Sotomayor wrote that a crucial factor in the fair-use analysis — “the purpose and character of the use, including whether such use is of a commercial nature or is for nonprofit educational purposes” — weighed in Ms. Goldsmith’s favor.

“Warhol himself paid to license photographs for some of his artistic renditions,” Justice Sotomayor wrote. “Such licenses, for photographs or derivatives of them, are how photographers like Goldsmith make a living. They provide an economic incentive to create original works, which is the goal of copyright.”

Other Warhol works, like Warhol’s images of Campbell’s soup cans, were a different matter, she wrote.

“The purpose of Campbell’s logo is to advertise soup. Warhol’s canvases do not share that purpose,” Justice Sotomayor wrote. “Rather, the soup cans series uses Campbell’s copyrighted work for an artistic commentary on consumerism.”
https://www.nytimes.com/2023/05/18/u...copyright.html





Former File-Sharing Platform LimeWire Reaches $17.5m in Total Funding Following Multiple Financing Rounds
Mandy Dalugdug

LimeWire, a file-sharing platform that relaunched as a digital collectibles marketplace, has raised USD $17.5 million in multiple rounds of funding through the sale of its cryptocurrency token, LMWR.

It means that the company exceeded its $15 million funding target and is now more than halfway to its $30 million fundraising limit.

The public sale of the LMWR token kicked off on May 2 and closed on Thursday (May 11). Remaining tokens are set to be burned and LMWR will now go live on multiple global exchanges on Tuesday (May 16).

Those who purchased LMWR tokens will automatically have accounts on limewire.com using the e-mail address that they used during the purchase process. Those who bought LMWR during the community pre-sale period can also claim their tokens via the same process and their LMWR will automatically vest into a LimeWire account that they used on the community sale launchpad, Tokensoft.

The presale raised $400,000 worth of LMWR, with $200,000 sold in the first 30 minutes of the sale.

The development comes as LimeWire transitions into an NFT marketplace after becoming notorious in the early 2000s for facilitating music piracy in its peer-to-peer file-sharing software.

Last year, the company raised $10.4 million in a private sale of its LMWR token. The round was led by Kraken Ventures, Arrington Capital and GSR, and was participated by Crypto.com Capital, CMCC Global, Hivemind, Hard Yaka, Red Beard Ventures, FiveT, 720Mau5, the fund behind Canadian music producer Deadmau5, as well as DAO Jones, a group of investors consisting of high-profile members from the music industry, including well-known electronic music artist Steve Aoki.

At the time, LimeWire said it intends to use the proceeds to grow its team and extend partnerships. It also plans “to onboard major music artists” onto its platform.

Shortly after that announcement, LimeWire struck its first major label deal with Universal Music Group that saw the latter providing licenses to allow LimeWire to partner with UMG artists in order to launch music-based NFT projects using the LimeWire marketplace.

LimeWire last month launched a new subscription service for all kinds of creators and brands, not just musicians.

“Since our relaunch in July last year, we have been actively engaging with artists to understand their needs and aspirations. Through open dialogue and collaboration, we’ve sought to solve ongoing challenges faced by creators in today’s digital landscape, such as difficulties with monetization, content ownership, and community management,” LimeWire said in April.

LimeWire claimed that it is the world’s first artist subscription platform with fully ownable content for fans and followers.

The subscription model will allow fans to subscribe to their favorite creators, artists and brands, as well as purchase limited paid content and interact with other members within the creator’s community.

Through blockchain technology, LimeWire said creators can protect their content from exploitation and piracy. Creators are able to generate passive revenue streams through royalties and pay-per-view income as they mint their content.
https://www.musicbusinessworldwide.c...ancing-rounds/





The Death of Ownership

Companies are taking away your ability to actually own the stuff you buy
Nathan Proctor

Andy Harding has been running his small electronics-repair shop, Salem Techsperts, in Salem, Massachusetts, for the past eight years. He does steady business fixing phones for college students and nurses from the nearby hospital. But soon after the release of the iPhone 13 in September 2021, Harding noticed a minor change to Apple's software that he thought might shut down his small shop for good.

One of the most frequent repairs Harding does — and one of his biggest revenue drivers — is fixing cracked iPhone screens. But Apple added a new feature to the latest model that would detect when the display was swapped, including screen repairs, and then disable the FaceID feature. The shift freaked out the owners of many repair shops, including Harding.

"People pay good money for a phone with FaceID, and they want it to work," Harding told me recently. "Broken iPhone screens are the number-one repair for shops like mine. I couldn't survive without that part of the business."

Eventually, Apple rolled out a software update that allowed FaceID to work after a screen repair. But the phone still warns users the screen is not genuine unless they use an "Apple-authorized" repair provider. But why does anyone need Apple's blessing to fix their phone? You already paid for it — you own the phone, you should be able to fix it on your terms.

Apple isn't the only company to put restrictions on goods that people have already bought. As more devices in our lives run on software, manufacturers have started to exert more control over their products even after the customer has taken them home. In some cases, companies force customers to use their repair services, disabling the product if they try to fix it themselves. In other instances, corporations require people to pay for ongoing subscription to access basic features of the goods.

Modern software allows manufacturers to tether product users to them, forever. Companies are just beginning to monetize this control, with dystopian methods and the assistance of America's unbalanced copyright laws. But there are ways that consumers and policymakers can push back on this corporate attempt to redefine what it means to "own" a product.

You bought it, but you don't really 'own' it

Imagine the start of a hypothetical summer Monday, some time in the future. You remotely start your coffee machine ($5 a month for the app to schedule brewing in advance and another $25 for recurring delivery of compatible pods) while you hit your stationary bike for a quick workout ($30 a month for access to classes). When you're ready to head into the office, the smart thermostat automatically turns down the air-conditioning (a $10-a-month feature) as you use an app on your phone to remotely start your car (which costs you $20 a month). And if you want to get any of that fixed? Put away your screwdriver, because you'll have to go to the manufacturer for even a minor tune-up.

While this may seem far-fetched, the explosion of subscription services for consumer products is pushing reality closer to this hypothetical. The global market for e-commerce subscriptions is expected to increase from around $73 billion in 2021 to some $904 billion in 2026. In addition to the proliferation of meal-delivery boxes and streaming services, companies are in many cases making access to the very thing you bought contingent on your payment: no subscription and you've got a brick taking up space. For companies, the appeal of subscriptions is pretty straightforward: a steady stream of revenue and a lot more money raised from their customers over time. While software development and maintenance comes with its own set of costs, the overhead is much lower than hardware manufacturing and gives companies more opportunities to make an additional sale — meaning that recurring revenue comes with huge profit margins.

Businesses use a slew of tactics to keep customers on the hook after they've purchased a product. One tactic is to use technical sensors to prevent unauthorized changes to the product. Take the experience of America's farmers: Newer equipment like tractors and combines often require special tools that manufacturers offer exclusively to authorized dealers. Along with highly technical computer systems, this makes it nearly impossible for farmers to fix their own vehicles. My organization, Public Interest Research Group or PIRG, calculated that repair restrictions cost farmers an additional $4.2 billion each year, with $1.2 billion going to the local authorized dealers and another $3 billion lost to equipment downtime. Similarly, Tesla's software can detect and restrict features from car owners that equipment not from the company, such as after-market tow hitches (while Tesla's own hitches are out of stock).

In other cases, companies have tried to block consumers from accessing certain features at all unless they pay up first. Car companies have taken the lead on pushing this trend. Mercedes-Benz and BMW made headlines for charging users monthly fees for better acceleration and the use of heated seats, respectively. You already bought the seat heater (and the luxury car that contains it), but now you need to pay for the right to turn it on? Printer companies have used similar tactics to get people to sign up for subscriptions that remotely monitor ink levels but can also shut off your machine if you fail to pay. Imagine if you had to pay the contractor who built your house a monthly fee so the light switches would work!

Finally, manufacturers use internet connectivity to monitor and control what you do. If they detect you did something they don't like (maybe hot-wiring your heated seat), they can take away or disable other features. Tesla has been accused of revoking charge capacity, fast-charging compatibility, and other features remotely. Consumers are afraid to do anything that displeases manufacturers, knowing that they can be punished.

"We need to end the continual monitoring of our behavior by some far-off manufacturer that can approve or reject the choices we make with products we bought."

You might think there ought to be a law against policies that make people simultaneously "buy" and "rent" things. But existing laws work against consumers, allowing manufacturers control what you can and cannot do. For example, overly broad copyright laws, in the hands of overzealous manufacturers, can make it a copyright crime to bypass technical systems to tinker with or repair your own device. The Digital Millennium Copyright Act was intended to prevent people from pirating music, games, or movies. But manufacturers have argued that the DMCA applies to software or firmware needed to fix or operate a piece of hardware. This overly broad definition of intellectual property has been leveraged to prevent independent repair and has redefined consumers' relationship with the goods they buy. By this interpretation, if the manufacturer installs a digital-protection measure around the heated seats, bypassing that could be seen as essentially piracy. If that's confusing, it's because it's silly.

Manufacturers also author dense user agreements that contain language to prevent customers from tinkering with the product. Most people have run into long "Terms of Use" documents chocked full of legalese that stretch on for pages and pages. In most cases, consumers simply check "agree" with little to no knowledge about what they're signing. A 2017 Deloitte survey of 2,000 consumers found that 91% click to agree with terms and conditions without reading them.

But inside these dense documents are rules that prevent people from fixing their goods or let the company take back ownership if they don't approve of how customers use the product. Sneaking these terms and conditions past people undermines basic consumer rights.

Tinker, tailor, service, mod

I believe in truth in advertising. If you're going to sell somebody something, sell it to them. If you are going to lease something to somebody, lease it to them. If you tether their future purchases to a secret "agreement" that you baked into the technology that they don't know about, that is deceptive. Not to mention, tinkering and fixing are American traditions. The ethos of "if it's broke, then fix it" has other benefits, too. Repair teaches critical skills, it saves consumers money, it helps cut waste and product obsolescence. Tinkering and fixing also leads to product innovations that can benefit everyone.

There are solutions to protect ownership. The first is right-to-repair legislation, which I've worked to pass in various states over the past five-plus years. Right to repair requires manufacturers to make the parts, tools, and information needed to conduct repairs available to consumers, on fair terms. It also says that those parts and tools cannot require remote authentication to become operable, meaning no more having to ask for permission to conduct repairs. So far in 2023, 28 states have considered some form of right-to-repair legislation, and Congress has held multiple hearings on the topic. Legislatures have now passed laws in Massachusetts, Colorado, and New York — and we're just getting started.

Another step is to clarify that repair isn't a copyright crime. The Freedom to Repair Act, introduced last year, would give a broad, permanent exemption to repair activities under copyright law. In addition to passing new laws, we need to enforce the laws we already have on the books. It's supposed to be a violation of antitrust laws to create a "tying" arrangement that forces someone who buys one product to buy other products or services. Anyone with a printer who has tried to find cheaper ink knows this isn't enforced effectively.

The US Federal Trade Commission and the US Department of Justice need to crack down on embedded software that forces product owners to pay monthly fees to use hardware they own. Regulators should also crack down on toxic legal terms put into user license agreements, just as they banned certain anti-consumer terms from credit-card-use agreements.

In the digital age, we need new consumer protections to reflect our agency as product-owning people. We need to be able to repair things without fear of reprisal. We shouldn't be forced to sign away our rights when we buy something. We need to end the continual monitoring of our behavior by some far-off manufacturer that can approve or reject the choices we make with products we bought.

Until then, we, like Andy Harding of Salem Techsperts, wait nervously to see what we will lose when the newest "innovation" hits the shelves.

Nathan Proctor is the Senior Director of US Public Interest Research Group's (PIRG) Right to Repair campaign.
https://www.businessinsider.com/comp...nership-2023-5

















Until next week,

- js.



















Current Week In Review





Recent WiRs -

May 13th, May 6th, April 29th, April 22nd

Jack Spratts' Week In Review is published every Friday. Submit letters, articles, press releases, comments, questions etc. in plain text English to jackspratts (at) lycos (dot) com. Submission deadlines are Thursdays @ 1400 UTC. Please include contact info. The right to publish all remarks is reserved.


"The First Amendment rests on the assumption that the widest possible dissemination of information from diverse and antagonistic sources is essential to the welfare of the public."
- Hugo Black
__________________
Thanks For Sharing
JackSpratts is offline   Reply With Quote
Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Peer-To-Peer News - The Week In Review - July 16th, '11 JackSpratts Peer to Peer 0 13-07-11 06:43 AM
Peer-To-Peer News - The Week In Review - July 9th, '11 JackSpratts Peer to Peer 0 06-07-11 05:36 AM
Peer-To-Peer News - The Week In Review - January 30th, '10 JackSpratts Peer to Peer 0 27-01-10 07:49 AM
Peer-To-Peer News - The Week In Review - January 16th, '10 JackSpratts Peer to Peer 0 13-01-10 09:02 AM
Peer-To-Peer News - The Week In Review - December 5th, '09 JackSpratts Peer to Peer 0 02-12-09 08:32 AM






All times are GMT -6. The time now is 10:34 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
© www.p2p-zone.com - Napsterites - 2000 - 2024 (Contact grm1@iinet.net.au for all admin enquiries)