P2P-Zone  

Go Back   P2P-Zone > Peer to Peer
FAQ Members List Calendar Search Today's Posts Mark Forums Read

Peer to Peer The 3rd millenium technology!

 
 
Thread Tools Search this Thread Display Modes
Prev Previous Post   Next Post Next
Old 26-03-08, 06:55 AM   #1
JackSpratts
 
JackSpratts's Avatar
 
Join Date: May 2001
Location: New England
Posts: 10,016
Default Peer-To-Peer News - The Week In Review - March 29th, '08

Since 2002


































"This is a historic find, the earliest known recording of sound." – Samuel Brylawski


"What are the rights of the discoverer versus the improver? Come, Parisians, don’t let them take our prize." – Édouard-Léon Scott de Martinville, inventor of recorded sound


"We don't want to be subject to laws of the Patriot Act." – Tom Puk


"Listen, I just think it’s bizarre and funny. My main consideration is that my daughter doesn’t get embarrassed about it." – Rick Astley


"We do in fact sell more CDs than we did ten years ago, and I suspect there’ll be a lot of albums sold on CD [in future] as well." – Matt Philips, British Phonographic Industry


"This deal is the direct result of public pressure, and the threat of FCC action, against Comcast. But with Comcast's history of broken promises and record of deception, we can't just take their word that the Internet is now in safe hands." – Marvin Ammori


"We have decided on our own, not due to any court order or agreement, to bring the TorrentSpy.com search engine to an end and thus we permanently closed down worldwide on March 24, 2008. We now feel compelled to provide the ultimate method of privacy protection for our users - permanent shutdown." – Justin Bunnell


"Today all big torrent sites are pressured somehow. TPB has its share of pressure, however we expected it and have a legal system that is more just in cases like this. The way that the copyright lobby is going at this is totally wrong and we can’t let them win. And we won’t let them win." – Brokep


"It's a great clusterfuck for the American mind's idea of Mexico. This teaches the rest of the world that Mexico is not just a bunch of cactuses and sombreros." – Gustavo Arellano


"I don't fall over and convulse, but it hurts. I was on the phone when it happened, and I couldn't move and couldn't speak." – RyAnne Fultz


"I think we're right on the verge of 'ultra budget' filmmaking." – Kirk Mastin


"I have lived in the shadow of this my whole life. I am so happy now, I just can’t explain it." – Laura Siegel Larson
































March 29th, 2008





Creator’s Family Reclaims the Rights to Superman
Michael Cieply

Time Warner is no longer the sole proprietor of Superman.

A federal judge here on Wednesday ruled that the heirs of Jerome Siegel — who 70 years ago sold the rights to the action hero he created with Joseph Shuster to Detective Comics for $130 — were entitled to claim a share of the United States copyright to the character. The ruling left intact Time Warner’s international rights to the character, which it has long owned through its DC Comics unit.

And it reserved for trial questions over how much the company may owe the Siegel heirs for use of the character since 1999, when their ownership is deemed to have been restored. Also to be resolved is whether the heirs are entitled to payments directly from Time Warner’s film unit, Warner Brothers, which took in $200 million at the domestic box office with “Superman Returns” in 2006, or only from the DC unit’s Superman profits.

Still, the ruling threatened to complicate Warner’s plans to make more films featuring Superman, including another sequel and a planned movie based on the DC Comics’ “Justice League of America,” in which he joins Batman, Wonder Woman and other superheroes to battle evildoers.

If the ruling survives a Time Warner legal challenge, it may also open the door to a similar reversion of rights to the estate of Mr. Shuster in 2013. That would give heirs of the two creators control over use of their lucrative character until at least 2033 — and perhaps longer, if Congress once again extends copyright terms — according to Marc Toberoff, a lawyer who represents the Siegels and the Shuster estate.

“It would be very powerful,” said Mr. Toberoff, speaking by telephone on Friday. “After 2013, Time Warner couldn’t exploit any new Superman-derived works without a license from the Siegels and Shusters.”

Time Warner lawyers declined to discuss the decision, a spokesman said. A similar ruling in 2006 allowed the Siegels to recapture their rights in the Superboy character, without determining whether Superboy was, in fact, the basis for Warner Brothers’s “Smallville” television series. The decision was later challenged in a case that has yet to be resolved, said Mr. Toberoff, who represented the family in that action.

This week’s decision by Stephen G. Larson, a judge in the Federal District Court for the Central District of California, provided long-sought vindication to the wife and daughter of Mr. Siegel, who had bemoaned until his death in 1996 having parted so cheaply with rights to the lucrative hero.

“We were just stubborn,” Joanne Siegel, Mr. Siegel’s widow, said in a joint interview with her daughter, Laura Siegel Larson. “It was a dream of Jerry’s, and we just took up the task.”

The ruling specifically upheld the Seigels’ copyright in the Superman material published in Detective Comics’ Action Comics Vol. 1. The extent to which later iterations of the character are derived from that original was not determined by the judge.

In an unusually detailed narrative, the judge’s 72-page order described how Mr. Siegel and Mr. Shuster, as teenagers at Glenville High School in Cleveland, became friends and collaborators on their school newspaper in 1932. They worked together on a short story, “The Reign of the Superman,” in which their famous character first appeared not as hero, but villain.

By 1937, the pair were offering publishers comic strips in which the classic Superman elements — cape, logo and Clark Kent alter-ego — were already set. When Detective Comics bought 13 pages of work for its new Action Comics series the next year, the company sent Mr. Siegel a check for $130, and received in return a release from both creators granting the company rights to Superman “to have and hold forever,” the order noted.

In the late 1940s, a referee in a New York court upheld Detective Comics’ copyright, prompting Mr. Siegel and Mr. Shuster to drop their claim in exchange for $94,000. More than 30 years later, DC Comics (the successor to Detective Comics) gave the creators each a $20,000-per-year annuity that was later increased to $30,000. In 1997, however, Mrs. Siegel and her daughter served copyright termination notices under provisions of a 1976 law that permits heirs, under certain circumstances, to recover rights to creations.

Mr. Toberoff, their lawyer, has been something of a gadfly to Warner in the past. In the late 1990s, for example, he represented Gilbert Ralston, a television writer, in a legal battle over his rights in the CBS television series “The Wild Wild West,” which was the basis for a 1999 Warner Brothers film that starred Will Smith. The case, said Mr. Toberoff, was settled.

Compensation to the Siegels would be limited to any work created after their 1999 termination date. Income from the 1978 “Superman” film, or the three sequels that followed in the 1980s, are not at issue. But a “Superman Returns” sequel being planned with the filmmaker Bryan Singer (who has also directed “The Usual Suspects” and “X-Men”) might require payments to the Siegels, should they prevail in a demand that the studio’s income, not just that of the comics unit, be subject to a court-ordered accounting.

Mrs. Siegel and Ms. Larson said it was too soon to make future plans for the Superman character. But they were inclined to relish this moment.

“I have lived in the shadow of this my whole life,” Ms. Larson said. “I am so happy now, I just can’t explain it.”
http://www.nytimes.com/2008/03/29/bu.../29comics.html





The Google Of Peer To Peer?
John Foley

Tiversa, a five-year-old company based in Pittsburgh, specializes in knowing what kind of content is being shared over peer-to-peer networks. Until now, it's concentrated on helping businesses find and fix data leaks caused by file-sharing users. But Tiversa's got other plans for its technology, including working with advertisers to understand and respond to user activity on P2P networks.

P2P networks have long suffered a bad rap, going back to the troubles of Napster seven or eight years ago. They can be a hornet's nest of copyright violation, ill-intended searches, and leaked data. As InformationWeek reported earlier this week, P2P networks are rife with sensitive business documents and personal information, often the result of users inadvertently storing those documents in their music folders or otherwise misconfiguring the file-sharing application during installation. (See "Your Data And The P2P Peril" and "Our P2P Investigation Turns Up Business Data Galore."

Tiversa helps businesses get the data leak problem under control, and it turns out that its technology can be used in other ways, too. The company has begun working with unnamed advertisers to see if the information it culls from P2P activity -- primarily search terms and search matches -- might be used for market intelligence or even targeted ad campaigns.

How might that work? Say someone is looking for a copy of the 2007 movie Beowulf. Tiversa's real-time monitoring system might respond to the search with an offer of a licensed copy of the movie or a related game. These are somewhat murky waters given that P2P networks have served as a distribution channel for unlicensed music and movies, but it could be another step toward commercializing and legitimizing file sharing. Just last week, the Distributed Computing Industry Association, a trade group for the P2P industry, held a conference devoted to P2P advertising and new business models.

Tivera's technology could ultimately turn up in browsers. Tiversa COO Chris Gormley says incorporating P2P search into a browser tool bar would be a "no brainer." The company has been in discussions with a major Web portal, though it won't disclose names. "They could use our technology to create a whole different search engine," he says. "They could be become the Google of peer to peer."

Don't scoff. Tiversa says 1.5 billion searches a day take place on P2P networks, several times the volume handled by Google. If accurate, it's only a matter of time before mainstream advertisers get serious about the file-sharing crowd.
http://www.informationweek.com/blog/...ogle_of_p.html





BitTorrent Sites Show Explosive Growth
Ernesto

BitTorrent’s popularity is growing every day. Despite efforts from anti-piracy outfits such as the MPAA and IFPI, torrent sites continue to grow traffic wise, and there is no sign that this trend will be brought to a halt anytime soon.

We decided to compile a list of the 25 most popular BitTorrent sites, and see how their (relative) popularity has grown over the past three months. Out of the 25 sites in the list, 21 improved their ranking in Alexa’s list of most popular sites on the Internet.

There are a few changes in the top 10 compared to the list of 2007’s top torrent sites. Mininova is still leading the bunch, but The Pirate Bay is now in second place, in front of isoHunt. Torrentportal and TorrentReactor traded places as well, the sites are now 7th and 9th respectively.

YouTorrent is the only newcomer, and it’s really impressive that the site made it into the top 25 only 2 months after its launch.

Top 25 torrent sites March 2007
Rank # Torrent Site Alexa Rank (Dec 07) Alexa Rank (Mar 08) Change (%)
1. mininova.org 63 53 + 19 %
2. ThePirateBay.org 182 130 + 40 %
3. isohunt.com 170 147 + 16 %
4. Torrentz.com 231 192 + 20 %
5. BtJunkie.org 689 469 + 47 %
6. torrentspy.com 376 585 - 36 %
7. TorrentReactor.net 909 616 + 48 %
8. GamesTorrents.com 942 641 + 47 %
9. TorrentPortal.com 699 697 no change
10. btmon.com 924 743 + 24 %
11. sumotorrent.com 1,894 1,101 + 72 %
12. myBittorrent.com 1,861 1,454 + 28 %
13. animesuki.com 1,738 1,473 + 18 %
14. Fulldls.com 1,448 1,646 - 12 %
15. bitdig.com 5,805 1,945 + 300 %
16. torrentz.ws 7,990 1,991 + 400 %
17. newtorrents.info 3,348 2,272 + 47 %
18. Torrent-Finder.com 3,404 2,635 + 29 %
19. TorrentBox.com 2,812 2,686 + 5 %
20. Fenopy.com 3,102 2,901 + 7 %
21. torrentvalley.com 5,276 3,014 + 75 %
22. youtorrent.com … 3,107 New!
23. TorrentReactor.to 3,016 3,313 - 9 %
24. www.zoozle.org 4,669 3,369 + 39 %
25. www.seedpeer.com 3,992 3,449 + 16 %

Note: Alexa’s data gathering is not perfect. The exact figures may be not be completely accurate, but it is a great tool (especially the traffic rank) to compare sites within the same niche and to get a global impression of traffic shifts over time.
http://torrentfreak.com/bittorrent-s...growth-080322/





When Capacity Is Never Enough

The Cautionary Tale of Video Downloads
Tom Steinert-Threlkeld

A bit is a bit is a bit. An electronic packet of data is a packet is a packet. Unless you look inside to see what it contains.

Yet Internet-access providers such as Comcast and Time Warner Cable, the nation’s two largest cable operators, are loath to open the envelopes.

Even if those packets are generated by a relatively small number of users carrying illegally obtained goods — and hogging bandwidth in the process.

“How do you distinguish, without invading the privacy of the people communicating?” asked Federal Communications Commissioner Jonathan Adelstein at last week’s Internet Video Policy Symposium in Washington, D.C. “It’s very hard to distinguish, just looking at packets, whether they’re legal or illegal.”

What Adelstein was talking about was Internet video — and more specifically, the distribution of movies, TV shows and other content by a technique called peer-to-peer sharing.

Broadband Internet-access subscribers use their own computers — and those of anyone who wants to share shows — to manage the distribution of whatever they want to watch. Unchecked, peer-to-peer users have become the bane of network operators, who see no way to limit the ability of these Net-video hogs to usurp the bandwidth they have spent billions to install.

Sharing a single high-definition movie can employ 25 Megabits per second of capacity, until completed. A standard-definition show can take up 6 Mpbs. This isn’t a case for your garden-variety 256,000-bit “broadband” connection anymore.

The solution?

If you’re a policymaker, such as Adelstein, the answer looks like … more bandwidth.

But the answer is not that simple, according to cable-system operators. With peer-to-peer file-sharing, there’s never enough capacity. And broadband network owners would spend huge sums — more billions — to serve the voracious appetites of as little as 5% of their subscribership.

The Video Hog

“Video can be a real bandwidth hog sometimes,” Adelstein said Tuesday night. “Of course, one great solution for that problem is adding more capacity.”

If that were accomplished, there would be no debate about the P2P file-sharing problem. You don’t see this kind of problem in Japan, where Internet users are already connected at 100 Megabits per second, Adelstein contends.

But wait. It’s not that cut and dried, according to Comcast. In the face of this form of hogging, there’s never enough capacity. Invest in more bandwidth and, by definition, it will almost immediately be overtaken by file-sharing.

“Because these P2P protocols are designed to devour any and all available bandwidth on the network, it is not possible to build one’s way out of the need for reasonable network management,” Comcast public policy counsel Joseph Waz and a phalanx of other Comcast lawyers argued to the Federal Communications Commission in a Feb. 12 filing of comments on broadband industry practices.

How can that be? It’s relatively straightforward, according to Sena Fitzmaurice, senior director of corporate communications and government affairs for Comcast in the nation’s capital.

The primary program used for efficient distribution of files, BitTorrent, splits tasks between available personal computers. A few central servers “seed” participating PCs with content and those machines, in turn, seed others. In the end, each computer becomes part of a one-for-all, all-for-one union, using a cable network’s last-mile bandwidth to distribute heavy duty content.

To move video or other large files, the seeded computers ask for as much capacity as they can get, to deliver the final result as quickly as possible. And since much of that work takes place in neighborhoods and not network backbones, available capacity is maxed out more rapidly.

When it comes to downloading video for later viewing, as opposed to streaming it for on-demand playback (“I Want My Web TV,” March 17, 2008, page 14), network operators and economic analysts define BitTorrent — and its adherents — as bandwidth hogs.

“It’s a question that answers itself, doesn’t it?” former FCC chief economist Gerry Faulhaber, now a professor at the University of Pennsylvania’s Wharton School of Business, said at the symposium. “It’s not called BitTrickle, after all.”

By one count in the middle of last year, video and audio streaming overtook peer-to-peer applications as the top consumer of Internet bandwidth, accounting for 46% of all traffic. That came after four years of P2P overwhelmingly consuming the most bandwidth, according to Ellacoya Networks.

But even though that report indicated P2P was “losing its status as the biggest bandwidth hog on the Net, that’s not what we’re seeing,” said Yankee Group analyst David Vorhaus. Cable operators continue to report that 60% to 75% of their Internet traffic is being generated by P2P file-sharing, Vorhaus said.

And, as with streaming video, it all boils down to a very small number of outsized users. Vorhaus estimates that 5% to 10% of Internet users are generating 80% to 90% of this P2P traffic. In 2006, research conducted by CableLabs staffer Terry Shaw and Clemson University computer science professor Jim Martin found, in fact, that it only takes about 10 BitTorrent users bartering files on a 500-home network node to double the delays experienced by all other users. That’s especially true if the BitTorrent users are taking in video and everybody else is using “normal priority” services, like e-mail or Web surfing, which are characterized as “best-effort” traffic.

Online, BitTorrent has tried to refashion itself as a legitimate purveyor of video content. Its BitTorrent Entertainment Network, akin to Apple’s iTunes Store, allows legitimate downloads of more than 10,000 movies, TV shows, games and music videos. BitTorrent is marketing its technology to businesses that want to distribute video. Its first client was Brightcove, an online distributor that serves publishers such as Reuters, National Geographic and TMZ.

But Vorhaus, Faulhaber and many other experts believe the vast majority of BitTorrent’s Web traffic consists of content that is being distributed in violation of copyright law. Indeed, 90% of P2P downloads are still of illegally copied content, according to David Hahn, vice president of product management at SafeNet, which tracks the networks.

That means if the capacity of a cable or telephone broadband network were doubled to accommodate more traffic — and such file-sharing continued — it would, in effect, only support further theft.

Pamela’s Parts

Not managing this traffic flow could be life-threatening, Faulhaber said. Under a network-neutrality regime, an episode of Baywatch would have the same priority as the transfer of your personal medical history to a hospital the minute you collapse at a restaurant. “Pamela Anderson’s parts are not as important as your heart,” said Faulhaber.

But increasing capacity to support P2P traffic may be inevitable — because legitimate uses of the technology could skyrocket. Using local computers, not just central servers, to distribute video content can be extremely efficient. And if a large, for-profit enterprise such as Amazon or Netflix or Blockbuster or The Walt Disney Co., or all of the above, become heavy users of the technology for video downloads, bandwidth would be under heavy pressure.

“That’s the real concern,” said Vorhaus.

To that point, Comcast has tried to curb the current herd of download hogs by focusing on the uploads that fuel the sharing of files over swarms of personal computers.

To try and limit its network load, Comcast, according to its FCC comments, only manages the uploading of files — at a time when a customer or customers are not downloading files at the same time.

Such “unidirectional sessions” indicate an automated file-sharing process is underway and can be held up until “usage drops below an established threshold” of simultaneous sessions.

This is akin to smoothing out the flow of cars onto a highway through the use of temporary stop lights at on-ramps, Comcast contends.

In effect, Comcast has tried to curb the effects of the use of a P2P application without condemning the application itself. But why?

“Why not target the thing that is causing the problem?” Georgetown Center for Business and Public Policy senior fellow Scott Wallsten said of BitTorrent. “What we have is a problem in waiting.”

Normal network-management technique or not, the Comcast practice of delaying BitTorrent traffic at peak times has prompted scrutiny from the Federal Communications Commission. After the Associated Press called the practice data “blocking,” a coalition of consumer groups and law professors filed a complaint with the FCC. They claimed Comcast had violated the agency’s two-year-old net neutrality policy statement, and should be ordered to stop, as well as fined up to $195,000 per affected customer. Comcast has about 13 million high-speed data subscribers.

“Comcast does not block any Web site, application, or Web protocol, including peer-to-peer services, period,” its executive vice president, David Cohen, responded. “What we are doing is a limited form of network management objectively based upon an excessive bandwidth-consumptive protocol during limited periods of network congestion.”

The commission subsequently opened a formal investigation of the practice; and held a hearing a few weeks ago at the Harvard Law School into the fairness of the practice.

The commission’s chairman, Kevin Martin, says that two troubling aspects of the case are that, in his view, Comcast at first denied the allegations. Also troubling, he said, were allegations that Comcast altered certain user information in packets to effect a delay in peer-to-peer transmissions. He said a ruling will come by July 1.

For its part, Comcast told the commission in its February comments that many of the complaints about the effects of the “blocking” had nothing to do with blocks or delays. Among the complaints: that users couldn’t check e-mail when sending files from home to work and that chat services weren’t working properly, allegedly because of bad network management.

“These commenters’ calls for Commission intervention are misplaced,” Waz and the Comcast attorneys said. “Surely, the Commission has neither the resources nor the ability to turn itself into the help desk for 60 million broadband households.”

The Capacity Panacea

The solution again, at first blush, looks like this: Add capacity. Then you could handle all bits equally, without delay.

That “would also deal with the network-neutrality issues. The more capacity, the less of an issue it becomes,” Adelstein said.

But capacity can be a chimera, said Faulhaber. Just look at Japan, a country that Adelstein cites as an exemplar of serving net video hogs and average users at the same time.

Japanese consumers are already used to Internet-access speeds of 100 Mbps — the rate touted by Comcast CEO Brian Roberts at the 2007 Cable Show as the imminent signal speed of “wideband Internet access” from cable operators.

Faulhaber notes that even in Japan, 100 Mbps is not fast enough to avoid the suddenly cardinal sin of managing its network that Comcast has committed.

Even in Japan, delaying the arrival of some content — so-called traffic-shaping — is common. This is a normal practice that allows more efficient traffic processing for all users, not just hogs.

“To say, 'We have a neutral network,’ you never do,” said Faulhaber. You have to be proactive, to give more users more services on a consistent basis, he said.

Telcos like Verizon also face bandwidth scarcity problems from peer-to-peer applications. The problem, however, is more acute for cable, since the upstream bandwidth of a DOCSIS cable modem is just a fraction of the downstream speeds.

Meanwhile, Verizon’s public-relations handling of broadband network management-related issues stands in contrast to Comcast’s.

Earlier this month, the telco announced it has tested a system called P4P, developed by researchers at Yale and the University of Washington, designed to keep peer-to-peer traffic off the Internet’s backbone networks. This technology more intelligently directs P2P traffic to local peers, instead of letting software like BitTorrent try to suck data willy-nilly from all over the world. Verizon claimed that using P4P, 58% of peer-to-peer traffic came from nearby P2P users on its network, compared with 6% before.

Verizon attempted to position the move as an embrace of P2P, pointing out there are legitimate uses of peer-to-peer. In a press release, the telco noted that NBC Universal is using P2P as part of its NBC Direct episode-download service. NBCU has been a critic of P2P networks used to swap copyrighted material, urging the FCC last year to require broadband providers to prevent video piracy.

“No longer the dark-alley distribution system for unauthorized file sharing, advanced P2P delivery networks link content-seekers with licensed files they want,” Verizon’s announcement said.

The cable industry also says it’s working on such engineering solutions to handle P2P traffic with technology suppliers. And that’s how the issues ought be resolved, National Cable & Telecommunications Association CEO Kyle McSlarrow said during a conference call with reporters last Thursday.

“Let the marketplace and the Internet community examine the results of what is, and is not, working,” McSlarrow said.

The management of traffic — and the curbing of bandwidth hogs — is going to get more critical as more types of video unique to the Internet — like feeds from security cameras, or video-rich faux-3D imagery found in such games as the Second Life/Google Earth Web environment Second Earth — emerge.

It’s presumed that Internet video will amount to either narcissistic homemade, personally-focused video or high-end, high-quality professional fare. The real result, though, is likely to be a host of new types of video applications that could only exist with such a low-cost mechanism of worldwide distribution.

The trick, said Faulhaber, is for video content producers to collaborate with distributors to provide a protected channel of high-quality video to legitimate viewers. Otherwise, charging for content will be impossible.

Timing is the problem, said Georgetown’s Wallsten. In essence: Knowing what to do about Net video hogs, and when to do it.

“The Internet changes really fast,” he said. “Policies can’t.”

Todd Spangler contributed to this report.
http://www.multichannel.com/article/CA6544099.html





Comcast to Stop Hampering File-Sharing
Deborah Yao

Comcast Corp., an Internet service provider under investigation for hampering online file-sharing by its subscribers, announced Thursday an about-face in its stance and said it will treat all types of Internet traffic equally.

Comcast said it will collaborate with BitTorrent Inc., the company founded by the creator of the peer-to-peer file-sharing protocol, to come up with better ways to transport large files over the Internet instead of delaying file transfers.

Since user reports of interference with file-sharing traffic were confirmed by an Associated Press investigation in October, Comcast has been vigorously defending its practices, most recently at a hearing of the Federal Communications Commission in February.

Consumer and ''Net Neutrality'' advocates have been equally vigorous in their attacks on the company, saying that by secretly blocking some connections between file-sharing computers, Comcast made itself a judge and gatekeeper for the Internet.

They also accused Comcast of stifling delivery of Internet video, an emerging competitor to the core business of the nation's largest cable operator.

It was not immediately clear what effect, if any, the move will have on the FCC's ongoing probe, but Net Neutrality groups remained skeptical.

''This deal is the direct result of public pressure, and the threat of FCC action, against Comcast,'' said Marvin Ammori, general counsel of Free Press, a media reform group. ''But with Comcast's history of broken promises and record of deception, we can't just take their word that the Internet is now in safe hands.''

Shares in Comcast rose 29 cents, or 1.5 percent, to $20 in midday trading Thursday.

Comcast has said that its practices were necessary to keep file-sharing traffic from overwhelming local cable lines, where neighbors share capacity with one another.

On Thursday, Comcast said that by year's end, it will no longer target files based on the type of protocol used, such as BitTorrent's, and will instead explore alternatives.

''The outcome will be a traffic management technique that is more appropriate for today's emerging Internet trends,'' Tony Werner, Comcast's chief technology officer, said in a statement.

One option is to delay file transfers for the heaviest downloaders, regardless of protocol, the Philadelphia-based company said.

Comcast said it also was monitoring Time Warner Cable Inc.'s experiment in placing explicit caps on the monthly downloads for new customers in Beaumont, Texas. Subscribers who go over their allotment will pay extra, much like a cell-phone subscriber who uses too many minutes in a month.

But Comcast may be wary about charging certain users more because of competitive pressure, especially after rival Verizon Communications Inc. said recently that such traffic is legitimate and that its FiOS network can handle the flow, said Harold Feld of Media Access Project, a nonprofit advocacy group in Washington, D.C.

Comcast has been hampering the BitTorrent file-sharing protocol, which together with the eDonkey protocol, accounts for about a third of all Internet traffic, according to figures from Arbor Networks. The vast majority of that is illegal sharing of copyright-protected files, but file-sharing is also emerging as a low-cost way of distributing legal content -- in particular, video.

On Thursday, Werner all but embraced peer-to-peer file transfers, saying the techniques have ''matured as an enabler for legal content distribution.''

The company initially veiled its traffic-management system in secrecy, saying openness would allow users to circumvent it. Werner said the company now would ''publish'' the new technique and take into account feedback from the Internet community.

Comcast and BitTorrent said they want to work out network management issues privately, without the need for government intervention.

FCC Commissioner Robert McDowell agreed as much, saying in a statement that ''the private sector is the best forum to resolve such disputes.''

For its part, BitTorrent acknowledged that service providers have to manage their networks somehow, especially during peak times.

''While we think there were other management techniques that could have been deployed, we understand why Comcast and other ISPs adopted the approach that they did initially,'' Eric Klinker, BitTorrent's chief technology officer, said in a statement.

Comcast also said that the issue is larger than BitTorrent. It said it was in talks with other parties to find solution, although the cable company might not have much of a choice.

Verizon recently announced that by sharing information with Pando Networks, another file-sharing company, Verizon was able to speed up file-sharing downloads for its subscribers while reducing the strain on its own network. AT&T Inc. has been looking at similar collaboration.

However, phone companies are in a better position than cable companies to deal with file-sharing traffic, since neighbors don't share capacity on phone lines.

------

Associated Press Business Writer Barbara Ortutay and Technology Writer Peter Svensson in New York contributed to this story.
http://www.physorg.com/news125835899.html





BitTorrent Inc. + Comcast = Love, Peace, Harmony…Not!
enigmax

When Robb Topolski made the initial discovery that Comcast was interfering with BitTorrent traffic, he couldn’t have imagined that it would lead to an FCC hearing or, more importantly, to apparent reconciliation this week between Comcast and the rest of the world. Thing is, Robb doesn’t believe a word of it.

Ever since the news broke that Comcast had been using ‘hacker-like’ techniques to hamper BitTorrent traffic, Comcast’s name has been dragged through the mud, with claim after claim of dirty tricks, lies, half-truths and strategic omission. It seems that nothing could go right for the company. Until this week, that is.

Apparently, everything in the Comcast garden is rosy these days, with previous arch-rival BitTorrent now working things out together over afternoon tea. However, not everyone is celebrating.

In 2007, Robb Topolski discovered and documented the Comcast interference, informed TorrentFreak and we published an article which ignited the whole debate. It’s safe to say, he’s been following this one closely.

Here are his thoughts on the Comcast / BitTorrent reconciliation:

—–

I’m probably a key figure as to why we’re all talking about Network Neutrality again. I was having a problem uploading on Gnutella in early 2007. I tracked it down to Comcast using Sandvine-injected RST packets. Blog stories led to press stories which led to independent confirmation. And here we are today. Peace and harmony? Probably not.

Today Comcast and BitTorrent seems to have solved world hunger — and I’d love nothing more than to be optimistic about it. But I cannot be. As they say on Slashdot — show video, or it didn’t happen. This deal is treachery, relies on how much we can trust the word of Comcast, and leaves the public interests out in the cold.

I think it’s strange that anyone believes a word that Comcast says. This is the Comcast that:

1. Told the FCC in 2005 that they would not degrade traffic in order to convince the FCC that network neutrality regulations were not needed.

2. Started degrading P2P traffic the very next year, and failed to tell anyone what they were doing.

3. Used a system that utilized forgery, and successfully placed blame on the other peer instead of Comcast.

4. Denied it when caught.

5. Then changed their story when the denials were not believed, but still never came out and said what they were doing.

6. Then they justified their actions by throwing their other Cable-Internet brothers and sisters under the bus with their “they do it too!” defense

7. Then stealthily changed the AUP days before an FCC filing where they referred to the new provisions.

8. When the changed AUP started getting press attention, they stated that a prominent story on Comcast.net alerted millions of visitors of the change and accused Marvin Ammori of crying wolf. (Google cache proved that nothing alerted users to the changed AUP until the day after the press started asking questions.)

9. Then they packed the Harvard FCC hearing.

This company has not demonstrated that you can trust its promises, nor can you believe its assertions. Comcast just used BitTorrent Inc. as a tool to try and defang the FCC.

BitTorrent Inc. is a content provider. Vuze, who actually DID make a complaint and petition to the FCC, is a competitor. Neither BitTorrent, Vuze, nor Comcast represents the interests of 12 million Comcast users nor the The Internet Society nor the public. And this middle-of-the-night deal was made without their input.

Nothing has changed. The RST interference continues. It was a wrongful act. BitTorrent Inc. has no right making a deal with Comcast allowing it to continue to commit wrongful acts until it finally decides it is ready to stop. The correct relief is to stop the interference immediately and to FULLY DISCLOSE what it did and to accept responsibility for those actions. (Even today, Comcast’s Policy VP refused to answer questions about the interference.)

Their word is worthless. Until the interference stops, I have no reason to believe it will. Until either meaningful competition returns to broadband, or until sufficient government regulation enforces Network Neutrality, we have no reason to think that this agreement will last through the night.

Robb Topolski

—–

TorrentFreak confronted Ashwin Navin of BitTorrent Inc. with Robb’s comments, and he told us: “We decided to collaborate with Comcast because they agreed to stop using RSTs, increase upload capacity, and evaluate network hardware that accelerates media delivery and file transfers. We’re at the beginning of the formal collaboration, but Robb’s work was instrumental to identify the offending practice. We need him and the community to keep an eye on ISPs across the world.”

“Our work with Comcast will benefit all P2P development because Comcast has agreed to manage traffic at Layer 3 (the network layer) rather than Layers 4-7 (the protocol/application layers). This is a core component of the neutrality debate,” he added.

As always, time will tell…
http://torrentfreak.com/bittorrent-c...ve-not-080329/





New Project Sniffs Out Bandwidth Shaping ISPs
Thomas Mennecke

The file-sharing world has been consumed by a continuous barrage of coverage on the Comcast vs BitTorrent battle. While Comcast may be able to escape their policy of delaying traffic relatively unscathed, their image has taken a beating in the public relations department. In response to the growing distrust of ISPs, members of the p2pforum.it community developed a tool, dubbed Project Gemini, which allows users to test whether their ISPs block traffic.

According to a forum post, a spokesperson for the project states that, "our aim is to produce evidence with the technique below: we've developed two "Live" operating systems designed to connect with one another over the Internet, to start a BitTorrent transfer, and to record the transmission - after which it will generate a report containing the analysis of the traffic."

Testing for a shaped BitTorrent transmission isn't terribly difficult, but may be foreboding for the less computer savvy. The project requires the end user to download and mount an ISO to CD, change the boot sequence to initialize from an optical drive, and run the program which can only function with Ubuntu. The end user isn't required to have Ubuntu previously installed, rather the CD is a self contained with all the required data.

So if you can burn and mount an ISO - a familiar task in the file-sharing world - and are suspicious of your ISP, this program may be worth a try.
http://www.slyck.com/story1680_New_P...h_Shaping_ISPs





ISPs Limit Access to CBC Download, Users Say

Thousands of viewers have embraced CBC-TV's experiment with BitTorrent, but many Canadians have found their attempts to access a CBC show online restricted by their internet service providers.

On Sunday, CBC offered a final episode of reality TV program Canada's Next Great Prime Minister for download via BitTorrent, a file-sharing service.

The release was an experiment for the public broadcaster in new ways of offering its programming.

Downloads were in the thousands, said Tessa Sproule, executive in charge of digital programming for factual entertainment at CBC Television.

It is impossible to tell how many people actually saw the program because the files are passed from one computer user to another through BitTorrent, she added.

"It was very promising," Sproule told CBC News. "People around the world are seeing the file, probably because it's the first time.… We're very happy about it."
ISP bottlenecks

However, downloaders who blogged about the experience on the Canada's Next Great Prime Minister site complained about very long periods required to download the show.

One user received a notice that it could take 2½ hours to download, while another was quoted 11 hours.

The bottleneck is occurring because ISPs such as Rogers and Bell limit the amount of bandwidth allocated for file-swapping on BitTorrent.

The controversial practice, called traffic shaping, is meant to stop illegal downloading through BitTorrent. But it also slows the times on legal downloads such as Canada's Next Great Prime Minister.

Michael Geist, an Ottawa-based advocate of open sharing over the internet, called the CBC experiment an "enlightened approach to content distribution."

But he warned that ISP practices could get in the way.

"It would be ironic if ISP network management practices ensured that viewers outside the country enjoyed better access to the program than the Canadian taxpayers who helped fund its creation," he wrote in his blog.

Some people posting on the show blog said they'd left their computers running to help speed up downloading through BitTorrent for others.

Most users posting on the blog welcomed the CBC's experiment with BitTorrent.

"With the state of affairs of Canada's fading telecommunications industry, it's fantastic to see that CBC is pressing new boundaries. Kudos on finding new ways to provide Canadian content," said a user called Bob.

Others asked for more content to be delivered this way, including favourite shows such as Jpod, Rick Mercer Report, Fifth Estate and the news.

"I'd like to see more content delivered in this way, without restriction and I can tell you that the majority do not mind ad placement within the content," said a user called Steven G.

There were complaints about the quality of the download, with some saying the images were distorted.

Sproule said CBC is working on refining the quality of broadcast via BitTorrent.
http://www.cbc.ca/arts/tv/story/2008...rrent-cbc.html





Norwegian ISPs Refuse MPAA’s Request to Disconnect Pirates
Ernesto

After being blown off by the Norwegian police, MPAA lawyer Espen Tøndel is now demanding that ISPs disconnect Norwegian file-sharers from the Internet. According to IKT Norway, an interest group for ISPs, the lawyer has sent a letter to Norwegian ISPs on behalf of The Norwegian branch of the MPAA.

In the letter, Tøndel asks the ISPs to notify customers who share copyrighted content, and threaten to disconnect them from the internet. Tøndel also attached a document that supposedly links the IP-addresses of seeders to copyrighted works.

It seems that Norway is not alone in this, Jim Williams, the MPAA’s senior vice president opted for a similar disconnection policy in the US yesterday. IKT Norway is not too happy about the letter though.

“In a constitutional state, the police and the prosecuting authority have the job of investigating and indicting, not lawyers and communication engineers”, says Hallstein Bjercke from IKT Norway, in a press release.

“Most of the big ISPs in Norway are members of IKT Norway and we will support the various ISPs as best we can against what we see as a preposterous demand from Simonsen”, Bjercke adds.

He asks the ISPs to contact IKT Norway instead of answering the law firm’s letter. “In our opinion, Tøndel asks the ISPs to assist them in their private investigation on filesharers. Tøndel’s law firm asks the ISPs to use personal information about their customers in a way that would be a breach on the Norwegian laws on personal information and personal privacy, in addition to breaching the contract between individual customers and their respective ISP.”

“What Simonsen is actually asking for is confessions from the alleged filesharers, which can be used against them if Simonsen decides to sue”, Bjercke said.

IKT Norway makes it clear that the Norwegian ISPs will not take the role of investigator and judge against their own customers. “To give that kind of responsibility to the ISPs is like asking the mailman to control the contents of every letter and package he delivers,” Bjercke says.

IKT Norway is now checking into the legality of the law firms private investigation and the legality of connecting personal information to the customers of Norwegian ISPs.
http://torrentfreak.com/isps-refuse-...equest-080328/





Swedish ISP Refuses To Block The Pirate Bay
enigmax

After forcing a single ISP in Denmark to block The Pirate Bay, it now appears that the IFPI has a plan to sue all of the major Swedish ISPs to force them to do the same. Telia Sonera, a large Swedish ISP is refusing to be bullied, stating that such blocking and filtering actions are illegal under EU law.

Now that the IFPI has realized that it can’t sue every file-sharer in the world, it tries to force ISPs to block their customers from accessing filesharing sites such as The Pirate Bay. The IFPI recently forced an Israeli ISP to block access to HTTPShare.com - which boosted its visitors significantly - but it’s the block it achieved against The Pirate Bay in Denmark that is currently being used as leverage against other ISPs, this time in Sweden.

When IFPI pressure forced Danish ISP ‘Tele2′ to block access to The Pirate Bay, the trade organization chose to view this as some sort of landmark decision which could be used to make other ISPs take the same action, regardless of the likelihood that the block ordered by Denmark broke EU law. According to reports, the IFPI is using this ’successful’ block as a legal hammer to start hitting Swedish ISPs over the head with, as it formulates its plan to sue all of Sweden’s major ISPs into blocking The Pirate Bay.

One such ISP is Telia Sonera, the dominant internet provider in Sweden with a total of 106 million subscribers across Europe. According to a report, they have received a letter from the IFPI which states that legal measures will be taken against them unless they block The Pirate Bay initially, and also some other (as yet unnamed) sites connected to it.

However, Telia is highly experienced in its field and has a very clear understanding of its obligations under the law, law which does not require them to block sites or filter content.

Patrik Hiselius of Telia Sonera explains: “The rules say that we as Internet carriers are not allowed to listen in on what our customers are sending to each other or are talking with each other about. That’s something police and prosecutors are allowed to do after a decision has been made about it in court.”

Unfortunately for the IFPI, the law is very stringent when it comes to wiretapping communications. Eavesdropping is allowed, but only in cases of very severe crime.

Anna Hörnlund, a lawyer with the The Swedish Post and Telecom Agency believes that it’s impossible to identify illegal file-sharing without breaking the law on wiretapping: “To get access to this information, there needs to be a crime that is punishable by imprisonment and where a prosecutor believes charges can be made that leads to prison. In those cases, the ISP can hand over the information to the police. I don’t know how they think they will get through this by suing the ISP.”

Patrik Hiselius of Telia Sonera agrees that the record companies have a problem to solve he says they are approaching the issue in the wrong manner: “The best way to meet the demand for music and film on the Internet would be to make good, simple legal services available with good pricing. The legal sites still have lots of shortcomings when it comes to availability.”

The IFPI has recently sued Irish ISP Eircom, claiming that it’s responsible for the actions of its users. TJ McIntyre, chairman of Digital Rights Ireland (DRI) said: “ISPs are intermediaries. They are not, in law, responsible for what Internet users do, any more than [the post office] is responsible for what individuals send in the mail. In fact, European law specifically states that they may not be put under a general obligation to monitor the information they transmit.”

Clearly the IFPI believes that European law doesn’t count when applied to their interests. If their lawyers weren’t getting so financially fat from these frivolous lawsuits, maybe they would take the time to explain it to them.
http://torrentfreak.com/swedish-isp-...te-bay-080327/





Winny Copyright Infringers 'Should be Identified'
The Yomiuri Shimbun

An advisory panel to the National Police Agency that is looking into ways to prevent cybercrime released Thursday a report on copyright violation issues concerning the person-to-person file sharing software Winny, proposing that Internet service providers should be required to disclose the identity of customers who have used the software to illegally exchange copyrighted movies and music.

The panel, headed by Prof. Masahide Maeda of Tokyo Metropolitan University, also suggests in the report the establishment of a system to make it easier for copyright associations to demand compensation from violators.

In response to the report, Internet industry groups plan to start discussions with copyright associations regarding guidelines on disclosure of violators' identities.

The service providers are required to protect the confidentiality of communications in line with the Constitution. However, the law stipulating the responsibilities of the providers allows them to disclose a person's identity in cases where that person has clearly violated other people's rights.

However, no specific rule concerning the disclosure of identities has been implemented, so the identities of those who infringed copyrights using Winny and other so-called P2P file-sharing software have rarely been disclosed.

Many people use Winny to download copyrighted movies and music over the Internet free of charge. Losses resulting from these illegal transactions are estimated at 10 billion yen.

Members of the Telecom Services Association, the Japanese Society for Rights of Authors, Composers and Publishers and other related organizations have been discussing how to tackle the problem.

Ahead of the release of the panel's latest report, the providers and copyright groups agreed to withdraw Internet services from copyright violators.
http://www.yomiuri.co.jp/dy/national...28TDY02308.htm





TorrentSpy Shuts Down
Ernesto

A little over a year ago, TorrentSpy.com was still the most visited BitTorrent site, but times have changed. After an expensive two year battle with the MPAA, TorrentSpy decided to throw in the towel and the site has now shut down permanently.

TorrentSpy is no more, Justin Bunnell, the founder of the site writes: “We have decided on our own, not due to any court order or agreement, to bring the TorrentSpy.com search engine to an end and thus we permanently closed down worldwide on March 24, 2008.”

The main reason for the shutdown is the ongoing legal battle with the MPAA, which started February 2006. “We now feel compelled to provide the ultimate method of privacy protection for our users - permanent shutdown,” Justin writes.

By the end of 2006 TorrentSpy was more popular than any other BitTorrent site, but this changed quickly in August 2007, when a federal judge ordered TorrentSpy to log all user data. The judge ruled that TorrentSpy had to monitor its users in order to create detailed logs of their activities, and hand these over to the MPAA.

In a response to this decision - and to ensure the privacy of their users - TorrentSpy decided that it was best to block access to all users from the US. This led to a huge decrease in traffic and revenue.

This was not enough for the MPAA, who argued that TorrentSpy had ignored the court decision. The legal battle continued, and this eventually led to a preventative closure of the site by Justin, to protect the privacy of its users.

Brokep from The Pirate Bay had this to say about the closure: “Today all big torrent sites are pressured somehow. TPB has its share of pressure, however we expected it and have a legal system that is more just in cases like this. The way that the copyright lobby is going at this is totally wrong and we can’t let them win. And we won’t let them win. Today we reached a loss of a site, but it was more a person having to give up for economical reasons than anything else. The copyright lobby has their big cards - money and influence. In the long run they will have to give up as well. And when they do, I’ll go to the US and buy Justin a well-deserved beer.”

At this point it is not clear what will happen to the TorrentSpy.com domain. The domain was offered to other BitTorrent site owners, but has not yet been sold.
http://torrentfreak.com/torrentspy-shuts-down-080327/





10 Tips and Tricks for Private BitTorrent Sites
sharky

The first thing to notice when you join a private BitTorrent site is the eye-popping quality of the torrents. Each one is carefully culled, hand-picked through a strict moderation process. However, before you start hammering away on that download link - here are a few things you need to know.

On private torrent sites, everything revolves around ratios. A 1:1 ratio (or 1.0) means that you’ve downloaded exactly the same amount of data as you’ve uploaded. Thus a 0.80 ratio indicates that you’ve uploaded less than you’ve downloaded, which is hurtful to the health of the torrent. Inversely, a 3.0 ratio means you’ve uploaded 3 times more data than you’ve downloaded. Strive to achieve at least a 1.0 ratio - each site will have specific consequences for members who maintain a ratio of less than this. Attain a ratio over 1.0 and the rewards shall follow you into the P2P afterlife.

If you’re brand-new to a private site, it will be difficult in the beginning to acquire a 1.0 ratio. Luckily, users are given a ‘grace’ period to achieve this. Since there are so many more seeds than leechers (a total flip-flop from public BT sites), it becomes harder to upload to others - due to the fact that there are fewer people to share with.

So why go through all the trouble to keep an honest ratio? Because deep down, you’re an upstanding denizen of file-sharing society! Aside from that shameless pat-on-the-back, good ratios offer many perks, including an upgraded account on the tracker (i.e. VIP status), higher download speeds, free “invites” for your friends, and no waiting periods associated with accounts in arrears.

Here are Ten Tips to get your ratio in top-shape as fast as possible:

1. Start out with Smaller Files

Initially, opt for smaller (i.e. under 1 GB) files for downloading. This gives you a greater chance of someone coming along after you and downloading the same torrent (and you’ll be able to upload to them). Obviously a 700MB movie file will be more appealing to other site members than a 30GB ‘Blu-Ray’ rip.

2. Jump on the ‘Newly Released’ torrents

This is a great tip for increasing your ratio in a hurry. Camp out in your favorite private BT site, and refresh the torrent listings frequently. Newly added entries will have many more leechers than seeds, so you’ll be able to share (upload) more data. To maximize this tip, select smaller files - the “TV Episode” category works great for this.

3. Select Files that have a High ‘L’ or upload number

This is important. When selecting torrents, base your initial selections on a high number of leechers (the more, the better). This will ensure you have many avenues to upload to during (and after) the transfer. When starting out on a new private BT site, we would even go so far as to say that you should download torrents that you don’t want - just start grabbing torrents that have lots of leechers. Once your ratio get over the 1:1 (1.0) mark, delete them.

TIP: If one of your seeding torrents remains popular, leave it running in µtorrent permanently. This will always help to boost your upload ratio.

4. Avoid ‘Zero-Leech’ torrents

When you’re new to a private site, steer clear of the ‘zero-leech’ torrents - it is impossible to increase your share ratio when there are no other downloaders. When viewing a list of torrents, look for the “Leecher” column (or just “L”) and avoid anything that has a zero ( “0″ ) in it. After your account ratio has become relatively stable, now is the time to snag whatever you want.

5. Leave some tasks running in uTorrent

After the completed download of a torrent, leave the task running (as a seed) in µtorrent to increase your upload statistics. Don’t delete (or move) the files of a running task! You can, however, extract (unRAR) the files, or copy the files from one place to the next. In the event of a movie/video file - you’ll be able to “burn” or “extract” the *.avi file (or even play it on the PC) without affecting the seeding torrent.

TIP: Always keep a few things running as ’seeds’ in your BT client. If you notice that they aren’t uploading, replace them with newer ones.

6. Go for the ‘Freebie’ downloads

Many private sites will offer “free” torrents that won’t count against your download statistics (thus, your ratio will remain unchanged). Grab these freebies - especially when searching for torrents on a new account.

7. Use ‘Credits’ to purchase…

A popular feature among superior private BitTorrent sites is the addition of a ‘credits’ feature for account holders. Credits can be used to ‘purchase’ VIP status, increased sharing ratios and other perks. Not all sites are the same, but some credits can be acquired just from staying active in their IRC channel, or from just having the torrents available for download in your BT client.

8. Do NOT try to ‘cheat’ the Private Trackers

There are a variety of ratio cheating tips available out there, but don’t be tempted. Trackers are fairly sophisticated and ever-evolving. If you get caught cheating, you won’t even be warned - it’s a permanent ban for you and bye-bye for good.

9. Set a proper Upload Limit

Setting a proper upload limit in the BT client makes all the difference! You’ll want to supply a high enough limit to maximize uploading, but not have it eat into your download bandwidth. The general rule is to set it at 80 - 85% of your upload limit. To figure this out, visit www.speedtest.net and conduct the simple test. Results are shown in kilobits, so divide the result by 1/8 and then multiply that by 0.85. This will give you the proper number in KB/s (KiloBytes).

In µtorrent, go to OPTIONS > Preferences… > Connection and enter your upload rate. Click “Apply” and then “OK” to save the changes. While your in that same ’settings’ page, make sure to use a port number from the good list (e.g. 49152 - 65535).

10. And if all else fails…donate

Most sites allow for monetary contributions to keep up with the server costs. If you enjoy a particular site immensely but cannot seem to be able to approach a decent sharing ratio (due to turning off your computer at night, going to work, or sharing your computer with your kids, wife, husband or siblings), think about donating. In most cases even a not-so-generous gratuity will robustly affect your account status - plus you’ll feel good about helping out the BitTorrent community.

Other Tips - Follow ‘The Rules’

Yeah, we know: you hate rules! That’s why you probably moved out of your parents’ basement. Rules are probably why some turned to P2P.

Nevertheless, rules are an important aspect to private BT sites - they ensure healthy torrents and blazing-fast download speeds for all. Each site will have their “rules” posted - the link is usually not hard to find. Below are some general rules / tips that pertain to any private BitTorrent site:

Use an ‘Accepted’ BitTorrent client

Not all private trackers are the same - and each one has different rules in regards to which BitTorrent client is on the “allowable” list. Most sites recommend µtorrent, but only specific versions (or builds) of it. If you stick with v1.6.1 or the latest version v1.7.7 (recommended), you can’t go wrong with ANY private site (avoid any versions in between these numbers). And do not use BitComet on private trackers.

Proper BitTorrent client configuration:

Many trackers recommend that you disable DHT and Peer Exchange (PEX) in your BT client’s settings. To do this in µtorrent, go to OPTIONS > Preferences… > and select the BitTorrent tab. Remove the three checkmarks that pertain to DHT and PEX (see image below):

Do not ‘Hit & Run’ a Private BT site:

A ‘Hit & Run’ (or H&R) is when someone joins a private tracker, and downloads as much as they can before making off without uploading to a proper ratio. While this practice is frowned upon even on public sites such as mininova.org, it is deplorable to private sites. This can (and sometimes will) lead to your IP address being banned from the site - forever.

Stick within these guidelines for HAPPY Torrenting!
http://torrentfreak.com/10-tips-tric...-sites-080323/





White House Tech Policy Called `Magical Thinking`

Bush policy draws criticism for failing to promote effective competition.
Roy Mark

No conference on technology policy would be complete without a debate on where America stands in the global competition race.

Is the pipe half full or half empty? Not surprisingly, the talk at the second annual Tech Policy Summit was decidedly mixed.

"The U.S. is still the most dynamic broadband economy in the world," said Ambassador Richard Russell, the associate director of the White House's Office on Science and Technology Policy. "As opposed to being miles ahead, though, we're only a little ahead."

But Yale Law School's Susan Crawford called Russell's position "magical thinking. We're not doing well at all." She proceeded to call the White House's effort "completely inadequate on broadband competition."

Crawford added that what America needs is "access to a general communication structure that is open with universal access," a notion characterized by Russell as a "tragic mistake" and invoked an image of a single, regulated monopoly.

"More pipes into the home is the key," Russell said.

Crawford, though, said the administration has failed to promote competition through its free-market approach, noting that even the recent 700MHz spectrum auction is not likely to change the competitive landscape much, since Verizon Wireless won the best slice for wireless broadband. "The big actors [telecoms and cable companies] are running regional duopolies," she said.

Russell said there is no need for any new legislation to create a more competitive atmosphere. "People don't understand how hard it is to write legislation," he said, citing the 1996 Telecommunications Act as a prime example.

As originally passed by Congress, lawmakers envisioned the act creating competition by forcing telecoms and cable companies to share their lines at discounted prices with competitors.

"Look how that turned out. [Congress] decided to have everyone share the same line and the wire was copper," Russell said. "The administration has opposed any new legislation because you never know how it might turn out."

Joe Waz, Comcast's vice president of external affairs and public policy, blamed the failure of the 1996 legislation on "endless gaming of the system by the incumbents." M2Z Networks Chairman and CEO Milo Medin laid the 1996 act's failure at the feet of politicians. "Congress was trying to cut the baby in half. Politicians love this sort of thing," he said.

Crawford said Congress' primary mistake was imposing a "thy shall cooperate" regime on the Baby Bells and cable companies. However, they did not and proceeded to launch a blizzard of protracted litigation.

By the end of the debate, Crawford was the only member of the panel still insisting on an activist Congress to address issues such as network neutrality and network management. The other members were putting their faith in the free market.

"This is really about the rights of unborn technology," she said.
http://www.eweek.com/c/a/Infrastruct...WK_032808_SDT2





Low-Income Residents Get High-Speed Access
Katie Hafner

Last summer, when Earthlink pulled the plug on plans to build a city-wide Wi-Fi network in San Francisco, it looked like only those with the money to pay for high-speed Internet access (or with a decent laptop and a good map of free hotspots) would be able to get it.

Such bad things should happen more often.

To bridge the gap, earlier this week, the Internet Archive, a San Francisco-based non-profit organization, began offering free Internet service to a number of public housing projects where the Internet will be piped in at speeds far greater than most high-speed systems provide.

The first project to be connected is Valencia Gardens in the city’s Mission District, with 260 units. It is now up and running in a pilot project. The project expects to wire more than 2,500 units in the city over the next eight to ten months, said Brewster Kahle, the founder of the Internet Archive, which is best known for its digital archiving work.

The apartments are connected to the Internet at 100 megabits per second, a speed that contrasts sharply with the normal high-speed Internet service offered by telephone and cable companies, which is usually less than 6 megabits/second.

Mr. Kahle said the Internet Archive has achieved such high speed by working with the city of San Francisco to connect the city’s municipal fiber optic network, which runs through the public housing developments, to an Internet Archive switching center, which in turn, connects to the Internet.

“We are pleased to be the first non-profit organization to bring public housing online,” Mr. Kahle said. “We are excited to see much faster access to the Internet as a way to experiment with advanced applications, and are pleased that the underserved get first access to advanced technology.”
http://bits.blogs.nytimes.com/2008/0.../index.html?hp





Why Old Technologies Are Still Kicking
Steve Lohr

IN 1991, Stewart Alsop, the editor of InfoWorld and a thoughtful observer of industry trends, predicted that the last mainframe computer would be unplugged by 1996. Last month, I.B.M. introduced the latest version of its mainframe, the aged yet remarkably resilient warhorse of computing.

Today, mainframe sales are a tiny fraction of the personal computer market. But with the mainframe facing extinction, I.B.M. retooled the technology, cut prices and revamped its strategy. A result is that mainframe technology — hardware, software and services — remains a large and lucrative business for I.B.M., and mainframes are still the back-office engines behind the world’s financial markets and much of global commerce.

The mainframe stands as a telling case in the larger story of survivor technologies and markets. The demise of the old technology is confidently predicted, and indeed it may lose ground to the insurgent, as mainframes did to the personal computer. But the old technology or business often finds a sustainable, profitable life. Television, for example, was supposed to kill radio, and movies, for that matter. Cars, trucks and planes spelled the death of railways. A current death-knell forecast is that the Web will kill print media.

What are the common traits of survivor technologies? First, it seems, there is a core technology requirement: there must be some enduring advantage in the old technology that is not entirely supplanted by the new. But beyond that, it is the business decisions that matter most: investing to retool the traditional technology, adopting a new business model and nurturing a support network of loyal customers, industry partners and skilled workers.

The unfulfilled predictions of demise, experts say, tend to overestimate the importance of pure technical innovation and underestimate the role of business judgment. “The rise and fall of technologies is mainly about business and not technological determinism,” said Richard S. Tedlow, a business historian at the Harvard Business School.

To survive, technologies must evolve, much as animal species do in nature. Indeed, John Steele Gordon, a business historian and author, observes that there are striking similarities in the evolutionary process of markets and biological ecosystems. Dinosaurs, he notes, may be long gone, victims of a change in climate that better suited mammals. But smaller reptiles evolved and survived, and today there are more than 8,000 species of reptiles, mainly lizards and snakes, compared with about 5,400 species of mammals.

As a media technology, radio is an evolutionary survivor. Its time as the entertainment hub of American households in the 1930s and ’40s, captured in the Woody Allen film “Radio Days,” gave way to the rise of television.

TV replaced radio as the box families gathered around in their living rooms. Instead, radio adopted shorter programming formats and became the background music and chat while people ride in cars or do other things at home — “audio wallpaper,” as Paul Saffo, a technology forecaster in Silicon Valley, puts it.

While television did pose a threat to movies, it also served as a prod to innovation, including failures like Smell-O-Vision but also wide-screen, rich-color technologies like Cinerama and CinemaScope. The idea — and a good one — was to give viewers a more vivid, immersive experience than they could possibly have with television.

Today movies, like other traditional media, face the digital challenge of the Internet. And Mr. Saffo is betting that after a period of adjustment and experimentation, they will make another life-prolonging adaptation.

“Technologies want to survive, and they reinvent themselves to go on,” he said.

The survivors also build on their own technical foundations as well as the human legacy of people skilled in the use of a technology and the business culture and habits that surround it. And a change in the economic environment can sometimes lead to the renaissance of an older technology. Railroads, for example, have enjoyed a revival of investment recently as rising fuel costs and road congestion have prompted shippers to move from trucks to trains; some travelers, too, have opted for railways, along routes like the Boston-New York-Washington corridor.

The weight of legacy is underestimated, according to John Staudenmaier, editor of the journal Technology and Culture, because innovation is so often portrayed as a bold break with the past. A few stories of technological achievement fit that mold, like the Manhattan Project, but they are rare indeed.

The mainframe is the classic survivor technology, and it owes its longevity to sound business decisions. I.B.M. overhauled the insides of the mainframe, using low-cost microprocessors as the computing engine. The company invested and updated the mainframe software, so that banks, corporations and government agencies could still rely on the mainframe as the rock-solid reliable and secure computer for vital transactions and data, while allowing it to take on new chores like running Web-based programs.

“The mainframe survived its near-death experience and continues to thrive because customers didn’t care about the underlying technology,” said Irving Wladawsky-Berger, who led the technical transformation of the mainframe in the early 1990s and is now a visiting professor at the Massachusetts Institute of Technology. “Customers just wanted the mainframe to do its job at a lower cost, and I.B.M. made the investments to make that happen.”

I.B.M.’s most recent model, the z10, represents an investment of $1.5 billion and the work of 5,000 technical professionals. To nurture its ecosystem, the company partners with 400 universities worldwide in programs to teach mainframe skills.

The mainframe doomsayer, Mr. Alsop, is now a venture capitalist. In retrospect, he says, his 1991 prediction was wrong only in the timing. I.B.M. has so drastically reinvented the mainframe technology and its business model that the mainframes he wrote about are long gone. “It is a different world,” he said.
http://www.nytimes.com/2008/03/23/te...gy/23digi.html





How Did Your Computer Crash? Check the Instant Replay
Michael Fitzgerald

ANYONE who uses a computer knows what it’s like to have the system crash. Crashes are the digital world’s addition to that short list of inevitables, death and taxes. But what if you could record the crash and play it back, like TiVo for software?

That idea inspired two software engineers, Jonathan Lindo and Jeffrey Daudel, to figure out such a product. They have succeeded, and are now moving from the niche market where they proved the idea and onto a bigger stage.

System crashes and other software flaws are more than an annoyance. A 2002 study by the National Institute of Standards and Technology estimated that software flaws cost the United States economy as much as $59.5 billion a year.

For software developers, the flaws that cause crashes rank among their biggest problems, especially the ones that can’t be reproduced, like the proverbial noise in the car engine that disappears when you visit the mechanic.

Mr. Lindo says he and Mr. Daudel found themselves overwhelmed by bugs they couldn’t find while working together at an Internet start-up in 2002. “We were spending almost all of our time not fixing the issues, but trying to get to the point where we could just see the issue, and we said, ‘Wouldn’t it be great if we could just TiVo this and replay it?”’ Mr. Lindo recalls.

Innovation by analogy is a powerful concept, says Giovanni Gavetti, an associate professor at the Harvard Business School who, with his colleague Jan W. Rivkin, has published research on how businesses can use analogic reasoning as a strategic tool. Human beings are analogy machines, he notes, dealing with new information by comparing it to things they already know something about.

It would take time for Mr. Lindo and Mr. Daudel to prove that their analogy worked. They were tackling a daunting problem — in fact, friends told them that they had a great idea, but one that was probably impossible to carry out. For one thing, they had to account for everything that can affect a program, from keystrokes, mouse movements and other software applications to network traffic and programming instructions that are designed to occur randomly. (For instance, in a computer game, the villain shouldn’t always do the same thing.)

Ideally, their tool would not slow down the system as it recorded what was happening. They were also developing it for game platforms, among the most complex of software environments.

There were already programs on the market that could do things like log all the various inputs a program received. But none of them worked as the program was running, which is what developers really want, say analysts like Theresa Lanowitz of Voke, a technology research firm. In effect, that meant these products took snapshots, not the “video” that Mr. Lindo and Mr. Daudel thought was necessary.

Eventually, they got their technology working, and in late December 2003 quit their jobs and started Replay Solutions — so named because it would replay software crashes.

Their product, ReplayDirector, works on the Xbox gaming platform and several versions of the Microsoft Windows operating system. One customer is Electronic Arts, which began using the product in the fall of 2006, according to Steven Giles, its director of online operations.

Mr. Giles says he was referred to Replay by a venture capitalist he knows. The venture capitalist was worried that the company’s software might be “smoke and mirrors,” and Mr. Giles initially felt the same way. But when he realized that it worked, he convinced a number of developer teams at Electronic Arts to license the tool. (The venture capitalist ended up investing in Replay.)

Mr. Giles says that he liked a number of things about the tool, but that one stood out: its ability to capture bugs that cannot be reproduced.

“That’s something that nobody inside or outside our industry has really been able to solve,” he says. “We refer to it almost as magic.”

He declined to say how much Electronic Arts saves by using Replay, but says that “where it’s obvious that there are savings is when you see your most senior and expensive developers working on the game rather than chasing these ghost bugs.”

Replay’s gaming software sells for $50,000 a project, with negotiated pricing for multiple projects.

There are still pieces missing from the TiVo analogy. For instance, software developers can’t yet fast-forward to see their crashes. Still, having tackled games, Replay is now expanding into new markets. Last week, it released a beta version of its next product, for developers who write code in the Java language. Versions for other markets will also start appearing later this year, and the technology should transfer to almost any kind of software environment, says Vishwanath Venugopalan, an enterprise software analyst at the 451 Group, a research firm in New York.

THE best news for business is that Replay is one of a number of innovators, big and small, aiming to improve how software is developed, says Dana Gardner, president of Interarbor Solutions, a consulting firm in Gilford, N.H. This burst of innovation, he says, reflects the increasing importance of software across the business world.

Even a clever tool like Replay, however, cannot completely eliminate system crashes. With new software applications, “you’re always trying to solve a problem nobody’s ever solved before,” notes Michael D. Ernst, an associate professor of computer science at M.I.T. who studies programmer productivity and has independently been researching the idea of “replaying” software.

New software, then, guarantees new bugs. But having a way to replay problems should make it much faster to find — and swat — those bugs.
http://www.nytimes.com/2008/03/23/te...y/23proto.html





Buyer Beware: Pointless Text File Wins 16 Software Awards
D. Scott Pinzon

An amusing story over on Successful Software.Net highlights the risky side of relying on freeware and shareware for any mission-critical purpose.

Andy Brice, a UK-based software developer, had grown suspicious about "awards" ascribed to freeware and shareware programs that he knew lacked functions and features of rival offerings. So he invented a program named AwardMeStars. It didn't run; in fact, it wasn't actually a program. It was a text file comprised solely of the words, "This program does nothing at all," and renamed as an executable (.EXE). He had a third party submit his file to just about every software aggregation site; then he sat back to watch the results.

His non-operating, do-nothing program won 16 awards. Various cites labeled it "Certified 5-Star," "Editor's Pick," and "Cool Discovery." All of them, obviously, from sites that didn't even bother to note the blatant name of the program, nor try to run it even once.

What's going on? Brice surmises that the software sites award their top rating to everything submitted, in hopes that the software authors will boast of the awards on their own sites and link back to the aggregator sites -- thus, raising the aggregator site's rankings in search engines.

Small businesses in particular love to rely upon low-cost solutions, and since shareware typically comes from an author with no marketing budget, network administrators who use free tools often find themselves downloading a piece of code they've never heard of. Well, if you're relying on "reviews" and "awards" to help you judge the reputation of that freeware, move carefully. Here is one more reason you should dedicate a computer to being your test machine, keep it off your primary network -- and try before you buy. Or, in the case of freeware, enjoy before you deploy.

Kudos to Brice for sharing his story, and further kudos to Slashdot affiliate freshmeat.com for being one of the few sites to reject Brice's "program." To see the list of sub-standard sites that issue the awards, check out Brice's full story.
http://www.watchguard.com/RSS/showar...k=RSS.16Awards





A One-Stop Site Offers All the Photo Functions, From Posting Online to Red-Eye Removal
J .D. Biersdorfer

Flickr and Facebook allow you to share photos online, and desktop programs like Picasa, iPhoto and Photoshop Elements let you make the pictures look good before you upload. But starting today with its new Photoshop Express site, Adobe is putting the two together.

After signing up for the free site at www.photoshop.com/express, members can upload their images and then edit them with Adobe’s simplified set of point-and-click controls for red-eye removal, cropping, exposure, saturation and other functions. Users can group images into Web albums and post them to popular social networking sites, all from within Photoshop Express.

Each basic account at the site, which is still in a beta test version, gets two gigabytes of online storage, although Adobe soon plans to offer more services for a fee.

If you think you went too far with your photo manipulation, Photoshop Express lets you remove changes one by one — or go all the way back to the original picture.
http://www.nytimes.com/2008/03/27/te...h/27adobe.html





Dinner and a video event

At Cineplexes, Sports, Opera, Maybe a Movie
Kelly Shimoda

Brooks Barnes

Movie theaters are not just for movies anymore.

Coming soon will be broadcasts of live baseball games, rock concerts, classic TV shows and an array of other offerings not associated with the silver screen.

From nickelodeons to drive-ins to multiplexes, American movie theaters have always evolved with the times. But the latest evolution, set off by stagnating attendance and advances in digital technology, marks the first time that movie theaters have reinvented themselves without motion pictures as the centerpiece.

“Exhibitors are heading toward showing more than just movies faster than anyone expected,” said Ted Mundorff, chief executive of Landmark Theaters, which operates multiplexes in California, Texas and New York, among other states. “Live simulcasts of sporting events or whatever won’t displace the first week of ‘Harry Potter,’ but they might displace the fifth week.”

Chains in Tennessee and New Jersey sell $25 tickets to performances of La Scala operas. AMC and Regal, two of North America’s biggest chains, have promoted concerts (Celine Dion), marathons of classic TV shows (“Star Trek”) and seasonal events (the St. Olaf Christmas Festival). On April 24, hundreds of theaters are scheduled to show highlights from the Drum Corps International World Championships.

Few think nonmovie content will supplant movies as the primary reason people trek to the multiplex. Rather, the hope is that all the niche offerings will add up to steady supplemental income.

“I love film, but the simple fact is that we can’t count on movie attendance to grow,” said Thomas W. Stephenson Jr., president of Rave Motion Pictures, which operates theaters in 11 states.

Movie attendance inched up less than 1 percent in 2007, year-over-year, after a narrow increase in 2006 and three previous years of sharp declines — even as studios pumped out a record number of blockbuster-style pictures. Movie fans bought about 1.42 billion tickets in 2007, according to Media by Numbers, delivering $9.6 billion in revenues.

As televisions get bigger and the gap between a film’s theatrical release and DVD release shrinks, exhibitors worry that attendance could slump further. Video on demand poses another threat. Piracy, meanwhile, eats away more than $600 million annually in movie ticket sales, according to the National Association of Theater Owners.

Exhibitors have long sought to come up with new ways to fill seats. Renting out auditoriums for meetings gained popularity a decade ago. And some theaters have experimented with nonmovie content for years. Screenvision, a New York company that sells on-screen advertising for more than 1,900 theaters, simulcast the 2003 MTV Movie Awards to a handful of theaters.

What is different now? The economic need is greater, and the technology needed to show live broadcasts and high-definition films is now accessible enough, and reliable enough, to make this a real market, operators say.

About 5,000 movie screens in the United States are equipped with digital projectors, up from 200 just three years ago. Within the next two years, that number is expected to be 10,000. Digital projection systems, while expensive, give theaters the ability to pull off live, high-definition simulcasts — and also open the door for 3-D presentation, something that is expected to lift their core movie business.

“We can now replicate Carnegie Hall across the country,” said Matthew Kearney, the chief executive of Screenvision.

Perhaps not exactly. But a $40 ticket to hear the New York Philharmonic play at Carnegie Hall gets patrons a balcony seat. At a multiplex, for half that price, customers would get digital surround-sound and a close-up view.

Simulcasts of the Metropolitan Opera over the last year helped turn the tide. National CineMedia, a competitor of Screenvision, said nearly 300,000 people attended screenings in 2007, which was the inaugural season; in 2008, simulcasts of Met performances in movie theaters are expected to draw upwards of a million people.

The New York Mets could not have been happier with a simulcast last August at Ziegfeld Theater in New York, where a live organist and the team mascot led viewers in singalongs as though they were in the ballpark.

“Tickets to watch the game in the theater sold out so quickly that we’re in talks to do a bunch more of them this summer,” said Dave Howard, executive vice president for business operations for the Mets.

Doug and Margarita Gibson, on the other hand, were annoyed two weeks ago during a Landmark simulcast of a Tennis Channel exhibition match between Pete Sampras and Roger Federer. The couple paid $20 for tickets only to discover that the event was not shown in high-definition as advertised. Also, a technical problem interrupted the match for 10 minutes after the first set.

“Next time, we will just stay home and watch it,” said Mr. Gibson, who asked for, and received, a refund.

Marketing is the biggest puzzle that operators need to figure out, said Jeffrey B. Logsdon, an entertainment analyst at BMO Capital Markets. Trying to contain costs, most have relied on advertising on their Web sites and in movie listings. Still, most people do not think to seek this kind of content at the movies, he said.

Consumer psychology, Mr. Logsdon says, plays as big a role in the shift as economics. Operators want people to think of theaters as vibrant, busy places. But when weekends account for 70 percent of movie ticket sales, multiplex parking lots spend a lot of time sitting empty.

“We want people to get used to coming into our building,” said Shari Redstone, president of National Amusements, the operator of 1,500 screens and the parent company of both Viacom and CBS. “It’s less ‘let’s be a movie theater’ and more ‘let’s be a community entertainment destination.’ ”
http://www.nytimes.com/2008/03/23/bu...3multi.html?hp





Regal, Like AMC, to Add Imax Screens
Brooks Barnes

The Regal Entertainment Group, which owns the nation’s largest movie theater chain, will work with the Imax Corporation to open 31 new, large-format outlets. The two companies said they would share the cost of installation and the revenue, but declined to reveal more detailed financial information.

One of Regal’s primary competitors, AMC Entertainment, announced a similar deal with Imax in December. The movie theater business, buffeted by a decade-long boom in home entertainment that has hurt attendance, is betting that Imax’s eye-popping imagery will help lure patrons to the multiplex. Currently Imax has 150 theaters in the United States, with plans for more than 300 by 2010.

The increased emphasis on Imax comes as Hollywood makes more movies available in the format, which is still struggling to expand beyond its roots in science and history museums. Several summer blockbusters will have an Imax option, including “Kung Fu Panda,” a DreamWorks Animation picture featuring the voice of Jack Black.

Regal said it will install Imax digital projection systems in 20 of its largest markets. It noted that the equipment will help improve profitability for itself and movie studios by cutting the expenses of shipping and handling film prints. The first converted theaters will open in November.
http://www.nytimes.com/2008/03/24/bu...ia/24imax.html





Video Camera Shoot-Off
Jack

$3200.00 Canon High Definition XH-A1 VS. the $169.00 Flip.

"I think we're right on the verge of 'ultra budget' filmmaking." - Kirk Mastin

http://kirkmastin.blogspot.com/2008/...ormats_21.html

Things are getting very interesting for independent content creators.





Amazon Kindle and Sony Reader Locked Up: Why Your Books Are No Longer Yours

If you buy a regular old book, CD or DVD, you can turn around and loan it to a friend, or sell it again. The right to pass it along is called the "first sale" doctrine. Digital books, music and movies are a different story though. Four students at Columbia Law School's Science and Technology Law Review looked at the particular issue of reselling and copying e-books downloaded to Amazon's Kindle or the Sony Reader, and came up with answers to a fundamental question: Are you buying a crippled license to intellectual property when you download, or are you buying an honest-to-God book?

In the fine print that you "agree" to, Amazon and Sony say you just get a license to the e-books—you're not paying to own 'em, in spite of the use of the term "buy." Digital retailers say that the first sale doctrine—which would let you hawk your old Harry Potter hardcovers on eBay—no longer applies. Your license to read the book is unlimited, though—so even if Amazon or Sony changed technologies, dropped the biz or just got mad at you, they legally couldn't take away your purchases. Still, it's a license you can't sell.

But is this claim legal? Our Columbia friends suggest that just because Sony or Amazon call it a license, that doesn't make it so. "That's a factual question determined by courts," say our legal brainiacs. "Even if a publisher calls it a license, if the transaction actually looks more like a sale, users will retain their right to resell the copy." Score one for the home team.

There's a kicker, though: If a court ruled with you on that front, you still can't sell reproductions of your copy, an illegal act tantamount to Xeroxing your Harry Potters. You'd have to sell the physical media where the "original" download is stored—a hard drive or the actual Kindle or Sony Reader. Our guess is that it only gets more complicated from here. What happens when the file itself resides only on some $20-per-month Google storage locker?

For more details, have a look at the original, surprisingly readable legal summary:

Quote:
The (Potential) Legal Validity of E-book Reader Restrictions By Rajiv Batra, John Padro, Seung-Ju Paik and Sarah Calvert

Many users are unhappy that e-book readers, such as the Sony Reader and the Amazon Kindle, restrict the sharing, borrowing and transferring of e-books. While some argue that the "first sale" doctrine should allow users to transfer an e-book in the same manner as a hard-copy book, these contentious restrictions may be valid under current law.

The Sony Reader and the Amazon Kindle

The Sony Reader and the Amazon Kindle are portable media devices designed to carry and display e-books and other electronic documents. Kindle has a mobile broadband function that allows users to browse online content and download e-books while on the go. Alternatively, the Sony Reader requires users to download and manage their library of e-books via a home computer.

The contentious characteristic of both products is that they bar users from sharing their e-books with other users. For example, Kindle's license agreement grants a "non-exclusive right to keep a permanent copy...solely for your personal, non-commercial use." Consequently, Kindle users may "not sell, rent, lease, distribute, broadcast, sublicense or otherwise assign any rights to...any third party." The Sony Reader has similarly restrictive language in its license, but does allow users to copy e-books to several other Readers as long as they are registered to the same account.

The First Sale Doctrine

Some users have argued that these license restrictions violate the "first sale" doctrine. Under the Copyright Act, the first sale doctrine allows the owner of a particular copy of a work to sell, lease or rent that copy to anyone they want at any price they choose. These rights only apply, however, to the particular copy that was purchased; any unauthorized reproduction or copying of that work constitutes copyright infringement. For instance, you can't give away photocopies of Harry Potter and the Deathly Hallows, but you can auction your paperback on eBay when you're finished with it.

When it comes to digital works, however, two complications arise: first, consumers might only hold a license to the content, rather than all of the rights that come from a sale; second, without a traditional physical container for each purchased work, consumers may not practically be able to sell their "particular copy" at all.

License vs. Sale

The first sale doctrine only applies to the "owner" of a copy of a work, so end users who acquire content by license do not enjoy the right to resell their copies. Whether a transaction is a license or a sale is a factual question determined by courts—even if a publisher calls it a license, if the transaction actually looks more like a sale, users will retain their right to resell the copy. However, as more commercial transactions involve the transfer of digital content—particularly commercial software—courts have struggled to consistently make the distinction between license and sale. Software is increasingly transferred with highly restrictive licensing terms, but federal case law has not clearly determined whether these types of transfers are licenses or true sales.

Kindle and the Sony Reader are following this licensing trend and creating restrictive licenses that users must agree to upon using the product. If these agreements are found to be enforceable licenses, they could serve as the legal authority to limit users from selling or otherwise transferring the e-books they download.

Amazon vs. Sony

Both license schemes are equally restrictive, but each product limits use in a slightly different manner. Amazon Kindle's use license expressly limits the extent and use of both the device and the digital media. The Sony Reader's restrictions operate in two steps: a license to use the device and a second license to use the e-book library software (created by Sony). In both devices, users are not allowed to circumvent or alter the pre-installed software on the device.

For digital media, Kindle's agreement allows users one permanent copy. The Reader, on the other hand, allows one user to posses multiple copies as long as they are all registered to that user. Both regimes are equally restrictive on the distribution, copying, and sharing of purchased e-books (to other users).

The reason for the differences in these restrictions is a result of their technical characteristics. Amazon's wireless store requires the terms to be agreed on initially, while the Sony Reader's reliance on iTunes-like software allows a separate use agreement. In effect, both agreements accomplish the same level of restriction, but you have a little more leeway with the number of copies with the Sony Reader.

Hard Copies vs. Digital Copies

Another possible complication stems from the inherent difference between transferring an e-book and transferring a hard-copy book. The transfer of a hard-copy book is just that; the physical transfer of one copy. The transfer of an e-book, however, requires the digital recreation or copying of that e-book. Because the first sale doctrine allows transfers of only your particular copy, and not reproductions or recreations, a digital transfer of an e-book is probably impermissible. Thus, users of Kindle and the Sony Reader can only legally transmit works by selling the physical media on which they are stored—be that the e-book readers themselves or the users' hard drives.

While the restrictions on e-books may initially seem inconsistent with the rights granted for hard-copy books, these differences are the consequence of new digital products outgrowing traditional copyright doctrines. Such issues are currently being examined by legal scholars and industry insiders, but only time will tell whether this degree of control over digital media is acceptable to society.
http://gizmodo.com/369235/amazon-kin...o-longer-yours





Feds Tout New Domestic Intelligence Centers
Ryan Singel

Federal, state and local cops are huddling together in domestic intelligence dens around the nation to fuse anti-terror information and tips in ways they never have before, and they want the American people to know about it -- sort of.

Some of the nation's top law enforcement and anti-terror officials got together to hold press briefings Tuesday and Wednesday mornings at the second annual National Fusion Center conference held in San Francisco.

Homeland Security Under Secretary Charlie Allen, formerly of the CIA, described how sharing threat assessments, and even the occasional raw intel, with the new fusion centers marks a cultural shift from the Cold War era. Back then, spies treated everyone, other departments and agencies included, as suspicious.

"Things have changed remarkably in Washington. We are talking to each other," Allen said Tuesday. "I am from the shadows of the CIA where in the Cold War, we followed a different model. That model does not apply for the kinds of threats we have today that are borderless. The threats are so different and so remarkably dangerous for our citizens."

The fifty or so U.S. fusion centers are where the federal, state and local cops share intelligence, sift data for clues, run down reports of suspicious packages and connect dots in an effort to detect and thwart terrorism attacks, drug smuggling and gang fighting.

Privacy and civil liberties groups are increasingly suspicious of the fusion centers, but state and local officials have complained for years that the feds don't share any useful information. The 9/11 Commission agreed, blaming the CIA and FBI's lack of information-sharing for wasted chances to stop the airline hijackings. The commission strongly urged they change their ways and put holes in so-called "stove pipes." And in 2007, the Democrats boosted fusion centers' stature and funding in the first bill they passed after taking control of Congress.

More than $130 million federal dollars have fed the development of the fusion centers in locations as diverse as Kansas and Northern California.

On Tuesday, San Francisco police chief Heather Fong said the information flow was getting better, especially around big events being held in the city.

"When we get information, it's not how much can we amass and keep to ourselves," Fong said. "It's how much information can we obtain but appropriately share so that it positively assists others in doing their jobs around the country and the world."

The dominant catchphrase from the officials was that the centers need to focus on "all threats, all hazards." That means that the fusion centers would be working on immigration, radicalization, demographic changes, hurricanes, biological and chemical threats, as well as common criminal activity. Officials say the centers must look at even the most mundane crimes, since they can be used to fund terrorism.

By way of example, Los Angeles police chief Bratton cites the investigation of a string of gas station stick-ups in L.A. in 2005. The robbery investigation led to the prosecution of militant Muslim convicts who were planning attacks on synagogues. That, Bratton said, illustrates why these intelligence centers need to be analyzing run-of-the-mill crimes.

"Information that might seem innocuous may have some connection to terrorism," Bratton said.

But critics say that "all hazards, all threats" approach sounds suspiciously like the government is building a distributed domestic intelligence service that could easily begin keeping tabs on Americans exercising their First Amendment rights. The scope also seems at odds with the federal government's Information Sharing Environment guidelines, which say these centers are supposed to focus on terrorism.

California's Anti-Terrorism Information Center admitted to spying on anti-war groups in 2003. And Denver's police department built their own secret spy files on Quakers and 200 other organizations.

Earlier this year, the ACLU issued a warning report about Fusion Centers, complete with an interactive fusion center map, earlier this year. The report, entitled What's Wrong With Fusion Centers, cited concerns about military units operating in the centers, as well as the potential for scope creep and data mining. How, the group asked, can citizens contest information about themselves, given the patchwork of state, local and federal sunshine laws that may or may not apply.

But in a conference keynote Tuesday, Congresswoman Jane Harman (D-California), a powerful force in intelligence matters and funding, pooh-poohed the ACLU's concerns, and said she supported both fusion centers, and civil liberties.

"I was frustrated when I met with the [ACLU] report authors and they could not point to a single instance of a fusion center violating someone's civil rights or liberties," Harman said. "In fact, state and local laws and protections in place at many fusion centers are more rigorous than their federal counterparts."

Tim Sparapani, the ACLU's top legislative lawyer in D.C., bristled at Harman's remakrs. "Our prognosticating track record in identifying programs ripe for abuse of privacy and civil liberties is pretty solid," Sparapani wrote in an e-mail that listed several other programs that the ACLU correctly raised warning flags on.

"That's not luck," he wrote. "It's a trend based on seeing the surveillance industrial complex being built bit-by-bit and terabyte by terabyte. As sure as the sun rises in the east and sets in the west, if Fusion Centers aren't built with rigid controls they will be privacy-invading monsters.

The ACLU points to Virginia, where legislators are moving to exempt their fusion center from government sunshine laws and give legal immunity to companies that report information -- such as the name of a person accosted by a private security guard for taking pictures of a skyscraper.

On Wednesday, a trio of federal privacy and civil liberties officers, including the Department of Homeland Security's chief privacy officer Hugo Teufel, promised they were working to make sure the centers respect citizens' civil liberties and privacy.

David Gersten, the director of the civil rights and civil liberties programs at DHS, said he was working to expand their training course for Fusion Center employees to "include an examination of the history of privacy and civil liberties as they relate to intelligence and criminal investigations."

That history includes the famous 1976 Church Committee report on the FBI's notorious COINTELPRO spying program. The report warned in the introduction "Unless new and tighter controls are established by legislation, domestic intelligence activities threaten to undermine our democratic society and fundamentally alter its nature.

THREAT LEVEL asked conference attendees about the concerns over expanding the dissemination of intelligence given the continuing trouble innocent Americans have trying to get off the nation's unified terrorist watch list.

Just this week, the Justice Department's inspector general issued a watch list audit (.pdf), finding that FBI agents were watch-listing people who they weren't even investigating. Moreover, since those names were added through a back channel, there was no scheduled review or follow-up to take them off the watch list.

Leonard Boyle, who runs the Terrorist Screening Center that curates and runs the watch list, said those problems are being fixed.

"We have streamlined our processes so [...] we avoid delays in amending nominations or removing people who ought to be removed because they are no longer suspected of having a nexus to terrorism," Boyle said.

Also present at the conference was Ambassador Thomas McNamara who now works at the Director of National Intelligence Office. McNamara's group is working on custom-built XML schemes, such as a standard for Suspicious Activity Reports. The idea is have all fusion centers and intelligence agencies using the same data format, to more easily share, search, sort and store intelligence data.

Surprisingly, a total of only three reporters showed up over two days of the conference to hear from the officials. THREAT LEVEL was the only media outlet to show up both days.

Despite journalists taking up only two of the fifty or so chairs, officials stuck with the formality of a press conference. Each day six to eight officials stood in a semicircle flanking the lectern and took turns issuing short remarks. After each set of speeches, the director of the Iowa fusion center and designated emcee Russell Porter allowed for a handful of questions from the two-reporter audience.

And as for information sharing, the conference's openness extended only so far, and the press was not allowed into sensitive sessions such as "How to Generate Suspicious Activity Reporting" and "Commanders and Analysts: Sharing Perspectives."

Government employees manning an informational booth for the Director of National Intelligence's OpenSource.gov website refused to even describe the program, saying they would need to call in a press minder.

The website seems to indicate that the program is a way for the government to share intel reports composed by analysts who read international newspapers and watch TV stations from around the world.

THREAT LEVEL guessed we would not be able to sign up for the email blasts, due to our propensity to share information with the public. The taciturn DNI employees confirmed that fact, adding that they also couldn't share the information from OpenSource.gov due to copyright issues.
http://blog.wired.com/27bstroke6/200...out-new-d.html





Overstock.com CEO: Wikipedia Has Become An Instrument Of Mass Mind-Control

Reader Adam forwarded us this bizarre email from Patrick Byrne, CEO of Overstock.com.

After announcing deals on watches and exercise equipment, the email invites readers "Take 5 with Patrick, " which involves the CEO likening Wikipedia to mind-control and Wall Street corruption. Apparently the feud between Wikipedia and Overstock goes way back. Back in late 2006, someone from Overstock edited the company's Wikipedia page to read like an advertisement. This was reverted by one of the site's editors. Over the next few months, Overstock's "director of social media," Judd Bagley, used dozens of Wikipedia identities to revert Overstock's entry and harass editors. Overstock also began a site called antisocialmedia.net, run by Bagley. Fed up, Wikipedia banned all Overstock. IP addresses from editing Wikipedia pages. Since then, Byrne has used his "Take 5 with Patrick" postings to disseminate articles such as "Social Media - Hijacking the Discourse," "How to Handle a Corrupt Reporter," "A Small Thing Called, the First Amendment," and "Our Corrupt Federal Regulator the SEC."
http://consumerist.com/371015/overst...s-mind+control





Will Homeland Security the Militarized Police State Shock You Into Submission?
Karen De Coster

This is perhaps one of the most kooky and creepy Security State tactics that I have come across: the EMD safety bracelet, which is being billed as the "last line of defence." A company called Lamperd Firearm Training Systems (scroll down) is trying to commercialize this item as an "airline security product." The company’s video that hawks this device talks about the current facial recognition system called biometrics, where cameras capture photos of people and compare those images to the images of "terrorists" in its "terrorist" database. No matter how sophisticated this technology, it can all too often allow a terrorist on board a plane, and, this technology can also have the effect of creating airport bottlenecks. Ahem. The solution? The "viable, workable answer" is an electronic ID bracelet. This bracelet will replace the need for a ticket and contain all necessary information about the person, and as a bonus, it can allow the passenger to be tracked through the terminal. Crew members would be empowered with radio frequency transmitters to subdue "hijackers." The technology will override a person’s central nervous system and zap them down quicker than you can say "Homeland Security." The company assures us that being dragged through the bracelet process is a "small inconvenience in order to assure your safe arrival." In fact, its studies show that most people would "happily opt" for wearing the bracelet to "insure their own security."

Here’s the Lamperd video on YouTube - you must watch it. Here’s the patent for this device. The patent actually reads this:

Upon activation of the electric shock device, through receipt of an activating signal from the selectively operable remote control means, the passenger wearing that particular bracelet receives the disabling electrical shock from the electric shock device. Accordingly, the passenger becomes incapacitated for a few seconds or perhaps a few minutes, during which time the passenger can be fully subdued and handcuffed, if necessary. Depending on the type of transmission medium used to send the activating signal, other passengers may also become temporarily incapacitated, which is undesirable and unfortunate, but may be unavoidable.

Lamperd even posts a series of letters on its website showing interest in the product for use "outside of airport security," which, of course, is the real reason for the product. Why it can be used for border control to subdue illegal aliens or by local law enforcement agencies to control the "criminal element!"
http://www.infowars.com/?p=995





Spy-In-The-Sky Drone Sets Sights on Miami
Tom Brown

Miami police could soon be the first in the United States to use cutting-edge, spy-in-the-sky technology to beef up their fight against crime.

A small pilotless drone manufactured by Honeywell International, capable of hovering and "staring" using electro-optic or infrared sensors, is expected to make its debut soon in the skies over the Florida Everglades.

If use of the drone wins Federal Aviation Administration approval after tests, the Miami-Dade Police Department will start flying the 14-pound (6.3 kg) drone over urban areas with an eye toward full-fledged employment in crime fighting.

"Our intentions are to use it only in tactical situations as an extra set of eyes," said police department spokesman Juan Villalba.

"We intend to use this to benefit us in carrying out our mission," he added, saying the wingless Honeywell aircraft, which fits into a backpack and is capable of vertical takeoff and landing, seems ideally suited for use by SWAT teams in hostage situations or dealing with "barricaded subjects."

Miami-Dade police are not alone, however.

Taking their lead from the U.S. military, which has used drones in Iraq and Afghanistan for years, law enforcement agencies across the country have voiced a growing interest in using drones for domestic crime-fighting missions.

Known in the aerospace industry as UAVs, for unmanned aerial vehicles, drones have been under development for decades in the United States.

The CIA acknowledges that it developed a dragonfly-sized UAV known as the "Insectohopter" for laser-guided spy operations as long ago as the 1970s.

And other advanced work on robotic flyers has clearly been under way for quite some time.

"The FBI is experimenting with a variety of unmanned aerial vehicles," said Marcus Thomas, an assistant director of the bureau's Operational Technology Division.

"At this point they have been used mainly for search and rescue missions," he added. "It certainly is an up-and-coming technology and the FBI is researching additional uses for UAVs."

Safety, Privacy Concerns

U.S. Customs and Border Protection has been flying drones over the Arizona desert and southwest border with Mexico since 2006 and will soon deploy one in North Dakota to patrol the Canadian border as well.

This month, Customs and Border Protection spokesman Juan Munoz Torres said the agency would also begin test flights of a modified version of its large Predator B drones, built by General Atomics Aeronautical Systems, over the Gulf of Mexico.

Citing numerous safety concerns, the FAA -- the government agency responsible for regulating civil aviation -- has been slow in developing procedures for the use of UAVs by police departments.

"You don't want one of these coming down on grandma's windshield when she's on her way to the grocery store," said Doug Davis, the FAA's program manager for unmanned aerial systems.

He acknowledged strong interest from law enforcement agencies in getting UAVs up and running, however, and said the smaller aircraft particularly were likely to have a "huge economic impact" over the next 10 years.

Getting clearance for police and other civilian agencies to fly can't come soon enough for Billy Robinson, chief executive of Cyber Defense Systems Inc, a small start-up company in St. Petersburg, Florida. His company makes an 8-pound (3.6 kg) kite-sized UAV that was flown for a time by police in Palm Bay, Florida, and in other towns, before the FAA stepped in.

"We've had interest from dozens of law enforcement agencies," said Robinson. "They (the FAA) are preventing a bunch of small companies such as ours from becoming profitable," he said.

Some privacy advocates, however, say rules and ordinances need to be drafted to protect civil liberties during surveillance operations.

"There's been controversies all around about putting up surveillance cameras in public areas," said Howard Simon, Florida director of the American Civil Liberties Union.

"Technological developments can be used by law enforcement in a way that enhances public safety," he said. "But every enhanced technology also contains a threat of further erosion of privacy."

(Reporting by Tom Brown; Editing by Michael Christie and Eddie Evans)
http://www.reuters.com/article/newsO...29797920080326





California Backs Off Real ID – Update
Ryan Singel

For a short moment Thursday, millions of Californians were in danger of facing pat-downs at the airport and being blocked from federal buildings come May 11.

In a Tuesday letter to Homeland Security chief Michael Chertoff, the head of California's DMV said that while California had already applied for and gotten an extension on the Real ID deadline, it wasn't actually committing to complying with Real ID rules by 2010. That's when states who ask for extension have to begin issuing driver's licenses and state IDs that comply with the federal rules.

"California's request for an extension is not a commitment to implement Real ID, rather it will allow us to fully evaluate the impact of the final regulations and precede with necessary policy deliberations prior to a final decision on compliance," DMV director George Valverde wrote.

States have until March 31 to request a two-year extension, and DHS had said before Thursday it won't grant Real ID extensions to states who don't commit to implementing the rules in the future.

That meant Tuesday's letter looked like enough to join California to the small rebellion against the Real ID rules.

For Californians that would mean enduring the same fate facing citizens of South Carolina, Maine, Montana and New Hampshire.

They would have needed to dig out their passport, if they had one, every time they boarded a plane, or go through an extra level of TSA screening at airport metal detectors. Los Angeles and San Francisco airports could have had security lines stretching to the Sierras.

Californians would also have been barred from buying certain medicine, entering federal court buildings or getting help at the Social Security Administration, unless they have a passport.

But after Threat Level provided Homeland Security spokesman Laura Keehner with the letter, Keehner said California's commitment to thinking about commitment is good enough.

"For right now, there is nothing that says they will not comply with Real ID," Keehner said.

Even though California just said it might not comply with Real ID, Keehner said that's fine since there was an ongoing process that might lead to compliance.

"It is different than saying we are not complying with Real ID," Keehner said. "If they were saying that, they would not get an extension."

At issue are long-delayed rules that require states to collect, verify and store birth and marriage certificates for nearly all citizens who have state-issued licenses or identification cards.

That means almost every driver's license holder will have to get certified documents and go into the DMV to get a new license -- and many will likely have to go in more than once.

The rules also require the nation's DMVs to interconnect their systems to prevent duplicate licenses and conform to federal standards for the physical cards themselves. DHS estimates the changes will cost from $4 to $20 billion, but is only offering some $80 million in direct funds.

In early January when DHS unveiled the final rules, Secretary Michael Chertoff said Real ID would make the country safer.

"For about $8 per license, Real ID will give law enforcement and security officials a powerful advantage against falsified documents, and it will bring some peace of mind to citizens wanting to protect their identity from theft by a criminal or illegal alien," Chertoff said.

Maine, Montana, South Carolina and New Hampshire are fighting the mandate, saying the rules violate state rights, will cost them billions and intrude on citizens' privacy rights.

The states aren't alone. Interest groups ranging from the AARP, the right wing Eagle Forum and the ACLU oppose the rules, and Homeland Security's own outside privacy advisers explicitly refused to endorse Real ID as "workable or appropriate" in 2007.

In February, New Hampshire asked for the extension, but also said that the request is "not an indication of our state's intent to comply with the Real ID final rule." In 2007, New Hampshire lawmakers passed, and Democratic governor John Lynch signed, a law banning New Hampshire from complying with Real ID.

So far, DHS has not accepted New Hampshire's request for an extension.

DHS says that it is committed to rejecting the rebel states' driver's licenses as acceptable proof of identification come May 11.

DHS spokesman Russ Knocke told Threat Level two weeks ago that citizens need to lay the blame for any inconveniences on their state officials and suggested the residents apply for passports now.

Keehner reiterated that that there "will be real consequences for states whose leadership chooses not to comply."

For instance, showing up with a driver's license at the airport "will be the same as showing up with no license currently," Keehner said.

She added that Secretary Chertoff held a conference call today with a number of governors to talk about Real ID, and that included Governor Mark Sanford of the rogue state of South Carolina. The group is "working together on going forward," Keehner said.

Bill Scannell, a spokesman for the Identity Project which is fighting Real ID, questioned whether DHS can keep its hard line if California joins the mix.

"California has stated quite clearly they do not intend to comply with Real ID," Scannell said. "It begs the the question: Will DHS be playing hardball with the big 40-ton gorilla California in the same way it has been slapping around little tiny New Hampshire? Their issues are the same."

The letter from California comes a little more than a week after Assembly member Pedro Nava, the head of the Transportation committee in the California legislature, introduced a resolution calling on California's congressional crew to rewrite the rules because they were too expensive and privacy invasive.

Update: This story was changed substantially after DHS spokeswoman Laura Keehner said that the letter would not lose California's extension. The original version relied on the presumption that it could.
http://blog.wired.com/27bstroke6/200...rnia-back.html





Outsourced Passport Work Risky
Bill Gertz

The United States has outsourced the manufacturing of its electronic passports to overseas companies — including one in Thailand that was victimized by Chinese espionage — raising concerns that cost savings are being put ahead of national security, an investigation by The Washington Times has found.

The Government Printing Office's decision to export the work has proved lucrative, allowing the agency to book more than $100 million in recent profits by charging the State Department more money for blank passports than it actually costs to make them, according to interviews with federal officials and documents obtained by The Times.

The profits have raised questions both inside the agency and in Congress because the law that created GPO as the federal government's official printer explicitly requires the agency to break even by charging only enough to recover its costs.

Lawmakers said they were alarmed by The Times' findings and plan to investigate why U.S. companies weren't used to produce the state-of-the-art passports, one of the crown jewels of American border security.

"I am not only troubled that there may be serious security concerns with the new passport production system, but also that GPO officials may have been profiting from producing them," said Rep. John D. Dingell, the Michigan Democrat who chairs the House Energy and Commerce Committee.

Officials at GPO, the Homeland Security Department and the State Department played down such concerns, saying they are confident that regular audits and other protections already in place will keep terrorists and foreign spies from stealing or copying the sensitive components to make fake passports.

"Aside from the fact that we have fully vetted and qualified vendors, we also note that the materials are moved via a secure transportation means, including armored vehicles," GPO spokesman Gary Somerset said.

But GPO Inspector General J. Anthony Ogden, the agency's internal watchdog, doesn't share that confidence. He warned in an internal Oct. 12 report that there are "significant deficiencies with the manufacturing of blank passports, security of components, and the internal controls for the process."

The inspector general's report said GPO claimed it could not improve its security because of "monetary constraints." But the inspector general recently told congressional investigators he was unaware that the agency had booked tens of millions of dollars in profits through passport sales that could have been used to improve security, congressional aides told The Times.

Decision to outsource

GPO is an agency little-known to most Americans, created by Congress almost two centuries ago as a virtual monopoly to print nearly all of the government's documents, from federal agency reports to the president's massive budget books that outline every penny of annual federal spending. Since 1926, it also has been charged with the job of printing the passports used by Americans to enter and leave the country.

When the government moved a few years ago to a new electronic passport designed to foil counterfeiting, GPO led the work of contracting with vendors to install the technology.

Each new e-passport contains a small computer chip inside the back cover that contains the passport number along with the photo and other personal data of the holder. The data is secured and is transmitted through a tiny wire antenna when it is scanned electronically at border entry points and compared to the actual traveler carrying it.

According to interviews and documents, GPO managers rejected limiting the contracts to U.S.-made computer chip makers and instead sought suppliers from several countries, including Israel, Germany and the Netherlands.

Mr. Somerset, the GPO spokesman, said foreign suppliers were picked because "no domestic company produced those parts" when the e-passport production began a few years ago.

After the computer chips are inserted into the back cover of the passports in Europe, the blank covers are shipped to a factory in Ayutthaya, Thailand, north of Bangkok, to be fitted with a wire Radio Frequency Identification, or RFID, antenna. The blank passports eventually are transported to Washington for final binding, according to the documents and interviews.

The stop in Thailand raises its own security concerns. The Southeast Asian country has battled social instability and terror threats. Anti-government groups backed by Islamists, including al Qaeda, have carried out attacks in southern Thailand and the Thai military took over in a coup in September 2006.

The Netherlands-based company that assembles the U.S. e-passport covers in Thailand, Smartrac Technology Ltd., warned in its latest annual report that, in a worst-case scenario, social unrest in Thailand could lead to a halt in production.

Smartrac divulged in an October 2007 court filing in The Hague that China had stolen its patented technology for e-passport chips, raising additional questions about the security of America's e-passports.

Transport concerns

A 2005 document obtained by The Times states that GPO was using unsecure FedEx courier services to send blank passports to State Department offices until security concerns were raised and forced GPO to use an armored car company. Even then, the agency proposed using a foreign armored car vendor before State Department diplomatic security officials objected.

Concerns that GPO has been lax in addressing security threats contrast with the very real danger that the new e-passports could be compromised and sold on the black market for use by terrorists or other foreign enemies, experts said.

"The most dangerous passports, and the ones we have to be most concerned about, are stolen blank passports," said Ronald K. Noble, secretary general of Interpol, the Lyon, France-based international police organization. "They are the most dangerous because they are the most difficult to detect."

Mr. Noble said no counterfeit e-passports have been found yet, but the potential is "a great weakness and an area that world governments are not paying enough attention to."

Lukas Grunwald, a computer security expert, said U.S. e-passports, like their European counterparts, are vulnerable to copying and that their shipment overseas during production increases the risks. "You need a blank passport and a chip and once you do that, you can do anything, you can make a fake passport, you can change the data," he said.

Separately, Rep. Robert A. Brady, chairman of the Joint Committee on Printing, has expressed "serious reservations" about GPO's plan to use contract security guards to protect GPO facilities. In a Dec. 12 letter, Mr. Brady, a Pennsylvania Democrat, stated that GPO's plan for conducting a security review of the printing office was ignored and he ordered GPO to undertake an outside review.

Questionable profits

GPO's accounting adds another layer of concern.

The State Department is now charging Americans $100 or more for new e-passports produced by the GPO, depending on how quickly they are needed. That's up from a cost of around just $60 in 1998.

Internal agency documents obtained by The Times show each blank passport costs GPO an average of just $7.97 to manufacture and that GPO then charges the State Department about $14.80 for each, a margin of more than 85 percent, the documents show.

The accounting allowed GPO to make gross profits of more than $90 million from Oct. 1, 2006, through Sept. 30, 2007, on the production of e-passports. The four subsequent months produced an additional $54 million in gross profits.

The agency set aside more than $40 million of those profits to help build a secure backup passport production facility in the South, still leaving a net profit of about $100 million in the last 16 months. GPO was initially authorized by Congress to make extra profits in order to fund a $41 million backup production facility at a rate of $1.84 per passport. The large surplus, however, went far beyond the targeted funding.

The large profits raised concerns within GPO because the law traditionally has mandated that the agency only charge enough to recoup its actual costs.

According to internal documents and interviews, GPO's financial officers and even its outside accounting firm began to inquire about the legality of the e-passport profits.

To cut off the debate, GPO's outgoing legal counsel signed a one-paragraph memo last fall declaring the agency was in compliance with the law prohibiting profits, but offering no legal authority to back up the conclusion. The large profits accelerated, according to the officials, after the opinion issued Oct. 12, 2007, by then-GPO General Counsel Gregory A. Brower. Mr. Brower, currently U.S. Attorney in Nevada, could not be reached and his spokeswoman had no immediate comment.

Fred Antoun, a lawyer who specializes in GPO funding issues, said the agency was set up by Congress to operate basically on a break-even financial basis.

"The whole concept of GPO is eat what you kill," Mr. Antoun said. "For the average taxpayer, for them to make large profits is kind of reprehensible."

Likewise, a 1990 report by Congress' General Accounting Office stated that "by law, GPO must charge actual costs to customers," meaning it can't mark up products for a profit.

Like the security concerns, GPO officials brush aside questions about the profits. Agency officials declined a request from The Times to provide an exact accounting of its e-passport costs and revenues, saying only it would not be accurate to claim it has earned the large profits indicated by the documents showing the difference between the manufacturing costs and the State Department fees.

Questioned about its own annual report showing a $90 million-plus profit on e-passports in fiscal year 2007 alone, the GPO spokesman Mr. Somerset would only say that he thinks the agency is in legal compliance and that "GPO is not overcharging the State Department."

Mr. Somerset said 66 different budget line items are used to price new passports and "we periodically review our pricing structure with the State Department."

Public Printer Robert Tapella, the GPO's top executive, faced similar questions during a House subcommittee hearing on March 6. Mr. Tapella told lawmakers that increased demand for passports — especially from Americans who now need them to cross into Mexico and Canada — produced "accelerated revenue recognition," and "not necessarily excess profits."

GPO plans to produce 28 million blank passports this year up from about 9 million five years ago.

A State Department consular affairs spokesman, Steve Royster referred questions to GPO on e-passports costs.

Congress to weigh in

GPO's explanations have not satisfied lawmakers, who are poised to dig deeper.

Mr. Dingell, the House Commerce chairman, said The Times' findings are "extremely serious to both the integrity of the e-passport program and to U.S. national security" and he has asked an investigative subcommittee chaired by Rep. Bart Stupak, Michigan Democrat, to begin an investigation.

"Our initial inquiry suggests that more needs to be done to understand whether the supply chain is secure and fully capable of protecting the manufacturing of this critical document," Mr. Dingell told The Times.

Mr. Stupak said that considering the personal information contained on e-passports, "it is essential that the entire production chain be secure and free from potential tampering." He added: "The GPO needs to make every effort to ensure that future passport components are made in America under the tightest security possible."

Michelle Van Cleave, a former National Counterintelligence Executive, said outsourcing passport work and components creates new security vulnerabilities, not just for passports.

"Protecting the acquisition stream is a serious concern in many sensitive areas of government activity, but the process for assessing the risk to national security is at best loose and in some cases missing altogether," she told The Times.

"A U.S. passport has the full faith and credit of the U.S. government behind the citizenship and identity of the bearer," she said.

"What foreign intelligence service or international terrorist group wouldn't like to be able to masquerade as U.S. citizens? It would be a profound liability for U.S. intelligence and law enforcement if we lost confidence in the integrity of our passports."
http://washingtontimes.com/apps/pbcs...86493/0/SPORTS





Australian WiMAX Pioneer Trashes Technology as "Miserable Failure"

Australia’s first WiMAX operator, Hervey Bay’s Buzz Broadband, has closed its network, with the CEO labeling the technology as a “disaster” that “failed miserably.”

In an astonishing tirade to an international WiMAX conference audience in Bangkok yesterday afternoon, CEO Garth Freeman slammed the technology, saying its non-line of sight performance was “non-existent” beyond just 2 kilometres from the base station, indoor performance decayed at just 400m and that latency rates reached as high as 1000 milliseconds. Poor latency and jitter made it unacceptable for many Internet applications and specifically VoIP, which Buzz has employed as the main selling point to induce people to shed their use of incumbent services.

Freeman highlighted his presentation with a warning to delegates, saying “WiMAX may not work.” He said that the technology was still “mired in opportunistic hype,” pointing to the fact most deployments were still in trials, that it was largely used by start-up carriers and was supported by “second-tier vendors”, which he contrasted with HSPA with 154 commercial networks already in operation and support from top tier vendors.

What made Freeman’s presentation most extraordinary was that just 12 months ago he fronted the same event with a generally positive appraisal of the platform which at that stage he had deployed just a few months before. At the time, Freeman said that his company had signed 10% of its 55,000 user target market in just two months, a market share that rose to 25%, on the back of an advertising campaign that highlighted value VoIP prices.

He did acknowledge at the time that the technology had indoor coverage issues, which he yesterday said had earned him a quick and negative reaction at the time from his supplier, Airspan. Other early WiMAX adopters have also reported issues with indoor coverage: VSNL in India reported indoor loss at just 200m from the base station at an IEEE conference last year.

HORSES FOR COURSES: Freeman says Buzz has now abandoned WiMAX in favour of a “horses for courses” policy. This includes use of the TD-CDMA standard at 1.9GHz—used by operators such as New Zealand’s Woosh Wireless—and a platform he described as wireless DOCSIS– a relatively little known technology that takes HFC plant and extends its capabilities via wireless mesh. He said wireless DOCSIS operates at up to 38Mbps in the 3.5GHz spectrum and its customer premises equipment supported two voice ports for under $A70 while it boasted “huge cell coverage.” He also was employing more conventional wireless mesh platforms at 2.4GHz that support up to 10Mbps with CPE voice ports costing less than A$80.

Despite his problems with WiMAX, Freeman is a believer that competitors should operate their own infrastructure and not depend on Telstra unbundled or wholesale offerings. Prior to Buzz he was involved in the rollout of regional Victorian HFC networks as an executive with Neighborhood Cable. He says the use of wireless is essential in Hervey Bay, because ADSL is blocked to 80% of the population because of Telstra’s use of pairgain and RIMs, while what ADSL ports are available are now largely exhausted. But years of successive government policies had weakened the case for standalone infrastructure, beginning with restrictive policies in the pay television market which he said undermined independent HFC deployments.

“I’m against government micromanagement of the market. Government should start to provide a conducive investment environment.”

Not all WiMAX operators are unhappy.

Internode says an Airspan-supplied network is providing consistent average speeds of 6Mbps at distances up to 30km, with CEO Simon Hackett describing the platform as “proven.”

Freeman’s frank words left many at the WiMAX event looking uncomfortable but none more so than his co-panelist Adrian de Brenni representing Opel Networks. De Brenni, standing in for an absent Jason Horley, said little new about Opel that hasn’t already been discussed, except to state that QoS would be a product feature of the future Opel wholesale offering “including voice.”
http://www.commsday.com/node/228





Sprint, Clearwire Team Up on WiMax Cable Joint Venture

Cable companies discuss funding wireless venture

Comcast Corp and Time Warner Cable Inc, the two largest U.S. cable operators, are discussing a plan to fund a new wireless company that would be run by Sprint Nextel Corp and Clearwire Corp, people familiar with the discussions said on Tuesday.

Comcast is expected to contribute as much as $1 billion for the wireless venture, which would use emerging WiMax technology for the nationwide venture, said the sources, who were familiar with the deal but not authorized to speak to the media about it.

WiMax is a largely unproven technology that promises to support Internet access at speeds up to five times faster than traditional wireless networks, and can support a range of mobile and video applications.

WiMax is a potential substitute for fixed-wire high-speed Internet that, for instance, could be offered across an entire metropolitan region.

Time Warner Cable is expected to put in about $500 million, the sources said.

These parties are trying to raise $3 billion for the joint venture. Intel Corp may contribute $1 billion, one of the sources said. Google Inc may be potential funding partner, the person said.

Bright House Networks, the sixth-largest U.S. cable provider, is also involved in the discussions and would provide between $100 million and $200 million for the venture, the person said.

But the talks are still in "very early stages," the sources said and a deal could fail to come to a conclusion.

Intel, Sprint, Google and Clearwire officials declined to comment. Officials from Comcast and Time Warner Cable were not immediately available.

Sprint, the No. 3 U.S. mobile service, has said it aims to use WiMax technology to provide wireless connections to consumer electronic devices such as music players and cameras as well as cell phones and laptops.

Sprint, which has been bleeding subscribers amid customer service problems, said recently that it was reviewing its WiMax plans after it was widely criticized by shareholders for a commitment to spend $5 billion to build the high-speed wireless network based on WiMax.

Both Sprint and Clearwire have said they were looking for outside funding for their WiMax networks.

Sprint and Clearwire also said last month they were continuing to talk even after they announced late last year that they ditched an agreement to let customers roam between both companies' WiMax networks.

Meanwhile, cable companies like Comcast are making a push to enter the wireless business to fend off competition from telecom giants such as Verizon Communications These companies have been snatching traditional cable customers by offering bundled services including high-speed Internet, cell-phone, land-line and video.

(Reporting by Anupreeta Das and Duncan Martell in San Francisco and Yinka Adegoke and Sinead Carew in New York; Editing by Gary Hill and Lincoln Feast)
http://www.reuters.com/article/techn...28498320080326





Google Offers New Plan for the Airwaves

Google Inc on Monday unveiled plans for a new generation of wireless devices to operate on soon-to-be-vacant television airwaves, and sought to alleviate fears that this might interfere with TV broadcasts or wireless microphones.

In comments filed with the Federal Communications Commission, the Internet leader outlined plans for low-power devices that use local wireless airwaves to access the "white space" between television channels. A Google executive called the plan "Wi-Fi 2.0 or Wi-Fi on steroids."

"The airwaves can provide huge economic and social gains if used more efficiently ...," Google said in the comments.

Rick Whitt, Google's Washington telecom and media counsel, said this class of Wi-Fi devices could eventually offer data transmission speeds of billions of bits per second -- far faster than the millions of bits per second available on most current broadband networks. Consumers could watch movies on wireless devices and do other things that are currently difficult on slower networks.

The white-space airwaves could become available in February 2009, when TV broadcasters switch from analog to digital signals. Whitt said he expects devices using white-space spectrum could be available by the end of 2009.

Shares of Google surged $27.36, or 6.3 percent, to $460.91 amid a sharp rise in U.S. stock markets. The Nasdaq composite index was up 3.3 percent.

Google sees the white-space spectrum as a natural place to operate a new class of phones and wireless devices based on Android, Google's software that a variety of major equipment makers plan to use to build Internet-ready phones.

The Silicon Valley company also said that, in general, it stands to benefit whenever consumers have easier access to the Internet. Google's primary business is selling online ads as people perform Web searches.

The FCC filing comes less than two weeks after Bill Gates, co-founder of Google rival Microsoft Corp, urged the agency to free up the white-space spectrum so it could be used to expand access of wireless broadband.

Google and Microsoft are part of a coalition of technology companies that has been lobbying the FCC to allow unlicensed use of white-space spectrum.

The group also includes Dell Inc, Intel Corp, Hewlett-Packard Co and the north American unit of Philips Electronics.

The idea is opposed by U.S. broadcasters and makers of wireless microphones, who fear the devices would cause interference.

The FCC currently is testing equipment to see if the white-space spectrum can be used without interfering with television broadcasts.

In a compromise designed to mollify some interest groups opposed to expanding use of white-space spectrum, Google proposed a "safe harbor" on channels 36-38 of the freed-up analog TV spectrum for exclusive use by wireless microphones, along with medical telemetry and radio astronomy devices. In effect, no white-space devices could use these channels.

Google said "spectrum-sensing technologies" could be used that would automatically check to see whether a channel was open before using it, thereby avoiding interference with other devices. It said such technology is already being used by the U.S. military.

Google said the enhancements "will eliminate any remaining legitimate concerns about the merits of using the white space for unlicensed personal/portable devices."

Google also said it would provide free technical assistance to other companies seeking to take advantage of white-space airwaves. This would include having Google help to maintain "open geo databases" of local channels for use by any device certified to use the spectrum.

A proposal being studied by the FCC would create two categories of users for the airwaves: one for low-power, personal, portable devices, and a second group for fixed commercial operations.

(Editing by Tim Dobbyn and John Wallace)
http://www.reuters.com/article/ousiv...00918220080324





FCC to Investigate D-Block Auction

9/11 Commission joins the chorus of observers calling the auction results out of key.
Carol Pinchefsky

The Federal Communications Commission will investigate complaints that a contractor tried to extort potential bidders for a block of the spectrum set aside for emergency responders, and doomed the auction to failure.

The auction of the agency's D block of the 700MHz spectrum, reserved at a lower price for emergency responders and private ventures who agree to share it with emergency responders, netted a single $472 million offer from Qualcomm, which was well below the FCC's $1.3 billion reserve price. Several public interest groups, including the 9/11 Commission and members of Congress, have asked the agency to investigate accusations that Cyren Call, a consultant hired to manage the handover of the D-block spectrum, demanded payment from auction participants and discouraged bids.

Kevin J. Martin, chairman of the FCC, referred the matter March 21 to FCC Inspector General Kent R. Nilsson.

Although anti-collusion laws prevent people, businesses, and other interests involved from discussing the specifics of the auction, organizations like The Public Interest Spectrum Coalition—comprised of several public interest groups including Public Knowledge, U.S. PIRG, Free Press, and the Media Access Project—believe that a failure to meet the reserve price was not merely a problem of logistics. The 9/11 Commission also raised objections to the auction result.

In a press release, PISC referred to a letter they sent to the FCC on March 19, 2008. "[W]hile not accusing any party of wrongdoing, the letter asks the FCC to investigate whether discussions between Morgan O'Brien of Cyren Call and possible D Block bidder Frontline Wireless caused Frontline to lose financial backing and scared off other bidders." Cyren Call, the advisor to The Public Safety Spectrum Trust Corporation, manages the public use of the spectrum in the FCC's public-private partnership.

In the last few days, allegations that Cyren Call tried to tack on $50 million worth of fees annually in order for the winning bidder to lease the spectrum—on top of the $1.3 billion minimum licensing fee—are either a whispering campaign against the company, fact, or something in between.

Representative Edward J. Markey, D-Mass, chairman of the House Subcommittee on Telecommunications and the Internet, also called for an investigation into why the auction failed to meet its reserve price. When asked if the hearing would address alleged interference by Cyren Call in the auction, Jessica Schafer, communications director for Representative Markey, said, "We're looking at several issues all related to this auction. If there are issues about tampering, I assume they would come up."

Schafer said, "The point of the hearing is we don't know what the problem was, why we didn't manage to meet the reserve price…. the point of this hearing is we need to figure out what were the concerns of all players. We want to find out what happened here and how we can make it a successful auction."
http://www.eweek.com/c/a/Government/...Block-Auction/





Bill Criminalizing WiFi Leeching Shot Down, and Rightly So
Eric Bangeman

If you use someone else's WiFi signal without permission, you're a thief. That's the conclusion of a bill introduced into the Maryland General Assembly last week. Sponsored by Delegate LeRoy E. Myers, Jr., the legislation would criminalize the unauthorized use of a wireless access point in the state; it has since received an "unfavorable report" by the House Judiciary Committee, which all but dooms its prospects of passage.

The bill's purpose is to prohibit anyone from accessing a wireless access point "intentionally, willfully, and without authorization." It appears to ban both everyday WiFi leeching and using an open access point for more nefarious activities. And if you live in a Maryland neighborhood with broadband usage caps, it would be illegal to deliberately cause someone to overuse their bandwidth allotment. Violators would be subject to fines of up to $1,000 and three years in jail, unless they tried to hack into a password-protected system or used their unauthorized access for mischief. Then the penalties could climb to $10,000 and 10 years in prison.

With 802.11b/g/n technology becoming widespread over the past several years, there has been a corresponding rise in cases of people being arrested for using open access points without permission. One of the first cases came in 2005 when a Florida man was convicted of accessing a computer network without authorization. He was arrested and charged with the third-degree felony after the WAP's owner discovered him surfing in an SUV outside of their home.

There have been other cases, including an Illinois man who pleaded guilty to remotely accessing a computer system without permission in 2006 and a Michigan man who parked his car in front of a café and used its free WiFi service—without even ordering so much as a latte. When combined, it's a troubling trend of law enforcement and politicians overreacting to what is generally a harmless activity.

In the case of Delegate Myers, the legislation was prompted by his neighbor's use of Myer's WiFi connection without permission, according to Maryland newspaper The Herald-Mail. Apparently, Myers hadn't bothered to password-protect his WAP or use rudimentary security precautions. The proposal of the bill suggests that Myers would rather see this legislated than learn how to set up wireless security.

Sanity appears to reign in Maryland, however. The Maryland public defender's office is opposing the bill, noting the widespread availability of WiFi networks. "A technically unsophisticated user, such as a visiting parent, or simply a houseguest unfamiliar with the home's Internet could and probably would choose the first available network," the office noted in a filing. The public defender makes the obvious point (one that seems to have escaped Myers) that those wanting to prevent unauthorized access should enable routers' built-in security functions. It's so easy that there's no excuse for not doing it.

Of course, some people don't mind sharing their WiFi connections. In a piece I did earlier this year on the ethics of using unprotected wireless access points, I mentioned a friend who left his WAP unsecured as a public service. If people don't bother to protect their networks, there's no way of telling whether or not they want others to use it. Using an open WAP to cause harm should be punished in the same way as other nefarious hacking activities, using one to check e-mail, surf the 'Net, or read RSS should not be punished at all. Thankfully, the Maryland House Judiciary Committee appears to have more common sense than Del. Leroy Myers. That said, we probably haven't see the last of such bills.
http://arstechnica.com/news.ars/post...ightly-so.html





Hazards of Wifi
Dale Dougherty

Our town, Sebastopol, had passed a resolution in November to permit a local Internet provider to provide public wireless access. This week, fourteen people showed up at a City Council meeting to make the claim that wireless caused health problems in general and to them specifically. These emotional pleas made the Council rescind its previous resolution.

So, a few people in this town strongly believe a wide variety of problems are caused by low frequency electromagnetic radiation (EMF). Some label the problem as "electromagnetic hypersensitivity" or EHS. Here's the Wikipedia entry on " Electrical Sensitivity." It reports that the World Health Organization found that "there is no scientific basis for the belief that EHS is caused by exposure to electromagnetic fields."

An online petition collected 235 "signatures" opposed to public Wifi in Sebastopol. The resolution reads: "The convenience of this technology does not warrant the increase in radiation and the potential risks to the health of our community."

Here's a typical comment from someone signing the petition:

I have had health challenges, and my body cannot handle wifi...it gives me headaches and makes me very sick. I would be unable to go to the store, shop. I have enough problems being limited in my travels, it is outrageous that a place so environmentally conscious would create this in our/my hometown. In Europe they are much more advanced than us, and there wifi is not allowed in cities in the European commonwealth.

The person organizing the petition believes that people don't understand the harm that electromagnetic radiation and basic electricity is doing to them. On a local bulletin board, the opponents cite bioinitiative.org.

One person writes:
We are urged us to switch from regular incandescent light bulbs to compact fluorescent (CFL) light bulbs, to save energy. However, there is a very good reason NOT to use CFL bulbs. They create electromagnetic frequencies proven to be extremely detrimental to human health.

Others write about living without electricity, except for the brief period they are using their computer to write messages in support of the petition.

One can see the fear spreading. Science should be a way to dispel such fears but it is clear with this group of people that science cannot be trusted. They put forth the idea that science should be able to prove that there is no harm and therefore eliminate any risk, and without such proof, we should not move forward. They use this logic to recommend a "precautionary" approach, which is their keyword for a "know-nothing, do-nothing" approach.

Yet another person writes:
Research is increasingly showing a correlation between adverse health symptoms and emf radiation exposure. Local and national governmental bodies in other western countries are paying attention and are beginning to legislate limits to exposure to wi-fi radiation by prohibiting it in certain locations. The trend towards increasing international concern is clear. Why are we so sanguine in this country?

Of course, the research is not specified. I can't find much about governments banning wifi except a college in Ontario and a European directive on radiation that threatens to eliminate MRI scans. The article "Wifi Woo" in Junkscience.blogspot.com is interesting.

The effect of the resolution would have been to add a few wireless access points downtown. There are already several hundred in private homes and businesses in town. The same people who oppose public wifi still walk along streets and into buildings where they are invisibly bathing in wifi. Will this small group of people now demand that we outlaw wireless in public areas, just to accommodate their fears?

Now, I don't know that wireless (or electricity) is without harm. I can read the research that does exist and learn more -- if I have the time and reason to do so. However, I do not like the smell of fear, and when people justify actions based on their own fears, I become suspicious that the concern is unwarranted. If it wasn't wifi, it would be flouride. Something is needed to affix to their anxiety. I can only be glad that they weren't alive when the city decided on electrification a century ago.

I plan to write an editorial for our local paper. I'd appreciate hearing from you on this issue if it has come up in your community.
http://radar.oreilly.com/archives/20...s-of-wifi.html





Analysis: Patent Reform Bill Unable to Clean up Patent Mess
Timothy B. Lee

Fixing serious flaws

Last September, the House of Representatives approved the Patent Reform Act of 2007, legislation that would make important changes to America's patent system. With the legislation being fiercely debated behind closed doors in the Senate, Ars takes a closer look at the legislation's provisions, the major players in the debate, and the legislation's prospects for curing what ails the American patent system.

There are formidable forces arrayed on both sides of the patent debate. On one side are major technology firms who are concerned that the explosion of dubious patent litigation threatens their bottom line. Their interest has been piqued by recent eye-popping awards to patent trolls. These companies want to make it more difficult for patent holders to extract large payouts from deep-pocketed firms. But at the same time, most hold large patent portfolios of their own, and are wary of changes that could cut too deeply into their licensing revenues.

On the other side are a variety of interest groups who benefit from the current patent system and are wary of changes that might reduce the profits of patent holders. These include the pharmaceutical industry, which feels well-served by existing patent rules. It also includes the patent bar, patent examiners, and other groups that are invested in the current process.

Despite the heated rhetoric on both sides, it is unclear if the legislation will do much to fix the most serious flaws in the patent system. A series of appeals court rulings in the 1990s greatly expanded patentable subject matter, making patents on software, business methods, and other abstract concepts unambiguously legal for the first time. The result has been a flood of patents of broad scope and dubious quality. With one very minor exception, none of the proposals being debated on the Hill would address these changes.

As a result, it may be hard for advocates for fundamental reform of the patent system to get too excited about this year's debates on the Hill. Although the legislation includes provisions that are likely to moderately reduce the toll that patents take on high-tech innovation, none of the proposals address the fundamental problems that have cropped up in recent years. Opponents of software patents, in particular, will find the provisions of the Patent Reform Act underwhelming. Their best hope is that the Supreme Court tackles the issue in the coming years. If that doesn't happen, then they will likely need to wait for the situation to deteriorate further before there will be sufficient political will for serious reforms.

Damage control

Probably the most important changes under consideration are reforms that would reduce the size of damages that could be awarded to successful plaintiffs in patent lawsuits. Two features of present patent laws have contributed to the astonishing size of recent damage awards. One is that when a patent covers one component of a complex product, courts have sometimes awarded damages based on the value of the entire product, even if the patent covers only a small aspect of that product. The other is the ability to win treble damages if patent infringement is found to be "willful." Both of these provisions increase the potential payoff to patent holders, encouraging more patenting and more patent litigation.

On the issue of apportionment of damages, both last fall's House legislation and the pending Senate bill dictate that when a patent covers only a portion of an invention and an appropriate royalty cannot be computed from existing licensing agreements, that damages should be calculated based on "the economic value of the infringement attributable to the claimed invention's specific contribution over the prior art." Advocates hope that this provision will prevent the holder of a patent on one part of a complex system from holding the entire system ransom.

Under patent law, infringement damages can be trebled if the court finds that infringement was done willfully. But critics charge that this rule causes many firms—especially in the software industry—to avoid looking at patents at all out of fear of heightened liability. Recent court decisions have reduced this problem somewhat, but both bills would go further by limiting findings of willful infringement to cases where the infringer had received specific written notice from the patentee, the infringer intentionally copied from the patent, or the infringer continued to infringe after losing in court. This would go a long way toward ensuring that software firms could conduct patent searches without fear of heightened liability.

Procedural reforms

The legislation would make a variety of changes to patent procedures that would be a big deal for patent lawyers, but may not have significant effects outside the patent bar. Possibly the most important change would be the creation of a new system for challenging patents after they've been granted. The patent office has had a limited reexamination process since 1981, but it has enjoyed broad discretion regarding the patents it chose to review. The Patent Reform Act would create a new "post-grant review" process that can be initiated by third parties. This process could provide a new avenue for striking down bad patents without the need for full-blown patent litigation.

Probably the most discussed procedural change is the switch from a first-to-invent to a first-to-file rule for granting patents. In most countries, the first person to file a patent application generally receives the patent, unless it can be shown that he copied the idea from another applicant. In contrast, the United States grants the patent to the first person to have developed the invention claimed in the patent, even if she wasn't the first person to file for the patent. This can cause administrative headaches for the patent office because it must sometimes conduct elaborate investigations to ascertain who was the true first inventor (although such "interference" proceedings occur for only a small fraction of patent applications). The Patent Reform Act would, for the first time, institute a first-to-file rule for granting patents, bringing the US into line with the rest of the world.

Another major procedural reform relates to disclosure. Traditionally, the patent application process has been secret, with only the applicant and Patent Office officials having access to pending applications. In 1999, Congress for the first time required disclosure of pending patent application 18 months after they were filed. But this requirement had an important limitation: it applied only to patents that were simultaneously being sought overseas, where such disclosures are common. The House and Senate bills drop the rule about foreign filing and require that, in most cases, applications should be disclosed within 18 months.

Trolls love purple

A final key procedural reform relates to venue shopping. Certain jurisdictions—especially the Eastern District of Texas—have become notorious for being sympathetic to plaintiffs in patent cases. As a result, a inordinate number of patent cases have found their way to East Texas courtrooms. The Patent Reform Act would rein in this abuse of jurisdictional rules by restricting plaintiffs' options when filing a patent lawsuit. While the House and Senate versions differ slightly, both would require that patent litigation occur in a location where either the plaintiff or the defendant have a significant presence.

The legislation does almost nothing to rein in the Federal Circuit's increasingly permissive attitude toward patents on abstract concepts like software, business methods, and mental processes. Only one provision of the Patent Reform Act addresses this issue: the House bill includes a prohibition on patents for tax planning methods.

Evaluating the Patent Reform Act

To help us understand the implications of the Patent Reform Act, Ars talked to James Bessen, co-author of a recent book on the patent system. Bessen told Ars that the most important changes currently under consideration on the Hill are the post-grant review process, the apportionment of damages, and willful infringement. But he argued that none of these changes will address the serious flaws that plague the patent system. Rather, their most important effect will be the signal it sends that the IT industry is finally taking patent issues seriously and organizing to tackle the problem in Congress. He predicted that Congress will need to revisit the issue in the coming years as patent problems continue to mount.

Bessen had three major suggestions to offer to Congress once it gets serious about fundamental reform. First, there needs to be much stronger limits on patenting of abstract ideas, including software and business methods. Bessen argued that narrowly construing patent scopes will significantly improve the situation by preventing patent holders from using a single, broad patent to harass entire sectors of the technology industry.

Second, Bessen said that dramatically increasing maintenance fees would give patent holders a strong incentive to let unimportant patents lapse. That would reduce the total number of patents in effect and make it easier for potential innovators to review the remaining patents for possible infringement.

Finally, Bessen told Ars that the law needs to have protections for inadvertent infringers. Under current law, in most cases there are no legal protections available to a firm that independently develops a technology that is covered by an existing patent. An independent invention defense to patent infringement would be the strongest possible reform in this direction. A less ambitious approach would be to reduce damages in cases where independent invention can be demonstrated.

Bessen was pessimistic that any of these reforms would prove politically feasible in the short term. The interest groups that now profit from the expansion of patent scope and the resulting litigation won't accept such changes without a fight. On the other hand, as the problems with the patent system become more and more obvious, the momentum for reform may grow to the point where serious reforms become possible.

Prospects for passage

Indeed, it remains to be seen if even the modest reforms of the Patent Reform Act will survive the Congressional sausage factory. The legislation has the backing of the IT industry, but it is opposed by the pharmaceutical industry, the patent bar, and some labor unions. Negotiations have dragged on for months, as these competing interest groups have spent millions on lobbying and public relations campaigns. Earlier this month, supporters of the legislation were predicting an imminent compromise on the bill's key provisions, but no announcement has been forthcoming and insiders are now predicting that the legislation's backers will be lucky to see a vote in April.

The uphill climb faced by even this modest patent reform legislation suggests that momentum for serious reform is unlikely to originate on Capitol Hill. Two major developments could bring about serious reform. One would be continued interest in the subject from the Supreme Court, which has already heard several patent cases and will rule on another case this term. In the oral arguments in last year's AT&T v. Microsoft case, Justices Breyer and Stevens both raised questions about the patentability of software. The End Software Patents Project is focusing much of its efforts on laying the groundwork for a challenge to the patentability of software should an appropriate case come before the Supreme Court.

The other development that could add momentum to patent reform efforts is if abusive patent litigation continues to spiral out of control. Thus far, the coalition for patent reform has been somewhat disorganized, as many technology firms have been split between their desire for protection against patent trolls and their desire to collect royalties on their own patents. But a few more nine-figure jackpots for patent trolls could help focus their minds. As the costs of the patent system continue to soar, Silicon Valley's heavyweights may become more sympathetic to more sweeping reforms, and more willing to put more effort into making them happen.
http://arstechnica.com/articles/cult...atent-mess.ars





IP Hypocrisy: US Likes WTO Rulings Only When it Wins
Nate Anderson

The US likes to call out other countries for not being tough enough with intellectual property rules, and it tosses countries like Russia, China, and even Israel onto "watch lists" and "priority watch lists" in an attempt to force changes. But the US comes in for its share of IP-related criticism from other countries both small and large, too. When it happens, though, we're not nearly so quick to change our ways.
US hauls China before WTO again: less counterfeiting, piracy please

Two ongoing cases illustrate the point. First, the European Union is pushing for the US to change a pair of rules that it calls "long-standing trade irritants." Despite World Trade Organization rulings against it, the US has not yet corrected either case for a period of several years. Ambassador John Bruton, who represents the EU in the US, said in a statement late last week that he wants to see the matters resolved.

"As the stakes continue to grow in the intellectual property arena, the US should not weaken its voice in the debate by ignoring treaty obligations and WTO decisions," Bruton said. "American delay on fixing the 'Irish Music' and 'Havana Club' cases diminish the arguments that both the US and EU countries have against China and other countries that continue to tolerate widespread intellectual property rights infringement."

The so-called "Irish Music" dispute concerns the portion of US copyright law that lets restaurants and shops play broadcast music without compensating the copyright holders. As previous coverage of this issue shows, Europe takes a fairly hard-line stance on these payments; a UK car repair chain was even targeted by collecting societies because its mechanics played their radios loud enough that customers could hear them.

The WTO ruled against the US in 2000, and the US responded not by changing its laws, but by making payments directly to European collecting societies. Now, those payments have lapsed but the US law remains.

The Havana Club issues stems from the long-standing US effort to impose sanctions on Cuba. The US reduced the rights of US companies who owned trademarks "which previously belonged to a Cuban national or company expropriated in the course of the Cuban revolution." Such marks, including "Havana Club Rum," were no longer protected in the US, could not be renewed, and could not be enforced. This makes EU companies that have invested in Cuban business less than pleased, especially when they believe that US drink makers were behind the rule change.

The US also lost a WTO ruling on the matter and was to have complied by 2005. To date, it has not.

The second case concerns Antigua and Barbuda, a small Caribbean country home to all sorts of online vices, including gambling and DRM circumvention. Antigua took the US to the WTO years ago over charges that the US was unfairly criminalizing access to Antiguan gambling websites while still allowing US-based horse racing sites to function. The WTO ruled against the US on several occasions, including a 2007 ruling that found the US had not yet bothered to comply with previous rulings.

In retaliation, Antigua has announced a radical plan in an attempt to force its larger neighbor to play ball. The Antiguan government has recently stated that it will allow piracy of US intellectual property such as movies and movies unless the US changes its gambling policy (though this move has yet to be authorized by the WTO).

Apparently, it's easy to get hot and bothered when it's industries from your country that claim to be badly affected by rules elsewhere. When it comes to the claims of other countries, though, even claims that have been validated by the WTO, it's much easier to see the complexity of the situation, to spend years arguing those complexities before judges, and to do nothing even when compelled by rulings.

This sort of behavior makes it that much harder to assert some kind of moral high ground when China, Russia, and others pick and choose which of their WTO obligations they are going to comply with.
http://arstechnica.com/news.ars/post...n-it-wins.html





Wikileaks Releases Over 120 Censored Videos and Photos of the Tibet Uprising

Today Wikileaks released over 120 censored photos and videos of the Tibet uprising and has called on bloggers around the world to help drive the footage through the Chinese internet censorship regime -- the so called "Great Firewall of China".

The transparency group's move comes as a response to the the Chinese Public Security Bureau's carte-blanche censorship of youtube, the BBC, CNN, the Guardian and other sites carrying video footage of the Tibetan people's recent heroic stand against the inhumane Chinese occupation of Tibet.

Wikileaks has also placed the collection in two easy to use archives together with a HTML index page so they may be easily copied, placed on websites, emailed across the internet as attachments and uploaded to peer to peer networks.

Censorship, like communism, seems like a reasonable enough idea to begin with. While 'from each according to his ability and to each according to his need' sounds unarguable, the world has learned that these words call forth a power elite to administer them with coercive force. Such elites are quick to define the needs of their own members as paramount. Similarly 'from each mouth according to its ability and to each ear according to its need' seems harmless enough, but history shows that censorship also requires an anointed class to define this "need" and to make violence against those who continue talking. Such power is quickly corrupted.

The first ingredient of civil society is the people's right to know, because without such understanding no human being can meaningfully choose to support anything, let alone a political party. Knowledge is the driver of every political process, every constitution, every law and every regulation. The communication of knowledge is without salient analogue. It is living, unique and demands its rightful place at the summit of society. Since knowledge is the creator and regulator of all law, its position beyond law commands due respect.

James Madison, Thomas Jefferson and other Enlightenment framers of the US Bill of Rights understood this well when they began the First Amendment's constitutional protections of speech and of the press with 'Congress shall make no law....'.

As knowledge flows across the world it is time to sum great freedoms of every nation and not subtract or divide them. Let us then unite in common purpose for the surest way to protect the freedoms of any nation is to protect the freedoms of every nation.
http://wikileaks.org/wiki/Wikileaks_...Tibet_uprising





BBC Website 'Unblocked in China'

People in China are able to access English language stories on the BBC News website in full, after years of strict control by Beijing.

The Communist authorities often block news sites such as the BBC in a policy dubbed the "great firewall of China".

But BBC staff working in China now say they are able to access news stories that would have been blocked before.

However, the firewall remains in place for Chinese language services on the website and for any links in Chinese.

'Without hindrance'

Beijing has never admitted to blocking access to BBC news stories - and there has been no official confirmation that the website has been unblocked.

But Chinese users trying to access pages on the site have almost always been redirected to an error message telling them: "The connection was reset."

It now appears that this is no longer the case, and access to the site is much easier.

Steve Herrmann, editor of the BBC News website, says this is a welcome development.

"We want BBC News to be as accessible in China as anywhere else in the world," he said.

"We will endeavour to continue working with the Chinese authorities to improve our access in other areas."

Technology experts say such a development would not be possible without the approval of internet service providers - which are under strict supervision by Beijing.

Surprised readers

Statistics show that traffic to the website from China has been much higher than usual.

Typically fewer than 100 people read stories from Chinese computers - but on Tuesday that figure jumped to more than 16,000.

And comments have been flooding in to BBC forums from all over China.

Ross Brown in Qingdao, Shandong province, wrote: "We were just discussing with some of our Chinese colleagues about the fact BBC website was blocked, went on to show them and we see this latest news. Excellent news."

Many of the comments came from readers of the website who have spent years accessing the stories by linking their computers to others based outside of China.

Tibet difficulties

The Chinese authorities had promised to give foreign journalists more freedom in the run-up to this summer's Olympic Games.

But analysts say that recent outbreaks of unrest in Tibet have made this promise more difficult for Beijing to uphold.

The BBC and other media organisations still find reporting from Tibet very difficult - foreign journalists were refused permission to enter the region during the recent protests.

The websites of UK newspapers the Times and Guardian - as well as video-sharing site YouTube - were blocked or partially blocked during the unrest.

This week the Chinese government has arranged a trip for foreign media organisations to Tibet - but the BBC's request to be included was rejected.

Although no official ban has ever been announced, the BBC News website has been blocked for almost a decade.

The Chinese language website has been blocked since its launch in 1999.


Are you in China? What is your reaction to this story? Is this your first time reading the BBC News website?


Name:
Email address:
Town and Country:
Phone number (optional):
Comments:

http://news.bbc.co.uk/go/pr/fr/-/2/h...ic/7312240.stm





Germany's Top Court Curtails Disputed Data Storage Law

Critics feel the law would make huge amounts of data vulnerable to misuse

In a blow to Berlin's efforts to boost anti-terrorism measures, Germany's highest court on Wednesday, March 19 blocked parts of a sweeping data-collection law that had prompted large protests by civil liberties

Germany's constitutional court on Wednesday severely curbed parts of a wide-reaching and highly controversial data collection law that requires telecom companies to store telephone and Internet data for up to six months, dealing a setback to government efforts to fight terrorism.

The law which went into effect in January gave the federal government broad access to data including e-mail addresses, length of call and numbers dialed and in the case of mobile phones, the location calls are made from.

Fears of data misuse

The court in Karlsruhe on Wednesday declared that parts of the law were unconstitutional pending review. It ruled that the data may continue to be stored by telecom companies but that details may only be transferred to investigators in the event of inquires into serious crime and only with a warrant. In cases of less serious crime, investigating authorities may only access the data subject to a final decision by the top court, judges said.

The legislation is part of an EU directive formulated in response to bomb attacks in Madrid and London in past years.

Demonstrators in Hamburg staged a mock funeral marking the "Death of Privacy"

It has sparked large protests by German civil liberties campaigners, privacy advocates and opposition politicians in recent months who argue it would make huge amounts of personal data vulnerable to misuse and place everybody under general suspicion of being a terrorist.

Critics are also concerned that the data may be handed over to foreign intelligence agencies and could be used by entertainment companies to pursue and prosecute people who illegally download music and videos.

On Dec 31, over 30,000 opponents of the law filed a class-action suit, the biggest of its kind in Germany, against the bill.

"A serious blow to the government"

Wednesday's decision was the latest in a series of rulings against tighter security measures by Chancellor Angela Merkel's government.

Last month, the constitutional court established a new "fundamental right" for German citizens to surveillance-free computer systems. In recent years, the court has set limits on bugging homes, trawling electronic databases and monitoring computers of suspected criminals.

The decision was hailed by opposition politicians and privacy advocates.

"The grand coalition should finally draw the lesson of these verdicts and stop crossing the limits of constitutionality on citizen rights," said Claudia Roth of the opposition Greens party.

"This is a serious blow to the federal justice ministry which until recently has been trying to play down the violation of basic rights of citizens by the data collection bill," said Joerg van Essen from the opposition free-market liberal Free Democratic Party (FDP).
http://www.dw-world.de/dw/article/0,...203058,00.html





China's Battle to Police the Web

Web users in China are able to view the BBC News website for the first time in years. So how does the so-called great firewall of China work?
Darren Waters

It is not clear why China's net population, the world's largest, is suddenly able to view the BBC News website after years of being blocked. Nor is it clear how long the access will continue.

But what is certain is that China's authorities have dynamic control of what their citizens can and cannot access.

Most countries that block or filter the internet do so on a site-by-site basis. For example, Pakistan blocked YouTube recently by telling Internet Service Providers (ISPs) in the country to redirect traffic whenever someone typed in the address for the popular video sharing site.

By deliberately rewriting the net address books inside Pakistan, authorities were able to redirect traffic.

But this is a blunt method of filtering and relies on authorities to actively track websites it wants to ban.

China does not block content or web pages in this way. Instead the technology deployed by the Chinese government, called Golden Shield, scans data flowing across its section of the net for banned words or web addresses.

There are five gateways which connect China to the internet and the filtering happens as data is passed through those ports.

When the filtering system spots a banned term it sends instructions to the source server and destination PC to stop the flow of data.

Amnesty International has accused net giant Cisco and Sun Microsystems of actively assisting with the development of censorship and surveillance systems in the country.

Both firms have rejected the accusation and have said the equipment they sell to China is no different from products sold in other countries.

The dynamic nature of filtering in China gives the government more control over content and means the authorities can react to news events.

Oppressive regimes

It has been called "just-in-time filtering" and is being employed more widely around the world in oppressive regimes.

It allows authorities to block access to information around key events like elections, demonstrations etc.

Security researchers believe this form of filtering was employed on YouTube in China during the recent unrest in Tibet.

In January last year, President Hu Jintao reportedly ordered officials to regulate the internet better and "purify the online environment" ensuring that online information is "healthy" and "ethically inspiring".

This was followed by a new wave of censoring certain websites, blogs and online articles.

But there have been well-documented ways to by-pass China's firewall.

One method involves connecting to a friendly computer outside China and using it as a proxy, to access websites that are banned.

China cannot block every computer outside its borders so this method has proved popular with citizens wanting unfettered access to the net.

The problem has been in informing users in China of the IP address, the unique number of every device online, of the machines willing to act as proxy servers.

E-mail has been one method to alert people; however China is believed to have 30,000 people who routinely scan e-mails for this kind of information.

Organisations in the US and elsewhere have been working on technology to make this process of finding friendly computers more easily.

The University of Toronto's Citizen Lab has developed software called psiphon which acts as a tunnel through the firewall.

Psiphon works through social networks. A net user in an uncensored country can download the program to their computer, which transforms it into an access point.

They can then give contacts in censored countries a unique web address, login and password, which enables the restricted users to browse the web freely through an encrypted connection to the proxy server.

Its creators say the system provides strong protection against "electronic eavesdropping" because censors or ISPs can see only that end users are connected to another computer and not view the sites that are being visited.

China Wide Web?

But even without specialised software, some China net users are able to crack the firewall.

A report released last year by US researchers showed that the firewall was more porous than previously thought.

It found that the firewall often failed to block what the Chinese government finds objectionable, and was least effective when lots of Chinese web users were online.

But even when no technology is used to filter or ban, China's net citizens are not getting unfettered access to the web.

Western companies like Google and Microsoft have been criticised for launching services which effectively self-censor.

A search request on Google in China will not bring back the same results as it would in the US, with many websites removed from the list of returned items.

Microsoft's blog service in China does not allow people to use words such as democracy, freedom and human rights.

Many observers now feel that China is not really connected to the web at all.

Instead, net users in the country experience a China Wide Web and not the World Wide Web.
http://news.bbc.co.uk/go/pr/fr/-/1/h...gy/7312746.stm





CNet Networks to Cut 120 Jobs
AP

CNet Networks Inc. says it is laying off 120 employees as part of an effort to streamline the online media company while generating more content for its news and entertainment sites.

All the layoffs -- about 4.4 percent of CNet's work force -- will involve employees in the U.S., according to a document CNet filed Wednesday with the Securities and Exchange Commission.

CNet's suite of popular Web sites commands a huge worldwide audience, but its investors have long complained the company's profits haven't kept pace with the growth of Internet advertising.

The company indicated in the filing that the layoffs would be effective immediately and cost at least $3.8 million in severance pay, outplacement and other expenses.

Along with the technology news site CNet.com, the San Francisco-based company operates News.com, TV.com, GameSpot.com, Chow.com and BNet.com. It said in a February filing that, as of Dec. 31, it employed about 2,700 people worldwide.

CNet shares rose 6 cents Wednesday to close at $7.25.
http://ap.google.com/article/ALeqM5j...8JZIQD8VLFAS00





Motorola, Under Pressure, to Split in Two

Cellphone unit separating after push for shake-up
Ashley Heher

Motorola Inc., after months of agitation from frustrated investors, announced plans yesterday to separate its struggling handset business from other operations and form two separate publicly traded companies.

The cellphone maker has been under pressure from billionaire investor Carl Icahn for changes meant to revitalize its cellphone business, which has seen its sales and stock price plummet after the company failed to find a second act to the once-popular Razr phone.

Motorola said the handset business will operate separately from a company that includes its home and networks business and sells TV set-top boxes and modems, and from the enterprise that sells computing and communications equipment to businesses.

"Our priorities have not changed," chief executive Greg Brown said in a statement. "We remain committed to improving the performance of our Mobile Devices business by delivering compelling products that meet the needs of customers and consumers around the world."

Motorola said it hopes the transaction will be tax-free, allowing shareholders to own stock in both new companies. If the deal is approved, the two units would be separated in 2009.

Brown said Motorola will search for a new chief executive of the Mobile Device business as it works to regain favour with customers and its No. 2 position in the cellphone market. Motorola lost that spot last year to rival Samsung Electronics Co. Finland's Nokia Corp. is the industry leader.

Yesterday's announcement was just the latest shake-up at Motorola, which rode the success of the iconic Razr phone from 2005 to 2006, but has stumbled since amid stiff competition.

Last year, the company pulled back from developing markets, cut 7,500 jobs and CEO Ed Zander resigned.

A flock of executives has left the company this year, and more cuts and changes are likely as the new management team scrambles to retain control in the face of a revived threat from Icahn.

Icahn, who has been steadily increasing his Motorola position, disclosed in a filing this month that he now owns 142.4 million shares, or 6.3 per cent – up from five per cent a month ago.

Yesterday's announcement came two days after Icahn sued Motorola, seeking documents about its executives and its cellphone business.

Icahn plans to use the material in his battle to win four seats on the company's board, his second proxy fight in two years with Motorola. He rejected a concessionary offer of two seats from the company.
http://www.thestar.com/Business/article/356911





Windows XP: Going, Going ... Gone?
David DeJean

The approaching death of Windows XP may upset you, but it shouldn't come as a surprise. Microsoft Corp.'s product life-cycle guidelines have foretold the fate of XP since 2001. In fact, Microsoft has been killing off one version of a product as it is replaced with another for years now. But this time around, the approaching demise of XP is getting more attention than, say, the final passing of Windows 2000.

Why? For a couple of reasons: XP is the most widely used operating system on the planet, and its long-delayed successor, Windows Vista, is not proving to be universally popular. The companies that make up the enterprise market for Windows are dragging their feet about upgrading, and on the consumer side there are signs of a rebellion against Vista.

Microsoft has already made changes in its timetables. Last year, the company extended the sales life cycle -- the time during which PC manufacturers and system builders could sell computers with XP installed -- to June 30, 2008. It will stop selling XP altogether on Jan. 31, 2009. And it extended the mainstream support period for XP to April 14, 2009, in an effort to reassure customers made nervous by the long delays in shipping Vista.

The result of all this tweaking is that Microsoft will stop selling XP long before it stops supporting it. You may be able to run XP for as long as you want, but before too long you may not be able to buy a legitimate copy of XP to run.

So will there be any way to get a copy of XP after June 30? If you want to continue using XP, what problems will you face? If you buy a PC with Vista installed and decide you want XP instead, what are your options?

The product life-cycle guidelines

Microsoft's product life-cycle guidelines grew out of two sets of needs: Microsoft's need to make a profit, and its customers' (particularly enterprise customers) needs for some certainty about the products they were committing to.

The policy was an attempt at transparency, a promise that new products would be supported for a definite period and that as they aged Microsoft wouldn't just abandon them. Instead, the company would withdraw support in a series of scheduled steps that corresponded to the pace of technological change, allowing customers time to transition to newer products. (The guidelines apply to all Microsoft products, not just operating systems.)

The problem is that what sounds like a promise to some (particularly enterprise customers) can sound like a threat to others -- particularly consumers. And they're not taking it well.

XP timeline
June 30, 2008

PC manufacturers stop selling computers with XP installed.
Jan. 31, 2009

Microsoft stops selling XP altogether.
April 14, 2009

Mainstream support (free live support and warranty support) ends. Free maintenance is limited to security fixes.
April 8, 2014

All support for XP ends.

This incipient consumer rebellion is a relatively new phenomenon, even in the short history of PCs. For most of the '90s, Microsoft couldn't bring out new products fast enough to satisfy customers. Computing technology was exploding, and Windows exploded along with it, from Windows 3.1 to Windows 95 to Windows 98 to Windows 98 Second Edition to Windows Millennium Edition. PC sales boomed and Windows users raced to upgrade to the latest version.

But that binge left Microsoft with a huge hangover. As the new decade started, it was supporting a tangle of versions and upgrades. Then the Internet bubble burst and PC sales slowed. New products like Windows ME weren't as well received as the older ones. Microsoft needed to reduce its support liabilities and create a profit plan. The product life-cycle guidelines were the solution.

The three phases of support

First laid out in 2001 and revised in 2002 and 2004, the guidelines defined a three-phase life span and created a division between business desktop software and consumer desktop software. (In the beginning, it was easier to distinguish between business products based on the NT kernel -- like Windows NT and Windows 2000 -- and consumer products that ran on top of DOS, like Windows 98 and ME.)

• Mainstream phase: In the prime of a product's life, Microsoft provides both free and paid live support, support for warranty claims and online self-help support information. Software support and maintenance is extensive and free, with downloadable fixes and updates, service packs and freely available support for problem incidents, as well as requests for design changes and new features. Business customers may pay for additional support.
• Extended phase: Free live support and warranty support end, and free maintenance of consumer products is limited to security fixes. Self-help support information remains available online. Pay-per-incident live support remains available. Software patches and updates continue for business desktop software.
• End of life: Online support information is removed. Patches and updates cease. The product is history.

These phases were set in a schedule with definite dates and durations. Business products would be supported for 10 years -- mainstream support for five years, extended support for another five. Consumer products would get five years of mainstream support, but no extended support.

But there are two other factors in a product's life cycle -- service packs and the availability of a new version of the product:

• Service packs have a life cycle of their own. Support for each service pack ends 24 months after the next service pack release (support for Windows XP Home SP1 support, for example, ended in 2006, two years after the release of SP2 in 2004) or at the end of the product's support life cycle, whichever comes first.
• When it looked like mainstream support for Windows XP might run out before the next version of Windows made it to market, Microsoft amended the support life cycle policy to promise that mainstream support would last for either five years or for two years after a successor version is released, whichever period is longer.

While the product life-cycle guidelines set very definite limits on product life spans, Microsoft has shown a willingness to move the goal posts when it gets enough pressure. When Windows XP shipped in December 2001, it was slated to be in mainstream support until December 2006. Microsoft's internal problems with getting Vista out the door finally forced the company to extend the mainstream period for XP out to April 2009, and to make some other accommodations, like eliminating the distinction between business and consumer versions, so that XP Home will have an extended support phase just like XP Pro.

The result is that next year, on April 14, 2009, Microsoft will end mainstream support for XP, and five years later, on April 8, 2014, it will stop supporting XP at all.

The other life cycle

But even before that, XP faces a major event in an entirely different life cycle, one that Microsoft has said very little about -- the sales life cycle.

The key dates for sales come much sooner than 2009 or 2014. In fact, in only a few weeks, on June 30, 2008, Microsoft will stop selling XP through its retail and reseller channels (the resellers are big manufacturers like Dell Inc. and Hewlett-Packard Co. that sell PCs with Windows preinstalled).

System builders, the "white box" retailers who build PCs to order, will be given another seven months, but on Jan. 31, 2009, a couple of months before XP exits mainstream support, Microsoft will stop selling XP altogether (except for a version sold in some less-developed countries and a special arrangement for XP Home in China).

At least that's the current information. It could change. It has before.

What sounds like a promise to some can sound like a threat to others -- particularly consumers. And they're not taking it well.
In the past, the company has generally kept the previous version of Windows on the market for two years or so, past the introduction of a new version. That was apparently the plan for XP. When Vista finally shipped to enterprise customers in late 2006, the on-sale dates for XP were reset to January 2009.

But the new operating system didn't capture the popular imagination quite the way Microsoft had planned. Vista's heavy demands for hardware, its rocky support for applications and peripherals, and its draconian security features have left consumers less than enthusiastic. (InfoWorld.com, for example, has collected more than 100,000 signatures on a Save Windows XP petition.)

Enterprise customers have also been slower to move to Vista than to previous versions of Windows. A Microsoft reseller, CDW Corp., reported this January the results of a poll that found that a year after its release, fewer than half of businesses were using or evaluating Vista.

Big resellers, the PC manufacturers who preinstall Windows on their products, initially switched from XP to Vista when the consumer versions of the operating systems shipped in January 2007. But by April, Dell, Lenovo, and HP were once again selling machines with XP installed. An April 4 post on Dell's Web site announced the company's intention to sell XP on certain systems "until later this summer." Nearly a year later, the company is still selling XP systems.

In September 2007, Microsoft agreed to a six-month extension of XP's on-sale dates, along with license provisions for Vista's business editions that grant buyers the right to downgrade to XP.

All this leaves Microsoft in an unfamiliar position. Its major customers -- the resellers, system builders and enterprise licensees -- and a vocal part of the Windows user base all appear to be reluctant users of Vista. None of this means that Microsoft is likely to grant XP another stay of execution. But it does mean we're going to be in for an interesting few weeks leading up to June 30.

What happens after June 30?

XP won't suddenly disappear, though. It will take some time for PCs loaded with XP to move from factories to warehouses to sellers to buyers. Shrink-wrapped FPP versions of the various editions of XP will also remain on sale until supplies are exhausted. And even after June 30, there will still be two ways to obtain XP until Jan. 31, 2009.

The easiest way will be to buy a new PC with XP installed from a white box system builder. It will, of course, be a reseller's version of the operating system (white box builders tend to use the same reseller versions as the larger vendors), which is tied to the PC it's installed on and can't be transferred to another computer.

Or you can buy a new PC with a reseller version of Vista Business or Vista Ultimate installed and downgrade to XP Pro (download PDF). There are enough pain points in this process that you won't want to undertake it lightly. Although you may have the right to downgrade, the maker of your PC isn't obliged to supply an XP install disk. If it's important to you, check before you buy. And although you can reinstall Vista later on, you have to do it from the installation files or media you got with the machine, so don't wipe those out by accident.

You won't be able to activate your new XP install with its previously used product key across the Internet, either. A query to Microsoft on this last point produced the following clarification:

"A customer who wishes to downgrade to XP should be able to do so using their original XP disc and original XP product key. That customer may have to call [Microsoft customer service] to get an override in case their hardware changed and their hardware ID went out of tolerance. Activation is governed by the RIT/ROT count. 'RIT' equals the number of activations on the single machine. 'ROT' equals the number of activations [of that product key] on different machines. So if the customer activated the key more than the RIT limit or if he changed the hardware, only then would they have to call a Product Activation call center."

Does that make everything clearer?

Support goes on

Although the sales life cycle starts to wind down on June 30, you can keep on using XP for as long as you want to. You might want to run XP until the next version of Windows (currently called Windows 7) comes out; it's expected in 2010. Or you might want to give some other operating system a little more time to mature.

Perhaps you think that Ubuntu Linux is just a couple of versions away from real usability.

In both these cases, time is on your side. There won't be any changes in XP support until April 14, 2009, when Windows XP Service Pack 2 moves from mainstream support to extended support. Extended support's security fixes should certainly keep you going safely until April 8, 2014, or until Windows 7 actually does ship, whichever comes first.

Can You Save XP?

InfoWorld is conducting a campaign to rally XP users to demand that XP be kept available. If you want to join that campaign, you'll find these links useful:
Save XP.com

The "Save XP" petition count update

The problem is, there's support and then there's support. The last time Microsoft ended mainstream support for a version of Windows was in June 2005, when it stopped supporting Windows 2000. By the end of 2006, major software vendors had also ended their support for the operating system. New products didn't support Windows 2000, and upgrades of existing Win2K products to new versions weren't available.

This lack of upgrades to run on defunct operating systems is a natural result of market forces. Application software makers, just like Microsoft, want to minimize their support costs by supporting their products on as few operating system versions as economically possible, so when an operating system version's percentage of the installed base falls below its potential to contribute to the bottom line, the vendor will cut its support -- and deflect complaints by pointing at Microsoft.

XP is certainly much more widely used than Win2K, and it will probably be supported by application vendors for a lot longer as a result. But if you really want to stay with XP, you should be prepared to stay with your current applications as well. There may not be any upgrades.

Whether you merely tolerate XP or won't give it up until it's pried from your cold, dead fingers, it will be gone.
Finally, there is one more factor that might stretch out the life of XP a bit. Benjamin Gray, an analyst at Forrester Research Inc., predicted last fall that Service Pack 3 for XP, which will ship later this year, may play a part. Big corporate customers are still looking forward to XP SP3, and Gray said he wouldn't be surprised to see Microsoft extend mainstream support for this updated version of the operating system past April 2009 in response to pressure from the enterprise market.

If you're clinging to XP because you're waiting for that stability and compatibility, whether in Vista or in the next version of Windows, or just because you're entirely happy with XP and see no reason to change, then the product life-cycle guidelines are your friend. The combination of mainstream and extended support will give you several years of protection.

And even if you find in a couple of years that you can't get an XP version of some upgraded application, extended support means that your XP machine still has some life expectancy; you won't have to junk it just because it's become a malware magnet.

But if you're holding onto XP because you're just purely mad at Microsoft, or your PC won't run Vista anyway, then you're only buying time. Sooner or later, it's inevitable. Whether you love Vista or hate it, merely tolerate XP or won't give it up until it's pried from your cold, dead fingers, it will be gone. The product life-cycle guidelines say so.
http://www.computerworld.com/action/...icleId=9070119





Gone in 2 Minutes: Mac Gets Hacked First in Contest
Robert McMillan

It may be the quickest $10,000 Charlie Miller ever earned.

He took the first of three laptop computers -- and a $10,000 cash prize -- Thursday after breaking into a MacBook Air at the CanSecWest security conference's PWN 2 OWN hacking contest.

Show organizers offered a Sony Vaio, Fujitsu U810 and the MacBook as prizes, saying that they could be won by anybody at the show who could find a way to hack into each of them and read the contents of a file on the system, using a previously undisclosed "0day" attack.

Nobody was able to hack into the systems on the first day of the contest when contestants were only allowed to attack the computers over the network, but on Thursday the rules were relaxed so that attackers could direct contest organizers using the computers to do things like visit Web sites or open e-mail messages.

The MacBook was the only system to be hacked by Thursday, however, the word on the show floor is that the Linux and Vista systems will meet with some serious challenges on Friday.

Miller, best known as one of the researchers who first hacked Apple's iPhone last year, didn't take much time. Within 2 minutes, he directed the contest's organizers to visit a Web site that contained his exploit code, which then allowed him to seize control of the computer, as about 20 onlookers cheered him on.

He was the first contestant to attempt an attack on any of the systems.

Miller was quickly given a nondisclosure agreement to sign and he's not allowed to discuss particulars of his bug until the contest's sponsor, TippingPoint, can notify the vendor.

Contest rules state that Miller could only take advantage of software that was preinstalled on the Mac, so the flaw he exploited must have been accessible, or possibly inside, Apple's Safari browser.

Last year's contest winner, Dino Dai Zovi, exploited a vulnerability in QuickTime to take home the prize.

Dai Zovi, who congratulated Miller after his hack, didn't participate in this year's contest, saying it was time for someone else to win.

By late Thursday, Apple engineers were already working on patching the issue, said Aaron Portnoy, a TippingPoint researcher who is one of the contest's judges.

Miller's $10,000 payday may sound sweet, but it's not the most Miller has been paid for his work. In 2005, he earned $50,000 for a Linux bug he delivered to an unnamed government agency.

Last year's contest winner, Dino Dai Zovi, exploited a vulnerability in QuickTime to take home the prize.

Dai Zovi, who congratulated Miller after his hack, didn't participate in this year's contest, saying it was time for someone else to win.

Shane Macaulay, who was Dai Zovi's co-winner last year, spent much of Thursday trying to hack into the Fujitsu Vista laptop, at one point rushing back to his Vancouver area home to retrieve a file that he thought might help him hack into the system.

But it was all in vain.

"It's one thing to find a vulnerability, it's another thing to make working exploit code," said Terri Forslof, TippingPoint's Manager of Security Response.

Forslof said that a number of "high quality" researchers have said that they will attempt to hack the machines on Friday, the last day of the conference.

She expects both systems to be hacked on Friday, when contest rules will be further eased, and hackers will be able to attack popular third-party software that can be installed on the systems. "I don't think we'll have to take any home," she said.
http://security.itworld.com/5013/mac...27/page_1.html





Amazon Takes on Apple with Copy-Protection-Free Music
Jefferson Graham

The music industry is finally comfortable selling digital music without copy protection, but the huge shift hasn't resulted in dramatically higher sales.

Instead, it produced something that major music labels have long sought: a strong No. 2 competitor to Apple.

Amazon's (AMZN) MP3 store - which sells only songs without copy protection - has quietly become No. 2 in digital sales since opening nearly six months ago. That's even though Apple (AAPL) dominates digital music with its iTunes Store (the second-largest music retailer in the world, after Wal-Mart) (WMT) and its hugely popular iPod.

The push for copy-protection-free music began nearly a year ago, when Apple and major label EMI shocked the industry by announcing a landmark arrangement to sell 150,000 songs without digital rights management (DRM) software. It was the first time a major label had agreed to such terms.

Consumers had long complained about DRM, saying it hindered what they could do with their purchases. For instance, a song sold at iTunes with DRM couldn't be played on a Microsoft (MSFT) Zune digital music player.

Apple, which claims an 80% share of digital music sales, said consumers would be ecstatic about the EMI deal and that digital sales would greatly increase. CEO Steve Jobs predicted his iTunes catalog would be 50% DRM-free by the end of 2007. But that never happened.

Warner (TWX), Sony/BMG (SNE) and Universal all opted to sell their DRM-free music on Amazon instead. "The labels think Apple has too much influence," says Phil Leigh, an analyst at Inside Digital Media.

Apple now has 2 million songs from EMI and independent labels available without DRM, out of its 6 million-song catalog. Amazon offers 4.5 million DRM-free songs.

Amazon's arrival "removed some of the stranglehold iTunes had on the market," says Ted Cohen, a former EMI Music executive and managing partner of the Tag Strategic consulting firm.

Apple originally sold each DRM-free song for a premium, $1.29, compared with 99 cents for a song with copy protection. But Apple was forced to lower the price to 99 cents when Amazon launched its MP3 download store at that price.

Pete Baltaxe, Amazon's director of digital music, won't say how many songs Amazon has sold but will say that consumers love the experience.

"What we hear a lot is, 'Thank you.' They appreciate that everything is DRM-free and so comprehensive," he says.

Apple declined to comment for this story.

'Music is mature'

The labels are also offering DRM-free songs at other digital media outlets. Universal is working with Wal-Mart, Rhapsody, Best Buy (BBY) and a handful of smaller retailers. Sony/BMG has a deal with Target (TGT). That hasn't significantly boosted sales. It hasn't hurt them either, although music label executives had argued against selling songs without copy protection, saying such a move would increase piracy.

About 239 million digital tracks have been sold this year, according to Nielsen SoundScan. That compares with 189 million at the same time last year, which is not a dramatic jump. (CD sales continue their decline: 74.3 million this year, compared with 89.2 million at the same time in 2007.)

That's because "music is mature," says Eric Garland, CEO of BigChampagne, which monitors online piracy. "The growth is in TV shows, movies and gaming."

Garland says there is no evidence that download sales will be affected by DRM-free songs. "The music consumer holds all the cards, and they have a list of complaints. They want iTunes to be 100% DRM-free. They want unlimited selection and to have more of a social component to their songs. Having one label without DRM (at iTunes) isn't enough to make an impact," Garland says.

Amazon's Baltaxe says the best defense against piracy is a good offense. "Songs sold without DRM, at high quality, with album art, that's the best way to get people to buy music instead of stealing it. DRM is a way to punish people who are buying," he says. "Offering a great product at a great price is a way to combat piracy."
http://news.yahoo.com/s/usatoday/200...ctionfreemusic





Diddy's Girl Group Danity Kane Tops U.S. Pop Chart

Danity Kane, a prefabricated MTV girl group, scored its second consecutive No. 1 on the U.S. album chart Wednesday, while pop-soul duo Gnarls Barkley's latest release entered at a disappointing No. 18 after being rushed out three weeks ahead of schedule.

Danity Kane's "Welcome to the Dollhouse" sold 236,192 copies in the week ended March 23, according to Nielsen SoundScan. It ranks as the second-biggest opening of the year, behind the 375,000-unit start for Jack Johnson's "Sleep Through the Static" in February.

Perhaps more incredibly, at a time when most artists are routinely suffering big sales drops amid a decadelong slump in the music industry, "Welcome to the Dollhouse" managed to sell about 2,000 copies more than Danity Kane's 2006 self-titled debut. The debut has sold 926,000 copies to date.

The MTV "Making the Band 3" troupe is signed to Sean "Diddy" Combs' Bad Boy imprint. It's a good day for Combs. The Los Angeles Times said it is reviewing a recent story linking the hip-hop mogul to the 1994 shooting of rapper Tupac Shakur, amid an online report that the paper may have relied on forged FBI documents. Combs strongly denied any involvement in the shooting.

The Grammy-winning Gnarls Barkley's second album, "The Odd Couple," sold 31,000 copies for its No. 18 start. Its 2006 debut, "St. Elsewhere," opened at No. 20 and peaked at No. 4 as the single "Crazy" turned into a monster hit. "St. Elsewhere" has sold 1.4 million copies to date.

"The Odd Couple" was rushed out last Tuesday, three weeks earlier than planned, likely because of an early leak on the Internet; online buyers accounted for 85% of the sales. A spokeswoman from the group's Atlantic Records label did not respond to an email seeking comment.

Elsewhere, the multi-artist "NOW 27" compilation held steady at No. 2 with 170,000 copies in its second week. Last week's champ, Rick Ross' "Trilla," fell to No. 3 with 90,000.

Rookie rapper Flo Rida may have recently topped Billboard's Hot 100 singles chart for a massive 10 weeks with "Low," but his debut album "Mail on Sunday" bowed at No. 4 with a relatively modest 86,000 copies. By contrast, "Low" has sold 3.3 million downloads since its November release, the biggest selling digital download of all time.

Johnson's "Static" slipped one to No. 5 with 67,000 in its seventh week. Sales to date stand at 939,000 copies.

Miley Cyrus' Disney double-disc "Hannah Montana 2 (Soundtrack)/Meet Miley Cyrus" soared 10 to No. 6 with 61,000. Additionally, her "The Best of Both Worlds Concert" set held at No. 10 with 51,000.

Snoop Dogg's "Ego Trippin"' slid four to No. 7 with 57,000. Blender magazine cover girl Taylor Swift's self-titled album jumped four to No. 8, also with 57,000.

Three weeks after it debuted at No. 1, Janet Jackson's "Discipline" now resides at No. 17. Its sales to date stand at about 310,000 copies; at the same point in its cycle, her previous release "20 Y.O." had sold 443,000 copies. The 2006 disc was considered a big disappointment.

Album sales, boosted by the Easter holiday, rose 11.3% over the prior week to 8.83 million units, but were down 1.1% compared to the same week in 2007.
http://www.billboard.com/bbcom/news/..._id=1003741823





BPI Admits CD Sales Booming!
P2PNet

Despite harassing their customers with bogus threats and lawsuits and saying they’re, “stealing music”, and false claims that file sharing is killing the music industry, the BPI (British Phonographic Industry) publicly states the labels do in fact sell more CDs today than they did 10 years ago.

BPI ‘members’ include Warner Music Group, EMI, Sony BMG, and Universal Music Group.

During an interview with BPI rep Matt Philips and music critic Neil McCormick, BBC News reported CD sales are in a “downward spiral”.

But, Philips responded, “We do in fact sell more CDs than we did ten years ago, and I suspect there’ll be a lot of albums sold on CD [in future] as well.”

And this is confirmed on the BPI website which states >>>

Quote:
Increasing the number of CDs bought by occasional customers is one of the reasons why we sell more CD albums in the UK than we did a decade ago. - BPI Chief Executive Geoff Taylor
Their own figures also show CD sales of BRIT (UK equivalent of Grammys) nominated artists, ” … give the overall market a boost as CD sales are increase (sic) by 25%.

“Furthermore, 2007 saw the overall market increase by 9%.”

A recent BPI press release also states, ” … (CD) sales remain 26% higher than those recorded a decade ago.”

While the BPI claims, “The industry continues to innovate in developing new online and mobile business models,” the truth is actually explained in the interview succinctly by McCormick:

“If you’re putting something on the internet, it doesn’t cost anything. Anybody can put anything anywhere. The music business is convulsed by fear and excitement about what that means. (They are) a rabbit in the headlights of the internet.”

Wabbit hunting season is open.

‘Nuf said.
http://www.p2pnet.net/story/15386





RIAA, MPAA, Pressure US Schools

American educators continue to fight entertainment industry attempts to force US colleges and universities into not only becoming corporate copyright cops, but also installing, and paying for, software designed to filter all but corporate ‘product’ from online school networks.

“In a letter to members of Congress dated Tuesday, the Motion Picture Association of America and the Recording Industry Association of America took exception to claims by higher-education groups that online music services and technology tools to block file-sharing are costly and ineffective,” says the Chronicle of Higher Education’s Wired Campus.

“The entertainment-industry groups were responding to a letter that 13 higher-education groups sent to some members of Congress last week urging them to reject language in a House-approved bill that would require colleges to use such tools.

“The entertainment industry groups want the House language to become law. A similar Senate bill omits the language on peer-to-peer file sharing.”

A “filtering product is now deployed at approximately 70 colleges and universities across the country, and it has demonstrated the ability to impede illegal P2P activity on a number of campus networks,” the story quotes the RIAA and MPAA as saying.

It’s probably referring to Audible Magic’s CopySense which, four years ago, was brazenly touted around congress by then newly appointed RIAA boss Mitch Bainwol. It was later analysed by the EFF (Electronic Frontier Foundation).

Network administrators, “will want to ask Audible Magic tough questions before investing in the company’s technology, lest the investment be rendered worthless by the next P2P ‘upgrade’,” said the EFF’s Chris Palmer.

Ohio University is boasting it spent more than $75,000 for a device —- CopySense —- that, “scans data crisscrossing its network for copyrighted media,” said p2pnet.

‘Disproportionately responsible’

The MPAA/RIAA letter, “says one university ’saved $1.2-million a year in terms of bandwidth and $70,000 in personnel costs’ after installing a technology filter on a campus network,” continues Wired Campus. “The letter did not name the university.”

The letter also said US college students are, “disproportionately responsible for digital theft of copyrighted materials and that this dynamic is one that needs to be proactively addressed by the university community without further delay”.

It’s now Standard Operating Procedure for people working for, and with, the RIAA and brother organisation, Hollywood’s MPAA, “to make false statements or issue grossly inaccurate statistics they know will be picked up by the ever-faithful lamescream corporate media and repeated as though they came from reliable sources,” said p2pnet in February, going on:

“The statements and/or statistics can later be discarded as ‘mistakes’.”

An example is the now-infamous MPAA ’study’ which concluded American students were responsible for 44% of Hollywood’s domestic losses.

It was picked up by the international press corps and reported as fact. It was also one of the principal contributing factors to Section 494 of the College Opportunity and Affordability Act of 2007.

However, the study, by LEK and dismissed by Britain’s Industry Trust for Intellectual Property Awareness (ITfIPA) as “inaccurate and out of date,” turned out to be based on specious statistics.

The figure should have been 15%, the MPAA reluctantly admitted after the damage had been done.

However, even 15% was well over the top, said Mark Luker, vice president of campus IT group Educause.

It didn’t account for the fact more than 80% of college students live off campus and aren’t necessarily using college networks, he said, going on to state 3% was a more reasonable estimate for the percentage of revenue that might be at stake on campus networks.

“The 44 percent figure was used to show that if college campuses could somehow solve this problem on this campus, then it would make a tremendous difference in the business of the motion picture industry,” Luker said. The new figures proved, “any solution on campus will have only a small impact on the industry itself”.

‘Ill conceived’

Educause has increased the level of its opposition to Section 494 of the College Opportunity and Affordability Act of 2007 as it stands, said another p2pnet post, going on:

“ ‘Educause continues to feel that the mandates contained in the House bill are expensive, ineffective, and inappropriate,’ Steve Worona, director of policy and networking programs and founding director of the Educause/Cornell Institute for computer policy and law, said.”

In a letter to CIOs in universities across the the US, the bill, “requires every college and university to develop a plan for offering alternatives to illegal downloading or peer-to-peer distribution of intellectual property as well as a plan to explore technology-based deterrents to prevent such illegal activity,” said Educause, declaring:

“These requirements are ill conceived and would be harmful to higher education if passed into law. The legal online services they imagine are immature and have been rejected by student communities on many campuses, even when offered at no charge.”

It would, “impose new costs and regulatory burdens on both the Department of Education and campuses while doing very little to address the problem,” says ACE president David Ward.

“Therefore, we ask that the House recede to the Senate with respect to the proposed new Section 494.”

However, the major Hollywood studios, Time Warner, Viacom, Fox, Sony, NBC Universal and Disney, and Warner Music, EMI, Vivendi Universal and Sony BMG, the members of the Big 4 organised music cartel, aren’t giving in.

Higher-education groups are “trying to divert attention from the real and immediate problems at hand,” says the industry letter, according to Wired Campus, which adds:

“The letter was signed by Dan Glickman, president of the motion picture group, and Mitch Bainwol, chairman and CEO of the recording-industry group. The letter was addressed to the chairmen and ranking minority members of the Senate Committee on Health, Education, Labor, and Pensions, and the House Committee on Education and Labor.”
http://p2pnet.net/story/15366





Milestones

Richard Widmark, Actor, Dies at 93
Aljean Harmetz

Richard Widmark, who created a villain in his first movie role who was so repellent and frightening that the actor became a star overnight, died Monday at his home in Roxbury, Conn. He was 93.

His death was announced Wednesday morning by his wife, Susan Blanchard. She said that Mr. Widmark had fractured a vertebra in recent months and that his conditioned had worsened.

As Tommy Udo, a giggling, psychopathic killer in the 1947 gangster film “Kiss of Death,” Mr. Widmark tied up an old woman in a wheelchair (played by Mildred Dunnock) with a cord ripped from a lamp and shoved her down a flight of stairs to her death.

“The sadism of that character, the fearful laugh, the skull showing through drawn skin, and the surely conscious evocation of a concentration-camp degenerate established Widmark as the most frightening person on the screen,” the critic David Thomson wrote in “The Biographical Dictionary of Film.”

The performance won Mr. Widmark his sole Academy Award nomination, for best supporting actor.

Tommy Udo made the 32-year-old Mr. Widmark, who had been an established radio actor, an instant movie star, and he spent the next seven years playing a variety of flawed heroes and relentlessly anti-social mobsters in 20th Century Fox’s juiciest melodramas.

His mobsters were drenched in evil. Even his heroes, including the doctor who fights bubonic plague in Elia Kazan’s “Panic in the Streets” (1950), the daredevil pilot flying into the eye of a storm in “Slattery’s Hurricane” (1949) and the pickpocket who refuses to be a traitor in Samuel Fuller’s “Pickup on South Street” (1953) were nerve-strained and feral.

“Movie audiences fasten on to one aspect of the actor, and then they decide what they want you to be,” Mr. Widmark once said. “They think you’re playing yourself. The truth is that the only person who can ever really play himself is a baby.”

In reality, the screen’s most vicious psychopath was a mild-mannered former teacher who had married his college sweetheart, the actress Jean Hazelwood, and who told a reporter 48 years later that he had never been unfaithful and had never even flirted with women because, he said, “I happen to like my wife a lot.”

He was originally turned down for the role of Tommy Udo by the movie’s director, Henry Hathaway, who told Mr. Widmark that he was too clean-cut and intellectual. It was Darryl Zanuck, the Fox studio head, who, after watching Mr. Widmark’s screen test, insisted that he be given the part.

Among the 65 movies he made over the next five decades were “The Cobweb” (1955), in which he played the head of a psychiatric clinic where the staff seemed more emotionally troubled than the patients; “Saint Joan” (1957) , as the Dauphin to Joan Seberg’s Joan of Arc; John Wayne’s “The Alamo” (1960), as Jim Bowie, the inventor of the Bowie knife; “Judgment at Nuremberg” (1961), as an American army colonel prosecuting German war criminals; and John Ford’s revisionist western “Cheyenne Autumn” (1963), as an army captain who risks his career to help the Indians.

The genesis of “Cheyenne Autumn” was research Mr. Widmark had done at Yale into the suffering of the Cheyenne. He showed his work to John Ford and, two years later, Ford sent Mr. Widmark a finished screenplay.

Mr. Widmark created the role of Detective Sergeant Daniel Madigan in Don Siegel’s 1968 film “Madigan.” It proved so popular that he later played the loner Madigan on an NBC television series during the 1972-73 season.

As his blonde hair turned grey, Mr. Widmark moved up in rank, playing generals in the nuclear thriller “Twilight’s Last Gleaming” (1977) and “The Swarm” (1978), in which he waged war on bees. He was the evil head of a hospital in “Coma” (1978) and a United States Senator in “True Colors” (1991).

He was forever fighting producers’ efforts to stereotype him. Indeed, he became so adept at all types of roles that he consistently lent credibility to inferior movies and became an audience favorite over a career that spanned more than half a century.

“I suppose I wanted to act in order to have a place in the sun,” he once told a reporter. “I’d always lived in small towns, and acting meant having some kind of identity.”

Richard Widmark (he had no middle name) was born on Dec. 26, 1914, in Sunrise, Minn., and grew up throughout the Midwest. His father, Carl Widmark, was a traveling salesman who took his wife, Mae Ethel, and two sons from Minnesota to Sioux Falls, S.D.; Henry, Ill.; Chillicothe, Mo.; and Princeton, Ill., where Mr. Widmark graduated from high school as senior class president.

Movie crazy, he was afraid to admit his interest in the “sissy” job of acting. On a full scholarship at Lake Forest College in Illinois, he played end on the football team, took third place in a state oratory contest, starred in plays and was, once again, senior class president.

Graduating in 1936, he spent two years as an instructor in the Lake Forest drama department, directing and acting in two dozen plays. Then he headed to New York City in 1938, where one of his classmates was producing 15-minute radio soap operas and cast Mr. Widmark in a variety of roles.

“Getting launched was easy for me — too easy, perhaps,” he said of his success playing “young, neurotic guys” on “Big Sister,” “Life Can Be Beautiful,” “Joyce Jordan, M.D.,” “Stella Dallas,” “Front Page Farrell,” “Aunt Jenny’s Real Life Stories” and “Inner Sanctum.”

At the beginning of World War II, Mr. Widmark tried to enlist in the army but was turned down three times because of a perforated eardrum. So he turned, in 1943, to Broadway. In his first stage role, he played an Army lieutenant in F. Hugh Herbert’s “Kiss and Tell,” directed by George Abbott. Appearing in the controversial play “Trio,” which was closed by the License Commissioner after 67 performances because it touched on lesbianism, he received glowing reviews as a college student who fights to free the girl he loves from the domination of an older woman.

After a successful, 10-year career as a radio actor, he tried the movies with “Kiss of Death,” which was being filmed in New York. Older than most new recruits, he was, to his surprise, summoned to Hollywood after the movie was released. “I’m probably the only actor who gave up a swimming pool to go out to Hollywood,” Mr. Widmark told The New Yorker in 1961.

He had never expected 20th Century Fox to pick up the option on the contract he was forced to sign to get the role of Tommy Udo. During the seven years of his Fox contract, he starred in 20 movies, including “Yellow Sky” (1948), as the blackguard who menaces Gregory Peck; “Down to the Sea in Ships” (1949), as a valiant whaler; Jules Dassin’s “Night and the City” (1950), as a small- time hustler who dreams of becoming a wrestling promoter; and “Don’t Bother to Knock” (1952), in which the tables were turned and he was the prey of a psychopathic Marilyn Monroe.

A passionate liberal Democrat, Mr. Widmark played a bigot who baits a black doctor in Joseph Mankiewicz’s “No Way Out” (1950). He was so embarrassed by the character that after every scene he apologized to the young actor he was required to torment, Sidney Poitier. In 1990, when Mr. Widmark was given the D.W. Griffith Career Achievement Award by the National Board of Review, it was Mr. Poitier who presented it to him.

Within two years after his Fox contract ended, Mr. Widmark had formed a production company and produced “Time Limit” (1957), a serious dissection of possible treason by an American prisoner of war that The New York Times called “sobering, important and exciting.” Directed by the actor Karl Malden, “Time Limit” starred Mr. Widmark as an army colonel who is investigating a major (Richard Basehart) who is suspected of having broken under pressure during the Korean War and aided the enemy.

Mr. Widmark produced two more films: “The Secret Ways” (1961) in which he went behind the Iron Curtain to bring out an anti-Communist leader; and “The Bedford Incident” (1964), another Cold War drama, in which he played an ultraconservative naval captain trailing a Russian submarine and putting the world in danger of a nuclear catastrophe.

Mr. Widmark told The Guardian in 1995 that he had not become a producer to make money but to have greater artistic control. “I could choose the director and my fellow actors,” he said. “I could carry out projects which I liked but the studios didn’t want.”

He added: “The businessmen who run Hollywood today have no self-respect. What interests them is not movies but the bottom line. Look at ‘Dumb and Dumber,’ which turns idiocy into something positive, or ‘Forrest Gump,’ a hymn to stupidity. ‘Intellectual’ has become a dirty word.”

He also vowed he would never appear on a talk show on television, saying, “When I see people destroying their privacy — what they think, what they feel — by beaming it out to millions of viewers, I think it cheapens them as individuals.”

In 1970, he won an Emmy nomination for his first television role, as the president of the United States in a mini-series based on Fletcher Knebel’s novel “Vanished.” By the 1980s, television movies had transformed the jittery psychopath of his early days into a wise and stalwart lawman. He played a Texas Ranger opposite Willie Nelson’s train robber in “Once Upon a Texas Train,” a small-town police chief in “Blackout” and, most memorably, a bayou country sheriff faced with a group of aged black men who have confessed to a murder in “A Gathering of Old Men.”

“The older you get, the less you know about acting,” he told one reporter, “but the more you know about what makes the really great actors.” The actor he most admired was Spencer Tracy, because, he said, Tracy’s acting had a reality and honesty that seemed effortless.

Mr. Widmark, who hated the limelight, spent his Hollywood years living quietly on a large farm in Connecticut and an 80-acre horse ranch in Hidden Valley, north of Los Angeles. Asked once if he had been “astute” with his money, he answered, “No, just tight.”

He sold the ranch in 1997 after the death of Ms. Hazelwood, his wife of 55 years. “I don’t care how well known an actor is,” Mr. Widmark insisted. “He can lead a normal life if he wants to.”

Besides his wife, Ms. Blanchard, Mr. Widmark is survived by his daughter, Anne Heath Widmark, of Santa Fe, N.M., who had once been married to the Hall of Fame pitcher Sandy Koufax.

Well into his later years, the nonviolent, gun-hating Mr. Widmark, who described himself as “gentle,” was accosted by strangers who expected him to be a tough guy. There is even a story that Joey Gallo, the New York mobster, was so taken by Mr. Widmark’s performance in “Kiss of Death” that he copied the actor’s natty posture, sadistic smirk and tittering laugh.

“It’s a bit rough,” Mr. Widmark once said, “priding oneself that one isn’t too bad an actor and then finding one’s only remembered for a giggle.”
http://www.nytimes.com/2008/03/26/ar...idmark.html?hp





Neil Aspinall, Beatles Aide, Dies at 66
Allan Kozinn

Neil Aspinall, who left an accounting job to become the Beatles’ road manager when the group was still a local dance band and who went on to manage the band’s production and management company, Apple, died Sunday night in Manhattan. He was 66 and lived in Twickenham, England.

Geoff Baker, a spokesman for the family, said the cause was lung cancer. Mr. Aspinall had been undergoing treatment at Memorial Sloan-Kettering Cancer Center. He retired from Apple last year.

Of all the people in the Beatles’ orbit, Mr. Aspinall had the most durable relationship with the group; in fact, he had already been a crucial member of the Beatles' entourage for about 18 months when Ringo Starr became the Beatles' drummer. When the Beatles were inducted into the Rock and Roll Hall of Fame in 1988, George Harrison made a point of saying that Mr. Aspinall should be considered the fifth Beatle.

In November 1967, when the Beatles formed Apple to oversee their creative and business interests, they asked Mr. Aspinall, by then a trusted assistant of longstanding, to manage it. He never took a formal title, but he ran a company that, in its first years, included a record label, a film production company and electronics, publishing and retailing divisions. He also quickly put the Beatles’ own complicated contractual commitments in order.

Nevertheless, expenses at Apple spun out of control as the open kitchen and bar at the company’s Savile Row office began feeding and watering a parade of journalists, would-be Apple artists and hangers-on. When the American manager Allen Klein was brought in to sort out the Beatles’ finances, Mr. Klein fired much of the staff but was told by John Lennon, “Don’t touch Neil and Mal, they’re ours,” referring to Mr. Aspinall and his assistant, Mal Evans, who had also been with the group since its Liverpool days.

During the first 25 years at the head of Apple, Mr. Aspinall oversaw a succession of lawsuits. In one, not settled until 1974, Paul McCartney sued the other Beatles to dissolve their partnership. At the same time, the Beatles as a group sued EMI Records in a royalties dispute that took 20 years to settle. Apple also sued the Broadway show “Beatlemania” for unauthorized use of the Beatles’ name and logo, and it fought several court battles against Apple Computer for trademark infringement. The last was settled in 2006, in favor of the computer company.

Mr. Aspinall was often blamed for the slow pace at which Beatles archival projects were released as several projects have languished on Apple's shelves for years, including a home-video production of the Beatles' 1965 concert at Shea Stadium, remastered versions of the film "Let It Be" and digital download versions of all the Beatles' studio recordings.

What the complaints did not take into account is that Mr. Aspinall could release only what Apple’s principals — Mr. McCartney, Mr. Starr, Olivia Harrison and Yoko Ono (the widows of George Harrison and John Lennon) — unanimously agree should be released. And the interpersonal politics at Apple are such that unanimity is hard to come by.

Even so, Mr. Aspinall did oversee several important releases since 1993. These include “Live at the BBC,” a two-disc compilation of the group’s radio performances; “Yellow Submarine Songtrack,” a remixed version of the music from the “Yellow Submarine” animated film, which Apple also restored and reissued; “1,” a single-disc hits compilation; “Let It Be ... Naked,” a remixed and reconfigured version of “Let It Be,” without the string and choral overdubs that fans have long complained about; two installments of “The Capitol Albums,” which brought together mono and stereo versions of eight Beatles albums in their American (rather than British) configurations; and “Love,” a multi-media collaboration with Cirque du Soleil (and a matching recording).

His biggest achievement was “The Beatles Anthology.” The idea was to use performance film and interview clips to let the Beatles tell their own story. Originally meant to be a theatrical film, the project was begun in 1970 but shelved until the final EMI lawsuits were settled in 1989. By then, Mr. Aspinall had proposed that instead of making a film, the Beatles should contribute new interviews (with archival interviews with John Lennon, who was murdered in 1980) to a six-hour television series and a nearly 13-hour home video edition.

When the Beatles agreed, he assembled an extraordinary archive of television and concert film, photograph collections and other materials, for use both in “The Beatles Anthology” and other potential Apple projects. He was one of the few non-Beatles interviewed in “The Beatles Anthology” and credited as executive producer. He retired from Apple in 2007.

Mr. Aspinall’s history with the Beatles reached back to their earliest days as a band, when he hung flyers around Liverpool advertising their performances. In February 1961, with the group’s popularity in Liverpool soaring, Mr. Aspinall gave up his job as an apprentice accountant and began driving the group from job to job, often three performances a day.

On international tours, Mr. Aspinall left the business of equipment setup to Mr. Evans and became the Beatles’ principal aide. One of Mr. Aspinall’s later jobs was to round up the pictures of the celebrities and other influential crowd members for the cover of the 1967 album “Sgt. Pepper’s Lonely Hearts Club Band.” He also accompanied Lennon and Mr. McCartney to New York in May 1968 for a series of interviews announcing Apple.

On occasion, he was drafted as a performer. He was among the singers in the celebratory chorus of “Yellow Submarine,” and he played tambura (an Indian drone instrument) on “Within You Without You,” harmonica on “Being for the Benefit of Mr. Kite” and percussion on “Magical Mystery Tour.”

Mr. Aspinall was born in Prestatyn, Wales, on Oct. 13, 1941, and grew up in Liverpool, where he attended the Liverpool Institute with Mr. McCartney and Mr. Harrison. He became friendly with the Beatles through Pete Best, their drummer from 1960 to 1962.

Mr. Aspinall, originally a boarder in Mr. Best’s house, had started a romantic relationship with Mona Best, Mr. Best’s mother. Their son, Roag Best, was born in 1962.

Mr. Aspinall accompanied Pete Best to the meeting with the Beatles manager Brian Epstein at which the drummer was fired. Mr. Aspinall decided to continue working for the group and also maintained his relationship with Mrs. Best for several years. He eventually had an opportunity to help Mr. Best make up for his missed fortune as a member of the Beatles: because several of the group’s previously unissued recordings with Mr. Best were used on “The Beatles Anthology,” Mr. Aspinall negotiated a generous royalty arrangement for the drummer.

In 1968, Mr. Aspinall married Suzy Ornstein, whose father, George “Bud” Ornstein, was head of European production for United Artists, the company for which the Beatles made the films “A Hard Day’s Night,” “Help!” and “Let It Be.” She survives him, as do their daughters Gayla, Dhara and Mandy; their son Julian, and Mr. Aspinall’s first son, Roag Best.

Mr. Aspinall made several films for the Beatles individually and collectively during his years as their principal aide. One accompanied the group’s 1969 single “Something” for which Mr. Aspinall filmed the Beatles and their wives walking placidly through an English garden (or, in Mr. McCartney’s case, the grounds of his farm in Scotland). What Mr. Aspinall’s idyllic film avoided showing was that the Beatles were at that point barely on speaking terms. In the film, no two Beatles are seen together.

During they group’s heyday, Mr. Aspinall wrote articles about their recording sessions for “The Beatles Monthly Book,” an officially sanctioned fan magazine. Virtually alone among Beatles insiders, he resisted the temptation to publish his memoirs, but joked that if he did write them, he would arrange to have them published only after his death. He is not known to have undertaken the project.
http://www.nytimes.com/2008/03/24/ar...pinall.html?hp
JackSpratts is offline   Reply With Quote
 


Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Peer-To-Peer News - The Week In Review - September 22nd, '07 JackSpratts Peer to Peer 3 22-09-07 06:41 PM
Peer-To-Peer News - The Week In Review - May 19th, '07 JackSpratts Peer to Peer 1 16-05-07 09:58 AM
Peer-To-Peer News - The Week In Review - December 9th, '06 JackSpratts Peer to Peer 5 09-12-06 03:01 PM
Peer-To-Peer News - The Week In Review - September 16th, '06 JackSpratts Peer to Peer 2 14-09-06 09:25 PM
Peer-To-Peer News - The Week In Review - July 22nd, '06 JackSpratts Peer to Peer 1 20-07-06 03:03 PM






All times are GMT -6. The time now is 06:07 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
© www.p2p-zone.com - Napsterites - 2000 - 2024 (Contact grm1@iinet.net.au for all admin enquiries)