View Single Post
Old 13-02-08, 09:42 AM   #2
JackSpratts
 
JackSpratts's Avatar
 
Join Date: May 2001
Location: New England
Posts: 10,017
Default

RealPlayer Users Held to Ransom
Davey Winder

It has been a couple of months now since a Russian security researcher, Evgeny Legerov, confirmed that the widely deployed media software RealPlayer was vulnerable to a zero-day exploit. The Russian company, Gleg, is in the business of selling information on such exploits and security flaws. Unfortunately, according RealNetworks's Vice President Jeff Chasen, Gleg has been unwilling or unable to provide the necessary data to allow the alleged gaping security hole to be patched despite repeated requests from both RealNetworks and CERT. Gleg has, on the other hand, posted a video showing the heap overflow/code execution exploit in action.

According to Chris Wysopal, CTO for application secure code testing company, Veracode, it was only ever a matter of when rather than if the zero day exploit commercial market would find a vulnerability in widely deployed software such as this. "We don't know when this unpatched RealPlayer vulnerability was introduced into the code" Wysopal says "It has probably been latent for many months. Real's customers were vulnerable as soon as they downloaded this version of RealPlayer. There is currently knowledge circulating in criminal circles and attackers are using it to compromise Real's customers."

The fact that Gleg apparently knew how to reproduce this problem at least a month beforehand, but did not inform the vendor, is quite frankly appalling. Indeed, there appears to be a legitimate concern over what benefit the customers of Gleg, who were informed about the problem, would get by having such client side exploit information before the vendor can patch it.

Legerov has responded to criticism by arguing that the exclusivity is required so that his customers can better understand the level of risk that they face. Again, this beggars belief. What do they need to understand other than the client software is broken and needs to be fixed ASAP, unless there were some ulterior motive. As Wysopal says "I know that users with RealPlayer 11 installed will undoubtedly stumble across a malicious music file and their system will have a bot installed running with their logged in privilege level. I'm not sure what additional value I would get as a Gleg customer." Unless, of course, you were RealNetworks in which case you might be able to run the exploit in lab conditions and patch that vulnerability. But then isn't that tantamount to blackmail?

Wysopal argues with plenty of merit that a cooperative solution is a much safer way for customers to understand the risks of the code they run, promoting good security hygiene on the vendor side. "We have found that once vendors know that their big customers are using an independent review service they are more likely to proactively start doing security testing within their SDLC" he continues "A vendor can't bluff their way out of a comprehensive code assessment like they can from just a single (or a few) vulnerabilities publicly reported. If their code is full of vulnerabilities their customers will know."
http://www.daniweb.com/blogs/entry2060.html





Hackers - 1: SP1 – 0
Adrian Kingsley-Hughes

With the launch of SP1 Microsoft promised to put an end to two popular hacks used by pirates to allow a non-genuine install of Windows Vista to function in the same way as a genuine install. Testing that I’ve carried out in the lab today suggests that Microsoft has been true to its word.

The two most common hacks used were the OEM BIOS hack and the grace timer hack (of which there were two flavors which were widespread).

Testing both these methods of circumventing Windows activation and Windows Genuine Advantage (WGA) has shown me that SP1 effectively ignores both these hacks. Systems that previously were shown to be genuine prior to the installation of SP1 then require activation - and if the system isn’t activated it is marked at non-genuine and enters the nag state.

Pirates trying to apply these hacks to new installations of Vista which include SP1 will find that neither method works.

I’m certain that when SP1 hits the Windows Update servers that there are going to be a lot of people out there surprised to find that their systems aren’t as genuine as they thought they were. This will no doubt put a few more bucks into Microsoft’s coffers.

Will this put an end to the counterfeiting of Vista? Some I’ve spoken to in the underground community say it will, while others are confident that new circumvention methods will be discovered.

[UPDATE 02/10/08 5:05pm – It does seem that Microsoft hasn’t been successful in closing off all the hacks that allow non-genuine copies of Vista SP1 to pass off as genuine ones. After a few minutes of searching the darker corners of the Internet and a few seconds in the Command Prompt I was able to fool Windows into thinking that it was genuine, turning this:… into this: Close, but no cigar.]

[UPDATE 02/10/08 4:00pm – I’m getting scattered reports claiming that there is still a hack for Windows Vista SP1 that works. I’ll investigate further later.]
http://blogs.zdnet.com/hardware/?p=1267#





Report: Web Browsers Under Siege from Organised Crime

IBM today released the findings of the 2007 X-Force Security report, detailing a disturbing rise in the sophistication of attacks by criminals on Web browsers worldwide. According to IBM, by attacking the browsers of computer users, cybercriminals are now stealing the identities and controlling the computers of consumers at a rate never before seen on the Internet.

The study finds that a complex and sophisticated criminal economy has developed to capitalise on Web vulnerabilities. Underground brokers are delivering tools to aid in obfuscation, or camouflaging attacks on browsers, so cybercriminals can avoid detection by security software. In 2006, only a small percentage of attackers employed camouflaging techniques, but this number soared to 80 percent during the first half of 2007, and reached nearly 100 percent by the end of the year. The X-Force believes the criminal element will contribute to a proliferation of attacks in 2008.

Using these techniques, cybercriminals can infiltrate a user's system and steal their IDs and passwords or obtain personal information like National Identification numbers, Social Security numbers and credit card information. When attackers invade an enterprise machine, they could steal sensitive company information or use the compromised machine to gain access to other corporate assets behind the firewall.

The Storm Worm, the most pervasive Internet attack last year, continues to infect computers around the world through a culmination of the threats the X-Force tracks, including malicious software (malware), spam and phishing. Last year, delivery of malware was at an all time high, as X-Force reported a 30 percent rise in the number of malcode samples identified. The Storm Worm comprised around 13 percent of the entire malcode set collected in 2007.

In other findings, for the first time ever, the size of spam emails decreased sharply to pre-2005 levels. X-Force believes the decrease is linked to the drop off of image-based spam. This decrease can be counted as a win for the security industry - as anti-spam technologies became more efficient at detecting image-based spam, spammers were forced to turn to new techniques.

The X-Force has been cataloguing, analysing and researching vulnerability disclosures since 1997. With more than 33,000 security vulnerabilities catalogued, it has the largest vulnerability database in the world. This unique database helps X-Force researchers to understand the dynamics that make up vulnerability discovery and disclosure.

The new X-Force report from IBM also reveals that:

• The number of critical computer security vulnerabilities disclosed increased by 28 percent, a substantial upswing from years past.
• The overall number of vulnerabilities reported for the year went down for the first time in 10 years.
• Out of all the vulnerabilities disclosed last year, only 50 percent can be corrected through vendor patches.
• Nearly 90 percent of 2007 disclosed vulnerabilities are remotely exploitable.

The report (PDF) can be viewed on ISS.net.
http://www.net-security.org/secworld.php?id=5813





Prototype Software Sniffs Out, Disrupts Botnets
Layer 8

Researchers this week detailed a prototype system to identify and eradicate botnets in the wild.

Georgia Tech’s BotSniffer uses network-based anomaly detection to identify botnet command and control channels in a local area network without any prior knowledge of signatures or server addresses, the researchers said. The idea is to ultimately detect and disrupt botnet infected hosts in the network.

The researchers said their prototype, which was presented at the Internet Society's Network and Distributed System Security Symposium this week, is based on the fact that botnets engage in coordinated communication, propagation, and attack and fraudulent activities. BotSniffer, can capture network command and control protocols and utilize statistical algorithms to detect botnets. The researchers also said they built BotSniffer detectors as plug-ins on top of the popular open source Snort intrusion/detection system but that BotSniffer is independent of Snort and not included in Snort distribution.

“We evaluated BotSniffer using many real-world network traces. The results show that BotSniffer can detect real-world botnets with high accuracy and has a very low false positive rate,” the researchers said.

Botnet command and control traffic, which often uses Internet Relay Chat (IRC) or HTTP protocols, is difficult to detect because it follows normal protocol usage and is similar to normal network traffic. Botnet traffic volume is low as well and may contain encrypted communication, adding to the difficulty, researchers said.

“However, we observe that the bots of a botnet demonstrate spatial-temporal correlation and similarities due to the nature of their pre-programmed response activities to control commands. This helps us identify command and control within network traffic. For instance, at a similar time, the bots within a botnet will execute the same command -- obtain system information, scan the network -- and report to the command and control server with the progress/result of the task. Normal network activities are unlikely to demonstrate such a synchronized or correlated behavior. Using BotSniffer’s sequential hypothesis testing algorithm, when we observe multiple instances of correlated and similar behaviors, we can conclude that a botnet is detected.”

The researchers said they consider the botnet’s use of command and control channels to be the weakest link of a botnet. “If we can take down an active command and control or simply interrupt the communication to the command and control, the botmaster will not be able to control his botnet. Moreover, the detection of the command and control channel will reveal the command and control servers and the bots in a monitored network. Therefore, understanding and detecting the command and controls has great value in the battle against botnets,” researchers said.

BotSniffer joins BotHunter, BotMiner and BotProbe as emerging techniques to fight botnets. BotHunter, for example, is a dialog-correlation-based engine that recognizes the communication patterns of malware-infected computers within a network.

Certainly tracking and eradicating botnets is a growing business. The Storm botnet , which has grown into a large remotely controlled botnet since the initial worm appeared a year ago to infect victims' machines, has a realtime tracker on Secure Computing's TrustedSource.org research portal which displays real-time information compiled through sensors maintained in 75 countries.

Big security software vendors such as McAfee, Symantec and Trend Micro, have added botnet-fighting features to their packages. Others such as Endeavor Security working through a Department of Homeland Security funded research program are introducing products that can help combat malware.

In their third annual survey of network infrastructure security, network security firm Arbor Networks found that botnets are seen as the most significant threat by ISPs. It marked the first time that Arbor had listed botnets as a survey option for potential threats to Internet service; in previous editions of the survey, DDoS attacks had been the overwhelming choice as the top threat.

The Federal Bureau of Investigation's Director Robert Mueller called botnets one of the Internet’s most grave dangers. "Once under their thumbs, [botnets] can wreak all kinds of havoc, from shutting down a power grid to flooding an emergency call center with millions of spam messages."

The FBI in November said its Operation ‘Bot Roast’ had netted eight individuals that have been indicted, pled guilty, or been sentenced for crimes related to botnet activity. Additionally, 13 search warrants were served in the U.S. and by overseas law enforcement partners in connection with the operation, the FBI said. This ongoing effort has thus far uncovered more than $20 million in economic loss and more than one million victim computers.
http://www.networkworld.com/community/node/25105





Taming the Data Deluge with the New Open Source iRODS Data Grid System

In the Information Age, the freedom to easily generate and share digital forms of information is driving life-changing advances in science and medicine, dramatic expansions in communications, big gains in business productivity, and a new flowering in video, music, and other cultural expressions.

At the same time, the digital data we all love is growing explosively. In 2006, humanity produced 161 exabytes of digital data – that’s 161 billion billion bytes, or 12 stacks of books stretching from the Earth to the Sun -- more data than our capacity to store it.

This deluge of data is bringing with it unprecedented challenges in organizing, accessing, sharing, and preserving digital information. To meet these challenges, the Data-Intensive Computing Environments (DICE) group at the San Diego Supercomputer Center (SDSC) at UC San Diego has released version 1.0 of iRODS, the Integrated Rule-Oriented Data System, a powerful new open-source approach to managing digital data.

“iRODS is an innovative data grid system that incorporates and moves beyond ten years of experience in developing the widely used Storage Resource Broker (SRB) technology,” said Reagan Moore, director of the DICE group at SDSC. “iRODS equips users to handle the full range of distributed data management needs, from extracting descriptive metadata and managing their data to moving it efficiently, sharing data securely with collaborators, publishing it in digital libraries, and finally archiving data for long-term preservation.”

The most powerful new feature, for which the Integrated Rule-Oriented Data System is named, is an innovative “rule engine” that lets users easily accomplish complex data management tasks. Users can automate enforcement, or “virtualize” data management policies by applying rules that control the execution of all data access and manipulation operations. Rather than having to hard code these actions or workflows into the software, the user-friendly rules let any group easily customize the iRODS system for their specific data management needs.

For example, when astronomers take new photographs in a sky survey and enter them into a data collection, the researchers can set up iRODS rules to automatically extract descriptive information and record it in the iRODS Metadata Catalog (iCAT), replicate a copy to another repository for backup, create a thumbnail for a Web-based gallery, and run an analysis program to identify related images.

An organization’s archivist can configure iRODS rules to identify and retain a collection of digital records for five years, and then move them to another site or destroy them. And if someone requests these records, the archivist can confirm that the current digital copy is indeed an authentic copy of the original. iRODS rules are being developed that will validate the trustworthiness of digital repositories.

Users can apply the growing set of existing rules or write new ones. Rules can also be developed as community-wide policies to manage data.

“One reason policy-based data management is important is that it lets communities integrate across different types of collection structures,” said Moore. “What this means is that iRODS lets one community talk to any other community independent of what data management system the other community is using. No matter which technology you pick you aren’t isolated.”

iRODS is designed to be flexible, growing seamlessly from small to very large needs.

“You can start using it as a single user who only needs to manage a small stand-alone data collection,” said Arcot Rajasekar, who leads the iRODS development team. “The same system lets you grow into a very large federated collaborative system that can span dozens of sites around the world, with hundreds or thousands of users and numerous data collections containing millions of files and petabytes of data – it’s a true full-scale distributed data system.” A petabyte is one million gigabytes, about the storage capacity of 10,000 of today’s PCs.

At SDSC alone iRODS and its predecessor SRB technology are already managing one petabyte of data and two hundred million files for 5,000 users.

“It’s an advantage that the new iRODS system is open source,” added Rajaseker. “This is bringing in collaborators from the US and as far away as France, the UK, Japan, and Australia who are contributing code, so iRODS will quickly add more features.”

“We also find that users like the open source approach and have more confidence in adopting the new technology. Open source software makes it possible to assemble a larger development team and interact with a wider range of user communities. This increases user confidence that the iRODS system will be around in the future.”

Currently the iRODS team is current working with partners to help a number of projects apply the technology, including the National Archives and Records Administration (NARA), the Ocean Observatories Initiative (OOI), the National Science Digital Library, the Temporal Dynamics of Learning Center (TDLC), the UC Humanities, Arts and Social Sciences (HASS) grid and the Testbed for the Redlining Archives of California’s Exclusionary Spaces (T-RACES) project, and numerous others.

Version 1.0 of iRODS is supported on Linux, Solaris, Macintosh, and AIX platforms, with Windows coming soon. The iRODS Metadata Catalog (iCAT) will run on either the open source PostgreSQL database (which can be installed via the iRODS install package) or Oracle. And iRODS is easy to install -- just answer a few questions and the install package automatically sets up the system.

Under the hood, the iRODS architecture stores data on one or more servers, which may be widely separated geographically; keeps track of system and user-defined information describing the data with the iRODS Metadata Catalog (iCAT); and offers users access through clients (currently a command line interface and Web client, with more to come). As directed by iRODS rules, the system can process data where it is stored using applications called “micro-services” executed on the remote server, making possible smaller and more targeted data transfers.

“Because it’s a second generation effort, IRODS isn’t like a new, untested product since we have the knowledge from years of experience with dozens of projects using the SRB,” said iRODS software architect Mike Wan. “iRODS includes the familiar functions from the SRB, so people can jump in and easily start using the new system.”

Added iRODS senior software engineer Wayne Schroeder, “For a 1.0 release it has a large number of features -- we already knew where we were going as we developed it, and this has made it cleaner and faster.”

To help users get started with iRODS, the DICE group is offering several tutorials and workshops in the US and internationally. Following on the very popular Society of American Archivists (SAA) workshop at SDSC last summer, there will be two SAA sessions this summer, with additional tutorials in the US, Europe, and Asia.

The DICE team plans to continue supporting the widely used SRB system well into the future. But as SRB users decide to upgrade, the team is developing a seamless migration path to the more capable and faster iRODS system. As part of this, for a digital data collection at NARA the iRODS team has already migrated one million files from an SRB data grid to an iRODS data grid.

“We migrated not only the data files but also the metadata, access controls, and directory structure,” said Moore. “This is an important demonstration that users can migrate collections to different choices of data grid technology without any problem.”

In addition to Moore, Rajasekar, Wan, and Schroeder, group members who contributed to the iRODS system include: Sheau-Yen Chen, Lucas Gilbert, Chien-Yi Hou, Arun Jagatheesan, George Kremenek, Sifang Lu, Richard Marciano, Dave Nadeau, Antoine de Torcy, and Bing Zhu. Other collaborators in the iRODS project include the French Institut National de Physique Nucléaire et de Physique des Particules (IN2P3), the UK e-Science Data Management Group at Rutherford Appleton Laboratory, and the High Energy Accelerator Research Organization, KEK, in Japan.

iRODS is funded by NARA and the National Science Foundation (NSF). More information, the iRODS software download, and documentation are available at http://irods.sdsc.edu.
http://www.newswise.com/articles/view/537570/?sc=swtn





BlackBerry Service Out in North America; AT&T Says All Carriers Affected
AP

An outage has disconnected BlackBerry smart phones across North America.

AT&T Inc. says the disruption Monday is affecting all wireless carriers. AT&T first learned about the problem at about 3:30 p.m. EST.

There's no word on the cause or when the problem might be fixed.

BlackBerry maker Research in Motion did not immediately return a phone call.
http://www.newstimes.com/latestnews/ci_8231883





Routine Upgrade Blamed for BlackBerry Outage

Research In Motion said late Tuesday that a problem with a routine upgrade of an internal data-routing system appears to have caused the massive outage that left BlackBerry users with spotty or nonexistent wireless e-mail service for three hours on Monday.

“RIM’s early investigation ... points to a problem with an internal data routing system within the BlackBerry service infrastructure that had been recently upgraded,” the company said in a statement.

“The upgrade was part of RIM’s routine and ongoing efforts to increase overall capacity for longer term growth.”

RIM has been adding corporate, government and retail subscribers at a torrid pace and has had to expand its capacity in step to handle increased e-mail and other data traffic. Its total subscriber base sits at about 12 million according to latest available data.

The company said that similar capacity upgrades had been successful in the past, noting that once it found the problem on Monday, it was able to quickly restore service.

“No messages were lost and the system continues to operate normally today,” it said.

The outage was the second major loss of service for RIM in less than a year. The last one hit North America and occurred last April. That time, RIM blamed a new storage feature which hadn’t been properly tested.
http://www.nytimes.com/2008/02/12/te...12cnd-rim.html





Amazon’s S3 Cloud Has a Dark Lining for Startups
Brad Stone

A few days after the BlackBerry e-mail system’s latest downtime, Amazon is giving companies another reason to worry about outsourcing company-critical functions.

Amazon’s S3 service, which offers cheap, accessible Web storage for hundreds of thousands of companies, went down this morning at around 7:30 a.m. Eastern time and is only now slowly creeping back up, delivering high error rates, according to various bloggers and posters on Web forums.

S3, one of Amazon’s stable of Web services, lets businesses store their data “in the cloud,” avoiding the need to operate their own servers. It is part of the same online infrastructure that Amazon uses to run its own business. Over 330,000 developers have registered to use Amazon Web Services, up more than 30,000 from last quarter, according to Amazon’s recent quarterly earnings announcement.

An Amazon spokesman said: “We’ve resolved this issue and performance is returning to normal levels for all Amazon Web Services that were impacted. We understand the critical importance of our services to our customers, which is why operational excellence is our highest priority.”

The technical problems affected a host of startups that use S3, such as the messaging service Twitter. The New York Times also uses S3 to store and deliver articles from its historical archives, parts of which were unavailable this morning.

On Amazon’s Web developer forum, the natives are restless. “My business is effectively closed right now because Amazon did something wrong. I’ll have to reconsider using the service now,” wrote one poster at around 9 a.m. Eastern time this morning.

Will all of this hurt Amazon? This is the first significant problem for S3, so if they fix the problem quickly and communicate clearly to customers, probably not. But if it proves to be a continuing problem — like the now biannual BlackBerry blackouts — expect nervous tech folks to at the very least begin considering backup systems.

UPDATE: Here’s another statement from an Amazon spokeman:

“For one of our services, the Amazon Simple Storage Service, one of our three geographic locations was unreachable for approximately two hours and was back to operating at over 99% of normal performance before 7 a.m. pacific standard time. We’ve been operating this service for two years and we’re proud of our uptime track record. Any amount of downtime is unacceptable and we won’t be satisfied until it’s perfect. We’ve been communicating with our customers all morning via our support forums and will be providing additional information as soon as we have it.”
http://bits.blogs.nytimes.com/2008/0...ups/index.html





HOW TO: Run Windows Update from a CD (sort of!)
James Bannan

When was the last time you installed a fresh copy of Windows XP SP2? The process is still straightforward and relatively quick...but then you think “I’ll just make sure the patches are up to date”, and then proceed to stare with horror at the 100+ security updates and critical fixes which Windows Update or WSUS demands you install.

One of the things to look forward to with XP Service Pack 3 is that the whole patching process is reset back to something more civilised, but until that time you’re still faced with the prospect of a massive blowout of installation time, chewed-up bandwidth and for too many reboots.

WSUS (Windows Server Update Services) does alleviate this somewhat by making all the patches available offline, but this solution is normally only used in a business environment – not every home user has access to Windows Server 2003 or the inclination to keep a WSUS database up-to-date.

Another option is to maintain an installation source (like a network share) with all the relevant patches slipstreamed. Again, this is effective if you’ve got the time to maintain it.

A better option which we’ve just discovered is the innovative work of Alek Patsouris. Entitled “Project Dakota”, it’s a self-contained boot CD which contains all the necessary updates to automatically patch a Windows XP SP2 system with all the patches available at the CD’s build time. It also contains Service Pack 2, so you can use it to bring non-SP2 systems into compliance, and it comes bundled with other third-party applications like Firefox and Quicktime.
Tutorial – Using Project Dakota

Download the latest Project Dakota build. It is a full CD ISO weighing in at 700MB, so this may take some time. You can burn the ISO to create a bootable CD, or use an image reading tool like WinISO, WinRAR or Daemon Tools to extract the contents to a shared network location. For the sake of this tutorial, I’ll assume that you’re using the CD option.

Insert the CD. This will bring up the autorun window and three options – “Run Project Dakota”, “Test Optical Drive” and “File Lists and other info”.

Project Dakota Autorun

The optical drive test makes sure that there are no issues with either the drive or the media, and the file list option gives the details of the patches which are available for install, and instructions on setting up a networked install. If you select this option, you’ll see that the installation script is broken up into groups or bundles of patches. After each group has been installed the system reboots.

Project Dakota Installation

Select “Run Project Dakota”. The next screen prompts you to go and relax, but unfortunately you can’t quite do this as the installation isn’t 100% automated. Although the patches don’t require any user intervention, there are four optional packages which do: .NET Framework 1.1, .NET Framework 2.0, IE7 and Media Player 11. You’re given the option whether or not to install these, and until you make a selection the overall process won’t proceed.

Automatic system restarts are built into the install script, so it’s obviously best not to be using the system you’re patching for anything else whilst Project Dakota is running.

Once the patching process is complete, you’re presented with a few more option screens to install third-party applications like Firefox, AVG and Flash, or to run various system tools/optimisation scripts like disabling System Restore or Disk Defragmenter. The applications may not be the latest available as they are bundled at the time of the ISO build, but it’s a very useful feature to have for offline installations or on systems with limited internet access.

Project Dakota - Third-Party Apps

After Project Dakota had finished running on my test system I jumped online to Windows Update and ran a check to see which patches were still outstanding. Impressively there were only nine high priority updates and two optional software updates.

Project Dakota - Outstanding Updates

You can re-run Project Dakota on a system which you’ve previously patched. You’ve got the option to roll back, uninstall all the updates and erase the patches, or overwrite everything and re-patch.

Project Dakota - Reinstallation Options

At the moment this terrific and worthwhile project is only available for Windows XP, but plans are in place to make it available for Windows 2000, Server 2000/2003, Windows Vista and Home Server.
http://apcmag.com/8006/patch_windows_xp_from_a_cd





Vista Incapable

Suit Says Microsoft Knew it Misled
Joseph Tartakoff

Quoting extensively from internal Microsoft Corp. e-mails, plaintiffs' lawyers argued Friday that the company knowingly misled consumers by allowing PC makers to emblazon "Windows Vista Capable" stickers on PCs that could run only the most bare-bones version of the operating system.

The new documents are the latest development in a lawsuit filed against Microsoft last year, charging that the company deceived consumers into thinking that the PCs they were buying could run Vista's most highly promoted features, even when they couldn't. The slogan was part of a campaign by Microsoft to maintain sales of Windows XP computers during the 2006 holiday shopping season after Windows Vista was delayed.

The hearing Friday before U.S. District Judge Marsha Pechman was held to determine whether the lawsuit merited class-action status and whether Washington law applied.

During his opening presentation, plaintiffs' lawyer Jeffrey Tilden of Gordon Tilden Thomas & Cordell quoted from numerous internal e-mails that appeared to show that employees within Microsoft had misgivings about the "Windows Vista Capable" campaign. The documents are under seal pending a ruling by Pechman.

"Even a piece of junk will qualify" for the "Windows Vista Capable" designation, wrote one employee in an e-mail that Tilden read out loud.

Another employee, Mike Nash, currently a corporate vice president for Windows product management, wrote in an e-mail, "I PERSONALLY got burnt. ... Are we seeing this from a lot of customers? ... I now have a $2,100 e-mail machine."

Jim Allchin, then the co-president of Microsoft's Platforms and Services Division, wrote in another e-mail, "We really botched this. ... You guys have to do a better job with our customers."

Another e-mail chain presented in court showed that Wal-Mart was concerned about the impact the campaign could have, and Tilden hinted that other retailers had similar concerns.

Microsoft downplayed the significance of the e-mail exchanges. "The e-mails cited in today's hearing are isolated, and in many instances, outdated and really just snippets of a broad and thorough review that took place during the development of the Windows Vista Capable program," David Bowermaster, a Microsoft spokesman, said in a statement.

He added, "Throughout this review, Microsoft employees raised concerns and addressed issues with the aim of making this program better for our partners and more valuable for consumers. In the end, we believe we achieved both objectives."

The company's marketing campaign also used the term "premium ready" to indicate when a computer was able to run the most-advanced versions of Windows Vista.

Both sides had 40 minutes to make their cases.

During his presentation, Stephen Rummage, a lawyer with Davis Wright Tremaine who represented Microsoft, argued that the case did not merit class-action status, since each customer who bought a "Windows Vista Capable" computer had different information at the time of the purchase. For instance, a Microsoft Web site provided information about the various editions of Vista, while magazines, blogs and even some retailers also explained the distinctions.

"We know that there was a wealth of information available to the public," he said. "They have not presented the court with a single document showing what people were told."

Plaintiffs' attorney Jeffrey Thomas countered that it made sense for the case, which was filed by two plaintiffs, to proceed as a class action because all the individuals who bought "Windows Vista Capable" PCs were united in that "each person in our class did not get what they paid for."

Pechman said she would issue a ruling in about 10 days.
http://seattlepi.nwsource.com/busine...ftvista09.html





EU Conducts Antitrust Raid on Intel, Retailers
David Lawsky

The European Commission conducted antitrust raids against Intel Corp's Munich offices on Tuesday and against retailers selling products of the world's largest chip maker, the Commission and the chip maker said.

The European Union watchdog's actions ratcheted up pressure on Intel and broke new ground by raiding Germany's huge Media Markt-Saturn and British electrical goods retailer DSG International Plc, which owns Dixons and Currys.

Intel has been preparing for a Brussels hearing on March 11 and 12 to answer pending charges it abused its dominance of the market for central processing units (CPUs) at the heart of every PC.

"Commission officials carried out unannounced inspections at the premises of a manufacturer of central processing units and a number of personal computer retailers," said Jonathan Todd, a Commission spokesman.

He said the Commission, accompanied by local law enforcement staff, conducted the raids because it had reason to believe the companies "may have violated EC (European Community) Treaty rules on restrictive business practices and/or abuse of a dominant market position."

Intel confirmed the raids.

"There has been a raid on our offices in Munich. As is our normal practice, we are cooperating with authorities," said Chuck Mulloy, a spokesman for the chip maker .

In London, British retailer DSG said it was part of the sweep as well.

"I can confirm that officials from the EU Commission are currently conducting an inspection at our Retail Support Centre in Hemel Hempstead," a DSG spokesman said in a statement.

Germany's Media Markt-Saturn also confirmed it was raided. It is a subsidiary of trading company Metro, which controls most of that country's retail electronics market and operates in other countries as well.

The raids come as Intel faces a closed hearing in Brussels next month on charges that it slashed prices below cost and offered huge rebates in an attempt to drive smaller competitor Advanced Micro Devices Inc out of the market.

The Commission was already investigating Media Markt-Saturn for its ties to Intel, acting on a reference from the German anti-cartel agency. The retailer sells PCs with Intel CPUs but not those by AMD.

The Commission is the EU's antitrust watchdog and has powers to fine companies up to 10 percent of their global annual revenue for competition abuses.

(Additional reporting by Jens Hack in Munich and Dan Lalor in London; Editing by David Holmes, Paul Bolding)
http://www.reuters.com/article/techn...16666220080212





Hewlett-Packard Settles Spying Case
Matt Richtel

Hewlett-Packard has agreed to a financial settlement with The New York Times and three BusinessWeek magazine journalists in connection with the company’s spying scandal that stemmed from surreptitiously obtaining private phone records.

The parties to the dispute declined to disclose the amount of the settlement, which was reached privately and not as a result of a lawsuit.

The settlement brings to a close one of a few unresolved aspects of a spying scandal that tarnished Hewlett-Packard and brought down its chairwoman, Patricia C. Dunn, and several high-ranking executives.

To trace what Ms. Dunn and others considered disloyal and risky leaks from the company’s board, H.P. retained investigators who engaged in a wide-ranging investigation in 2006. The inquiry involved "pretexting," a practice that had someone pretending to be someone else to obtain private records from phone companies.

Phone records compromised included those of the three BusinessWeek staff members — Ben Elgin, Peter Burrows and Roger Crockett — and the family phone records of The New York Times reporter, John Markoff, and his wife, Leslie Terzian Markoff.

"What H.P. did was an affront to the free press," said Terry Gross, a San Francisco lawyer who represented the reporters for BusinessWeek, part of the Mc-Graw-Hill Companies, their families and The New York Times. "They didn’t like what reporters were writing, and they broke into their private telephone accounts to identify who their sources were."

In a written statement, Hewlett-Packard said it was pleased to have resolved the matter.

Previously, the company agreed to settle a lawsuit by the California attorney general, paying $14.5 million in fines and promising to change its corporate governance practices.

News organizations have customarily been hesitant about pursuing financial settlements with companies or people they write about, wanting to avoid the perception that coverage could be tied to compensation.

In a statement, The New York Times said it pursued a claim against Hewlett-Packard in part to send the message that "corporate misconduct aimed at silencing the press is not acceptable and will not be tolerated." The Times pursued the claim on Mr. Markoff’s behalf, and he did not individually seek compensation.

The Times donated its money from the settlement to groups including the Center for Investigative Reporting and the Investigative Journalism Program at the journalism school of the University of California, Berkeley.

Mr. Gross said the BusinessWeek reporters also planned to give some of their settlement money to charity.

The company still faces five lawsuits brought by other journalists and their families, who assert that Hewlett-Packard used pretexting to illegally obtain their phone records. Those cases, filed together in August 2007 in San Francisco Superior Court, are pending, the plaintiffs’ lawyer, Kevin Boyle, indicated.

Two reporters for The Wall Street Journal were also targets of Hewlett-Packard investigators, but The Journal has said it will not take part in legal action or settlement talks.
http://www.nytimes.com/2008/02/14/bu...edia/14hp.html





EU: Printer Tracking Dots May Violate Human Rights
Danny O'Brien

We've long been concerned about the human rights risks of printer tracking dots for anyone who publishes printed works with modern technology. Tracking dots are the secret marks that many popular color laser printers and photocopiers scatter across every document they touch. The marks, almost invisible to the eye, uniquely identify the printer that produced the document, and, as EFF uncovered, can even automatically encode the time and date it was created.

Anonymous self-publication and distribution have been, and remain, a vital political communication channel in many countries. A telltale pattern readable by government officials is a tool that oppressive states everywhere would love to have -- not to mention the general threat to individual privacy countries more respectful of human rights.

It turns out that the European Commission, the executive wing of the EU (whose members include many former Eastern Bloc states), shares these concerns. When asked by Satu Hassi, Green Member of European Parliament for Finland, about the legality within Europe of America's tracking dots, Commissioner Frattini said that while the Commission could not uncover a specific law against the dots themselves,

Quote:
to the extent that individuals may be identified through material printed or copied using certain equipment, such processing may give rise to the violation of fundamental human rights, namely the right to privacy and private life. It also might violate the right to protection of personal data.
There's some irony in hearing such concerns come from Commissioner Frattini, who is currently championing his own privacy invasions with a proposed EU Passenger Name Record data-mining network.

Nonetheless, at least there is recognition in Europe of the dangers of these yellow dots. It also raises some follow-up questions. Given that including tracking systems in printers appears to be a U.S. government policy, how hard does the EU plan to pressure their ally for change in its secret agreements with printer manufacturers? Is the United States sharing its knowledge of how to decode these dots with individual EU nations' governments? And if so, what other governments, authoritarian or not, know the secret of tracking their citizens' publications?
http://www.eff.org/deeplinks/2008/02...e-human-rights





Dane, Tunisians Arrested in Cartoonist Murder Plot

A Danish citizen of Moroccan descent and two Tunisians were arrested in Denmark on Tuesday over a plot to murder one of 12 cartoonists whose drawings of the Prophet Mohammad caused worldwide uproar in 2006.

The Security and Intelligence Service (PET) said the arrests near Aarhus in western Denmark were made after lengthy surveillance to prevent a "terror-related killing" that was in an early stage of planning.

PET said it expected the 40-year-old Danish citizen to be released pending further investigation. The Tunisians will remain detained while deportation proceedings are brought against them.

According to Jyllands-Posten, the newspaper that originally published the cartoons in September 2005, the suspects are accused of planning to kill 73-year-old Kurt Westergaard.

He drew the cartoon that caused the most controversy, depicting the founder of Islam with a bomb in his turban. The paper reproduced that drawing on its Web site on Tuesday.

Westergaard, who has been under PET protection for several months, told Danish state TV he was sure the PET's interference had saved his life. But he said even with hindsight he would still have made the drawing.

The cartoons gained little initial attention but were later reprinted outside Denmark, provoking outrage among Muslims, most of whom deem any depiction of the Prophet as offensive.

Three Danish embassies were attacked and at least 50 people were killed in rioting in the Middle East, Africa and Asia. Several young Muslims have since been convicted in Denmark of planning bomb attacks, partly in protest at the cartoons.

Freedom

Prime Minister Anders Fogh Rasmussen said he was deeply concerned by the serious nature of the crime.

"Unfortunately, the matter shows that there are in Denmark groups of extremists that do not acknowledge and respect the principles on which Danish democracy is built," Rasmussen said in a statement. "In Denmark, we have freedom not only to think and talk, but also to draw."

The Islamic Faith Community, a religious Muslim organization at the centre of the cartoon controversy, condemned the plot, saying all disagreements should be handled via legal channels.

"It does not serve our purpose that people take the law into their own hands. On the contrary," it said in a statement. "We want to appeal to reason in both politicians and the media to not use this miserable example to feed the flames or use it for their own profit. No one in Denmark deserves to live in fear."

In the 2006 book "The Mohammad Crisis" written by former Reuters correspondent Per Bech Thomsen, Westergaard said he did not expect the cartoons to become a global affair.

"The idea was to illustrate that terrorists get their ammunition from the fundamentalist parts of Islam. It was not aimed at Muslims and Islam in general, but against the part that inspires and uses death and destruction," he said in the book.

Westergaard, a staff cartoonist at Jyllands-Posten who has been accused of being both anti-Semitic and anti-Christian in the past, told Thomsen he felt misunderstood.

"I was part of the project to strike a blow for freedom of expression and the anger over being threatened because one does one's work drowns out the fear," he said in the book.

(Editing by Catherine Evans)
http://www.washingtonpost.com/wp-dyn...021200198.html





Danish Newspapers To Reprint Muhammad Cartoon

Three of Denmark's largest newspapers plan to reprint a cartoon tomorrow that depicts the Prophet Muhammad wearing a bomb-shaped turban.

The announcement follows the arrests of three people suspected of plotting to kill Kurt Westergaard, who drew one of 12 Muhammad caricatures.

Publication of the drawings in 2005 and then again in 2006 led to protests in Muslim countries.

The papers say they are taking the action to show they will not be intimidated by fanatics.

Islamic law generally opposes any depiction of the prophet.
http://www.editorandpublisher.com/ea..._id=1003709991





EU Plans to Require Biometrics of all Non-European Visitors
Stephen Castle

All non-Europeans would need to submit biometric data before crossing Europe's frontiers under sweeping European Union proposals to combat illegal migration, terrorism and organized crime that are to be outlined this week.

The plans - arguably the biggest shake-up of border management in Europe since the creation of an internal travel zone - would apply to citizens of the United States and all other countries that now enjoy visa-free status.

They would, however, allow EU citizens and "low risk" frequent travelers from outside the bloc to pass through automated, fast-track frontier checkpoints without coming into contact with border guards. Voluntary programs for prescreening such visitors, who would register fingerprints and other data, would be stepped up.

The proposals, contained in draft documents examined by the International Herald Tribune and scheduled to go to the European Commission on Wednesday, were designed to bring the EU visa regime into line with a new era in which passports include biometric data.

The commission, the EU executive, argues that migratory pressure, organized crime and terrorism are obvious challenges to the Union and that the bloc's border and visa policy needs to be brought up to date.

It also wants a new European Border Surveillance System to be created, to use satellites and unmanned aircraft to help track the movements of suspected illegal migrants.

If approved by the commission this week, the measures would need the approval of all EU states.

The United States routinely requires European citizens to submit fingerprints when crossing its borders and the commission's document notes that America plans to introduce an electronic travel-authorization system for people from countries like Britain, France and Germany that are in its Visa Waiver Program.

The commission's proposals cover the Schengen zone, Europe's internal free-travel area named after the village in Luxembourg near where the original agreement between five countries was signed on June 14, 1985. Twenty-four countries are now members.

It is unclear whether Britain and Ireland, which along with Cyprus are not members of Schengen, would opt into the program.

Each year more than 300 million travelers cross EU borders, but there is no obligation for countries inside the Schengen free-travel zone to keep a record of entries and exits of non-European third-country nationals in a dedicated database. Moreover, if the visitor leaves from another Schengen country, it is often impossible to determine whether or not the visitor overstayed his or her visa.

The proposals, drafted by the European commissioner for justice and home affairs, Franco Frattini, suggest that non-Europeans on a short-stay visa would be checked against a Visa Information System that is already under construction and should be operational in 2012.

Frattini also is calling for a new database to be set up to store information on the time and place of entry and exit of non-European nationals, using biometric identifiers. Once a person's visa expired, an alert would go out to all national authorities that the visitor had overstayed his or her allotted time.

Travelers from countries with a visa requirement would need to provide biometric data at European consulates before leaving their home country. Those arriving from nations not requiring visas, like the United States, would also need to submit fingerprints and a digitalized facial image.

But the European Union would try to make the system more user-friendly for Europeans and some categories of bona fide visitors by granting them the status of "registered traveler." They would be able to have their biometric travel documents scanned and checked by machines.

All Europeans should be able to use such a system when EU countries complete the task of issuing passports with two biometric identifiers, by 2019 at the latest. The 27 EU countries started issuing passports with a digitalized facial image in August 2006 and, in June 2009, will add the holder's fingerprints. European residence permits will also contain the same identifiers.

Non-Europeans could gain the same, fast-track status providing they have not overstayed previous visas, have proof of sufficient funds to pay for their stay in Europe and hold a biometric passport.

All non-European nationals would be asked to make an electronic application, supplying key data, before their arrival, allowing them to be checked against anti-terror databases in advance.

The draft documents also highlight weaknesses in Europe's efforts to guard its borders. One paper points out that, in the eight EU countries with external borders in the Mediterranean Sea and southern Atlantic, frontier surveillance is carried out by about 50 authorities from 30 institutions, sometimes with competing competencies and systems.

The plans foresee increased use of satellites and unmanned surveillance aircraft to monitor unauthorized movements, and a computerized communication network to share information.

Frattini also wants to see a bigger role for the agency that coordinates cooperation over external borders, known as Frontex. Although the agency has been criticized in some southern European nations for failing to match the scale of the challenge over illegal migration, the commission argues that it has achieved impressive results.

In 2006 and 2007 more than 53,000 people were apprehended or denied entry at a frontier and at least 2,900 false travel documents were seized. In addition, 58 people suspected of links to illegal trafficking have been arrested.
http://www.iht.com/articles/2008/02/10/europe/union.php





FBI Worried About Surge in Sales of Book Detailing Airport Security Gaps

A book detailing gaps in airport security enjoys a surprising surge in sales; the FBI, which keeps track of sales of books which may benefit terrorists -- and also of similar library books being checked out -- wants to know more

Counterterrorist analysts at the FBI have been monitoring the sales of books dealing with aviation and other security issues. The bureau recently sent a letter to Prometheus Books in New York to inquire about a recent spike in sales of a title critical of gaps in airport security. The book, Aviation Insecurity: The New Challenges of Air Travel, by Andrew Thomas, saw a surge in orders in the fourth quarter, raising a red flag at the FBI. Originally published in 2003, the paperback gives a close examination of security missteps at Boston's Logan International Airport, and paints a bleak picture of the security apparatus under the FAA and the airlines in the run-up to the 9/11 hijackings. In addition, it warns of continued vulnerabilities in the new aviation security system.

The 263-page book also includes an FAA executive summary from 9/11, first revealed in WND, that states that one of the hijackers shot a passenger aboard American Airlines Flight 11. More than 500 copies of the book were purchased in a short period in late November, an official familiar with the FBI inquiry said. It still is unclear who was trying to buy up available copies of the title. Thomas, the book's author, declined comment. Thomas is an assistant professor at the University of Akron in Ohio. The FBI keeps a list of security books in addition to Thomas's book, and tracks the titles through sales and distribution channels. Libraries also are monitored for activity and interest in the listed titles.
http://hsdailywire.com/single.php?id=5558





U.K. Student Records to Sit in Accessible Database
Avril Ormsby

British students aged 14 to 19 will have their school records permanently placed on an electronic database accessible to prospective employers.

The project, called Managing Information Across Partners (MIAP), will launch in September. The record will include personal details and exam results and will remain with the pupil for life.

More than 40 partners, including the Learning and Skills Council, the Department of Innovation, Universities and Skills, and the Department for Work and Pensions, are involved in the project.

The system will be based on a Unique Learner Number.

"The Unique Learner Number, necessary to acquire a learner record for the diploma is a unique identifier that can be used by a learner for life," MIAP said on its Web site. "It is a national number that is validated and is therefore deemed to be unique."
The aim is to expand the system to include other information and to allow details already available but scattered across many databases to be brought together, it said.

The pupil would have control over the record and would be able to restrict the information shared.

It is envisaged that the information could be transferred if the pupil changes school, goes to college or applies for work, MIAP said.

"This will save a lot of effort for the learner in having to present this information to a prospective employer or a college," it added.

Pupils currently have a Unique Pupil Number which is allocated by a school and used internally for administration purposes. It expires when the pupil leaves.

"At the moment both numbers will work alongside each other but it is quite likely that in the future the ULN would replace the UPN," MIAP said.

Margaret Morrissey, spokeswoman for the National Confederation of Parent Teacher Associations, said: "From the point of view of parents and children hold on--hold on to what is probably a good idea, but which raises concern about data protection."

The ability of official bodies to keep personal data secure has been questioned by a spate of recent scandals.

In December, nine NHS trusts lost 168,000 patient records. A month before, the details of 25 million child benefit claimants went missing. And information on 3 million learner drivers disappeared during that time.

Government plans for national identity cards have also been criticized for their expense and so-called Big Brother infringement.
http://www.reuters.com/article/inter...50329920080213





When Surveillance Cameras Talk
Thomas K. Grose

Big Brother is not only watching you; in Barking and Dagenham, Big Brother wants a word. The disembodied voices of authority offering advice and warnings that now issue as if from thin air in the hardscrabble east London borough are, in fact, talking CCTV cameras — the latest high-tech weapon in the war on littering, graffiti, vandalism and other antisocial behavior. Sixteen of the borough's 84 surveillance cameras have been wired for sound, making London's first video monitoring network with a broadcasting capacity. A second borough, Southwark, will soon adopt the same system.

Both communities are among the 20 nationwide awarded $50,000 grants by Britain's Home Office to test the cameras, following an initial trial run last year in the Northern city of Middlesborough. The talking cameras are the latest advance in a country that's embraced video surveillance with an enthusiasm that would make Orwell shudder. The Week in Review is edited and published by Jack Spratts. Liberty, a civil liberties group, conservatively estimates there are 4.2 million CCTV cameras currently in operation in the UK, one for every 14 residents. Anyone living or working in London will likely be captured on camera 300 times a day, the group claims. Indeed, the government's information commissioner, Richard Thomas, has called Britain a "surveillance society" in danger of becoming overly reliant on tracking technologies.

But Glynis Rogers, Barking's head of community safety, counters that CCTV surveillance is popular with the public, and calls the talking cameras "a natural progression" of the technology. She's also dismissive of Big Brother parallels. The vocal cameras, she says, assure residents that the council is "actively managing the [borough's] open spaces ... so for us, it's actually far more open than a Big Brother scenario." Barking's cameras mostly transmit such prerecorded spiels as: "CCTV is in operation in this area and antisocial behavior will be reported to the police." Another message reminds folks to keep an eye on their valuables. Eventually, the council wants to run contests to pick school children to voice some of the messages.

The system can also operate live, in real time. CCTV operators, keeping a vigilant eye on a bank of 39 monitors in their windowless office, can ad lib broadcasts, asking people, for instance, to pick up the litter they've just dropped, or warning them that their behavior's unacceptable.

Liberty is not impressed. While not wholly opposed to video surveillance, the group thinks it's been oversold as a crime-prevention method. "There's no evidence whatsoever that it actually deters crime," says Jen Corlew, the group's media director, and adding voices to the mix won't change that. "'Gimmick' is the word we've been using to refer to it."

For the most part, the people on the streets of central Barking were taking the audio messages in stride — on a recent day, few even stopped to seek the source of the sound each time one was broadcast. Barking's a working-class area with a large population of senior citizens. Incomes are low; unemployment is high; and the shopping area is bereft of the chi-chi stores and expensive coffee bars so prevalent in central London. Officials brag that crime rates are falling faster in Barking than in all of London, but many residents remain afraid to venture out at night. Not suprising, then, those asked on the streets and in shops were quick to voice support for the cameras. Typical was Maureen Lovely, a 66-year-old retiree: "I know it's a bit like Big Brother is watching you, but it's a good way of making people be aware. Hopefully, it will make things cleaner and quieter."

Still, some question how effective the talking cameras will be on Friday and Saturday nights, when crowds can get rowdy. Hussain Scandari, a 19-year-old college student, doubted if troublemakers and people who have been drinking heavily will pay much heed to the audio admonitions. "They'll just do their own thing." Still, if they're arrested, they won't be able to say they weren't warned.
http://www.time.com/time/world/artic...711972,00.html





Domestic Access to Spy Imagery Expands
Eileen Sullivan

A plan to use U.S. spy satellites for domestic security and law-enforcement missions is moving forward after being delayed for months because of privacy and civil liberties concerns.

The charter and legal framework for an office within the Homeland Security Department that would use overhead and mapping imagery from existing satellites is in the final stage of completion, according to a department official who requested anonymity because the official was not authorized to speak publicly about it.

The future of this program is likely to come up Wednesday when Homeland Security Secretary Michael Chertoff goes to Capitol Hill to talk about his department's spending plan.

Last fall, senior Democrats on the House Homeland Security Committee asked the department to put the program on hold until there was a clear legal framework of how the program would operate. This request came during an ongoing debate over the rules governing eavesdropping on phone calls and e-mails of suspected terrorists inside the United States.

The new plan explicitly states that existing laws which prevent the government from spying on citizens would remain in effect, the official said. Under no circumstances, for instance, would the program be used to intercept verbal and written conversations.

The department currently is waiting for federal executive agencies to sign off on the program — called the National Applications Office — and will share the details with lawmakers soon.

Domestic agencies such as the Federal Emergency Management Agency and Interior Department have had access to this satellite imagery for years for scientific research, to assist in response to natural disasters like hurricanes and fires, and to map out vulnerabilities during a major public event like the Super Bowl. Since 1974 the requests have been made through the federal interagency group, the Civil Applications Committee.

These types of uses will continue when the Homeland Security Department oversees the program and becomes the clearinghouse for these requests. But the availability of satellite images will be expanded to other agencies to support the homeland security mission. The details of how law enforcement agencies could use the images during investigations would be determined in the future after legal and policy questions have been resolved, the official said.

It is possible that in the future an agency might request infrared imaging of what is inside a house, for instance a methamphetamine laboratory, and this could raise constitutional issues. In these instances, law enforcement agencies would still have to go through the normal process of obtaining a warrant and satisfying all the legal requirements. The National Applications Office also would require that all the laws are observed when using new imaging technology.

Requests for satellite images will be vetted even more than they were when the requests went through the Civil Applications Committee. All requests will be reviewed by an interagency group that includes Justice Department officials to ensure civil rights and civil liberties are not violated.

This new effort largely follows the recommendations outlined by a 2005 independent study group headed by Keith Hall, a former chief of the National Reconnaissance Office and now vice president of the consulting firm Booz Allen Hamilton.
http://ap.google.com/article/ALeqM5g...ifwNgD8UP4GG03





US Judge Hedges Bets on Torture

A United States Supreme Court judge says it is not clear that the American constitution protects people from torture.

Justice Antonin Scalia says the constitution prohibits cruel and unusual punishments.

But he says it would be absurd not to inflict pain on someone when they had key knowledge of an imminent attack.

"Once you acknowledge that we're into a different game, how close does the threat have to be and how severe can the infliction of pain be?" he asked.

"I don't think these are easy questions at all, in either direction, but I certainly know you can't come in smugly and with great self-satisfaction and say, 'oh it's torture and therefore it's no good'," he said.

"You would not apply that in some real life situations."
http://www.abc.net.au/news/stories/2...13/2161360.htm





Connecticut can’t do it alone

Retroactive Immunity: What's Next?
Jason Rosenbaum

I just got off a conference call with Chris Dodd. Not surprisingly, he said he was "disappointed" by the outcome of the votes today. He said, "We’ve just sanctioned the single largest invasion of privacy in American history."

Dodd clearly sees most of his caucus as against him as he was only able to pick up 20-30 votes for his amendments and against cloture. He explains the reasoning Democrats have to vote for retroactive immunity in stark terms:

"Those who are advocating this notion that you have to give up liberties in order to be more secure are apparently prevailing."

Nevertheless, Dodd seems unbowed by these defeats. Like myself and others in the grassroots, he is now looking towards the House to resolve this issue. The House's RESTORE Act, passed last year, includes no retroactive immunity and was passed 227-189 back in November.

Firedoglake has already picked up the call to put pressure on the house. Sign their petition and help make sure the House stands by their bill.

If the House caves and a bill with retroactive immunity is reported out of conference, Dodd said he again will use, "all the tools available to the single senator to delay going forward on this," including a filibuster.

All I can say is thank you Dodd, for standing up for the Constitution when so many others were willing to make that false choice between security and liberty. We're with you 100%.

UPDATE: If you want to thank Dodd, help him retire his campaign debt!
http://www.theseminal.com/2008/02/12...ty-whats-next/





House Leaves Surveillance Law to Expire
Carl Hulse

The House broke for a week’s recess Thursday without renewing terrorist surveillance authority demanded by President Bush, leading him to warn of risky intelligence gaps while Democrats accused him of reckless fear mongering.

The refusal of Speaker Nancy Pelosi, Democrat of California, to schedule a vote on a surveillance measure approved Tuesday by the Senate touched off an intense partisan conflict over the national security questions that have colored federal elections since 2002 and are likely to play a significant role again in November.

Trying to put pressure on Democrats, Mr. Bush offered to delay a trip to Africa to resolve the dispute and warned that failure to extend the expanded power under the Foreign Intelligence Surveillance Act, which expires Saturday, could hamper efforts to track terrorists.

“Our intelligence professionals are working day and night to keep us safe,” Mr. Bush said, “and they’re waiting to see whether Congress will give them the tools they need to succeed or tie their hands by failing to act.”

But Ms. Pelosi and other House Democrats said Mr. Bush and Congressional Republicans were at fault because they had resisted temporarily extending the bill to allow disagreements to be worked out. Democrats would not be bullied into approving a measure they considered flawed, she said.

“The president knows full well that he has all the authority he needs to protect the American people,” said Ms. Pelosi, who then referred to President Franklin D. Roosevelt’s admonition about fearing only fear itself. “President Bush tells the American people that he has nothing to offer but fear, and I’m afraid that his fear-mongering of this bill is not constructive.”

The decision by the House Democratic leadership to let the law lapse is the greatest challenge to Mr. Bush on a major national security issue since the Democrats took control of Congress last year.

Last summer, Democrats allowed the surveillance law to be put in place for six months although many of them opposed it. They have also relented in fights over spending on the Iraq war under White House pressure. But with Mr. Bush rated low in public opinion polls as he enters the last months of his presidency, Democrats are showing more willingness to challenge him.

Republicans say House Democrats are taking a risk, especially in light of the strong bipartisan Senate vote for the bill.

“They can’t pass a Mother’s Day resolution and got 68 votes for this bill,” said Representative Adam H. Putnam of Florida, chairman of the House Republican Conference.

The battle over the surveillance bill was also tangled up in the rancor over a House vote to hold in contempt Joshua B. Bolten, the White House chief of staff, and Harriet E. Miers, the former White House counsel, for refusing to testify about the firing of United States attorneys. Republicans said the House was devoting time to that issue when it could be considering the surveillance program, and they staged a walkout in protest.

The main sticking point is a provision in the Senate bill that provides legal immunity for telecommunications companies that, at the Bush administration’s request, cooperated in providing private data after the Sept. 11, 2001, attacks. Many House Democrats oppose that immunity.

Surveillance efforts will not cease when the law lapses. Administration intelligence officials said agencies would be able to continue eavesdropping on targets that have already been approved for a year after the initial authorization. But they said any new targets would have to go through the more burdensome standards in place before last August, which would require that they establish probable cause that an international target is connected to a terrorist group.

Intelligence officials also told reporters Thursday that they were worried that telecommunications companies would be less willing to cooperate in future wiretapping unless they were given immunity.

Ben Powell, general counsel for the director of national intelligence’s office, said some carriers had already asked whether they could be compelled to cooperate even without legal protection, although he indicated that none had actually threatened to halt operations.

Ms. Pelosi said that she believed that the differences could be resolved within three weeks and that she had told the chairmen of the House Intelligence and Judiciary Committees to work with their counterparts in the Senate to seek a compromise.

Congressional Republicans sharply criticized Democrats for not moving on the final measure.

“I think there is probably joy throughout the terrorist cells throughout the world that the United States Congress did not do its duty today,” said Representative Ted Poe, Republican of Texas.

Democrats said Republicans, struggling politically, were trying to create an air of crisis.

“This is a manufactured political crisis,” said Senator Richard J. Durbin of Illinois, the No. 2 Democrat. “They want something to put in front of the American people to take their minds off the state of the economy.”

Eric Lichtblau contributed reporting.
http://www.nytimes.com/2008/02/15/wa...15fisa.html?hp





Error Gave F.B.I. Unauthorized Access to E-Mail
Eric Lichtblau

A technical glitch gave the F.B.I. access to the e-mail messages from an entire computer network — perhaps hundreds of accounts or more — instead of simply the lone e-mail address that was approved by a secret intelligence court as part of a national security investigation, according to an internal report of the 2006 episode.

F.B.I. officials blamed an “apparent miscommunication” with the unnamed Internet provider, which mistakenly turned over all the e-mail from a small e-mail domain for which it served as host. The records were ultimately destroyed, officials said.

Bureau officials noticed a “surge” in the e-mail activity they were monitoring and realized that the provider had mistakenly set its filtering equipment to trap far more data than a judge had actually authorized.

The episode is an unusual example of what has become a regular if little-noticed occurrence, as American officials have expanded their technological tools: government officials, or the private companies they rely on for surveillance operations, sometimes foul up their instructions about what they can and cannot collect.

The problem has received no discussion as part of the fierce debate in Congress about whether to expand the government’s wiretapping authorities and give legal immunity to private telecommunications companies that have helped in those operations.

But an intelligence official, who spoke on condition of anonymity because surveillance operations are classified, said: “It’s inevitable that these things will happen. It’s not weekly, but it’s common.”

A report in 2006 by the Justice Department inspector general found more than 100 violations of federal wiretap law in the two prior years by the Federal Bureau of Investigation, many of them considered technical and inadvertent.

Bureau officials said they did not have updated public figures but were preparing them as part of a wider-ranging review by the inspector general into misuses of the bureau’s authority to use so-called national security letters in gathering phone records and financial documents in intelligence investigations.

In the warrantless wiretapping program approved by President Bush after the Sept. 11 terrorist attacks, technical errors led officials at the National Security Agency on some occasions to monitor communications entirely within the United States — in apparent violation of the program’s protocols — because communications problems made it difficult to tell initially whether the targets were in the country or not.

Past violations by the government have also included continuing a wiretap for days or weeks beyond what was authorized by a court, or seeking records beyond what were authorized. The 2006 case appears to be a particularly egregious example of what intelligence officials refer to as “overproduction” — in which a telecommunications provider gives the government more data than it was ordered to provide.

The problem of overproduction is particularly common, F.B.I. officials said. In testimony before Congress in March 2007 regarding abuses of national security letters, Valerie E. Caproni, the bureau’s general counsel, said that in one small sample, 10 out of 20 violations were a result of “third-party error,” in which a private company “provided the F.B.I. information we did not seek.”

The 2006 episode was disclosed as part of a new batch of internal documents that the F.B.I. turned over to the Electronic Frontier Foundation, a nonprofit group in San Francisco that advocates for greater digital privacy protections, as part of a Freedom of Information Act lawsuit the group has brought. The group provided the documents on the 2006 episode to The New York Times.

Marcia Hofmann, a lawyer for the privacy foundation, said the episode raised troubling questions about the technical and policy controls that the F.B.I. had in place to guard against civil liberties abuses.

“How do we know what the F.B.I. does with all these documents when a problem like this comes up?” Ms. Hofmann asked.

In the cyber era, the incident is the equivalent of law enforcement officials getting a subpoena to search a single apartment, but instead having the landlord give them the keys to every apartment in the building. In February 2006, an F.B.I. technical unit noticed “a surge in data being collected” as part of a national security investigation, according to an internal bureau report. An Internet provider was supposed to be providing access to the e-mail of a single target of that investigation, but the F.B.I. soon realized that the filtering controls used by the company “were improperly set and appeared to be collecting data on the entire e-mail domain” used by the individual, according to the report.

The bureau had first gotten authorization from the Foreign Intelligence Surveillance Court to monitor the e-mail of the individual target 10 months earlier, in April 2005, according to the internal F.B.I. document. But Michael Kortan, an F.B.I. spokesman, said in an interview that the problem with the unfiltered e-mail went on for just a few days before it was discovered and fixed. “It was unintentional on their part,” he said.

Mr. Kortan would not disclose the name of the Internet provider or the network domain because the national security investigation, which is classified, is continuing. The improperly collected e-mail was first segregated from the court-authorized data and later was destroyed through unspecified means. The individuals whose e-mail was collected apparently were never informed of the problem. Mr. Kortan said he could not say how much e-mail was mistakenly collected as a result of the error, but he said the volume “was enough to get our attention.” Peter Eckersley, a staff technologist for the Electronic Frontier Foundation who reviewed the documents, said it would most likely have taken hundreds or perhaps thousands of extra messages to produce the type of “surge” described in the F.B.I.’s internal reports.

Mr. Kortan said that once the problem was detected the foreign intelligence court was notified, along with the Intelligence Oversight Board, which receives reports of possible wiretapping violations.

“This was a technical glitch in an area of evolving tools and technology and fast-paced investigations,” Mr. Kortan said. “We moved quickly to resolve it and stop it. The system worked exactly the way it’s designed.”
http://www.nytimes.com/2008/02/17/wa...on/17fisa.html





With a Death in Congress, an IP Shakeup Looks Likely
Nate Anderson

Rep. Tom Lantos' (D-CA) death from esophageal cancer last night leaves an opening at the top of the House Foreign Affairs Committee, an opening that appears to be perfectly shaped like Howard Berman (D-CA). Berman is expected to take over the foreign affairs post, which will open his current spot as chair of the Subcommittee on Courts, the Internet and Intellectual Property. Rick Boucher (D-VA), who's in favor of expanded fair use rights and DMCA reform, looks to be next in line.

Berman hails from Hollywood and has been a powerful Congressional backer of the entertainment industry. Known as Congressman Hollywood, he's pushed everything from higher radio station royalty payments to the MPAA's campaign against colleges to the current PRO-IP Act.

With Lantos' passing, Berman's is the only name reportedly up for consideration to lead the Foreign Affairs Committee. While Democratic rules keep him from chairing two committees at the same time, CQ Politics points out that Berman can actually hold onto both positions until the end of this term.

Next year, though, he'll have to give up his subcommittee chair (assuming that he wins reelection and that the Democrats still control the House), and Boucher would be next in line for the job. The Hollywood Reporter, which has an obvious interest in the story, rather strangely describes Boucher as being a "long-term advocate of expanding the ability of people to use copyrighted material for free."

That description appears aimed at making Boucher sound scary to Hollywood types, as does the label of "industry opponent." (The MPAA and RIAA, certainly not industry opponents, routinely affirm their support for fair use.)

A Boucher chairmanship would be interesting to watch, as he has previously backed not only DMCA reform but network neutrality, a federal shield law for journalists and bloggers, and increased funding for rural broadband. His power would be limited, though, by the likely presence of Berman on the subcommittee and by industry friend John Conyers (D-MI), who chairs the full Judiciary Committee.
http://arstechnica.com/news.ars/post...ks-likely.html





White House Objects to Plan for .gov P2P Security
Anne Broache

The Bush administration on Thursday questioned a proposed law that would force federal agencies to develop specific plans for guarding government computers and networks against "risks" posed by peer-to-peer file sharing.

The Democratic-sponsored bill, called the Federal Agency Data Protection Act, contains a section asking federal agencies to report to Congress what "technological" (e.g., software and hardware) and "nontechnological" methods (such as employee polices and user training) they would employ to ensure peer-to-peer file-sharing programs do not harm the security of government systems.

The proposal, introduced late last year, is the latest manifestation of congressional Democrats' concern about the perils of so-called "inadvertent" file-sharing--that is, when inexperienced or uninformed peer-to-peer users set their applications to share folders containing sensitive files without realizing they're doing so.

At a hearing last summer, Rep. Henry Waxman, chairman of the House of Representatives Committee on Oversight and Government Reform, said such a practice can pose a national security threat and warned of plans for new legislation. He and others grilled the founder of Lime Wire, a popular P2P application, about how his service warns users about the files and folders they're poised to share. At the time, a Federal Trade Commission official told politicians that it has found any risks are largely rooted in how individuals use the technology.

The Bush administration appears to be backing up that view. Without naming the peer-to-peer file-sharing provision in particular, Karen Evans, the federal government's chief information officer, told a House information policy subcommittee that she objects to singling out a particular technology when issuing computer security requirements.

"While we recognize that technologies that are improperly implemented introduce increased risk, we recommend any potential changes to the statute be technology-neutral," Evans said at the sparsely attended hearing, which barely lasted an hour.

Federal agencies are already required to report on information security plans and risks annually under a law known as the Federal Information Security Management Act, or FISMA. Based on those plans, members of Congress have taken to issuing a yearly "report card" assessing agencies' status.

Without ever mentioning the Democrats' bill, Rep. Tom Davis (R-Va.), FISMA's original author, said he agreed that a "technology-neutral" approach, which refrains from being "overly prescriptive," is the best way to go.

Davis went on to urge passage of his own federal computer security bill, which passed the last Republican-controlled House but died in the Senate. It would require federal agencies to give "timely" notice to Americans if their sensitive personal information is compromised, as there's currently no legal requirement that they do so.

Some security experts warned the committee that piling on paperwork for federal agencies, as FISMA requires, isn't necessarily the most efficient way to improve security. Alan Paller, director of research for the Sans Institute, which does computer security training, said agencies need more guidance on what security-related steps to prioritize, rather than just a long list of items to complete.

"We want to avoid a 'check the box' mentality," added Tim Bennett, president of the Cyber Security Industry Alliance, a trade group that represents security technology vendors.

Still, Bennett said his group "strongly" supports the latest bill and its peer-to-peer network section.

"File-sharing can give users access to a wealth of information but it also has a number of security risks," he said. "You could download viruses or other malicious code without meaning to. Or you could mistakenly allow other people to copy files you don't mean to share."
http://www.news.com/8301-10784_3-987...=2547-1_3-0-20





MP3 Rocket Adds Limewire's TigerTree Technology to Fight P2P 'Fake Files' and to Block the Enemies of P2P
Press release

Limewire has released open source technology that can now block the sharing of all files without a copyright license (although the blocking technology is not currently turned on as default). The www.mp3rocket.com founders have agreed, that they will continue the fight for digital freedom and will not block their customer's files and will never filter files based on license detection. The second Limewire technology recently released and adopted by MP3Rocket is improved file filtering technology that blocks "fake files" and spam files from the enemies of files sharing and spammers. The new "TigerTree" hash technology blocks spammer's fake files, such as those containing spam, viruses and Trojans.

Toronto, Ontario (PRWEB) February 15, 2008 -- A group of music companies, including Sony BMG, Virgin Records and Warner Bros. Records, have accused Lime Wire and the company's officers of copyright infringement, according to a federal lawsuit filed Aug 2007 in U.S. District Court in New York. Case 1:06-cv-05936-GEL. Despite the ongoing lawsuit, Limewire continues to enhance its software and has recently released two major open source technologies that will affect the future of global file sharing.

Possibly due to the pressure of its current lawsuit, Limewire has released technology that can now block the sharing of all files without a copyright license (although the blocking technology is not currently turned on as default). The www.mp3rocket.com founders have agreed, that they will continue the fight for digital freedom and will not block their customer's files and will never filter files based on license detection.

The second Limewire technology recently released and adopted by MP3Rocket is improved file filtering technology that blocks "fake files" and spam files from the enemies of files sharing and spammers. The new "TigerTree" hash technology blocks spammer's fake files, such as those containing spam, viruses and Trojans.

The TigerTree Spam filter (included in the latest 5.0.3 version of the MP3Rocket) works by storing information about files, where search results can be filtered by keywords, size, verified SHA-1 hashes or even IP address. Information about what is spam is gathered through two methods: One, which includes the user marking which results they find to be fake, and another which covers the learning of items commonly found in search results. Both of these measures are used to give each file a rating and once this rating goes over a certain threshold, the file is flagged as spam.

The filter is designed to learn by storing information. When a search is conducted, the user can block certain results by key words, size, verified SHA-1 hashes and/or IP address. Taking this information, the filter will then begin to learn by creating spam profiles. If a future search result fits a malicious profile, the file is labeled as spam.

By automatically blocking "fake files" and spam MP3Rocket users find exactly what they are looking for faster than ever without the hassle of garbage files.

Other new features in the new free MP3Rocket's 5.0.3 release:
1. New "Watch TV" functionally that provides live on demand streams to over 3,000 Movies and TV Shows.
2. New "Live Radio" allows users to listen to over 500 top live radio stations.
3. New "Game Player" provides over 1,000 fun games.
4. "Most Popular" feature allows user to find most popular music, movies and pictures. Popular files are determined by actual real time user votes. The TV, Radio and Game streams also utilize this "most popular" voting feature to improve user experience.
5. Firewall to Firewall Transfers. Since about 60% of users are currently firewalled, this feature greatly increases the amount of content on the network.
6. Faster network connections. Using new "UDP Host Caches", MP3Rocket starts up and connects faster than ever before!
7. iTunes Integration. Windows and Mac users can now take advantage of MP3Rocket's iTunes integration.

If you liked Limewire, you should give the new 5.0.3 version of the MP3Rocket a try. It's everything Limewire was and more. Judge for yourself, now it's free at www.mp3rocket.com
http://www.prweb.com/releases/2008/02/prweb701683.htm





Officials Step Up Net-Neutrality Efforts

House bill aims to ensure providers route traffic fairly
Amy Schatz, Dionne Searcey and Vishesh Kumar

Big broadband companies are headed for a clash with Washington over whether consumers have a right to get as much as they want from the Internet, as fast as they want it, without paying extra for the privilege.

Complaints that cable titan Comcast Corp. is deliberately delaying some Internet traffic are prompting moves in Washington to block efforts by broadband providers to favor some kinds of Internet traffic over others.• The News: Complaints that Comcast is delaying some Internet traffic are prompting moves in Washington.

Congressman Edward Markey (D., Mass.) last night introduced a bill that would change federal laws to make sure Internet traffic has protections similar to phone calls, which companies are required to connect without hesitation.

Together with Rep. Chip Pickering (R., Miss.), Mr. Markey is proposing the "Internet Freedom Preservation Act," which says it is the policy of the U.S. to "maintain the freedom to use for lawful purposes broadband telecommunications networks, including the Internet, without unreasonable interference from or discrimination by network operators." Essentially, the bill would give the Federal Communications Commission more authority to police Internet providers to make sure they're delivering traffic fairly.

Meanwhile, comments are due today at the FCC in the agency's investigation of complaints that Comcast is deliberately slowing some Internet traffic, as well as a broader look at what should be done about such complaints in the future.

The stepped-up efforts by regulators and lawmakers to enforce what tech-industry officials call "net neutrality" come as an explosion in downloading of online video is prompting cable and phone companies to rethink their Internet pricing models, opening the door for fee plans based on the extent of usage.

Time Warner Inc.'s Time-Warner Cable unit recently announced a trial of a capacity-based pricing plan in Texas. Consumers will pay more if they exceed caps for downloading movies and other high-capacity content. "It's not set up to be a punishment of people's responsible Internet usage," Time Warner spokesman Alex Dudley said. "This is an attempt to get the high-end users to think differently about how they consume their Internet usage."

Comcast, the No. 1 cable operator by number of subscribers, and Verizon Communications Inc. also refuse to eventually rule out adopting a similar capacity-based pricing model.

This type of pricing plan is a radical departure from current practice. Cable and phone companies in the U.S. have long adhered to plans that allow Web surfers unlimited downloading for a monthly flat fee. They typically charge higher rates for faster speeds of Internet service but consumers can generally download or upload as much content as they want.

Separately, AT&T Inc. plans to track pirated movies and other content across its network. While it is unclear how the company would execute the plans, consumers have feared it would block content.

In a statement AT&T said, "We want to set the record straight that we have not said we are going to filter, and in fact, there is no technology solution available at this time. What we have said is that we are working with some in the content industry on the very real issue of piracy that has raised costs for all Internet users."

While the heavy usage by music and video downloaders has prompted cable companies to consider different pricing plans in the past, the fear of attacks from advocates of maintaining neutral pricing and access has stopped companies from changing their policies.

Public interest groups worry Comcast's decision to delay Internet traffic from peer-to-peer file sharing networks such as BitTorrent is just the beginning. They want to prevent companies that offer both cable and Internet access from protecting their lucrative television franchises by preventing consumers from migrating to Internet video. Comcast says it has done nothing wrong and needs freedom to manage its Internet network to ensure the best service for its customers.

Regulators now face a challenge to set rules in a rapidly shifting market where changes in technology and consumer tastes are running faster than Washington's ability to react.

Cable and telephone companies say their networks are being overwhelmed by huge increases in video downloading. In December, a record 10 billion videos were viewed online, researcher comScore reported.

AT&T says consumer broadband traffic on its network has doubled in the last two years alone. And broadband customers are using 40% more bandwidth each year. Time Warner estimates that 5% of its users account for 50% of the bandwidth usage in many parts of its network. This small percentage of users is able to absorb so much bandwidth because of the rapid growth of peer-to-peer networking services such as BitTorrent, popular among video downloaders. One user downloaded the equivalent of 1,500 high-definition films in a month, the company says.

Use of BitTorrent is exploding. The software has been downloaded 160 million times, up from 80 million a year ago, according to Ashwin Navin, the president of BitTorrent. Managing this type of heavy network use is a particular problem for cable operators because their networks are shared by neighborhoods. That means if a consumer is downloading hours of high-definition video, subscribers living nearby could see their Internet speeds slow.

After investigations by the Associated Press and the Electronic Frontier Foundation last year showed Comcast was delaying, if not totally blocking, some peer-to-peer Internet traffic, the company admitted it sometimes delays traffic. In late January, the cable company quietly changed its terms of service to disclose that fact.

In comments filed at the FCC last night, the company denied it has violated the FCC's net-neutrality principles and said it is using "carefully limited measures" to manage Internet traffic.

Further, the company asked the agency to "make it clear that [the FCC] will not be drawn into second-guessing the reasonable network management decisions that engineers and service providers must make on a daily -- and sometimes hourly -- basis to respond to a dynamic and ever-changing Internet."

Mark Cooper, of the Consumer Federation of America, said he fears the explosion in video will give companies an excuse to engage in discriminatory behavior. "The testing has begun where they will push network discrimination as far as they can to see what they can get away with," he says.

Cox Communications also says it may put other types of traffic ahead of peer-to-peer traffic. "We reserve the right to manage traffic on our network but we don't get into the specific methodologies that we use," said David Grabert, director of media relations at Cox. "You have to manage traffic on the network and prioritize certain types of traffic -- like phone traffic -- over other types of traffic."

The FCC's ability to referee net-neutrality issues isn't clear, although it plans to hold a public hearing in Boston later this month to examine the issue. Three years ago, after consumers complained that a rural North Carolina phone and Internet provider was blocking Vonage Holdings Corp., a competing Internet-phone-service provider, the FCC launched an investigation. Within a month the company, Madison River Communications, had agreed to stop and forked over $15,000 to make the investigation go away.

Soon after, however, the FCC changed the rules and deregulated Internet lines, effectively giving itself less authority to regulate Internet services. The Comcast complaint has prompted the FCC and Congress to look at remedying that problem.
http://online.wsj.com/article/SB120286741569864053.html





Internet Bill a Blow to the Gatekeepers

Reps. Ed Markey (D-Mass.) and Chip Pickering (R-Miss.) today launched the latest salvo in the struggle to keep the Internet free from gatekeepers with the introduction of the “Internet Freedom Preservation Act of 2008” (HR 5353).

The bipartisan bill protects Net Neutrality under the Communications Act and calls for a nationwide conversation to set policy about the future of the Internet. Rep. Ed Markey

The legislation gives hope to the millions of Americans who have called for action to ensure that the public — not phone and cable companies — control the fate of the Internet.

Taking it Public

The new bill calls on the FCC to convene at least eight “broadband summits” to collect public input on a variety of policies “that will promote openness, competition, innovation, and affordable, ubiquitous broadband service for all individuals in the United States.”

Taking the issue outside the Beltway — and beyond the corrosive influence of telecom lobbyists — is an encouraging sign for communities across the country that stand to benefit from the enormous economic and social benefits of an open Internet.

Big phone and cable companies like AT&T, Verizon, Comcast and Time Warner have padded the pockets of Washington lawyers, lobbyists and shills to kill Net Neutrality and pave the way for “network management” practices that allow blocking of certain content in favor of Web sites and services the companies prefer.

Barring Discrimination

The new bill requires the FCC to actively protect the free-flowing Internet from gatekeepers, enforcing protections that “guard against unreasonable discriminatory favoritism for, or degradation of, content by network operators based upon its source, ownership, or destination on the Internet.”

These protections would be amended into the Communications Act, according to the new legislation.

The FCC recently launched an investigation — spurred by a complaint from members of the SavetheInternet.com Coalition and thousands of letters from concerned citizens — into blocking of Internet services by cable and phone companies.

A Growing Coalition

“Americans need to ask themselves: What good is free speech if a handful of powerful corporations have the ability to shut off or slow viewpoints they find objectionable?” said International Brotherhood of Teamsters General President Jim Hoffa. “I applaud Congressman Markey and encourage other union members to stand with the 1.4 million-member strong International Brotherhood of Teamsters.”

“Gamers, the majority of whom are in the coveted 18-45 demographic, increasingly use the Internet to communicate, mobilize and play the increasingly complex games they enjoy,” said Hal Halpin, president of the Entertainment Consumers Association (ECA), a national nonprofit membership organization established to serve the needs of the millions of Americans who play computer and video games. “We look forward to participating in the discussion fostered by this important legislation.”

Markey and Pickering’s bill will reignite the grassroots campaign to restore meaningful and lasting Net Neutrality protections. Access to an open Internet connection is no longer a luxury; it’s a right that should be afforded every American.

The public now has a new chance to speak out against would-be gatekeepers that seek to distort the Internet in their favor.
http://www.savetheinternet.com/blog/...better-policy/





Comcast Defends Role As Internet Traffic Cop
Cecilia Kang

Comcast said yesterday that it purposely slows down some traffic on its network, including some music and movie downloads, an admission that sparked more controversy in the debate over how much control network operators should have over the Internet.

In a filing with the Federal Communications Commission, Comcast said such measures -- which can slow the transfer of music or video between subscribers sharing files, for example -- are necessary to ensure better flow of traffic over its network.

In defending its actions, Comcast stepped into one of the technology industry's most divisive battles. Comcast argues that it should be able to direct traffic so networks don't get clogged; consumer groups and some Internet companies argue that the networks should not be permitted to block or slow users' access to the Web.

Comcast's FCC filing yesterday was in response to petitions to the agency by the consumer group Free Press and the online video provider Vuze, which claimed that the cable company was abusing its control over its network to impede video competition.

Separately, the FCC began an investigation of Comcast's network practices after receiving those complaints. That review is ongoing, according to Comcast, which said it hasn't received any specific orders based on the complaints.

The FCC prohibits network operators from blocking applications but opens the door to interpretation with a footnote in a policy statement that provides for an exemption for "reasonable management."

Rep. Edward J. Markey (D-Mass.), chairman of the House Energy and Commerce Committee's subcommittee on telecommunications and the Internet, plans to introduce a bill today calling for an Internet policy that would prohibit network operators from unreasonably interfering with consumers' right to access and use content over broadband networks. The bill also calls for the FCC to hold eight meetings around the nation to assess whether there is enough competition among network providers and whether consumers' rights are being upheld.

"Our goal is to ensure that the next generation of Internet innovators will have the same opportunity, the same unfettered access to Internet content, services and applications that fostered the developers of Yahoo, Netscape and Google," Markey said in a written statement yesterday.

The case with Comcast illustrates the high-stakes battle between those who argue that the Internet should remain open to all traffic, and the companies who argue that some governance of their networks is in the best interest of their customers.

In its comments, Comcast said network controls are necessary, especially for heavy Web users. Specifically, the company imposes "temporary delays" of video, music and other files shared between computers using such technologies as BitTorrent.

Comcast compared its practices to a traffic-ramp control light that regulates the entry of additional vehicles onto a freeway during rush hour. "One would not claim that the car is 'blocked' or 'prevented from entering the freeway; rather it is briefly delayed," the company's statement said.

Marvin Ammori, the general counsel for Free Press, said Comcast's behavior is the second major example of an service provider overstepping its authority in an attempt to quash competition. In March 2005, the FCC fined Madison River Communications for blocking calls by competitor Vonage, which provided free calls over the Internet.

Ammori said that by interfering with video transfers, Comcast is trying to protect its television and On Demand video services.

BitTorrent said Comcast should respond by increasing bandwidth on its networks and upgrading its systems rather than limiting how customers use its service.

"It's like putting a Band-Aid on the problem to achieve a short-term fix," said Ashwin Navin, co-founder and president of San Francisco-based BitTorrent.
http://www.washingtonpost.com/wp-dyn...=moreheadlines





Vuze to Comcast: It's Not a Fair Race When You Own the Track
Nate Anderson

The consumer groups and companies that filed the FCC challenge to Comcast's P2P "delaying" tactics responded this morning to Comcast's lengthy FCC filing. In that filing, which we covered in detail yesterday, Comcast made an aggressive defense of its policies, claiming that it only resets P2P uploads made during peak times and when no download is also in progress. Free Press, BitTorrent, and Vuze all say that's not good enough.

In a conference call, Vuze's general counsel Jay Monahan drew the starkest analogy. What Comcast is really doing, he said, wasn't at all comparable to limiting the number of cars that enter a highway. Instead, it was more like a horse race where the cable company owns one of the horses and the racetrack itself. By slowing down the horse of a competitor like Vuze, even for a few seconds, Comcast makes it harder for that horse to compete. "Which horse would you bet on in a race like that?" asked Monahan.

Vuze offers its own video content distributed through P2P technology, and it sees services like Comcast's own video-on-demand offerings as a direct competitor. But smaller, independent publishers also value P2P technology because it allows them to "compete with the big boys" by offering high-quality video files in a way that would not be possible if a nonprofit had to pay directly for all the download bandwidth.

The Participatory Culture Foundation, makers of the open-source Miro video client, called Comcast's targeting of BitTorrent and other P2P technologies a "free speech issue" since it doesn't affect any of the big media companies that can afford to pay for direct downloads.

Alternatives

The question, then, is what Comcast should do about a situation in which 15-20 P2P users can saturate the upload link on a local node that serves 400 other users. It hardly seems fair to let that happen and make everyone else suffer, but harming other companies' legitimate commercial interests by hindering their traffic is hardly fair, either.

When pressed to offer an alternative approach, the basic suggestion from Vuze and Free Press was to "build more infrastructure." This has been Verizon's approach with FiOS, and the company is always happy to talk smack about the cable industry's bandwidth constraints.

The other main alternative was to charge users by the gigabyte, rather than through flat-rate pricing that imposes no cost for downloading terabytes of data each month. (Time Warner may trial this approach, which is common in other countries but has never caught on in the US.)

Comcast insists that all of this talk blows the matter far out of proportion. The company says that it only delays traffic when the load is so high that doing nothing would cause problems, and it points out that it is currently rolling out DOCSIS 3.0, which will bring much larger upload capacities to local nodes. Delaying Vuze traffic for five or ten minutes is hardly going to harm anyone, right?

But Monahan wants the FCC to realize that those few minutes might in fact make a big difference. He points to the well-tested observation that Web surfers are impatient and that long page load times can decrease traffic. Vuze's target demographic, 18-34 tech-savvy 'Net users, is likely to be similarly impatient, and Monahan doesn't believe that Comcast should be allowed to pick which company's crucial traffic will be affected when load gets too high. If a customer has to wait for a Vuze download to finish because Comcast is delaying the traffic, she might decide that the service is unreliable and not worth her further patronage.

Everyone's dropping some knowledge

The controversy could lay the ground for additional FCC rules regarding ISP neutrality, so it's not surprise that just about every stakeholder is weighing in. Comcast, Vuze, and Free Press have all filed comments with the FCC, and they're joined by the Center for Democracy & Technology, the ITIF, and the Progress & Freedom Foundation. If you're looking to get your learn on, start with these filings; they're far more interesting than most FCC submissions. As Free Press notes, "the rubber has now hit the road" in regards to network management rules, and everyone knows it.

Sidenote: Vuze said that it has started to use encryption on its traffic, just strong enough stuff to keep packet inspection gear from discerning the contents. It's another reminder of what will start happening if ISPs like AT&T institute copyright-focused filtering on national networks.
http://arstechnica.com/news.ars/post...the-track.html





BitTorrent Developers Introduce Comcast Busting Encryption
Ernesto

Several BitTorrent developers have joined forces to propose a new protocol extension with the ability to bypass the BitTorrent interfering techniques used by Comcast and other ISPs. This new form of encryption will be implemented in BitTorrent clients including uTorrent, so Comcast subscribers are free to share again.

BitTorrent throttling is not a new phenomenon, ISPs have been doing it for years. When the first ISPs started to throttle BitTorrent traffic most BitTorrent clients introduced a countermeasure, namely, protocol header encryption. This was the beginning of an ongoing cat and mouse game between ISPs and BitTorrent client developers, which is about to enter new level.

Unfortunately, protocol header encryption doesn’t help against more aggressive forms of BitTorrent interference, like the Sandvine application used by Comcast. A new extension to the BitTorrent protocol is needed to stay ahead of the ISPs, and that is exactly what is happening right now.

Back in August we were the first to report that Comcast was actively disconnecting BitTorrent seeds. Comcast of course denied our allegations, and ever since there has been a lot of debate about the rights and wrongs of Comcast’s actions. On Wednesday, Comcast explained their BitTorrent interference to the FCC in a 57-page filing. Unfortunately they haven’t stopped lying yet, since they now argue that they only delay BitTorrent traffic, while in fact they disconnect people, making it impossible for them to share files with non-Comcast users.

In short, the Comcast interference works like this: A few seconds after you connect to someone in a BitTorrent swarm, a peer reset message (RST flag) is sent by Comcast and the upload immediately stops. Most vulnerable are users in a relatively small swarm where you only have a couple of peers you can upload the file to.

For the networking savvy people among us, here’s an example of real RST interference (video) on a regular BitTorrent connection. In this case, the reset happens immediately after the bitfields are exchanged. Evil? Yes - but there is hope.

The goal of this new type of encryption (or obfuscation) is to prevent ISPs from blocking or disrupting BitTorrent traffic connections that span between the receiver of a tracker response and any peer IP-port appearing in that tracker response, according to the proposal.

“This extension directly addresses a known attack on the BitTorrent protocol performed by some deployed network hardware. By obscuring the ip-port pairs network hardware can no longer easily identify ip-port pairs that are running BitTorrent by observing peer-to-tracker communications. This deployed hardware under some conditions disrupts BitTorrent connections by injecting forged TCP reset packets. Once a BitTorrent connection has been identified, other attacks could be performed such as severely rate limiting or blocking these connections.”

So, the new tracker peer obfuscation technique is especially designed to be a workaround for throttling devices, such as the Sandvine application that Comcast uses. More details on the proposal can be found at BitTorrent.org, which aims to become a coordination platform for BitTorrent developers.

TorrentFreak talked to Ashwin Navin, president and co-founder of BitTorrent Inc. who has some of his employees working on the new extension. He told us: “There are some ISPs who would like people to believe that “slowing down” BitTorrent or “metering” bandwidth consumption serves the greater good. Consumers should be very weary of this claim.”

“In recent months, consumers enjoyed unprecedented participation in the political process thanks to the ability to upload opinions and feedback in the YouTube presidential debates. Musicians, filmmakers and artists are finding ways to connect with their audiences across the world thanks to MySpace and BitTorrent. Students are engaging with interactive learning tools in their schools. Which bandwidth intensive application will banned or shaped or metered next by these ISPs? The creative spirit of millions has been ignited, and our need to participate, to communicate will not be silenced.”

“The US government should encourage ISPs to innovate and invest in their networks,” Ashwin said. “Permitting them to interfere or interrupt in the communications of consumers, to protect ISP profit margins, would be a tremendous set back for our country and economy, when we are already slipping behind the first world (UK, EU, Japan, Korea, Singapore, etc) in its broadband capacity.”

We wholeheartedly agree with Ashwin on this one, as we’ve said before. The Internet is only a few years old, if the plan is to keep using it in the future, ISPs need to upgrade their networks. So, invest in more Internet gateway capacity, 10Gbps interconnect ports, and peering agreements. BitTorrent users are not the problem, they only signal that the ISPs need to upgrade their capacity, because customers will only get more demanding in the future. The Internet is not only about sending email, and browsing on text based websites anymore.

The new protocol extension is still under development, but the goal is of course, to get it out as soon as possible.

Hang on…
http://torrentfreak.com/bittorrent-d...yption-080215/





Illegal Downloaders 'Face UK Ban'
BBC

People in the UK who go online and illegally download music and films may have their internet access cut under plans the government is considering.

A draft consultation suggests internet service providers would be required to take action over users who access pirated material.

But the government is stressing that plans are at an early stage and it is still working on final proposals.

Six million people a year are estimated to download files illegally in the UK.

Music and film companies say that the illegal downloads cost them millions of pounds in lost revenues.

The government proposals were first reported by the Times newspaper.

Voluntary scheme

The Times suggested that broadband firms which failed to enforce the rules could be prosecuted, and the details of customers suspected of making illegal downloads made available to the courts.

According to the Times, the draft paper states: "We will move to legislate to require internet service providers to take action on illegal file sharing."

Some of the UK's biggest internet providers, such as BT, Virgin and Tiscali have been in talks with the entertainment industry over introducing a voluntary scheme for policing pirate activity, but no agreement has been reached.

So far, they have failed to resolve how disputed allegations would be arbitrated - for example, when customers claim other people have been "piggybacking" on their internet service.

'No liability'

The Internet Service Providers Association said data protection laws would prevent providers from looking at the content of information sent over their networks.

HAVE YOUR SAY Isn't it strange that when corporations start losing money the Government acts quickly to stamp it out Jon Perez, Reading

"ISPs are no more able to inspect and filter every single packet passing across their network than the Post Office is able to open every envelope," the association said.

"ISPs bear no liability for illegal file sharing as the content is not hosted on their servers," it added.

The Department for Culture, Media and Sport said that early drafts of the document had been circulated among stakeholders.

"The content and proposals for the strategy have been significantly developed since then and a comprehensive plan to bolster the UK's creative industries will be published shortly," it added.

"We will not comment on the content of the leaked document."
http://news.bbc.co.uk/go/pr/fr/-/2/h...ss/7240234.stm





5 Reasons Why Illegal Downloaders Will Not Face a UK Ban
Matt

There’s been a lot of buzz about a story The London Times ran this morning under the headline “Internet users could be banned over illegal downloads,” which also appeared on the BBC website under the even more alarming headline “Illegal downloaders ‘face UK ban.” Time to get a couple of things straight.

The Times says “people who illegally download films and music will be cut off from the internet under new legislative proposals to be unveiled next week.” Actually, this story is complete balderdash. But the fact that this nutty proposal is getting anywhere at all illustrates how ignorant the powers that be are about downloading.

Let’s get a couple of things straight –

1. This proposal was a draft consultation green paper, defined as “a proposal without any commitment to action.” The government receives many of these on a daily basis. They are like junk mail at Number 10 Downing Street. The Prime Minister’s toilet paper is more important than most green papers, and both are usually filed in the same place.

2. This proposal is totally and completely unworkable in the real world. ISPs will not accept liability for the contents of packets (nor should they), and it would be impossible for them to open and check if every single download and upload was legal or not without the entire Internet grinding to halt. This isn’t in the best interests of the government, the ISPs or the voters. Banning customers and exposing yourself to billions in liability isn’t a good business strategy. Criminalizing six million citizens and inconveniencing the rest is not a vote winner.

3. It would be impossible to tell the difference between illegal downloading and legal activities such as downloading software patches, using torrents to share stuff legally, playing online video games, using VoIP, photo sharing, telecommuting, and many others. The resistance from the private sector would be as strong as it would from the general public.

4. The very idea of this goes against the ruling of the European Court, which says EU member states are not obligated to disclose personal information about suspected file sharers. It would also fly in the face of Article 10 of the European freedom of expression laws, which gives every European the “freedom to hold opinions and to receive and impart information and ideas without interference by public authority and regardless of frontiers.”

5. WiFi piggybacking and encrypted packets make it impossible to tell who is downloading what in the first place. These techniques are only getting more sophisticated, while for the most part, the content industries collectively remain as dumb as a box of hair.

So in summary:

Insert Toilet Flushing Sound FX Here

This idea makes as much sense as trying to ban people from singing ‘Happy Birthday’ to each other over the telephone network, or burning down libraries to protect the publishing industry. But what’s frightening about such ideas is that they are still taken seriously all over the world by powerful decision makers in government and industry who have absolutely no clue about how the Internet actually works, or the damage such laws could do to democracy.

Before there is any more discussion about this, the music and film companies need to definitively prove illegal downloads cost them millions of dollars in lost revenues. CD sales are falling because nobody uses them anymore, and Hollywood is in rude health despite the pirates. There should be no more talk about changing laws and spending tax payer’s money on this ‘problem’ until someone proves there really is one.

Furthermore, if there is a problem, tax payers shouldn’t have to pony up in the first place. The content industries need to stop braying at governments to protect inefficient business models and look at the real solution that’s been staring them in the face for ten years.

For those who are interested, my book: “The Pirate’s Dilemma: How Youth Culture Is Reinventing Capitalism” is out now through Free Press, , and probably soon on a BitTorrent tracker near you .
http://torrentfreak.com/illegal-down...uk-ban-080212/





ISPs Demand Record Biz Pays Up if Cut-Off P2P Users Sue

Money meets mouth on file-sharing legislation threats
Chris Williams

ISPs are calling on the record industry to put its money where its mouth is on illegal file-sharing, by underwriting the cost of lawsuits brought by people who are wrongly accused of downloading or uploading music.

ISPA told The Register today it is worried about the cost to its members if users targeted by rights holders for copyright infringement turn out to be innocent. "We still need to establish the proof points," a spokesman said.

It's the latest public detail from long-running private negotiations that have hit mainstream media headlines today. The lobbying campaign to have government force ISPs to disconnect persistent illegal file-sharers scored a victory with a leak to The Times. The draft government document says: "We will move to legislate to require internet service providers to take action on illegal file-sharing."

The threat (note the qualifier "move to") is not news, however. A battle between the record industry and ISPs over the plans has been waged in private for over a year, and the government has indicated on more than one occasion that it backs the rights holders. The leak merely repeats earlier reports that the government is indeed preparing legislation if a voluntary settlement is not reached soon.

The details of the proposed three strikes system and the stalemate in negotiations have also been public for some time.

It's reported that the threat to legislate will be reiterated by Gordon Brown and the Culture Secretary Andy Burnham in the next two weeks when they publish a broad cultural strategy document. It will commit the government to laws on filesharing if there's no self-regulatory solution. Previously ministers had said legislation was a possibility, or expressed personal support for it, but the intention to act has been clear for months.

A Department for Culture, Media and Sport spokeswoman said no dates on legislation had been set. In January then-intellectual property minister Lord Triesman told The Register that he was aiming to have proposals ready for the Queen's speech this November.

In response to the mainstream media waking up to the process today, ISPA has been keen to emphasise the advantages of a self-regulatory system, the ISPs seemingly having resigned themselves to some sort of enforcement role against illegal filesharing.

An ISPA spokesman contrasted the success of an internet industry-led initiative such as the UK's Internet Watch Foundation, which combats child sexual abuse content online, with the cumbersome and slow introduction of the Regulation of Investigatory Powers Act (RIPA). He added that the internet trade association has not seen documents cited by The Times, but that there was nothing new in the "three strikes" system it reportedly proposes. "It's not like these sort of things haven't been said before," he said.

A negotiated settlement is still possible, as a statement from the BPI today demonstrates. Chief executive Geoff Taylor pleaded: "We simply want ISPs to advise customers if their account is being used to distribute music illegally, and then, if the advice is ignored, enforce their own terms and conditions about abuse of the account."

He called ISP arguments about privacy "bogus", though according to The Times the government is considering making them share data about who they have disconnected, which would require new user contracts under the Data Protection Act. However, after an EU ruling on file-sharing data protection in January, the BPI indicated that it wouldn't want personal data to be shared.

The BPI said: "The music business wants to partner with internet service providers to create new services that would deliver even greater value for music lovers, artists, labels and ISPs." A hint perhaps at blanket licensing of file-sharing at ISP level - the other end of the internet music equation, which the record business must resolve to survive.
http://www.theregister.co.uk/2008/02...ng_paper_leak/





ISPs Reject Monitoring Role
Mark Ward

UK net firms are resisting government suggestions that they should do more to monitor what customers do online.

The industry association for net providers said legal and technical barriers prohibit them from being anything other than a "mere conduit".

The declaration comes as the government floats the idea of persistent pirates being denied net access.

And in the US one net supplier has admitted to "degrading" traffic from some file-sharing networks.

Traffic control

Net firms have been stung into defining their position by the emergence this week of a draft government consultation document that suggests ISPs should be drafted in to the fight against piracy.

It suggested that people who persistently download and share copyrighted material could have their net access removed.

A spokesman for the Internet Service Providers Association (ISPA) said the 2002 E-Commerce Regulations defined net firms as "mere conduits" and not responsible for the contents of the traffic flowing across their networks.

He added that other laws on surveillance explicitly prohibited ISPs from inspecting the contents of data packets unless forced to do so by a warrant.

The spokesman said technical issues also made it hard for net firms to take action against specific types of traffic.

For instance, he said, while some people use peer-to-peer networks to download copyrighted material many commercial services, such as Napster and the BBC's iPlayer, use file-sharing technology to distribute music and TV legally.

In the US, Comcast admitted in documents filed with the Federal Communications Commission that it does "degrade" some traffic from peer-to-peer networks.

The spokesman added: "We know that all ISPs are involved in traffic management but that is to optimise the service for all their customers."

A spokesman for Virgin Media said its traffic management system came into play during peak times - between 1600 and 2100.

Action was taken against any customer whose usage exceeded a limit associated with their tariff during that five hour window, he said.

"If you exceed that threshold we will drop your speed for five hours from when the excess is recorded," he said.

Browsing speeds are most often managed during the busiest times

Andrew Ferguson, an editor at Think Broadband, said net service firms manage their bandwidth in many different ways.

Almost all, he said, manage traffic but at certain times impose other systems to smooth out the peaks.

"Some firms will happily let you use as much as you like but will charge you accordingly, and business products that are more expensive often allow unlimited use," he said.

Others impose charges on customers who regularly exceed their download limits and a few manage their system so users cannot exceed a monthly download cap. The limits that firms impose can also vary widely.

"Any ISP that does not do traffic management is not going to stay in business very long," said Gavin Johns, managing director of net management firm Epitiro.

He said it was essential to ensure that services which have to be delivered in real time, such as voice and streaming video, were usable.

"Different applications use different ports and have different payloads," said Mr Johns, "They look completely different from a network point of view."

"If they didn't do traffic management we would all complain," he said.

Mr Ferguson from Think Broadband said although traffic management was common, net providers imposed it in contrasting ways.

"What varies is the degree it impacts users and the openness of providers in telling users it exists and what is and is not managed," he told the BBC News website.

"Traffic management has a poor reputation as in many cases it is used to keep bandwidth costs down for a provider with little respect to the consumers' wishes," he said.
http://news.bbc.co.uk/2/hi/technology/7246403.stm





Underbelly Ban 'Useless' Online
Darren Osborne

VICTORIAN internet users won't have to wait long to see illegal versions of the Nine Network's new crime TV series Underbelly, despite a court order banning its showing in the state.

Supreme Court judge Justice Betty King today issued a suppression order preventing the series from being screened in Victoria and ordered the removal of all excerpts from the internet.

While the series will premiere around the rest of Australia tomorrow night, the court fears its showing in Victoria could prejudice a murder trial.

However, the explosion of video on the internet, through websites such as YouTube and though BitTorrent file sharing, may allow Victorians to see the drama within hours of its broadcast interstate.

A scan of YouTube today revealed several copies of the promotional clips from the series on the website, despite their removal from Nine's site.

"This is a great problem on the internet,'' said University of NSW Cyberspace Law and Policy Centre executive director David Vaile.

"Legal jurisdiction is typically limited by geography, and by its nature the internet doesn't place much regard to geography."

Mr Vaile believed Judge King may have taken into consideration the possibility that copies of the drama would appear on the internet, but that it would have limited impact on potential jurors.

''(She) may well have decided that something that is not the official publisher's website will not have the same sort of impact,'' he said.

However watching illegal versions of the underworld drama will not be without risk.

Mr Vaile said people caught uploading clips from Underbelly could face copyright and contempt of court charges.

"There is potential in some circumstances for that order to render people in contempt,'' Mr Vaile said.

He added that despite the belief that the internet provides anonymity, authorities would be able to track down any culprits.

"There is a false perception around that activity on the internet is anonymous or that's untraceable. Unfortunately, the opposite is the case," Mr Vaile said.

"The terms of service (on YouTube and Facebook) don't really offer any protection of privacy or anonymity."

Mr Vaile said website hosts such as YouTube and Facebook could also face charges regarding breach of copyright.

He added that once the clips were on the internet it would be impossible to remove them completely.

"Once it gets out onto the internet, subsequent copying is also a problem,'' Mr Vaile said.

"The further along it gets the harder it is to put back into the bag."
http://www.australianit.news.com.au/...013040,00.html





Better Than Free
Kevin Kelly

This super-distribution system has become the foundation of our economy and wealth. The instant reduplication of data, ideas, and media underpins all the major economic sectors in our economy, particularly those involved with exports — that is, those industries where the US has a competitive advantage. Our wealth sits upon a very large device that copies promiscuously and constantly.

The internet is a copy machine. At its most foundational level, it copies every action, every character, every thought we make while we ride upon it. In order to send a message from one corner of the internet to another, the protocols of communication demand that the whole message be copied along the way several times. IT companies make a lot of money selling equipment that facilitates this ceaseless copying. Every bit of data ever produced on any computer is copied somewhere. The digital economy is thus run on a river of copies. Unlike the mass-produced reproductions of the machine age, these copies are not just cheap, they are free.

Our digital communication network has been engineered so that copies flow with as little friction as possible. Indeed, copies flow so freely we could think of the internet as a super-distribution system, where once a copy is introduced it will continue to flow through the network forever, much like electricity in a superconductive wire. We see evidence of this in real life. Once anything that can be copied is brought into contact with internet, it will be copied, and those copies never leave. Even a dog knows you can't erase something once its flowed on the internet.

This super-distribution system has become the foundation of our economy and wealth. The instant reduplication of data, ideas, and media underpins all the major economic sectors in our economy, particularly those involved with exports — that is, those industries where the US has a competitive advantage. Our wealth sits upon a very large device that copies promiscuously and constantly.

Yet the previous round of wealth in this economy was built on selling precious copies, so the free flow of free copies tends to undermine the established order. If reproductions of our best efforts are free, how can we keep going? To put it simply, how does one make money selling free copies?

I have an answer. The simplest way I can put it is thus:

When copies are super abundant, they become worthless.
When copies are super abundant, stuff which can't be copied becomes scarce and valuable.

When copies are free, you need to sell things which can not be copied.

Well, what can't be copied?

There are a number of qualities that can't be copied. Consider "trust." Trust cannot be copied. You can't purchase it. Trust must be earned, over time. It cannot be downloaded. Or faked. Or counterfeited (at least for long). If everything else is equal, you'll always prefer to deal with someone you can trust. So trust is an intangible that has increasing value in a copy saturated world.

There are a number of other qualities similar to trust that are difficult to copy, and thus become valuable in this network economy. I think the best way to examine them is not from the eye of the producer, manufacturer, or creator, but from the eye of the user. We can start with a simple user question: why would we ever pay for anything that we could get for free? When anyone buys a version of something they could get for free, what are they purchasing?

From my study of the network economy I see roughly eight categories of intangible value that we buy when we pay for something that could be free.

In a real sense, these are eight things that are better than free. Eight uncopyable values. I call them "generatives." A generative value is a quality or attribute that must be generated, grown, cultivated, nurtured. A generative thing can not be copied, cloned, faked, replicated, counterfeited, or reproduced. It is generated uniquely, in place, over time. In the digital arena, generative qualities add value to free copies, and therefore are something that can be sold.

Eight Generatives Better Than Free

Immediacy — Sooner or later you can find a free copy of whatever you want, but getting a copy delivered to your inbox the moment it is released — or even better, produced — by its creators is a generative asset. Many people go to movie theaters to see films on the opening night, where they will pay a hefty price to see a film that later will be available for free, or almost free, via rental or download. Hardcover books command a premium for their immediacy, disguised as a harder cover. First in line often commands an extra price for the same good. As a sellable quality, immediacy has many levels, including access to beta versions. Fans are brought into the generative process itself. Beta versions are often de-valued because they are incomplete, but they also possess generative qualities that can be sold. Immediacy is a relative term, which is why it is generative. It has to fit with the product and the audience. A blog has a different sense of time than a movie, or a car. But immediacy can be found in any media.

Personalization — A generic version of a concert recording may be free, but if you want a copy that has been tweaked to sound perfect in your particular living room — as if it were preformed in your room — you may be willing to pay a lot. The free copy of a book can be custom edited by the publishers to reflect your own previous reading background. A free movie you buy may be cut to reflect the rating you desire (no violence, dirty language okay). Aspirin is free, but aspirin tailored to your DNA is very expensive. As many have noted, personalization requires an ongoing conversation between the creator and consumer, artist and fan, producer and user. It is deeply generative because it is iterative and time consuming. You can't copy the personalization that a relationship represents. Marketers call that "stickiness" because it means both sides of the relationship are stuck (invested) in this generative asset, and will be reluctant to switch and start over.

Interpretation — As the old joke goes: software, free. The manual, $10,000. But it's no joke. A couple of high profile companies, like Red Hat, Apache, and others make their living doing exactly that. They provide paid support for free software. The copy of code, being mere bits, is free — and becomes valuable to you only through the support and guidance. I suspect a lot of genetic information will go this route. Right now getting your copy of your DNA is very expensive, but soon it won't be. In fact, soon pharmaceutical companies will PAY you to get your genes sequence. So the copy of your sequence will be free, but the interpretation of what it means, what you can do about it, and how to use it — the manual for your genes so to speak — will be expensive.

Authenticity — You might be able to grab a key software application for free, but even if you don't need a manual, you might like to be sure it is bug free, reliable, and warranted. You'll pay for authenticity. There are nearly an infinite number of variations of the Grateful Dead jams around; buying an authentic version from the band itself will ensure you get the one you wanted. Or that it was indeed actually performed by the Dead. Artists have dealt with this problem for a long time. Graphic reproductions such as photographs and lithographs often come with the artist's stamp of authenticity — a signature — to raise the price of the copy. Digital watermarks and other signature technology will not work as copy-protection schemes (copies are super-conducting liquids, remember?) but they can serve up the generative quality of authenticity for those who care.

Accessibility — Ownership often sucks. You have to keep your things tidy, up-to-date, and in the case of digital material, backed up. And in this mobile world, you have to carry it along with you. Many people, me included, will be happy to have others tend our "possessions" by subscribing to them. We'll pay Acme Digital Warehouse to serve us any musical tune in the world, when and where we want it, as well as any movie, photo (ours or other photographers). Ditto for books and blogs. Acme backs everything up, pays the creators, and delivers us our desires. We can sip it from our phones, PDAs, laptops, big screens from where-ever. The fact that most of this material will be available free, if we want to tend it, back it up, keep adding to it, and organize it, will be less and less appealing as time goes on.

Embodiment — At its core the digital copy is without a body. You can take a free copy of a work and throw it on a screen. But perhaps you'd like to see it in hi-res on a huge screen? Maybe in 3D? PDFs are fine, but sometimes it is delicious to have the same words printed on bright white cottony paper, bound in leather. Feels so good. What about dwelling in your favorite (free) game with 35 others in the same room? There is no end to greater embodiment. Sure, the hi-res of today — which may draw ticket holders to a big theater — may migrate to your home theater tomorrow, but there will always be new insanely great display technology that consumers won't have. Laser projection, holographic display, the holodeck itself! And nothing gets embodied as much as music in a live performance, with real bodies. The music is free; the bodily performance expensive. This formula is quickly becoming a common one for not only musicians, but even authors. The book is free; the bodily talk is expensive.

Patronage — It is my belief that audiences WANT to pay creators. Fans like to reward artists, musicians, authors and the like with the tokens of their appreciation, because it allows them to connect. But they will only pay if it is very easy to do, a reasonable amount, and they feel certain the money will directly benefit the creators. Radiohead's recent high-profile experiment in letting fans pay them whatever they wished for a free copy is an excellent illustration of the power of patronage. The elusive, intangible connection that flows between appreciative fans and the artist is worth something. In Radiohead's case it was about $5 per download. There are many other examples of the audience paying simply because it feels good.

Findability — Where as the previous generative qualities reside within creative digital works, findability is an asset that occurs at a higher level in the aggregate of many works. A zero price does not help direct attention to a work, and in fact may sometimes hinder it. But no matter what its price, a work has no value unless it is seen; unfound masterpieces are worthless. When there are millions of books, millions of songs, millions of films, millions of applications, millions of everything requesting our attention — and most of it free — being found is valuable.

The giant aggregators such as Amazon and Netflix make their living in part by helping the audience find works they love. They bring out the good news of the "long tail" phenomenon, which we all know, connects niche audiences with niche productions. But sadly, the long tail is only good news for the giant aggregators, and larger mid-level aggregators such as publishers, studios, and labels. The "long tail" is only lukewarm news to creators themselves. But since findability can really only happen at the systems level, creators need aggregators. This is why publishers, studios, and labels (PSL)will never disappear. They are not needed for distribution of the copies (the internet machine does that). Rather the PSL are needed for the distribution of the users' attention back to the works. From an ocean of possibilities the PSL find, nurture and refine the work of creators that they believe fans will connect with. Other intermediates such as critics and reviewers also channel attention. Fans rely on this multi-level apparatus of findability to discover the works of worth out of the zillions produced. There is money to be made (indirectly for the creatives) by finding talent. For many years the paper publication TV Guide made more money than all of the 3 major TV networks it "guided" combined. The magazine guided and pointed viewers to the good stuff on the tube that week. Stuff, it is worth noting, that was free to the viewers. There is little doubt that besides the mega-aggregators, in the world of the free many PDLs will make money selling findability — in addition to the other generative qualities.

These eight qualities require a new skill set. Success in the free-copy world is not derived from the skills of distribution since the Great Copy Machine in the Sky takes care of that. Nor are legal skills surrounding Intellectual Property and Copyright very useful anymore. Nor are the skills of hoarding and scarcity. Rather, these new eight generatives demand an understanding of how abundance breeds a sharing mindset, how generosity is a business model, how vital it has become to cultivate and nurture qualities that can't be replicated with a click of the mouse.

In short, the money in this networked economy does not follow the path of the copies. Rather it follows the path of attention, and attention has its own circuits.

Careful readers will note one conspicuous absence so far. I have said nothing about advertising. Ads are widely regarded as the solution, almost the ONLY solution, to the paradox of the free. Most of the suggested solutions I've seen for overcoming the free involve some measure of advertising. I think ads are only one of the paths that attention takes, and in the long-run, they will only be part of the new ways money is made selling the free.

But that's another story.

Beneath the frothy layer of advertising, these eight generatives will supply the value to ubiquitous free copies, and make them worth advertising for. These generatives apply to all digital copies, but also to any kind of copy where the marginal cost of that copy approaches zero. (See my essay on Technology Wants to Be Free.) Even material industries are finding that the costs of duplication near zero, so they too will behave like digital copies. Maps just crossed that threshold. Genetics is about to. Gadgets and small appliances (like cell phones) are sliding that way. Pharmaceuticals are already there, but they don't want anyone to know. It costs nothing to make a pill. We pay for Authenticity and Immediacy in drugs. Someday we'll pay for Personalization.
Maintaining generatives is a lot harder than duplicating copies in a factory. There is still a lot to learn. A lot to figure out. Write to me if you do.
http://www.edge.org/3rd_culture/kell...y08_index.html





How Internet Censorship Works
Jonathan Strickland

Introduction to How Internet Censorship Works

One of the early nicknames for the Internet was the "information superhighway" because it was supposed to provide the average person with fast access to a practically limitless amount of data. For many users, that's exactly what accessing the Internet is like. For others, it's as if the information superhighway has some major roadblocks in the form of Internet censorship.

The motivations for censorship range from well-intentioned desires to protect children from unsuitable content to authoritarian attempts to control a nation's access to information. No matter what the censors' reasons are, the end result is the same: They block access to the Web pages they identify as undesirable.

Internet censorship isn't just a parental or governmental tool. There are several software products on the consumer market that can limit or block access to specific Web sites. Most people know these programs as Web filters. Censorship opponents have another name for them: Censorware.

While there are some outspoken supporters and opponents of Internet censorship, it's not always easy to divide everyone into one camp or another. Not everyone uses the same tactics to accomplish goals. Some opponents of censorship challenge government policies in court. Others take the role of information freedom fighters, providing people with clandestine ways to access information.

Beyond the Net
Sometimes the fight moves from the online world to the real one. In 2006, a group of men attacked United States citizen Peter Yuan Li in his Atlanta home. Li was an anti-censorship activist and a practitioner of Falun Gong, a spiritual following similar to Buddhism. Li maintained Web sites that criticized China's Communist party. His assailants bound him and demanded to know where he stored his information. They beat Li severely and stole two laptop computers, leaving other valuables untouched. Li believed the men were sent by the Chinese government to silence him [source: Forbes].


In this article, we'll look at the different levels of Internet censorship, from off-the-shelf Web filters to national policy. We'll also learn about the ways some people are trying to fight censorship.

We'll start off by looking at Internet censorship on the domestic level.

Video Gallery: Net Neutrality
One issue connected with Internet censorship is net neutrality. Watch this video from PodTech.net to learn more about what net neutrality is and why people disagree over if it's necessary or not.


Internet Censorship at Home

There's no denying that the Internet contains a lot of material that most parents wouldn't want their children to see. Whether it's pornography, hate speech, chat rooms or gambling sites, many parents worry that their children will be exposed to negative or even dangerous content. While some opponents of censorship may feel that parental supervision is the best way to keep kids safe online, many parents point out that it's difficult -- if not impossible -- to oversee a child's access to the Internet all the time.

Many parents turn to software and hardware solutions to this problem. They can purchase Web filtering programs like Net Nanny or CYBERsitter to block access to undesirable Web sites. These programs usually have a series of options parents can select to limit the sites their children can access. These options tell the program which filters to enable. For example, CYBERsitter has 35 filter categories, including pornography and social networking sites [source: CYBERsitter].

Most Web filters use two main techniques to block content: Blacklists and keyword blocking. A blacklist is a list of Web sites that the Web filter's creators have designated as undesirable. Blacklists change over time, and most companies offer updated lists for free. Any attempt to visit a site on a blacklist fails. With keyword blocking, the software scans a Web page as the user tries to visit it. The program analyzes the page to see if it contains certain keywords. If the program determines the Web page isn't appropriate, it blocks access to the page.

Web Filter Controversy
Censorship opponents have some big problems with Web filtering software. Many Web filtering programs encrypt their blacklists, claiming that it helps minimize abuse. Opponents point out that the encrypted blacklist could also include Web pages that aren't inappropriate at all, including pages that criticize the creators of the Web filter. Even if the programs' creators aren't blocking these sites on purpose, it's easy for a Web filter to restrict access to the wrong sites. That's because programs that search for keywords can't detect context. For example, early Web filters would often block access to chicken breast recipes. The programs couldn't tell the difference between an innocent site about cooking and a pornographic site, so they blocked all of them indiscriminately.


Another option for parents is to install a firewall. A computer firewall provides protection from dangerous or undesirable content. Firewalls can be software or hardware. They act as a barrier between the Internet and your computer network. They only let safe content through and keep everything else out. Firewalls require a little more involvement from the network administrator (in this case, a parent) than Web filtering software. Tech-savvy parents might not have a problem installing and maintaining a firewall. Others prefer to use Web filters, which do most of the work for them.

Have you ever tried to access a Web site at your workplace only to receive an intimidating message? Some companies limit the kinds of sites employees can visit.

Big Businesses and Internet Censorship

Corporations that restrict employee Internet access usually do so for a few reasons. One of the most common reasons is to increase productivity. While employees can use the Internet for research or communication, they may also use it as a distraction. Some companies restrict Internet access severely in order to prevent employees from wasting time online.

Another corporate concern is harassment. Without restrictions, an employee could surf the Web for inappropriate content, such as pornography. If other employees see this material, they may feel that their work environment is a hostile one. Some companies resort to using Internet censorship in order to avoid lawsuits.

While several companies use Web filtering software similar to the products available for home use, many also rely on firewalls. With a firewall, a company can pick and choose which Web pages or even entire domains to block. This way, the company is more likely to avoid blocking sites that employees may need to access legitimately.

At many workplaces, when an employee attempts to access a restricted Web site, he or she will see a message that says the network administrator has identified the site as inappropriate. Usually the message includes the option to petition the network administrator if the user feels the site is wrongfully blocked. The network administrator can adjust which sites are restricted through firewall settings.

What about the corporations that provide Internet access, such as telecom and cable companies? They can play a crucial role in what content customers can access on the Internet. In the United States, there's an ongoing battle over a concept called net neutrality. In a nutshell, net neutrality refers to a level playing ground where Internet service providers (ISPs) allow access to all content without favoring any particular company or Web site. Telecom and cable companies successfully petitioned the Supreme Court to dismiss net neutrality [source: ACLU].

Without net neutrality, ISPs can charge content providers a fee for bandwidth usage. Content providers that pay the fee will get more broadband access, meaning their Web sites will load faster than competitors who didn't pay the fee. For example, if Yahoo pays a fee to an ISP and Google didn't, the ISP's customers would discover that Yahoo's search engine loads much faster than Google's. Supporters of net neutrality argue that such preferential treatment amounts to censorship.

Search Engine Censorship Part One
Most search engines self-censor their search engine results pages (SERPs) in an effort to provide users with relevant search terms. This is necessary because some webmasters try to trick search engines into giving their Web pages high SERP ranks. If the search engines didn't weed out and censor these pages, every SERP would be filled with irrelevant results.


Internet Censorship at the International Level

Many countries restrict access to content on the Internet on some level. Even the United States has laws that impact the kind of information you can access on the Internet in a school or public library. Some countries go much further than that -- and a few don't allow any access to the Internet at all.

The OpenNet Initiative (ONI), an organization dedicated to informing the public about Web filtering and surveillance policies around the world, classifies Web filtering into four categories:

• Political: Content that includes views contrary to the respective country's policies. The political category also includes content related to human rights, religious movements and other social causes
• Social: Web pages that focus on sexuality, gambling, drugs and other subjects that a nation might deem offensive
• Conflict/Security: Pages that relate to wars, skirmishes, dissent and other conflicts
• Internet tools: Web sites that offer tools like e-mail, instant messaging, language translation applications and ways to circumvent censorship

Think About the Children!
In 2000, the United States Congress enacted the Children's Internet Protection Act (CIPA) into law. The law imposes restrictions on Internet access in schools and public libraries that receive funding from the E-rate program, which makes certain technologies more affordable for schools and libraries. Critics of the law say that the act violates the First Amendment to the Constitution. In 2003, the Supreme Court upheld CIPA in a 5 to 4 decision [source: Supreme Court].


Countries like the United States are fairly liberal, with policies that restrict only a few Web pages, but other countries are stricter. According to Reporters Without Borders, an organization dedicated to promoting free expression and the safety of journalists, the following countries have the strongest censorship policies:

• Belarus
• China
• Cuba
• Egypt
• Iran
• Myanmar
• North Korea
• Saudi Arabia
• Syria
• Tunisia
• Turkmenistan
• Uzbekistan
• Vietnam

Some countries go well beyond restricting access. The Myanmar government allegedly keeps Internet cafés under surveillance with computers that automatically take screenshots every few minutes. China has an advanced filtering system known internationally as the Great Firewall of China. It can search new Web pages and restrict access in real time. It can also search blogs for subversive content and block Internet users from visiting them. Cuba has banned private Internet access completely -- to get on the Internet, you have to go to a public access point.

Search Engine Censorship Part Two
Recently, censorship opponents have criticized search engine companies like Yahoo and Google for helping restrictive countries maintain control of the Internet. The companies are in a delicate position -- although headquartered in the United States, they still need to obey local laws when operating in other countries.


There are several organizations dedicated to ending Internet censorship.

Opponents of Internet Censorship

In addition to the thousands of people who combat censorship through blogs every day, there are several organizations that raise awareness about Internet censorship. Some are formal organizations with prestigious memberships, while others are looser groups that aren't above advocating a guerilla approach to getting around strict policies.

The American Civil Liberties Union (ACLU) is an adamant opponent of Internet censorship. The ACLU has filed numerous lawsuits in order to overturn censorship laws. In 2007, the ACLU convinced a federal court that the Children's Online Protection Act (COPA) was unconstitutional. COPA was a law that made it illegal to present material online that was deemed harmful to minors, even if it included information valuable to adults [source: ACLU].

The OpenNet Initiative is a group that strives to provide information to the world about the ways countries allow or deny citizens access to information. The initiative includes departments at the University of Toronto, the Harvard Law School, Oxford University and the University of Cambridge. On ONI's Web page you can find an interactive map that shows which countries censor the Internet.

Reporters Without Borders also concerns itself with Internet censorship, although the group's scope extends beyond Internet practices. The group maintains a list of "Internet enemies," countries that have the most severe Internet restrictions and policies in place [source: Reporters Without Borders].

The Censorware Project has been around since 1997. Its mission is to educate people about Web filtering software and practices. At its Web site, you can find investigative reports about all the major Web filter programs available on the market as well as essays and news reports about censorship. A similar site is Peacefire.org, which began as a site dedicated to protecting free speech on the Internet for young people.

We Don't Need No Thought Control
In 2007, AT&T came under fire when music fans discovered that the company had edited out political comments in a Webcast performance by the band Pearl Jam. The band covered Pink Floyd's song "Another Brick in the Wall" and added lyrics criticizing United States President George W. Bush. AT&T cut the new lyrics out of the song before Webcasting it. After an outcry from fans, the company eventually admitted that it wasn't an isolated incident, though AT&T spokeswoman Tiffany Nels claimed that it was never AT&T's intent to remove political statements from Webcasts [source: MTV].


Other groups offer advice on how to disable or circumvent censorware. Some advocate using proxy sites. A proxy site is a Web page that allows you to browse the Web without using your own Internet protocol (IP) address. You visit the proxy site, which includes a form into which you type the URL of the restricted sites you want to visit. The proxy site retrieves the information and displays it. Outsiders can only see that you've visited the proxy site, not the sites you've pulled up.

It may be decades before the Internet reaches its full potential as a conduit for ideas. Ironically, it isn't going to get there through technological breakthroughs, but through changes in national and corporate policies.
http://computer.howstuffworks.com/in...censorship.htm





Canada a Top Copyright Violator, U.S. Group Says
CBC News

Canada has joined Russia and China as the biggest violators of U.S. copyright law, according to the U.S.-based International Intellectual Property Alliance.

In a report filed to U.S. Trade Representative Susan Schwab on Monday, the group recommended that Canada join the other two countries on the USTR's Priority Watch List.

Countries on this list are subject to accelerated investigations and possible trade sanctions.

The IIPA said Canada is the only country in the 30-member Organization for Economic Co-operation and Development that has yet to modernize its copyright law or meet the minimum global standards set out in the World Intellectual Property Organization treaty signed in 1996.

"Canada has taken no meaningful steps toward modernizing its copyright law to meet the minimum global standards of the WIPO internet treaties, which it signed more than a decade ago," the report said.

"In 2007, parliamentary leadership and the government, at its highest levels, acknowledged many of these deficiencies and the government listed copyright reform among its top legislative priorities. But these encouraging statements have not yet evolved into anything more concrete."

Other countries named

The group also recommended that 10 other countries be placed on the Priority Watch List, including Argentina, Chile, Costa Rica, Egypt, India, Mexico, Peru, Saudi Arabia, Thailand and Ukraine.

It also recommended that 29 other countries, including Spain, Sweden and Israel, be placed or maintained on the less critical Watch List. A total of 51 countries were named as significant violators.

Schwab's office will release its annual watch lists at the end of April.

Canadian copyright experts said the IIPA's recommendations should be taken with a grain of salt, since the organization has singled out a good number of countries that have trade dealings with the United States.

"It's little more than a lobbying exercise which lacks reliable and objective analysis," said University of Ottawa internet law professor Michael Geist. "With 51 countries, some of which are leaders in Europe and Asia, Canada is in very good company. It almost begs the question — who's the one that's really offside here?"

The IIPA, which says it represents the copyright interests of 1,900 U.S. companies, estimated the United States lost more than $18 billion U.S. in trade through copyright piracy in 2007, up 20 per cent from $15 billion a year earlier.

According to the group, Canada contributed about $511 million of that loss — up from $494 million in 2006 — while China led the pack with a $2.9-billion contribution, up from $2.4 billion.

The group's report is the latest to urge the U.S. government into pressuring Ottawa to reform copyright laws. U.S. ambassador to Canada David Wilkins has repeatedly said Canada's laws are the most lax among G7 nations, while the recording industries on both sides of the border have called for tougher rules.

Minister of Industry Jim Prentice was to introduce a draft bill in December but backed off because of widespread public opposition.

Geist, who has led the opposition against a U.S.-style bill, said the legislation may be introduced as early as this week. Given the government's other priorities, including the upcoming budget and debate over Canada's military involvement in Afghanistan, he said it's unlikely much time has been spent on revising the bill.

"It's more likely they've revised their communications strategy on how to sell it," he said.

Teachers, musicians, artists, telecommunications companies, retailers, privacy commissioner Jennifer Stoddart, as well as more than 40,000 members of a Facebook group devoted to the issue have voiced their opposition to what they say was the overly restrictive, U.S.-style legislation that Prentice was expected to table.
http://www.cbc.ca/technology/story/2...copyright.html





Business Coalition Opposes Harsh Copyright Reform

Google, Rogers, Retail Council among those urging changes to legislation
CBC News

A who's who of powerful companies and business associations have banded together to push for less restrictive copyright reform, driving a stake into the heart of the federal government's argument for its new copyright bill.

The Business Coalition for Balanced Copyright, a group that includes Google, Yahoo, Rogers, Telus, the Canadian Alliance of Broadcasters and the Retail Council of Canada, among others, on Tuesday sent its stance on seven key copyright principles to Industry Minister Jim Prentice, Canadian Heritage Minister Josée Verner and several other cabinet ministers.

According to the document obtained by CBCNews.ca, the coalition wants any new copyright legislation to include measures that enshrine the rights of consumers to use in different ways the copyrighted material they buy, as well as companies in their daily business practices.

The group also said internet service providers should not be held liable for copyright violations that occur on their networks, and that Canada should put into place measures that prevent the highly punitive lawsuits seen in the United States.

"We're looking for a balanced approach between the rights of the users and the rights of the copyright holders," said Pam Dinsmore, vice-president of regulatory affairs for Rogers. "Our concern would be that the government would err too much on the side of the copyright holders."

Google's Canada policy counsel Jacob Glick said harsh copyright rules would limit innovation on the internet by preventing users from engaging in acts such as parody, or making the mash-up videos that have become popular on its YouTube website.

"Canada's current approach to fair dealing ossifies the tiny and exhaustive list of exceptions to copyright and as such stifles cultural and technological innovation," Glick wrote on Google's public policy blog. "Flexible exceptions and limitations, which encourage creativity and innovation, are integral to balanced copyright law."

The coalition joins a long list of individuals and concerned parties opposed to the proposed legislation, including teachers, librarians, musicians, privacy commissioner Jennifer Stoddart, and more than 40,000 members of a Facebook group devoted to the issue.

The proposed bill has not been made public yet, but Prentice's silence on its contents and a lack of public consultation have led many to believe the new law will mirror the Digital Millennium Copyright Act passed in the United States in 1998. Critics say the U.S. rules have been overly restrictive, haven't curbed copyright violations and have spawned a wave of excessive lawsuits, such as the one in which a Minnesota woman was ordered to pay $222,000 U.S. last year for sharing music online.

Prentice was to introduce the bill in December but shelved it after the public voiced its opposition. He promised to revisit it early this year.

Critics on Wednesday said the coalition's stance undermines one of Prentice's key arguments in favour of the bill — that the business community has been demanding reforms.

"Those claims have now been completely undermined," wrote University of Ottawa internet law professor Michael Geist on his blog. "With such an impressive list of backers, the Industry Minister must now surely recognize that his proposed bill is opposed by the very industries that he has promised to support."

The coalition wants the following in any new copyright legislation:

• Expanded "fair dealing" provisions for users, which give a "large and liberal" interpretation of how consumers can use the copyrighted material they buy. The Copyright Act needs to be amended to accommodate long-standing and accepted uses, such as the copying of a song from one format to another — for example, from a CD to a computer — or the recording of a television program for later viewing, they say. A number of Canada's major trading partners already allow such uses.
• A clause that prevents copyright owners from going after people or companies who circumvent for non-commercial reasons the technological protection measures placed on content. A record label, for example, should not be able to sue a consumer who gets around copy-protection measures in order to transfer a song to an iPod.
• No surcharges on downloadable content. Copyright owners have been pushing for downloads to be considered as "communications to the public," and say they should therefore be subject to an additional fee. The coalition believes such a charge would unfairly double the delivery cost of online music, films, games and other software.
• A scrapping of the surcharge on recordable media, including CDs and MP3 players. The surcharge was originally introduced to compensate artists for revenue they were losing through illegal downloads, which were ending up on the recordable media. But with the proliferation of legal online stores, artists are being compensated for many downloads. The government should therefore consider getting rid of the tax or scaling it back, the coalition says.
• An exemption for violating copyright as part of legitimate business practices, such as when a broadcaster copies a show for its archives.
• No liability for internet service providers for the actions of their users.
• A limit on the damages that can be awarded to copyright holders in lawsuits against those infringing on their copyrights.

Prentice has also been criticized for listening to lobby groups, particularly in the United States, who want Canada to back up the World Intellectual Property Organization treaty it signed in 1996 with strong laws. The U.S.-based International Intellectual Property Alliance this week asked U.S. Trade Representative Susan Schwab to put Canada on its priority watch list, alongside Russia and China, because they say Canada is a top violator of U.S. copyrights.

The parties that make up the coalition, which does want to see copyright reform — but with fewer restrictions — first began talking in December, when they realized they weren't going to be consulted, they said.

"With the lack of consultations, [we decided] we should come together and build a common position," said Kim Furlong, vice-president of government relations for the Retail Council of Canada. "The ministers will now have something to look back on to see this is where the business community stands."

Furlong said the coalition signatories — which also include the Canadian Association of Internet Providers, a division of CATAlliance, the Canadian Cable Systems Alliance, the Canadian Wireless and Telecommunications Association, the Computer and Communications Industry Association, Third Brigade, Tucows, Cogeco Cable, EastLink, MTS Allstream and SaskTel — hold the seven principles in varying levels of priority. The issues of fair dealing and media copying are of particular concern to the Retail Council, she said.

Telecommunications providers such as Rogers, on the other hand, are more concerned about being held liable for what their customers use the internet for, or for how services such as music downloads on cellphones are taxed, Rogers' Dinsmore said.

A spokesperson for Prentice could not be immediately reached for comment.

Furlong said it is unlikely the proposed legislation will be introduced in Parliament before the federal budget is tabled on Feb. 26. The Conservatives wouldn't want the bill to die on the order paper, which could happen if the budget forces an election, and are therefore likely to hold off until afterward, she said.

"That would be wise," she said.
http://www.cbc.ca/technology/story/2...copyright.html





Canadian DMCA On Hold?
Michael Geist

Rumours tonight indicate that the government has again decided to delay introducing the Canadian DMCA. With the House of Commons off next week and the budget coming the following week, if this is true it would appear that there will be no copyright legislation for at least another month (assuming there is no election). It is impossible to pinpoint any one reason for the delay - the public outcry through Facebook, the treaty issue, the opposition from education, the impact of copyright on some MPs electoral chances, the privacy concerns, the outcry from artists groups, the op-eds, and the creation of a powerful business coalition calling for balanced copyright may have all played a role.

The key question is now is whether the government accepts that it is not the communication strategy that needs fixing, but rather it is the bill itself. I would argue that there is a deal to be had that would leave virtually all stakeholders sufficiently satisfied to garner broad support (or at least limit the heated and potentially costly opposition). A bill that implements WIPO by rendering circumvention an infringement only where it occurs for infringing purposes, the introduction of a more flexible fair dealing provision, the establishment of a notice-and-notice system for infringing content, and the creation of intermediary protection for third party content could serve as the foundation for a forward-looking package that meets consumer needs, business demands, and U.S. pressure. Over the past 10 weeks we have witnessed the passion and interest of thousands of Canadians on the copyright issue. The government may have heard enough to hold off on a Canadian DMCA. Let's hope it uses this opportunity to consult and build broad support for a copyright solution that serves Canadian interests.
http://www.michaelgeist.ca/content/view/2694/125/





World Leader in Movie Piracy Flees from the Mounties
enigmax

Last month we reported that Geremi Adam, producer of some of the highest quality pirate movie copies ever seen on the Internet, had been caught and had been ordered to appear in court in January. Adam, aka ‘maVen’ had other ideas - and has disappeared.

Between 2004 and 2006, Geremi Adam, delighted the movie piracy scene with some of the highest quality Telesync movies ever seen. From ‘The Bourne Supremacy’ and ‘Harry Potter and the Goblet of Fire’ through to ‘Spongebob Squarepants’ and plenty of other titles, the work of ‘maVen’ set a very high standard for quality pirated movies.

Following an FBI investigation into ‘maVen’, his file was handed to the Royal Canadian Mounted Police (RCMP) in April 2006. By September of the same year, Geremi Adam was arrested by police outside a Montreal theater after ‘camming’ the movies ‘How to Eat Fried Worms’ and ‘Invincible’. They seized his laptop and other equipment but later released him. A month later he was arrested again outside another theater.

Facing a $25,000 fine and six months in jail, Geremi Adam was ordered to appear in court on 30th January. Clearly unimpressed by the prospect of being locked up and/or bankrupted, he failed to appear in court and has gone on the run.

“We have a warrant and the police officers will try to find Mr. Adam,” said federal prosecutor Yacine Agnaou in a statement. “When he is found, he will be mandated to appear in court.”

Meanwhile, outside the court a demonstration was taking place by a group calling themselves ‘Hors-d-Oeuvre’ who say that ‘maVen’ is being unfairly treated and that all media should be available on the Internet for free. “Free Geremi!” they chanted in unison.

Fortunately for Adam, he committed the alleged offenses before new tough legislation was introduced to punish movie cammers caught in Canada. He will likely escape the severe punishment of 2 years in prison, a fate awaiting fellow cammer, Louis-Rene Hache. RCMP Staff Sgt. Noel St-Hilaire said: “Unfortunately at the time there was no legislation that forbid anyone from filming in a cinema. There’s not much we could do then other than issue a warning.”

Although not turning up at court is likely to inflame the situation, it’ll be interesting to see if the mounties are prepared to put any serious effort into ‘getting their man’ in this instance.
http://torrentfreak.com/maven-world-...nties-0800212/





DVD Rip Automates One-Click DVD Ripping



Windows only: Rip and back up any DVD to your hard drive with DVD Rip, a freeware Windows application that automates the entire DVD-to-hard-drive backup process. All you need to do is insert your DVD, run DVD Rip, and let it take care of the rest. Why? A while back I explained why I'd soured on optical media, the gist of which was the ease with with DVDs are damaged. Sick of scratched, skippy DVDs, I put together a simple AutoHotkey script that automated DVD rips in conjunction with a freeware application called DVD Shrink. I've since gone back and drastically improved the original DVD Rip application complete with options and improved automation.

DVD Rip Automated DVD Backup
Version: 0.2
Released: February 12, 2008
Creator: Adam Pash

License: DVD Rip is licensed under the GNU Public License. If you'd like to take a look at the source, you can download it here.

What it does: Automates the process of ripping and backing up DVDs to your hard drive in conjunction with the freeware application DVD Shrink. See the video below for a better idea of how it works.

Again, DVD Shrink is the application doing the actual ripping. DVD Rip just makes the process completely painless. I originally created it so that I could insert a DVD in my media center PC, run DVD Rip from the media center, and then let everything rip in the background. DVD Shrink normally compresses the DVD image by about half, so you retain the entire menu structure while taking much less space.

As I said above, the reason I rip every DVD is to avoid dealing with scratches. Normally a rip will smooth over those unreadable section without any issue, so after DVD Rip and DVD Shrink are finished, it's going to be a perfect, playable copy.

Installation

DVD Rip is distributed as a simple executable, which means there's nothing to install. Just drag the DVD Rip.exe file to wherever you want it to live (might I recommend C:\Program Files\DVD Rip\) and double-click the application whenever you want to rip a new DVD.

How It Works

The first time you run DVD Rip, you'll need to configure a few settings—telling it where you want to save DVDs by default, whether you want to use the default DVD title as it's displayed in your DVD drive (like D:\THIS_DVD) or manually set the title (like in the video), and a few more. If this is starting to sound tedious for an application that's supposed to take all of the pain out of DVD backup, don't worry—you only need to go through the settings the first time you run the application. After that they won't show up again (unless you hold the Shift key when you run the application, which you would do if you decided you wanted to adjust a setting).

Roadmap

The initial release of DVD Rip only handles ripping DVDs complete with menus to your hard drive. That may be enough for some people (it is for me), but others would like to go the next step and copy the DVD to a blank DVD. I'm looking into adding automation for backing up the DVD to a blank DVD with the open source application ImgBurn.

Also, you can play your ripped DVDs (menus and all) with VLC, and it works like a charm, but I'm looking into a similar helper application for managing and playing your ripped DVDs.

Lastly, I'm looking into methods for adding DVD Rip to the Windows Autoplay menu. I never watch DVDs on my computer before ripping them, so if I'm putting a DVD movie in the drive, it's to rip it. With a DVD Rip option in the Autoplay menu, you could set your computer to automatically devour any DVD movie you inserted in your computer. This may or may not be your cup of tea, but if you're big on backing up DVDs and—like me—you never put a DVD in your drive with the intention of watching it then and there, it could be useful.

Changelog

• Version 0.1: Released
• Version 0.2: Added GUI options, improved reliability and portability

Bug Reports and Feature Requests

DVD Rip works like a charm on my Vista and XP computers, but that's really the only place it's been tested, so there may very well be a few bugs here and there that need to be worked out. If you've got feature requests or bug reports, I'd love to hear them in the comments. Don't have a commenter account? Here's how to get one.
http://lifehacker.com/355281/dvd-rip...ck-dvd-ripping





Apple Releases Apple TV 'Take 2'
Arnold Kim

Apple has released the Take 2 update for the Apple TV today. The software update is available for download from your Apple TV itself. The free software update allows Apple TV owners to rent movies without the use of a computer in standard and high definition formats. Library titles cost $2.99 and new releases are $3.99. High definition versions add $1 to the price of rental. Movies can be kept for 30 days, but expire within 24 hours of playback.
http://www.macrumors.com/2008/02/12/...take-2-update/





iPhone Usage Shocks Search Giant
Slash Lane

Google on Wednesday said it has seen 50 times more search requests coming from Apple iPhones than any other mobile handset -- a revelation so astonishing that the company originally suspected it had made an error culling its own data.

"We thought it was a mistake and made our engineers check the logs again," Vic Gundotra, head of Google’s mobile operations told the Financial Times during this week's Mobile World Congress in Barcelona.

Should other companies follow in Apple's footsteps by making web access commonplace on their mobile handsets, Gundotra believes the number of mobile searches could outpace fixed internet search "within the next several years."

That of course means big increases in incremental advertising revenues for the Mountain View, Calif.-based search giant. Though Google's primary revenue driver remains online advertising, the company has never separated out its mobile revenues from those of traditional computer-based browsers.

Gundotra, however, told the Times that the mobile segment was growing “above expectations”, both in terms of usage and revenues.

"The world is changing. Users want an internet without fences. They know how to type in Google.com if they want to get to it," he said. "Two years ago the operators were still playing the role of gatekeepers but that is no longer the role for them."

The mobile boss also reiterated a long-running company position on the mobile handset market, which is that Google is unlikely to build its own mobile hardware despite widespread speculation to the contrary.

"We want every phone to be a Google phone," he said. "We are ultimately talking about thousands of devices. The best way to do this would be to get Google’s mobile operating system, Android, deployed on as many types of handsets as possible."

Google has the first Android-based mobile handsets from third-party manufacturers would begin shipping during the second half of 2008.
http://www.appleinsider.com/articles...rch_giant.html





Cell Phone Use Linked To Increased Cancer Risk

A recent study says frequent cell phone users face a 50% greater risk of developing tumors of the parotid gland than those who don't use cell phones.
Thomas Claburn

Frequent cell phone users face a 50% greater risk of developing tumors of the parotid gland than those who don't use cell phones, according to a recently published study.

The parotid gland is the largest human salivary gland; it's located near the jaw and ear, where cell phones are typically held.

The reported annual incidence of salivary gland tumors is one to three per 100,000 people, according a 2006 article by Mark Kidd in Ear, Nose and Throat Journal. Based on that data, a 50% increase would raise one's theoretical high-end risk of developing a tumor in the head from 0.003% per year to 0.0045% per year.

To put the possible danger into perspective, consider that the annual incidence of death by car crash in the United States is about 14 per 100,000 people, according to Department of Transportation statistics.

The study, led by Tel Aviv University epidemiologist Dr. Siegal Sadetzki, appeared last December in the American Journal of Epidemiology.

Sadetzki's findings are sure to add to confusion surrounding the already contentious debate about the health effects of cell phone radiation. Many other studies in recent years have found no increased risk of cancer due to mobile phone use, but a few have stopped short of ruling the possibility out and a few have said increased risk of cancer is small but real.

The U.K. Mobile Telecommunications and Health Research Programme last year "found no association between short term mobile phone use and brain cancer," and noted that "the situation for longer term exposure is less clear."

Professor Kjell Mild of Sweden's Orebro University, however, also published a study last year and found that using a cell phone over a period of more than 10 years raises the risk of brain cancer and that children are particularly susceptible to this risk because of their developing skulls.

In 2006, the American Journal of Epidemiology published a Swedish salivary gland study, "Mobile Phone Use and Risk of Parotid Gland Tumor," and the authors found no increased risk of tumors caused by cell phone use.

One area where the two parotid gland studies differ is in the number of participants. The 2006 Swedish study included 172 people with benign and malignant tumors, and 681 health control subjects. Sadetzki's study included nearly 500 people with benign or malignant tumors and about 1,300 healthy control subjects.

Sadetzki says that the Israelis were early cell phone adopters and heavy users of the technology, a tendency that suggests higher radio frequency exposure than other populations. Her study found an increased risk of cancer for frequent cell phone users in rural areas, which may be attributable to the increased radiation output required when phones try to communicate in areas with fewer antennas. She believes that frequent mobile phone users and children face the largest increased risk of health effects.

"While I think this technology is here to stay, I believe precautions should be taken in order to diminish the exposure and lower the risk for health hazards," Sadetzki said in a statement. She recommends the use of hands-free devices at all times, holding the phone away from one's body, and making shorter, less frequent calls. She also advises that parents limit the amount of time children can talk on mobile phones.

And if you really want to protect your health, buckle up and drive with care.
http://www.informationweek.com/news/...leID=206504325





Gates Foundation’s Influence Criticized
Donald G. McNeil Jr.

The chief of malaria for the World Health Organization has complained that the growing dominance of malaria research by the Bill and Melinda Gates Foundation risks stifling a diversity of views among scientists and wiping out the world health agency’s policy-making function.

In a memorandum, the malaria chief, Dr. Arata Kochi, complained to his boss, Dr. Margaret Chan, the director general of the W.H.O., that the foundation’s money, while crucial, could have “far-reaching, largely unintended consequences.”

Many of the world’s leading malaria scientists are now “locked up in a ‘cartel’ with their own research funding being linked to those of others within the group,” Dr. Kochi wrote. Because “each has a vested interest to safeguard the work of the others,” he wrote, getting independent reviews of research proposals “is becoming increasingly difficult.”

Also, he argued, the foundation’s determination to have its favored research used to guide the health organization’s recommendations “could have implicitly dangerous consequences on the policy-making process in world health.”

Dr. Tadataka Yamada, executive director of global health at the Gates Foundation, disagreed with Dr. Kochi’s conclusions, saying the foundation did not second-guess or “hold captive” scientists or research partnerships that it backed. “We encourage a lot of external review,” he said.

The memo, which was obtained by The New York Times, was written late last year but circulated this week to the heads of several health agency departments, with a note asking whether they were having similar struggles with the Gates Foundation.

A spokeswoman for the director general said Dr. Chan saw the memo last year but did not respond to it. It is “the view of one department, not the W.H.O.’s view,” said the spokeswoman, Christine McNab. The agency has cordial relations with the foundation, and the agency’s policies are set by committees, which include others besides Gates-financed scientists, she said.

The Gates Foundation has poured about $1.2 billion into malaria research since 2000. In the late 1990s, as little as $84 million a year was spent — largely by the United States military and health institutes, along with European governments and foundations. Drug makers had largely abandoned the field. (China was developing a drug, artemisinin, that is now the cornerstone of treatment.)

The World Health Organization is a United Nations agency with a $4 billion budget. It gives advice on policies, evaluates treatments — especially for poor countries — maintains a network of laboratories and sends teams to fight outbreaks of diseases, like avian flu or Ebola. It finances little research; for diseases of the poor, the Gates Foundation is the world’s biggest donor.

Dr. Kochi, an openly undiplomatic official who won admiration for reorganizing the world fight against tuberculosis but was ousted from that job partly because he offended donors like the Rockefeller Foundation, called the Gates Foundation’s decision-making “a closed internal process, and as far as can be seen, accountable to none other than itself.”

Moreover, he added, the foundation “even takes its vested interest to seeing the data it helped generate taken to policy.”

As an example, he cited an intervention called intermittent preventive treatment for infants, known as IPTi.

Other experts said IPTi involved giving babies doses of an older anti-malaria drug, Fansidar, when they got their shots at 2 months, 3 months and 9 months. In early studies, it was shown to decrease malaria cases about 25 percent. But each dose gave protection for only a month. Since it is not safe or practical to give Fansidar constantly to babies because it is a sulfa drug that can cause rare but deadly reactions and because Fansidar-resistant malaria is growing, World Health Organization scientists had doubts about it.

Nonetheless, Dr. Kochi wrote, although it was “less and less straightforward” that the health agency should recommend it, the agency’s objections were met with “intense and aggressive opposition” from Gates-backed scientists and the foundation. The W.H.O., he wrote, needs to “stand up to such pressures and ensure that the review of evidence is rigorously independent of vested interests.”

Amir Attaran, a health policy expert at the University of Ottawa who has criticized many players in the war on malaria, said he thought Dr. Kochi’s memo was “dead right.” His own experience with Gates-financed policy groups, he said, was that they are cowed into “stomach-churning group think.” But Dr. Attaran said he believed that scientists were not afraid of the foundation, but of its chief of malaria, Dr. Regina Rabinovich, whom he described as “autocratic.”

Dr. Rabinovich, when told of Dr. Attaran’s characterization, said she did not want to respond. Dr. Yamada of the Gates Foundation called it “unfortunate and inaccurate.”

“I’m not a grantee of hers,” he said, “but she’s an extremely knowledgeable leader. And if she has an opinion, she’s entitled to it.” He said he did not know the details of the IPTi issue, but added that researchers often differed about policy implications.

There have been hints in recent months that the World Health Organization feels threatened by the growing power of the Gates Foundation. Some scientists have said privately that it is “creating its own W.H.O.”

One oft-cited example is its $105 million grant to create the Institute for Health Metrics and Evaluation at the University of Washington. Its mission is to judge, for example, which treatments work or to rank countries’ health systems.

These are core W.H.O. tasks, but the institute’s new director, Dr. Christopher J. L. Murray, formerly a health organization official, said a new path was needed because the United Nations agency came under pressure from member countries. His said his institute would be independent of that.
http://www.nytimes.com/2008/02/16/sc...16malaria.html





Yahoo Said to be in Talks with News Corp. to Thwart Microsoft
AP

Yahoo Inc. is discussing a possible partnership with News Corp. in its latest effort to repel Microsoft Corp. or prod its unsolicited suitor into raising its current takeover bid, according to a person familiar with the talks.

The specifics of the joint venture still hadn't been worked out, said the person who didn't want to be identified because of the sensitivity of the matter.

Both The Wall Street Journal and a prominent blog, TechCrunch, reported that News Corp. is interested in folding its popular online social network, MySpace.com, and other Internet assets into Yahoo - an idea that first came up last year. News Corp. owns The Wall Street Journal.

News Corp. and a private equity firm also would buy significant stakes in Yahoo in a complex deal designed to boost the Sunnyvale-based company's market value above Microsoft's initial bid of $44.6 billion, or $31 per share.

A Yahoo spokesman said the company continues to "carefully and thoroughly" evaluate alternatives that will enrich its long-term shareholders.

News Corp. spokeswoman Teri Everett declined to comment.

Although News Corp. Chairman Rupert Murdoch made it clear in a conference call last week that his New York-based company had no interest in an outright acquisition of Yahoo, he didn't rule out the possibility of a deal involving MySpace.

Yahoo rejected Microsoft's offer Monday, insisting that its Internet franchise is worth more money. Microsoft has held firm on its bid so far, dubbing it "full and fair" while threatening to launch a hostile takeover attempt.

Besides talking with News Corp., Yahoo has explored an advertising partnership with its biggest rival, Internet search leader Google Inc.

Although Google probably could help elevate Yahoo's recently drooping profits, the alliance would likely face antitrust hurdles because the two companies operate the Web's two biggest ad networks and eliminating one would reduce competition.

If Yahoo is able to work out a deal with News Corp., analysts believe Microsoft will simply raise its offer because it needs the acquisition to counteract Google's dominance of the online ad market - a battleground that is rapidly reshaping the technology and media industries.

"Buying Yahoo makes tremendous sense for Microsoft, more sense than any other company in the world," said Ken Marlin, a New York investment banker specializing in media and technology deals.
http://www.siliconvalley.com/news/ci_8251519





Four US Newspaper Companies Form Online Ad Partnership
Jeremy Kirk

Four major U.S. newspaper chains launched an online advertising network on Friday that will let advertisers book national campaigns through a single point of contact, reaching 50 million people a month across the U.S.

Investors include the Tribune, Gannett, Hearst and New York Times companies, which publish flagship newspapers such as the Chicago Tribune, USA Today, the San Francisco Chronicle and the New York Times, respectively.

The network, QuadrantOne, will let an advertiser place ads on hundreds of Web sites focused on 27 major markets, targeting users by what they are viewing, their online behavior and demographic information.

QuadrantOne is most notable for the online players that aren't participants, such as Google, Yahoo or Microsoft. This latest move by the newspaper companies may be designed to assert greater control over their print and Web properties.

Yahoo reached a landmark revenue-sharing deal in November 2006 with seven U.S. publishers. Yahoo provides search services, places job ads on its own HotJobs site and sells Web advertising. The deal was expanded in April 2007, with some 264 newspapers distributing their content on Yahoo's portal.

One of the QuadrantOne's investors, the Hearst company, is participating in the Yahoo deal.

Google's PrintAds program lets its customers who are already buying contextual Web-based ads to also place ads in 600 daily and weekly U.S. newspapers. Google also offers easy tools for customers to design their own ad that can be uploaded to particular newspaper.

The Chicago Tribune, the New York Times and the San Francisco Chronicle participate in PrintAds.

If newspapers develop better ways to sell their own online ads, they may not have to share revenue with their Web counterparts such as Yahoo and Google.

Online advertising accounted for about US$16.9 billion in revenue in 2006 and is expected to rise to $50.3 billion by 2011, according to a December 2007 report from the Yankee Group.

The U.S. newspaper industry is in dire straits, in part because of a bumpy U.S. economy, declining print readers and falling print advertising revenue.

Critics argue that newspaper companies waited far too long to revamp their businesses with the surge in online publishing and advertising over the last 15 years.

Companies such as Google, which has made a fortune in Web-based advertising, have reaped some gains at the expense of newspapers, as advertisers look for cheaper and more targeted ways to reach buyers.

Classified advertising, once a bread-and-butter source of revenue for newspaper, has also declined over the years due to advertising boards such as Craig's List.
http://www.thestandard.com/news/2008...ad-partnership





New York Times Plans to Cut 100 Newsroom Jobs
Richard Pérez-Peña

After years of resisting the newsroom cuts that have hit most of the industry, The New York Times will bow to growing financial strain and eliminate about 100 newsroom jobs this year, the executive editor said Thursday.

The cuts will be achieved by “by not filling jobs that go vacant, by offering buyouts, and if necessary by layoffs,” said the executive editor, Bill Keller. The more people who accept buyouts, he said, “the smaller the prospect of layoffs, but we should brace ourselves for the likelihood that there will be some layoffs.”

The Times has 1,332 newsroom employees, the largest number in its history; no other American newspaper has more than about 900. There were scattered buyouts and job eliminations in The Times’ newsroom in recent years, but the overall number continued to rise, largely because of the growth of its Internet operations.

Shares in The New York Times Company rose almost 5 percent Thursday after the newsroom staff reductions were reported, closing at $18.84, up 86 cents.

The Times Company has made significant cuts in the newsrooms of some of its other properties, including The Boston Globe, as well as in non-news operations. Company executives say the overall head count is 3.8 percent lower than it was a year ago.

But with the industry’s economic picture worsening, the company is under increased pressure from shareholders — notably two hedge funds that recently bought almost 10 percent of the common stock — to do something dramatic to improve its bottom line.

For 2007, it recently reported earnings of $209 million on revenue of $3.2 billion.

Newspaper industry ad revenue fell about 7 percent last year, and 4.7 percent at The Times Company, and executives around the industry have projected that 2008 will be equally bad.

Other large newspapers have made much bigger cuts, proportionally, than those The Times is planning; some newsrooms are more than 20 percent smaller than they were early in this decade.

Even so, eliminating jobs has grown harder “because the low-hanging fruit is gone, and so is some of the higher-hanging fruit,” Mr. Keller said. And he suggested that the cuts could not help but affect the newspaper’s journalism.

“To meet our budget goals, we will have to do a little less, and every time we do less, we cede a bit of advantage,” he said. “Our challenge will be to set our priorities in such a way that we do less in the areas that damage our competitiveness least.”

The Times has a newsroom budget of more than $200 million. It is one of a very few news organizations that have not reduced their coverage of Iraq, which costs about $3 million a year, and expenses have also been increased by an unusually long and competitive presidential campaign.

The Times also faces increased competition from The Wall Street Journal, which was acquired in December by the News Corporation. With Rupert Murdoch, the News Corporation’s chairman, calling for The Journal to become an alternative to The Times, The Journal is stepping up its coverage of politics and government.

The Journal has about 750 newsroom personnel, a figure that does not include some of the support staff that most newspapers include in the tally. That is the largest the number has ever been, and News Corporation executives have said they expect it to grow.

The Los Angeles Times has fewer than 900 newsroom employees, down from about 1,200 early in this decade. The Washington Post has about 800, down from a peak of about 900.
http://www.nytimes.com/2008/02/14/bu...cnd-times.html





BBC Warns Staff Over Internet Pictures
Tara Conlan and Jemima Kiss

BBC editorial staff have been told to be cautious over the use of photos from social networking websites, saying the practice raises a number of legal and ethical issues.

The BBC does not yet have a fixed policy on content from social networking sites, but an update for editorial staff and producers sent on Friday and seen by MediaGuardian.co.uk warned that just because pictures are easily available, it should not remove the "responsibility to assess the sensitivities in using it".

"Simply because material may have been put into the public domain may not always give the media the right to exploit its existence. The use of a picture by the BBC brings material to a much wider public than a personal website that would only be found with very specific search criteria," the
email said.

"Consideration should be given to the context in which it was originally published including the intended audience."

Editorial staff were told that they need to consider the original context of photos and how their use might impact grieving or distressed friends and relatives. Photos also need to be verified before use.

There are further concerns around copyright of photographs copied and pasted from the web, which may belong to either the host site or one of its users.

Issues were raised over the use of photos from personal profile pages on sites such as MySpace after the shootings at Virginia Tech, and following the recent spate of suicides in Bridgend.

Writing about the issue on the BBC's news blog recently, the news website editor, Steve Hermann, said it is reasonable to assume that photos on a social networking site user's personal profile would be seen only by that person's family and friends.

"The boundary between what's public and what's private isn't always easy to define online, and I think it's also true to say it's not something people always give a huge amount of thought to when posting," Hermann added.

"For most people, most of the time, the media and wider public won't be focusing on them. That gives them a certain anonymity."
http://www.guardian.co.uk/media/2008...1/bbc.medialaw





At Harvard, a Proposal to Publish Free on Web
Patricia Cohen

Publish or perish has long been the burden of every aspiring university professor. But the question the Harvard faculty will decide on Tuesday is whether to publish — on the Web, at least — free.

Faculty members are scheduled to vote on a measure that would permit Harvard to distribute their scholarship online, instead of signing exclusive agreements with scholarly journals that often have tiny readerships and high subscription costs.

Although the outcome of Tuesday’s vote would apply only to Harvard’s arts and sciences faculty, the impact, given the university’s prestige, could be significant for the open-access movement, which seeks to make scientific and scholarly research available to as many people as possible at no cost.

“In place of a closed, privileged and costly system, it will help open up the world of learning to everyone who wants to learn,” said Robert Darnton, director of the university library. “It will be a first step toward freeing scholarship from the stranglehold of commercial publishers by making it freely available on our own university repository.”

Under the proposal Harvard would deposit finished papers in an open-access repository run by the library that would instantly make them available on the Internet. Authors would still retain their copyright and could publish anywhere they pleased — including at a high-priced journal, if the journal would have them.

What distinguishes this plan from current practice, said Stuart Shieber, a professor of computer science who is sponsoring the faculty motion, is that it would create an “opt-out” system: an article would be included unless the author specifically requested it not be. Mr. Shieber was the chairman of a committee set up by Harvard’s provost to investigate scholarly publishing; this proposal grew out of one of the recommendations, he said.

The publishing industry, as well as some scholarly groups, have opposed some forms of open access, contending that free distribution of scholarly articles would ultimately eat away at journals’ value and wreck the existing business model. Such a development would in turn damage the quality of research, they argue, by allowing articles that have not gone through a rigorous process of peer review to be broadcast on the Internet as easily as a video clip of Britney Spears’s latest hairdo. It would also cut into subsidies that some journals provide for educational training and professional meetings, they say.

J. Lorand Matory, a professor of anthropology and African and African American studies at Harvard, said he sympathized with the goal of bringing down the sometimes exorbitant price of scientific periodicals, but worried that a result would be to eliminate a whole range of less popular journals that are subsidized by more profitable ones.

Art history periodicals, for example, are extremely expensive to publish because of the reproduction costs, and subscriptions pay for those as well as some of the discipline’s annual gatherings.

Professor Matory also pointed out that “any professor who wants to put his or her article up online can.”

Asked about the Harvard proposal, Allan Adler, vice president for legal and governmental affairs at the Association of American Publishers, said that mandates are what publishers object to, as when Congress required that any work financed by the National Institutes for Health be funneled through PubMed Central, an open-access repository maintained by the National Library of Medicine.

“As long as they leave the element of choice for authors and publishers,” he said, “there isn’t a problem.”

Supporters of open access say that the current system creates a different set of problems for academics. Expensive journals cut into a library’s budget for scholarly books and monographs, which hurts academic publishers, which hurts the coming generation of scholars who must publish to gain tenure.

Professor Shieber also doubts that free distribution would undermine the journal industry. “We don’t know if that would happen,” he said. “There is little evidence to support that it would.” Nearly all scholarly articles on physics have been freely available on the Internet for more than a decade, he added, and physics journals continue to thrive.

As for the vote, Professor Shieber said: “As far as I know, everyone I’ve ever talked to is supportive of the underlying principle. Still there is a difference between an underlying principle and specific proposal.”
http://www.nytimes.com/2008/02/12/books/12publ.html





Machines 'to Match Man by 2029'
Helen Briggs

Machines will achieve human-level artificial intelligence by 2029, a leading US inventor has predicted.

Humanity is on the brink of advances that will see tiny robots implanted in people's brains to make them more intelligent said engineer Ray Kurzweil.

He said machines and humans would eventually merge through devices implanted in the body to boost intelligence and health.

"It's really part of our civilisation," Mr Kurzweil said.

"But that's not going to be an alien invasion of intelligent machines to displace us."

Machines were already doing hundreds of things humans used to do, at human levels of intelligence or better, in many different areas, he said.

Man versus machine

"I've made the case that we will have both the hardware and the software to achieve human level artificial intelligence with the broad suppleness of human intelligence including our emotional intelligence by 2029," he said.

"We're already a human machine civilisation, we use our technology to expand our physical and mental horizons and this will be a further extension of that."

Humans and machines would eventually merge, by means of devices embedded in people's bodies to keep them healthy and improve their intelligence, predicted Mr Kurzweil.

"We'll have intelligent nanobots go into our brains through the capillaries and interact directly with our biological neurons," he told BBC News.


CHALLENGES FACING HUMANITY
Make solar energy affordable
Provide energy from fusion
Develop carbon sequestration
Manage the nitrogen cycle
Provide access to clean water
Reverse engineer the brain
Prevent nuclear terror
Secure cyberspace
Enhance virtual reality
Improve urban infrastructure
Advance health informatics
Engineer better medicines
Advance personalised learning
Explore natural frontiers


The nanobots, he said, would "make us smarter, remember things better and automatically go into full emergent virtual reality environments through the nervous system".

Mr Kurzweil is one of 18 influential thinkers chosen to identify the great technological challenges facing humanity in the 21st century by the US National Academy of Engineering.

The experts include Google founder Larry Page and genome pioneer Dr Craig Venter.

The 14 challenges were announced at the annual meeting of the American Association for the Advancement of Science in Boston, which concludes on Monday.
http://news.bbc.co.uk/go/pr/fr/-/1/h...as/7248875.stm





Silicon Valley Starts to Turn Its Face to the Sun
G. Pascal Zachary

CAN Silicon Valley become a world leader in cheap and ubiquitous solar panels for the masses?

Given the valley’s tremendous success in recent years with such down-to-earth products as search engines and music players, tackling solar power might seem improbable. Yet some of the valley’s best brains are captivated by the challenge, and they hope to put the development of solar technologies onto a faster track.

There is, after all, a precedent for how the valley tries to approach such tasks, and it’s embodied in Moore’s Law, the maxim made famous by the Intel co-founder Gordon Moore. Moore’s Law refers to rapid improvements in computer chips — which would be accompanied by declining prices.

A link between Moore’s Law and solar technology reflects the engineering reality that computer chips and solar cells have a lot in common.

“A solar cell is just a big specialized chip, so everything we’ve learned about making chips applies,” says Paul Saffo, an associate engineering professor at Stanford and a longtime observer of Silicon Valley.

Financial opportunity also drives innovators to exploit the solar field. “This is the biggest market Silicon Valley has ever looked at,” says T. J. Rogers, the chief executive of Cypress Semiconductor, which is part-owner of the SunPower Corporation, a maker of solar cells in San Jose, Calif.

Mr. Rogers, who is also chairman of SunPower, says the global market for new energy sources will ultimately be larger than the computer chip market.

“For entrepreneurs, energy is going to be cool for the next 30 years,” he says.

Optimism about creating a “Solar Valley” in the geographic shadow of computing all-stars like Intel, Apple and Google is widespread among some solar evangelists.

“The solar industry today is like the late 1970s when mainframe computers dominated, and then Steve Jobs and I.B.M. came out with personal computers,” says R. Martin Roscheisen, the chief executive of Nanosolar, a solar company in San Jose, Calif.

Nanosolar shipped its first “thin film” solar panels in December, and the company says it ultimately wants to produce panels that are both more efficient in converting sunlight into electricity and less expensive than today’s versions. Dramatic improvements in computer chips over many years turned the PC and the cellphone into powerful, inexpensive appliances — and the foundation of giant industries. Solar enterprises are hoping for the same outcome.

To be sure, Silicon Valley’s love affair with solar could be short-lived.

“We’ve seen a lot of pipe dreams in the industry over the years, a lot of wild claims never came through,” says Lisa Frantzis, a specialist in renewable energy at Navigant Consulting in Burlington, Mass.

Another brake on the pace of solar innovation might be consumer behavior. It often can be hard to get consumers to change their habits, and homeowners may be slow to swap out expensive water heaters for newfangled solar solutions. Reliability is also an issue: while current solar technologies have proved relatively durable, it’s unknown how resilient the next generation of solar will be.

“We need technologies that can survive on a rooftop for 20 years,” says Barry Cinnamon, chief executive of Akeena Solar Inc. of Los Gatos, Calif., a designer and installer of solar systems.

Affordable solar development is also still dependent on government subsidies.

“Mass adoption requires technological innovations that dramatically lower costs,” says Peter Rive, the chief operations officer of SolarCity in Foster City, Calif., a system designer and installer.

So what does the valley bring to the mix? Expertise in miniaturization and a passion for novelty among its entrepreneurs.

“There are suddenly a lot of new ideas coming into this field,” says Paul Alivisatos, a professor of chemistry at the University of California, Berkeley, who also has his own solar start-up.

One novel approach is called “solar thermal,” which uses large mirrors to generate steam to run conventional turbines that generate electricity.

In 2006, Vinod Khosla, a veteran venture capitalist best known as a co-founder of Sun Microsystems, discovered an obscure Australian company, Ausra, pursuing solar thermal. He persuaded the management of Ausra to move to Silicon Valley and helped it raise money.

Ausra recently signed a deal with PG&E, the big California utility company, to supply a large solar plant. “The best work in solar is happening in Silicon Valley,” Mr. Khosla says.

Another exciting area is thin-film solar, in which cells are created in roughly the same way that memory is created on dense storage devices like hard-disk drives — allowing the nascent industry to tap into the valley’s expertise.

At Nanosolar, for instance, some of those in top management come directly from Seagate Technology and I.B.M., two traditional titans in computer storage.

The promise of Solar Valley has investors opening their wallets as never before. But some worry that promising technologies of today must be renewed, and quickly, if the logic of Moore’s Law is to define solar.

“There’s a lot of money being thrown at the problem and that’s healthy; it gives it a real chance of succeeding,” Mr. Alivisatos says. “But so much of our effort is going into short-term victories that I worry our pipeline will go dry in 10 years.”

The fear of a solar bubble is legitimate, but after years of stagnation, entrepreneurs say the recent developments in the field are welcome. Long ignored by the most celebrated entrepreneurs in the land and now embraced as one of the next big things, solar energy may gain traction because of a simpler rule than Moore’s Law: where there’s a will, there’s a way.
http://www.nytimes.com/2008/02/17/business/17ping.html





No static at all

FCC Plan to Let AM Stations to Use FM at Night Controversial
Matthew Lasar

Almost everyone has signaled thumbs up on a Federal Communications Commission plan that would allow AM radio stations to duplicate their programming on FM translators. The formal comment period for the proceeding has concluded, but filings keep arriving. The latest came in on February 8th from Rep. Wayne Gilchrest (D-MD), endorsing the remarks of his constituent, Richard Gelfman, owner of Chestertown AM oldies station WCTR, "The Town."

"We strongly urged the Federal Communications Commission to adopt new rules which would allow daytime only stations such as ourselves to better serve local communities by granting the right to use FM translators (or any other means) to operate at night," Gelfman wrote. "As a matter of fact, without this ability, I do not believe that daytime AM stations can continue to survive."

Over 200 comments have been filed in this proceeding, most of them in favor of the idea. But several media reform groups have raised questions about the proposal, especially its impact on communities that hope to build Low Power FM (LPFM) stations in their area.

Filling in "coverage gaps?"

Many AM stations have to dramatically reduce their signal power at night because of "skywave propagation"—in the wee hours the AM frequency can hit the ionosphere and travel hundreds of miles beyond its designated local service area, causing interference. Unless the FCC has designated the station as a "clear channel" licensee, allowed to broadcast long distances at night, or permits it to operate on "flea power"—greatly reducing its signal—the station must shut down.

Since 1970, the Commission has limited the use of FM translators for extending the broadcasts of other FM stations. But in July of 2006, the National Association of Broadcasters (NAB) petitioned the FCC to allow AM stations to operate FM translators as well. The NAB argued that such a rule change would allow many of the nation's estimated 4,814 stations to fill in "coverage gaps" and provide more local programming.

"For instance," NAB wrote, "daytime stations that currently air tape-delayed coverage of the local high school's sporting events or a local political debate because they must turn off their transmitter at sundown, will now be able to do so live." NAB filings on this issue also point out that women and minorities have a significance presence in AM radio.

And so the Media Bureau of the FCC launched a 90-day proceeding on the question in early November of last year. Their proposal would allow FM translators as long as the night time service does not extend beyond a 25-mile radius of the AM site, or the daytime coverage area of the station, whichever is smallest.

A hodgepodge of smaller, locally-based AM station owners quickly wrote in to support the proposal, among them Virginia's Christian Broadcasting Service, which operates six AM stations, and a consortium of AM "daytimer station" groups in and around Kentucky led by Big River Radio, Inc. Many complain that they have lost audience to the iPod, and satellite and Internet radio. "All daytimer stations are at a competitive disadvantage, and generally cannot provide good nighttime service to their communities of license," the Big River group wrote to the FCC in early December.

Other station filings, such as Gelfman of WCTR's, extol the unique local coverage that they provide: high school and college sports, local government meetings, and agricultural weather forecasts. "In our town, Chesterton, there is one newspaper—a weekly—and we are the only other source of local news and events," Gelfman wrote. "Our service to our local community should not be so seriously limited by rules that require us, a vital community resource, to go off the air."

Giving to AM; taking from low-power FM

But a sympathetic filing submitted by Clear Channel Communications in January probably did not help the cause of these local broadcasters. Clear Channel owns 1,100 radio stations across the US and praised the plan as one that will "enhance competition, foster localism and promote diversity." Shortly after their comment came one of the few statements critical of the proposal, jointly filed by the LPFM advocacy group Prometheus Radio Project and the Media Access Project (MAP).

The Prometheus/Media Access comment reminds the FCC that not all AM stations provide local coverage to their communities; many, in fact do not at all. But the translator proposal could take spectrum space from potential LPFM license owners, whose applicants have to prove to the Commission that they enjoy an "established community presence" and are committed to "local program origination."

"Allocating FM spectrum to AM stations that may or may not provide local content is not an effective way to increase localism," Prometheus/MAP wrote to the Commission on February 4, "a more effective way to promote localism is to allocate spectrum to LPFM licensees who pledge a commitment to localism."

The FCC has launched another proceeding on operation and licensing rules for these community based, nonprofit stations, which can broadcast at a maximum of 100 watts of power. And Representative Michael Doyle of Pennsylvania has proposed legislation that would liberalize the rules for LPFM, since an FCC commissioned study indicated that these licenses do not threaten full power stations, as the NAB and National Public Radio claimed in the late 1990s. In late November, the Commission recommended to Congress that it remove requirements that LPFM stations protect full power stations operating on third adjacent channels.

The Prometheus/MAP filing also notes that while female and minority ownership rates are better for AM than for FM stations, they're still "abysmal," at 6.63 percent for women and 10.65 percent for minorities. "The proposed rule would do nothing to increase the number of minority and female owners," the comment warns, and could make the situation even worse by making AM stations more valuable, "attracting women and minority buy-outs from well funded corporations."

But the statement does suggest conditions for AM to FM translator rules that Prometheus and MAP see as acceptable.

• AM stations may access FM translators only as a "fill-in"—presumably when some important event requires broadcast coverage in the early morning, evening, or night
• Only "standalone" stations may employ FM translators. "Many AM stations are owned by large companies," Prometheus writes, "and the interests of large companies should not be allowed to hide behind the interests of survival of genuine small businesses"
• One FM translator per standalone AM station
• No translator should go to an AM station that also owns an FM outlet in the same market.

It's unclear when the FCC will make a decision on this issue, but given that even the most skeptical commenters make allowance for the proposal, chances are that we'll eventually see some relaxation of the Commission's FM translator rules.
http://arstechnica.com/news.ars/post...roversial.html





A Victory for Jazz, or Just Grammy Being Grammy?
Ben Ratliff

When something newsworthy or popular or positive happens to a jazz musician — a big award, say — many in the jazz world feel astonished for about four seconds, then quickly act very smug. You know: We’ve been sitting here patiently, full of our aesthetic virtue, so used to being ignored, and the world has finally come around to our point of view. Are we happy about it? More to the point, what took you so long?

Are we entitled to feel that way about “River: The Joni Letters,” by Herbie Hancock, being named album of the year at the Grammys on Sunday ? We could, but it would be silly. Perhaps the speculation is true that Amy Winehouse and Kanye West split the vote. And yes, it is very unusual to see a record of such modest sales win the big prize. But inasmuch as it is a jazz album, it is precisely the kind of jazz album that would win this award.

First, let’s just get this over with: Where were you in 1965, Recording Academy, when Mr. Hancock made his venerated album “Maiden Voyage”?

But look one year further back to 1964. “Getz/Gilberto” won, and besides “River,” it is the only more or less jazz album in 50 years of the Grammys to have earned this award. There might be instructive logic there to unravel why “River” beat some records that are far more successful and far more emblematic of their time. (“River” has sold around 55,000 copies, whereas Kanye West’s “Graduation” has sold two million.)

“Getz/Gilberto” was a collaboration between the jazz saxophonist Stan Getz and the bossa nova musician João Gilberto, with his wife at the time, Astrud Gilberto, as occasional singer. It shares some important qualities with “River.”

Both are quite beautiful, though practical: experiments with strong ideas made moderate. Both are syncretist collaborations between a flexible jazz musician and a famously uncompromising genius who invented his or her own style — two musicians of putatively different worlds. Both feature light-voiced singing on a little less than half the tracks.

And on both, the drums sound chastened. (When a jazz record with really assertive, swinging rhythm wins album of the year, then jazz enthusiasts can feel smug. “Good taste” — an idea that means quite a lot in this category of the Grammys — can be telegraphed quickly by reducing the role of the drums.)

“River” isn’t just a jazz record. It is a singer-songwriter record. (Norah Jones, Tina Turner, Corinne Bailey Rae, Leonard Cohen and Luciana Souza are on it, all singing Joni Mitchell songs, as is Ms. Mitchell herself.) It is soft-edged and literate and respectable. It seems, at least, intended as an audience bridger. And it also a very Grammy-ish record, not just because Mr. Hancock, and others on it, have won various Grammys in the past.

Institutions like to congratulate themselves, and giving the prize to “River” can be understood as a celebration of the academy’s more high-minded pop impulses. The best album category, in particular, is often a corrective or an apology for any excesses or shortcomings of the present.

It can amount to a sentimental, history-minded celebration of album culture. At this point it can conjure and lament a lost world of musicians and styles from the 1970s or before, those who actually played instruments, sometimes very well, and trusted their listeners to pay attention to them in 40-minute chunks. From the last 15 years of the award this idea could explain the victories of Ray Charles, Norah Jones, Steely Dan, Santana, Bob Dylan, Tony Bennett and Eric Clapton. (It doesn’t explain Celine Dion, but if we perfectly understood the mind-set of academy members we wouldn’t watch the show.)

Some of what “River” accomplishes as a jazz record is serious indeed. Mr. Hancock’s version of Duke Ellington’s “Solitude” is the modern jazz process itself: a complete reharmonization of a familiar song, with rhythm that keeps vanishing and reappearing. Wayne Shorter’s saxophone playing on many tracks, including his own “Nefertiti” and Ms. Mitchell’s “Edith and the Kingpin,” is casually brilliant — some kind of strange, subconscious vernacular. And though Ms. Mitchell has never called herself a jazz singer, her vocal performance on “The Tea Leaf Prophecy” has a rhythmic assurance that a lot of self-identifying jazz singers could use.

It’s a cool-tempered album, almost drowsy. In so many ways it does seem a strange choice, not just in its modest commercial profile but in that it’s the first album to win this particular award for either Mr. Hancock or Ms. Mitchell. Yet it is also august and exquisitely acceptable: precisely the qualities that this category of the Grammy Awards tends to orient itself around.
http://www.nytimes.com/2008/02/12/ar...c/12gramm.html





Belaboring the obvious

Researchers: Musicians Like to Sing About Drugs and Sex

According to a new study conducted by medical researchers, thirty-three percent of popular songs contain explicit content and forty-two percent of songs hint at substance abuse. Rap was the most frequent offender, with seventy-seven percent of songs making reference to drugs or sex, with country music a surprising silver medalist with a thirty-six percent explicit content rate. The study also proves the old war cry “sex, drugs and rock n’ roll” to be factually incorrect, as only fourteen percent of rock songs contain offending lyrics. So how did the medical researchers come to their conclusion? They analyzed the lyrics of a total of 287 songs from 2005 that encompassed all musical genres. This reminds us of that Russian study that proved heavy metal’s subject matter is heavy. To further cement how useless this new study actually is, the researchers failed to draw any conclusions on how hearing all these drug references affects young listeners.
http://www.rollingstone.com/rockdail...drugs-and-sex/





Internet "Creates Pedophiles" According to "Expert"
Brian Ribbon

In the latest sensationalist article about pedophiles on the internet, the director of a Spanish vigilante organization has claimed that the internet "creates pedophiles". While conflating pedophilia with child sexual abuse, the "expert" quoted in the article incorrectly states that "studies show that some pedophiles feel attracted to children from an early age, but the majority of them develop the tendency later on"; he then claims that "the internet can become a catalyst for people belonging to the latter group."
http://yro.slashdot.org/article.pl?sid=08/02/11/1725238





Mobile Firms to Block Child Porn
BBC

Mobile firms from across the world have launched a new alliance which aims to block paedophiles using phones to send or receive child sexual abuse images.

The GSMA, the global association for mobile firms, has launched the Mobile Alliance, and says it is vital to act as web access via phones improves.

Among planned measures will be a block on mobile phone access to websites which host abusive content.

There will also be hotlines to report services carrying inappropriate images.

Swift removal

The Alliance has been founded by the GSMA, Hutchison 3G Europe, mobilkom austria, Orange FT Group, Telecom Italia, Telefonica/02, Telenor Group, TeliaSonera, T-Mobile Group, Vodafone Group and dotMobi.

It says its primary aim is "to create significant barriers to the misuse of mobile networks and services for hosting, accessing, or profiting from child sexual abuse content".

The Alliance says that, while the vast majority of child sexual abuse content is accessed through conventional internet connections, safeguards need to be in place as broadband networks being rolled out by phone firms could lead to similar misuse by paedophiles.

Among the measures is a commitment by members of the Alliance that they will implement "Notice and Take Down" procedures that will enable the swift removal of any child sexual abuse content which they are notified about on their own services.

"As our industry rolls out mobile broadband networks that provide quick and easy access to multimedia Web sites, we must put safeguards in place to obstruct criminals looking to use mobile services as a means of accessing or hosting pictures and videos of children being sexually-abused," said Craig Ehrlich, GSMA chairman.

"We call on governments across the world to support this initiative by providing the necessary legal clarity to ensure that mobile operators can act effectively against child sexual abuse content and to step up international enforcement against known sources."

The initiative was welcomed by Viviane Reding, European Commissioner for Information Society and Media.

Her role encompasses regulations on e-communication in Europe and she said: "This gives a very clear signal that the mobile industry is committed to making the Mobile Internet a safer place for children."

British success

Arun Sarin, chief executive of Vodafone, said the initiative was one of a number of measures Vodafone was implementing to combat misuse of its network. "Protecting young people wherever they are is of paramount importance to Vodafone," he said.

Joachim Horn of T-Mobile pledged that his company would "continue to be at the industry's forefront to maintain a high level of child safety," while Orange's Olaf Swantee said it was important for all telecoms firms to "share key learnings" to address the issue.

Sarah Robertson, spokeswoman for The Internet Watch Foundation, said the move would back up work already done in Britain, where she said the watchdog "already had an excellent working relationship" with all the major mobile phone providers.

"We're in the business of helping to block access to sites which carry these images. We welcome the support of the GSMA and look forward to establishing any relevant relationships with international providers."

She predicted use of the "Notice And Take Down" procedures would prove vital, saying that it had helped to police online content in the UK, with figures showing that the proportion of the world's child sex sites that are hosted in the UK had been cut from 18% to 1% over a six-year period.
http://news.bbc.co.uk/go/pr/fr/-/2/h...gy/7238739.stm





GOP's McKee Resigns After Home Is Searched

The Hagerstown home of Maryland Delegate Robert A. McKee, R-Washington, was searched by police officers for reasons that haven't been publicly disclosed. Philip Rucker

Robert A. McKee, a long-serving Republican delegate from Western Maryland, announced his resignation yesterday after authorities, who say they are conducting a child pornography investigation, seized two computers, videotapes and printed materials from his Hagerstown home.

First elected to the House of Delegates in 1994, McKee was chairman of the Western Maryland delegation and sponsored legislation to protect minors from sexual predators. McKee, 58, also resigned yesterday from his post as executive director of Big Brothers Big Sisters of Washington County, a child mentorship program where he has worked for 29 years.

"For me, this is deeply embarrassing," McKee said in a statement. "It reflects poorly on my service to the community."

The FBI's cyber-crimes unit and the county sheriff's office are reviewing the materials seized from McKee's home Jan. 31, federal and local authorities said.

"No charges have been filed as of this time," Sheriff Douglas W. Mullendore said.

Mullendore declined to elaborate on what he called a child pornography investigation and said authorities have not searched the delegate's office in Annapolis.

McKee's resignation will take effect at 8 p.m. Monday. He said in his statement that he will cooperate with authorities.

"In the meantime, I have entered treatment," McKee said, without providing specifics. "My primary focus is to get well and stay well. I know this can only happen with the support and prayers of my family and friends and the help of professionals."

The delegate's resignation came after the Hagerstown Herald-Mail reported yesterday that local authorities had obtained a warrant to search his home. McKee was at the State House on Thursday morning but skipped a committee hearing that afternoon and was absent from a floor session yesterday.

McKee, who is considered a political moderate, has sponsored bills this year dealing with minors, including the Child Protection From Predators Act and a proposal to collect DNA samples from sexual predators. McKee has sponsored several other sexual offender and child abduction bills in previous years.

For decades, McKee has been involved in youth athletics and children's groups, according to his General Assembly biography. He has served in officer positions in two Little League groups and as secretary of a parent and child center advisory committee.

During the 1970s, McKee was a reservist in the U.S. Navy. He is a former chaplain for the Hagerstown Jaycees and is a trustee and community services chairman at First Christian Church.

"In the long run, I hope and pray that my work in the local community for the last three decades will speak louder than the challenges I now face," McKee said.

House Republicans convened an emergency meeting yesterday to discuss the situation. Minority Leader Anthony J. O'Donnell (Calvert) said he had not spoken with McKee since news broke about the investigation and said he would not speculate on the investigation.

"It's troubling," O'Donnell said. "But look, in this country, we afford our citizens the right to hear details."

House GOP leaders issued a statement saying the Washington County Republican Central Committee will meet soon to select a replacement.

McKee served on the Ways and Means Committee and was well respected by his colleagues, said Del. Sheila E. Hixson (D-Montgomery), the committee's chairwoman.

"He's capable, works well, represents his party and constituents well and is very well liked by the committee leaders in both parties," Hixson said.
http://www.washingtonpost.com/wp-dyn...=moreheadlines





Pr0n Baron Challenges Google and Yahoo! to Build Better Child Locks

'This is not about First Amendment rights, it is about protecting children'
Cade Metz

The world's largest porn studio says that Google and Yahoo! should "erect stronger barriers" to keep porn away from the world's children.

Steven Hirsch, the co-chairman and co-founder of Vivid Entertainment, is to deliver this message on Saturday in New Haven, Connecticut as he addresses an army of Yale University MBA candidates.

"Responsible companies in the adult industry such as ours have done a great deal to deter minors from accessing adult material," Hirsch proclaims from inside a Vivid press release. "None of the search engines and portals, but particularly Yahoo and Google, has taken any significant steps in this direction.

"Vivid will work with any company that is ready to make it much more difficult for children to be exposed, even inadvertently, to material intended only for adults. This is not about First Amendment rights, it is about protecting children."

They are endangered, the porn king says, because the likes Google and Yahoo! do a poor job of promoting their porn filters and age-verification tools.
And while we are on the subject...

Hirsch also says that he does his best to convince his porn stars they shouldn't be porn stars. "I do interview all of the Vivid Girls personally before we sign them to exclusive contracts," he continues. "But, guess what? I spend more time trying to talk a new girl OUT of becoming a porn star as I do discussing the deal points of her contract once she's convinced me that she really does want to go down that path."

And he wants everyone to understand that he's a serious businessman. "In the end, running the world's biggest adult film studio isn't that much different than running any other studio except that our product is pretty much exclusively about sex," he rambles on. "The truth is, Vivid Entertainment is a business like any other. And my job is concerned as much with cost of goods, margins and EBITDA as it is with trying to come up with the idea for the next 'Debbie Does Dallas...Again.'"

Oh, and he knows his tech too. "We always believed it was important to stay on top of all new technologies," he insists, before listing all sorts of things that only occasionally involve new technologies.

"But we were also the first adult studio to sign talent to exclusive contracts; the first to change adult video packaging to make it more appealing to consumers and retailers; the first to really capitalize on the Internet for branding; the first to own a cable TV network; the first to shoot movies in Hi Def and to go after the wireless market in an effective way; the first to diversify with special interest labels such as Vivid-Alt, Vivid-Ed, Vivid-Celeb and even Vivid-Plus for those who like to watch women of notable stature; and the first to license our name with a professionally managed program that has included book publishing, comics, condoms, vodka, shoes, apparel and other merchandise. In publishing, 'How to Have a XXX Sex Life' by the Vivid Girls, published by HarperCollins, became a best seller and recently went into paperback."
http://www.theregister.co.uk/2008/02..._google_yahoo/





Half Of UK Men Would Swap Sex For 50 Inch TV

Nearly half of British men surveyed would give up sex for six months in return for a 50-inch plasma TV, a survey -- perhaps unsurprisingly carried out for a firm selling televisions -- said on Friday.

Electrical retailer Comet surveyed 2,000 Britons, asking them what they would give up for a large television, one of the latest consumer "must-haves."

The firm found 47 percent of men would give up sex for half a year, compared to just over a third of women.

"It seems that size really does matter more for men than women," the firm said.

A quarter of people said they would give up smoking, with roughly the same proportion willing to give up chocolate.

(Reporting by Peter Apps, editing by Paul Casciato)
http://www.reuters.com/article/techn...85284320080208





You Think It’s Easy to Schlep Those Cases in Four-Inch Heels?



Edward Wyatt

Minutes before showtime on the set of “Deal or No Deal,” Wendi San George, the program’s chief stylist, is trying to head off a crisis.

Jenelle Moreno, the bearer of suitcase No. 17, is just back from a beach vacation, and her tan is gleaming under the lights. “Can we just put some powder on her chest?” Ms. San George whispers frantically into her walkie-talkie as 2 of the show’s 14 makeup artists scurry across the stage.

“Lisa has a black dot on her left knee,” Ms. San George reports, referring to Lisa Gleave, the carrier of suitcase No. 3. “Then come over to No. 5,” also known as Ursula Mayes. “Her part is really funky to me. Put it in the middle with no bangs like it usually is. And where is Anya?”

The reply crackles back over the radio: “Wardrobe malfunction.” Soon enough Anya Monzikova, holding suitcase No. 10, rushes onstage pulling at the upper portion of her red sequined halter dress, trying to keep what little fabric there is covering at least a portion of her left breast.

For the 26 women who take the stage each week on the NBC hit game show, life is not all glamour and sequins and witty repartee with the host, Howie Mandel. At this taping in mid-January, for instance, there was the 14-hour workday, 8 ½ hours of which involved some or all of the models standing on an Arctic-like soundstage in short, short sleeveless dresses and four-inch heels.

The models are a popular part of this game show that “has no trivia, no stunts, no skill,” as Mr. Mandel put it. “The first time I heard about it, I thought, there’s no game.” One television critic, Phil Rosenthal of The Chicago Tribune, put it less delicately, calling the program an elaborate version of “How many fingers do I have behind my back?”

The game is played like this: A contestant chooses one of 26 suitcases, each worth an amount of money, from 1 cent to $1 million. Leaving that chosen case unopened, the contestant opens the remaining cases, a few at a time, and the amounts assigned to them are wiped off the board. Periodically a caller known as the banker and working for the producers offers the contestant an amount of money to stop playing and give up the sum in the originally selected case. The trick is, that amount might be $1 million — or 1 cent.

Despite having no contestant win the $1 million grand prize in two and a half years of production, “Deal or No Deal’ is an unqualified success. (In a continuing effort to give away the $1 million top prize, the producers have periodically increased the number of $1 million cases, to 12 of the 26 in Wednesday’s episode, scheduled for 8 p.m.)

The program’s average ratings have fallen only slightly from its first season to its current, the third, when it has regularly drawn 15 million viewers per episode. One measure of the show’s success is a soon-to-come spinoff, a half-hour syndicated version that will air five days a week. Mr. Mandel now calls the show “the most exciting thing I’ve ever done.”

The models are an important part of the prime-time success, said Scott St. John, an executive producer. “There is, of course, the visual appeal they have, but it goes beyond that,” he said. “They have wit and charm, and we let that come through on the show.”

They are a diverse group. Stacey Gardner, the usual holder of suitcase No. 2, graduated from law school and says she passed the California bar exam in 2005. Pilar Lastra, No. 14, was Playboy’s Miss August 2004. Aliké Boggan, No. 20, interprets services for the hearing-impaired at her church. Aubrie Lemon, who usually carries No. 23 but who was No. 6 at a recent taping, plays the harp and says she passed the qualifying exam for Mensa.

“It’s nice to kind of exercise my brain a little bit and show I still have it up there,” Ms. Lemon said. “It can just go numb if you sit here for 10 or 12 hours. But a lot of us here are very smart. You would be surprised because we all have this Barbie-doll facade.”

Models have a long history with game shows. “The Price Is Right” had Barker’s Beauties; “Let’s Make a Deal” had Carol Merrill. And, of course, “Wheel of Fortune” made Vanna White a household name.

But no game show has a veritable army of women that is as much a part of the show as the host and the contestants. And in dramatic terms the producers use them to spectacular effect, with an over-the-horizon march at the top of each episode that is one of the most visually compelling entrances on television.

In a lot of modeling ensembles, particularly those that have the goal of making the women uniform, all the models would have to be a similar height. But here they range from 5 foot 4 to over 6 feet. Their dresses, however, are designed so that their hems form a continuous line across a row of four to six women.

Chosen by Dina Cerchione, the costume designer, the dresses are discovered on the rack at a store and then ordered from the manufacturer in lots of 33. Often they are evening gowns that, over three fittings with each model, are cut to miniskirt length. Shoes too tend to be off the rack, from brands like Alfani, Nina and Aldo.

“I think we want to be sexy, but we are a family show,” Ms. Cerchione said. Sexy often wins out, however. Dresses emphasize certain parts of the models’ anatomy — not their clavicles — and it is not much of a secret that some of the women have been surgically enhanced in those areas. (Others simply receive extra padding.)

But the rest of their features — ethnicity, hair color — make the models individually distinct.

“With 26 girls you can’t have all blondes or all brunettes,” said J. C. Carollo, the model casting director. “You have to spice things up.” More than half of the models have been with the show since its beginning, but there is turnover and demand for new talent. While several of the models said they could live just on what they earn from the show, it shoots only two or three days every three weeks. That leaves plenty of time for them to pursue other modeling jobs — a key practice in a profession where the span of career is often only a few years.

The models themselves dismiss the notion that they are little more than eye candy.

“I would be very upset if someone said that to me,” said Lindsay Clubine, bearer of suitcase No. 26. “The girls here are involved in a lot of different charities.” Like Marisa Petroro, No. 18, for example. She had a tumor removed from her arm on her 19th birthday and underwent a year of chemotherapy and radiation; she now is a national spokeswoman for the Sarcoma Foundation of America.

“The girls here are not like Lindsay Lohan or Paris Hilton,” Ms. Clubine said. “They are pretty, but they have good heads on their shoulders.”
http://www.nytimes.com/2008/02/13/ar...deal.html?8dpc

















Until next week,

- js.



















Current Week In Review





Recent WiRs -

February 9th, February 2nd, January 26th, January 19th

Jack Spratts' Week In Review is published every Friday. Submit letters, articles, press releases, comments or questions in plain text English to jackspratts (at) lycos (dot) com. Submission deadlines are Thursdays @ 1400 UTC. Please include contact info. The right to publish all remarks is reserved.


"The First Amendment rests on the assumption that the widest possible dissemination of information from diverse and antagonistic sources is essential to the welfare of the public."
- Hugo Black
JackSpratts is offline   Reply With Quote