View Single Post
Old 17-11-05, 05:28 PM   #2
JackSpratts
 
JackSpratts's Avatar
 
Join Date: May 2001
Location: New England
Posts: 10,018
Default

Researchers Unveil $100 Laptop For Schoolkids
Andy Sullivan



Researchers unveiled a $100, hand-cranked laptop computer on Wednesday and said they hoped to place them in the hands of millions of schoolchildren around the globe.

About the size of a textbook, the lime-green machines can set up their own wireless networks and operate in areas without a reliable electricity supply, Massachusetts Institute of Technology (MIT) researchers said at a United Nations technology summit.

"These robust, versatile machines will enable children to become more active in their own learning," U.N. Secretary-General Kofi Annan said at a press conference where the machine was unveiled.

The goal is to provide the machines free of charge to children in poor countries who cannot afford computers of their own, said MIT Media Lab chairman Nicholas Negroponte.

Governments or charitable donors will pay for the machines but children will own them, he said.

"Ownership of the laptops is absolutely critical," he said. "Have you ever washed a rented car?"

Brazil, Thailand, Egypt and Nigeria are candidates to receive the first wave of laptops starting in February or March, and each will buy at least 1 million units, he said.

The laptop is not yet in production but one company has offered to build it for $110 and four others are considering bids as well, he said.

The computers operate at 500 MHz, about half the processor speed of commercial laptops, and will run on Linux or some other open-source operating system, he said.

They can be folded in different ways to serve as an electronic book, a television or a computer. A bright yellow hand crank that sticks out prominently from the side serves as an alternate power source when batteries or an electric outlet are not available.

The computer uses a screen from a portable DVD player, which can be switched from color to black and white to make it easily viewable in bright sunlight, said Mary Lou Jepsen, the project's chief technical officer.

A free laptop program in the state of Maine has increased school attendance and boosted participation, Negroponte said.

"If you get those kinds of results, I'm going to build the machines," he said. "There's enough passion and enough kids that are able to do things they were not able to do before that justifies it."

Negroponte said the machines might be commercially available to the general public at a higher price -- perhaps $200 or so. But their bright color and distinctive appearance should discourage anybody from stealing or buying one from a student, he said.
http://today.reuters.com/news/newsAr...archived=False





Opening The Door On A CD-Less Music Label
John Borland

Forty years ago Jac Holzman left a deep mark on popular music with the release of The Doors' first album on his independent Elektra music label. Today he wants to do the same with Cordless Recordings.

Holzman's Cordless label is the first all-digital music label operated by a major record company, the Warner Music Group, launching Thursday on the Web and on digital music services such as iTunes and RealNetworks' Rhapsody.

Music from the label's first six bands is being sold only online for now, in three-song "clusters" instead of full albums. Instead of big tours, the bands will be promoted on blogs and sites like MySpace.

More eyebrow-raising from the traditional big labels' perspective, artists get to keep ownership of the master recordings they release under Cordless. If they want to release their music elsewhere after a short contract is up, more power to them.

If that sounds a little like an indie music label, it's not an accident. The 73-year-old Holzman says he's trying to infuse the new venture with the spirit of the independent labels he created and managed for 20 years, even if it exists in the arms of a major corporation.

"Independent record making is a process and a point of view and a flexibility," Holzman said, noting that he and his partners have already agreed to sign bands just hours after hearing them. "There's a nimbleness that larger companies, where decision mechanisms have become cumbersome, have lost."

The Cordless Recordings label is an ambitious experiment in several ways for Warner Music, which has increased its focus on digital distribution since being sold by parent company Time Warner in late 2003 and going public earlier this year.

Warner's new owner, Edgar Bronfman Jr., has repeatedly highlighted for investors his belief that digital markets are responsible for the industry's growth, and recently told attendees at a big gathering for the mobile-phone industry that it was "the music industry's most important conference." (Shelby Bonnie, the CEO of News.com parent CNET Networks, joined Warner Music Group's board earlier this week.)

Cordless Recordings is a bet that relatively inexpensive Internet distribution and marketing may give labels a cost-effective way to nurture bands over time, instead of spending as much as hundreds of thousands of dollars to record and market a first album.

It's also an experiment with patience. The idea is to release short three-song clusters online every few months over the course of nearly two years, allowing musicians to grow artistically and build an audience, an approach that differs radically from betting everything on a single 12-song album.

"It seems like a smarter way of spending money," said Larry Little, co-founder of Los Angeles-based From the Future Management, which represents bands including The Posies and Film School. "It sets the band up in a position where they don't necessarily have to deliver right out of the gate. Today, the pressure has been to have a hit right away."

Cordless calling
For Holly Brewer, singer in the band Humanwine, contact with Cordless came as a surprise.

The group, which plays an eclectic mix of music that Brewer describes as "punk rock Mary Poppins," is one of the first six bands to be released on Cordless. She says a friend passed an MP3 of their music to Holzman, and the label called not long afterward.

The band members hadn't heard of Warner's digital plans but were impressed by Holzman.

"He's really what people say about him," Brewer said. "He's honestly beyond making a legacy, and he just wants to put out good music. He's probably not nearly as dangerous as he was 20 years ago."

Holzman came to the project almost by accident. His roots stretch back decades, to 1950, when he started Elektra as one of the first independent record labels in the country, ultimately signing The Doors as his most lasting band. He sold Elektra to Warner in the early 1970s and has been in and out of the company since, serving as Warner Music's Chief Technology Officer in the early 1990s before leaving for another start-up.

His return to Warner came shortly after Bronfman's purchase of the label nearly two years ago. He met with Bronfman, who showed him a list of potential digital ideas and asked if he wanted to be involved with any of them.

Holzman liked the idea of a digital-only label and has worked since to put the infrastructure for Cordless in place.

The first six bands are all young and relatively new to recording. Along with Boston-based Humanwine, they include Jihad Jerry & The Evildoers, Breakup Breakdown, Dangerous Muse, Nozzle, and Koishii & Hush. There's no common thread, other than that Holzman and the others helping to run the label liked the music, the executives said.

They aren't likely to make an immediate significant impact on Warner's bottom line. But that's not the point, or not yet. Cordless is an experiment for both the label and the bands on it.

For bands, which will get only a small advance that will cover some recording costs, it's a bet that the Internet can help build their reputation without having a CD available in Tower Records or other stores. In return, they get the extraordinarily rare right to keep permanent ownership of their music.

On the label side, it's an attempt to reach out to a music-consuming world that is increasingly deserting radio and record stores for iTunes and MySpace.

"The scene today is one of some confusion," Holzman said. "Nobody knows which way to jump or where it's going to go. But we intend with Warner and Cordless to point the direction we want to go. Whether anybody else follows is another matter."
http://news.com.com/Opening+the+door...3-5942975.html





'Spyware' Vendor Bangs Copyright Shield
John Leyden

RetroCoder, developers of the SpyMon remote monitoring program, is brandishing copyright law in a bid to protect its software from being detected by anti-spyware or anti-virus products.

SpyMon is marketed as a means for the paranoid to surreptitiously monitor the activities of their partners or kids online - behaviour that has brought it to the attention of security vendors.

RetroCoder has countered by confronting visitors to SpyMon's download page with a 'copyright notice (http://www.spymon.com/download.htm)' which states that it cannot be examined by security researchers.

"If you do produce a program that will affect this softwares ability to perform its function then you may have to prove in criminal court that you have not infringed this warning. Infringement of a copyright licence is a criminal offence," RetroCoder's End User Licensing Agreement (EULA) states.

It's questionable whether this agreement would withstand legal challenge but RetroCoder is making good on its threat to take security vendors to task for detecting its product. Anti-spyware maker Sunbelt Software has been sent a nastygram (http://sunbeltblog.blogspot.com/2005/11/retrocoder.html) threatening legal action against it for labelling SpyMon as spyware.

"If you read the copyright agreement when you downloaded or ran our program you will see that anti-spyware publishers / software houses are NOT allowed to download, run or examine the software in any way. By doing so you are breaking EU copyright law, this is a criminal offence. Please remove our program from your detection list or we will be forced to take action against you," RetroCoder said.

Sunbelt Software is standing firm in its decision to label SpyMon as malware. It's far from alone in labeling SpyMon as potentially harmful. CA, for example, designates the software as a keystroke logger.

Red Herring

RetroCoder's effort to to cow security vendors is far from unique. Simon Perry, CA's VP of security strategy in EMEA, said security makers are getting hit by such legal threats on a regular basis. "I'm not aware of any developer successfully using this tactic in order to get a security vendor to back off," he said.

Perry said that the copyright threat tactic was something of a red herring. "A copyright license has nothing to do with looking at software. You don't need to decompile code to look at its behaviour. By looking at whether software 'does what it says on the tin' you can say whether its potentially harmful or not," he said.
http://www.theregister.co.uk/2005/11/14/spymon/





Explosive growth

Gas Pipe Broadband?
Marguerite Reardon

Imagine accessing the Internet over the same pipe that provides you with natural gas for cooking.

It may sound nuts today, but a San Diego company called Nethercomm is developing a way to use ultra wideband wireless signals to transmit data at broadband speeds through natural-gas pipes. The company claims its technology will be able to offer 100 megabits per second to every home, which is more than enough to provide voice, video and high-speed Internet access.

Needless to say, there's a big caveat here: These claims have yet to be tested. Nethercomm has no working products and has not tried the technology in the field.

"When I first heard about it, it seemed pretty outrageous," said Joe Posewick, president of EN Engineering, an engineering firm that helps natural gas companies build distribution facilities. "But the more we talked to Nethercomm and other experts in the industry, the more we realized that it could be a viable technology that could revolutionize the natural-gas industry.

"Of course, we have to see if it really works," Posewick added. "There's been no proof of concept yet."

So how does broadband in gas pipes work? Nethercomm is adapting ultra wideband radio transmitters and receivers to send wireless signals through the natural-gas pipe at the same time the pipe is delivering gas fuel. Ultra wideband, or UWB, is a developing communication technology that delivers very high-speed network data rates, but at higher power levels it can interfere with other wireless signals.

That's not usually a problem when ultra wideband signals are transmitted in pipes buried underground. As a result, tremendous amounts of data could be transmitted through a gas line without causing problems.

At least, that's the idea. Nethercomm and the technology it's developing is still in the early days. The company hasn't yet announced any licensing deals with ultra wideband equipment makers. The 12-person company, which has no venture backing at the moment, is also trying to raise money to start a pilot program with broadband providers and gas companies by next summer.

Some skeptics may scoff at the idea of using a natural-gas pipe for broadband, but it's not so easy to dismiss the man behind the technology.

Patrick Nunally, founder and CEO of Nethercomm and one of the inventors of gas line broadband, has a hefty track record. Until May 2005, he worked as chief technology officer for Patriot Scientific, a company that designs microprocessor technology for the U.S. Department of Defense. Prior to that, he was president and CEO of Intertech, a company he founded in 1998 that specialized in intellectual-property development for embedded processor and communications systems. He has also served as chief executive and chief technology officer for several other technology development companies.

Nunally holds more than 134 patents worldwide, predominantly in wireless and signal processing. He's been honored with awards from the IEEE (Institute of Electrical and Electronics Engineers) and was named an IEEE/IAE (Institute for the Advancement of Engineers) fellow in 1994. He has even received a formal citation from former President Bill Clinton for his efforts in furthering technology development in the United States.

If transmitting broadband through natural-gas pipes works as Nethercomm's execs think it can, it could have a major impact on the broadband access market. A recent U.S. Supreme Court decision and changes in the Federal Communications Commission classification of DSL has made it more difficult for independent service providers to use existing cable or phone infrastructure to reach broadband customers.

What's more, the old copper infrastructure that is currently used to deliver DSL service doesn't have enough capacity to support new applications like high-definition television service. While phone companies like SBC Communications and Verizon Communications have already begun spending billions of dollars to upgrade their networks to provide more capacity, technology that uses existing pipes into people's homes could augment these new networks. "The cable guys view us as a threat," Nunally said. "But the phone companies that we have talked to seem more interested in working with us."

Internet service providers and the phone companies, in fact, have been looking for alternative ways to deliver broadband services to customers. EarthLink has been busy exploring new technologies such as broadband over power lines and citywide Wi-Fi. Back in October, it announced a contract to build a wireless broadband network for the city of Philadelphia.

A chicken-and-egg problem
It has also been testing ways to use electrical power lines to deliver broadband service into homes. It's currently testing services with Duke Power in Charlotte, N.C., Progress Energy in Raleigh, N.C., and Consolidated Edison in New York. Google is also looking into citywide Wi-Fi in San Francisco, and it's invested in a a broadband-over-power-line service provider called Current Communications Group.

Other carriers, such as BellSouth, have started offering services that use prestandard WiMax technology. It now offers service in Athens, Ga., Palatka, Fla., and parts of New Orleans.

There are technical limitations to each of these technologies. For example, broadband on power lines has interference issues. Citywide Wi-Fi and WiMax don't offer the kinds of bandwidth needed to deliver high-capacity applications like high-definition television. WiMax supports data speeds of 75mbps, but that capacity is shared among dozens of users. BellSouth's service in New Orleans only offers 1.5mbps.

Natural-gas piping could be a good solution to the problem. It already serves more than 70 percent of households and well over 35 percent of businesses in the United States, according to West Technology Research Solutions, an independent market research firm. Because the lines are underground, more powerful transmitters can be used, which ramps up bandwidth to 100mbps for every household.

Delivering broadband through gas pipes could be much cheaper than technology available today, according to a recent study by West Technology Research Solutions. The analyst firm estimates it would cost a phone company about $500 per customer to deploy broadband in gas pipes. Deploying DSL over its existing copper infrastructure costs about $1,000 per customer. Fiber to the home is even more expensive, costing about $2,000 per customer.

Nunally estimates a deployment for a city of a million homes would cost $2 million.

"If a comparable service can be delivered cheaper through natural-gas pipes, then that is a big factor," said George West, senior analyst at West Technology.

While most analysts are not sold on the technology just yet, many believe it sounds plausible.

"In theory it could work," said Craig Mathias, a principal analyst at Farpoint Group, a consultancy specializing in wireless and mobile technologies. "Ultra wideband technology is pretty tolerant. But I'm not sure how well it could work within all the twists and turns inside a natural-gas pipe."

At least one ISP says it's interested in learning more about broadband in gas pipes.

"We are just starting to look into it," said Kevin Brand, vice president of product management for EarthLink. "We still need to see how real the technology is, whether it will really work, and how much it will cost to deploy. It doesn't have the ubiquity of power line technology, but natural gas goes into a great number of homes. It's interesting."

West predicts that 3.9 million households will subscribe to broadband services delivered through gas pipes by 2008. And by 2010, he predicts that number will grow to 18.6 million subscribers.

That is, if someone has the guts to test it.

"Utilities are conservative by nature," said EN Engineering's Posewick. "There's a little bit of a chicken-and-egg problem. You need to prove to them that the technology works, but to know if it works, we need someone who can help test it."
http://news.com.com/Gas+pipe+broadba...?tag=nefd.lede





Gandalf Spells Thousand-Fold Broadband Boost

EU funded project will offer DSL at 1.25Gbps
Robert Jaques

An EU funded project dubbed 'Gandalf' is aiming to cast a spell over the internet by boosting broadband speeds by a factor of more than 1,000.

The EU Information Society Technologies' Gandalf project has been created to offer unprecedented data transfer rates seamlessly over fixed-line and wireless connections.

Scientists working on the initiative claim to have developed a groundbreaking technique to increase data rates a thousand-fold compared to existing DSL, and a hundred-fold compared to Wi-Fi.

The technology also allows data to flow over wireless and fixed-line communications, making the project the only initiative in the world to progress so far in both areas.

"Why not use the same technology for both fixed-line and wireless? That was the fundamental question that drove the project," explained Gandalf co-ordinator Javier Martí at the Technical University of Valencia in Spain.

"We also saw the need to address the additional challenge of obtaining high rates of data transfer, exceeding 1Gbps, over both cable and radio."

In order to overcome these hurdles, the project partners developed a technique using an optical feeder that allows data to be sent over cable in a format which also allows it to be transmitted over wireless networks.

This duality ensures that the access nodes and the modems in the homes or offices of end users are all the same regardless of whether they are receiving data via cable, radio or both.

The upshot is that the technology is relatively cheap to deploy and will ultimately reduce costs for operators by alleviating bandwidth requirements at the transmitter end.

It will also simplify the electronics involved at the transmitter and receiving ends, and result in cost savings which could be passed on to end users, the researchers promised.

"The major advantage for operators is that the cost of implementing Gandalf is minimal," said Martí.

"We estimate that it would not cost more than say €50,000 or €60,000 to implement it across an entire network, which is peanuts for operators."

The system would give operators access to more clients without having to undergo costly public works projects to lay new fibre optic cable.

Existing cable could be used to relay data to the closest access node to clients' homes before being converted into a wireless signal.

Laboratory tests have so far achieved a data transfer rate over both fixed-line and wireless of 1.25Gbps and the project partners are currently studying the other capabilities of the system.

"We are planning to test the capabilities of creating wireless access at double radio bands using the base-band and intermediate frequencies," explained Martí.

"So far we have managed Wi-Fi at 5GHz and WiMax at 40GHz. That makes sense in hotspots which can become very crowded very quickly, say in a football stadium during a soccer match.

"The technique would allow additional bandwidth to be added immediately as and where needed.

"The technology is here and it works, the only thing remaining is standardisation. Depending on how well that goes I imagine a product will be on the market within a year from now.
www.vnunet.com/2145795





Other Nations Hope to Loosen U.S. Grip on Internet
Victoria Shannon

When Libya lost the use of its Internet domain ".ly" for five days last year, it needed help from an agency in California that reports to the United States Commerce Department.

Anyone looking to do business with an .ly Web site or e-mail an .ly address probably met with a "file not found" or "no such person" message. For anyone on the Internet, Libya was just not there.

In a day when Internet access is critical to world commerce - let alone casual communication - even a five-day lapse is a hardship. And when one government needs the help of another to make its citizens visible again on that network, it can be a damaging blow to its sovereignty, and perhaps a matter of national security, even if the cause was a dispute over payments, as in the Libyan case.

What if, by historical chance, France or Britain controlled country domain names on the Internet? Would the United States settle for asking another government to fix its own addresses?

That kind of power to hinder or foster freedom of the Internet, centralized in a single government, is the crucial issue for many of the 12,000 people expected in Tunis this week for a United Nations summit meeting on the information age.

Four years of high-level talks on Internet governance will conclude with the meeting. On its eve, a figurative ocean separates the American position - that the Internet works fine as it is - from most of the rest of the world. That includes the European Union, which says the Internet is an international resource whose center of gravity must move away from Washington.

Managing operators of country-level domain names like .ly, .de and .co.uk is one way the United States, through Icann, a nonprofit agency based in California, controls the Internet. This organization is a consequence of the network's development from research in American universities, laboratories and government agencies in the 1970's.

Icann, which is short for the Internet Corporation for Assigned Names and Numbers, is a central authority in an essentially decentralized, neutral and ungoverned global network of networks. So that one computer can easily find another, Icann runs the addressing system, giving out blocks of unique identifiers to countries and private registries.

Whether this week's final debates break the deadlock and produce any agreement to give other governments more sway over Internet policy was in some doubt last week. Even a recent discussion of Internet governance between President Bush and José Manuel Barroso, president of the European Commission, did not bring the sides any closer.

"Our strong preference is to have a document that everyone can be proud of," said David Gross, deputy assistant secretary of state, who is leading the American delegation with Michael Gallagher, assistant secretary of commerce.

"We would be sorely disappointed not to have a document at all, but that would be better than to have a bad document," Mr. Gross said from Tunis, where the negotiations resumed Sunday before the official start of the summit meeting on Wednesday.

A delegate from the European Union said that its call for a new intergovernmental body to set the principles for running the Internet still stands and that the solitary American relationship with Icann "is not sustainable" in the long term.

The official, who spoke on condition that he not be identified because he was not authorized to speak for the delegation, conceded that "there is a need for clarification" in the European Union statement.

But he said the 25-member European contingent was unanimous in its stance, which he called "a middle ground between two extremes: those who are for a complete overhaul and those who are for the status quo."

Mr. Gross and other Americans dismiss the European view as "top down" control of the Internet, as opposed to the private-sector-led, "bottom up" approach of Icann.

But the essential problem is that "the United States holds most of the cards, and if it isn't willing to give any up, it can't be forced to," said Milton Mueller, a partner in the Internet Project, a nonprofit group.

When the first part of the summit meeting took place two years ago in Geneva, many participants feared that the United Nations, through the International Telecommunication Union, wanted to govern Internet issues.

"Today, the I.T.U. is off the table," Mr. Mueller said.

But Mr. Mueller, a participant in the meeting and a longtime follower of developments at Icann, said the Americans had handled their position poorly in the face of global opposition since then. "Americans are so parochial when it comes to these things," he said. "They have no idea how it sounds to 200 other countries when they say, 'The Internet really is nongovernmental - except for us.' Why were they so surprised? In the U.S., that contradiction becomes invisible to you."

The agreement that lets Icann operates under Commerce Department oversight expires in September. The United States government, however, said in June that it would not let its oversight of the master file that decodes Internet addresses lapse despite the agreement.

But Mr. Mueller says the summit meeting fireworks may lead the Bush administration to consider other options that are not so unilateral.

Such small, longer-term steps may not come soon enough for some developments. Two trends herald an enormous demand for unique Internet addresses of the kind that Icann manages, and global participants are eager that the policy and political questions be settled quickly.

One trend is the move by media businesses to make their products available online. Each song, video clip, book or other digital product requires a unique identifier to locate it on the Internet, even if the file is not a Web site per se.

The other is the desire of manufacturers and wholesalers to put radio tags in their physical products for inventory and other supply-chain management. To be tracked over the Internet, each tag also needs its own Internet "address," leading to what the I.T.U. is calling an "Internet of things."

Paul Twomey, the Australian who is chief executive of Icann, said the cooperative and democratic Internet was unlike any other network that had been governed so far - not like the world's railroads, the postal systems or telecommunications - and so required a fresh, untried approach.
http://www.nytimes.com/2005/11/15/technology/15net.html





US To Keep Control Of Internet Traffic System
Andy Sullivan and Astrid Wendlandt

The United States will keep control of the domain-name system that guides online traffic under an agreement on Wednesday seen as a setback to efforts to internationalize one of the pillars of the Internet.

Negotiators at the United Nations World Summit on the Information Society said they had agreed to set up a forum to discuss "spam" e-mail and other Internet issues and explore ways to narrow the technology gap between rich and poor countries.

But oversight of the domain-name system will remain with the United States, a setback for the European Union and other countries that had pushed for international control of one of the most important technical aspects of the Internet.

The European Union said in a statement that the agreement would lead to "further internationalization of Internet governance, and enhanced intergovernmental cooperation to this end."

"In the short term, U.S. oversight is not immediately challenged, but in the long term they are under the obligation to negotiate with all the states about the future and evolution of Internet governance," said a member of the EU delegation who declined to be identified.

The U.S. said the agreement essentially endorses the status quo.

"There's nothing new in this document that wasn't already out there before," said Ambassador David Gross, the head of the U.S. delegation.

"We have no concerns that it could morph into something unsavory," he said about the forum.

The summit was launched two years ago with a focus on bringing technology to the developing world, but U.S. control of the domain-name system had become a sticking point for countries like Iran and Brazil, who argued that it should be managed by the United Nations or some other global body.

The United States argued that such a body would stifle innovation with red tape. The EU in recent months had sought to reach a compromise between the two sides.

"Let me be absolutely clear: the United Nations does not want to take over, police or otherwise control the Internet," said UN Secretary General Kofi Annan. "Day-to-day running of the Internet must be left to technical institutions, not least to shield it from the heat of day to day politics."

Under the agreement, a California nonprofit body known as the Internet Corporation for Assigned Names and Numbers, or ICANN, will continue to oversee the system that matches addresses like "reuters.com" with numerical addresses that computers can understand.

Individual countries will have greater control over their own domains, such as China's .cn or France's .fr. Disputes have arisen on occasion between national governments and the independent administrators assigned to manage these domains by ICANN.

Businesses, technical experts and human-rights groups will be allowed to participate along with governments in the forum, which will first meet in early 2006.

"Internet governance requires a multi-stakeholder approach. This is why we have suffered such agonies in our discussions on Internet governance," said Yoshio Utsumi, who heads the International Telecommunications Union, the UN organization that sponsored the summit.

(Additional reporting by Huw Jones in Brussels)
http://today.reuters.com/news/newsAr...&srch=internet





Microsoft Enters the High-Performance Computing Fray
John Markoff

In January a group of Microsoft researchers set out to discover how much computing power they could buy for less than $4,000 at a standard online retailer.

They found the answer at NewEgg.com, where they were able to purchase - for just $3,632 - 9.5 gigaflops of computing speed. That is the amount of computing power offered by a Cray Y-MP supercomputer in 1991 at a cost of $40 million.

The plunging cost of computing power is both an opportunity and a challenge to Microsoft, which on Tuesday plans to unveil its first entry into the market for high- performance scientific and technical computing.

The company's Windows Computer Cluster Server 2003 software is scheduled to become available in the first half of next year and is intended to give scientists and engineers a simple way to gain high-performance computing from their existing Microsoft desktop computers.

The high-performance market is growing more quickly than the rest of the server market, according to the International Data Corporation. Last year, the percentage of high- performance servers grew from 7 percent to 10 percent of the X86 computer server market. The new Microsoft software is meant for systems with up to 64 processors, but can be extended to much larger machines as well, if they are linked internally on high-speed data networking connections, the company said.

"Our focus is not on the very highest-end systems but on divisional and departmental computing systems," said Kyril Faenov, Microsoft's director of high-performance computing.

The entry is significant, because until now Microsoft has largely been excluded from the high-performance market, which is dominated by Linux and Unix software.

However, Microsoft is planning a significant marketing push into the field with a keynote speech by Bill Gates, the company's co-founder and chairman, on Tuesday at the annual supercomputing trade show taking place this week in Seattle.

Microsoft, based in Redmond, Wash., will also announce that it is planning to provide international support for 10 supercomputing institutes around the world, including Cornell University, the University of Utah, University of Stuttgart and Shanghai Jiao Tong University.

"We think the big deal here is to give a lot more people access to a level of computation that was not available before," said Craig Mundie, a senior vice president at Microsoft and one of the company's three chief technology officers.

Microsoft is hoping to leverage its monopoly position in desktop computing by offering scientists and engineers a single computing environment with the thousands of applications now available for desktop and server operating systems.

The low end of the technical computing market is, however, already highly competitive with software systems, many of them available as inexpensive open-source software programs.

For example, the Linux Rocks program has been developed for more than a decade by a small group of engineers led by Philip M. Papadopoulos, program director of grid and cluster computing at the San Diego Supercomputer Center.

That program, which is freely available and is used by more than 500 academic and technical computing sites, can be installed on a 128- processor system in as little as eight minutes with the aid of BitTorrent file-sharing software.

To move into the scientific and technical computing world, Microsoft will have to overcome several obstacles, Mr. Papadopoulos said.

"Most users are Unix friendly, that's the environment they work in," he said. If Microsoft wants to move scientists and engineers into its software environment, it will have to demonstrate compatibility and prove that it offers an easier environment in which to do parallel programming, he said.

Microsoft executives said they were still refining pricing for the Computer Cluster Server software, but added that prices would be similar to those for the company's multi- processor server system.

Microsoft's advantage will come in helping scientists manage the huge explosion of data generated by new sensor technology now being widely used in the sciences, Mr. Mundie said. By lowering the cost of maintaining a computing system, the new software will also be an advantage for Microsoft, he added.
http://www.nytimes.com/2005/11/15/te...y/15super.html





Apple to Raise iTunes Prices?

EMI chief says Apple will change to a flexible pricing structure within 12 months.

Apple Computer may blink after all on its dogged determination to keep its price per download for iTunes songs at an across-the-board $0.99.

The company that created the portable digital device sensation, the iPod, and the iTunes Music Store that drives it, may be about to give in to pressure from music publishers such as Warner Music Group and EMI Music.

In a story in Thursday’s Wall Street Journal, Alain Levy, chief executive of EMI, is quoted as saying that he discussed the issue with Apple CEO Steve Jobs and he believes that Apple plans to change its one-price policy.

According to the paper, Mr. Levy said the issue is not really whether Apple will introduce flexible pricing, but when it will do it.

Shares of Apple were down $0.28 to $64.67 in recent trading.

Back in September, Mr. Jobs blasted music industry executives, calling them greedy for pushing for an increase in the price of downloaded music, saying their demands, if met, would serve to encourage piracy.

Warner Music Group CEO Edgar Bronfman Jr. rose to the challenge, saying the single-price strategy was unfair to the music industry, the artists, and the consumer (see Bronfman Fires Back at Apple). He said the market should decide what music downloads should cost, not a single retailer like Cupertino, California-based Apple.

“Some songs should be $0.99 and some songs should be more. I don’t want to give anyone the impression that $0.99 is a thing of the past,” he said.

Mr. Bronfman said the digital music industry had become the only one that did not have flexible pricing. He could have pointed to the movie industry as a glaring exception to that rule. No matter which movie one attends at the multiplex, whether it’s a hit or a dog, it costs the same.

Sharing iPod Revenue
“We are selling our songs through iPod, but we don’t have a share of iPod’s revenue,” he said. “We want to share in those revenue streams. We have to get out of the mindset that our content has promotional value only.”

At the time Mr. Bronfman called for flexible pricing, Apple refused to respond to questions about the notion that superior or more popular music could be priced higher than $0.99. But according to Mr. Levy, Apple will relent in the next 12 months.

Sales of digital music have been growing in the last couple of years and now account for more than 5 percent of the global recorded music market. That is about three times the level of a year ago.

“There are some reports showing that single-track downloads in the U.S. are slowing, but we are also seeing full album downloads in the U.S. accelerating in growth,” said Eric Nicoli, chairman of EMI Group. “We are interested in digital distribution of every kind all over the world. We are interested in downloads, in subscription models, in legalized peer-to-peer, in every aspect of mobile distribution.”

EMI, which posted its six-month performance on Wednesday, said its digital sales grew by 142.4 percent to $76.5 million, representing 4.9 percent of total group revenues, up from 2.1 percent of the firm’s revenues in the prior year’s first half (see A Sour Note on File Sharing).
http://www.redherring.com/Article.as...unes+Prices%3f





The Empire Strikes Back

How the media industry plans to break Apple's stranglehold on digital content.
Erick Schonfeld

Nobody said the digital life would be easy. That’s a lesson the media industry is learning as its businesses erode thanks to file swappers on one side and the margin-squeezing dictates of Apple (AAPL) on the other. No matter what the industry does, when its products become digital, it loses control over them. Even with Apple and its amazingly successful iTunes store, the music labels have to sell their songs the way Apple tells them to: one at a time, for 99 cents each. Now, as Apple introduces video into the mix, the movie companies, too, are concerned about ceding control to Apple.

“The content owners don’t like that,” notes Stefan Roever, the CEO of a startup called Navio that wants to help shift the balance of power back to the media companies. How? Imagine if you went to a music site to buy a single download for 99 cents, but instead you were offered the option to purchase the perpetual right to that song. With this right, you could download the song to your PC, your iPod, or your cell phone in whatever format was appropriate. And if you got a new computer, or if the digital-rights-management software protecting the file changed one day, you wouldn’t need to buy the song again. Your rights to the song would be stored online. Pay once, and it would be yours forever. If you lost it, you’d just download it again. Or you could share the song with a friend, or even resell it, depending on what rights you bought.

Now imagine further that you didn’t have to go to Sony (SNE) BMG’s or Fox Music’s site to buy these songs, but instead could go to the artists’ sites themselves or any affiliate site or even a music blog that sold rights to the songs for a 10 or 15 percent split of the revenues it helped generate. Any site could become an affiliate. So if you went to the website of the music club where you’d seen a great band the night before, the rights to the band’s entire set list might be for sale on a track-by-track basis.

And what if you could also buy rights to additional digital content related to the song, such as a ringtone version for your cell phone, a screensaver or wallpaper featuring that artist, or a videogame. In other words, for a slightly higher price, you could get a bundle of digital content. You might go online to buy a movie ticket, and you’d get an additional offer to buy the soundtrack and a mobile game based on the movie, all for an extra $8. These digital rights could also be bundled with physical goods like CDs, DVDs, or coupons.

Contrast this with the current iTunes model, which works great for Apple but not so great for the media companies. Apple takes its near-monopoly position in digital music players and forces customers to use a specific application (iTunes) to buy their music. By controlling the hardware and software for iPods, Apple in effect controls most digital music distribution. That lets Apple dictate how digital songs are sold and at what price.

Although the music companies get about 70 percent of what Apple collects, the margins are still miserable compared with those for CDs. That’s because the CD is a bundled sale: one high price for a bundle of songs. Of course, you can buy whole albums on iTunes as well, but you don’t have to -- and that’s what’s breaking the business model for the music companies. “The old economics on a CD worked,” Roever explains, “because people were buying bundles. The new economics are 10 times worse.” Since technology broke the old CD bundle, Roever now wants to use his technology to help the media companies create entirely new bundles.

Fox and Sony BMG are already experimenting with Navio’s technology, which acts as the back-end digital plumbing that keeps track of all the rights, delivers files in the proper format to the proper device, and even provides Flash-based Web storefronts, thus allowing any site to become an online store. In return, Navio takes a 15 percent cut, on average. Roever says he’s in discussions with other major media players as well to “rights-enable” the entire contents of their digital catalogs.

The key is getting around Apple. Come December, Navio will launch a new version of its service that will finally allow music companies to sell songs that will play on iPods without going through Apple’s iTunes store. How will Apple react to this news? “I think they will go ballistic,” Roever predicts, before adding, “There is nothing Apple can do to prevent this.” (Like RealNetworks before it, Navio has reverse-engineered the iPod’s Fairplay software so that Navio can deliver copy-protected songs in a format that will play on the device.) If nothing else, though, Apple can at least try to bury Navio with lawyers.

But what Roever is proposing goes way beyond circumventing Apple. He is persuading media giants to experiment with selling the rights to a piece of digital music or a movie instead of focusing on selling the digital file itself. If the media companies don’t screw this one up, it could be a potentially powerful model because it gives them total control over the pricing and packaging of their digital products. It could even help stem the tide of illegal file sharing. “In the long run, it will be so much more valuable to own the right than steal the file,” Roever predicts. That will be true, though, only if the rights being sold become more desired than the underlying file.

It all sounds good on paper. The potentially disruptive nature of selling rights instead of digital files per se opens up the door to sales of new digital bundles, to dispersing distribution throughout the Web, and to the ability to sell future-proof versions of songs, movies, games, and whatnot. But it also sounds extremely complicated. Who will explain these new rights to consumers? Who will make sure that the media companies actually offer something better than what is available today through iTunes -- instead of trying to gouge customers by upselling them bundles of useless stuff they really don’t want?

In the end, Apple will still have simplicity on its side. And simplicity usually wins.
http://www.business2.com/b2/web/arti...130183,00.html





US Wants Wiretap Ability On Internet Calls Expanded
Jeremy Pelofsky

U.S. law enforcement authorities want expanded ability to tap any phone call between an Internet phone and a traditional phone if needed for an investigation, according to documents filed this week.

The U.S. Justice Department urged communications regulators to require Internet phone companies to provide the ability to conduct surveillance on services that offer only outgoing calls or incoming calls to or from the traditional phone network.

With the growth of high-speed Internet services, several companies like privately-held Vonage Holdings Corp. and Skype, which eBay Inc. recently bought, are now offering low-priced Internet telephone service as an alternative.

There are approximately 3.6 million U.S. customers who have signed up for two-way Internet phone service, known as Voice Over Internet Protocol, or VOIP, according to a new survey by TeleGeography Research.

The group projects 4.4 million U.S. subscribers by the end of the year and close to 20 million by 2010. The Communications Assistance for Law Enforcement Act (CALEA) passed by Congress in 1994 was aimed at preserving the ability of authorities to conduct court-ordered wiretaps as technology advanced.

The U.S. Federal Communications Commission in August ruled that companies like Vonage must offer law enforcement authorities the ability to conduct surveillance on Internet phone services that can both make and receive calls to and from the traditional phone network.

However, Skype offers independent one-way services, SkypeOut which permits outbound calls that can connect to the traditional phone network, and SkypeIn which receives calls from the phone network and gives the customer a phone number.

Comments Filed

Without referring to Skype, the Justice Department asked that CALEA be extended to services that "enable customers to place calls to or receive calls from the Public Switched Telephone Network (PSTN)."

The agency filed comments on Monday with the FCC, which is weighing how to apply the law to new communications services.

Skype argued against applying the surveillance law beyond two-way Internet phone services, noting that the FCC decision was aimed at those that replace traditional phone service.

"Many software-based VOIP products are used not to replace traditional telephony, but as a component of electronic messaging and other information services, which Congress clearly indicated was not covered by CALEA," Skype said.

The FCC decision in August also extended the surveillance law to broadband Internet access, a move that raised concerns by educational institutions like Cornell University which said the agency overstepped its bounds.

"If Cornell is not providing services for hire, it should be exempt from CALEA," the university said in comments to the FCC filed on Friday. "Congress expressly excluded 'private networks' from CALEA's reach."

The FCC said in its order that private networks would not be subject to the wiretap requirements but those that are connected with a public network would have to comply with the law.
http://today.reuters.com/news/newsAr...&srch=internet





Blu-ray, HP At Odds Over High-Def DVD Launch Plan
Sue Zeidler

The Blu-ray Disc group, which aims to set the standard for next-generation DVDs, on Wednesday said it would not adopt a proposal from Hewlett-Packard Co. by the launch of the technology, leading the PC maker to say it may back a rival in the looming multibillion dollar war.

That would leave HP, the No. 2 PC maker, splitting support between the two leading technologies.

Long a supporter of Sony Corp.-led Blu-ray, HP in October said if two technologies it considered important to PC users were not included in Blu-ray's specifications, it would consider backing rival standard HD-DVD, championed by Toshiba Corp.

Billions of dollars are at stake as the electronics, computer, movie and television industries gear up for a technology change expected to send consumers back to the stores for equipment and discs that will play high-definition pictures and stores many times the data of current discs.

But the competing technologies may end up in a war reminiscent of the home video-Betamax debate decades ago, which confused buyers and was an expensive loss for many companies.

HP has championed two technologies known as iHD and mandatory managed copy. Mandatory managed copy lets users legally copy DVDs and store the digital file on a home network, while iHD provides for new interactive features and is slated to be implemented in Microsoft Corp.'s new Windows Vista operating system.

On Wednesday Blu-ray said it will incorporate mandatory managed copy but would launch it in spring 2006 with interactive features built on Sun Microsystems Inc.'s Java software.

"Mandatory managed copy will be part of Blu-ray format, but while HP's request (for interactivity) is being considered, at this point in time, the Blu-ray group is still proceeding down the path of Java," Blu-ray spokesman Andy Parsons told Reuters in an interview.

"We are taking their request seriously, but are not willing to delay the launch and are going to go forward with the Java-type option," he said.

"I'm not saying we would not implement what they've requested, but it's not going to stop the format at this time," he said. "HP is still a valued member of the Blu-ray Disc Association and I expect to see them supporting Blu-ray in upcoming promotional events."

Microsoft and Intel Corp. both support HD-DVD. HP has said its move reflected its desire to ensure customers are not forced to choose between competing formats for DVDs.

Maureen Weber, general manager of personal storage in HP's personal systems group, on Wednesday said if Blu-ray remains committed to this stance, the computer maker will indeed adopt a more neutral position versus being an exclusive Blu-ray supporter.

"If they are unable to incorporate technologies we think are critical for the PC architecture, we'll be more neutral. We'll think of cost and implementation across the board. Potentially, we could support both HD DVD and Blu-ray," she said.

"You'd see us supporting both formats in various trade show booths," she said.
http://today.reuters.com/news/newsAr...archived=False





Sony - Fighting Illegal File Sharing Illegally
Garon Anders

As students, we are poor. Because we are poor, we have a tendency to enjoy the simple pleasures in life. We hang sheets over the windows instead of curtains. Used furniture is a must. We forego framed prints and make our own art to hang on the walls. There are two places that we tend to carefully spend our money: entertainment and technology. As the most computer savvy generation ever, we find ourselves caught in the midst of a groundbreaking technological battle. Who among us has not illegally downloaded copyrighted material? Let he who is without guilt cast the first store-bought CD.

In the past few weeks, there have been several major events that stem from the debate over sharing copyrighted material. The popular P2P service Grokster chose to settle with the Recording Industry Association of America. The settlement includes $50 million paid to the RIAA (and other plaintiffs) for the distribution of copyrighted material. Additionally, Grokster has stopped distributing its P2P software. A statement on Grokster.com reads, "The United States Supreme Court unanimously confirmed that using this service to trade copyrighted material is illegal. Copying copyrighted motion picture and music files using unauthorized peer-to- peer services is illegal and is prosecuted by copyright owners. There are legal services for downloading music and movies. This service is not one of them." Both the RIAA and the MPAA (Motion Picture Association of America) have agreed that they will pursue their case with the P2P services Morpheus and Kazaa.

In the past week, it was discovered the Sony/BMG was using software to protect the distribution of Sony/BMG music. The software that is now distributed with some Sony/BMG CDs, when played on a Windows machine, limits the usage rights of the consumer. When this was first discovered by a computer programmer in the UK, many took pause and noted the actions of the high-rolling music distributor. Later, it was discovered the Sony/BMG software was nearly impossible to remove from Windows programming due to its rootkit. Today, a security firm in Amsterdam confirmed that a hacker has successful manipulated Sony/ BMG's software to distribute a Trojan virus known as "Stinx-E." More viruses are expected to be developed that take advantage of Sony/BMG's software.

In addition to the bad publicity that Sony/BMG faces with respect to the discovery of the rootkit found on many their CDs, they are now facing a class action lawsuit filed in California by attorney Robert Green. According to Green, "The company appears to have violated a California state law that forbids 'inducing' the installation of spyware or similar utilities on a personal computer in order to run a particular program."

For those looking to uninstall Sony/BMG's software, a patch is available at their website, www.sonybmg.com. The company further angered consumers by failing to provide an immediate means by which to uninstall the rootkit packaged with Sony/BMG CDs. Many who visited the site were unable to download the patch because the company's website would not support Mozilla's Firefox or Opera browsers. Apparently, this has been noted and corrected as I was able to access Sony/BMG's website via Firefox earlier today.

I believe firmly that copyrighted materials should be protected from being illegally distributed. However, we have to ask ourselves, as consumers, how much more we are willing to take from firms like Sony/BMG and the RIAA? These money-hungry corporate giants are doing everything possible to alienate the consumer. Who wants to buy a CD that they cannot backup or requires a proprietary player? Who wants to purchase music that is associated with a group that is going around harassing grandparents with lawsuits for the distribution of copyrighted material? Yes, CD sales are down and there is a good reason for that. Stop producing bland, cookie-cutter, garden-variety garbage and foster a wide variety of new artists and genres!

The actions of the RIAA and Sony/BMG are reprehensible and sickening. Their actions prove that they are merely capitalist pigs whose only concern is the accrual of money. In the past three years, I have purchased two CDs and really do not have a desire to purchase more. I hope their lawsuits and malicious software serve to further alienate consumers and reduce CD sales. Maybe then, they will realize that music is not just about money and it certainly is not about lying to and cheating the consumer.
http://www.carolinianonline.com/medi...anonline.co m





Unto Us The Machine Is Born
Kevin Kelly

By 2015 the internet as we know it will be dead, killed by a globe-spanning artificial consciousness, writes founding Wired editor Kevin Kelly.

THE web continues to evolve from an entity ruled by mass media and mass audiences to one ruled by messy media and messy participation. How far can this frenzy of creativity go? Encouraged by web-enabled sales, 175,000 books were published and more than 30,000 music albums were released in the US last year. At the same time, 14 million blogs were launched worldwide.

All these numbers are escalating. A simple extrapolation suggests that in the near future everyone alive will (on average) write a song, author a book, make a video, craft a weblog, and code a program. This idea is less outrageous than the notion 150 years ago that some day everyone would write a letter or take a photograph.

What happens when the data flow is asymmetrical - but in favour of creators? What happens when everyone is uploading far more than they download? If everyone is busy making, altering, mixing and mashing, who will have time to sit back and veg out? Who will be a consumer?

No one. And that's just fine. A world in which production outpaces consumption should not be sustainable; that's a lesson from economics 101. But online, where many ideas that don't work in theory succeed in practice, the audience increasingly doesn't matter. What matters is the network of social creation, the community of collaborative interaction that futurist Alvin Toffler called prosumption. As with blogging and BitTorrent, prosumers produce and consume at once. The producers are the audience, the act of making is the act of watching, and every link is both a point of departure and a destination.

But if a roiling mess of participation is all we think the web will become, we are likely to miss the big news, again. The experts are certainly missing it. The Pew Internet & American Life Project surveyed more than 1200 professionals in 2004, asking them to predict the net's next decade. One scenario earned agreement from two-thirds of respondents: "As computing devices become embedded in everything from clothes to appliances to cars to phones, these networked devices will allow greater surveillance by governments and businesses."

Another was affirmed by one-third: "By 2014, use of the internet will increase the size of people's social networks far beyond what has traditionally been the case."

These are safe bets, but they fail to capture the web's disruptive trajectory. The real transformation under way is more akin to what Sun Microsystem's John Gage had in mind in 1988 when he famously said: "The network is the computer."

His phrase sums up the destiny of the web: as the operating system for a megacomputer that encompasses the internet, all its services, all peripheral chips and affiliated devices from scanners to satellites, and the billions of human minds entangled in this global network.

This gargantuan Machine already exists in a primitive form. In the coming decade, it will evolve into an integral extension not only of our senses and bodies, but our minds.

Today the Machine acts like a very large computer, with top-level functions that operate at about the clock speed of an early PC. It processes 1 million emails each second, which essentially means network ing runs at 100 kilohertz, SMS at 1 kilohertz. The Machine's total external RAM is about 200 terabytes. In any one second, 10 terabits can be coursing through its backbone, and each year it generates nearly 20 exabytes of data. Its distributed "chip" spans 1 billion active PCs, which is about the number of transistors in one PC.

This planet-sized computer is comparable in complexity to a human brain. Both the brain and the web have hundreds of billions of neurons, or webpages. Each biological neuron sprouts synaptic links to thousands of other neurons, and each webpage branches into dozens of hyperlinks. That adds up to a trillion "synapses" between the static pages on the web. The human brain has about 100 times that number - but brains are not doubling in size every few years. The Machine is.

Since each of its "transistors" is itself a personal computer with a billion transistors running lower functions, the Machine is fractal. In total, it harnesses a quintillion transistors, expanding its complexity beyond that of a biological brain. It has already surpassed the 20-petahertz threshold for potential intelligence as calculated by Ray Kurzweil ("Human 2.0", Next 25/10). For this reason some researchers pursuing artificial intelligence have switched their bets to the net as the computer most likely to think first.

Danny Hillis, a computer scientist who once claimed he wanted to make an AI "that would be proud of me", has invented massively parallel supercomputers, in part to advance us in that direction. He now believes the first real AI will emerge not in a stand-alone supercomputer such as IBM's proposed 23-teraflop Blue Brain, but in the vast tangle of the global Machine.

In 10 years the system will contain hundreds of millions of miles of fibre-optic neurons linking the billions of ant-smart chips embedded into manufactured products, buried in environmental sensors, staring out from satellite cameras, guiding cars, and saturating our world with enough complexity to begin to learn. We will live inside this thing.

Today the nascent Machine routes packets around disturbances in its lines; by 2015 it will anticipate disturbances and avoid them. It will have a robust immune system, weeding spam from its trunk lines, eliminating viruses and denial-of-service attacks the moment they are launched, and dissuading malefactors from injuring it again. The patterns of the Machine's internal workings will be so complex they won't be repeatable; you won't always get the same answer to a given question. It will take intuition to maximise what the global network has to offer. The most obvious development birthed by this platform will be the absorption of routine. The Machine will take on anything we do more than twice. It will be the Anticipation Machine.

ONE great advantage the Machine holds in this regard: it's always on. It is very hard to learn if you keep getting turned off, which is the fate of most computers.

AI researchers rejoice when an adaptive learning program runs for days without crashing. The foetal Machine has been running continuously for at least 10 years (30 if you want to be picky). I am aware of no other machine that has run that long with no downtime. Portions may spin down because of power outages or cascading infections,but the entire thing is unlikely to go quiet in the coming decade. It will be the most reliable gadget we have.

And the most universal. By 2015, desktop operating systems will be largely irrelevant. The web will be the only OS worth coding for. It won't matter what device you use, as long as it runs on the web OS. You will reach the same distributed computer whether you log on via phone, PDA, laptop, or HDTV.

By 2015 the '90s image of convergence will turn inside-out. Each device is a differently shaped window that peers into the global computer. Nothing converges. The Machine is an unbounded thing that will take a billion windows to glimpse even part of. It is what you'll see on the other side of any screen.

And who will write the software that makes this contraption useful and productive?

We will. Each of us already does it every day. When we post and then tag pictures on the community photo album Flickr, we are teaching the Machine to give names to images. The thickening links between caption and picture form a neural net that learns.

Think of the 100 billion times a day humans click on a webpage as a way of teaching the Machine what we think is important. Each time we forge a link between words, we teach it an idea. Wikipedia encourages its citizen authors to link each fact in an article to a reference citation. Over time, a Wikipedia article becomes totally underlined in blue as ideas are cross- referenced. That cross-referencing is how brains think and remember. It is how neural nets answer questions. It is how our global skin of neurons will adapt autonomously and acquire a higher level of knowledge.

The human brain has no department full of programming cells that configure the mind. Brain cells program themselves simply by being used. Likewise, our questions program the Machine to answer questions. We think we are merely wasting time when we surf mindlessly or blog an item, but each time we click a link we strengthen a node somewhere in the web OS, thereby programming the Machine by using it.

What will most surprise us is how dependent we will be on what the Machine knows - about us and about what we want to know. We already find it easier to Google something rather than remember it. The more we teach this megacomputer, the more it will assume responsibility for our knowing. It will become our memory. Then it will become our identity. In 2015 many people, when divorced from the Machine, won't feel like themselves - as if they'd had a lobotomy.

There is only one time in the history of each planet when its inhabitants first wire up its parts to make one large Machine. Later that Machine may run faster, but there is only one time when it is born.

You and I are alive at this moment. We should marvel, but people alive at such times usually don't. Every few centuries, the steady march of change meets a discontinuity, and history hinges on that moment. We look back on those pivotal eras and wonder what it would have been like to be alive then. Confucius, Zoroaster, Buddha, and the latter Jewish patriarchs lived in the same historical era, an inflection point known as the axial age of religion. Few world religions were born after this time. Similarly, the great personalities converging upon the American Revolution and the geniuses who commingled during the invention of modern science in the 17th century mark additional axial phases in the short history of our civilisation.

Three thousand years from now, when keen minds review the past, I believe that our ancient time, here at the cusp of the third millennium, will be seen as another such era.

In the years roughly coincidental with the Netscape IPO, humans began animating inert objects with tiny slivers of intelligence, connecting them into a global field, and linking their own minds into a single thing. This will be recognised as the largest, most complex, and most surprising event on the planet. Weaving nerves out of glass and radio waves, our species began wiring up all regions, all processes, all facts and notions into a grand network.

From this embryonic neural net was born a collaborative interface for our civilisation, a sensing, cognitive device with power that exceeded any previous invention. The Machine provided a new way of thinking (perfect search, total recall) and a new mind for an old species. It was the Beginning.

Netscape's float was a puny rocket to herald such a moment. First moments are like that. After the hysteria dies, the millions of dollars made and lost, the strands of mind, once achingly isolated, come together.

Today, our Machine is born. It is on.

They couldn't have done it without you

The total number of webpages now exceeds 600 billion. That's 100 pages per person alive.

In fewer than 4000 days, we have encoded half a trillion versions of our collective story and put them in front of 1 billion people, or one-sixth of the world's population. That remarkable achievement was not in anyone's 10-year plan.

The accretion of tiny marvels can numb us to the arrival of the stupendous. Today, at any net terminal, you can get: an amazing variety of music and video, an evolving encyclopedia, weather forecasts, help-wanted ads, satellite images of any place on earth - just to name a few applications - all wrapped up in an interactive index that really works.

This view is spookily godlike. You can switch your gaze on a spot in the world from map to satellite to 3-D just by clicking. Recall the past? It's there. Or listen to the daily complaints and travails of almost anyone who blogs (and doesn't everyone?). Ten years ago you would have been told there wasn't enough money in all the investment firms in the world to fund such a cornucopia. The success of the web at this scale was impossible. But if we have learned anything in the past decade, it is the plausibility of the impossible.

In about 4000 days, eBay has gone from marginal experiment in community markets in the San Francisco Bay area to the most profitable spin-off of hypertext. At any one moment, 50 million auctions race through the site.

What we all failed to see was how much of this new world would be manufactured by users, not corporate interests. Amazon.com customers rushed with surprising speed and intelligence to write the reviews that made the site useable. Owners of Adobe, Apple and most major software products offer help and advice on the developer's forum web pages. And in the greatest leverage of the common user, Google turns traffic and link patterns generated by 2 billion searches a month into the organising intelligence for a new economy.

No web phenomenon is more confounding than blogging. Everything media experts knew about audiences - and they knew a lot - confirmed the focus group belief that audiences would never get off their butts and start making their own entertainment.

What a shock, then, to witness the near-instantaneous rise of 50 million blogs, with a new one appearing every two seconds. These user-created channels make no sense economically. Where are the time, energy and resources coming from?

The audience.

I run a blog about cool tools. The web extends my passion to a far wider group for no extra cost or effort. My site is part of a vast and growing gift economy, a visible underground of valuable creations - free on inquiry. This spurs the grateful to reciprocate. It permits easy modification and re-use, and thus promotes consumers into producers.

The electricity of participation nudges ordinary folk to invest huge hunks of energy and time into making free encyclopedias or creating public tutorials for changing a flat tyre. A study found that only 40 per cent of the web is commercial. The rest runs on duty or passion.

This follows the industrial age, by the way, when mass-produced goods outclassed anything you could make yourself. The impulse for participation has up-ended the economy and is steadily turning the sphere of social networking into the main event.

Once, we, the public, just about only uploaded. Today, the poster child of the new internet regime is BitTorrent, under which users upload stuff while they are downloading. It assumes participation.

And the web embeds itself into every class, occupation and region. Everyone missed the 2002 flip-point when women online suddenly outnumbered men. The average user is now a 41-year- old woman.

What could be a better mark of irreversible acceptance than adoption by the technology- reluctant American rural sect, the Amish?

On a visit recently, I was amazed to hear some Amish farmers mention their websites.

"Amish websites?" I asked.

"For advertising our family business. We weld barbecue grills in our shop."

"Yes, but . . ."

"Oh, we use the internet terminal at the public library. And Yahoo!"

I knew then the battle was over.

Back to the future

Computing pioneer Vannevar Bush outlined the web's core idea - hyperlinked pages - in 1945, but the first person to try to build on the concept was a freethinker named Ted Nelson, who in 1965 envisioned his own scheme, which he called "Xanadu". But he had little success connecting digital bits on a useful scale and his efforts were known only to an isolated group of disciples. Few of the hackers writing code for the emerging web in the 1990s knew about Nelson or his hyperlinked dream machine.

At the suggestion of a computer-savvy friend, I got in touch with Nelson in 1984, a decade before Netscape made Marc Andreessen a millionaire. We met in a dark dockside bar in Sausalito, California. Folded notes erupted from his pockets, and long strips of paper slipped from overstuffed notebooks. He told me about his scheme for organising all the knowledge of humanity. Salvation lay in cutting up 3 x 5 cards, of which he had plenty.

Legend has it that Ted Nelson invented Xanadu as a remedy for his poor memory and attention deficit disorder. He was certain that every document in the world should be a footnote to some other document, and computers could make the (hyper)links between them visible and permanent. He sketched out complicated notions of transferring authorship back to creators and tracking payments as readers hopped along networks of documents, what he called the docuverse. He spoke of "transclusion" and "intertwingularity" as he described the grand utopian benefits of his embedded structure.

It was clear to me a hyperlinked world was inevitable. But what surprises me is how much was missing from Vannevar Bush's vision, Nelson's docuverse, and my own expectations. The web revolution heralded a new kind of participation that is developing into a culture based on sharing.

By ignoring the web's reality, we are likely to miss what it will grow into over the next 10 years. Any hope of discerning the state of the web in 2015 requires that we own up to how wrong we were 10 years ago.
http://www.smh.com.au/news/next/unto...e#contentSwap1

















Until next week,

- js.

















Current Week In Review





Recent WiRs -

November 12th, November 5th, October 29th, October 22nd

Jack Spratt's Week In Review is published every Friday. Please submit letters, articles, and press releases in plain text English to jackspratts (at) lycos (dot) com. Include contact info. Submission deadlines are Wednesdays @ 1700 UTC.


"The First Amendment rests on the assumption that the widest possible dissemination of information from diverse and antagonistic sources is essential to the welfare of the public."
- Hugo Black
JackSpratts is offline   Reply With Quote