P2P-Zone  

Go Back   P2P-Zone > Peer to Peer
FAQ Members List Calendar Search Today's Posts Mark Forums Read

Peer to Peer The 3rd millenium technology!

Reply
 
Thread Tools Search this Thread Display Modes
Old 10-08-11, 07:22 AM   #1
JackSpratts
 
JackSpratts's Avatar
 
Join Date: May 2001
Location: New England
Posts: 10,017
Default Peer-To-Peer News - The Week In Review - August 13th, '11

Since 2002


































"Egyptians are showing solidarity with Americans in the BART cell shutdown." – AnonyOps



































August 13th, 2011




200,000 BitTorrent Users Sued In The United States
Ernesto

The avalanche of mass-lawsuits in the United States that target BitTorrent users has reached a new milestone. Since last year, more than 200,000 people have been sued for allegedly sharing copyrighted material online, and this number continues to expand at a rapid pace. Added up, the potential profit from the so-called pay-up-or-else scheme runs into the hundreds of millions of dollars.

Mass file-sharing lawsuits have been filed all across the United States in recent months, almost exclusively targeting BitTorrent users. Copyright holders have embraced this new revenue stream by the dozen and new lawsuits are being filed every week.

The United States judicial system is currently being overloaded with new cases, and a few days ago the number of targeted Internet subscribers in federal courts broke the 200,000 barrier.

Through these mass lawsuits the copyright holders are trying to obtain the personal details of (mostly) BitTorrent users who allegedly shared their material online. Once this information is handed over, they then offer the defendant the opportunity to settle the case for a few hundred up to a couple of thousand dollars, thereby avoiding a full trial and potentially even bigger financial penalties.

A fairly exhaustive spreadsheet shows that the current number of Does that have been sued since the beginning of 2010 currently stands at 201,828. Nearly all of the defendants are accused of sharing copyrighted files via BitTorrent, and 1,237 allegedly used the eD2k network.

Over the course of the year several cases have been dismissed and settled and the estimated number of defendants who are still at risk lies at 145,417.

Most defendants are being sued in the high profile case brought by the makers of The Hurt Locker. As of May this year this lawsuit targeted 24,583 alleged BitTorrent users, and the first batch of settlement letters have been sent out to the people who pay for the allegedly-infringing Internet connections.

Despite the massive number of defendants, none of the cases have made it into a full jury trial as the copyright holders ask for in their original complaint. This also means that the evidence they claim to hold has not been properly tested.

It is believed that a significant amount of the people who are accused in these cases are not the actual infringer. However, since the copyright holders prefer settlements above full trials and because defendants don’t want to risk a $150,000 fine, the accuracy of the evidence remains a mystery.

What’s very clear is that for the copyright holders, tracking companies and lawyers, the settlement scheme is extremely profitable. If half of the original defendants eventually settle for an average fee of $2,500 they would generate a quarter billion dollars in revenue – from piracy.
https://torrentfreak.com/200000-bitt...states-110808/





High Court Agrees to Hear iiTrial

The High Court of Australia this morning granted film and television studios the right to appeal against the decision made earlier in the year in the case against Australian ISP iiNet.

The approval means the studios will have the opportunity to argue against the interpretation of the laws concerning copyright infringement by authorisation, rather than the facts of the case made against iiNet.

iiNet CEO Michael Malone said he wasn’t surprised by today’s decision, but called on the industry to come to a “workable” solution to piracy problems. “I know the Internet industry is eager to work with the film industry and copyright holders to develop a workable solution,” Malone said. “We remain committed to developing an industry solution that sees more content readily and cheaply available online as well as a sensible model for dealing with repeated copyright infringement activity.”

A group of 34 parties, comprised of most major Australian and American film studios, took iiNet to court in 2009 after claiming iiNet had “authorised” its users to download pirated movies and television over the Internet. After the original decision of not guilty made in February last year was appealed, the Australian Federal Court ruled in February this year that iiNet was not responsible for the illegal downloading.

However the parties against iiNet decided to attempt to appeal the decision yet again, lodging an application soon after on the 24th March with the High Court of Australia for a special leave to appeal grant.

The grant means that within the next 14 days, a notice of appeal must be filed with the High Court and sent to iiNet, otherwise the leave grant will lapse. iiNet is given the option to cross-appeal up to seven days after receiving the notice from the Studios. A lengthy number of legal procedures which must be undertaken and court sitting times mean it’s unlikely the court hearing will occur before the 25th October.

It could, however, stretch out until early December, with the next available sitting times after the 3rd of November not until the 29th of November. “We will continue to defend our position in these proceedings if necessary. I remain convinced that a genuine industry-wide solution is a better outcome for allconcerned and I’m hopeful it will be developed,” Malone said.
http://delimiter.com.au/2011/08/12/h...-hear-iitrial/





People Not Prepared for Copyright Law – Greens

People have not had enough warning that their file-sharing activities will be monitored from tomorrow ahead of a new law coming in next month, the Green Party says.

The Copyright (Infringing File Sharing) Amendment Act, designed to prevent illegal file sharing by internet users, was passed under urgency in April.

Under the law, copyright owners can send evidence of alleged infringements to internet service providers (ISPs), which would then send up to three infringement notices to the internet account holder.

If the warnings are ignored, the copyright owner can take a claim to the Copyright Tribunal and the tribunal can make awards of up to $15,000 against the account holder.

The law does not come into effect until September 1, but copyright holders will begin collecting data from tomorrow and any breaches found will count toward the three strikes.

In Parliament today, Green MP Gareth Hughes raised concerns about the work that had been done to prepare the public for the law, asking why the Government's information pack would not be on the Ministry of Economic Development's website until next week.

Commerce Minister Simon Power noted a number of organisations that had publicised the law and said the Government information pack would be available from next Wednesday.

"In addition, I note that the new measures have received a lot of coverage in the media, and I have made several press releases - actually, 10 - during the course of the development of the legislation," Mr Power said.

"There are no surprises about what will emerge on 1 September."

However, Mr Hughes said the media attention had largely focussed on MPs' lack of knowledge about file sharing rather than on the particulars of the new law.

He said news that data would be collected from tomorrow would surprise most people.

"The fact is the law was passed under urgency, the regulations were announced only six months before the legislation became live, it isn't much time for ISPs or the public to get ready for the legislation," Mr Hughes told NZPA.

"I also think the Government hasn't been making a big public promotion about it because, quite frankly, there has been quite a significant backlash against the legislation."

The lack of information provided to high schools and universities was also worrying, Mr Hughes said.

"Students are pretty skilled at getting round blocks that schools have in place now to use Facebook, they're likely to still be accessing peer-to-peer file sharing sites, and the Government should be providing advice and support to the Ministry of Education."

Parliament also risked fines or having its internet disconnected, and Mr Hughes said he had written to Speaker Lockwood Smith to clarify whether there was a plan in place to educate Parliamentary Services' internet users about the law.
http://www.nzherald.co.nz/connect/ne...ectid=10744240





8 Technical Methods That Make the PROTECT IP Act Useless
Drew Wilson

We’ve been running a series of guides that show just how easy it is to bi-pass general DNS censorship. It’s general DNS censorship that has been proposed in the PROTECT-IP Act among other things. Rather than simply debate philosophically on why the PROTECT-IP act will do absolutely nothing to deter copyright infringement, we decided to do one better and prove it instead.

Hiding your IP address, using a proxy, using the onion router and obtaining an IP address to a website so you won’t have to rely on a public DNS server – these seem like a very intimidating tasks for the unprepared. To be honest, when I first chose to try and figure these out, it seemed intimidating even to me – especially given that I don’t really even make use of proxy servers (or do any of the above for that matter). So really, I felt that I could relate to a number of moderately informed users on these topics.

Certainly, being able to remain anonymous online is something that can benefit many people – especially those who are marginalized by their own government in various ways – but I personally never felt that motivated to use any tools as it seemed to be an unnecessary layer of security when I simply browse news articles and listen to Creative Commons music among other things. So, a vast majority of the guides I’ve written over the last few weeks have been quite a learning experience to me.

The PROTECT-IP act has given me motivation to figure out how all of these methods work mainly due to the arbitrary nature of it all. If Hollywood doesn’t like that fan edit of a short clip, they can make that whole website disappear. If the RIAA thinks that a site like SoundClick doesn’t need to be seen by anyone else, they can erase easy access to that site almost with the snap of their fingers. So, how does the PROTECT-IP act work? Just look at the following from Wikipedia’s entry:

Quote:
The Protect IP Act says that an “information location tool shall take technically feasible and reasonable measures, as expeditiously as possible, to remove or disable access to the Internet site associated with the domain name set forth in the order”. In addition, it must delete all hyperlinks to the offending “Internet site”.

At a technical level domain name servers would be ordered to blacklist the suspected websites. Although the websites would remain reachable by IP address, links directing to them would be broken.[9] Also search engines—such as the already protesting Google—would be ordered to remove links in their index of the web of an allegedly infringing website. Furthermore, copyright holders themselves would be able to apply for court injunctions to have sites’ domains blacklisted.
To me, the scarier part is the fact that DNS servers would be affected by this. Forget search engines censoring websites based on copyright complaints, that has been happening for years through the DMCA. What I was more concerned about was the DNS servers because it would affect every internet user that uses that given server. So really, the taller order was figuring out how to make DNS censorship useless.

What struck me when writing these guides was just how easy some of these methods really were. In some instances, the only way to make defeating such censorship easier is to have a really big red button on the side of your computer that you can press to make DNS censorship go away. As such, I am convinced, at this point, that the PROTECT-IP Act will do absolutely nothing to curb copyright infringement. Sure, it’ll hamper free speech, sure it’s probably unconstitutional, sure it is politically unsound, sure it’ll probably hurt small and medium business, sure it’s probably anti-competitive, sure it’ll probably cause some security headaches, but stopping copyright infringement? Not by a long shot. Not with such methods I found that would be useful in circumventing such censorship anyway.

So, without further ado, the list including pros and cons of each (each method links to a corresponding guide we wrote):

1. Using a VPN Service

Quick Explanation:
A security tunnel that protects your data as it travels from your computer to the VPN server before letting it out on to the internet. As long as that VPN service is outside the United States, it’ll be very difficult to stop users using such services to circumvent DNS censorship.

Pros:

Very good security benefits. For the most part, it’s reliable. Plenty of technical support to go around depending on which VPN service you choose. Access pretty much everything on the internet. Very good for privacy.

Cons:
Costs money. May include bandwidth caps. Reliability of service isn’t consistent for every VPN service (though frontrunners are generally easier to spot in terms of reliability). Reportedly, you may need to install software you aren’t completely familiar with (depends on which service is being used).

2. Using Your HOSTs File

Quick Explanation:

For most users, there is actually a hosts file on their computer that can be used to connect domain name to server IP address without the use of a public DNS server. If a website is censored through a DNS server, one can simply use the HOSTs file so that a public DNS server isn’t even used in the first place. You just type in the domain name in your URL and the website would still appear.

Pros:

Completely removes the need to use a public DNS server when accessing specific websites. Prevents links from breaking due to DNS censorship. Enables you to have greater power over how you view webpages. No installation or downloading of software.

Cons:

Requires maintenance. Not always easy to find in your system (solved by our guide). May raise security issues on a LAN with multiple users (difficult to see how in a number of cases since one can use the HOSTs file to increase security for others). Side benefit of having an effective way of blocking ads on the web (hint: Use 127.0.0.1 for domains that deliver ads). You also need to find accurate IP addresses in the first place (solved by two other guides we have in this list.

3. Using TOR

Quick Explanation:

TOR is more or less a network of proxies. One person accesses a proxy and that proxy forwards that access to another proxy, trying to erase the users tracks. That proxy sends that stream to another proxy and the stream keeps going through these steps until it finally reaches what is known as an “exit-node”. That exit node then accesses the internet on the users behalf and acts as an intermediary in the process. As long as that exit node exists outside of the US, there is a very good chance that it won’t be affected by DNS censorship imposed by the ISPs onto their DNS servers.

Pros:

Added bonus of a very secure source of anonymity (not 100% chance of anonymity of course, but close enough). An interesting way of seeing the internet through the eyes of someone not in your country.

Cons:

You might not be able to get everything your want from the internet through this network (there may be way of making things not break through this, but it isn’t without the risk of compromised security). Requires downloading content to run (though installation is minimal)

4. Using a Web DNS Tool

Quick Explanation

Just by using publicly available DNS look-up tools, one can easily obtain server IP addresses for later use. If a domain is censored, one can simply replace the domain name part of the URL with the IP address and still access the website.

Pros:

Potentially obtain multiple IP addresses for later use. Free. Obtain the addresses once and you don’t have to worry about losing access to the site for as long as the server IP address remains the same.

Cons:

Preferably, the IP addresses must be obtained before the site is actually censored (there may be a brief window between when the domain is censored and when DNS records are updated, but there’s no telling how long that window is for sure). If the website obtains a new server and changes all of its IP addresses and you don’t have the new addresses, then you could lose the ability to use the website. There’s no guarantee this will always be an option should ISPs start blocking IP addresses as well.

5. Changing Your DNS Server

Quick Explanation:

Since we are talking about censoring DNS servers in the US, one can always just use a DNS server over seas (like ones used by ISPs overseas). By changing a your DNS server, you are no longer relying on a server that could be censored by the US government and/or corporate interests.

Pros:

No installation or downloading of additional software (everything you need should be on your computer already). Just a few menu clicks away. Can always be changed again at a later time without too much hassle.

Cons:

Can be a security risk to your computer if not done properly. Difficult to obtain DNS server IP addresses that will guaranteed be available for the foreseeable future. No guarantee that ISPs won’t start blocking this type of activity.

6. Using Command Prompt

Quick Explanation:

In Windows at least, one can simply open up command prompt (explained in tutorial) and simply type in “ping [insert domain name here]” and obtain a server IP address for later use.

Pros:

No installation or downloading of any software (use what you already have on your computer). Probably the fastest way to shield yourself from censorship. Only one command is technically necessary before you get what you are after.

Cons:

Obtaining this information through command prompt must be done before the domain is censored. Only one IP address can be obtained this way. If the website changes IP address for their server, you’ll lose access to the site unless you have the new one as well.

7. Using Foxy Proxy

Quick Explanation:

It’s a simple plug-in for FireFox you can download and install. After getting a nice list of simple proxies that preside outside of the US, you have a better chance at accessing the website that has been censored by the US government and/or corporate interests.

Pros:

Easy to install. Being able to access censored websites can merely be a click away. A fast fix with minimal effort if you have access to a decent size list of proxies (provided in guide).

Cons:

Reliability is no guarantee. Based on the technological aspect of this method, it’s not that secure since you are relying on one proxy. Not able to use this method for all kinds of web traffic. Confined to FireFox.

8. Using MAFIAAFire

Quick Explanation:

A simple plug-in for FireFox (or Chrome) you can download and install. If a website has had it’s domain seized, then you can be redirected to an alternate domain and still access the website.

Pros:

Easy to install. Is maintained for you through updates.

Cons:

Uses DNS servers that can be censored. Depends on there being an domain name being used in the first place for access (if an alternate domain doesn’t exist, then the site might not be accessible in this fashion). Technically, the site could be censored and block all possible updates as well.

Final Thoughts

By no means is this list comprehensive in any way. Still, I think some of these methods go way beyond circumventing types of censorship as suggested by the PROTECT-IP act.

It’ll be interesting to see how some services respond both who support internet censorship and those who are against internet censorship. I have a feeling it will be extremely difficult to stop these already existing methods to defeat DNS censorship. If, say, ISPs find a way to stop all of the above, a combination of some of the above or any enhancements to any of the above, I’ll be very impressed. Good luck to the ISPs on stopping this, they are going to need it.
http://www.zeropaid.com/news/95013/8...p-act-useless/





New Anti-Censorship Scheme Could Make it Impossible to Block Individual Sites
Nicole Casal Moore

A radical new approach to thwarting Internet censorship would essentially turn the whole web into a proxy server, making it virtually impossible for a censoring government to block individual sites.

The system is called Telex, and it is the brainchild of computer science researchers at the University of Michigan and the University of Waterloo in Canada. They will present it Aug. 12 at the USENIX Security Symposium in San Francisco.

"This has the potential to shift the arms race regarding censorship to be in favor of free and open communication," said J. Alex Halderman, assistant professor of computer science and engineering at U-M and one of Telex's developers.

"The Internet has the ability to catalyze change by empowering people through information and communication services. Repressive governments have responded by aggressively filtering it. If we can find ways to keep those channels open, we can give more people the ability to take part in free speech and access to information."

Today's typical anticensorship schemes get users around site blocks by routing them through an outside server called a proxy. But the censor can monitor the content of traffic on the whole network, and eventually finds and blocks the proxy, too.

"It creates a kind of cat and mouse game," said Halderman, who was at the blackboard explaining this to his computer and network security class when it hit him that there might be a different approach---a bigger way to think about the problem.

Here's how Telex would work:

Users install Telex software. Halderman envisions they could download it from an intermittently available website or borrow a copy from a friend.

Internet Service Providers (ISPs) outside the censoring nation deploy equipment called Telex stations.

When a user wants to visit a blacklisted site, he or she would establish a secure connection to an HTTPS website, which could be any password-protected site that isn't blocked. This is a decoy connection. The Telex software marks the connection as a Telex request by inserting a secret-coded tag into the page headers. The tag utilizes a cryptographic technique called "public-key steganography."

"Steganography is hiding the fact that you're sending a message at all," Halderman said. "We're able to hide it in the cryptographic protocol so that you can't even tell that the message is there."

The user's request passes through routers at various ISPs, some of which would be Telex stations. These stations would hold a private key that lets them recognize tagged connections from Telex clients. The stations would divert the connections so that the user could get to any site on the Internet.

Under this system, large segments of the Internet would need to be involved through participating ISPs.

"It would likely require support from nations that are friendly to the cause of a free and open Internet," Halderman said. "The problem with any one company doing this, for example, is they become a target. It's a collective action problem. You want to do it on a wide scale that makes connecting to the Internet almost an all or nothing proposition for the repressive state."

The researchers are at the proof-of-concept stage. They've developed software for researchers to experiment with. They've put up one Telex station on a mock ISP in their lab. They've been using it for their daily web browsing for the past four months and have tested it with a client in Beijing who was able to stream YouTube videos even though the site is blocked there.

###

The paper to be presented at USENIX Security is called "Telex: Anticensorship in the Network Infrastructure." Full text is at https://telex.cc/paper.html

J. Alex Halderman: https://jhalderm.com/

J. Alex Halderman's blog, Freedom to Tinker: https://freedom-to-tinker.com/blog/jhalderm

Telex website: https://telex.cc/

The University of Michigan College of Engineering is ranked among the top engineering schools in the country. At $180 million annually, its engineering research budget is one of largest of any public university. Michigan Engineering is home to 11 academic departments, numerous research centers and expansive entrepreneurial programs. The college plays a leading role in the Michigan Memorial Phoenix Energy Institute and hosts the world-class Lurie Nanofabrication Facility. Michigan Engineering's premier scholarship, international scale and multidisciplinary scope combine to create The Michigan Difference. Find out more at http://www.engin.umich.edu/
http://www.eurekalert.org/pub_releas...-nas081011.php





Cameron Threatens to Shut Down UK Social Networks
Stewart Meagher

In a move worthy of China's communist regime, UK PM David Cameron wants to shut down social networks whenever civil unrest rears its head in Britain's towns and cities.

Speaking in the House of Commons, Cameron said, "Everyone watching these horrific actions will be struck by how they were, organised via social media.

"Free flow of information can be used for good. But it can also be used for ill. So we are working with the police, the intelligence services and industry to look at whether it would be right to stop people communicating via these websites and services when we know they are plotting violence, disorder and criminality."

Although the Old Etonian didn't give any clue as to how he intends to block use of the likes of Twitter, Facebook and BlackBerry messenger - which have all been implicated in the mob's ability to stay 17 steps ahead of the cops as they turn up hours after the nation's shops and businesses have been picked clean by gangs of feral teenagers - but the only way we can see it working is if the entire cellular network is turned off in affected areas.

That might stop the hooligans rampaging about without hindrance but it might also stop people calling to let the authorities know that... oh we don't know... 'my shop/house/80-year-old mother-in-law is on fire!'

We get the distinct feeling he might not have thought this one through.

Home Secretary Teresa May is apparently holding meetings with Facebook, RIM and Twitter's UK outposts to discuss installing a kill switch at Number 10 but tech-savvy teens will soon switch to another service once they discover the three major social network players are working for the man.

Keith Vaz says he will hold an enquiry into the unrest when MPs have had their holidays which is really useful as the majority of the rioters will be back at school nicking the weedy kids' lunch money by then.
http://www.thinq.co.uk/2011/8/11/cam...cial-networks/





S.F. Subway Muzzles Cell Service During Protest
Elinor Mills

The operators of the Bay Area Rapid Transit subway system temporarily shut down cell service last night in four downtown San Francisco stations to interfere with a protest over a shooting by a BART police officer, a spokesman for the system said today.

"BART staff or contractors shut down power to the nodes and alerted the cell carriers," James Allison, deputy chief communications officer for BART, told CNET. The move was "one of many tactics to ensure the safety of everyone on the platform," he said in an initial statement provided to CNET earlier this afternoon.

Activists had planned to protest the fatal shooting of Charles Blair Hill, who BART police said went after them with a knife before an officer shot him on July 3.

"Organizers planning to disrupt BART service...stated they would use mobile devices to coordinate their disruptive activities and communicate about the location and number of BART police," said the original BART statement. "A civil disturbance during commute times at busy downtown San Francisco stations could lead to platform overcrowding and unsafe conditions for BART customers, employees, and demonstrators."

The initial statement from BART said the subway system had asked the wireless carriers to suspend the service in the stations, but Allison later said BART itself pulled the plug and notified the providers after the fact.

A protest about a week after the shooting had disrupted service on BART. The move also follows reports that rioters in London were using BlackBerry devices to communicate with each other earlier this week. After someone at BlackBerry maker Research In Motion tweeted: "We have engaged with the authorities to assist in any way we can," the BlackBerry blog was defaced by hackers. One Parliament member suggested that BlackBerry service should be suspended in light of its heavy use during the unrest, but so far the service has remained available. Mobile devices and services like Twitter have also played a key role in protests throughout the Middle East in recent years.

Hackers were calling for action against BART in retaliation for the cell service disruption. The Anonymous group of online activists started promoting Operation BART on Twitter, with one profile saying: "We are going to show BART (@SFBART) how to prevent a riot #OpBART." It was unclear exactly what actions the group would take. Meanwhile, they also released a digital flyer with the headline "muBARTek," a reference to former Egyptian President Hosni Mubarak, who was ousted after demonstrations earlier this year. One Anonymous Twitter profile claimed that "Egyptians are showing solidarity with Americans in the BART cell shutdown."

The BART statement about last night's event said "Cell phone service was not interrupted outside BART stations. In addition, numerous BART police officers and other BART personnel were present during the planned protest, and train intercoms and white courtesy telephones remained available for customers seeking assistance or reporting suspicious activity."

It's unclear exactly what effect the suspension of cell service had on the protest. The San Francisco Chronicle and KTVU, among others, reported that it appeared the protest failed to happen as planned.

Cell service was suspended from about 4 p.m. to 7 p.m. PT in Embarcadero, Montgomery Street, Powell Street, and Civic Center BART stations, BART's Allison said.

Sprint, Verizon, AT&T, and T-Mobile all provide service in the Transbay Tube, according to BART. The Tube runs beneath the San Francisco Bay, connecting San Francisco to Oakland, Berkeley, and other East Bay cities.

Asked to comment on whether the wireless shutdown was lawful, Allison told CNET: "We are well within our legal rights."

In a blog post, the ACLU of Northern California said this: "Shutting down access to mobile phones is the wrong response to political protests, whether it's halfway around the world or right here in San Francisco. You have the right to speak out. Both the California Constitution and the First Amendment to the United States Constitution protect your right to free expression."

The BART statement addressed free speech: "BART's primary purpose is to provide safe, secure, efficient, reliable, and clean transportation services," it said. "BART accommodates expressive activities that are constitutionally protected by the First Amendment to the United States Constitution and the Liberty of Speech Clause of the California Constitution (expressive activity), and has made available certain areas of its property for expressive activity...."

"Paid areas of BART stations are reserved for ticketed passengers who are boarding, exiting, or waiting for BART cars and trains, or for authorized BART personnel," the statement said. "No person shall conduct or participate in assemblies or demonstrations or engage in other expressive activities in the paid areas of BART stations, including BART cars and trains and BART station platforms."

Protesters are angry over what they say is excessive use of force after the death of Hill, and of another man in 2009. A BART officer fired three shots at Hill, a 45-year-old transient, after Hill allegedly threw a bottle at officers and waved a four-inch knife at them. That followed a highly publicized fatal shooting on January 1, 2009, in which a BART officer shot Oscar Grant in the back as he lay on the ground unarmed and restrained. Video from cell phones and cameras went viral and fanned anti-BART resentment and protests. The officer was found guilty of involuntary manslaughter after claiming he meant to fire his Taser instead of his gun, and he served a two-year sentence.

AT&T spokesman Mark Siegel said, in regard to the wireless shutdown, "We have no comment on this. Suggest you speak with BART." A T-Mobile representative said the company had no comment and referred questions to BART.

Representatives from Sprint and Verizon either did not immediately respond to calls and e-mails seeking comment or said they were looking into the matter.

Updated 3:05 p.m. PT with info on Anonymous' anti-BART campaign, 2:30 p.m. with "no comment" from AT&T, 2:25 p.m. with BART saying the subway system--and not the carriers--had cut cell service, 2:15 p.m. with info on the BlackBerry's role in the London riots, ACLU comment, details on Hill shooting, and background on Oscar Grant death, 6:06 p.m. with "no comment" from T-Mobile and reports that say protest appeared not to happen.
http://news.cnet.com/8301-27080_3-20...uring-protest/





Apple Uses Courts to Buy Time to Secure iPad's Market Share
Rachel Armstrong

Apple Inc's latest victory in its intellectual property battle with Samsung Electronics' is a step forward in its broader strategy of using the courts to help cement the unassailable lead its iPad has in the tablet market.

The technology giant has just won an injunction in a German court that temporarily bans Samsung from selling its flagship Galaxy tablet in most of the European Union, having won a similar ruling in Australia last week.

These injunctions are only preliminary measures and Apple will have to provide more substantial evidence in subsequent court cases that the design of the Galaxy infringed its patents or copied their designs in order to make any bans permanent.

Such cases can take months if not years to come to court -- assuming there's no settlement first -- and if Apple loses it will be liable for the business lost by Samsung in the meantime.

"Apple has a strategy of filing patents, getting some protection and trying to prevent other people from entering the market in the short-term," said Nathan Mattock, an intellectual property lawyer at Marque Lawyers in Sydney.

"If Apple's wrong it will have to pay Samsung a considerable amount of damages, so it's potentially quite risky."

Time Is Precious

But while risky, technology experts say pursuing this kind of strategy is worth it for Apple in terms of the time it buys their iPad to try and win an even greater market share.

"It's a market that's developing very fast which Apple have the lead in, so regardless of the damages they have to pay if they lose, the longer they can hold off competition the better for their business," said Andrew Milroy, vice president of information and communication technology research at consultancy Frost & Sullivan in Singapore.

In Australia, Samsung has agreed to show Apple an Australian version of the Galaxy Tab 10.1 one week before its launch there, a Samsung spokesman said.

In the first quarter of 2011, Apple's iPad accounted for 66 percent of the global tablet market according to market researcher IDC. However the growth of new products coming on to the market means that's expected to drop to around 58 percent by the end of the year.

Technology experts say Apple is using the courts in order to try and stop that slide.

"Using the courts is increasingly becoming part of commercial strategy in high growth markets where the opportunities are great -- it's a tactic to try and slow the competition down by whatever means you can," said Frost & Sullivan's Milroy.

Going down this route in German courts is particularly effective as it's easier to win a preliminary injunction forcing a company to remove its products from the market straight away than it is in the United States.

Florian Mueller, who writes the software intellectual property blog FOSS Patents, said that these injunctions require evidence the products in question are causing harm to the right holder's business but "not the more complex kind of hardship and public interest analysis that is performed in the United States."

Samsung To Strike Back

The competition ,however, is likely to strike back. Legal experts say Samsung will be preparing a multi-pronged case which will likely force Apple in to some kind of settlement allowing them back in to the market.

"Samsung's case will be a combination of 'your patent's not valid, even if it is valid its scope is very narrow and we're not infringing it anyway, plus by the way you're infringing our patent as well'," said Kimberlee Weatherall, associate director of the Intellectual Property Research Institute of Australia.

"It's posturing with a view to reaching some sort of settlement -- the stronger the position Samsung can put itself in with those multiple levels of argument the more favorable the settlement is likely to be," she added.

It's not just Samsung that Apple's big name IP lawyers, including Freehills in Australia and Morrison & Foerster in the United States, have in their sights.

The company is also involved in legal action with Taiwan's HTC and Motorola Inc, alleging patent infringements by their smartphones.

Google Battle

All of these rivals to Apple use Google Inc's Android platform, and the legal action prompted a stinging attack from Google's legal chief last week.

"They (Apple) want to make it harder for manufacturers to sell Android devices," Google's David Drummond wrote in a blog entry.

"Instead of competing by building new features or devises, they are fighting through litigation."

For now though Apple, whose strong sales mean it has built up billions of dollars in cash reserves, has enough money on its hands to finance both innovation and litigation.

"Apple has got quite a war chest so it can operate in this way, and that in the short-term at least is going to lead to their market dominance and everyone is one notice of that," said Mattock at Marque Lawyers.

(Additional reporting by Lee Chyen Yee in HONG KONG; Editing by Lincoln Feast)
http://www.reuters.com/article/2011/...77928620110810





Lawsuit Claims Apple and Publishers Colluded on eBook Pricing in Fear of Amazon.com

A Seattle law firm has filed a class-action lawsuit accusing Apple and five book publishers of engaging in price fixing around sales of electronic books. The suit, filed today in U.S. District Court in Northern California, alleges that Apple and HarperCollins Publishers, Hachette Book Group, Macmillan Publishers, Penguin Group Inc. and Simon & Schuster “colluded” on book prices in order to force Amazon.com to “abandon its pro-consumer discount pricing.”

“Fortunately for the publishers, they had a co-conspirator as terrified as they were over Amazon’s popularity and pricing structure, and that was Apple,” said Steve Berman, attorney representing consumers and founding partner of Hagens Berman. “We intend to prove that Apple needed a way to neutralize Amazon’s Kindle before its popularity could challenge the upcoming introduction of the iPad, a device Apple intended to compete as an e-reader.”

In a press release, Berman, a class-action specialist, continued:

“As a result of the pricing conspiracy, prices of e-books have exploded, jumping as much as 50 percent. When an e-book version of a best-seller costs close to – or even more than – its hard-copy counterpart, it doesn’t take a forensic economist to see that this is evidence of market manipulation.”

The suit also notes that Amazon’s pricing “threatened to disrupt the publishers’ long-established brick-and-mortar model faster than the publishers were willing to accept.”

You can read the entire press release and see a copy of the suit here.
http://www.geekwire.com/2011/suit-cl...fear-amazoncom





When Books Can Be Deceiving
Linda Morris

THE most pirated digital book title on the file-sharing website The Pirate Bay is not The Lord of the Rings. Nor is it one of the seven Harry Potter books or a title from the Twilight saga.

Apart from George R. Martin's historical fantasy A Game of Thrones, fiction isn't among the top 100 ebooks downloaded by users of the Swedish site.

The top title is David J. Lieberman's guide to detecting lies. The rest of the list is heavy with manuals on men's fitness, computer-hacking, digital photography, sexual techniques and brain training.

The Pirate Bay, which also offers access to music, movies and television shows, was named by the US Trade Department in March as one of a slew of websites and file-sharing services contributing to marketplace piracy. Boasting five million registered users, it is censored in Ireland, Italy and Denmark, and is battling the entertainment industry and the copyright lobby in international courts to remain online.

The site's users are mainly male, internet-savvy readers who are not regular buyers of bestselling fiction.

Beyond Pirate Bay, many other sites offer unauthorised copies of the Harry Potter series, Stephen King, Jodi Picoult and Jean M. Auel.

''Anything and everything is there to find if you want to look for it,'' says Tim Coronel, the publisher of the trade magazine Bookseller+Publisher.

Kate Eltham, the chief executive officer of the Queensland Writers Centre, says ebook piracy is not as widespread as in the music industry.

Nor do mainstream book buyers download illegally, since the varying quality, ethical concerns of older buyers and the nostalgic pull of the paperback continue to shape buying patterns.

The Nielsen Company, however, predicts a steep take-up of tablets with ebook reading capacity, with one quarter of Australian households expected to be owning one by Christmas.

''Piracy isn't a huge problem for the great majority of books and it won't become one as long as a wide range of titles is made available as widely as possible on a range of platforms at a reasonable price,'' Coronel says.

Mary Cunnane, the vice-president of the Australian Literary Agents' Association, says the risk is that readers will get used to the idea of file-sharing digital versions of books without cost.

''It means a loss of income for the author, and perforce agents, publishers, and retailers. It's what happened to the music industry and something publishing desperately wants to avoid.''
http://www.smh.com.au/entertainment/...812-1iqmk.html





Publishing Gives Hints of Revival, Data Show
Julie Bosman

The publishing industry has expanded in the past three years as Americans increasingly turned to e-books and juvenile and adult fiction, according to a new survey of thousands of publishers, retailers and distributors that challenges the doom and gloom that tends to dominate discussions of the industry’s health.

BookStats, a comprehensive survey conducted by two major trade groups that was released early Tuesday, revealed that in 2010 publishers generated net revenue of $27.9 billion, a 5.6 percent increase over 2008. Publishers sold 2.57 billion books in all formats in 2010, a 4.1 percent increase since 2008.

The Association of American Publishers and the Book Industry Study Group collaborated on the report and collected data from 1,963 publishers, including the six largest trade publishers. The survey encompassed five major categories of books: trade, K-12 school, higher education, professional and scholarly.

“We’re seeing a resurgence, and we’re seeing it across all markets — trade, academic, professional,” said Tina Jordan, the vice president of the Association of American Publishers. “In each category we’re seeing growth. The printed word is alive and well whether it takes a paper delivery or digital delivery.”

Higher education was especially strong, selling $4.55 billion in 2010, up 18.7 percent in three years, a trend that Ms. Jordan suggested could be traced to the expansion of two-year and community colleges and the inclination to return to school during a rough economy.

Sales of trade books grew 5.8 percent to $13.9 billion, fueled partly by e-books, the report said. Juvenile books, which include the current young-adult craze for paranormal and dystopian fiction, grew 6.6 percent over three years.

One of the strongest growth areas was adult fiction, which had a revenue increase of 8.8 percent over three years.

E-books were another bright spot, thanks to the proliferation and declining cost of e-reading devices like the Nook by Barnes & Noble and Amazon’s Kindle, and the rush by publishers to digitize older books.

In 2008 e-books were 0.6 percent of the total trade market; in 2010, they were 6.4 percent. Publishers have seen especially robust e-book sales in genre fiction like romance, mystery and thrillers, as well as literary fiction. In 2010, 114 million e-books were sold, the report said.

The survey does not include sales data from 2011, a year of substantial e-book growth. In its monthly snapshots of the industry so far this year, the Association of American Publishers has also tracked some decline in print sales.

And the report’s estimate of the size of the industry was significantly smaller than those in previous surveys of the book business, which were conducted by the Book Industry Study Group and released annually.

Dominique Raccah, the publisher of Sourcebooks, a midsize publisher in Naperville, Ill., worked on both surveys and said the older methodology was flawed.

“We probably overstated our estimates,” she said. “And today we have a better methodology and more data.”

Ms. Raccah emphasized that the newest survey incorporates data from a much larger group of publishers, in addition to distributors, wholesalers and retailers like Barnes & Noble. In its definition of what is a book, the report counted professional and scholarly journals and databases, multimedia teaching materials and mobile apps.

It also noted that publishers, many of which have expanded in recent years, are experimenting with multimedia products that go far beyond the traditional print book, she said.

“It shows that the industry as a whole is really healthy,” Ms. Raccah said. “That, I think, is exciting. You’re seeing an industry that is transforming itself.”

The newest survey showed that sales of adult hardcover and paperback books from 2008 through 2010 were relatively flat, growing about 1 percent over three years. Sales of mass-market paperbacks declined 16 percent since 2008.

Hardcover books became a slightly smaller portion of the trade market, dropping to 37.7 percent in 2010 from 39.6 percent in 2008. Trade paperbacks also lost some market share, dipping to 37.8 percent in 2010 from 39.5 percent in 2008.

Professional publishing, which focuses on science, medicine, law, technology and the humanities, increased by 6.3 percent from 2008 through 2010, to $3.75 billion. Scholarly publishing, the smallest category in the business, had net revenue grow by 4.7 percent since 2008.
https://www.nytimes.com/2011/08/09/b...ince-2008.html





After Much Ado, a Google Book Deal in France
Eric Pfanner

France has caused plenty of headaches for Google. Its politicians have denounced the U.S. Internet giant as a cultural imperialist; its publishers have called it a copyright cheat.

Yet France is suddenly the only country in the world in which Google has managed to achieve a longstanding business goal. A few days ago Google signed an agreement with the publisher Hachette Livre under which tens of thousands of French-language books will be pulled out of ink-on-paper purgatory and provided with a digital afterlife.

Hachette and Google reached a preliminary deal last year, but it was overshadowed by a far broader agreement between Google and U.S. authors and publishers that would have settled longstanding litigation. Like the deal with Hachette, the U.S. agreement involved books that were out of print but still protected by copyright, a category that accounts for the vast majority of the world’s books.

But last winter, a U.S. judge, Denny Chin, rejected the American settlement, and talks have stalled since. Meanwhile, with a final agreement in place in France, Google says it intends to start selling e-book versions of the Hachette titles by the end of the year, when it introduces a French version of its digital bookstore, Google Editions.

The deal with Hachette, which is part of the media conglomerate Lagardère, does not end Google’s problems with French publishers. At least three of them, Albin Michel, Flammarion and Gallimard, are pursuing lawsuits against the company, saying it illegally scanned their books. Another publisher, La Martinière, previously won a similar court case against Google.

Yet Google and Hachette, the biggest French publisher, with about one-quarter of the French market, said they hoped their deal could serve as a model for a broader rapprochement.

“We would love to implement similar arrangements with other French publishers, and it’s something that we have in mind as we talk to other partners,” said Simon Morrison, copyright policy and communications manager at Google in London.

Hachette said it would make digital copies of scanned books available to the Bibliothèque Nationale de France and other libraries, “thus contributing to the advancement of French culture,” as the company put it in a statement.

Could the agreement end up showing the way forward in the negotiations on a revised U.S. deal?

There are several key differences between the French accord and the U.S. proposal that Judge Chin rejected. One is that Hachette retains control of which books can be scanned and sold by Google, just as it does with copyrighted works that remain in print. Under the U.S. proposal, Google would have been free to digitize any out-of-print books, unless the copyright holders expressly opted out of the settlement.

In a hearing in New York last month, Judge Chin asked representatives of Google, the Association of American Publishers and the Authors Guild whether it might be possible to negotiate a so-called opt-in agreement — in other words, along the lines of the Hachette deal. Both sides were noncommittal, according to news reports of the hearing.

Until now, Google has steadfastly resisted switching to an opt-in system in the U.S. talks. Doing so would be a big setback for a company that says its mission is “to organize the world’s information and make it universally accessible and useful.” Some authors or publishers would opt out. So-called orphan works, those whose copyright holders cannot be clearly identified or tracked down, might never make it into the digital future.

Judge Chin has given Google, the publishers and the authors until Sept. 15 to come up with a revised deal. If nothing is settled by then, the litigation that prompted the talks is set to restart, six years after the authors and publishers originally sued Google.

Meanwhile, the French digital book business, which so far has trailed far behind the United States, with e-book sales still in their infancy, is about to pull ahead in one small but significant way.
https://www.nytimes.com/2011/08/08/t...in-france.html





Massachusetts Woman Sues Over Gmail Snooping
Jay Greene

A Massachusetts woman has filed a class action suit against Google for snooping into e-mail sent by people who don't have Gmail accounts to those that do, and using the information gleaned to sell targeted advertising.

Debra L. Marquis filed a class action suite on behalf of Massachusetts residents who do not have Gmail accounts and have sent mail to accounts with Gmail addresses. People without Gmail accounts have not consented to having their emails scanned, which Google does with Gmail messages in order to serve advertising that's presumably relevant to Gmail customers.

"In order to target advertisements to Gmail users," the suit alleges, "Google intercepts electronic communications to and from Gmail users with a device, without prior consent of the non-Gmail users," a violation of a Massachusetts wiretapping law.

According to the suit, Marquis is an AOL account holder. On behalf of the class, Marquis is seeking damages of $100 per day for each violation, or $1,000, whichever is higher, as well as punitive damages. The suit also seeks an injunction preventing Google from violating the specific Massachusetts law.

In a statement, Google said, "We're not going to comment on the ongoing litigation. But to be clear, Gmail has from the beginning used automated scanning technology to show our users relevant advertisements that help to keep our services free."

Marquis' lawyer did not return a call seeking comment.

Gmail's scanning has occasionally been a lightening rod for privacy advocates. Google has faced similar suits in the past. And rivals, such as Microsoft, have taken the company to task for the practice. Last month, a Microsoft video surfaced featuring the "Gmail Man," a friendly letter carrier that peeks into giant red Gmail envelopes to the revulsion of recipients.
http://news.cnet.com/8301-1023_3-200...mail-snooping/





Why Microsoft, Google and Facebook Want Your Email
Joe Brockmeier

Tally up the total number of searches on Google, Yahoo and Bing and you have about 3.5 billion searches per day. It's estimated that Twitter has about 300 million accounts and Facebook claims more than 750 million active users.As impressive as those numbers are, though, they're a drop in a bucket compared to email – which is estimated at 294 billion per day.

That, says Jeff Hardy, is why Google and Microsoft really want your email. Not because of the numbers, per se, but because of what the numbers represent.

Hardy, vice president of SmarterTools, says that Microsoft wants email because it's profitable – and wants to convince companies that email is expensive to provide. Google, on the other hand, presents email as free, because it wants the information that comes with. And Facebook is trying to get into the email game because it's the only way to continue its growth in the face of flattening usage.

Other companies should be competing for that business, says Hardy, and that email is dirt cheap to provide. According to his presentation for HostingCon 2011, Hardy says that you can get full enterprise-class email for less than $0.04 per mailbox, or Exchange replacement mailboxes for less than $0.39 per mailbox. But he says that each account is worth up to 10 people on average, and providers get to market their products and services to all of those users. Passing up email is passing up an opportunity to create loyalty.

Hardy seems to have a good point. Two, actually. One is that email gets overlooked compared to the new hotness(es) like Twitter or Facebook. But we still do much, much more via email. Perhaps too much, but that's a topic for another post. The other is that companies that offer hosting (his audience at HostingCon 2011) should be focused on providing email and getting it right. There's plenty of room, even today, for solid email hosting plans and for companies to wrangle email into much more business.
https://www.readwriteweb.com/enterpr...-and-faceb.php





When Knowledge Isn’t Written, Does It Still Count?
Noam Cohen

“MAKING fun of Wikipedia is so 2007,” a French journalist said recently to Sue Gardner, the executive director of the foundation that runs the Wikipedia project.

And so Ms. Gardner, in turn, told an auditorium full of Wikipedia contributors and supporters on Thursday in Haifa, Israel, the host city for the seventh annual Wikimania conference, where meetings and presentations focus on the world’s most used, and perhaps least understood, online reference work.

Once routinely questioned about its reliability — what do you mean, anyone can edit it? — the site is now used every month by upwards of 400 million people worldwide. But with influence and respect come responsibility, and lately Wikipedia has been criticized from without and within for reflecting a Western, male-dominated mindset similar to the perspective behind the encyclopedias it has replaced.

Seeing Wikipedia as The Man, in so many words, is so 2011.

And that’s a problem for an encyclopedia that wants to grow. Some critics of Wikipedia believe that the whole Western tradition of footnotes and sourced articles needs to be rethought if Wikipedia is going to continue to gather converts beyond its current borders. And that, in turn, invites an entirely new debate about what constitutes knowledge in different parts of the world and how a Western institution like Wikipedia can capitalize on it.

Achal Prabhala, an adviser to Ms. Gardner’s Wikimedia Foundation who lives and writes in Bangalore, India, has made perhaps the most trenchant criticism in a video project, “People are Knowledge,” that he presented in Haifa (along with its clunky subtitle, “Exploring alternative methods of citation for Wikipedia”).

The film, which was made largely with a $20,000 grant from the Wikimedia Foundation, spends time showing what has been lost to Wikipedia because of stickling rules of citation and verification. If Wikipedia purports to collect the “sum of all human knowledge,” in the words of one of its founders, Jimmy Wales, that, by definition, means more than printed knowledge, Mr. Prabhala said.

In the case of dabba kali, a children’s game played in the Kerala state of India, there was a Wikipedia article in the local language, Malayalam, that included photos, a drawing and a detailed description of the rules, but no sources to back up what was written. Other than, of course, the 40 million people who played it as children.

There is no doubt, he said, that the article would have been deleted from English Wikipedia if it didn’t have any sources to cite. Those are the rules of the game, and those are the rules he would like to change, or at least bend, or, if all else fails, work around.

“There is this desire to grow Wikipedia in parts of the world,” he said, adding that “if we don’t have a more generous and expansive citation policy, the current one will prove to be a massive roadblock that you literally can’t get past. There is a very finite amount of citable material, which means a very finite number of articles, and there will be no more.”

Mr. Prabhala, 38, who grew up in India and then attended American universities, has been an activist on issues of intellectual property, starting with the efforts in South Africa to free up drugs that treat H.I.V. In the film, he gives other examples of subjects — an alcohol produced in a village, Ga-Sabotlane, in Limpopo, South Africa, and a popular hopscotch-type children’s game, tshere-tshere — beyond print documentation and therefore beyond Wikipedia’s true-and-tried method.

There are whole cultures, he said, that have little to no printed material to cite as proof about the way life is lived.

“Publishing is a system of power and I mean that in a completely pleasant, accepting sense,” he said mischievously. “But it leaves out people.”

But Mr. Prabhala offers a solution: he and the video’s directors, Priya Sen and Zen Marie, spoke with people in African and Indian villages either in person or over the phone and had them describe basic activities. These recordings were then uploaded and linked to the article as sources, and suddenly an article that seems like it could be a personal riff looks a bit more academic.

For example, in his interview with a South African villager who explained how to make the alcoholic drink, morula, she repeatedly says that it is best if she demonstrates the process. When the fruit is ready, said the villager, Philipine Moremi, according to the project’s transcript of her phone conversation, “we pry them open. We are going to show you how it is done. Once they are peeled, we seal them to ferment and then we drink.” The idea of treating personal testimony as a source for Wikipedia is still controversial, and reflects the concerns that dominated the encyclopedia project six years ago, when arguably its very existence was threatened.

After a series of hoaxes, culminating in a Wikipedia article in 2005 that maligned the newspaper editor John Seigenthaler for no discernible reason other than because a Wikipedia contributor could, the site tried to ensure that every statement could be traced to a source.

Then there is the rule “no original research,” which was meant to say that Wikipedia doesn’t care if you are writing about the subway station you visit every day, find someone who has written reliably on the color of the walls there.

“The natural thing is getting more and more accurate, locking down articles, raising the bar on sources,” said Andrew Lih, an associate professor of journalism at the University of Southern California, who was an early contributor to Wikipedia and has written a history of its rise. “Isn’t it great we have so many texts online?”

But what works for the most developed societies, he said, won’t necessarily work for others. “Lots of knowledge is not Googleable,” he said, “and is not in a digital form.”

Mr. Lih said that he could see the Wikipedia project suddenly becoming energized by the process of documenting cultural practices around the world, or down the street.

Perhaps Mr. Prabhala’s most challenging argument is that by being text-focused, and being locked into the Encyclopedia Britannica model, Wikipedia risks being behind the times.

An 18-year-old is comfortable using “objects of trust that have been created on the Internet,” he said, and “Wikipedia isn’t taking advantage of that.” And, he added, “it is quite possible that for the 18-year-old of today that Wikipedia looks like his father’s project. Or the kind of thing his father might be interested in.”

Ouch.
https://www.nytimes.com/2011/08/08/b...wikipedia.html





News Corp.’s Soft Power in the U.S.
David Carr

Over the last month, many Americans watched from a distance in horror or amusement as it became evident that the News Corporation regarded Britain’s legal and political institutions as its own private club.

That could never happen in the United States, right?

As it turns out, a News Corporation division has twice come under significant civil and criminal investigations in the United States, but neither inquiry went anywhere. Given what has happened in Britain with the growing phone-hacking scandal, it is worth wondering why.

Both cases involve News America Marketing, an obscure but lucrative division of the News Corporation that is a big player in the business of retail marketing, including newspaper coupon inserts and in-store promotions. The company has come under scrutiny for a pattern of conduct that includes below-cost pricing, paying customers not to do business with competitors and accusations of computer hacking.

News America Marketing came to control 90 percent of the in-store advertising business, according to Fortune, aided in part by a particularly quick and favorable antitrust decision made by the Justice Department in 1997. That year, the News Corporation announced it wanted to buy Heritage Media, a big competitor, for about $754 million in stock plus $600 million in assumed debt. The News Corporation said it would sell the broadcast properties and hang onto the marketing division, which serviced 40,000 groceries and other retailers.

The deal would make News America Marketing the dominant player in the business and, for that reason, the San Francisco field office of the Justice Department recommended to Washington that the News Corporation’s takeover bid be challenged on antitrust grounds. Typically, such a request from a field office would carry great weight in Washington and, at a minimum, delay the deal for months.

But the Justice Department brass overrode San Francisco’s objections and gave its blessing in just two weeks. So who ran the antitrust division at the Justice Department at the time? Joel Klein, who this year became an executive vice president at the News Corporation, head of its education division and a close adviser to Rupert Murdoch on the phone-hacking scandal in Britain.

It’s worth noting that less than a year later, the Justice Department division led by Mr. Klein blocked the News Corporation from selling its share of a satellite company to PrimeStar, owned by a group of cable providers, on antitrust grounds, so any suggestion that a department of the United States government was snugly in the hip pocket of Mr. Murdoch would not be correct.

None of this suggests that Mr. Klein cut some sort of a deal that resulted in a job 14 years later. But the speed of the antitrust decision surprised even the people involved in the takeover. One of the participants, who declined to be identified discussing private negotiations, said he thought the sale was effectively blocked before the surprising turnaround.

“After that meeting with the San Francisco office, we all looked at each other and said, ‘This deal is not going to happen,’ ” he said.

My colleague Eric Lipton and I spent a few days trying to tease apart who made the actual decision to give the purchase the go-ahead — “It was as if a magic button had been pushed somewhere. We were all in shock,” said one of the same participants in the deal — but there is no paper trail.

People who worked at the Justice Department back then either could not recollect how the decision was made or declined to share information if they knew.

A spokeswoman for the News Corporation released this statement: “Joel didn’t know Mr. Murdoch at the time of the Heritage Media transaction 14 years ago. A year later, the D.O.J. under his leadership challenged the PrimeStar transaction in which News Corporation had a major interest. Any suggested inference is ludicrous.”

A lawyer who worked in the Justice Department in Washington at the time but did not want to be identified discussing internal matters, said: “This decision was made on the merits. The front office in Washington didn’t think a case could be won in court based on the very narrow definition of the market.”

But in retrospect, the anticompetitive fears of the San Francisco office were well founded.

After the Heritage Media deal, News America Marketing was in a position to throw its weight around and it did just that, drawing a variety of lawsuits in which competitors claimed they had been threatened and harassed. The News Corporation has settled those cases at a cost of over $650 million, and now the F.B.I. is looking into whether there was a pattern of illicit tactics by that division of the News Corporation.

“The way this whole thing got started was a horrible mistake. The government was bamboozled or worse,” said Thomas J. Horton, a law professor at the University of South Dakota who used to work at the Justice Department and represented a competitor, Insignia Systems of Minneapolis, in a lawsuit against News America Marketing. “The company has a long history of behaving unethically with no regard for our system of justice or legal ethics. They are ruthless.”

One of News America Marketing’s other competitors was Floorgraphics, a small New Jersey company that did in-store ads. George Rebh, who founded Floorgraphics along with his brother Richard, met with Paul V. Carlucci, head of News America, in 1999 at a Manhattan restaurant, and the News Corporation executive got right to the point.

“I will destroy you,” Mr. Carlucci said, according to his deposition in the Floorgraphics suit against News America, adding, “I work for a man who wants it all, and doesn’t understand anybody telling him he can’t have it all.” (Mr. Carlucci is now the publisher of the News Corporation-owned New York Post.)

Just in case the Rebh brothers did not get the point, court records indicate that beginning in October 2003, someone working out of the Connecticut headquarters of News America Marketing gained access to the Floorgraphics computer network, which included a collection of advertisements the company had created for its customers.

The News Corporation’s executives, as they have in case of phone hacking in Britain, said they had no idea that people working for them were engaged in such activity.

But in 2004, a Floorgraphics board member sent a letter to David F. DeVoe, chief financial officer of the News Corporation, detailing that Floorgraphics computers had “been breached by News America, as identified by their I.P. addresses.” News America has since admitted in court to breaching its competitor’s computers, but attributed it to lax security and a rogue employee.

According to correspondence that has been forwarded to members of the New Jersey Congressional delegation, Mr. Rebh also got in touch with the F.B.I., which sent two special agents to the Floorgraphics offices in 2004. One of the agents, Susan Secco, followed up with an e-mail in which she commented on the evidence Floorgraphics had compiled.

“I believe I have all I need to conduct interviews, as there is an excellent paper trail,” she wrote.

She then got in touch with the United States attorney in New Jersey and, after an initial burst of interest, the case died a slow death. The United States attorney at the time in New Jersey was Chris Christie, now governor of New Jersey and a rising star in the Republican Party.

Michael Drewniak, the governor’s press secretary, said politics played no role. “The U.S. attorney’s office receives thousands of referrals each year from parties seeking criminal investigations. Any decision to prosecute or not prosecute is based strictly on the strength of the evidence or lack thereof,” he said.

Two senior lawyers who supervised the unit that handled the initial investigation — Kevin O’Dowd and Deborah L. Gramiccioni — are now senior aides to Mr. Christie in the governor’s office. A state official said neither Mr. O’Dowd, then an assistant United States attorney, nor Ms. Gramiccioni, then chief of the office’s commercial crimes unit, recalled the details of the case and suggested that it had been handled at a lower level.

In early 2005, Mr. Rebh urged Representative Rush Holt and New Jersey’s two senators at the time, Jon S. Corzine and Frank R. Lautenberg — both Democrats — to press Mr. Christie and Alberto R. Gonzales, then the United States attorney general, to pursue the matter. The Federal Trade Commission asked for jurisdiction and was denied by the Justice Department. Frustrated, the Rebhs went to the Secret Service, but the case died for lack of cooperation.

Floorgraphics filed a civil lawsuit in federal court in 2009, but the suit was dropped when the News Corporation agreed to buy the assets of Floorgraphics for $29.5 million.

Given the pattern of conduct revealed recently in Britain, there is renewed interest in how News America behaved in the in-store business. Mr. Lautenberg and Mr. Holt sent a letter last month to Attorney General Eric H. Holder Jr., reminding him about the original accusations made by Floorgraphics and suggesting that the Justice Department revisit the case.

Although the statute of limitations on many of the ostensible crimes has expired, Mr. Lautenberg and others have indicated that the Senate Commerce Committee may hold hearings to investigate whether there was a broad pattern of misconduct by the News Corporation.

It’s too early to say what the result of these accusations and inquiries might be. And certainly no one has credibly said that the News Corporation’s employees here have hacked phones as they did in Britain, or replicated in America the kind of cozy, possibly corrupt relationships British employees fostered with officials.

Then again, maybe they didn’t have to. In America, where the News Corporation does most of its business and also has a long reach into film, TV, cable and politics, the company’s size and might give it a soft, less obvious power that it has been able to project to remarkable effect.
https://www.nytimes.com/2011/08/08/b...in-the-us.html





As Sun Storms Ramp Up, Electric Grid Braces for Impact
Victoria Jaggard

Storms are brewing about 93 million miles (150 million kilometers) away, and if one of them reaches Earth, it could knock out communications, scramble GPS, and leave thousands without power for weeks to months.

The tempest is what's known as a solar storm, a flurry of charged particles that erupts from the sun. Under the right conditions, solar storms can create extra electrical currents in Earth's magnetosphere—the region around the planet controlled by our magnetic field.

The electrical power grid is particularly vulnerable to these extra currents, which can infiltrate high-voltage transmission lines, causing transformers to overheat and possibly burn out.

"The concern is if the electric grid lost a number of transformers during a single storm, replacing them would be difficult and time-consuming," said Rich Lordan, senior technical executive for power delivery and utilization at the Electric Power Research Institute (EPRI).

"These power transformers are very big devices, and the lead time to get a replacement can be two months—if there's a spare one stored nearby. If a utility has to order a new one from the manufacturer, it could take six months to up to two years to deliver."

The danger is becoming more critical, as the sun is approaching what's known as solar maximum—the high point in our star's roughly 11-year cycle of activity. Scientists anticipate stronger storms around solar max, in 2013.

Using the latest sun-watching satellites and computer models, scientists have been trying to improve solar storm predictions. At the same time, electricity operators are developing plans for how to respond to solar storm warnings and determine what the consequences for the grid might be in a worst-case scenario.

"Geomagnetic storms are low-probability, high-impact events," Lordan said. "When assessing the risk to the grid, one has to ask, What's the level of storm intensity that the grid system should be prepared for?

"Based on the data and the scenarios we can reasonably expect, I believe the power-delivery system can operate through a solar storm."

Listening for the Solar Whistle

Earth is being constantly bombarded by charged particles from the sun, which emits material in all directions. This is known as the solar wind. But sometimes the sun ramps up magnetic activity on its surface, triggering huge flares of plasma.

Such "solar flares are like the whistle on a freight train," said Joe Kunches, a space scientist for the National Oceanic and Atmospheric Administration's Space Weather Prediction Center (SWPC). The big impacts come from coronal mass ejections (CMEs), cloud-like bundles of plasma that are sent racing off the sun's upper atmosphere, or corona, during periods of intense surface activity.

That's not to say every CME is a harbinger of doom—the clouds are highly directional and can miss Earth entirely or strike only glancing blows.

"The sun doesn't give a hoot about us," Kunches said. "It erupts and produces lots of energy, and sometimes we get a direct hit and sometimes we don't."

However, said Antti Pulkkinen, a sun researcher with NASA Goddard Space Flight Center, "if these clouds do move toward Earth's near-space environment . . . they can carry billions of tons of matter moving at 2,000 kilometers [1,242 miles] a second."

When the cloud reaches our magnetosphere, its charged particles become electromagnetically coupled to Earth's magnetic field, generating large electrical currents millions of amperes strong, Pulkkinen said. The sprawling electrical grid on Earth's surface then acts like an antenna, allowing these currents to flow into transmission lines.

"These storms are by their basic nature global," Pulkkinen added. But the risks to electrical grids are greatest at higher latitudes, since the largest electric currents are funneled toward Earth around the Poles.

For instance, in 1989 the transmission system for Canada's Hydro Quebec electricity provider collapsed during a solar storm, leaving millions of people without power for nine hours or more. And the "Halloween storms" of 2003 triggered blackouts in the city of Malmö, Sweden, and likely caused transformer failures in South Africa.

"Because they are located closer to the magnetic North Pole, Canadian utilities are deeply involved in monitoring geomagnetically induced currents, modeling impacts for vulnerability, and refining their operational protocols," EPRI's Lordan said.

European utilities and the South African electricity provider ESKOM also are preparing for the upcoming solar maximum, in part with advice and data from NASA.

The AC/DC Problem

Technically, geomagnetically induced currents aren't that strong compared with the currents that normally flow between power plants and electricity consumers. For electricity to travel long distances, it needs to be transformed to high voltage and back again, to limit energy loss due to resistance in the transmission wires.

Trouble arises because the extra currents from solar storms are direct current (DC) flows, and the electricity transmission system is used to handling alternating current (AC) flows, said EPRI's Lordan.

The extra DC flows saturate transformers, which start to overheat, causing their insulation to break down and their parts to experience accelerated aging. Above a certain temperature, a transformer will fail.

At the same time, the saturated transformer starts to consume what's known as reactive power.

"When you look at power in the system, there's real power—like that in incandescent light bulbs—and then there's 'imaginary' power called reactive power, measured in vars," Lordan said.

Reactive power is produced when the current and voltage are out of phase. This type of power flow needs to be carefully managed to keep the voltage steady in transmission lines.

During a solar storm, however, any saturated transformers draw on more reactive power than what normal control equipment can handle. This can lead to voltage collapse, when it's no longer possible to push the needed power through transmission wires.

Even without a full collapse, fluctuating voltage in the transmission system can cause the grid to become unstable, which can impact transformers, relays, capacitors, and even the generators at power plants.

A Satellite Shield

In 2007 NASA Goddard began a collaborative effort with EPRI called the Solar Shield Project, which uses monitoring data from several sun observatories to run state-of-the-art computer simulations and make solar storm predictions.

Solar Shield first collects a constant stream of data from satellites such as the Solar and Heliophysics Observatory (SOHO) and the Solar Terrestrial Relations Observatory (STEREO).

"When an operator sees an eruption on the sun, he or she will derive the three-dimensional parameters of that eruption, such as size, speed, and direction," NASA's Pulkkinen said. The resulting model can provide one- to two-day warnings of incoming solar storms to EPRI, which then disseminates the alert to participating utilities across the North American power industry.

"If operators know a couple days beforehand that there's a good likelihood of a storm, they can postpone maintenance of critical lines," Pulkkinen said. This step maximizes the amount of the grid available, reducing strain if localized portions fail.

"Operators can also bring in more reserve power to the system to make it as stable as possible," he said. If particular transformers start showing signs of trouble, operators can reduce their load or disconnect them.

If the storm is expected to be severe enough, "the most dramatic action is to close down the entire grid," Pulkkinen said. "If the system is turned off, the extra DC currents alone won't harm the transformers."

But such a move would be "the last mitigation measure in the toolbox," he said, because switching the system off intentionally would result in temporary blackouts.

"The industry's goal is to provide safe, reliable, and cost-effective power," EPRI's Lordan said. "Utilities would be watching the system closely and would attempt to operate through the storm."

"If the utility system did go down because of a voltage collapse, utilities would have to wait for the storm to pass and then activate procedures to bring the system back up," Lordan said.

"If transformers are lost, the system can operate around a certain number of failed units. But if it's hundreds of transformers, then the industry would quickly get together and move the spares where they are most needed."

The European Union is working on a similar solar-storm alert project called SPACECAST, which is projected to be operational by March 2012.

Trying to Avoid Surprises

As with other natural disasters, the ability to react to a solar storm depends first on the accuracy of monitoring and prediction efforts, which in turn need to be based on real-world physics.

But unlike hurricane predictions, for instance, "we have a tougher nut to crack, because the space weather system is so vast," NOAA's Kunches said. "If you were to make the sun the size of a basketball, Earth by comparison would be like a pinhead. Then you put the basketball at one end of a full-size court and the pinhead at the other end."

In addition, space weather forecasters don't yet have all the pieces of information needed to say for sure whether an incoming storm is going to be the type that will create geomagnetically induced currents.

"The strength of a CME is a function of the polarity of the embedded magnetic field in the plasma," Kunches said. "Polarity dictates whether the storm will be short-lived, very strong, etcetera, and stronger storms are more likely to induce geomagnetic currents. But we don't have that information until the storm is very near Earth."

What's more, even with a host of sun-watching instruments and monitoring centers, sometimes the sun simply "throws us a curveball," he said.

Between SOHO, STEREO, and NOAA's GOES satellites, "we're looking like crazy back at the sun, and we still get one out of ten or twenty surprise CMEs that just don't show up very well in the imagery," Kunches said.

Not only do the satellite-watchers miss some events, they also run the risk of false positives. The forecasters were recently duped just a few weeks ago, on June 21. "All our instruments saw what appeared to be an Earth-directed CME as plain as the nose on your face, so we put out a warning," Kunches recalled. "And it turns out nothing happened."

New Era in Space-Weather Forecasts

Answers may come from recently launched satellites such as NASA's Solar Dynamics Observatory (SDO), which is now watching the sun around the clock in high resolution, taking pictures every tenth of a second in multiple wavelengths.

"One of the goals of SDO is to provide us with the keys to unlock the physics of solar eruptions," NASA's Pulkkinen said. "The SDO team can't predict when the eruptions will happen, but it can observe them and help us predict from there."

Overall, he added, "from a space-weather forecasting viewpoint, we're living in a very exciting time. This is the first time in history we're able to make one- to two-day predictions. It's the first time we have the observation capacity via satellites, and the first time we have full-scale models and the computational power to run those models."

According to NOAA's Kunches, perhaps the most vital aspect today in space weather forecasting and mitigation is well-coordinated communication.

"It's important to be as well-educated about the sun as possible," he said. "There's a recognition in the emergency management community and other levels of government that, as best we can, we need to communicate about space weather.

"If something does happen, even if we didn't predict it very well, the idea is that we can get the word out quickly, and people will know what to do."
http://news.nationalgeographic.com/n...ity-grid-risk/





Lightning in Dublin Knocks Amazon, Microsoft Data Centers Offline
Rich Miller

A lightning strike has caused power outages at the major cloud computing data hubs for Amazon and Microsoft in Dublin, Ireland. The incident has caused downtime for many sites using Amazon’s EC2 cloud computing platform, as well as users of Microsoft’s BPOS (Business Productivity Online Suite).

Amazon said that lightning struck a transformer near its data center, causing an explosion and fire that knocked out utility service and left it unable to start its generators, resulting in a total power outage. While many sites were restored, Amazon said some sites that rely on one of its storage services may take 24-48 hours to fully recover. The company is bringing additional hardware online to try and speed the recovery process, but is advising customers whose sites are still offline to re-launch them in a different zone of its infrastructure.

Amazon said the event affected one of the EC2 Availability Zones in its Dublin data center, which is the company’s primary European hub for its cloud computing platform.

Generator Systems Disrupted

“Normally, upon dropping the utility power provided by the transformer, electrical load would be seamlessly picked up by backup generators,” Amazon said in an update on its status dashboard. “The transient electric deviation caused by the explosion was large enough that it propagated to a portion of the phase control system that synchronizes the backup generator plant, disabling some of them.”

“Power sources must be phase-synchronized before they can be brought online to load. Bringing these generators online required manual synchronization. We’ve now restored power to the Availability Zone and are bringing EC2 instances up.”

Amazon said the power outage began at 10:41 a.m. Pacific time, with instances beginning to recover about three hours later at 1:47 p.m. Pacific time. Recovery is taking longer for some user instances, including those using Amazon Elastic Block Storage (EBS), the company said.

UPDATE: As of 10:45 p.m. Eastern, Amazon reported that 60% of the impacted instances have recovered and are available. “Stopping and starting impaired instances will not help you recover your instance,” AWs said. “For those looking for what you can do to recover more quickly, we recommend re-launching your instance in another Availability Zone.”

UPDATE 2: Early Monday Amazon said that problems with EBS were slowing the recovery. “Restoring these volumes requires that we make an extra copy of all data, which has consumed most spare capacity and slowed our recovery process,” Amazon said in a status update. “We are in the process of installing additional capacity in order to support this process both by adding available capacity currently onsite and by moving capacity from other availability zones to the affected zone. While many volumes will be restored over the next several hours, we anticipate that it will take 24-48 hours until the process is completed.”

Microsoft Outage

The Twitter feed for Microsoft online services reported that a European data center power issue had affected access to its BPOS services. Microsoft reported that services were stating to come back online as of 7:30 pm Eastern/4:30 pm Pacific.

A Microsoft statement said that a “widespread power outage in Dublin caused connectivity issues for European BPOS customers. Throughout the incident, we updated our customers regularly on the issue via our normal communication channels.”

Key European Cloud Computing Hub

Dublin has become a key cloud computing gateway to Europe and beyond for U.S. companies due to several factors, including the city’s location, connectivity, climate and ready supply of IT workers. Dublin’s temperature is ideal for data center cooling, allowing companies to use fresh air to cool servers instead of using huge, power-hungry chillers to refrigerate cooling water.

This allowed Microsoft to design and build one of the world’s most efficient data centers, a huge facility that hosts the company’s cloud services for Europe and operates entirely without chillers. At 550,000 square feet, it is also one of the world’s largest data centers.

Amazon opened a data center in Dublin in December of 2008 to house the European availablity zones for its EC2 cloud computing services. The company recently acquired a 240,000 square foot building in Dublin which will be converted into an expansion data center.

The company’s property moves reflect the rapid growth of its European cloud computing operation, which was chronicled by Netcraft in December.
https://www.datacenterknowledge.com/...nters-offline/





Start-Up to Release 'Stone-Like' Optical Disc that Lasts Forever

New optical disc aims for consumer market first, then corporate archives
Lucas Mearian

Start-up Millenniata and LG plan to soon release a new optical disc and read/write player that will store movies, photos or any other data forever. The data can be accessed using any current DVD or Blu-ray player.

Millenniata calls the product the M-Disc, and the company claims you can dip it in liquid nitrogen and then boiling water without harming it. It also has a U.S. Department of Defense (DoD) study backing up the resiliency of its product compared to other leading optical disc competitors.

Millenniata CEO Scott Shumway would not disclose what material is used to produce the optical discs, referring to it only as a "natural" substance that is "stone-like."

The M-Disc

Millenniata's M-Disc is made of a stone-like substance that the company claims does not degrade over time.

Like DVDs and Blu-ray discs, the M-Disc platters are made up of multiple layers of material. But unlike the former, there is no reflective or die layer. Instead, during the recording process a laser "etches" pits onto the substrate material.

"Once the mark is made, it's permanent," Shumway said. "It can be read on any machine that can read a DVD. And it's backward compatible, so it doesn't require a special machine to read it - just a special machine to write it."

While Millenniata has partnered with LG for the initial launch of an M-Disc read-write player in early October, Shumway said any DVD player maker will be able to produce M-Disc machines by simply upgrading their product's firmware.

Millenniata said it has also proven it can produce Blu-ray format discs with its technology - a product it plans to release in future iterations. For now, the platters store the same amount of data as a DVD: 4.7GB. However, the discs write at only 4x or 5.28MB/sec, half the speed of today's DVD players.

"We feel if we can move to the 8X, that'd be great, but we can live with the four for now," Shumway said, adding that his engineers are working on upping the speed of recording.

Millenniata is also targeting the long-term data archive market, saying archivists will no longer have to worry about controlling the temperature or humidity of a storage room. "Data rot happens with any type of disc you have. Right now, the most permanent technology out there for storing information is a paper and pencil -- until now," Shumway said.

In 2009, the Defense Department's Naval Air Warfare Weapon's Division facility at China Lake, Calif. was interested in digitizing and permanently storing information. So it tested Millenniata's M-Disc against five other optical disc vendors: Delkin Devices, Mitsubishi, JVC, Verbatim and MAM-A.

"None of the Millenniata media suffered any data degradation at all. Every other brand tested showed large increases in data errors after the stress period. Many of the discs were so damaged that they could not be recognized as DVDs by the disc analyzer," the department's report states.

Recordable optical media such as CDs, DVDs and Blu-ray discs are made of layers of polycarbonate glued together. One layer of the disk contains a reflective material and a layer just above it incorporates an organic transparent dye. During recording, a laser hits the die layer and burns it, changing the dye from transparent to opaque creating bits of data. A low power laser then can read those bits by either passing through the transparent dye layer to the reflective layer or being absorbed by the pits.

Over long periods of time, DVDs are subject to de-lamination problems where the layers of polycarbonate separate, leading to oxidation and read problems. The dye layer, because its organic, can also break down over time, a process hastened by high temperatures and humidity.

While the DVD industry claims DVDs should last from 50 to 100 years, according to the National Institute of Standards and Technology (NIST), DVDs can break down in "several years" in normal environments. Additionally, NIST suggests DVDs should be stored in spaces where relative humidity is between 20% and 50%, and where temperatures do not drop below 68 degrees Fahrenheit.

Gene Ruth, a research director at Gartner, said generally he's not heard of a problem with DVD longevity. And, while he admits that a DVD on a car dashboard could be in trouble, the medium has generally had a good track record.

But Ruth said he can see a market in long-term archiving for a product such as the M-Disc because some industries, such as aircraft engineering, healthcare and financial services, store data for a lifetime and beyond.

Millenniata partnered with LG to provide M-Ready technology in most of its DVD and Blu-ray drives. Shumway said the products will begin shipping next month and should be in stores in the beginning of October.

"We felt it was important that we first produce this with a major drive manufacturer, someone that already had models and firmware out there," Shumway said.

Unlike DVDs, which come in 10-, 25-, 50- or 100-disc packs, M-Discs will be available one at a time, or in groups of two or three for just under $3 per disc. Millenniata is also courting system manufacturers in the corporate archive world.

"We're working with some very large channels as we train their distribution networks to launch this," he said. "At the same time, we're launching this at Fry's [Electronics] so consumers can see it and be introduced to this technology."
https://www.computerworld.com/s/arti...asts_forev er





Hacker to Demonstrate 'Weak' Mobile Internet Security
Kevin J. O'Brien

A German computer engineer said Tuesday that he had deciphered the code used to encrypt most of the world’s mobile Internet traffic and that he planned to publish a guide to prompt global operators to improve their safeguards.

Karsten Nohl, who published the algorithms used by mobile operators to encrypt voice conversations on digital phone networks in 2009, said during an interview he planned to demonstrate how he had intercepted and read the data during a presentation Wednesday.

Mr. Nohl said he and a colleague, Luca Melette, intercepted and decrypted wireless data using an inexpensive, modified, 7-year-old Motorola cellphone and several free software applications. The two intercepted and decrypted data traffic in a five-kilometer, or 3.1-mile, radius, Mr. Nohl said.

The interceptor phone was used to test networks in Germany, Italy and other European countries that Mr. Nohl declined to identify. In Germany, Mr. Nohl said he was able to decrypt and read data transmissions on all four mobile networks — T-Mobile, O2 Germany, Vodafone and E-Plus. He described the level of encryption provided by operators as “weak.”

In Italy, Mr. Nohl said his interceptions revealed that two operators, TIM, the mobile unit of the market leader, Telecom Italia, and Wind did not encrypt their mobile data transmissions at all. A third, Vodafone Italia, provided weak encryption, he said.
A spokeswoman for the GSM Association, the industry group based in London that represents global telephone operators, said the group would await details of Mr. Nohl’s research before commenting. A spokesman for O2, which is owned by Telefónica of Spain, said the operator followed Mr. Nohl’s research closely and would take account his findings in its own operations.

Vodafone said in a statement that “We regularly review security measures and carry out risk assessments to prevent the kind of exploit described. We implement appropriate measures across our networks to protect our customers’ privacy.”

Mr. Nohl said he developed his interception technology on an internal broadband network he set up at his research firm, Security Research Labs, in Berlin. His tests focused on mobile data networks that ran on the General Packet Radio Service, or GPRS, technology, which is used widely across the globe.

GPRS networks were introduced in 2000 as successors to GSM digital networks and were the first mobile networks to deliver significant data besides short text messages. GPRS networks are still widely used as backups for newer, faster 3G wireless networks, and consumers are often diverted to GPRS grids when they reach the limits of their monthly data plans.

Rogers Communications, a Canadian operator, estimates that 90 percent of mobile data traffic still runs on GPRS networks.

Mr. Nohl said he was surprised to find that the two Italian operators, TIM and Wind, did not encrypt their data traffic at all. In a statement, TIM would not confirm Mr. Nohl’s claims.

“TIM confirms that it uses state-of-the-art radio mobile technologies from primary international vendors to guarantee the protection of its mobile communications,” it said.

Mr. Nohl, who said he works for mobile operators who hire him to detect vulnerabilities in their systems, said many operators continue to run unencrypted data networks because it allows them to more easily filter out competing, unwanted services like Skype, an Internet-based service that allows consumers to make voice and video calls without using the operators’ voice networks.

“One reason operators keep giving me for switching off encryption is, operators want to be able to monitor traffic, to detect and suppress Skype, or to filter viruses, in a decentralized fashion,” Mr. Nohl said. “With encryption switched on, the operator cannot ‘look into’ the traffic anymore while in transit to the central GPRS system.”

Mr. Nohl said he intended to release his instructions at a conference of the Chaos Computer Club, a computer hackers’ group, which is being held near Berlin in Finowfurt, Germany. They will describe how to convert a Motorola C-123 cellphone, which is designed to run open-source software, into an interception device. But he said he would not release the keys to unlock the encryption used by operators to secure GPRS networks.

Mr. Nohl said his research was intended to prod mobile operators to improve the security of the wireless Internet, which he said was rudimentary compared with the safeguards protecting data sent over conventional, fixed-line computer networks. He said he destroyed the data he had intercepted from networks in Europe, and did not condone eavesdropping, a crime in Europe.

“We are releasing the software needed to reprogram cheap Motorola phones to become GPRS interceptors,” Mr. Nohl said. “This exposes operators with no encryption, like those in Italy, to immediate risk.”

Mr. Nohl said the release of the information would give mobile operators “a few months” to improve security before other hackers recreated his results and attempted to breach security of the mobile broadband networks.
https://www.nytimes.com/2011/08/10/t...-security.html





Exploit Writer Spills Beans on Secret iPhone Function

iOS debugger of no use to anyone ... except hackers
Dan Goodin

Black Hat Independent security consultant Stefan Esser made waves earlier this year when a technique he developed for hacking iPhones was adopted by JailbreakMe and other mainstream jailbreaking services.

The Register caught up with the German researcher at the Black Hat security conference in Las Vegas just ahead of his scheduled talk titled Exploiting the iOS Kernel. Here are the highlights of the discussion, including details about an undocumented debugger that can only be accessed using a custom-built connector:

El Reg: In a nutshell, what's your presentation about?

Esser: The idea is once you execute remote code you have the big problem that you still can't do anything on the iPhone because of all of the protections. In order to disable these protections you have to get into the kernel and disable them. If you cannot do that, you cannot put up a rootkit. So you need kernel exploits for rootkits.

I'm explaining how to actually exploit the kernel and how to make use of different parts. I show what you go after in the kernel, where you disable these security features. Most of the people don't know how to do kernel exploitations. This is a short course on how to do it.

You said earlier that one of the ways you go about exploiting the iOS kernel is making use of secret functionality. What is it?

It's a kernel debugger. Its actually not used. Developers should not have access to it. Apple even gives no normal way for an iOS developer to access that. They have it in the Mac OS, so they just left it inside the iOS kernel. It helps an attacker to make the exploit more stable, to make it easier to write the exploit.

It's an Apple-endorsed kernel debugger that people use to develop kernel drivers on Mac OS. On iOS, they're not supposed to do that, but Apple just left it inside, maybe thinking that no one can use it because normally the kind of interfaces you need, like Ethernet and serial, are not exposed from the iPhone. By default there's no way to speak with it on the iPhone.

So how do you access it, then?

When you look at the connecter of the iPhone, there are two pins that are like serial. There's no public cable or something like that that will expose them, but you can build your own cable and then speak serial with the iPhone.

Does that mean an attacker has to have physical access to the iPhone he wants to exploit?

No. That's only for development of the exploit. It makes development of the exploit easier. It makes it far easier to do the exploit.

What kind of information does the debugger give you that you otherwise wouldn't have?

A debugger gives you complete control over the CPU at the moment of the crash, so you can do anything. You can read memory, write memory, read the registers, change the values of the registers. The development time will be far shorter, and it's not fishing in the dark anymore. It's like you have full light.

In a nutshell, how do you go about exploiting an iOS device?

When you have a kernel debugger, you start with having a bug in the kernel. Once you have a bug, the first thing you'd do is write some trigger code that will cause the crash, and either you analyze the crash dump or you take a kernel debugger and try to get the program to jump to your own code to execute.

I also show how, when you have a heap overflow, you manage to control the layout of the heap so that you can write an exploit that leverages the heap overflow. So that when you trigger the buffer overflow you can control what you're actually overwriting and how to gain code execution from that.

How is it different doing that in iOS than in Unix, Linux, Windows, or OS X?

There are a lot of similarities, especially between iOS and Mac OS. During the talk I will highlight some of the differences that in some cases make it easier to exploit and in other cases make it harder to exploit in iOS. A lot of the techniques are known but were never applied to iOS before. It's more proof that it's possible and how real-world examples work.

How would you describe exploitation of the iOS kernel relative to other ones? Is it harder?

The big difference is that iOS has code-signing, so you cannot just put some shell code in there or use the Windows way to have a short ROP [return-oriented programming] payload that makes some memory readable, writeable, and executable, and then just jump there. That's not possible in iOS. In iOS, you have to do the whole kernel exploit in return-oriented programming, which makes it a lot harder to create an actual exploit.

So the security features of the iPhone make kernel exploitation a lot harder, but once you're in the kernel, there are no mitigations inside the kernel to protect the kernel itself from being exploited. The kernel is just there to guard the userland, but the kernel itself has no mitigations inside.

If you were advising Apple, would you say, "Remove the debugger from iOS"?

There is actually no reason to keep it in there. I would advise them from a security point of view to remove it. And there are other features that are not really used and make exploit development easier. These are things Apple can remove that would make exploitation harder.

Like what?

There's a function in iOS and Mac OS also that gives you some information about the state of the heap. With this information, the whole controlling-the-heap thing is easier. It's still possible to do it without it, but the exploit can be made more robust, more stable, with having this feature.

What kind of communications have you had with Apple? Have you been speaking with anyone in their security department about your work?

Not really. The only thing I spoke with them [about] was they asked me to apply for a job.

When did that happen? Are you interested?

Right after I released the first of the exploits for jailbreaking the iPhone, in April or so. At the moment I'm just evaluating other options.
http://www.theregister.co.uk/2011/08..._hacking_tool/





Why 193,000 People Stopped Paying for TV Last Quarter
Ryan Lawler

With Cablevision reporting a loss of 23,000 subscribers and Dish Network shedding 135,000 in the second quarter, the U.S. pay TV industry has lost nearly 200,000 subscribers in the second-quarter — and those are just the ones we know about. But if there was a lack of concern about cord cutters on second-quarter earnings calls, it’s not because operators were unaware of the losses; it’s because in most cases, they didn’t want those subscribers anyway.

As seen in the chart below, public pay-TV providers collectively shed 193,000 subscribers in the most recent quarter. While losses by cable providers are nothing new, they are usually offset by stronger growth in satellite and IPTV providers picking up the slack. That didn’t happen this quarter, as somewhat weak growth by IPTV providers and a big loss at Dish highlighted what seems to be an exodus of pay TV subscribers amidst a weak economy.

Company 2Q Video Net Adds/Losses
Comcast -238,000
Time Warner Cable -130,000
Charter -79,000
Cablevision -23,000
Dish Network -135,000
DirecTV 26,000
AT&T 202,000
Verizon 184,000
Total -193,000

When the numbers actually shake out, things are likely to be even worse than this. Keep in mind that these are just the top eight public pay TV providers, and most of those above operate in metropolitan markets. There’s a number of Tier 2 and Tier 3 providers not in this list, and many of those are in rural or underserved areas where the down economy has hit even harder.

Is competition really the cause?

On most of the earnings calls we sat in on over the past several weeks, there seemed to be a common refrain: Cable and satellite providers were losing subscribers in part due to increased competition and deals from the telco providers — Verizon and AT&T — who are aggressively buying share with steep upfront discounts.

But a look at the actual numbers doesn’t seem to bear that out. AT&T added 202,000 video subscribers in the second quarter, while Verizon added 184,000 in the same period. The addition of about 386,000 video subscribers combined is not out of line with previous quarters, and in fact is actually a little low compared to the 410,000 the telcos signed up in the first quarter or the 440,000 they added in the fourth quarter.

Will the real cord cutters please stand up?

If those pay TV subscribers aren’t actually going to competitors, where are they going? Most likely they’ve actually become cord cutters — two words that we didn’t hear much of on those earnings calls. In part, that’s because the rhetoric around cord cutters as anti-establishment, online video-watching rebels has largely been dispelled.

Studies have found those going without cable aren’t doing so because of over-the-top streaming offerings. Instead, those who are choosing to go without cable are doing so because they either don’t see much value in pay TV packages, can’t afford to keep paying for TV, or some combination of the two.

Operators acknowledge that the few video subscribers who have left the pay TV ecosystem so far have most commonly been on the bottom end of the cable value chain — that is, generally low-income users that just paid for TV and didn’t subscribe to broadband, HD or other higher-value services. And for most operators, that’s ok because they weren’t very high-margin customers anyway.

The myth of the higher-value customer

Cable providers are increasingly seeking ways to get more money out of their existing subscriber base. As a result, we’ve seen steady increases in average revenue per user (ARPU) as users sign up for more HD, more premium channels, more DVR set-top boxes throughout the home. That’s the reason Comcast’s ARPU stands at about $140, when basic cable service starts at about $39 based on some introductory offers.

On the other side, operators are increasingly shying away from customers who might not want to pay for the premium cable package, multiple DVRs and other bells and whistles. DirecTV and Dish Network both run credit scores of potential subscribers to weed out those who might turn out to be flakes and cancel after an introductory deal is over. The goal — to get customers signed up for as many value-added services as possible — is not just about driving up revenues, but about making those services sticky and increasing customer lock-in.

The problem is that in a world where all the cable operators are trying to sell ever-more expensive packages of services, there’s a sad truth of business they’re running up against, and it’s that not everyone is a luxury car buyer. That is, not everyone is in the market for the biggest and best. But in the cable world, there’s very little choice if all you want is a Kia.

Will cable reach a tipping point?

It’s not enough to blame the weak economy when things get rough and folks stop paying for cable; there’s also a structural problem with the way the industry views its subscribers. In the quest for higher margins and customer retention, those companies are generally willing to sacrifice subscribers at the low end if it means they can get more out of their so-called higher-value customers.

The question is how long the industry can keep pushing ARPU up before it starts to shed some of its better customers — those that aren’t necessarily poor, but don’t have $150 or more a month to spend on entertainment. There’s the old belief that TV is recession-proof, as consumers hunker down and spend more time at home rather than going out when their disposable income gets low. But at some point, the value proposition has to break down — especially when there are other ways to get low-cost video entertainment from services like Netflix or Hulu.
http://gigaom.com/video/cord-cutters-q2-2011/





The Top 10 Most Pirated Movies Of All Time
James Bruce

Last week we introduced the top 10 most pirated games of all time as a good indicator of real success. Can the same logic be applied to movies? Will the biggest grossing movies also be the most downloaded?

Read on to find out, in our list of the top 10 most downloaded and pirated movies, ever. This time, I’ve included box office stats from IMDB as well, to see how they correlate.

1. Avatar (2010) 16.58 Million Downloads/Box Office $2.03 Billion

An industry-changing movie from director James Cameron, Avatar was the first movie to be shot and directed from the start with 3D as the target viewing method, and continues to be a benchmark for all other 3D releases today. The movie explores an epic struggle between the natives of the planet Pandora and the earth military forces as they attempt to mine the incredibly valuable Un-obtainium, with the most breath-taking CG we’ve seen yet.

2. Kick-Ass (2010) 11.4 Million Downloads/Box Office $48 Million

Originally a comic book, Kick-Ass follows a boy who wants to be a super hero, despite a complete lack of super powers, training, or real reason to become one. He soon finds himself famous thanks to YouTube though, and gains the attention of a ridiculously hard 11 year old girl and her gun-nut crime-fighting father.

Described as “morally reprehensible” by some, “a ridiculously entertaining, perfectly paced, ultra-violent cinematic rush” by others, Kick-Ass received mixed reviews to say the least.

3. Star Trek (2009) 10.96 Million Downloads/Box Office $385 Million

With a fan base of dedicated Trekkies begging for it, who wouldn’t want to make a Hollywood-quality Star Trek blockbuster? Though previous movies have been hit and miss, this one managed to break it to a mass audience, and we are the better for it. An all-star cast and an intriguing plot about the early days of the USS Enterprise put this movie safely in the top 3.

4. Transformers: Revenge of the Fallen (2009) 10.6 Million Downloads/Box Office $836 Million

Is it possible to make a heartwarming film about giant transforming vehicle-robots? It seems so. Full of fantastic effects, and a surprising number of comedy moments, Transformers is an instant hit for any child of the 80’s. It also made for a great cinematic experience, which explains the high box office takings.

5. Inception (2010) 9.72 Million Downloads/Box Office $820 Million

A thrilling action movie that brings the concept of multi-layered, shared lucid dreaming and combines it with guns and bombs. The premise is about forcing an idea into someone’s head, through their dreams – resulting in a mind blowing movie that may take a few viewings to understand fully. Even more mind blowing is that lucid dreaming – the act of waking up consciously while remaining in a dream, is not in fact science fiction at all.

6. Shutter Island (2010) 9.49 Million Downloads/Box Office $127 Million

With DiCaprio as the lead in both this and Inception, he’s doing pretty well. Another one that’s going to potentially give you a headache as you try to make sense of what you’ve just seen, Shutter Island follows a detective who is sent to investigate a murderer who has escaped a hospital for the criminally insane. I can say no more without spoiling the plot, but suffice to say it’s an absolute winner.

7. RocknRolla (2009) 9.43 Million Downloads/Box Office $5 Million

Reminiscent of Lock, Stock and Two Smoking Barrels, a true London action/crime gun-fest full of cockney accents, Americans who we can all hate, and copious amounts of drugs. Absolutely dismal box office performance, but clearly still worth watching if it’s free.

8. The Hangover (2009) 9.18 Million Downloads/Box Office $277 Million

Guy gets engaged, guy goes to Vegas with friends for man-party, guys get drunk, guys lose aforementioned engaged friend… good times and hilarity ensues. It’s pretty textbook stuff, yet something about this movie has made it one of the best comedies of recent years. Reasonable success, the sequel promises to be one of the biggest hits of this year too.

9. Iron Man 2 (2010) 8.8 Million Downloads/Box Office $312 Million

The only real sequel in this list (Star Trek doesn’t count!), Iron Man 2 is another movie that really wants to be experienced in a cinema. Following the continued adventures of a rich CEO who invents an armoured robot suit that effectively makes him a superhero – much to the chagrin of his own company which makes weapons for the world. A new guy emerges on the scene though, resulting in epic robot battles.

10. Twilight (2009) 8.72 Million Downloads/Box Office $382 Million

Sexy male vampires and werewolves, programmed to have an almost drug-appeal to young girls. The female lead doesn’t quite fit in with her social group, when she discovers her knight in shining armour – in the form of a glittering vampire teenage heart-throb. A joke to many, but clearly appealing to many more, Twilight has become a powerful cultural phenomenon of this century.

So do the download figures correlate with the box office? For the most part, they do, but we can see a few anomalies in there. Firstly, Kick-Ass and RocknRolla seem to stand out as highly downloaded and pirated movies, yet not commercially successful. I suspect this was because they were relatively unknown and of niche appearance, so people weren’t willing to risk a pricey cinema trip on them. Second, Transformers and Inception seem to stand out in box office sales compared to their position in the list. I suspect this is because they are FX-heavy movies which really urge the viewer to get the complete “cinema experience”.

Note: These numbers were compiled from TorrentFreak.com annual summaries. Bear in mind that the figures for 2011 aren’t out yet – so it’ll be interesting to see where the 2011 statistics fit into this – they may even eclipse the entire top 10.
http://www.makeuseof.com/tag/top-10-...d-movies-time/





Sorry, Hollywood: Piracy May Make a Comeback
Janko Roettgers

The U.S. credit ratings downgrade, tumbling stocks and international instability have made not just financial analysts nervous this week. Consumers are also starting to wonder whether we’re about to enter another recession. Whenever that happens, people start to tighten their belts and cut unnecessary expenses — like paying for movies and TV shows. Add in the Netflix price hike as well as new authentication plans from broadcasters like Fox, and you’ve got yourself a perfect storm for piracy.

Hollywood has in recent years embraced the idea that piracy is passé. Consumers have been moving toward licensed and paid services like Netflix in record numbers, with Netflix now actually accounting for close to three times as much residential downstream bandwidth than BitTorrent during peak hours. Netflix CEO Reed Hastings echoed this sentiment just two months ago when he told an industry audience that his company “finally beat BitTorrent.”

P2P is alive and well

File sharing traffic is still growing - just not as fast as other applications.

The problem with this narrative is that piracy never actually declined. It just didn’t grow as fast as other types of media consumption. Throughout the day, BitTorrent is still responsible for an average of 21.6 percent of residential Internet traffic in North America, according to Sandvine. That’s only slightly less than Netflix, which accounts for 22.2 percent. Cisco even estimates that global consumer file sharing traffic will nearly double over the next four years, growing from 6 PB in 2011 to 13.9 PB in 2015.

With memories of the housing slump still fresh, many people could simply return to BitTorrent and download movies for free instead of going to the movies or paying for VOD. To make matters worse, Fox will give people another reason to fire up their Torrent client next week: Starting Monday, only DISH and Hulu Plus subscribers will have access to Fox TV shows the day after they air. Other consumers are supposed to wait eight days for their Master Chef and Glee fix; but a closer look at the impact the availability of free content online has had on piracy in the past suggests that up to a third of Fox’s online audience may go back to BitTorrent.

Consumers cancel cable — is Netflix next?

Consumers will receive another reminder that paying for entertainment can be expensive two weeks after Fox’s pay TV wall goes up: Netflix is splitting up its DVD and online video subscriptions starting Sept. 1, and the huge backlash this provoked when announced in July suggests a significant number of subscribers might just outright cancel. Even Netflix subscribers that downgrade to one of the two plans might want to get some of their entertainment elsewhere, and free torrent sites are always just a few clicks away.

If Hollywood needed a reminder of how dire the situation is, it came just a few days ago, when it became clear that 193,000 people stopped paying for TV last quarter. Not all cord cutters will flock to BitTorrent, but there will very likely be at least some cross-over.

LimeWire is dead, but people may just look elsewhere.

Of course, piracy in 2011 isn’t likeit used to be: Rights holders also recently succeeded in getting some major ISPs on board for an anti-piracy program that could help to further curtail file sharing. And then there was the demise of LimeWire at the end of last year. The file sharing service may have fallen out of fashion with hardcore P2P users long ago, but it still had a huge footprint with casual downloaders.

Piracy is moving to the cloud

But LimeWire, and to a degree even BitTorrent, have been replaced with one-click-hosting and streaming sites as well as a new generation of personal cloud media services to get movies and TV shows without the risk of getting sued. Cisco estimates that non-P2P file sharing will grow three times as fast as Torrent-based file sharing from 2010 to 2015, and Sandvine noted in May that “a significant portion of traffic is associated with online back-up and file storage sites,” only to explain that much of this isn’t caused by personal backups, but by file sharing through services like MegaUpload and RapidShare. Streaming sites that don’t care about take-down notices are also becoming increasingly popular, especially since they’re much easier to use for consumers that are more accustomed with Hulu than with
The Pirate Bay.

It’s only a few more weeks until the TV season starts. With legal catch-up services becoming less appealing and Netflix raising its prices, it looks like the piracy comeback is just around the corner as well.
http://gigaom.com/video/file-sharing-is-back/





Leaked AT&T Letter Demolishes Case For T-Mobile Merger

Lawyer accidentally decimates AT&T's #1 talking point
Karl Bode

Yesterday a partially-redacted document briefly appeared on the FCC website --accidentally posted by a law firm working for AT&T on the $39 billion T-Mobile deal (somewhere there's a paralegal looking for work today). While AT&T engaged in damage control telling reporters that the document contained no new information -- our review of the doc shows that's simply not true. Data in the letter undermines AT&T's primary justification for the massive deal, while highlighting how AT&T is willing to pay a huge premium simply to reduce competition and keep T-Mobile out of Sprint's hands.

We've previously discussed how AT&T's claims of job gains and network investment gained by the deal aren't true, with overall network investment actually being reduced with the elimination of T-Mobile. While AT&T and the CWA are busy telling regulators the deal will increase network investment by $8 billion, out of the other side of their mouth AT&T has been telling investors the deal will reduce investment by $10 billion over 6 years. Based on historical averages T-Mobile would have invested $18 billion during that time frame, which means an overall reduction in investment.

Yet to get the deal approved, AT&T's key talking point to regulators and the press has been the claim that they need T-Mobile to increase LTE network coverage from 80% to 97% of the population. Except it has grown increasingly clear that AT&T doesn't need T-Mobile to accomplish much of anything, and likely would have arrived at 97% simply to keep pace with Verizon. AT&T, who has fewer customers and more spectrum than Verizon (or any other company for that matter), has all the resources and spectrum they need for uniform LTE coverage without this deal.

For the first time the letter pegs the cost of bringing AT&T's LTE coverage from 80% to 97% at $3.8 billion -- quite a cost difference from the $39 billion price tag on the T-Mobile deal. The letter highlights how the push for 97% coverage came from AT&T marketing, who was well aware that leaving LTE investment at 80% would leave them at a competitive disadvantage to Verizon. Marketing likely didn't want a repeat of the Luke Wilson map fiasco of a few years back, when Verizon made AT&T look foolish for poor 3G coverage.

The letter also notes that AT&T's supposed decision to "not" build out LTE to 97% was cemented during the first week of January, yet public documents (pdf) indicate that at the same time AT&T was already considering buying T-Mobile, having proposed the deal to Deutsche Telekom on January 15. In the letter, AT&T tries to make it seem like the decision to hold off on that 17% LTE expansion was based on costs. Yet the fact the company was willing to shell out $39 billion one week later, combined with AT&T's track record with these kinds of tactics, suggests AT&T executives knew that 80-97% expansion promise would be a useful carrot on a stick for politicians.

While the $39 billion price certainly delivers AT&T some equipment, employees, and spectrum, most of T-Mobile's network replicates AT&T's existing resources in major markets, and T-Mobile's network is significantly less robust in rural markets where AT&T would want to expand. While the deal provides AT&T with a shortcut to sluggish tower builds in a few select markets, by and large AT&T will be faced with terminating many redundant positions and decommissioning a lot of duplicative equipment. They'll also have to close a large number of retail operations and independent retailers.

Again, the reality appears to be that AT&T is giving Deutsche Telekom $39 billion primarily to reduce market competition. That price tag eliminates T-Mobile entirely -- and makes Sprint (and by proxy new LTE partner LightSquared and current partner Clearwire) more susceptible to failure in the face of 80% AT&T/Verizon market domination. How much do you think wireless broadband market dominance is worth to AT&T over the next decade? After all, AT&T will be first to tell you there's a wireless data "tsunami" coming, with AT&T and Verizon on the shore eagerly billing users up to $10 per gigabyte.

Regardless of the motivation behind rejecting 97% LTE deployment, the letter proves AT&T's claim they need T-Mobile to improve LTE coverage from 80-97% simply isn't true. That's a huge problem for AT&T, since nearly every politician and non-profit that has voiced support for the merger did so based largely on this buildout promise. It's also a problem when it comes to the DOJ review, since proof that AT&T could complete their LTE build for far less than the cost of this deal means the deal doesn't meet the DOJ's standard for merger-specific benefits.
https://secure.dslreports.com/showne...-Merger-115652





Anonymous: Facebook's Going Down November 5
Chris Matyszczyk

The more Facebook seems to dominate the world, the closer it seems to be to its end.

Earlier this year, there was dastardly nonsense being peddled that Facebook would shut down March 15. However, now we have news of an apparently credible threat.

It comes from Anonymous, the interesting group of people who express their principles in an activist way by infiltrating the systems of the unsuspecting or the merely complacent.

The Village Voice has pointed me to an Anonymous press release that states quite unequivocally that Facebook is going to get it on November 5.

Should that date not be dear to you, it is the day when British people let off fireworks to commemorate Guy Fawkes, a man who felt the passionate need to detonate Britain's Houses of Parliament.

For those who haven't bothered with British history (short version: a lot of colonizing, eating meat, and pretending to be friendly), you might have seen Guy Fawkes immortalized in the fine movie "V For Vendetta".

Anonymous' press release is candid to its core. It declares: "If you are a willing hacktivist or a guy who just wants to protect the freedom of information then join the cause and kill facebook for the sake of your own privacy."

It then offers an interesting accusation: "Facebook has been selling information to government agencies and giving clandestine access to information security firms so that they can spy on people from all around the world. Some of these so-called whitehat infosec firms are working for authoritarian governments, such as those of Egypt and Syria."

There follows a hearty disquisition about how Facebook allegedly owns your information, regardless of whether you delete your account, and makes money from it.

Anonymous makes very clear which side it believes Facebook has made its bed and laid its skateboard. It says: "Facebook is the opposite of the Antisec cause. You are not safe from them nor from any government. One day you will look back on this and realise what we have done here is right, you will thank the rulers of the internet, we are not harming you but saving you."

I know that, for many, Facebook will represent a slightly different target than those receiving the hacktivists' attention before--such as the International Monetary Fund, News Corp., or the Iranian government.

The question, though, on the lips of many, may be why Anonymous has given Facebook notice. You may also wonder whether Facebook is quaking, whether it has taken any precautions--and whether, perhaps, in the spirit of sharing, it has reached out to Anonymous to broker some sort of truce. (The offer of a super-duper fan page, perhaps?)

I have, naturally, reached electronically in Facebook's direction, seeking answers to these prewar questions.

It will be troubling if, come November 5, your Mom can't post pictures of the twins' birthday party. Are the hacktivists more powerful than the company that allegedly espouses "hacker culture"? That might make for a very good movie.
http://news.cnet.com/8301-17852_3-20...wn-november-5/

















Until next week,

- js.



















Current Week In Review





Recent WiRs -

August 6th, July 30th, July 23rd, July 16th

Jack Spratts' Week In Review is published every Friday. Submit letters, articles, press releases, comments, questions etc. in plain text English to jackspratts (at) lycos (dot) com. Submission deadlines are Thursdays @ 1400 UTC. Please include contact info. The right to publish all remarks is reserved.


"The First Amendment rests on the assumption that the widest possible dissemination of information from diverse and antagonistic sources is essential to the welfare of the public."
- Hugo Black
JackSpratts is offline   Reply With Quote
Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Peer-To-Peer News - The Week In Review - July 16th, '11 JackSpratts Peer to Peer 0 13-07-11 06:43 AM
Peer-To-Peer News - The Week In Review - July 9th, '11 JackSpratts Peer to Peer 0 06-07-11 05:36 AM
Peer-To-Peer News - The Week In Review - January 30th, '10 JackSpratts Peer to Peer 0 27-01-10 07:49 AM
Peer-To-Peer News - The Week In Review - January 16th, '10 JackSpratts Peer to Peer 0 13-01-10 09:02 AM
Peer-To-Peer News - The Week In Review - December 5th, '09 JackSpratts Peer to Peer 0 02-12-09 08:32 AM






All times are GMT -6. The time now is 01:14 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
© www.p2p-zone.com - Napsterites - 2000 - 2024 (Contact grm1@iinet.net.au for all admin enquiries)