P2P-Zone  

Go Back   P2P-Zone > Peer to Peer
FAQ Members List Calendar Search Today's Posts Mark Forums Read

Peer to Peer The 3rd millenium technology!

Reply
 
Thread Tools Search this Thread Display Modes
Old 16-04-14, 06:37 AM   #1
JackSpratts
 
JackSpratts's Avatar
 
Join Date: May 2001
Location: New England
Posts: 10,013
Default Peer-To-Peer News - The Week In Review - April 19th, '14

Since 2002


































"Tails puts the essential tools in one place, with a design that makes it hard to screw them up. I could not have talked to Edward Snowden without this kind of protection. I wish I’d had it years ago." – Barton Gellman


"Mona and I are getting our New Zealand assets back, unless the Crown appeals." – Kim Dotcom






































April 19th, 2014




'Game of Thrones' Sets Piracy World Record, But Does HBO Care?
Paul Tassi

Due to the relative inaccessibility of HBO in most parts of the world, hidden behind a cable subscription paywall, then a separate subscription just for the channel itself, the channel sees piracy rates like no other. That’s especially true for its most popular show, Game of Thrones, which is usually the most pirated program of the year, every year.

But this weekend, the show broke a piracy world record after one of the most talked about episodes ever aired. Everyone wanted to see what all the fuss was about at the royal wedding (no spoilers here), and as such, the episode was downloaded 1.5 million people downloaded the episode in the first day. The record? That 193,418 were sharing a single file of the episode simultaneously. The previous record was 171,572 people sharing one copy of the Game of Thrones season three finale.

So, this seems like a pretty big deal, and the fact that HBO already has a streaming service in place, HBO Go, would indicate that a solution to this problem is obvious. HBO should unlock Go from the need to have a cable subscription. $15 a month gets you unlimited access to all their programming, and no need to order 300 other channels you couldn’t care less about.

That’s the utopic view of the situation, but unfortunately it overlooks many important factors. The biggest one is that HBO is owned by Time Warner Cable TWC -0.21%, a cable company. They have little reason to encourage people to cancel their cable after they can get HBO by itself. Furthermore, they really don’t seem to care about Game of Thrones piracy much at all. Time Warner TWX +0.41% CEO Jeff Bewkes explains:

“Basically, we’ve been dealing with this issue for years with HBO, literally 20, 30 years, where people have always been running wires down on the back of apartment buildings and sharing with their neighbors,” he said. “Our experience is, it all leads to more penetration, more paying subs, more health for HBO, less reliance on having to do paid advertising… If you go around the world, I think you’re right, Game of Thrones is the most pirated show in the world. Well, you know, that’s better than an Emmy.”

When you see these sorts of record-breaking piracy numbers, it’s easy to think the end is nigh and HBO will wave the white flag. But I’ve written about this in the past. Even if piracy still exists, the war is essentially won. Most companies have realized that a pirated download does not automatically equal a lost sale. Many of these pirates wouldn’t subscribe to HBO Go even if it was standalone, as piracy is still easy and free. As such, the small number of converts wouldn’t be worth cutting ties between HBO and cable.

Not to say HBO doesn’t care about piracy at all. On torrent sites, HBO shows are constantly flagged and they’re one of the only channels that still sends people letters via their ISPs about illegal activity. I never hear about anyone getting actually fined or sued for piracy, but it’s clear they’re not completely rolling over.

And I won’t say that HBO will never release Go on its own. They’ve already experimented with doing so in some regions, experiments that are still ongoing, and it’s pretty clear to everyone that the bloated cable model is dying a (far too slow) death. It will not be around forever, and HBO is ahead of the game by at least already having a (mostly) functional streaming service like Go that can be spun-off when the time comes for TWC to throw in the towel.

But for now, Game of Thrones suffers an “acceptable” level of piracy that doesn’t really hurt the bottom line for HBO or Time Warner. As I said in my popular piece “You Will Never Kill Piracy, and Piracy Will Never Kill You,” it’s impossible for one side to completely wipe out the other. Piracy will never be stopped entirely, but neither will piracy take down the movie, television, music or video game industry. It’s just not possible. The “winners” of the piracy war have learned to simply ignore it for the most part, or even embrace it as HBO claims to.

Game of Thrones will be a massive success for likely its entire lifespan, and pirates serve as free marketing, just another group to tell the world how awesome the show is.
http://www.forbes.com/sites/insertco...does-hbo-care/





Never Mind Bitcoin. Remember BitTorrent? File-Sharing Firm Sharpens Image
Javier Espinoza

BitTorrent Inc. is rolling out a range of new products aimed at going beyond the file-sharing technology that shot it to fame, but which has landed many sites that use that technology in hot water.

Matt Mason, the chief content officer for the San Francisco outfit, was in London recently and spoke to Digits about the company’s efforts to burnish its image.

“Our biggest problem is that people just assume that BitTorrent is some piracy thing,” said Mr. Mason.

The company was an early leader in distributing videos online, pioneering open-source protocol that essentially divides a film into pieces and sends them to multiple sources. Those sources can then work together to quickly reassemble video files on users’ computers.

About a decade ago, the peer-to-peer technology became the bête noire of the mainstream entertainment industry, as millions of people started uploading and downloading music and movies for free. Sites unaffiliated with BitTorrent popped up using the technology–and many of them have met with legal and regulatory scrutiny.

BitTorrent never became a target of legal action itself, and sought several years ago to legitimize its business by working with content providers.

It now operates as a file transfer and storage company, with a secure messaging platform. Later this year, it’s launching an encrypted chat service and a mobile-app built around live-content streaming.

The company claims 170 million users per month. It underwent a broad restructuring in 2008. While it doesn’t report earnings, it has been profitable since 2009 and doubled in size since 2011, according to Mr. Mason.

In 2013, the company launched BitTorrent Sync, which lets users synchronize large files across their personal devices. It already has around two million users. It’s a similar service to Dropbox, Box and Google Drive, but without using cloud services. With BitTorrent Sync, which is still in beta, data is only stored on the user’s device and devices to which that user grants access.

BitTorrent is also testing paywalls, Mr. Mason said. The aim is for fans to pay content creators and rights holders directly for their work. According to Mr. Mason, the fee structure will be favorable to the artist.
http://blogs.wsj.com/digits/2014/04/...sharpen-image/





LongAccess Wants To Cold-Store Your Digital Life For 30 Years
Mike Butcher

Tech people are probably used to the idea that we’ll have our memories and photos stored online. Sure, we have it in the backs of our mind that it’ll all be on some dusty hard drive somewhere we can access in the nursing home – at least we HOPE that’s the case. At the same time, very few people know how to store something in the cloud for a very long time – other than hoping Google Drive might still be around in the era of flying cars, 150 years old people and teleportation.

But services for very long term cloud infrastructure storage for enterprises is a big business. Few companies have set out to do the same for the ordinary person in the street. Long-term storage is usually considered an enterprise product, with very different characteristics. Thus, Evault from Seagate recently announced the EVault Long-Term Storage Service, for instance.

And most people think about backups, not storage. This is good for the “next day” after a failure, but rarely good in thirty years time.

So business documents, research data and kids photos, plus the accompanying issues of encryption, plus storage media and technology obsolescence, not to mention financial commitment… well it all adds up to a problem.

A new startup, Longaccess, is making a big promise: to securely and privately store files, so that the user never has to to do anything about them again. Founded in 2013 in Athens, Greece, Longaccess’ idea is to offer prepaid storage and access for 30 years.

Users don’t have to pay again (after the upfront cost) and they don’t have to migrate their files as technology evolves. And it comes with a printed Certificate to be used even if something happens to the files, by their kids or business associates or lawyer. Think of it like a safe deposit box in a bank. (We’ll get to the printed thing in a moment).

Today LongAccess releases a new feature that allows you to copy files directly from Dropbox to Longaccess.

Now, there is some competition. Small startups like SixSafe and Holdon.to (yet to launch) are looking at this market already. And Sony’s “Archival Disk” could well be a competing product, when and if it’s available on the market. But the jury is still out.

The LongAccess founders say they started Longaccess in the first place because – as we know – in 10 or 20 years even DVD drives will be obsolete, as will USB disks.

So the details: Pricing starts from Free for one year for 250MB. Then €7 for 30 years for a 1GB capsule, up to €100GB capsule for 30 years for €399 (that’s €13.3 per year).

The average cost per GB is low because the company uses cold storage. In addition, files are encrypted on the client side using AES256 encryption and a randomly generated key that is never shared with the service itself.

Once an archive upload is complete, the application generates a unique certificate: a text file containing the archive ID and the encryption key. You then print the Certificate and the hard copy acts as a physical token that makes it easy to access these digital rights for the long term. No passwords, no emails, etc – just the certificate.

Users can keep the Certificate in digital format (it’s just an HTML file), but the hardcopy of the the Certificate ID key could be printed on paper or even engraved into metal or stone. However they want to keep it.

“We are not competing with other cloud storage services, like Dropbox or SkyDrive”, Panayotis Vryonis, Longaccess CEO, told us. “I use them every day for saving, editing and sharing, they are my workbench. Longaccess, on the other hand, is more like a safe deposit box at a bank: it’s not as convenient for everyday use, but if you want to go the extra mile, you can be sure that whatever you put in it, will be safe, secure and your kids will find it.”

The company has raised €210,000 in total, largely lead by Greece-based Openfund.
http://techcrunch.com/2014/04/17/lon...-for-30-years/





Kim Dotcom Gets NZ Assets Back

Cars, cash and property seized from Kim Dotcom by police in the 2012 raid at his Coatesville mansion could be returned to him within the next 14 days.

When police raided the Dotcom Mansion in January 2012 they seized $6 million of luxury cars - including 15 Mercedes-Benz, a pink 1959 Cadillac and a Rolls-Royce Phantom - and $10 million from financial institutions.

They also took a number of personal items belonging to Dotcom and his wife Mona.

The property was seized under a foreign restraining order made by the United States District Court two days before the raid.

However today an order was made in the High Court at Auckland declining a police request to effectively extend the order so Dotcom's assets could be retained.

Minutes ago Dotcom tweeted about the ruling.

"Breaking News: High Court ruling just now. Mona and I are getting our New Zealand assets back, unless the Crown appeals," he said.
http://www.nzherald.co.nz/nz/news/ar...ectid=11239397





MPAA and RIAA Members Uploaded Over 2,000 Gigabytes to Megaupload
Ernesto

This month both the MPAA and RIAA filed civil lawsuits against Megaupload and its founder Kim Dotcom for massive copyright infringement. What they failed to mention, however, is that many of their members' employees were actually sharing files on the site. In addition, Disney, Warner Brothers and Fox were all eager to set up content distribution or advertising deals with Megaupload.

Following in the footsteps of the U.S. Government, this month the major record labels and Hollywood’s top movie studios filed lawsuits against Megaupload and Kim Dotcom.

While the legal action doesn’t come as a surprise, there is a double standard that has not been addressed thus far.

The entertainment industry groups have always been quick to brand Megaupload as a pirate haven, designed to profit from massive copyright infringement. The comment below from MPAA’s general counsel Steve Fabrizio is a good example.

“Megaupload was built on an incentive system that rewarded users for uploading the most popular content to the site, which was almost always stolen movies, TV shows and other commercial entertainment content,” Fabrizio commented when the MPAA filed its suit.

However, data from Megaupload’s database shared with TorrentFreak shows that employees of MPAA and RIAA member companies had hundreds of accounts at the file-storage site. This includes people working at Disney, Warner Bros., Paramount, 20th Century Fox, Universal Music Group, Sony, and Warner Music.

In total, there were 490 Megaupload accounts that were connected to MPAA and RIAA members, who sent 181 premium payments in total. Together, these users uploaded 16,455 files which are good for more than 2,097 gigabytes in storage.

Remember, those are only from addresses that could be easily identified as belonging to a major movie studio or record label, so the real numbers should be much higher.

But there’s more. The same companies that are now asking for millions of dollars in damages due to massive copyright infringement were previously eager to work with Megaupload and Megavideo.

As we noted previously, Disney, Warner Brothers, Fox and others contacted Kim Dotcom’s companies to discuss advertising and distribution deals.

For example, Shelina Sayani, Digital Marketing Coordinator for Warner Bros, offered a deal to syndicate “exciting” Warner content to Megaupload’s Megavideo site.

Subject: Warner Bros. – Looking for Content Manager
Date: Wed, 14 Jan 2009 08:55:50 -0800
From: Sayani, Shelina
To: demand@megavideo.com
Dear Megavideo,

I’m writing from Warner Bros., offering opportunities to syndicate our exciting entertainment content (e.g. Dark Knight, Harry Potter, Sex and the City clips and trailer) for your users. Could you please pass on my information to the appropriate content manager or forward me to them? Thanks so much for your time.

Shelina Sayani
WB Advanced Digital Services
3300 W Olive Ave, Bldg 168 Room 4-023
Burbank, CA 91505
818.977.4668


Similarly, Disney attorney Gregg Pendola reached out to Megaupload, not to threaten or sue the company, but to set up a deal to have Disney content posted on the Megavideo site.

Subject: Posting on Megavideo.com
From: “Pendola, Gregg”
Date: 8/13/2008 10:06 AM
To: love@megavideo.com
My name is Gregg Pendola. I am Executive Counsel for The Walt Disney Company. Certain properties of The Walt Disney Company have content that they would like to post on your site.

However, we are uncomfortable with a couple of the provisions of your Terms of Use that we feel may jeopardize our rights in our content. We were hoping that you would be amenable to reviewing a 1-page agreement we have drafted that we would like to use in place of your Terms of Use.

Is there someone I can contact to discuss this? Or someone I can email the Agreement to for review?

Thanks. Gregg

Gregg Pendola
Executive Counsel
The Walt Disney Company


For Fox, the interest in Megaupload wasn’t necessarily aimed at spreading studio content, but to utilize Megaupload’s considerable reach by setting up an advertising deal. In this email former Senior Director Matt Barash touts FAN, the Fox Audience Network.

Subject: Fox Ad Partnership
Date: Mon, 23 Feb 2009 08:09:14 -0800
From: Matt Barash
To: sales@megaupload.com
I’m reaching out to see if you have a few minutes to discuss the recently launched Fox Audience Network.

FAN is now up and running and fully operational, utilizing best of breed optimization technology to bring cutting edge relevancy to the ad network landscape.
We are scaling rapidly and seeking the right 3rd party publishers to add as partners to our portfolio.

Please let me know if you have some time to chat this week about how we can work together to better monetize your inventory.

Best,
Matt

Matt Barash
Director, Publisher Development
Fox Audience Network


The above are just a few examples of major industry players who wanted to team up with Kim Dotcom. Now, several years later, the same companies accuse the site of being one of the largest piracy vehicles the Internet has ever seen.

If the MPAA and RIAA cases proceed, Megaupload’s defense will probably present some of these examples to highlight the apparent double standard. That will be an interesting narrative to follow, for sure.
http://torrentfreak.com/mpaa-and-ria...upload-140418/





New Bill to Crack Down on Illegal Downloads has Privacy Experts Worried
Justin Ling

You might want to think twice about downloading a pirated copy of the new Captain America movie — or any other film — thanks to a new federal piece of legislation that was quietly tabled in the Senate this week.

Bill S-4, the Digital Privacy Act, was introduced in the upper chamber on Tuesday, and privacy experts are concerned that the bill is carte blanche for companies to share Canadians’ personal information with big media companies who are trying to crack down on copyright infringement.

The crux of the legislation, which tightens regulations on the steps that companies need to take if security breaches compromise their users’ personal information, has received some tepidly positive reviews from analysts. But it’s the section tucked in the middle that’s raising eyebrows.

Currently, under the Personal Information Protection and Electronic Documents Act (PIPEDA) companies can only share their users’ private data with government and police, in limited circumstances, as they investigate a crime.

Once S-4 becomes law, PIPEDA will allow companies to share Canadians’ information with other companies if they believe there has been a breach of agreement, or a case of fraud.

In other words, says digital advocacy group OpenMedia, pirating a copy of Game of Thrones onto your laptop will mean that HBO may soon have your number. All they’ll have to do is call up your internet service provider and ask for the information of each user who has ignored their copyright.

The practise began in the United States, where companies — unflatteringly referred to as “copyright trolls” — have issued mass mailings to users who pirate copyright material. The letters range from cease and desist requests, to notices of legal action and, more and more commonly, demands for reparation. In the U.S., companies have been known seek as much as $75,000 or more for violations.

David Christopher, communications manager for OpenMedia, says it’s a dangerous precedent, and S-4 will allow for it.

“It also opens the door to telecom firms handing our private data to U.S.-style copyright trolls, without any court order or judicial oversight,”he says. “Worst of all, we’d never know when we’d been a victim of these privacy breaches as the disclosures would be kept secret.”
http://news.nationalpost.com/2014/04...perts-worried/





SA’s First Digital Pirate Receives Five Year Suspended Sentence
Adam Oxford

The first prosecution of a South African for sharing a media file over the internet has concluded this afternoon in the Commercial Crimes Court in Bellville, Cape Town. The offender, Mr Majedien Norton, pleaded guilty to copyright infringement and received a five year wholly suspended sentence, without a fine, in a plea bargain agreement with the state.

“It’s a huge relief for me and my wife,” Norton told htxt.africa upon leaving the court, “I’m just glad we can put this behind us now and move on.”

Norton uploaded a torrent link and seeded a digitial version of Four Corners, a film about gangster life in the poor communities of the Cape Flats, to popular sharing website The Pirate Bay in November last year. The file, Norton says, was created after ripping a DVD purchased by a friend from a street vendor.

The charges were initially brought by the South African Federation Against Copyright Theft (SAFACT), which claimed that Norton’s file was a first generation creation from a file acquired directly from the film studio. Its director, Corne Guldenpfenning, originally claimed that she was ‘extremely certain of all the rights issues’ around the case and that “You have to be extremely careful before making a first example”.

We’re awaiting SAFACT comment regarding the outcome of the case and whether or not the organisation sees it as a success.

The trial has received a lot of coverage both online and in local papers and magazines. Pictures of Norton – an IT engineer who has recently been retrenched – and his family were widely circulated in the media after being lifted from Facebook.

The director of Four Corners, South African Ian Gabriel, told htxt.africa that he was philosophical about the effects of online copyright infringement and that while it will have cost the production company some viewers, others who would never have heard of the film may now have paid to see it.

“I think the way people think now # digitally # they don’t see piracy as piracy any more,” Gabriel said, “They see it as sharing. We will definitely not get as many people to the cinemas as we would have if the film were not pirated. At the same time, there are people who have seen the film who would never have got to the cinema. I’m pleased the film is reaching those people because there’s a message of pride and self recognition and of choice for ordinary people that the film is delivering and its important that message be heard.”

The film was released into theatres at the end of last month, and is currently the 51st highest grossing movie of the year according to stats at Box Office Mojo. Other major South African releases – including the latest Leon Schuster flick – are widely available online. With box office returns so low in the country – even Long Walk to Freedom took less than $2.5m in receipts here – international and DVD sales are vital under the current model to keep funding relatively big budget productions.
http://www.htxt.co.za/2014/04/17/sas...nded-sentence/





Record Labels Sue Pandora Over Pre-1972 Recordings
Eriq Gardner

The popular streaming service is hit with a complaint in New York court with potentially big consequences.

The major record labels are now suing Pandora for exploiting sound recordings made prior to Feb. 15, 1972.

Last September, a similar lawsuit was filed against Sirius XM. The subject of the lawsuit has to do with the fact that sound recordings didn't begin falling under federal copyright protection until the above date. As such, the streaming service might not be able to rely upon SoundExchange, the performance rights organization that collects digital and satellite royalties on the behalf of sound recording copyright owners. The record labels are testing this belief, now asserting New York state misappropriation claims over older music being streaming on Pandora.

"Pandora's refusal to pay Plaintiffs for its use of these recordings is fundamentally unfair," says the lawsuit. Among the artists whose songs are said to be infringed upon Pandora are Bob Dylan, The Beatles, David Bowie, Elvis Presley, James Brown and Led Zeppelin.

If the plaintiffs prevail, many of the songs could be off Pandora, as Capitol Records, Sony Music, Universal Music, Warner Music and ABKCO Music are demanding an injunction in addition to compensatory damages, punitive damages and all proceeds gained as a result of the exploitation of pre-'72 music. Pandora has more than 250 million users.

The complaint points out that Pandora features specific stations that leverage the older music including "50s Rock 'n' Roll," "60s Oldies," "Motown," "Doo-Wop," "Early Jazz," and others.

Pandora works by attempting to feed its users songs that are similar to their favorites, but the algorithm can sometimes be reliant on what users pick in the first place. For example, the lawsuit says that on the "Beatles" station, a user can expect to hear a recording from the John Lennon group approximately four times during a three-hour period.

The RIAA circulated the lawsuit on Thursday along with quotes from artists or their heirs. "It’s an injustice that boggles the mind," says Booker T. & the MG's Steve Cropper. "Just like the programmers who deserve to be paid for their work, I deserve to be paid for mine.”

The plaintiffs are being represented by the same attorneys at Mitchell Silberberg & Knupp who are representing them in the ongoing case against SiriusXM.

In the SiriusXM case, there could be important clarity on the issue of pre-'72 music coming in weeks. The plaintiffs have brought a motion for jury instruction and there is a hearing set for May 14. The satellite radio company has argued "there is no state law that requires SiriusXM (or any of the hundreds of thousands of other U.S. businesses that publicly perform music) to pay license fees for Pre-1972 Recordings."

Even if SiriusXM is wrong, and a judge says that misappropriation laws protect pre-1972 music, the defendant could assert affirmative defenses like laches, or the prejudice that has come from waiting decades before suing. SiriusXM also says a ruling would impact radio and television broadcasters, bars, restaurants and website operators using pre-'72 music.

One more point: The SirusXM case comes in a California court while the latest Pandora lawsuit was filed in New York. If federal law doesn't apply, judges in different states could come to different conclusions over the issue of pre-'72 music. At the moment, there might not be anything but costs in stopping the record labels from attempting to pursue Pandora and SiriusXM in 49 other states.
http://www.hollywoodreporter.com/thr...ora-pre-697327





How to Exploit Home Routers for Anonymity
Dan McInerney

This article is just a demo for educational purposes. To those who say this sort of information should be censored, I say you can close your eyes and shout, “la-la-la-la-this-doesn’t-exist” all you want but that won’t make practices like those outlined below disappear. Only through awareness can you grow and protect yourself and others.

Download device-pharmer
git clone https://github.com/DanMcInerney/device-pharmer
Device-pharmer will take advantage of Shodan and concurrently test 1000 hosts from the search results to find open targets. It will print the IP, port, and title of the page if the connection was successful. All successful connections will be logged in _results.txt in the current working directory. Device-pharmer will be included by default in the next update of Kali.

Get a Shodan API key
1) Sign up for a free Shodan account
http://www.shodanhq.com/account/register
Recommended.

OR

2) Search Google for one
site:pastebin.com shodan api key
This is not an optimized search. It’s just to give you an idea of how to find this sort of information.

Choose a router model to target
Search Google/Amazon/Cuil for routers with baked in VPN support. Perhaps, “vpn router” might do the trick ;). PPTP and OpenVPN are probably the easiest to set up. We’ll pretend for the rest of this exercise that the common D-Link DIR-300 is a router with baked in PPTP VPN support via stock firmware.

(Optional) Find a free HTTP proxy
https://hidemyass.com/proxy-list
Choose Speed: Fast and Connection time: Fast

OR

git clone https://github.com/DanMcInerney/elite-proxy-finder
Run:
python elite-proxy-finder.py -s 1
This script scrapes a couple reliable proxy sites for only high anonymity public proxies and concurrently tests the results against https://www.yahoo.com. It will then displays how fast each proxy was. The -s option specifies that you only want to show the fastest of the 100+ proxies it’ll find and test. I might just add this into device-pharmer as an option like ‘–auto-proxy’.

Search Shodan using device-pharmer
python device-pharmer.py -s 'dir-300' -a Wutc4c3T78gRIKeuLZesI8Mx2ddOiP4 --proxy 123.12.12.123:8080 --timeout 30

Alternatively if you know the default username/password you can tell the script to attempt to login to each device found:
python device-pharmer.py -s 'dir-300' -a Wutc4c3T78gRIKeuLZesI8Mx2ddOiP4 --proxy 123.12.12.123:8080 --timeout 30 -u admin -p password

-s: Search Shodan for ‘dir-300′; use single or no quotes
-a: Shodan API key
–proxy: Proxy all requests through this server (optional)
–timeout: By default it’s 12 seconds but since we’re proxying our requests we’re going to want to increase that to account for the lag the proxy is going
to introduce (optional)
-u: Try logging in with this username (optional)
-p: Try logging in with this password (optional)

If you have a free account you will only be given one page of results which amounts to 100 hosts. Plenty. If you have a pro account then you can use the -n option to specify how many pages of results you want to run through like “-n 5″.

Example results in the log file dir-300_results.txt without attempting to log in:

Set up dynamic DNS
http://www.noip.com
Register a free account then go to Manage Hosts > Add Host and fill it out. Max of 3 hosts.

Visit one of the results from the log file “dir-300_results.txt” in your browser
1) Look for the dynamic DNS settings (usually under a link like “DDNS”) and set it up with your noip account
2) Look for the PPTP VPN settings once you’re in, enable it if necessary, and create an account for yourself.

Set up network manager
1) apt-get install network-manager-pptp-gnome
Assuming you’re in Kali.

2) http://support.vpninja.net/hc/en-us/...-04-PPTP-Setup
Follow the instructions here.

Clear the router logs

Probably a good idea to do this before and after every session you make to the router. Safety first, of course. Usually you can find the logs in a link like “Settings” or “System” within the router web interface. If you can completely turn off the logs, even better.

Voila!

Your own personal VPN that you don’t have to share. VPNs are great at bypassing pesky internet filters and improve your anonymity online. The nice thing about this is that you don’t have to wonder whether or not your VPN provider is saving logs or not, you are in control of that.

Ultimately this is one of the less malicious things you can do with this power. If you really wanted to do harm you could change the DNS to point to a malicious server amongst other ideas. As the internet-of-things ramps up the amount of low hanging fruit you can find using methods described here is going to explode like the Cambrian.
http://danmcinerney.org/how-to-explo...for-anonymity/





The Insanely Fast WiFi Router You’ll Probably Never Need
Brian Fung

The average American household connects to the Internet at a rate of 10 megabits per second. Not bad, but also not fantastic — by way of comparison, a single HD Netflix stream takes up 5.8 Mbps of bandwidth. Now with that as our baseline, consider the speeds of the country's fastest Internet connections today: 1 Gbps, or a gigabit per second. That's equivalent to 1,000 Mbps, or roughly 100 times faster than the national average.

But if you thought that was fast, wait until you hear about a new WiFi router that's coming next year. It's capable of 10 Gbps — 10 gigabits per second. That's a thousand times the rate of the average American broadband connection. It's mindboggling. You could theoretically stream 1,724 Netflix movies, all in HD, all at the same time and not see any lag. And the manufacturer, Quantenna, says its 10-gigabit WiFi router will be shipping next year, though it offered no word on its price tag.

The new device is being announced just a year after the standards body responsible for WiFi put the stamp on 802.11ac, the technical protocol that provides for 1 Gbps WiFi. To see a 10x jump so soon on top of that announcement is pretty crazy.

Let's pause for a second. This doesn't necessarily mean your Internet service is going to get faster; if you've subscribed to a 50 Mbps plan, for instance, that's what you'll continue to get from your Internet provider. What we're talking about is the rate at which your WiFi router passes data from your Internet provider to your PC. If you have a 1 Gbps fiber optic connection but a really old router, your router will prevent you from getting the most of your subscription because the data is being bottlenecked when it reaches your home. You need the right equipment for the right job.

Since the average household Web connection is still lagging at 10 Mbps, it'll be hard for most people to take advantage of the 10-gig router right away. They simply don't consume enough data to need the giant pipes provided by this new technology. But it's still an awesome indicator of where the future may be headed.
http://www.washingtonpost.com/blogs/...ly-never-need/





Faster Wi-Fi on Flights Leads to Battle in the Sky
Alwyn Scott and Victoria Bryan

Wi-Fi in the sky is taking off, promising much better connections for travelers and a bonanza for the companies that sell the systems.

With satellite-based Wi-Fi, Internet speeds on jetliners are getting lightning fast. And airlines are finding that travelers expect connections in the air to rival those on the ground - and at lower cost.

But the fast evolution of rival systems and standards, such as Ku band and Ka band, pose a big question for airlines: which one to choose?

Equipping fleets can cost hundreds of millions of dollars, and airlines don't want to see their investment quickly become outdated due to newer technology. That's made some cautious about signing up.

"We don't want to end up with a Betamax," said Peter Ingram, chief financial officer of Hawaiian Airlines, referring to the Sony video format that eventually lost out to the VHS standard, leaving many consumers with obsolete systems.

Hawaiian is still considering which system to use.

The drive for in-flight connectivity also has intensified after the disappearance on March 8 of Malaysia Airlines Flight 370 with 239 people aboard. Search teams are scouring parts of the Indian Ocean for the missing aircraft, and it might have been better tracked if a satellite system capable of streaming cockpit data had been on board.

GLOBAL MARKET

The U.S. market for airborne Internet got a big boost last November after the U.S. Federal Aviation Administration allowed passengers to use smartphones, tablets and e-readers throughout a flight, ending a long-standing ban on their use during takeoff and landing.

While the change hasn't been adopted worldwide, the FAA's move is expected to lead to greater use of devices, and bandwidth, on planes.

About 40 percent of U.S. jetliners already have some Wi-Fi, but the race is on to wire the rest of a growing global fleet, and to make the existing connections better.

The number of commercial planes worldwide with Wi-Fi, cell service or both is expected to more than triple over the next 10 years, to 14,000 from about 4,000 currently, with much of that growth in Asia, according to research firm IHS.

Even with a tripling, only half of the worldwide fleet will be wired in 2022, suggesting demand for new systems will last longer.

Much of the U.S. fleet will need upgrades to access satellites, since many planes currently are equipped for ground-based transmission, which is typically slower than satellite.

"Passengers of the future want to be connected when they want," Chris Emerson, senior vice president of marketing at Airbus, told Reuters during the Aircraft Interiors Expo in Hamburg.

"Everyone wants Internet the way they have it on the ground, so it has to be cheap or free."

GREATER SPEED

Satellite technology will speed up onboard connections sevenfold, to about 70 megabits per second next year, fast enough to download a two-hour high-definition movie in about four minutes. Of course, that bandwidth will be shared among all of the users on the flight, which could number 200 or more.

Satellites also will allow service to reach developing markets in Asia and Latin America, and to offer expanded service in the U.S. and European markets.

Investors expect the global expansion and faster speeds will fuel greater use of services, with revenue split between the providers and the airlines.

It also will drive hardware sales, as airlines outfit aircraft with antennas, radios and routers. Honeywell, for example, makes fuselage-top antennas that link to the Global Xpress network provided by Inmarsat PLC, which operates on the Ka band.

In demonstrating the GX system at the Hamburg show, Honeywell said the system can deliver up to 50 megabits per second consistently around most of the globe, and it plans to test it on its own plane this summer, while Air China is expected to start trials with it in late 2014 or early 2015.

"GX is going to be a real game changer for airlines and their passengers from 2015 when the service comes online," John Broughton, director of product marketing for GX at Honeywell, said in an interview.

A rival standard, Ku band, operates in a lower frequency band. While it may be able to achieve higher bandwidth than Ka band in certain areas, its overall connectivity is not as consistent, especially on long-haul flights over oceans, experts said.

Gogo used the Hamburg show to announce its 2Ku system, that will use a special dual antenna made by ThinKom Solutions Inc to raise the capacity of the Ku band system to 70 mbps, a leap from its current systems that operated at 3 to 10 mbps. Gogo also offers Ka band satellite connectivity and built its business on ground-based cell-tower technology in the United States.

"BETAMAX" RISK

The improving systems mean customers will demand better connections. Some frequent fliers with status on several airlines say they choose flights based on Wi-Fi availability.

"It becomes an ante at the table," said Jonathan Schildkraut, an analyst at Evercore Partners, which co-managed Gogo's IPO last June.

But the variety of systems poses tough decisions for airlines, which risk choosing a technology that could become outdated.

Ingram, the CFO of Hawaiian Airlines, said the choice and cost of a system is especially important for his fleet since it mostly carries people on vacation - people who don't want to be tethered to the office.

"The technology in the Wi-Fi space for trans-Pacific flying is still evolving," he said, "so we haven't made any final decisions yet."

German airline Deutsche Lufthansa AG knows the perils. It originally worked with Connexion, a Boeing unit developing in-flight Wi-Fi that operated a decade ago but failed to attract enough customers.

"We were a little bit unfortunate," CEO Christoph Franz said in an interview. "We had spent millions to equip our aircraft."

Lufthansa has since outfitted more than 90 percent of its long-haul planes with satellite connectivity.

But it is taking a step-by-step approach for other planes, outfitting about 30 Airbus A321 aircraft with a system that can stream content from an onboard server to handheld devices, but doesn't connect to the internet.

"We need a decent provider for that, but we didn't want our customers to wait," Franz said. He expects a "triple-digit-million" euro investment to outfit the full fleet.

"We are ready to do this," he said. "But we have to look at the bill. We will see which system at the end of the day turns out to be the most affordable and the fastest."
http://www.reuters.com/article/2014/...A3D06U20140414





Google: Still No Plans to Bring Fiber to New York

Google is hiring a Fiber regional sales manager in NYC, but that's it.
Jon Brodkin

A Google Fiber job posting in New York City has a bunch of tech news sites excited about the prospect of Google bringing its fiber Internet service to the Big Apple.

It would certainly be a nice consolation prize for a city bemoaning the comical failure of the New York Knicks. But Google says there are no such plans.

"Don't read into the job listing," a Google spokesperson told Ars. "We've had a full team of folks working on Fiber in the New York office (and other locations around the world) for years. We don't currently have any plans to bring Google Fiber to New York. We're entirely focused on building out our networks in Kansas City, Austin, and Provo, and on exploring the possibility of bringing Fiber to the 34 locations we announced in February."

The job listing is for a regional sales manager position and says, "You will manage multiple teams that evangelize Google Fiber services to MDU (multi-dwelling apartments and condos) and large SMB owners. You will hire and manage a team that proactively reaches out and articulates how Google Fiber Solutions can help make their work more productive." The successful applicant will "lead and motivate multiple sales teams across multiple locations."

Google is also hiring regional sales managers in Mountain View, California and Austin, Texas. Google Fiber is coming to Austin and is being considered for Mountain View and nearby cities.

Google recently announced that it chose nine metro areas around the country for potential Fiber deployments. The closest ones to New York City are Raleigh-Durham in North Carolina and Atlanta, Georgia. New York City already has fiber in the form of Verizon FiOS, and Google has focused mostly on underserved areas where municipal officials are willing to provide expedited permitting and other perks. There are still millions of Americans without broadband, so there are plenty of areas where Google Fiber is needed.

One thing that is clear is that Google is building up its Fiber team. Job listings indicate that more than 60 positions are open. There is one other Google Fiber position open in New York, for a network infrastructure design manager.
http://arstechnica.com/business/2014...r-to-new-york/





Verizon Brings Fake Grassroots Campaign To New Jersey To Claim Support For Not Bringing Real Broadband
Kate Cox

New Jersey might not be that large a state, but its geography and its dense population make it easy to understand how running a broadband connection to 100% of residents could be a cumbersome and expensive project. So what’s a corporation stuck with a twenty-year-old public interest obligation to provide those broadband connections to do? Create a fake tidal wave of public support for their attempt to weasel out of it, of course!

Verizon’s been taking that last option, as Ars Technica reports. There’s a legal process underway between the state and Verizon, and Verizon has mobilized an army of largely imaginary concerned citizens on their behalf to support their case.

In order for this to make any sense, let’s take a quick trip to the department of backstory.

The original agreement, part of the “Opportunity New Jersey” plan, dates to 1993. At that very early time, telephone operator New Jersey Bell made an agreement with the state of New Jersey (PDF) that included a provision for getting broadband digital service to 100% of the state by the year 2010. The agreement defined broadband as:

[S]witching technologies matched with transmission capabilities support data rates up to 45,000,000 bits per second [45 Mbps] and higher, which enables services, for example, that will allow residential and business customers to receive high definition video and to send and receive interactive (i.e. two way) video signals

That agreement was extremely foresightful, considering that in 1993 the few households regularly getting online at all were using incredibly slow dial-up modems and mostly connecting to services like AOL, CompuServe, and Prodigy. (On our CRT screens! Uphill in a blizzard!) But what neither NJ Bell nor the state foresaw at the time was the re-consolidation of the phone companies, and many of the little Bells eventually joining back up and becoming Verizon.

Verizon did not meet that 100% coverage goal by 2010. Nor did they meet it in any of the four years since. It’s not surprising that they fell short, given that they are both trying to kill off their old copper land lines and slowing, or even halting, their FiOS expansion. That combination doesn’t leave a whole lot of room in the strategic plan for “run more wires through New Jersey.”

New Jersey is understandably not pleased with Verizon’s lack of follow-through, particularly as the company has received twenty years’ worth of concessions from the state in order to make it happen. If Verizon has only reached two million of New Jersey’s three million households in twenty years, then those tax breaks and increased customer rates aren’t getting NJ what they were supposed to.

NJ’s utilities board required Verizon to explain why they haven’t met the terms of the roll-out, which kicked off a legal process in which Verizon has said that they shouldn’t have to anymore because it’s complicated and expensive. Plus, they say, competition now exists and anyway they can totally make good on the remainder with wireless 4G service.

Now we’re back to the present day, where the legal back-and-forth between Verizon and New Jersey hit the public comment stage last month. In order to bolster their claims, Verizon went and found a whole lot of public to make comments in their favor. 418 identical pro-Verizon e-mails were sent to the utility board between March 22 and March 24, and 315 copies of a second form e-mail claiming to be from Verizon employees were sent on March 19 and 20.

In total, Ars reports, at least 792 form comments were submitted on Verizon’s behalf in the span of a couple weeks. And while that would be one thing if Verizon really did get happy customers or gung-ho employees to send the letters… it seems they didn’t.
Ars Technica picked some e-mail addresses from the list to spot-check against. Some were invalid and received bounce-backs. One was a Verizon customer who was shocked to find that “he” had sent the message at all:

“I am a customer only to Verizon and I was not contacted by them to submit anything,” the person told Ars. “If they did, I would’ve slammed them. They are gougers. If AT&T was where I lived, I would switch in a heart beat.”

When this customer was shown the e-mail he allegedly sent to state officials, he said, “That would mean someone did it on my behalf. I can assure you that I did not send that response.”


Broadband news site Stop The Cap also tried to contact the supposed Verizon supporters and found much the same problem. Of the 150 e-mail addresses they tried, 35 were invalid. So they contacted another 35, and a dozen of those were also invalid. Of the remainder, many were Verizon employees or retired Verizon employees. One was a lawyer who represents Verizon (and did not disclose it). Five had “no idea what we were talking about” and claimed they never sent any e-mails either for or against Verizon.

Ars’s report includes a good technical discussion of what’s at stake for New Jersey residents now. Whether or not Verizon will be obliged to run more wires through the Garden State is up in the air, but what isn’t is the fact that broadband access has gotten more important than ever. The digital divide is real, and even Comcast admits that expanding access is a necessity.

Creating a broadband plan in 1993 was remarkably ahead of its time. But faking grassroots support in order to pretend it’s not needed anymore? That’s a great way for Verizon to set the state back.
http://consumerist.com/2014/04/17/ve...eal-broadband/





Microsoft Slashes Windows XP Custom Support Prices Just Days Before Axing Public Patches

Reduces after-retirement support costs for large enterprises as much as 95%
Gregg Keizer

Just days before Microsoft retired Windows XP from public support, the company drastically reduced the price of custom support agreements that give large companies and government agencies another year of XP patches, experts reported today.

"I believe that Microsoft changed prices because it decided that not enough customers were enrolling in the program, and it was apprehensive of the ramifications of any Windows XP vulnerabilities," said Daryl Ullman, co-founder and managing director of the Emerset Consulting Group, a firm that specializes in helping companies negotiate software licensing deals.

At Ullman's recommendation, one Emerset client had spurned a $2 million deal two weeks ago to provide 10,000 XP PCs with custom support. But Microsoft came back days later with an price of just $250,000. Ullman advised his client to jump at what he called "an insurance policy," and the firm signed on the dotted line.

Others told Computerworld of similar deals Microsoft offered at the last minute to get customers to commit to another year of patches.

Custom support agreements, or CSAs, provide critical security updates for an operating system that's been officially retired, as Windows XP was April 8. CSAs are negotiated on a company-by-company basis and also require that an organization have adopted a top-tier support plan, dubbed Premier Support, offered by Microsoft.

The CSA failsafe lets companies pay for security patches beyond the normal support lifespan while they finish their migrations to Windows 7.

Windows XP's retirement was major news last week, and not only in the technology press, because the nearly-13-year-old OS still powers almost 28% of the world's personal computers. With the patch spigot turned off, many security experts, including Microsoft's, believe that cyber criminals will have a field day hacking XP PCs.

Although Microsoft has been beating the dump-XP drum for years, it has had mixed results getting everyone off the aged operating system. Most attribute a combination of budgetary issues, the stability and familiarity of XP, the poor reception of Windows 8, and sheer inertia as the cause for Windows XP's stubbornness.

The turn-about on CSAs was a marked change from late 2012 and early 2013, when Microsoft significantly boosted prices by reinstituting a $200 per-device model and setting top-end caps of as much as $5 million.

Michael Silver, an analyst with Gartner, had tracked those price increases last year. Today, he said several Gartner clients had reported massive price breaks in the last two weeks. "Microsoft made it much more affordable, but still priced to encourage companies to migrate," he said.

The new ceiling is $250,000, according to several sources, although the $200-per-device price remained in place.

Like Ullman, Silver attributed Microsoft's discounting to a fear of the backlash that would result if a large customer's PCs were infected with malware after the patch halt. "[A CSA] provides a modicum of protection to organizations and to Microsoft, which likely seeks to avoid public criticism for any Windows XP security breaches," Silver wrote in a note to clients April 8, the same day Microsoft retired the OS.

Sources familiar with Microsoft's position claimed that the company changed its CSA pricing tune after chief operating officer Kevin Turner returned to Redmond at the beginning of the month from a swing through the sales force, where he got an earful about customers with thousands of XP machines and no chance of making the migration deadline. The decision to drop prices was made shortly after that.

Ullman and Silver corroborated the timeline, saying they began hearing about the price reductions around the first of the month.

Microsoft's decision was the right one, said Ullman.

"This was an enormous change," Ullman said. "It shows a change at the way they look at their customers and might be part of a fresh atmosphere at Microsoft. I don't think it was about a change of heart about pricing, but instead Microsoft being a responsible software provider, stepping up to be responsible, realizing that there were all kinds of reasons why companies haven't upgraded XP, and providing a solution for a product that's there, that's reliable."

Microsoft has made several other moves of late -- all after Satya Nadella was appointed CEO to replace Steve Ballmer -- that signal a different attitude than, say, even three years ago, including shipping a touch-first Office for the iPad before one was ready for Windows 8.1.

"[The earlier CSA pricing] was a bad call," Ullman continued. "But someone said, 'This is wrong and we need to step up and be reasonable.' I see this as Microsoft helping customers migrate at their own pace."

Silver was less impressed. "What people wanted was longer support for Windows XP," he said in an interview. "But there was no way that Microsoft was going to blink on that. There was no way they were going to change the [support retirement] date. So the only thing they could do was lower the price. That way they wouldn't anger too many existing customers who had spent the time and money migrating from XP."

Still, Silver also noted that the winds had shifted in Redmond. "They wouldn't have moved this fast earlier," he said.

Because Microsoft adjusted the cap, not the $200 per-device pricing, the lower prices will benefit larger organizations. Ullman said that the new CSA minimums were 750 PCs, with a minimum payment of $150,000 for a year's worth of support.

Gartner advised companies that had already signed a CSA to go back to Microsoft and ask for a review and renegotiation of their current contract pricing and terms.

Under Microsoft's rules, companies can sign a CSA at any time -- there is no deadline, something Ullman said was very unusual for Microsoft -- and have immediate access to all the critical security updates that have been released since April 8. Payment for the first year of fixes, however, is retroactive, meaning that if two organizations sign a CSA, one today, another in December, the span covered will be from April 8, 2014, to April 8, 2015, for both.

The general public cannot obtain the same critical XP security updates which will be provided to the large companies and other organizations that negotiate a CSA with Microsoft.

Instead, Microsoft has encouraged consumers and very small businesses still running Windows XP to upgrade their hardware to Windows 8.1 or purchase new PCs with that OS, an appeal that has been characterized by some as deaf to reality.
http://www.computerworld.com/s/artic...public_patches





Vicious Heartbleed Bug Bites Millions Of Android Phones, Other Devices

Not the exclusive province of servers, Heartbleed can hack end users, too.
Dan Goodin

The catastrophic Heartbleed security bug that has already bitten Yahoo Mail, the Canada Revenue Agency, and other public websites also poses a formidable threat to end-user applications and devices, including millions of Android handsets, security researchers warned.

Handsets running version 4.1.1 of Google's mobile operating system are vulnerable to attacks that might pluck passwords, the contents of personal messages, and other private information out of device memory, a company official warned on Friday. Marc Rogers, principal security researcher at Lookout Mobile, a provider of anti-malware software for Android phones, said some versions of Android 4.2.2 that have been customized by the carriers or hardware manufacturers have also been found to be susceptible. Rogers said other releases may contain the critical Heartbleed flaw as well. Officials with BlackBerry have warned the company's messenger app for iOS, Mac OS X, Android, and Windows contains the critical defect and have released an update to correct it.

The good news, according to researchers at security firm Symantec, is that major browsers don't rely on the OpenSSL cryptographic library to implement HTTPS cryptographic protections. That means people using a PC to browse websites should be immune to attacks that allow malicious servers to extract data from an end user's computer memory. Users of smartphones, and possibly those using routers and "Internet of things" appliances, aren't necessarily as safe.

Chief among vulnerable devices are those running Android. While exploiting vulnerable handsets often isn't as simple as attacking vulnerable servers, the risk is high enough that users should tightly curtail use of their Android devices until users are sure their handsets aren't susceptible, Lookout's Rogers advised.

"If you have a vulnerable device and there's no fix available for you, I would be very cautious about using that device for sensitive data," he told Ars. "So I would be cautious about using it for banking or sending personal messages."

How Android phones are vulnerable

Rogers said the most likely scenario for an attacker exploiting a vulnerable Android device is to lure the user to a booby-trapped website that contains a cross-site request forgery or similar exploit that loads banking sites or other sensitive online services in a separate tab. By injecting malicious traffic into one tab, the attacker could possibly extract sensitive memory contents corresponding to the sites loaded in other tabs, he said. A less sophisticated version of the attack—but also one that's easier to execute—might simply inject the malicious commands into a vulnerable Android browser and opportunistically fish for any sensitive memory contents that may be returned.

Luckily, Android's security sandbox design prevents a malicious app from being able to access memory contents used by separate apps. Also fortunate is the fact that the majority of Android phones aren't susceptible. Still, the risk shouldn't be dismissed. About 34 percent of Android devices run on version 4.1.x of the mobile OS, according to figures supplied by Google. Google has said it's working with partners to roll out a patch, but as Ars has chronicled before, millions of Android smartphones never, or only rarely, receive available updates that patch dangerous security defects.

What's more, the threat of a vulnerable Android device being exploited by someone on the same Wi-Fi network as the targeted user, or by someone combining a Heartbleed attack with a separate exploit, should be enough to give people pause, even if they don't intend to visit banking sites or connect to Web-based e-mail or other sensitive services, Rogers counseled.

"The risk is that someone could either man-in-the-middle your Internet connection or use a cross-site request forgery-type attack or could use some kind of malicious thing to trick you into doing something secure and then fish out your secure credentials while you do that," he said. "That risk is sufficiently high as to say that you should be careful if your device is vulnerable."

Because Android is frequently customized for specific devices or manufacturers, it's possible some versions besides 4.1.1 and 4.2.2 are vulnerable. For that reason, Android users should download Heartbleed Detector, a free app developed by Lookout. In the vast majority of the tests Ars carried out, it found various Android versions contained a vulnerable version of OpenSSL, but that the Heartbeat extension that hosts the coding bug wasn't enabled, making the devices immune to attack. The sole exception was when Ars executed the app on a handset running version 4.1.1, which returned the screenshot below.

Security researchers have only begun to analyze the risks Heartbleed poses to people using home and small-office Internet routers, modems, and all kinds of other devices that rely on OpenSSL. It's too early to say which, or how many, of the appliances are susceptible to exploits that extract passwords or other data. But until more thorough audits are performed, users shouldn't rule out the possibility.
http://arstechnica.com/security/2014...other-devices/





19-Year-Old Canadian Arrested for Heartbleed Hack
Stephanie Mlot

Stephen Arthuro Solis-Reyes faces charges in relation to the breach of taxpayer data from the Canada Revenue Agency website.

A 19-year-old Canadian was arrested on Tuesday for his alleged role in the breach of the Canada Revenue Agency (CRA) website, the first known arrest for exploiting the Heartbleed bug.

Stephen Arthuro Solis-Reyes of London, Ontario faces one count of Unauthorized Use of Computer and one count of Mischief in Relation to Data.

On Monday, CRA Commissioner Andrew Treusch announced that over the course of six hours, the Social Insurance Numbers of about 900 taxpayers were removed from CRA systems. The hack occurred only a day after CRA services were fully restored, following last week's temporary shutdown due to the Heartbleed bug.

"The RCMP treated this breach of security as a high priority case and mobilized the necessary resources to resolve the matter as quickly as possible," Assistant Commissioner Gilles Michaud said in a statement.

A search of the suspect's home led to the seizure of computer equipment. Police provided no further details about the ongoing investigation.
Solis-Reyes is scheduled to appear in court in Ottawa on July 17.

Uncovered early last week by a team of researchers from Google Security and Codenomicon, the Heartbleed weakness has been roaming the Internet for two years, leaving the door to encrypted data and personal information wide open to scammers.

Now, Web-based organizations are scrambling to patch their systems before they become the next Canada Revenue Agency.
Those 900 residents whose data was compromised can expect a registered letter informing them that they've been impacted; for added security, the agency will not be making phone calls or sending emails.

It will, however, provide the affected users with free access to credit protection services and will apply additional protections to their CRA accounts to prevent future unauthorized activity.
http://www.pcmag.com/article2/0,2817,2456699,00.asp





Tor Begins Blacklisting Exit Nodes Vulnerable to Heartbleed
Michael Mimoso

The Tor Project has begun blacklisting exit nodes vulnerable to the Heartbleed vulnerability in OpenSSL.

Researcher Collin Mulliner, with the Systems Security Lab at Northeastern University in Boston, published the results of an experiment he conducted using a publicly disclosed Heartbleed proof-of-concept exploit against 5,000 Tor nodes. Mulliner said that 1,045 nodes, or a little more than 20 percent, were vulnerable to the bug.

Mulliner said only Tor exit nodes were leaking plaintext user traffic, including host names, credentials and web content. Mulliner conducted his experiment for three days last Friday through Sunday, and his results are a point-in-time snapshot. A post yesterday from Tor Project leader Roger Dingledine on the Tor mailing list said that 380 vulnerable exit keys were being rejected.

Heartbleed was publicly reported on April 7. The vulnerability lies in the heartbeat function in OpenSSL 1.0.1 to 1.0.1f which publicly leaks 64 KB of memory to any client or server pinging a web server running the vulnerable crypto library. The memory leaks can disclose in plaintext anything from user credentials to private server keys if the attack is repeated enough. Several researchers have already managed to retrieve private SSL keys in an online challenge from vendor CloudFlare. Speculation is that intelligence agencies and/or hackers may have been exploiting it since November. Mulliner said he did not try to extract private keys from Tor, nor did he think it was possible.

Tor promises anonymity to its users by using proxies to pass encrypted traffic from source to destination. Mulliner said he used a random list of 5,000 Tor nodes from the Dan.me.uk website for his research; of the 1,045 vulnerable nodes he discovered, he recovered plaintext traffic that included Tor plaintext announcements, but a significant number of nodes leaked user traffic in the clear.”

“I found a significant amount of plaintext user traffic, complete Web traffic, session IDs; everything you would find if you ran Heartbleed against a normal Web server,” Mulliner said.

Heartbleed saves attackers the work of setting up their own exit node and waiting for traffic to pass through it. Using Heartbleed, all a hacker would have to do is query a vulnerable exit node to obtain traffic, Mulliner said.

Dingledine yesterday published the first list of rejected exit nodes and said those nodes will not be allowed back on the network.

“I thought for a while about trying to keep my list of fingerprints up-to-date (i.e. removing the !reject line once they’ve upgraded their openssl), but on the other hand, if they were still vulnerable as of yesterday, I really don’t want this identity key on the Tor network even after they’ve upgraded their OpenSSL,” Dingledine wrote. He added that he hopes others will add to this list as other vulnerable relays are discovered.

Tor acknowledged some of its components were vulnerable to Heartbleed in a post to its blog on April 7.

Mulliner said it was a fairly straightforward process to write a script to run a Heartbleed proof of concept.

“Anybody who can get the Python script can play around with it,” Mulliner said, adding that there are likely fewer vulnerable Tor nodes now than when he ran his scans last week since some have likely been patched and Tor has begun blacklisting. “The data is dated, but it’s a good picture of that point in time.”
http://threatpost.com/tor-begins-bla...rtbleed/105519





New ‘Google’ for the Dark Web Makes Buying Dope and Guns Easy
Kim Zetter

The dark web just got a little less dark with the launch of a new search engine that lets you easily find illicit drugs and other contraband online.

Grams, which launched last week and is patterned after Google, is accessible only through the Tor anonymizing browser (the address for Grams is: grams7enufi7jmdl.onion) but fills a niche for anyone seeking quick access to sites selling drugs, guns, stolen credit card numbers, counterfeit cash and fake IDs — sites that previously only could be found by users who knew the exact URL for the site.

“I noticed on the forums and reddit people were constantly asking ‘where to get product X?’ and ‘which market had product X?’ or ‘who had the best product X and was reliable and not a scam?’” Grams’ creator told WIRED in a chat session. “I wanted to make it easy for people to find things they wanted on the darknet and figure out who was a trustworthy vendor.”

He wouldn’t provide his real name and asked instead to be referred to by the pseudonym he uses on Reddit, “gramsadmin.”

Although Grams is still in beta, it’s already serving up results from eight online markets, thanks to an API the developer made available to site owners to allow his engine to scrape their product listings.

These include SilkRoad2, which popped up in the wake of the original Silk Road’s demise following the arrest of its alleged founder Ross Ulbricht and the seizure of that site by the feds.

Other sites included in the search listings so far include Agora, BlackBank, Cloud-Nine, Evolution, NiceGuy, Pandora, and The Pirate Market.

The search engine results include the vendor’s name and location and the price of the product.

The engine also includes a number of Google-like features including an “I Feel Lucky” search button (our test of it produced listings for high-quality crystal meth) and other features that allow users to filter out results for sites they don’t want to see and sort items for price and the most recent listings.

There are even plans for advertising a la Google adwords, according to the developer of Grams, who has been posting announcements about his progress on Reddit.

“I am working on the algorithm so it is a lot like Google’s it will have a scoring system based how long the listing has been up, how many transactions, how many good reviews. That way you will see the best listing first,” he wrote in one of his posts.

“Within the next two weeks Grams will have a system similar to Google AdWords where vendors can buy keywords and their listings will go to the top of the search results when those keywords are searched for,” he wrote in another post. “They will be bordered with an advertisement disclaimer so users know those are paid results.”

One more advantage that Grams provides? It helps users locate sites that have gone down — due to a DDoS attack or other event — and relaunched under new URLs.

Gramsadmin says he coded the engine on his own, working 14-hour days for the last two weeks, and would love help, though he suspects this will be difficult.

“I have many ideas and features I am trying to implement,” he told WIRED. “I would love to hire programmers, but it is very hard to hire a good programmer you can trust and still remain anonymous.”

Among the features to be added are profile pages for vendors, to include contact info for the vendor in case a market goes down, and customer reviews of the vendors across multiple markets where they peddle their wares.

“I don’t have the capabilities yet to spider all of the darknet, so for now [I'm] working on making an automated site submitter for people to submit their sites and get listed in our search engine,” he said. “I will also be making it easier for advertisers to buy ads through an automated system.”

The dark web community has been so appreciative of his project, they have already sent out their version of a welcome wagon.

“Grams did get hit by a DDoS attack after the launch of the beta version,” he said. “It took us down for a few hours. [But] every major darknet site gets DDoS’d though so I took it as a ‘Welcome to the neighborhood’ message.”
http://www.wired.com/2014/04/grams-s...gine-dark-web/





Obama Lets N.S.A. Exploit Some Internet Flaws, Officials Say
David E. Sanger

Stepping into a heated debate within the nation’s intelligence agencies, President Obama has decided that when the National Security Agency discovers major flaws in Internet security, it should — in most circumstances — reveal them to assure that they will be fixed, rather than keep mum so that the flaws can be used in espionage or cyberattacks, senior administration officials said Saturday.

But Mr. Obama carved a broad exception for “a clear national security or law enforcement need,” the officials said, a loophole that is likely to allow the N.S.A. to continue to exploit security flaws both to crack encryption on the Internet and to design cyberweapons.

The White House has never publicly detailed Mr. Obama’s decision, which he made in January as he began a three-month review of recommendations by a presidential advisory committee on what to do in response to recent disclosures about the National Security Agency.

But elements of the decision became evident on Friday, when the White House denied that it had any prior knowledge of the Heartbleed bug, a newly known hole in Internet security that sent Americans scrambling last week to change their online passwords. The White House statement said that when such flaws are discovered, there is now a “bias” in the government to share that knowledge with computer and software manufacturers so a remedy can be created and distributed to industry and consumers.

Caitlin Hayden, the spokeswoman for the National Security Council, said the review of the recommendations was now complete, and it had resulted in a “reinvigorated” process to weigh the value of disclosure when a security flaw is discovered, against the value of keeping the discovery secret for later use by the intelligence community.

“This process is biased toward responsibly disclosing such vulnerabilities,” she said.

Until now, the White House has declined to say what action Mr. Obama had taken on this recommendation of the president’s advisory committee, whose report is better known for its determination that the government get out of the business of collecting bulk telephone data about the calls made by every American. Mr. Obama announced last month that he would end the bulk collection, and leave the data in the hands of telecommunications companies, with a procedure for the government to obtain it with court orders when needed.

But while the surveillance recommendations were noteworthy, inside the intelligence agencies other recommendations, concerning encryption and cyber operations, set off a roaring debate with echoes of the Cold War battles that dominated Washington a half-century ago.

One recommendation urged the N.S.A. to get out of the business of weakening commercial encryption systems or trying to build in “back doors” that would make it far easier for the agency to crack the communications of America’s adversaries. Tempting as it was to create easy ways to break codes — the reason the N.S.A. was established by Harry S. Truman 62 years ago — the committee concluded that the practice would undercut trust in American software and hardware products. In recent months, Silicon Valley companies have urged the United States to abandon such practices, while Germany and Brazil, among other nations, have said they were considering shunning American-made equipment and software. Their motives were hardly pure: Foreign companies see the N.S.A. disclosures as a way to bar American competitors.

Another recommendation urged the government to make only the most limited, temporary use of what hackers call “zero days,” the coding flaws in software like Microsoft Windows that can give an attacker access to a computer — and to any business, government agency or network connected to it. The flaws get their name from the fact that, when identified, the computer user has “zero days” to fix them before hackers can exploit the accidental vulnerability.

The N.S.A. made use of four “zero day” vulnerabilities in its attack on Iran’s nuclear enrichment sites. That operation, code-named “Olympic Games,” managed to damage roughly 1,000 Iranian centrifuges, and by some accounts helped drive the country to the negotiating table.

Not surprisingly, officials at the N.S.A. and at its military partner, the United States Cyber Command, warned that giving up the capability to exploit undisclosed vulnerabilities would amount to “unilateral disarmament” — a phrase taken from the battles over whether and how far to cut America’s nuclear arsenal.

“We don’t eliminate nuclear weapons until the Russians do,” one senior intelligence official said recently. “You are not going to see the Chinese give up on ‘zero days’ just because we do.” Even a senior White House official who was sympathetic to broad reforms after the N.S.A. disclosures said last month, “I can’t imagine the president — any president — entirely giving up a technology that might enable him some day to take a covert action that could avoid a shooting war.”

At the center of that technology are the kinds of hidden gaps in the Internet — almost always created by mistake or oversight — that Heartbleed created. There is no evidence that the N.S.A. had any role in creating Heartbleed, or even that it made use of it. When the White House denied prior knowledge of Heartbleed on Friday afternoon, it appeared to be the first time that the N.S.A. had ever said whether a particular flaw in the Internet was — or was not — in the secret library it keeps at Fort Meade, Md., the headquarters of the agency and Cyber Command.

But documents released by Edward J. Snowden, the former N.S.A. contractor, make it clear that two years before Heartbleed became known, the N.S.A. was looking at ways to accomplish exactly what the flaw did by accident. A program code-named Bullrun, apparently named for the site of two Civil War battles just outside Washington, was part of a decade-long effort to crack or circumvent encryption on the web. The documents do not make clear how well it succeeded, but it may well have been more effective than exploiting Heartbleed would be at enabling access to secret data.

The government has become one of the biggest developers and purchasers of information identifying “zero days,” officials acknowledge. Those flaws are big business — Microsoft pays up to $150,000 to those who find them and bring them to the company to fix — and other countries are gathering them so avidly that something of a modern-day arms race has broken out. Chief among the nations seeking them are China and Russia, though Iran and North Korea are in the market as well.

“Cyber as an offensive weapon will become bigger and bigger,” said Michael DeCesare, who runs the McAfee computer security operations of Intel Corporation. “I don’t think any amount of policy alone will stop them” from doing what they are doing, he said of the Russians, the Chinese and others. “That’s why effective command and control strategies are absolutely imperative on our side.”

The presidential advisory committee did not urge the N.S.A. to get out of the business entirely. But it said that the president should make sure the N.S.A. does not “engineer vulnerabilities” into commercial encryption systems. And it said that if the United States finds a “zero day,” it should patch it, not exploit it, with one exception: Senior officials could “briefly authorize using a zero day for high priority intelligence protection.”
http://www.nytimes.com/2014/04/13/us...cials-say.html





Guardian and Washington Post Win Pulitzer Prize for NSA Revelations

Pair awarded highest accolade in US journalism, winning Pulitzer prize for public service for stories on NSA surveillance
Ed Pilkington

The Guardian revealed the NSA's bulk collection of phone records 10 months ago based on Edward Snowden's leaks. Photograph: AFP/Getty Images

The Guardian and the Washington Post have been awarded the highest accolade in US journalism, winning the Pulitzer prize for public service for their groundbreaking articles on the National Security Agency’s surveillance activities based on the leaks of Edward Snowden.

The award, announced in New York on Monday, comes 10 months after the Guardian published the first report based on the leaks from Snowden, revealing the agency’s bulk collection of US citizens’ phone records.

In the series of articles that ensued, teams of journalists at the Guardian and the Washington Post published the most substantial disclosures of US government secrets since the Pentagon Papers on the Vietnam war in 1971.

The Pulitzer committee praised the Guardian for its "revelation of widespread secret surveillance by the National Security Agency, helping through aggressive reporting to spark a debate about the relationship between the government and the public over issues of security and privacy".

Snowden, in a statement, said: "Today's decision is a vindication for everyone who believes that the public has a role in government. We owe it to the efforts of the brave reporters and their colleagues who kept working in the face of extraordinary intimidation, including the forced destruction of journalistic materials, the inappropriate use of terrorism laws, and so many other means of pressure to get them to stop what the world now recognises was work of vital public importance."

He said that his actions in leaking the documents that formed the basis of the reporting "would have been meaningless without the dedication, passion, and skill of these newspapers".

The Pulitzers have been bestowed since 1917, at the bequest of the legendary newspaper publisher Joseph Pulitzer who established the honour in his will as a means of encouraging publicly-spirited journalism. Awards were given in 22 categories this year: the Boston Globe received the Pulitzer for breaking for "exhaustive and empathetic" coverage of the Boston marathon bombing. Journalists in the Globe newsroom observed a period of silence on Monday in memory of the victims, a day before the one-year anniversary of the attack.

At the Guardian, the NSA reporting was led by Glenn Greenwald, Ewen MacAskill and film-maker Laura Poitras, and at the Washington Post by Barton Gellman, who also co-operated with Poitras. All four journalists were honoured with a George Polk journalism award last week for their work on the NSA story.

The NSA revelations have reverberated around the world and sparked a debate in the US over the balance between national security and personal privacy. On the back of the disclosures, President Obama ordered a White House review into data surveillance, a number of congressional reform bills have been introduced, and protections have begun to be put in place to safeguard privacy for foreign leaders and to increase scrutiny over the NSA’s mass data collection.

"We are truly honoured that our journalism has been recognised with the Pulitzer prize," said Alan Rusbridger, the editor-in-chief of the Guardian. "This was a complex story, written, edited and produced by a team of wonderful journalists. We are particularly grateful for our colleagues across the world who supported the Guardian in circumstances which threatened to stifle our reporting. And we share this honour, not only with our colleagues at the Washington Post, but also with Edward Snowden, who risked so much in the cause of the public service which has today been acknowledged by the award of this prestigious prize."

Janine Gibson, the editor-in-chief of Guardian US, said: "We're extremely proud and gratified to have been honoured by the Pulitzer board. It's been an intense, exhaustive and sometimes chilling year working on this story, and we're grateful for the acknowledgement by our peers that the revelations made by Edward Snowden and the work by the journalists involved represent a high achievement in public service."

Among the disclosures were:

• the NSA’s mass dragnet of phone records of millions of Americans.

• the program codenamed Prism used by the NSA and its UK counterpart GCHQ to gain back-door entry into the data of nine giant internet companies including Google and Facebook.

• the cracking of internet encryption by the NSA and GCHQ that undermined personal security for web users.

• NSA surveillance of phone calls made by 35 world leaders.

The coverage of the Snowden leaks presented a particularly thorny issue for the 19-strong panel of journalists, academics and writers who recommend the winners. The stream of disclosures invoked strong and polarised reactions in the US and around the world.

In January, Obama said that the debate on the acceptable limits of government surveillance prompted by the articles “will make us stronger”. But other prominent US politicians such as Mike Rogers, Republican chairman of the House intelligence committee, have suggested journalism based on Snowden’s leaks was tantamount to dealing in stolen property.

Snowden has been charged with three offences in the US. He is the eighth person to be charged with breaking the 1917 Espionage Act by the Obama administration – more than all the prosecutions brought under previous presidents combined.

The Guardian's US operation, headquartered in New York, was incorporated as an American company in 2011 and recognised last year by the Pulitzer board as a US news outlet eligible to be considered for its prizes.

Last month Rusbridger was given a special award at the European press awards; earlier this month the Guardian was named newspaper of the year in the UK; and there it has been awarded other prizes for online and investigative journalism in Germany, Spain and the US.

The Snowden stories were edited from New York by Gibson, and Guardian US deputy editor Stuart Millar. The UK end of the reporting was led by deputy editor Paul Johnson and investigations editor Nick Hopkins.

Others on the team of journalists included Spencer Ackerman, James Ball, David Blishen, Gabriel Dance, Julian Borger, Nick Davies, David Leigh and Dominic Rushe. In Australia the editor was Katharine Viner and the reporter Lenore Taylor.

Among the other Pulitzers, Will Hobson and Michael LaForgia of the Tampa Bay Times won for local reporting for their investigation into the housing blight of the city’s homeless population; and the New York Times’s Tyler Hicks and Josh Haner took the two photography prizes.

The Pulitzer for fiction writing went to Donna Tartt for The Goldfinch, while Annie Baker won the prize for drama for her play set in a cinema, The Flick. Become Ocean, a piece commissioned by the Seattle Symphony by John Luther Adams, won the Pulitzer for music.

All the awards are administered by Columbia University. The full list is here.
http://www.theguardian.com/media/201...sa-revelations





Out in the Open: Inside the Operating System Edward Snowden Used to Evade the NSA
Klint Finley

When NSA whistle-blower Edward Snowden first emailed Glenn Greenwald, he insisted on using email encryption software called PGP for all communications. But this month, we learned that Snowden used another technology to keep his communications out of the NSA’s prying eyes. It’s called Tails. And naturally, nobody knows exactly who created it.

Tails is a kind of computer-in-a-box. You install it on a DVD or USB drive, boot up the computer from the drive and, voila, you’re pretty close to anonymous on the internet. At its heart, Tails is a version of the Linux operating system optimized for anonymity. It comes with several privacy and encryption tools, most notably Tor, an application that anonymizes a user’s internet traffic by routing it through a network of computers run by volunteers around the world.

Snowden, Greenwald and their collaborator, documentary film maker Laura Poitras, used it because, by design, Tails doesn’t store any data locally. This makes it virtually immune to malicious software, and prevents someone from performing effective forensics on the computer after the fact. That protects both the journalists, and often more importantly, their sources.

“The installation and verification has a learning curve to make sure it is installed correctly,” Poitras told WIRED by e-mail. “But once the set up is done, I think it is very easy to use.”

An Operating System for Anonymity

Originally developed as a research project by the U.S. Naval Research Laboratory, Tor has been used by a wide range of people who care about online anonymity: everyone from Silk Road drug dealers, to activists, whistleblowers, stalking victims and people who simply like their online privacy.

Tails makes it much easier to use Tor and other privacy tools. Once you boot into Tails — which requires no special setup — Tor runs automatically. When you’re done using it, you can boot back into your PC’s normal operating system, and no history from your Tails session will remain.

The developers of Tails are, appropriately, anonymous. All of WIREDS’s questions were collectively — and anonymously — answered by the group’s members via email.

They’re protecting their identities, in part, to help protect the code from government interference. “The NSA has been pressuring free software projects and developers in various ways,” the group says, referring to a a conference last year at which Linux creator Linus Torvalds implied that the NSA had asked him place a backdoor in the operating system.

But the Tails team is also trying to strike a blow against the widespread erosion of online privacy. “The masters of today’s Internet, namely the marketing giants like Google, Facebook, and Yahoo, and the spying agencies, really want our lives to be more and more transparent online, and this is only for their own benefit,” the group says. “So trying to counterbalance this tendency seems like a logical position for people developing an operating system that defends privacy and anonymity online.”

But since we don’t know who wrote Tails, how do we now it isn’t some government plot designed to snare activists or criminals? A couple of ways, actually. One of the Snowden leaks show the NSA complaining about Tails in a Power Point Slide; if it’s bad for the NSA, it’s safe to say it’s good for privacy. And all of the Tails code is open source, so it can be inspected by anyone worried about foul play. “Some of us simply believe that our work, what we do, and how we do it, should be enough to trust Tails, without the need of us using our legal names,” the group says.

According to the group, Tails began five years ago. “At that time some of us were already Tor enthusiasts and had been involved in free software communities for years,” they says. “But we felt that something was missing to the panorama: a toolbox that would bring all the essential privacy enhancing technologies together and made them ready to use and accessible to a larger public.”

The developers initially called their project Amnesia and based it on an existing operating system called Incognito. Soon the Amnesia and Incognito projects merged into Tails, which stands for The Amnesic Incognito Live System.

And while the core Tails group focuses on developing the operating system for laptops and desktop computers, a separate group is making a mobile version that can run on Android and Ubuntu tablets, provided the user has root access to the device.

Know Your Limitations

In addition to Tor, Tails includes privacy tools like PGP, the password management system KeePassX, and the chat encryption plugin Off-the-Record. But Tails doesn’t just bundle a bunch of off the shelf tools into a single package. Many of the applications have been modified to improve the privacy of its users.

But no operating system or privacy tool can guarantee complete protection in all situations.

Although Tails includes productivity applications like OpenOffice, GIMP and Audacity, it doesn’t make a great everyday operating system. That’s because over the course of day-to-day use, you’re likely to use service or another that could be linked with your identity, blowing your cover entirely. Instead, Tails should only be used for the specific activities that need to be kept anonymous, and nothing else.

The developers list several other security warnings in the site documentation.

Of course the group is constantly working to fix security issues, and they’re always looking for volunteers to help with the project. They’ve also applied for a grant from the Knight Foundation, and are collecting donations via the Freedom of the Press Foundation, the group that first disclosed Tails’s role in the Snowden story.

That money could go a long way toward helping journalists — and others — stay away from the snoops. Reporters, after all, aren’t always the most tech-savvy people. As Washington Post reporter Barton Gellman told the Freedom of the Press Foundation, “Tails puts the essential tools in one place, with a design that makes it hard to screw them up. I could not have talked to Edward Snowden without this kind of protection. I wish I’d had it years ago.”
http://www.wired.com/2014/04/tails





Lavabit Held in Contempt of Court for Printing Crypto Key in Tiny Font

US attorney: Lavabit "treated court orders like contract negotiations."
Joe Silver

A federal appeals court on Wednesday upheld a contempt of court ruling against Ladar Levison and his now-defunct encrypted e-mail service provider, Lavabit LLC, for hindering the government's investigation into the National Security Agency leaks surrounding Edward Snowden.

In the summer of 2013, Lavabit was ordered to provide real-time e-mail monitoring of one particular user of the service, believed to be Snowden, the former NSA contractor turned whistleblower. Instead of adequately complying with the order to turn over the private SSL keys that protected his company's tens of thousands of users from the government's prying eyes, Levison chose instead to shut down Lavabit last year after weeks of stonewalling the government.

Lavabit

Levison reluctantly turned over his encryption keys to the government, although not in a manner that the government deemed useful—he provided a lengthy printout in tiny type, a move the authorities said was objectionable. “The company had treated the court orders like contract negotiations rather than a legal requirement,” US Attorney Andrew Peterson, who represented the government, told PC World.

Levison said, “I have only ever objected to turning over the SSL keys because that would compromise all of the secure communications in and out of my network, including my own administrative traffic.”

In the opinion, Judge G. Steven Agee of the Fourth US Circuit Court of Appeals didn't rule on the merits of Levison's claims and found that a procedural error on his part forced the court’s hand.

Judge Agee, who was joined in his ruling by Judge Paul Niemeyer and Judge Roger Gregory, ruled that Levison should have brought his claims that the government allegedly exceeded its authority under the US “pen register” and “trap and trace” statutes prior to the district judge holding him in contempt of court last year.

Agee explained, "Lavabit proposes that we hear its challenge to the Pen/Trap Order because Lavabit views the case as a matter of 'immense public concern.'" (Reply Br. 6.) Yet there exists a perhaps greater "public interest in bringing litigation to an end after fair opportunity has been afforded to present all issues of law and fact.”

Agee concluded that “Lavabit neither ‘plainly’ nor ‘properly’ identified these issues for the district court.” He continued, “We decline to hear Lavabit’s new arguments merely because Lavabit believes them to be important.”

Attorney Brian Hauss from the ACLU, which filed an amicus brief in the case, told Ars:

The court focused its decision on procedural aspects of the case unrelated to the merits of Lavabit’s claims. On the merits, we believe it’s clear that there are limits on the government’s power to coerce innocent service providers into its surveillance activities. The government exceeded those limits when it asked Lavabit to blow up its business—and undermine the encryption technology that ensures our collective cybersecurity—to get information that Lavabit itself offered to provide.

Levison did not immediately comment on the contempt order, which carries a fine of thousands of dollars.

UPDATE 1:20pm CT: Levison told Ars: "I haven't read the court's opinion, nor sought advice from lawyers on any possible legal strategy, so that is still pending. My focus as of late has been on building a technological solution [in the form of the DarkMail Alliance]; which would take the decision away from the will of man."
http://arstechnica.com/tech-policy/2...-in-tiny-font/





Despite Lavabit Contempt Order, e-Mail Privacy Stalled in Congress

We don't need no stinking warrants. Hand over your e-mail.
David Kravets

A federal appeals court is holding in contempt the operator of a now-defunct e-mail service because he refused to abide by a court order and turn over the crypto keys and expose Lavabit's 400,000 customers to the government's prying eyes.

Equally troubling as that Wednesday decision by the Fourth US Circuit Court of Appeals may be, Congress has essentially punted on reforming the Electronic Communications Privacy Act, the law surrounding e-mail privacy.

That has led one of the leading lobbyists on the matter to declare a defeat of sorts.

"It's become clear to us in the course of a year and a half, we're not going to see comprehensive ECPA reform at this time," lobbyist Jim Dempsey, a vice president at the Center for Democracy & Technology, said in a telephone interview.

As it now stands, the President Ronald M. Reagan-era law allows the cops to get your e-mail or other cloud-stored content without a warrant, so long as it's been stored on a third-party's servers for at least six months. That law, combined with others, also allows the authorities to obtain cell-site data without a warrant. (Court rulings on these topics are mixed, and some key e-mail services, like Google, Microsoft and others, say they demand warrants despite the law.)

All the while, gridlock and fear in Congress is keeping lawmakers from adopting even watered-down reform packages.

That means lawmakers cannot bring themselves to update a law Reagan signed almost three decades ago, when CompuServe was king, when e-mail was briefly stored on servers before recipients downloaded them with their own software. The only clouds available at that time were those in the sky. Gmail was a figment of science fiction, and e-mail left on servers was considered abandoned.

But that original law that protected e-mail has been turned on its head, as most e-mail users store their communications in the cloud.

Consider that the Senate Judiciary Committee passed a watered down reform measure last year requiring the authorities to obtain a probable-cause warrant to acquire cloud-based data—the same standard required to search the same material if it was on a hard drive in your house.

"Just because your emails are on your computer, must not mean they have any less protection than if they were printed on your desk," said Mark Jaycox, an Electronic Frontier Foundation legislative analyst.

But American politics has created the 8th Wonder of the World—the legislative hold. In this case, an anonymous lawmaker or lawmakers has blocked the measure from going before the full Senate for a yes or no vote. And that's despite the unheard of announcement by the Justice Department saying it supports the package to enhance the public's privacy protections.

And this is the weakened version of legislation that once required warrants to track one's movements via cell tower pings from their mobile phones.

Dempsey is lobbying for the privacy changes along with several groups, like Digital Due Process, whose membership includes the biggest names in tech, from Adobe to Twitter, and said the Security and Exchange Commission's opposition to the warrant requirement has rattled some lawmakers.

Mary Joe White, the SEC chief, doesn't support the changeover and wants to keep it easy for investigators to access cloud communications.

"A few members of Congress are concerned," Dempsey said.

Yet 200 House members think it's a good idea to shore up the warrant requirement, at least insofar as e-mail and cloud-stored content is concerned.

Rep. Kevin Yoder (R-KS) proposed legislation nearly a year ago demanding a warrant. Some 200 members have signed onto the package, an unprecedented number. Yet it hasn't gotten a vote before the House Judiciary Committee, the body that sends it to the House floor.

“As the way we communicate with each other has dramatically changed over the past twenty years, our electronic communications laws have not kept pace," Yoder said.

House Judiciary Committee Chairman Bob Goodlatte (R-VA) talks about reforming the law.

"ECPA reform must be undertaken so that despite the evolution of technology and its use in the world, the constitutional protections reinforced by ECPA will endure," he said a year ago during a committee hearing.

Goodlatte's office did not return calls seeking comment. But Dempsey, Jaycox, and others suggested that Goodlatte is mulling the introduction of his own package.

"So until Goodlatte decides to publish his own bill or take up Yoder, the Yoder bill is left in limbo," Jaycox wrote in an e-mail.

While Congress twiddles its thumbs, Ladar Levison, Lavabit's operator whose contempt citation was upheld Monday, unplugged his e-mail service rather than supply the government the keys to its 400,000 accounts.

Levison argued on appeal that federal law did not give the government the power to demand the keys to his e-mail kingdom. The government argued the other way and convinced a federal judge last year to order Levison to provide the code.

However, the appeals court ruled on a technicality [PDF], saying Wednesday that Levison did not challenge the order in the trial court, and, hence, his opposition to it at the appellate level was improper.

All of which means the same type of courtroom legal tussle—which perhaps entails bigger privacy ramifications than reforming the Electronic Communications Privacy Act—is likely to resurface with another e-mail provider. And the Legislative Branch will remain on the sidelines, crippled by its own gridlock.

"In the Lavabit case, at the appellate level, both sides argued that the statute was already clear enough—clear enough that it did require disclosure of the keys in the government's view and clear enough that it did not require disclosure of the keys in Lavabit’s view," Dempsey said. "Both arguments are now available in future cases, should they arise, and there is no call (in Congress) for clarifying the statute one way or the other."
http://arstechnica.com/tech-policy/2...d-in-congress/





It’s Time to Encrypt the Entire Internet
Klint Finley

The Heartbleed bug crushed our faith in the secure web, but a world without the encryption software that Heartbleed exploited would be even worse. In fact, it’s time for the web to take a good hard look at a new idea: encryption everywhere.

Most major websites use either the SSL or TLS protocol to protect your password or credit card information as it travels between your browser and their servers. Whenever you see that a site is using HTTPS, as opposed to HTTP, you know that SSL/TLS is being used. But only a few sites — like Facebook and Gmail — actually use HTTPS to protect all of their traffic as opposed to just passwords and payment details.

Many security experts — including Google’s in-house search guru, Matt Cutts — think it’s time to bring this style of encryption to the entire web. That means secure connections to everything from your bank site to Wired.com to the online menu at your local pizza parlor.

Cutts runs Google’s web spam team. He helps the company tweak its search engine algorithms to prioritize certain sites over others. For example, the search engine prioritizes sites that load quickly, and penalizes sites that copy — or “scrape” — text from others.

If Cutts had his way, Google would prioritize sites that use HTTPS over those that don’t, he told blogger Barry Schwartz at a conference earlier this year. The change, if it were ever implemented, would likely spur an HTTPS stampede as web sites competed for better search rankings.

Cutts, who didn’t respond to our request for comment, told Schwartz that it’s a controversial idea, and it faces some opposition within Google. A Google spokesperson would only tell us that the company has nothing to announce at this time. So this change won’t happen overnight.

Dump the Plain Text Internet

White hat hacker Moxie Marlinspike knows as well as anyone how insecure SSL/TLS can be. A former Twitter engineer, he’s uncovered multiple critical bugs in the protocols over the course of his career and has proposed an alternative way handling trust and verification in the protocol. But he still thinks that using HTTPS in as many places as possible would be a good thing. “I think there’s value to making network traffic as opaque as possible, even for static content,” he says. “Ideally we would replace plain text on the internet entirely.”

When you use HTTPS, the data is coded so that, in theory, only you and the server you’re communicating with read the contents of the messages passing back and forth between your computer and the server.

Most major websites only use HTTPS to protect your password when you login, or your credit card information when you make a purchase. But that started to change in 2010 when software developer Eric Butler released a free tool called FireSheep to show just how easy it was to temporarily take control of someone else’s account over a shared network — such as a public Wi-Fi connection.

Butler agrees that more use of HTTPS would be a good thing, pointing out that using HTTP makes it easier for governments or criminals to spy on what internet users are doing online. And Micah Lee, a technologist for The Intercept, points out that there are many situations in which it makes sense to use HTTPS besides just protecting passwords or other sensitive information.

For example, HTTPS doesn’t just encrypt the information passing between a server and your computer: It also verifies that the content you’re downloading is coming from the people you expect it to be coming from — again, in theory. That’s something that a regular HTTP connection can’t do.

“Any sort of attacks that involve tricking the victim into connecting to the attacker’s server instead of the real server gets halted by HTTPS,” Lee said via email. “And this is really important, even for non-secret content, because of integrity: you really don’t want attackers modifying the content of websites you’re visiting without your knowledge.”

For example, a country that doesn’t want its citizens getting certain information from Wikipedia can set up a system that feeds users fake Wikipedia pages. “Without HTTPS, censorship isn’t just possible,” Lee says. “It’s simple for powerful attackers like governments, and it’s impossible for ordinary users to detect.”

There are other ways that a rogue government or criminal hacker could cause problems by replacing insecure content with their own fake pages. Lee points out that many journalists post their PGP encryption keys on their websites using only HTTP. An attack could show a potential whistleblower a fake page with a fake encryption key, causing them to turn incriminating evidence over to, for example, the government or their employer.

One of the most dangerous possibilities, however, is that hackers could replace software downloads with malware. “Websites that publish software have no business ever using HTTP,” Lee says. “They should always use HTTPS. If they don’t, they’re putting software users at risk.”

The Argument Against Total SSL

But if HTTPS is so great, then why don’t all websites use it already? There are several disadvantages to using HTTPS everywhere, the World Wide Web Consortium’s HTTPS expert Yves Lafon told us in 2011.

The first is the increased cost. You have to purchase TLS certificates from one of several certificate authorities, which can cost anything from $10 dollars per year to about $1,000 dollars a year, depending on the type of certificate you purchase and the level of identify verification it provides. Another issue is that HTTPS increases server resource consumption and can slow sites down. But Marlinspike and Butler say the costs and resource overhead are actually greatly overestimated.

An issue for smaller sites is that it’s historically been hard to set up unique certificates on sites that use cheap shared hosting. Also, sites that used content delivery networks — or CDNs — to speed up their responsiveness also frequently faced challenges when implementing SSL. Both of these issues have been largely resolved today, though the costs, performance and complexity varies from host to host.

But even if the entire web isn’t ready to switch completely to HTTPS, there are plenty of reasons that more sites should start using HTTPS by default — especially sites that provide public information and software. And given how far we’ve already come since the days of FireSheep, we can expect HTTPS to continue to continue to spread, even if Google doesn’t start prioritizing sites that use it.
http://www.wired.com/2014/04/https/

















Until next week,

- js.



















Current Week In Review





Recent WiRs -

April 12th, April 5th, March 29th, March 22nd

Jack Spratts' Week In Review is published every Friday. Submit letters, articles, press releases, comments, questions etc. in plain text English to jackspratts (at) lycos (dot) com. Submission deadlines are Thursdays @ 1400 UTC. Please include contact info. The right to publish all remarks is reserved.


"The First Amendment rests on the assumption that the widest possible dissemination of information from diverse and antagonistic sources is essential to the welfare of the public."
- Hugo Black
JackSpratts is offline   Reply With Quote
Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Peer-To-Peer News - The Week In Review - July 16th, '11 JackSpratts Peer to Peer 0 13-07-11 06:43 AM
Peer-To-Peer News - The Week In Review - July 9th, '11 JackSpratts Peer to Peer 0 06-07-11 05:36 AM
Peer-To-Peer News - The Week In Review - January 30th, '10 JackSpratts Peer to Peer 0 27-01-10 07:49 AM
Peer-To-Peer News - The Week In Review - January 16th, '10 JackSpratts Peer to Peer 0 13-01-10 09:02 AM
Peer-To-Peer News - The Week In Review - December 5th, '09 JackSpratts Peer to Peer 0 02-12-09 08:32 AM






All times are GMT -6. The time now is 03:45 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
© www.p2p-zone.com - Napsterites - 2000 - 2024 (Contact grm1@iinet.net.au for all admin enquiries)