P2P-Zone  

Go Back   P2P-Zone > Peer to Peer
FAQ Members List Calendar Search Today's Posts Mark Forums Read

Peer to Peer The 3rd millenium technology!

Reply
 
Thread Tools Search this Thread Display Modes
Old 16-01-19, 07:34 AM   #1
JackSpratts
 
JackSpratts's Avatar
 
Join Date: May 2001
Location: New England
Posts: 10,013
Default Peer-To-Peer News - The Week In Review - January 19th, ’19

Since 2002


































"Who dropped a glass on the floor?" – Thomas Koritke






































January 19th, 2019




Ajit Pai Refuses to Brief Lawmakers Over Phone-Tracking Scandal, Dubiously Blames Shutdown
Dell Cameron

Saying there is no legal reason why Ajit Pai should rebuff an invitation to brief lawmakers during the government shutdown, the chairman of the House Energy and Commerce Committee on Monday criticized the FCC head for refusing to appear for a briefing about the ongoing controversy over Americans’ real-time location data being actively disclosed to unauthorized third-parties.

In a call on Monday, Pai’s staff told Democrats that he would not appear to discuss his agency’s progress in tackling the issue due to the shutdown. His presence was requested last week for an “emergency briefing” by the committee’s chair, Rep. Frank Pallone, Jr., who, citing a threat to law enforcement, military personnel, and domestic abuse victims, said the matter could not wait for the shutdown to be resolved.

Significant portions of the federal government have been closed since Dec. 22, as President Trump has refused to sign any spending bill that doesn’t include billions of dollars to construct a wall along the U.S.-Mexico border. The House of Representatives and Senate have both passed bills excluding the money. After FCC funds dried up on Jan. 3, the commission said it would cease all work irrelevant to “the protection of life and property.”

Attention spiked over the phone-tracking issue last week after Motherboard revealed that one of its reporters managed to acquire the location of a cellphone in Queens, New York, through a $300 back-alley deal. The report followed a New York Times story last April that disclosed how phone-location data was being funneled by “middleman” companies from carriers such as AT&T and T-Mobile to law enforcement officials who would otherwise require a warrant to obtain it.

Motherboard security reporter Joseph Cox wrote that he had acquired phone-location data from a source in the bail bond industry during the course of an undercover investigation and that the sensitive data was being routinely acquired by bond agents (read “bounty hunters”) without a subpoena or prior approval from the courts.

In a letter to the FCC chairman last week, Pallone said it was paramount his committee investigate the matter at once and that it could not wait “until President Trump decides to reopen the government.” However, committee members said on Monday that Pai had declined to brief them citing the shutdown, while asserting (in Pallone’s words) that the matter was “not a threat to the safety of human life or property.”

Neither Pai nor his chief of staff, Matthew Berry, placed the call notifying Pallone’s office of his refusal, according to a senior Democratic aide, who said the news came instead from a lower-level staffer.

Pallone responded to Pai’s decision in a statement, saying, “There’s nothing in the law that should stop the Chairman personally from meeting about this serious threat that could allow criminals to track the location of police officers on patrol, victims of domestic abuse, or foreign adversaries to track military personnel on American soil.”

Pai’s office did not immediately respond to a request for comment.

In an email to Gizmodo, Sen. Ron Wyden, who raised questions about the extralegal sale of location data to correctional officers last spring, criticized Pai for having “enough time to tweet cat videos and tired memes,” while refusing “to brief Congress about a real threat to every American’s security.”

“It’s a new low for someone who has spent his tenure at the FCC refusing to do his job and stand up for American consumers,” added Wyden.

In his letter to Pai last week, Pallone wrote:

Bad actors can use location information to track individuals’ physical movements without their knowledge or consent. If recent reports detailing the cheap, accurate, and easy accessibility of legally protected, real-time location data are true, we must work expeditiously to address these public safety concerns. If we don’t, the privacy and security of everyone who subscribes to wireless phone service from certain carriers—including government officials, military personnel, domestic violence victims, and law enforcement officials—may be compromised.

Pallone noted that one of Pai’s colleagues—Commissioner Jessica Rosenworcel—had offered to make herself available to the committee, though she lacks the authority to direct agency resources.

“This is an issue of personal and national security,” Rosenworcel, a Democrat, told Gizmodo by phone. “It needs an FCC investigation, the public deserves answers. There’s absolutely no excuse for delay.” She also concurred with Pallone’s assessment that nothing in the law prevents Pai from briefing the committee during a shutdown.

“What strikes me most, having spent some time studying this, is that this is a data ecosystem with no oversight,” she said. It’s also very troubling, she said, that wireless carriers, which have legitimate use for location data, have so easily turned it over to “shady middlemen,” who seem to have little compunction about how it used, or by whom.

“It’s not clear to me that I ever consented to that happening, and I bet you didn’t either,” she said, adding, “The FCC should be investigating from top to bottom, we should be auditing to identify every one of those third parties that had access to that information, and we should be figuring out if consumers ever gave their consent for this to occur.”

T-Mobile, Sprint, and AT&T have each said publicly that they plan to terminate all contracts with location aggregators—the companies, like Zumigo, which have been implicated in handing location data over to unauthorized parties (including the bond agents). T-Mobile, whose underhanded data practices were central to the Motherboard investigation, has said it will finish terminating its contracts with the location companies in March. (It first claimed to be doing so last May.)

The nation’s other major wireless carrier, Verizon, whose data was not available for purchase in Motherboard’s story, told Gizmodo its ties to the companies were all cut last year.

“As you’re most likely aware, Verizon is not among the companies cited in recent media accounts regarding issues with location tracking. We have followed through on our commitment to terminate aggregation arrangements and provide location information only with the express consent of our customers,” a Verizon spokesperson said.

The company added that it had maintained prior arraignments with four roadside assistance companies during the winter “for public safety reasons,” but that those companies “agreed to transition out of the existing arrangements by the end of March.”
https://gizmodo.com/ajit-pai-refuses...ing-1831750774





Democrats Aren’t Buying a Proposal for Big Tech to Write its Own Privacy Rules

‘Big tech cannot be trusted to write its own rules’
Makena Kelly

For the past year, discussions involving data privacy have heated up in Congress, and new federal legislation now seems inevitable. Today, a leading technology policy think tank, supported by Google, Amazon, and Facebook, proposed a “grand bargain” with lawmakers, arguing that any new federal data privacy bill should preempt state privacy laws and repeal the sector-specific federal ones entirely.

The Information Technology and Innovation Foundation’s (ITIF) proposal lays out a few basic characteristics for legislation that the industry has frequently discussed in the past, like requiring more transparency, data interoperability, and users to opt into the collection of sensitive personal data. All 50 states have their own laws when it comes to notifying users after a data breach, and ITIF asks for a single breach standard in order to simplify compliance. It also calls to expand the Federal Trade Commission’s authority to fine companies that violate the data privacy law, something industry leaders have asked for in the past.

But the “bargain” would also preempt state laws like California’s new privacy act, and repeal every other existing piece of federal privacy legislation, including landmark laws like Health Insurance Portability and Accountability Act (HIPAA) and Family Educational Rights and Privacy Act (FERPA). Every sector- or issue-specific privacy law would be removed, and state and local lawmakers would be unable to draft stricter, more specific regulations in the future.

The Children’s Online Privacy Protection Act (COPPA) would be one of the repealed laws. It was authored by Sen. Ed Markey (D-MA) in the late ‘90s, and IT was one of the first pieces of legislation governing the collection of data. The law imposes requirements on companies when it comes to collecting data on children under 13 years of age, which has become a sticking point for a number of tech companies. Both Google and Facebook have been sued multiple times for violating COPPA, and the law is one of the main reasons many web services cut off at age 13.

Markey, who oversees tech regulation as part of the Senate Commerce Committee, says throwing out COPPA is a step too far.

“As Congress works to provide the American people with a comprehensive federal privacy law, we should build upon—not dismantle—existing safeguards,” Markey said, responding to ITIF’s proposal. “Getting rid of COPPA is literally like throwing the baby out with the bath water.”

But tech groups like ITIF argue that a “patchwork” of privacy legislation would stifle innovation and increase service prices for consumers. “Privacy regulations aren’t free—they create costs for consumers and businesses, and if done badly, they could undermine the thriving U.S. digital economy,” said ITIF senior policy analyst Alan McQuinn, co-author of the report. “To avoid throwing a wrench into the digital economy and imposing expensive compliance burdens on businesses across all sectors, any data privacy regulations should create rules that facilitate data collection, use, and sharing while also empowering consumers to make informed choices about their data privacy.”

Sen. Richard Blumenthal (D-CT) has worked alongside Markey in the past, calling for the Federal Trade Commission to investigate companies like Google for potential COPPA violations. To Blumenthal, the think tank’s latest proposal is self-serving. “If Big Tech thinks this is a reasonable framework for privacy legislation, they should be embarrassed,” Blumenthal said. “This proposal would protect no one – it is only a grand bargain for the companies who regularly exploit consumer data for private gain and seek to evade transparency and accountability.”

“Big tech cannot be trusted to write its own rules – a reality this proposal only underscores,” Blumenthal said. “I look forward to rolling out bipartisan privacy legislation that does in fact ‘maximize consumer privacy,’ and puts consumers first.”
https://www.theverge.com/2019/1/14/1...pa-coppa-hipaa





Feds Can't Force You To Unlock Your iPhone With Finger Or Face, Judge Rules
Thomas Brewster

A California judge has ruled that American cops can’t force people to unlock a mobile phone with their face or finger. The ruling goes further to protect people’s private lives from government searches than any before and is being hailed as a potentially landmark decision.

Previously, U.S. judges had ruled that police were allowed to force unlock devices like Apple’s iPhone with biometrics, such as fingerprints, faces or irises. That was despite the fact feds weren’t permitted to force a suspect to divulge a passcode. But according to a ruling uncovered by Forbes, all logins are equal.

The order came from the U.S. District Court for the Northern District of California in the denial of a search warrant for an unspecified property in Oakland. The warrant was filed as part of an investigation into a Facebook extortion crime, in which a victim was asked to pay up or have an “embarassing” video of them publicly released. The cops had some suspects in mind and wanted to raid their property. In doing so, the feds also wanted to open up any phone on the premises via facial recognition, a fingerprint or an iris.

While the judge agreed that investigators had shown probable cause to search the property, they didn’t have the right to open all devices inside by forcing unlocks with biometric features.

On the one hand, magistrate judge Kandis Westmore ruled the request was “overbroad” as it was “neither limited to a particular person nor a particular device.”

But in a more significant part of the ruling, Judge Westmore declared that the government did not have the right, even with a warrant, to force suspects to incriminate themselves by unlocking their devices with their biological features. Previously, courts had decided biometric features, unlike passcodes, were not “testimonial.” That was because a suspect would have to willingly and verbally give up a passcode, which is not the case with biometrics. A password was therefore deemed testimony, but body parts were not, and so not granted Fifth Amendment protections against self-incrimination.

That created a paradox: How could a passcode be treated differently to a finger or face, when any of the three could be used to unlock a device and expose a user’s private life?

And that’s just what Westmore focused on in her ruling. Declaring that “technology is outpacing the law,” the judge wrote that fingerprints and face scans were not the same as “physical evidence” when considered in a context where those body features would be used to unlock a phone.

“If a person cannot be compelled to provide a passcode because it is a testimonial communication, a person cannot be compelled to provide one’s finger, thumb, iris, face, or other biometric feature to unlock that same device,” the judge wrote.

“The undersigned finds that a biometric feature is analogous to the 20 nonverbal, physiological responses elicited during a polygraph test, which are used to determine guilt or innocence, and are considered testimonial.”

There were other ways the government could get access to relevant data in the Facebook extortion case “that do not trample on the Fifth Amendment,” Westmore added. They could, for instance, ask Facebook to provide Messenger communications, she suggested. Facebook has been willing to hand over such messages in a significant number of previous cases Forbes has reviewed.

Law finally catching up with tech?

Over recent years, the government has drawn criticism for its smartphone searches. In 2016, Forbes uncovered a search warrant not dissimilar to the one in California. Again in the Golden State, the feds wanted to go onto a premises and force unlock devices with fingerprints, regardless of what phones or who was inside.

Andrew Crocker, senior staff attorney at the digital rights nonprofit Electronic Frontier Foundation, said the latest California ruling went a step further than he’d seen other courts go. In particular, Westmore observed alphanumeric passcodes and biometrics served the same purpose in unlocking phones.

“While that’s a fairly novel conclusion, it’s important that courts are beginning to look at these issues on their own terms,” Crocker told Forbes. “In its recent decisions, the Supreme Court has made clear that digital searches raise serious privacy concerns that did not exist in the age of physical searches—a full forensic search of a cellphone reveals far more than a patdown of a suspect’s pockets during an arrest for example.”

The magistrate judge decision could, of course, be overturned by a district court judge, as happened in Illinois in 2017 with a similar ruling. The best advice for anyone concerned about government overreach into their smartphones: Stick to a strong alphanumeric passcode that you won’t be compelled to disclose.
https://www.forbes.com/sites/thomasb...e-judge-rules/





Nielsen: 16M U.S. Homes Now Get TV Over-the-Air, a 48% Increase Over Past 8 Years
Sarah Perez

The number of U.S. households without a traditional cable or satellite TV subscription that instead receive broadcast stations using a digital antenna has jumped by nearly 50 percent over the past 8 years to reach 16 million homes, according to a new report from Nielsen. Today, 14 percent of all U.S. TV households are watching television over the air, it found.

The measurement firm says there are basically two camps among this group of cord cutters.

One, which tends to consist of older viewers with a median age of 55, exclusively watches TV via their antenna – they don’t subscribe to any streaming service.

This group, totalling 6.6 million homes, tends to be more diverse and have a smaller median income – which makes sense. For them, cord cutting may be more of a cost-saving tool, rather than a way to combine free content with other paid services to create a personalized TV experience.

The other group, totalling 9.4 million homes, has at least one subscription video service, like Netflix, Hulu, or Amazon Prime Video, for example. They tend to be younger, with a median age of 36 – as well as more affluent, and more device-connected, says Nielsen.

Because they’re spending more time on devices doing other things – perhaps gaming or using social networks – they consume less traditional media. That impacts the time spent watching TV.

The group of cord cutters watching over-the-air TV who don’t have access to a subscription video service watches over 6 hours per day. That’s 2 hours more than those with a subscription service, the study found.

The group using subscription services are more active on social media, too, likely as a result of their age and their numerous devices. They spend an hour per day, on average, using social media – 17 minutes more than the group without subscription video.
But both groups tend to watch the majority of “TV” content on their television. Despite the increased use of devices like smartphones and tablets, it seems that TV viewing continues to largely take place on the big screen.

Also of note, there’s a small but growing subgroup among the cord cutters who have subscription services who additionally have access to a virtual provider. These are the streaming services offering live TV – like YouTube TV, Hulu with Live TV, PlayStation Vue, or Sling TV. This group has grown to over 1.3 million homes as May 2018, Nielsen claims. (Keep in mind Nielsen’s numbers are counting TV households in the U.S., not individual user accounts to these services.)

The full report dug deeper into this third segment, and found they tend to be 56% more likely to have a college degree, 19% more likely to have children, and 95% more likely to have an internet connected device, compared with an average home. They also watch slightly more TV than the other “plus SVOD (subscription video on demand)” group at 3 hours, 27 minutes per day, compared with 3 hours, 22 minutes, Nielsen says.
https://techcrunch.com/2019/01/15/ni...-past-8-years/





After GDPR, The New York Times Cut off Ad Exchanges in Europe — and Kept Growing Ad Revenue
Jessica Davies

When the General Data Protection Regulation arrived last year, The New York Times didn’t take any chances.

The publisher blocked all open-exchange ad buying on its European pages, followed swiftly by behavioral targeting. Instead, NYT International focused on contextual and geographical targeting for programmatic guaranteed and private marketplace deals and has not seen ad revenues drop as a result, according to Jean-Christophe Demarta, svp for global advertising at New York Times International.

Currently, all the ads running on European pages are direct-sold. Although the publisher doesn’t break out exact revenues for Europe, Demarta said that digital advertising revenue has increased significantly since last May and that has continued into early 2019.

“The fact that we are no longer offering behavioral targeting options in Europe does not seem to be in the way of what advertisers want to do with us,” he said. “The desirability of a brand may be stronger than the targeting capabilities. We have not been impacted from a revenue standpoint, and, on the contrary, our digital advertising business continues to grow nicely.”

The NYT briefly tested reintroducing open-exchange programmatic ad buying last fall but didn’t pursue it. “When we weighed all considerations, it was decided not to continue with it,” added Demarta.

The last-minute scramble to prepare for GDPR’s arrival last May, led to some U.S. publishers taking a more extreme approach and either blocking pages entirely in Europe or pulling advertising altogether. USA Today pulled all advertising in Europe — a strategy it still sticks to. In total, more than 1,000 U.S. websites blocked access in Europe last May — an extreme but understandable response given the eye-watering GDPR penalties that can be levied should they get it wrong. Los Angeles Times has started making some of its content available in certain countries like the U.K. and France, though still carries a notice on its site warning that it is unavailable in most European countries.

That more extreme approach may fly for U.S. publishers whose European revenue and traffic is negligible compared to the overall business. However, for publishers with meaningful revenue coming from Europe, that approach is unsustainable.

The New York Times has 2.9 million paying digital subscribers globally, and 15 percent of the publisher’s digital news subscribers are from Europe. Digital advertising in Europe also remains an important revenue stream for the publisher. The publisher’s reader-revenue business model means it fiercely guards its readers’ user experience. Rather than bombard readers with consent notices or risk a clunky consent user experience, it decided to drop behavioral advertising entirely.

Business Insider also didn’t want to cut off a healthy European ad revenue stream. The publisher still runs personalized ads in Europe but only to those who have given consent for their data to be used for such purposes. Opt-in rates are high, so ad revenues haven’t been affected, according to Marc Boswell, global svp of revenue, operations and client services at Business Insider.

BI’s European ad revenue accounts for approximately 10 percent of its total ad revenue, a large enough slice to ensure blocking advertising was never an option, added Boswell. The publisher started preparing for GDPR in January 2018, and used Axel Springer’s open-source CMP from the start.

Those who have developed a more sophisticated approach to GDPR, rather than a more blunt tactic of blocking all visitors from Europe, will be in a stronger a position for the arrival of pending U.S privacy laws, such as the California Consumer Privacy Act, experts believe. “GDPR is just the beginning,” said Brian Kane, chief operations officer and co-founder of Sourcepoint. “Whether it is CCPA or other state-based regulation in the U.S. or [data privacy] trends in other countries like Canada and Japan, publishers will be prompted to lean in. They can no longer afford to view it as ‘well, it’s only a small percentage of my traffic being affected’ — publishers will have to do something.”

The scramble that led to the arrival of GDPR last May isn’t likely to be repeated when it comes to preparing for the U.S. laws, according to industry executives. The CCPA doesn’t kick in until 2020, but three U.S. media conglomerates are already actively seeking consent-management platform partners, according to sources with knowledge of the situation. CMPs store the information on which users have given consent to be tracked and served personalized ads, and pass that information back to all the publisher’s programmatic advertising partners.

“We’re seeing renewed interest from large U.S. publishers in revising their consent strategies,” said Kane. “Similar to what we saw in the U.K., there are many publishers who had a ‘day one’ solution for GDPR and are now wanting to run more robust processes for the long term, both for GDPR as well as the U.S. privacy laws.”

So far, the U.K. regulator, the Information Commissioner’s Office, has favored the carrot to the stick, particularly when it comes to leveraging its jurisdiction with U.S. publishers. The Washington Post went its own route and developed a paid subscription offer for those who don’t want to give consent to be tracked. The publisher received a mild wrist slap from the ICO in the form of a warning letter, but the regulator isn’t likely to enforce the matter.

“If you’d asked me a year and a half ago if I was concerned about the California privacy law, I’d have said yes, very,” said Boswell. “But since we have gone through GDPR, that’s no longer the case. There are minor differences between the policies, but we will likely apply a modified version of the same framework.”
https://digiday.com/media/new-york-t...pe-ad-revenue/





EU Cancels 'Final' Negotiations On EU Copyright Directive As It Becomes Clear There Isn't Enough Support
Mike Masnick

So, this is certainly unexpected. Just hours after we pointed out that even all of the lobbyists who had written/pushed for Article 13 in the EU Copyright Directive were now abandoning their support for it (basically because the EU was considering making it just slightly less awful), it appears that Monday's negotiations have been called off entirely:

BREAKING: Council has failed to find an agreement on its #copyright position today. This doesn’t mean that #Article11 and #Article13 are dead, but their adoption has just become a lot less likely. Let’s keep up the pressure now! https://t.co/DEYBhuRyGz #SaveYourInternet

— Julia Reda (@Senficon) January 18, 2019


Apparently multiple countries -- including Germany, Italy, the Netherlands and Poland -- made it clear they would not support the latest text put forth by Romania, and therefore would have blocked it from moving forward. Monday's negotiations were supposed to have been the "final" negotiations (after the previous "final" negotiations that didn't accomplish much) around a "compromise" bill that then would have gone out to be voted on by the EU Council, the EU Committee and the EU Parliament in the next few months. However, with the news of all those countries (via the EU Council) deciding to vote against the proposal, it effectively blocks it for now.

MEP Julia Reda now has the full breakdown of the votes, noting that 11 countries voted against the "compromise" text: Germany, Belgium, the Netherlands, Finland, Slovenia, Italy, Poland, Sweden, Croatia, Luxembourg and Portugal. That's... a pretty big list. Reda points out that most of those countries were concerned about the impact on users' rights (Portugal and Croatia appear to be outliers). That's pretty big -- as it means that any new text (if there is one) should move in a better direction, not worse.

As Reda notes, this does not mean that the Copyright Directive or Article 13 are dead. They could certainly be revived with new negotiations (and that could happen soon). But, it certainly makes the path forward a lot more difficult. Throughout all of this, as we've seen in the past, the legacy copyright players plowed forward, accepting no compromise and basically going for broke as fast as they could, in the hopes that no one would stop them. They've hit something of a stumbling block here. It won't stop them from still trying, but for now this is good news. The next step is making sure Article 13 is truly dead and cannot come back. The EU has done a big thing badly in even letting things get this far. Now let's hope they fix this mess by dumping Articles 11 and 13.
https://www.techdirt.com/articles/20...-support.shtml





The (Almost) Secret Algorithm Researchers Used to Break Thousands of RSA Keys
William Kuszmaul

RSA encryption allows for anyone to send me messages that only I can decode. To set this up, I select two large random primes p and q (each of which is hundreds of bits long), and release their product x = p \cdot q online for everyone to see; x is known as my public key. In addition, I pick some number e which shares no factors with p-1 or q-1 and release it online as well.

The beauty of RSA encryption is that using only the information I publicly released, anyone can encode a message they want to send me. But without knowing the values of p and q, nobody but me can decode the message. And even though everyone knows my public key x = p \cdot q, that doesn’t give them any efficient way to find values for p or q. In fact, even factoring a 232-digit number took a group of researchers more than 1,500 years of computing time (distributed among hundreds of computers).

On the surface, RSA encryption seems uncrackable. And it might be too, except for one small problem. Almost everyone uses the same random-prime-number generators.

A few years ago, this gave researchers an idea. Suppose Bob and Alice both post public keys online. But since they both used the same program to generate random prime numbers, there’s a higher-than-random chance that their public keys share a prime factor. Factoring Bob’s or Alice’s public keys individually would be nearly impossible. But finding any common factors between them is much easier. In fact, the time needed to compute the largest common divisor between two numbers is close to proportional to the number of digits in the two numbers. Once I identify the common prime factor between Bob’s and Alice’s keys, I can factor it out to obtain the prime factorization of both keys. In turn, I can decode any messages sent to either Bob or Alice.

Armed with this idea, the researchers scanned the web and collected 6.2 million actual public keys. They then computed the largest common divisor between pairs of keys, cracking a key whenever it shared a prime factor with any other key. All in all, they were able to break 12,934 keys. In other words, if used carelessly, RSA encryption provides less than 99.8\% security.

At first glance this seems like the whole story. Reading through their paper more closely, however, reveals something strange. According to the authors, they were able to run the entire computation in a matter of hours on a single core. But a back-of-the-envelope calculation suggests that it should take years to compute GCD’s between 36 trillion pairs of keys, not hours.

So how did they do it? The authors hint in a footnote that at the heart of their computation is an asymptotically fast algorithm, allowing them to bring the running time of the computation down to nearly linear; but the actual description of the algorithm is kept a secret from the reader, perhaps to guard against malicious use. Within just months of the paper’s publication, though, follow-up papers had already discussed various approaches in detail, both presenting fast algorithms (see this paper and this paper), and even showing how to use GPUs to make the brute-force approach viable (see this paper).

There’s probably a lesson here about not bragging about things if you want them to stay secret. But for this post I’m not interested in lessons. I’m interested in algorithms. And this one turns out to be both relatively simple and quite fun.

Algorithm Prerequisites: Our algorithm will deal with integers having an asymptotically large number of digits. Consequently, we cannot treat addition and multiplication as constant-time operations.

For n-bit integers, addition takes O(n) time. Using long multiplication, multiplication would seem to take O(n^2) time. However, it turns out there is an algorithm which runs in time O(n \log n \log \log n).

Computing the GCD naively using the Euclidean algorithm would take O(n^2 \log n \log \log n) time. Once again, however, researchers have found a better algorithm, running in time O(n \log^2 n \log \log n).

Fortunately, all of these algorithms are already implemented for us in GMP, the C++ big-integer library. For the rest of the post we will use Big-O-Tilde notation, a variant of Big-O notation that ignores logarithmic factors. For example, while GCD-computation takes time O(n \log^2 n \log \log n), in Big-O-Tilde notation we write that it takes time \widetilde{\text{O}}(n).

Transforming the Problem: Denote the set of public RSA keys by k_1, \ldots, k_n, where each key is the product of two large prime numbers (i.e., hundred digits). Note that n is the total number of keys. Rather than computing the GCD of each pair of keys, we can instead compute for each key k_i the GCD of it and the product of all the other keys \prod_{t \neq i} k_t. If a key k_i shares exactly one prime factor with other keys, then this provides that prime factor. If both prime factors of k_i are shared with other keys, however, then the computation will fail to actually extract the individual prime factors. This case is probably rare enough that it’s not worth worrying about.

The Algorithm: The algorithm has a slightly unusual recursive structure in that the recursion occurs in the middle of the algorithm rather than at the end.

At the beginning of the algorithm, all we have is the keys,

k_1,
k_2,
k_3, \cdots

The first step of the algorithm is to pair off the keys and compute their products,

j_1 = k_1 \cdot k_2,
j_2 = k_3 \cdot k_4,
j_3 = k_5 \cdot k_6, \cdots

Next we recurse on the sequence of numbers j_1, \ldots, j_{n / 2}, in order to compute

r_1 = GCD(j_1, \prod_{t \neq 1} j_t),
r_2 = GCD(j_2, \prod_{t \neq 2} j_t),
r_3 = GCD(j_3, \prod_{t \neq 3} j_t), \cdots

Our goal is to compute s_i = GCD(k_i, \prod_{t \neq i} k_t) for each key k_i. The key insight is that when i is odd, s_i can be expressed as
s_i = GCD(k_i, r_{(i + 1) / 2} \cdot k_{i + 1}),
and that when i is even, s_i can be expressed as
s_i = GCD(k_i, r_{i / 2} \cdot k_{i - 1}).
To see why this is the case, one can verify that the term on the right side of the GCD is guaranteed to be a multiple of GCD(k_i, \prod_{t \neq i} k_t), while also being a divisor of \prod_{t \neq i} k_t. This, in turn, implies that the GCD-computation will evaluate to exactly GCD(k_i, \prod_{t \neq i} k_t), as desired.

Computing each of the s_i‘s in terms of the r_i‘s and k_i‘s completes the algorithm.

Bounding the Running Time: Let m denote the total number of bits needed to write down k_1, k_2, \ldots, k_n. Each time the algorithm recurses, the total number of bits in the input being recursed on is guaranteed to be no more than at the previous level of recursion; this is because the new inputs are products of pairs of elements from the old input.

Therefore each of the O(\log n) levels of recursion act on an input of total size O(m) bits. Moreover, the arithmetic operations within each level of recursion take total time at most \tilde{O}(m). Thus the total running time of the algorithm is also \tilde{O}(m) (since the O(\log n) recursion levels can be absorbed into the Big-O-Tilde notation).

If we unwrap the running time into standard Big-O notation, we get
O(m \log^3 m \log \log m).

Is it practical? At first glance, the triple-logarithmic factor might seem like a deal breaker for this algorithm. It turns out the actual performance is pretty reasonable. This paper found that the algorithm takes time roughly 7.65 seconds per thousand keys, meaning it would take a little more than 13 hours to run on 6.2 million keys.

One of the log factors can be removed using a slightly more clever variant of the algorithm, which avoids GCD computations at all but the first level of recursion (See this paper). The improved algorithm takes about 4.5 seconds per thousand keys, resulting in a total running time of about 7.5 hours to handle 6.2 million keys.

So there we go. A computation that should have taken years is reduced to a matter of hours. And all it took was a bit of clever recursion.
https://algorithmsoup.wordpress.com/...rt-1-the-hack/





The Super-Secure Quantum Cable Hiding in the Holland Tunnel
Jeremy Kahn

Commuters inching through rush-hour traffic in the Holland Tunnel between Lower Manhattan and New Jersey don’t know it, but a technology likely to be the future of communication is being tested right outside their car windows. Running through the tunnel is a fiber-optic cable that harnesses the power of quantum mechanics to protect critical banking data from potential spies.

The cable’s trick is a technology called quantum key distribution, or QKD. Any half-decent intelligence agency can physically tap normal fiber optics and intercept whatever messages the networks are carrying: They bend the cable with a small clamp, then use a specialized piece of hardware to split the beam of light that carries digital ones and zeros through the line. The people communicating have no way of knowing someone is eavesdropping, because they’re still getting their messages without any perceptible delay.

QKD solves this problem by taking advantage of the quantum physics notion that light—normally thought of as a wave—can also behave like a particle. At each end of the fiber-optic line, QKD systems, which from the outside look like the generic black-box servers you might find in any data center, use lasers to fire data in weak pulses of light, each just a little bigger than a single photon. If any of the pulses’ paths are interrupted and they don’t arrive at the endpoint at the expected nanosecond, the sender and receiver know their communication has been compromised.

“Financial firms see this as a differentiator,” says John Prisco, chief executive officer of Quantum Xchange, the company that’s been operating the cable in the Holland Tunnel since the fall. Prisco says several large banks and asset management firms are testing his gear, but he declined to name them, citing nondisclosure agreements. The companies are considering using QKD to guard their most sensitive secrets, he says, including trading algorithms and customer settlement accounts. Quantum Xchange, based in Bethesda, Md., says it hopes to stretch its cables from Boston to Washington, D.C., and is also promoting them to U.S. government agencies.

Estimates of the annual QKD market range from $50 million to $500 million, but market researcher Global Industry Analysts Inc. says demand for QKD and related technologies may reach $2 billion by 2024. The Chinese government has created a 1,240-mile QKD-protected link between Beijing and Shanghai. It’s also demonstrated the ability to use QKD to transmit and receive messages from a satellite. And a half-dozen QKD startups are pitching other kinds of clients. Qubitekk Inc., a startup in Southern California, has a U.S. Department of Energy contract for a pilot project to secure the communications that help operate power stations. Telecommunications giants including the U.K.’s BT Group Plc and Japan’s NTT Corp. say they’re considering whether to build the protection into their network infrastructure.

Why bother when most network traffic is already encrypted? Encryption is worthless if an attacker manages to get the digital keys used to encode and decode messages. Each key is usually extra-encrypted, but documents disclosed by former National Security Agency contractor Edward Snowden in 2013 showed that the U.S. government, which hoovers up most of the world’s internet traffic, can also break those tougher codes. Exactly how the NSA accomplishes this isn’t widely known. (One suspicion is that while keys are supposed to be based on multiplying two random large prime numbers together, many systems use a relatively small subset of primes, making it much easier for a computer to guess the key.)

Quantum computers are another potential threat to conventional encryption. Like QKD systems, these machines use quantum physics principles to process information and may one day achieve processing power far beyond that of conventional computers. When that happens—in the next 3 to 15 years, depending on whose estimate is right—quantum computers will give almost any user the code-breaking powers of today’s NSA. In 2016 the NSA warned companies that do business with the U.S. government that their next generation of encryption systems would have to be resistant to attacks by quantum computers.

QKD has limits. It can protect data only in transit, not when it’s at rest, stored in data centers or on hard drives. And because fiber-optic cabling itself absorbs some light, a single photon can travel only so far. Scientists have pushed the boundary ever outward, as far as 260 miles in lab experiments. Yet for high-speed transmissions under real-world conditions, the record is just 60 miles. Farther transmissions require a series of “trusted nodes,” relays that are themselves vulnerable to hackers or physical tapping. China uses armed guards to secure the nodes in its 1,240-mile QKD network, says Anthony Lawrence, a former NSA network security expert and briefing officer who now runs cybersecurity startup Vor Technology LLC.

One sure way to avoid these security and distance issues is simply to cut the cord. British startup Kets Quantum Security Ltd. is working with Airbus SE on using QKD to secure communications between a drone and its operator on the ground. And satellite relays will eventually be able to transmit quantum-encrypted signals almost anywhere on Earth, predicts Lawrence, who’s working to commercialize QKD. For the moment, though, the signals are stuck in the Holland Tunnel.
https://www.bloombergquint.com/busin...holland-tunnel





A New Cryptojacking Tactic that Involves Wikipedia and Downloaded Movie Files has Been Discovered
Samuel Tan

A new method of cryptojacking has been discovered. This new tactic involves using a downloadable movie file as bait, then cryptocurrencies are being mined in the background without you knowing. In addition, the virus also attempts to steal cryptocurrency through Wikipedia donations according to security researcher ‘@0xffff08000’.

A few days back we published an article, “New research shows that Cryptojacking is responsible for more than 4% of Monero’s supply

In that article, we briefly went over the process of cryptojacking and how it works. To keep it real short, hackers installs a malware on your computer that secretly mines cryptocurrency.

This is not a small issue by any means, and it is continually growing. MacAfee Labs – one of the most well-known cybersecurity companies in the world – recently released a report that highlighted the ways that hackers often spread “cryptojacking” tactics through various social media platforms such as Slack and Discord. The report also pointed out that throughout 2018 crypto mining malware grew by an astonishing 4,000%.

Back to the new cryptojacking tactic that involves Wikipedia

The tactic was discovered by a security researcher whose name is unknown but is the person who runs a Twitter account named @0xffff0800, which is entirely dedicated to technology and cybersecurity.

The malware launches a Powershell command, which then inserts malicious code into the Firefox browser. The attack is designed to infect movie torrent files and is also meant to infect Windows computers in particular. The point of the attack is to phish for any Bitcoin or Ethereum addresses that the user might have. It’s an advanced virus as it then actually aims to replace these victims addresses with the hacker’s wallet.

The next thing is the donation scam involving Wikipedia that cryptocurrency cybercriminals are now utilizing. This same virus injects code that actually adds a fake donation banner to their Wikipedia. Whilst Wikipedia does accept donations, these cryptocurrency addresses are part of the fake banner, and are actually malicious wallets.

Most people have seen the banner before. If you go onto Wikipedia’s home page now, you should be able to see the banner as well, but you have to be careful and ensure that you are actually on Wikipedia’s true home page.
https://cryptomenow.com/a-new-crypto...en-discovered/





To Save the Sound of a Stradivarius, a Whole City Must Keep Quiet
Max Paradiso

Florencia Rastelli was mortified. As an expert barista, she had never spilled a single cup of coffee, she said. But last Monday, as she wiped the counter at Chiave di Bacco, the cafe where she works, she knocked over a glass and it shattered loudly on the floor.

The customers all stood still, petrified, Ms. Rastelli recalled. “I was like: Of all days, this one,” she said. “Even a police officer popped in and asked me to keep it down. I was so embarrassed.”

The people of Cremona are unusually sensitive to noise right now. The police have cordoned off streets in the usually bustling city center and traffic has been diverted. During a recent news conference, the city’s mayor, Gianluca Galimberti, implored Cremona’s citizens to avoid any sudden and unnecessary sounds.

Cremona is home to the workshops of some of the world’s finest instrument makers, including Antonio Stradivari, who in the 17th and 18th centuries produced some of the finest violins and cellos ever made. The city is getting behind an ambitious project to digitally record the sounds of the Stradivarius instruments for posterity, as well as others by Amati and Guarneri del Gesù, two other famous Cremona craftsmen. And that means being quiet.

A Stradivarius violin, viola or cello represents the pinnacle of sound engineering, and nobody has been able to replicate their unique tones.

Fausto Cacciatori, the curator of Cremona’s Museo del Violino, a museum devoted to musical instruments that is assisting with the project, said that each Stradivarius had “its own personality.” But, he added, their distinctive sounds “will inevitably change,” and could even be lost within just a few decades.

“It’s part of their life cycle,” Mr. Cacciatori said. “We preserve and restore them, but after they reach a certain age, they become too fragile to be played and they ‘go to sleep,’ so to speak.”

So that future generations won’t miss out on hearing the instruments, three sound engineers are producing the “Stradivarius Sound Bank” — a database storing all the possible tones that four instruments selected from the Museo del Violino’s collection can produce.

One of the engineers, Mattia Bersani, said that the sounds in the database could be manipulated with software to produce new recordings when the tone of the original instruments degraded. Musicians of the future would be able to “record a sonata with an instrument that will no longer function,” he said.

“This will allow my grandchildren to hear what a Strad sounded like,” said Leonardo Tedeschi, a former D.J. who came up with the idea for the project. “We are making immortal the finest instrument ever crafted.”

Throughout January, four musicians playing two violins, a viola and a cello will work through hundreds of scales and arpeggios, using different techniques with their bows, or plucking the strings. Thirty-two ultrasensitive microphones set up in the museum’s auditorium will capture the sounds.

Different Notes, Different Sounds

To preserve the instruments’ sounds, musicians have to play all the possible notes, and then record them in different combinations and using different techniques.

“It’ll be physically and mentally challenging for them,” said Thomas Koritke, a sound engineer from Hamburg, Germany, who is leading the project. “They’ll have to play hundreds of thousands of individual notes and transitions for eight hours a day, six days a week, for more than a month.”

Organizing the project had also taken a long time, Mr. Koritke added. “It took us a few years to convince the museum to let us use 500-year-old stringed instruments,” he said. Then they had to find top musicians who knew the instruments inside out. Then the acoustics of the auditorium, which was designed around the sound of the instruments, had to be studied, as well.

In 2017, the engineers thought their project was finally ready to get underway. But a soundcheck revealed a major flaw.

“The streets around the auditorium are all made of cobblestone, an auditory nightmare,” Mr. Tedeschi said. The sound of a car engine, or a woman walking in high heels, produces vibrations that run underground and reverberate in the microphones, making the recording worthless, he explained. “It was either shutting down the entire area or having the project not seeing the light of day,” Mr. Tedeschi said.

Luckily for the engineers, Cremona’s mayor is also the president of the Stradivarius Foundation, the municipal body that owns the Museo del Violino. He allowed the streets around the museum to be closed for five weeks, and appealed to people in the city to keep it down.

“We are the only city in the world that preserves both the instruments and their voices,” Mr. Galimberti said. “This is an extraordinary project that looks at the future, and I’m sure people from Cremona will understand that closing the area was inevitable.”

On Jan. 7, the police cordoned off the streets. The auditorium’s ventilation and elevators were turned off. Every light bulb in the concert hall was unscrewed to eliminate a faint buzzing sound.

Upstairs in the museum, Mr. Cacciatori put on a pair of velvet gloves and took a 1615 Amati viola from its glass display case. He inspected it thoroughly, and then a security guard escorted him and the instrument down two flight of stairs to the auditorium.

The curator handed the instrument to Wim Janssen, a Dutch viola player, who walked to the center of the stage.

He sat on a chair in the semidarkness under a cluster of microphones. The three engineers left the hall, and took their seats in a soundproofed room beneath the hall filled with speakers and computer screens, servers and cables.

Mr. Janssen wore an earpiece, through which Mr. Koritke relayed instructions.

“Go,” Mr. Koritke whispered.

The violist played a C-major scale as the recording team watched graphs on their screens responding to the crisp sound of the instrument. Mr. Tedeschi grinned in satisfaction.

Then it happened, and they froze. “Stop for a moment, please,” Mr. Koritke said, and the violist held his position.

The engineers rewound the recording, and played it again.

Mr. Koritke heard the problem, loud and clear: “Who dropped a glass on the floor?”
https://www.nytimes.com/2019/01/17/a...g-cremona.html





Court Rejects FCC Request to Delay Net Neutrality Case
Harper Neidig

A federal appeals court denied the Federal Communications Commission’s request to postpone oral arguments in a court battle over the agency’s decision to repeal its net neutrality rules.

The FCC had asked for the hearing to be postponed since the commission’s workforce has largely been furloughed due to the partial government shutdown.

The hearing remains set for February 1.

After the FCC repealed the rules requiring internet service providers to treat all web traffic equally in December of 2017, a coalition of consumer groups and state attorneys general sued to reverse the move, arguing that the agency failed to justify it.

The FCC asked the three-judge panel from the D.C. Circuit Court of Appeals to delay oral arguments out of “an abundance of caution” due to its lapse of funding.

Net neutrality groups opposed the motion, arguing that there is an urgent need to settle the legal questions surrounding the FCC’s order.

“Due to the FCC’s misguided and unlawful repeal of the network neutrality rules, consumers are at risk of substantial harm from Internet Service Providers (“ISPs”), which may now interfere with access to lawful Internet content without the restraint of the net neutrality rules,” the trade group Incompas wrote in a filing this week.
https://thehill.com/policy/technolog...eutrality-case

















Until next week,

- js.



















Current Week In Review





Recent WiRs -

January 12th, January 5th, December 29th, December 22nd

Jack Spratts' Week In Review is published every Friday. Submit letters, articles, press releases, comments, questions etc. in plain text English to jackspratts (at) lycos (dot) com. Submission deadlines are Thursdays @ 1400 UTC. Please include contact info. The right to publish all remarks is reserved.


"The First Amendment rests on the assumption that the widest possible dissemination of information from diverse and antagonistic sources is essential to the welfare of the public."
- Hugo Black
__________________
Thanks For Sharing
JackSpratts is offline   Reply With Quote
Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Peer-To-Peer News - The Week In Review - July 16th, '11 JackSpratts Peer to Peer 0 13-07-11 06:43 AM
Peer-To-Peer News - The Week In Review - July 9th, '11 JackSpratts Peer to Peer 0 06-07-11 05:36 AM
Peer-To-Peer News - The Week In Review - January 30th, '10 JackSpratts Peer to Peer 0 27-01-10 07:49 AM
Peer-To-Peer News - The Week In Review - January 16th, '10 JackSpratts Peer to Peer 0 13-01-10 09:02 AM
Peer-To-Peer News - The Week In Review - December 5th, '09 JackSpratts Peer to Peer 0 02-12-09 08:32 AM






All times are GMT -6. The time now is 12:48 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
© www.p2p-zone.com - Napsterites - 2000 - 2024 (Contact grm1@iinet.net.au for all admin enquiries)