Sandra Finley

Nov 182018
 

Bruce Schneier   is an internationally renowned security technologist. He teaches at the Harvard Kennedy School, and serves as special advisor to IBM Security. His new book is called Click Here to Kill Everybody: Security and Survival in a Hyper-Connected World.

 

Years ago I contacted Bruce Schneier because of Lockheed Martin’s role at Statistics Canada (Census Bureaux).   The name “Schneier” appears whenever there are questions about the security of data bases.

A skim of his November Newsletter (free subscription) reflects the Privacy issue:

2018-11-15    Nov issue of Crypto-Gram, by Bruce Schneier. A free monthly newsletter providing summaries, analyses, insights, and commentaries on security: computer and otherwise.

Schneier is a quoted authority in these postings:

2015-03-17   Bill C-51, Elephant in the Room, the U.S.A. (includes info re “Five Eyes”, FVEY)

2014-04-09   CRA halts e-filing amid fears of global data breach

E-voting in Canada: Online Voting and Hostile Deployment Environments by Christopher Parsons

2011-09-03   Election fraud in the U.S., “Murder, Spies & Voting Lies”. E-voting in Canada.

– – – – – – – – – – –

With thanks to “Wired“:

Surveillance Kills Freedom By Killing Experimentation, by Bruce Schneier

Excerpted from the upcoming issue of McSweeney’s, “The End of Trust,” a collection featuring more than 30 writers investigating surveillance, technology, and privacy.
NASA

In my book Data and Goliath, I write about the value of privacy. I talk about how it is essential for political liberty and justice, and for commercial fairness and equality. I talk about how it increases personal freedom and individual autonomy, and how the lack of it makes us all less secure. But this is probably the most important argument as to why society as a whole must protect privacy: it allows society to progress.

We know that surveillance has a chilling effect on freedom. People change their behavior when they live their lives under surveillance. They are less likely to speak freely and act individually. They self-censor. They become conformist. This is obviously true for government surveillance, but is true for corporate surveillance as well. We simply aren’t as willing to be our individual selves when others are watching.

 

Let’s take an example: hearing that parents and children are being separated as they cross the U.S. border, you want to learn more. You visit the website of an international immigrants’ rights group, a fact that is available to the government through mass internet surveillance. You sign up for the group’s mailing list, another fact that is potentially available to the government. The group then calls or emails to invite you to a local meeting. Same. Your license plates can be collected as you drive to the meeting; your face can be scanned and identified as you walk into and out of the meeting. If instead of visiting the website you visit the group’s Facebook page, Facebook knows that you did and that feeds into its profile of you, available to advertisers and political activists alike. Ditto if you like their page, share a link with your friends, or just post about the issue.

Maybe you are an immigrant yourself, documented or not. Or maybe some of your family is. Or maybe you have friends or coworkers who are. How likely are you to get involved if you know that your interest and concern can be gathered and used by government and corporate actors? What if the issue you are interested in is pro- or anti-gun control, anti-police violence or in support of the police? Does that make a difference?

Maybe the issue doesn’t matter, and you would never be afraid to be identified and tracked based on your political or social interests. But even if you are so fearless, you probably know someone who has more to lose, and thus more to fear, from their personal, sexual, or political beliefs being exposed.

This isn’t just hypothetical. In the months and years after the 9/11 terrorist attacks, many of us censored what we spoke about on social media or what we searched on the internet. We know from a 2013 PEN study that writers in the United States self-censored their browsing habits out of fear the government was watching. And this isn’t exclusively an American event; internet self-censorship is prevalent across the globe, China being a prime example.

It’s easy to imagine the more conservative among us getting enough power to make illegal what they would otherwise be forced to witness. In this way, privacy helps protect the rights of the minority from the tyranny of the majority.

Ultimately, this fear stagnates society in two ways. The first is that the presence of surveillance means society cannot experiment with new things without fear of reprisal, and that means those experiments—if found to be inoffensive or even essential to society—cannot slowly become commonplace, moral, and then legal. If surveillance nips that process in the bud, change never happens. All social progress—from ending slavery to fighting for women’s rights—began as ideas that were, quite literally, dangerous to assert. Yet without the ability to safely develop, discuss, and eventually act on those assertions, our society would not have been able to further its democratic values in the way that it has.

Consider the decades-long fight for gay rights around the world. Within our lifetimes we have made enormous strides to combat homophobia and increase acceptance of queer folks’ right to marry. Queer relationships slowly progressed from being viewed as immoral and illegal, to being viewed as somewhat moral and tolerated, to finally being accepted as moral and legal.

In the end it was the public nature of those activities that eventually slayed the bigoted beast, but the ability to act in private was essential in the beginning for the early experimentation, community building, and organizing.

Marijuana legalization is going through the same process: it’s currently sitting between somewhat moral, and—depending on the state or country in question—tolerated and legal. But, again, for this to have happened, someone decades ago had to try pot and realize that it wasn’t really harmful, either to themselves or to those around them. Then it had to become a counterculture, and finally a social and political movement. If pervasive surveillance meant that those early pot smokers would have been arrested for doing something illegal, the movement would have been squashed before inception. Of course the story is more complicated than that, but the ability for members of society to privately smoke weed was essential for putting it on the path to legalization.

We don’t yet know which subversive ideas and illegal acts of today will become political causes and positive social change tomorrow, but they’re around. And they require privacy to germinate. Take away that privacy, and we’ll have a much harder time breaking down our inherited moral assumptions.

The second way surveillance hurts our democratic values is that it encourages society to make more things illegal. Consider the things you do—the different things each of us does—that portions of society find immoral. Not just recreational drugs and gay sex, but gambling, dancing, public displays of affection. All of us do things that are deemed immoral by some groups, but are not illegal because they don’t harm anyone. But it’s important that these things can be done out of the disapproving gaze of those who would otherwise rally against such practices.

If there is no privacy, there will be pressure to change. Some people will recognize that their morality isn’t necessarily the morality of everyone—and that that’s okay. But others will start demanding legislative change, or using less legal and more violent means, to force others to match their idea of morality.

It’s easy to imagine the more conservative (in the small-c sense, not in the sense of the named political party) among us getting enough power to make illegal what they would otherwise be forced to witness. In this way, privacy helps protect the rights of the minority from the tyranny of the majority.

This is how we got Prohibition in the 1920s, and if we had had today’s surveillance capabilities in the 1920s it would have been far more effectively enforced. Recipes for making your own spirits would have been much harder to distribute. Speakeasies would have been impossible to keep secret. The criminal trade in illegal alcohol would also have been more effectively suppressed. There would have been less discussion about the harms of Prohibition, less “what if we didn’t…” thinking. Political organizing might have been difficult. In that world, the law might have stuck to this day.

China serves as a cautionary tale. The country has long been a world leader in the ubiquitous surveillance of its citizens, with the goal not of crime prevention but of social control. They are about to further enhance their system, giving every citizen a “social credit” rating. The details are yet unclear, but the general concept is that people will be rated based on their activities, both online and off. Their political comments, their friends and associates, and everything else will be assessed and scored. Those who are conforming, obedient, and apolitical will be given high scores. People without those scores will be denied privileges like access to certain schools and foreign travel. If the program is half as far-reaching as early reports indicate, the subsequent pressure to conform will be enormous. This social surveillance system is precisely the sort of surveillance designed to maintain the status quo.

For social norms to change, people need to deviate from these inherited norms. People need the space to try alternate ways of living without risking arrest or social ostracization. People need to be able to read critiques of those norms without anyone’s knowledge, discuss them without their opinions being recorded, and write about their experiences without their names attached to their words. People need to be able to do things that others find distasteful, or even immoral. The minority needs protection from the tyranny of the majority.

Privacy makes all of this possible. Privacy encourages social progress by giving the few room to experiment free from the watchful eye of the many. Even if you are not personally chilled by ubiquitous surveillance, the society you live in is, and the personal costs are unequivocal.

From The End of Trust (McSweeney’s 54), out November 20th, a collection featuring over thirty writers investigating surveillance, technology, and privacy, with special advisors The Electronic Frontier Foundation. Wired readers can take 10% off the issue, or a full subscription, with the code WIRED.

Nov 182018
 

From: Bruce Schneier
Sent: November 15, 2018

Crypto-Gram
November 15, 2018

by Bruce Schneier
CTO, IBM Resilient
schneier   AT   schneier.com
https://www.schneier.com

A free monthly newsletter providing summaries, analyses, insights, and commentaries on security: computer and otherwise.

For back issues, or to subscribe, visit Crypto-Gram’s web page.

Read this issue on the web

These same essays and news items appear in the Schneier on Security blog, along with a lively and intelligent comment section. An RSS feed is available.

** *** ***** ******* *********** *************

In this issue:

  1. How DNA Databases Violate Everyone’s Privacy
  2. Privacy for Tigers
  3. Government Perspective on Supply Chain Security
  4. West Virginia Using Internet Voting
  5. Are the Police Using Smart-Home IoT Devices to Spy on People?
  6. On Disguise
  7. China’s Hacking of the Border Gateway Protocol
  8. Android Ad-Fraud Scheme
  9. Detecting Fake Videos
  10. Security Vulnerability in Internet-Connected Construction Cranes
  11. More on the Supermicro Spying Story
  12. Cell Phone Security and Heads of State
  13. ID Systems Throughout the 50 States
  14. Was the Triton Malware Attack Russian in Origin?
  15. Buying Used Voting Machines on eBay
  16. How to Punish Cybercriminals
  17. Troy Hunt on Passwords
  18. Security of Solid-State-Drive Encryption
  19. Consumer Reports Reviews Wireless Home-Security Cameras
  20. iOS 12.1 Vulnerability
  21. Privacy and Security of Data at Universities
  22. The Pentagon Is Publishing Foreign Nation-State Malware
  23. Hiding Secret Messages in Fingerprints
  24. New IoT Security Regulations
  25. Oracle and “Responsible Disclosure”
  26. More Spectre/Meltdown-Like Attacks
  27. Upcoming Speaking Engagements

** *** ***** ******* *********** *************

How DNA Databases Violate Everyone’s Privacy

[2018.10.15] If you’re an American of European descent, there’s a 60% chance you can be uniquely identified by public information in DNA databases. This is not information that you have made public; this is information your relatives have made public.

Research paper:

“Identity inference of genomic data using long-range familial searches.”

Abstract: Consumer genomics databases have reached the scale of millions of individuals. Recently, law enforcement authorities have exploited some of these databases to identify suspects via distant familial relatives. Using genomic data of 1.28 million individuals tested with consumer genomics, we investigated the power of this technique. We project that about 60% of the searches for individuals of European-descent will result in a third cousin or closer match, which can allow their identification using demographic identifiers. Moreover, the technique could implicate nearly any US-individual of European-descent in the near future. We demonstrate that the technique can also identify research participants of a public sequencing project. Based on these results, we propose a potential mitigation strategy and policy implications to human subject research.

A good news article.

** *** ***** ******* *********** *************

Privacy for Tigers

[2018.10.16] Ross Anderson has some new work:

As mobile phone masts went up across the world’s jungles, savannas and mountains, so did poaching. Wildlife crime syndicates can not only coordinate better but can mine growing public data sets, often of geotagged images. Privacy matters for tigers, for snow leopards, for elephants and rhinos — and even for tortoises and sharks. Animal data protection laws, where they exist at all, are oblivious to these new threats, and no-one seems to have started to think seriously about information security.

Video here.

** *** ***** ******* *********** *************

Government Perspective on Supply Chain Security

[2018.10.18] This is an interesting interview with a former NSA employee about supply chain security. I consider this to be an insurmountable problem right now.

** *** ***** ******* *********** *************

West Virginia Using Internet Voting

[2018.10.19] This is crazy (and dangerous). West Virginia is allowing people to vote via a smart-phone app. Even crazier, the app uses blockchain — presumably because they have no idea what the security issues with voting actually are.

** *** ***** ******* *********** *************

Are the Police Using Smart-Home IoT Devices to Spy on People?

[2018.10.22] IoT devices are surveillance devices, and manufacturers generally use them to collect data on their customers. Surveillance is still the business model of the Internet, and this data is used against the customers’ interests: either by the device manufacturer or by some third party the manufacturer sells the data to. Of course, this data can be used by the police as well; the purpose depends on the country.

None of this is new, and much of it was discussed in my book Data and Goliath . What is common is for Internet companies is to publish “transparency reports” that give at least general information about how police are using that data. IoT companies don’t publish those reports.

TechCrunch asked a bunch of companies about this, and basically found that no one is talking.

Boing Boing post.

** *** ***** ******* *********** *************

On Disguise

[2018.10.23] The former CIA Chief of Disguise has a fascinating video about her work.

** *** ***** ******* *********** *************

China’s Hacking of the Border Gateway Protocol

[2018.10.24] This is a long — and somewhat technical — paper by Chris C. Demchak and Yuval Shavitt about China’s repeated hacking of the Internet Border Gateway Protocol (BGP): “China’s Maxim — Leave No Access Point Unexploited: The Hidden Story of China Telecom’s BGP Hijacking.”

BGP hacking is how large intelligence agencies manipulate Internet routing to make certain traffic easier to intercept. The NSA calls it “network shaping” or “traffic shaping.” Here’s a document from the Snowden archives outlining how the technique works with Yemen.

EDITED TO ADD (10/27): Boing Boing post.

** *** ***** ******* *********** *************

Android Ad-Fraud Scheme

[2018.10.25] BuzzFeed is reporting on a scheme where fraudsters buy legitimate Android apps, track users’ behavior in order to mimic it in a way that evades bot detectors, and then uses bots to perpetuate an ad-fraud scheme.

After being provided with a list of the apps and websites connected to the scheme, Google investigated and found that dozens of the apps used its mobile advertising network. Its independent analysis confirmed the presence of a botnet driving traffic to websites and apps in the scheme. Google has removed more than 30 apps from the Play store, and terminated multiple publisher accounts with its ad networks. Google said that prior to being contacted by BuzzFeed News it had previously removed 10 apps in the scheme and blocked many of the websites. It continues to investigate, and published a blog post to detail its findings.

The company estimates this operation stole close to $10 million from advertisers who used Google’s ad network to place ads in the affected websites and apps. It said the vast majority of ads being placed in these apps and websites came via other major ad networks.

Lots of details in both the BuzzFeed and the Google links.

The Internet advertising industry is rife with fraud, at all levels. This is just one scheme among many.

** *** ***** ******* *********** *************

Detecting Fake Videos

[2018.10.26] This story nicely illustrates the arms race between technologies to create fake videos and technologies to detect fake videos:

These fakes, while convincing if you watch a few seconds on a phone screen, aren’t perfect (yet). They contain tells, like creepily ever-open eyes, from flaws in their creation process. In looking into DeepFake’s guts, Lyu realized that the images that the program learned from didn’t include many with closed eyes (after all, you wouldn’t keep a selfie where you were blinking, would you?). “This becomes a bias,” he says. The neural network doesn’t get blinking. Programs also might miss other “physiological signals intrinsic to human beings,” says Lyu’s paper on the phenomenon, such as breathing at a normal rate, or having a pulse. (Autonomic signs of constant existential distress are not listed.) While this research focused specifically on videos created with this particular software, it is a truth universally acknowledged that even a large set of snapshots might not adequately capture the physical human experience, and so any software trained on those images may be found lacking.

Lyu’s blinking revelation revealed a lot of fakes. But a few weeks after his team put a draft of their paper online, they got anonymous emails with links to deeply faked YouTube videos whose stars opened and closed their eyes more normally. The fake content creators had evolved.

I don’t know who will win this arms race, if there ever will be a winner. But the problem with fake videos goes deeper: they affect people even if they are later told that they are fake, and there always will be people that will believe they are real, despite any evidence to the contrary.

** *** ***** ******* *********** *************

Security Vulnerability in Internet-Connected Construction Cranes

[2018.10.29] This seems bad:

The F25 software was found to contain a capture replay vulnerability — basically an attacker would be able to eavesdrop on radio transmissions between the crane and the controller, and then send their own spoofed commands over the air to seize control of the crane.

“These devices use fixed codes that are reproducible by sniffing and re-transmission,” US-CERT explained.

“This can lead to unauthorized replay of a command, spoofing of an arbitrary message, or keeping the controlled load in a permanent ‘stop’ state.”

Here’s the CERT advisory.

** *** ***** ******* *********** *************

More on the Supermicro Spying Story

[2018.10.29] I’ve blogged twice about the Bloomberg story that China bugged Supermicro networking equipment destined to the US. We still don’t know if the story is true, although I am increasingly skeptical because of the lack of corroborating evidence to emerge.

We don’t know anything more, but this is the most comprehensive rebuttal of the story I have read.

** *** ***** ******* *********** *************

Cell Phone Security and Heads of State

[2018.10.30] Earlier this week, the New York Times reported that the Russians and the Chinese were eavesdropping on President Donald Trump’s personal cell phone and using the information gleaned to better influence his behavior. This should surprise no one. Security experts have been talking about the potential security vulnerabilities in Trump’s cell phone use since he became president. And President Barack Obama bristled at — but acquiesced to — the security rules prohibiting him from using a “regular” cell phone throughout his presidency.

Three broader questions obviously emerge from the story. Who else is listening in on Trump’s cell phone calls? What about the cell phones of other world leaders and senior government officials? And — most personal of all — what about my cell phone calls?

There are two basic places to eavesdrop on pretty much any communications system: at the end points and during transmission. This means that a cell phone attacker can either compromise one of the two phones or eavesdrop on the cellular network. Both approaches have their benefits and drawbacks. The NSA seems to prefer bulk eavesdropping on the planet’s major communications links and then picking out individuals of interest. In 2016, WikiLeaks published a series of classified documents listing “target selectors”: phone numbers the NSA searches for and records. These included senior government officials of Germany — among them Chancellor Angela Merkel — France, Japan, and other countries.

Other countries don’t have the same worldwide reach that the NSA has, and must use other methods to intercept cell phone calls. We don’t know details of which countries do what, but we know a lot about the vulnerabilities. Insecurities in the phone network itself are so easily exploited that 60 Minutes eavesdropped on a US congressman’s phone live on camera in 2016. Back in 2005, unknown attackers targeted the cell phones of many Greek politicians by hacking the country’s phone network and turning on an already-installed eavesdropping capability. The NSA even implanted eavesdropping capabilities in networking equipment destined for the Syrian Telephone Company.

Alternatively, an attacker could intercept the radio signals between a cell phone and a tower. Encryption ranges from very weak to possibly strong, depending on which flavor the system uses. Don’t think the attacker has to put his eavesdropping antenna on the White House lawn; the Russian Embassy is close enough.

The other way to eavesdrop on a cell phone is by hacking the phone itself. This is the technique favored by countries with less sophisticated intelligence capabilities. In 2017, the public-interest forensics group Citizen Lab uncovered an extensive eavesdropping campaign against Mexican lawyers, journalists, and opposition politicians — presumably run by the government. Just last month, the same group found eavesdropping capabilities in products from the Israeli cyberweapons manufacturer NSO Group operating in Algeria, Bangladesh, Greece, India, Kazakhstan, Latvia, South Africa — 45 countries in all.

These attacks generally involve downloading malware onto a smartphone that then records calls, text messages, and other user activities, and forwards them to some central controller. Here, it matters which phone is being targeted. iPhones are harder to hack, which is reflected in the prices companies pay for new exploit capabilities. In 2016, the vulnerability broker Zerodium offered $1.5 million for an unknown iOS exploit and only $200K for a similar Android exploit. Earlier this year, a new Dubai start-up announced even higher prices. These vulnerabilities are resold to governments and cyberweapons manufacturers.

Some of the price difference is due to the ways the two operating systems are designed and used. Apple has much more control over the software on an iPhone than Google does on an Android phone. Also, Android phones are generally designed, built, and sold by third parties, which means they are much less likely to get timely security updates. This is changing. Google now has its own phone — Pixel — that gets security updates quickly and regularly, and Google is now trying to pressure Android-phone manufacturers to update their phones more regularly. (President Trump reportedly uses an iPhone.)

Another way to hack a cell phone is to install a backdoor during the design process. This is a real fear; earlier this year, US intelligence officials warned that phones made by the Chinese companies ZTE and Huawei might be compromised by that government, and the Pentagon ordered stores on military bases to stop selling them. This is why China’s recommendation that if Trump wanted security, he should use a Huawei phone, was an amusing bit of trolling.

Given the wealth of insecurities and the array of eavesdropping techniques, it’s safe to say that lots of countries are spying on the phones of both foreign officials and their own citizens. Many of these techniques are within the capabilities of criminal groups, terrorist organizations, and hackers. If I were guessing, I’d say that the major international powers like China and Russia are using the more passive interception techniques to spy on Trump, and that the smaller countries are too scared of getting caught to try to plant malware on his phone.

It’s safe to say that President Trump is not the only one being targeted; so are members of Congress, judges, and other senior officials — especially because no one is trying to tell any of them to stop using their cell phones (although cell phones still are not allowed on either the House or the Senate floor).

As for the rest of us, it depends on how interesting we are. It’s easy to imagine a criminal group eavesdropping on a CEO’s phone to gain an advantage in the stock market, or a country doing the same thing for an advantage in a trade negotiation. We’ve seen governments use these tools against dissidents, reporters, and other political enemies. The Chinese and Russian governments are already targeting the US power grid; it makes sense for them to target the phones of those in charge of that grid.

Unfortunately, there’s not much you can do to improve the security of your cell phone. Unlike computer networks, for which you can buy antivirus software, network firewalls, and the like, your phone is largely controlled by others. You’re at the mercy of the company that makes your phone, the company that provides your cellular service, and the communications protocols developed when none of this was a problem. If one of those companies doesn’t want to bother with security, you’re vulnerable.

This is why the current debate about phone privacy, with the FBI on one side wanting the ability to eavesdrop on communications and unlock devices, and users on the other side wanting secure devices, is so important. Yes, there are security benefits to the FBI being able to use this information to help solve crimes, but there are far greater benefits to the phones and networks being so secure that all the potential eavesdroppers — including the FBI — can’t access them. We can give law enforcement other forensics tools, but we must keep foreign governments, criminal groups, terrorists, and everyone else out of everyone’s phones. The president may be taking heat for his love of his insecure phone, but each of us is using just as insecure a phone. And for a surprising number of us, making those phones more private is a matter of national security.

This essay previously appeared in the Atlantic.

EDITED TO ADD: Steven Bellovin and Susan Landau have a good essay on the same topic, as does Wired. Slashdot post.

** *** ***** ******* *********** *************

ID Systems Throughout the 50 States

[2018.10.31] Jim Harper at CATO has a good survey of state ID systems in the US.

** *** ***** ******* *********** *************

Was the Triton Malware Attack Russian in Origin?

[2018.10.31] The conventional story is that Iran targeted Saudi Arabia with Triton in 2017. New research from FireEye indicates that it might have been Russia.

I don’t know. FireEye likes to attribute all sorts of things to Russia, but the evidence here looks pretty good.

** *** ***** ******* *********** *************

Buying Used Voting Machines on eBay

[2018.11.01] This is not surprising:

This year, I bought two more machines to see if security had improved. To my dismay, I discovered that the newer model machines — those that were used in the 2016 election — are running Windows CE and have USB ports, along with other components, that make them even easier to exploit than the older ones. Our voting machines, billed as “next generation,” and still in use today, are worse than they were before — dispersed, disorganized, and susceptible to manipulation.

Cory Doctorow’s comment is correct:

Voting machines are terrible in every way: the companies that make them lie like crazy about their security, insist on insecure designs, and produce machines that are so insecure that it’s easier to hack a voting machine than it is to use it to vote.

I blame both the secrecy of the industry and the ignorance of most voting officials. And it’s not getting better.

** *** ***** ******* *********** *************

How to Punish Cybercriminals

[2018.11.02] Interesting policy paper by Third Way: “To Catch a Hacker: Toward a comprehensive strategy to identify, pursue, and punish malicious cyber actors“:

In this paper, we argue that the United States currently lacks a comprehensive overarching strategic approach to identify, stop and punish cyberattackers. We show that:

  • There is a burgeoning cybercrime wave: A rising and often unseen crime wave is mushrooming in America. There are approximately 300,000 reported malicious cyber incidents per year, including up to 194,000 that could credibly be called individual or system-wide breaches or attempted breaches. This is likely a vast undercount since many victims don’t report break-ins to begin with. Attacks cost the US economy anywhere from $57 billion to $109 billion annually and these costs are increasing.
  • There is a stunning cyber enforcement gap: Our analysis of publicly available data shows that cybercriminals can operate with near impunity compared to their real-world counterparts. We estimate that cyber enforcement efforts are so scattered that less than 1% of malicious cyber incidents see an enforcement action taken against the attackers.
  • There is no comprehensive US cyber enforcement strategy aimed at the human attacker: Despite the recent release of a National Cyber Strategy, the United States still lacks a comprehensive strategic approach to how it identifies, pursues, and punishes malicious human cyberattackers and the organizations and countries often behind them. We believe that the United States is as far from this human attacker strategy as the nation was toward a strategic approach to countering terrorism in the weeks and months before 9/11.

In order to close the cyber enforcement gap, we argue for a comprehensive enforcement strategy that makes a fundamental rebalance in US cybersecurity policies: from a heavy focus on building better cyber defenses against intrusion to also waging a more robust effort at going after human attackers. We call for ten US policy actions that could form the contours of a comprehensive enforcement strategy to better identify, pursue and bring to justice malicious cyber actors that include building up law enforcement, enhancing diplomatic efforts, and developing a measurable strategic plan to do so.

** *** ***** ******* *********** *************

Troy Hunt on Passwords

[2018.11.05] Troy Hunt has a good essay about why passwords are here to stay, despite all their security problems:

This is why passwords aren’t going anywhere in the foreseeable future and why [insert thing here] isn’t going to kill them. No amount of focusing on how bad passwords are or how many accounts have been breached or what it costs when people can’t access their accounts is going to change that. Nor will the technical prowess of [insert thing here] change the discussion because it simply can’t compete with passwords on that one metric organisations are so focused on: usability. Sure, there’ll be edge cases and certainly there remain scenarios where higher-friction can be justified due to either the nature of the asset being protected or the demographic of the audience, but you’re not about to see your everyday e-commerce, social media or even banking sites changing en mass.

He rightly points out that biometric authentication systems — like Apple’s Face ID and fingerprint authentication — augment passwords rather than replace them. And I want to add that good two-factor systems, like Duo, also augment passwords rather than replace them.

Hacker News thread.

** *** ***** ******* *********** *************

Security of Solid-State-Drive Encryption

[2018.11.06] Interesting research: “Self-encrypting deception: weaknesses in the encryption of solid state drives (SSDs)“:

Abstract: We have analyzed the hardware full-disk encryption of several SSDs by reverse engineering their firmware. In theory, the security guarantees offered by hardware encryption are similar to or better than software implementations. In reality, we found that many hardware implementations have critical security weaknesses, for many models allowing for complete recovery of the data without knowledge of any secret. BitLocker, the encryption software built into Microsoft Windows will rely exclusively on hardware full-disk encryption if the SSD advertises supported for it. Thus, for these drives, data protected by BitLocker is also compromised. This challenges the view that hardware encryption is preferable over software encryption. We conclude that one should not rely solely on hardware encryption offered by SSDs.

EDITED TO ADD: The NSA is known to attack firmware of SSDs.

EDITED TO ADD (11/13): CERT advisory. And older research.

** *** ***** ******* *********** *************

Consumer Reports Reviews Wireless Home-Security Cameras

[2018.11.07] Consumer Reports is starting to evaluate the security of IoT devices. As part of that, it’s reviewing wireless home-security cameras.

It found significant security vulnerabilities in D-Link cameras:

In contrast, D-Link doesn’t store video from the DCS-2630L in the cloud. Instead, the camera has its own, onboard web server, which can deliver video to the user in different ways.

Users can view the video using an app, mydlink Lite. The video is encrypted, and it travels from the camera through D-Link’s corporate servers, and ultimately to the user’s phone. Users can also access the same encrypted video feed through a company web page, mydlink.com. Those are both secure methods of accessing the video.

But the D-Link camera also lets you bypass the D-Link corporate servers and access the video directly through a web browser on a laptop or other device. If you do this, the web server on the camera doesn’t encrypt the video.

If you set up this kind of remote access, the camera and unencrypted video is open to the web. They could be discovered by anyone who finds or guesses the camera’s IP address — and if you haven’t set a strong password, a hacker might find it easy to gain access.

The real news is that Consumer Reports is able to put pressure on device manufacturers:

In response to a Consumer Reports query, D-Link said that security would be tightened through updates this fall. Consumer Reports will evaluate those updates once they are available.

This is the sort of sustained pressure we need on IoT device manufacturers.

Boing Boing link.

EDITED TO ADD (11/13): In related news, the US Federal Trade Commission is suing D-Link because their routers are so insecure. The lawsuit was filed in January 2017.

** *** ***** ******* *********** *************

iOS 12.1 Vulnerability

[2018.11.08] This is really just to point out that computer security is really hard:

Almost as soon as Apple released iOS 12.1 on Tuesday, a Spanish security researcher discovered a bug that exploits group Facetime calls to give anyone access to an iPhone users’ contact information with no need for a passcode.

[…]

A bad actor would need physical access to the phone that they are targeting and has a few options for viewing the victim’s contact information. They would need to either call the phone from another iPhone or have the phone call itself. Once the call connects they would need to:

  • Select the Facetime icon
  • Select “Add Person”
  • Select the plus icon
  • Scroll through the contacts and use 3D touch on a name to view all contact information that’s stored.

Making the phone call itself without entering a passcode can be accomplished by either telling Siri the phone number or, if they don’t know the number, they can say “call my phone.” We tested this with both the owners’ voice and a strangers voice, in both cases, Siri initiated the call.

** *** ***** ******* *********** *************

Privacy and Security of Data at Universities

[2018.11.09] Interesting paper: “Open Data, Grey Data, and Stewardship: Universities at the Privacy Frontier,” by Christine Borgman:

Abstract: As universities recognize the inherent value in the data they collect and hold, they encounter unforeseen challenges in stewarding those data in ways that balance accountability, transparency, and protection of privacy, academic freedom, and intellectual property. Two parallel developments in academic data collection are converging: (1) open access requirements, whereby researchers must provide access to their data as a condition of obtaining grant funding or publishing results in journals; and (2) the vast accumulation of “grey data” about individuals in their daily activities of research, teaching, learning, services, and administration. The boundaries between research and grey data are blurring, making it more difficult to assess the risks and responsibilities associated with any data collection. Many sets of data, both research and grey, fall outside privacy regulations such as HIPAA, FERPA, and PII. Universities are exploiting these data for research, learning analytics, faculty evaluation, strategic decisions, and other sensitive matters. Commercial entities are besieging universities with requests for access to data or for partnerships to mine them. The privacy frontier facing research universities spans open access practices, uses and misuses of data, public records requests, cyber risk, and curating data for privacy protection. This Article explores the competing values inherent in data stewardship and makes recommendations for practice by drawing on the pioneering work of the University of California in privacy and information security, data governance, and cyber risk.

** *** ***** ******* *********** *************

The Pentagon Is Publishing Foreign Nation-State Malware

[2018.11.09] This is a new thing:

The Pentagon has suddenly started uploading malware samples from APTs and other nation-state sources to the website VirusTotal, which is essentially a malware zoo that’s used by security pros and antivirus/malware detection engines to gain a better understanding of the threat landscape.

This feels like an example of the US’s new strategy of actively harassing foreign government actors. By making their malware public, the US is forcing them to continually find and use new vulnerabilities.

EDITED TO ADD (11/13): This is another good article. And here is some background on the malware.

** *** ***** ******* *********** *************

Hiding Secret Messages in Fingerprints

[2018.11.12] This is a fun steganographic application: hiding a message in a fingerprint image.

Can’t see any real use for it, but that’s okay.

** *** ***** ******* *********** *************

New IoT Security Regulations

[2018.11.13] Due to ever-evolving technological advances, manufacturers are connecting consumer goods — from toys to light bulbs to major appliances — to the Internet at breakneck speeds. This is the Internet of Things, and it’s a security nightmare.

The Internet of Things fuses products with communications technology to make daily life more effortless. Think Amazon’s Alexa, which not only answers questions and plays music but allows you to control your home’s lights and thermostat. Or the current generation of implanted pacemakers, which can both receive commands and send information to doctors over the Internet.

But like nearly all innovation, there are risks involved. And for products born out of the Internet of Things, this means the risk of having personal information stolen or devices being overtaken and controlled remotely. For devices that affect the world in a direct physical manner — cars, pacemakers, thermostats — the risks include loss of life and property.

By developing more advanced security features and building them into these products, hacks can be avoided. The problem is that there is no monetary incentive for companies to invest in the cybersecurity measures needed to keep their products secure. Consumers will buy products without proper security features, unaware that their information is vulnerable. And current liability laws make it hard to hold companies accountable for shoddy software security.

It falls upon lawmakers to create laws that protect consumers. While the US government is largely absent in this area of consumer protection, the state of California has recently stepped in and started regulating the Internet of Things, or “IoT” devices sold in the state — and the effects will soon be felt worldwide.

California’s new SB 327 law, which will take effect in January 2020, requires all “connected devices” to have a “reasonable security feature.” The good news is that the term “connected devices” is broadly defined to include just about everything connected to the Internet. The not-so-good news is that “reasonable security” remains defined such that companies trying to avoid compliance can argue that the law is unenforceable.

The legislation requires that security features must be able to protect the device and the information on it from a variety of threats and be appropriate to both the nature of the device and the information it collects. California’s attorney general will interpret the law and define the specifics, which will surely be the subject of much lobbying by tech companies.

There’s just one specific in the law that’s not subject to the attorney general’s interpretation: default passwords are not allowed. This is a good thing; they are a terrible security practice. But it’s just one of dozens of awful “security” measures commonly found in IoT devices.

This law is not a panacea. But we have to start somewhere, and it is a start.

Though the legislation covers only the state of California, its effects will reach much further. All of us — in the United States or elsewhere — are likely to benefit because of the way software is written and sold.

Automobile manufacturers sell their cars worldwide, but they are customized for local markets. The car you buy in the United States is different from the same model sold in Mexico, because the local environmental laws are not the same and manufacturers optimize engines based on where the product will be sold. The economics of building and selling automobiles easily allows for this differentiation.

But software is different. Once California forces minimum security standards on IoT devices, manufacturers will have to rewrite their software to comply. At that point, it won’t make sense to have two versions: one for California and another for everywhere else. It’s much easier to maintain the single, more secure version and sell it everywhere.

The European General Data Protection Regulation (GDPR), which implemented the annoying warnings and agreements that pop up on websites, is another example of a law that extends well beyond physical borders. You might have noticed an increase in websites that force you to acknowledge you’ve read and agreed to the website’s privacy policies. This is because it is tricky to differentiate between users who are subject to the protections of the GDPR — people physically in the European Union, and EU citizens wherever they are — and those who are not. It’s easier to extend the protection to everyone.

Once this kind of sorting is possible, companies will, in all likelihood, return to their profitable surveillance capitalism practices on those who are still fair game. Surveillance is still the primary business model of the Internet, and companies want to spy on us and our activities as much as they can so they can sell us more things and monetize what they know about our behavior.

Insecurity is profitable only if you can get away with it worldwide. Once you can’t, you might as well make a virtue out of necessity. So everyone will benefit from the California regulation, as they would from similar security regulations enacted in any market around the world large enough to matter, just like everyone will benefit from the portion of GDPR compliance that involves data security.

Most importantly, laws like these spur innovations in cybersecurity. Right now, we have a market failure. Because the courts have traditionally not held software manufacturers liable for vulnerabilities, and because consumers don’t have the expertise to differentiate between a secure product and an insecure one, manufacturers have prioritized low prices, getting devices out on the market quickly and additional features over security.

But once a government steps in and imposes more stringent security regulations, companies have an incentive to meet those standards as quickly, cheaply, and effectively as possible. This means more security innovation, because now there’s a market for new ideas and new products. We’ve seen this pattern again and again in safety and security engineering, and we’ll see it with the Internet of Things as well.

IoT devices are more dangerous than our traditional computers because they sense the world around us, and affect that world in a direct physical manner. Increasing the cybersecurity of these devices is paramount, and it’s heartening to see both individual states and the European Union step in where the US federal government is abdicating responsibility. But we need more, and soon.

This essay previously appeared on CNN.com.

** *** ***** ******* *********** *************

Oracle and “Responsible Disclosure”

[2018.11.14] I’ve been writing about “responsible disclosure” for over a decade; here’s an essay from 2007. Basically, it’s a tacit agreement between researchers and software vendors. Researchers agree to withhold their work until software companies fix the vulnerabilities, and software vendors agree not to harass researchers and fix the vulnerabilities quickly.

When that agreement breaks down, things go bad quickly. This story is about a researcher who published an Oracle zero-day because Oracle has a history of harassing researchers and ignoring vulnerabilities.

Software vendors might not like responsible disclosure, but it’s the best solution we have. Making it illegal to publish vulnerabilities without the vendor’s consent means that they won’t get fixed quickly — and everyone will be less secure. It also means less security research.

This will become even more critical with software that affects the world in a direct physical manner, like cars and airplanes. Responsible disclosure makes us safer, but it only works if software vendors take the vulnerabilities seriously and fix them quickly. Without any regulations that enforce that, the threat of disclosure is the only incentive we can impose on software vendors.

** *** ***** ******* *********** *************

More Spectre/Meltdown-Like Attacks

[2018.11.14] Back in January, we learned about a class of vulnerabilities against microprocessors that leverages various performance and efficiency shortcuts for attack. I wrote that the first two attacks would be just the start:

It shouldn’t be surprising that microprocessor designers have been building insecure hardware for 20 years. What’s surprising is that it took 20 years to discover it. In their rush to make computers faster, they weren’t thinking about security. They didn’t have the expertise to find these vulnerabilities. And those who did were too busy finding normal software vulnerabilities to examine microprocessors. Security researchers are starting to look more closely at these systems, so expect to hear about more vulnerabilities along these lines.

Spectre and Meltdown are pretty catastrophic vulnerabilities, but they only affect the confidentiality of data. Now that they — and the research into the Intel ME vulnerability — have shown researchers where to look, more is coming — and what they’ll find will be worse than either Spectre or Meltdown. There will be vulnerabilities that will allow attackers to manipulate or delete data across processes, potentially fatal in the computers controlling our cars or implanted medical devices. These will be similarly impossible to fix, and the only strategy will be to throw our devices away and buy new ones.

We saw several variants over the year. And now researchers have discovered seven more.

Researchers say they’ve discovered the seven new CPU attacks while performing “a sound and extensible systematization of transient execution attacks” — a catch-all term the research team used to describe attacks on the various internal mechanisms that a CPU uses to process data, such as the speculative execution process, the CPU’s internal caches, and other internal execution stages.

The research team says they’ve successfully demonstrated all seven attacks with proof-of-concept code. Experiments to confirm six other Meltdown-attacks did not succeed, according to a graph published by researchers.

Microprocessor designers have spent the year rethinking the security of their architectures. My guess is that they have a lot more rethinking to do.

** *** ***** ******* *********** *************

Upcoming Speaking Engagements

[2018.11.14] This is a current list of where and when I am scheduled to speak:

The list is maintained on this page.

** *** ***** ******* *********** *************

Since 1998, CRYPTO-GRAM has been a free monthly newsletter providing summaries, analyses, insights, and commentaries on security technology. To subscribe, or to read back issues, see Crypto-Gram’s web page.

You can also read these articles on my blog, Schneier on Security.

Please feel free to forward CRYPTO-GRAM, in whole or in part, to colleagues and friends who will find it valuable. Permission is also granted to reprint CRYPTO-GRAM, as long as it is reprinted in its entirety.

Bruce Schneier is an internationally renowned security technologist, called a security guru by the Economist. He is the author of 14 books — including the New York Times best-seller Data and Goliath: The Hidden Battles to Collect Your Data and Control Your World — as well as hundreds of articles, essays, and academic papers. His newsletter and blog are read by over 250,000 people. Schneier is a fellow at the Berkman Klein Center for Internet and Society at Harvard University; a Lecturer in Public Policy at the Harvard Kennedy School; a board member of the Electronic Frontier Foundation, AccessNow, and the Tor Project; and an advisory board member of EPIC and VerifiedVoting.org. He is also a special advisor to IBM Security and the CTO of IBM Resilient.

Crypto-Gram is a personal newsletter. Opinions expressed are not necessarily those of IBM, IBM Security, or IBM Resilient.

Copyright © 2018 by Bruce Schneier.

** *** ***** ******* *********** *************

Mailing list hosting graciously provided by MailChimp. Sent without web bugs or link tracking.

Bruce Schneier · Harvard Kennedy School · 1 Brattle Square · Cambridge, MA 02138 · USA

Nov 172018
 

Julian Assange has been charged “under seal” in the US. That means no details of the charge, or even the charge itself, are meant to be known by the public.

Vanity Fair, Excerpt:

According to The Washington Post, an August 22 filing in an unrelated case mentions Assange twice by name. Arguing that a case involving a man accused of coercing a minor for sex should be kept sealed, Assistant U.S. Attorney Kellen Dwyer, who is also working on a long-standing case against WikiLeaks, wrote that both the charges and the arrest warrant “would need to remain sealed until Assange is arrested in connection with the charges in the criminal complaint and can therefore no longer evade or avoid arrest and extradition in this matter.” Elsewhere in the filing, Dwyer wrote that “due to the sophistication of the defendant and the publicity surrounding the case, no other procedure is likely to keep confidential the fact that Assange has been charged.” Seamus Hughes, a terrorism expert at the George Washington University, first noted both mentions. “To be clear, seems Freudian, it’s for a different completely unrelated case, every other page is not related to him,” he wrote on Twitter. The office “just appears to have Assange on the mind when filing motions to seal and used his name.”

Exactly what charges Assange is facing remains unclear. In the past, prosecutors have considered conspiracy, violating the Espionage Act, and theft of government property. During the Obama administration, the Justice Department held back on going after Assange amid concerns that doing so was similar to prosecuting a news outlet. (Charging someone for publishing accurate information, Assange’s lawyer Barry Pollack told The Guardian on Thursday, is “a dangerous path for a democracy to take.”) The recently ousted Jeff Sessions, however, took a more Draconian stance on government leaks, and prosecutors were reportedly told over the summer that they could start compiling a complaint. So far, the D.O.J. has not offered further details. “That was not the intended name for this filing,” Joshua Stueve, a spokesman for the United States Attorney’s Office for the Eastern District of Virginia, told The New York Times, explaining that “the court filing was made in error.”

Whether Assange will be charged as part of the Russia probe is also unknown, though it seems likely. Presumably, the mention of Assange’s name in legal documents has spooked Trumpworld, which is already on edge in anticipation of the next Mueller bombshell. According to Politico, the White House suspects more indictments are imminent, potentially targeting a cabal of Trump family members and associates for their connections to WikiLeaks. On Wednesday, the special counsel delivered a one-page motion to a Washington judge stating that former Trump campaign deputy chairman Rick Gates, who pleaded guilty to conspiracy against the U.S. and making a false statement in a federal investigation, “continues to cooperate with respect to several ongoing investigations.” Then, on Thursday, Mueller’s office and Paul Manafort’s lawyers jointly requested a 10-day extension to file a report pertaining to the former campaign chairman’s sentencing.

Trump allies are feeling the pressure. Conspiracy theorist and commentator Jerome Corsi, a Stone ally, has said he expects to be indicted for perjury, and told The Guardian that Mueller’s team grilled him on Assange and Brexiteer Nigel Farage, the latter of whom has links to both WikiLeaks and Trump. Donald Trump Jr., too, is said to be bracing for a legal showdown—as three sources recently told my colleague Gabriel Sherman, the president’s eldest son has “been telling friends he is worried about being indicted as early as this week.” (His lawyer, Alan Futerfas, denied this, saying in a statement, “Don never said any such thing, and there is absolutely no truth to these rumors.”)

As paranoia, media scrutiny, and the hashtag #indictmentpalooza pick up, the president, who has been working with lawyers on written answers to a series of Mueller’s questions, also appears to be on tenterhooks. “The inner workings of the Mueller investigation are a total mess. They have found no collusion and have gone absolutely nuts. They are screaming and shouting at people, horribly threatening them to come up with the answers they want,” he wrote on Twitter Thursday, ending an almost two-month hiatus of attacks on the Russia probe. “They are a disgrace to our Nation and don’t care how many lives [they] ruin.”

Nov 162018
 
the light went on.   I remember what “under the Law” means in the Office of the Privacy Commissioner.

(NOTE:  List of RELATED postings at bottom)

 

There’s a BLIND SPOT that creates confusion.  And down the garden path we go.

the word PRIVACY  used in two different contexts has different meaning.
If you do not understand that, your expectations of where the path leads will likely be wrong.

Explain it this way:

You register a FOI  (Freedom of Information – – a request for access to documents in Government and public sector institutions) with, for example,  the Saskatchewan Information and Privacy Commissioner.  

“Privacy” in that context is about our access to information about the self, that is held in government or public sector institutions.   

Whereas,

“Privacy” in the context of the Charter Right to Privacy of Personal Information is about Constitutional Law that prohibits “the state” (the Government) from amassing detailed personal information on individuals in the society.

 

 

I could not figure out:  why aren’t people and the media raising the Charter Right to Privacy of Personal Information?   (StatsCan’s plan to get personal data through enforced “collaborations” with the private sector.  Which they feel they have to do because of non-compliance by Canadians with handing over personal information to StatsCan.)

Non-compliance rates have been exposed. 2013-10   Lockheed Martin Census: StatsCan math is wrong on non-compliance. It’s 11%, not 2%. Under oath at the trial of Audrey Tobias.)  In truth,  non-compliance is higher than that.  One year for example,  the “Religion” of 12,000 Canadians was “Jedi” (from Star Wars)!

 

But never mind,  “The Privacy Commissioner is investigating“, we can relax – – it’s a slam-dunk.  But it’s not.  And it’s potentially dangerous  if the Privacy Commissioner issues a report saying StatsCan operations are within the Law.

 

The Privacy Commissioner operates “under the Law“.  There’s the nub of it.  WHICH Laws?  It’s the “Privacy Act” and one other.  I skimmed the Privacy Act – – it’s more about Access to personal information held in Govt Depts.

The Privacy Commissioner more or less frames the debate in media interviews;

 – the Charter Right to Privacy of Personal Information does not fall within his terms of reference.

 

I think that’s part of why you don’t see it in the media coverage.

Citizens are the ones who have to insert the Charter Right into the debate.   Scroll down, at bottom:

A SMALL, IMPORTANT  ACTION

“REPORT A CONCERN”

ON-LINE, TO THE PRIVACY COMMISSIONER

 

Large numbers of people balk at StatsCan’s efforts to obtain personal information  (and have been, for years).

2018-11-13  Canadians strongly oppose Statscan’s plan to obtain the banking records of 500,000 households: poll. Globe & Mail.

 

Another reason that the Charter Right is largely omitted from the public debate:  many Canadians do not know, or are not clear on whether such a Right exists.  I doubt that it’s taught in schools.

LINKS

You might think that an investigation by the Privacy Commissioner will surely put an end to StatsCan’s plan to demand personal banking information from the Banks, until you realize

 “under the Law” means “under the 2 Acts of Parliament that govern the operations of the Privacy Commissioner”.   

 https://www.priv.gc.ca/en/

About the OPC

The Privacy Commissioner of Canada is an Agent of Parliament whose mission is to protect and promote privacy rights. The Office of the Privacy Commissioner of Canada (OPC) oversees compliance with the Privacy Act, which covers the personal information-handling practices of federal government departments and agencies, and the Personal Information Protection and Electronic Documents Act (PIPEDA), Canada’s federal private-sector privacy law.

The Charter Right to Privacy of Personal Information is NOT a Law administered by the Privacy Commissioner.

I only skimmed the Privacy Act.     This phrase is representative:

  • 8(1) A request for access to personal information

 

WHY are the media reports saying nothing about the Charter Right to Privacy of Personal Information in this debate?

There’s a blind spot created by using the same word to describe two different things.

– – – – – – –

Links from the website for the Privacy Commissioner:

  Privacy Commissioner launches investigation into Statistics Canada

Announcement – October 31, 2018

I can think this (an investigation) will solve the problem – – StatsCan will not be allowed to order the banks to hand over people’s banking data.

 

– – – – – – –

From    Commissioner shares his views on the collection of financial information by Statistics Canada

Announcement – November 8, 2018

EXCERPTS

Indeed, Statistics Canada regularly consults us on the privacy implications of many of their initiatives; it (StatsCan) is always open to a dialogue and often accepts our recommendations.

 

After having received complaints related to Statistics Canada and its collection of personal information from private sector organizations, I have opened an investigation. 

 

I am at liberty to tell you that I have received 52 complaints on this matter as of this morning.

 

A SMALL, IMPORTANT  ACTION

“REPORT A CONCERN”

ON-LINE, TO THE PRIVACY COMMISSIONER

 

don’t know if our network is large enough to insert the Charter Right into the debate.   But let’s give it a try.   In a small office,  large volumes are hard to handle.  Quality of input, not quantity, counts!

I talked with a woman at the Office of the Privacy Commissioner who recommended (as I understand, the Report of the Commissioner will incorporate the Concerns we submit):

Go to:

https://services.priv.gc.ca/q-s/allez-go/eng/fb408134-7bd4-48cb-87a2-4ac6210dee51

Privacy Commissioner of Canada

Report a Concern

Notice:     This form is intended for individuals who want to share their comments with us but do not require a response from our Office.

(Maximum 4096 characters)

Tell the Privacy Commissioner your thoughts.   Might be helpful – – some ideas you can copy and change to suit yourself.:

  • Canadians have a Charter Right to Privacy of Personal Information.   OR

 

  • The Charter Right should be in your Report on StatsCan and the Banks.   OR

 

  •   This is what our Charter Right says:  “In fostering the underlying values of dignity, integrity and autonomy, it is fitting that s. 8 of the Charter should seek to protect a biographical core of personal information which individuals in a free and democratic society would wish to maintain and control from dissemination to the state.”     OR

 

  • The Charter Right to Privacy of Personal Information says that StatsCan can’t do what’s it’s trying to do.  Constitutional Law is higher than the Privacy Act and all other Acts of Parliament.  OR

 

  • Under the Law, if StatsCan wants to get the personal information of citizens from the Banks, it has to make a successful application to the Courts to grant them a Section 1 over-ride.  In Court StatsCan has to satisfy the criteria set out for the Govt to take away a Charter Right of all citizens.

(2010-12-23    Charter of Rights and Freedoms, Section 8 Privacy – Case Law: The Queen Vs Plant protects a “biographical core of personal information” from the state.  Oakes Test to override.)

Don’t forget the “COMMENTS” below.

RELATED POSTINGS

2018-11-16  the BLIND SPOT in Privacy Commissioner’s investigation of StatsCan (getting personal data from the private sector)

2018-11-13   Canadians strongly oppose Statscan’s plan to obtain the banking records of 500,000 households: poll. Globe & Mail.

2018-11-13  Blind men describing elephant: Reply to “I wish I could persuade you that everyone gains from what is being proposed” by StatsCan (collection of data from Banks)

2018-11-12    My reply to “StatsCan plan to scoop customer spending data from banks”

2018-11-11  The law that lets Europeans take back their data from big tech companies, CBS 60 Minutes.

2018-11-08  Senator ‘repelled’ by StatsCan plan to scoop customer spending data from banks, IT World Canada

2018-11-06   News Release from Senate of Canada: Senate committee to probe Statistics Canada’s request for Canadians’ banking data

2016-08-23  MK Ultra: CIA mind control program in Canada (1980) – The Fifth Estate

Nov 162018
 

(NOTE:  list of RELATED postings at bottom)

by  Bill Curry

 

Canadians strongly oppose Statistics Canada’s plan to obtain personal banking records – and most would not consent to participating, according to a new Nanos Research survey.

 

The survey suggests the federal government is on the wrong side of public opinion in its defence of the plan, with 74 per cent of respondents either opposing, or somewhat opposing, Statscan accessing those records without permission. Prime Minister Justin Trudeau and his cabinet repeatedly defended it this month in the House of Commons in response to criticism from opposition MPs.

 

The statistics agency has warned in recent years that its traditional survey methods are becoming less reliable due to declining participation rates. As a result, it is exploring new ways of collecting data by working with private-sector companies.

 

The plan quickly moved from theory to practice in recent weeks when the agency sent letters to nine Canadian banks informing them it would compel them to hand over the personal banking records of 500,000 Canadian households in January.

 

The nine banks – BMO, CIBC, Canadian Western Bank, HSBC, Laurentian Bank, National Bank, RBC, Scotiabank and Toronto-Dominion Bank – were caught by surprise. The Canadian Bankers Association expressed concern with the plan and recently said it is considering legal options.

 

The letters triggered an investigation by the federal Privacy Commissioner, and Statscan said it will not proceed until that investigation is complete.

 

The agency wants the banking records in order to improve the speed and accuracy of its reporting in areas such as spending trends and inflation. It says individual names will be removed and no identifiable information will be released to the public.

 

Still, the Nanos survey found that 57 per cent of Canadians would not consent to having their personal banking data shared with Statscan. Only 30 per cent said they would consent, while 13 per cent said they were unsure. Older Canadians were more likely to oppose the plan.

 

In response to a related question, 64 per cent said protecting the privacy of financial data is more important than helping Statscan better understand consumer behaviour and trends.

 

Pollster Nik Nanos said Canadians clearly want to have control over their private information.

 

“Part of the issue here is the lack of consent – that there’s not even an option to give consent, at least as it’s currently being proposed,” he said. “If they want to listen to Canadians, the government should moderate its position. It’s pretty clear that there’s a significant proportion of Canadians that are uncomfortable with financial data being shared – along with their personal information – by the banks to Statistics Canada.”

 

Chief statistician Anil Arora has said the project would not produce high-quality data if it only collected the banking records of individuals who agree to participate.

 

“We know that when you have a consent-based or a voluntary model, you are making some very significant quality trade-offs,” he told The Globe and Mail in an interview this month.

 

Mr. Arora compared the issue to the debate over whether compliance with the long-form census should be mandatory or voluntary: “Because those that say, ‘Yes’ and fill out the form don’t look anything like those that say, ‘No, I won’t fill it out.'”

 

The survey of 1,000 Canadians was commissioned by Nanos Research and took place from Nov. 3-7. A survey of that size is considered accurate to within plus or minus 3.1 percentage points, 19 times out of 20.

 

It also found that when it comes to protecting their personal financial information, Canadians rank banks ahead of Statistics Canada: Sixty-nine per cent said they would trust or somewhat trust banks to protect their personal information, while 65 per cent said the same about the agency.

 

Canadians have even less faith in credit-card companies: Only 47 per cent of survey respondents said they would trust or somewhat trust them to protect their personal information

= = = = = = = = = =

RELATED POSTINGS

2018-11-16  the BLIND SPOT in Privacy Commissioner’s investigation of StatsCan (getting personal data from the private sector)

2018-11-13   Canadians strongly oppose Statscan’s plan to obtain the banking records of 500,000 households: poll. Globe & Mail.

2018-11-13  Blind men describing elephant: Reply to “I wish I could persuade you that everyone gains from what is being proposed” by StatsCan (collection of data from Banks)

2018-11-12    My reply to “StatsCan plan to scoop customer spending data from banks”

2018-11-11  The law that lets Europeans take back their data from big tech companies, CBS 60 Minutes.

2018-11-08  Senator ‘repelled’ by StatsCan plan to scoop customer spending data from banks, IT World Canada

2018-11-06   News Release from Senate of Canada: Senate committee to probe Statistics Canada’s request for Canadians’ banking data

2016-08-23  MK Ultra: CIA mind control program in Canada (1980) – The Fifth Estate

Nov 152018
 

= = = = = = = = = =  = = = = = = =

RELATED POSTINGS

2018-11-16  the BLIND SPOT in Privacy Commissioner’s investigation of StatsCan (getting personal data from the private sector)

2018-11-13   Canadians strongly oppose Statscan’s plan to obtain the banking records of 500,000 households: poll. Globe & Mail.

2018-11-13  Blind men describing elephant: Reply to “I wish I could persuade you that everyone gains from what is being proposed” by StatsCan (collection of data from Banks)

2018-11-12    My reply to “StatsCan plan to scoop customer spending data from banks”

2018-11-11  The law that lets Europeans take back their data from big tech companies, CBS 60 Minutes.

2018-11-08  Senator ‘repelled’ by StatsCan plan to scoop customer spending data from banks, IT World Canada

2018-11-06   News Release from Senate of Canada: Senate committee to probe Statistics Canada’s request for Canadians’ banking data

2016-08-23  MK Ultra: CIA mind control program in Canada (1980) – The Fifth Estate

Nov 132018
 

(NOTE:  list of RELATED postings at bottom)

Further to:    My reply to “StatsCan plan to scoop customer spending data from banks”

I received this:

I circulated on facebook the petition against this sharing of personal data from financial institutions.  One response made it clear that people trust Stats Canada a way more than is warranted, based on their track record.

And this:

This is one question (StatsCan) I do know a lot about, and I wish I could persuade you that everyone gains from what is being proposed; this is not the banks passing on your data to credit agencies, who then leaked you records, but statcan, with an unblemished record for assembling the fats we need to share while protecting your privacy better than any other firm of agency with which you share your data.

in hopes of triggering second thoughts, and with best wishes,

 

I conclude:   we diverge on whether or not to trust StatsCan.

 

My “Second thoughts” are really “More thoughts”.

Hi!  . . .

 

I appreciate your overture, and your experience with StatsCan.

My concern is not the work that StatsCan does.  My concern is the amount of detailed personal information they are amassing.

 

First, you may not be aware:

It is not the case that StatsCan has an unblemished record.  They have communications specialists.

The first 2 examples are CBC reports.

Some left on subway cars, some mailed to wrong addresses, etc.

More serious allegations about the hacking of Statistics Canada secure servers.  StatsCan says no harm done.

 

Beyond that:

I think we two are the blind men describing the elephant.  The one at the front describes a thick tail.  The one at the back describes a wispy thin tail, on the same animal.

You are feeling a different part of StatsCan than I have experienced, starting back in 2003 (which is to say that my experience is not insignificant, nor is yours).

An advantage for me:  People in the network forward information I would never see by myself.   I cross-examined the head of the Census operations, at my trial.  I have been in direct contact, over the years with other people who were prosecuted.

To me,  it is written in large letters, across the side of the elephant, not visible to the blind:  surveillance.  Edward Snowden wrote that.  That is what you get with the parties that StatsCan works with.  I won’t repeat the evidence – – like the travel expense claims from the StatsCan website.

 

I have a long interest in Nazi Europe.  It seems so preposterous:  how could people have been so obtuse, as to let it happen?  One example:

It is documented:  people in the field were telling their leaders long before it happened – – intelligence showed that the Nazis were going to go through Ardennes Woods to invade France, you may know the story.  But it was pooh-poohed:

Allied generals in World War II felt the region was impenetrable to massed vehicular traffic and especially armor, so the area was effectively “all but undefended” during the war, leading to the German Army twice using the region as an invasion route into Northern France and Southern Belgium via Luxembourg in the Battle of France and the later Battle of the Bulge.

Anyhow, there you go.   Sorry,  I am convinced that we need the charter right to privacy, more than StatsCan & Lockheed Martin need all our personal information.     If you have not read “IBM and Holocaust”,  you really should.   Mechanized census files, very detailed.  What they had then – – Hollerith machines and punch cards, is nothing compared with what they have today.

 

From my “tail” of the elephant,  StatsCan is not trustworthy.  And the last people on Earth I would trust:  the American imperialists, the military-industrial-congressional complex.  It’s an Ardennes Woods story.

 

ILLUSION IS THE FIRST OF ALL PLEASURES (VOLTAIRE).

THE WORLD WANTS TO BE DECEIVED (HITLER) 

(Quote from:  Illusion and Denial that the invader is on the way (Ardennes, Sarajevo). Canada?)

It’s not a conspiracy theory.  It’s just piecing together the intelligence collected, starting in 2003.  I have had the luxery of TIME to spend on the topic It’s a priority for me – – it would have been for you, too, if you had received a Summons to Court, as I did.  You don’t have to be a genius to figure it out, you just need time and a supporting network working with you.

Best wishes,

Sandra

– – – – – – – – –

RE:  StatsCan’s plan to obtain spending data from Banks

I am concerned by a lack of emphasis and informed dialogue on:

  • The Charter Right to Privacy of Personal Information
  • WHY we have the Charter Right
  • The Rule of Law
  • What happens in a corporatized civil service

My reply to “StatsCan plan to scoop customer spending data from banks” is posted at:

http://sandrafinley.ca/?p=22835   

 It elaborates on the topics of concern.  I am hoping you will peruse,  and if it suits,  share with others.

Thank-you for your consideration, and all that you do in service to a better Canada for everyone.

Sandra Finley

= = = = = = = = = = =

RELATED POSTINGS

2018-11-16  the BLIND SPOT in Privacy Commissioner’s investigation of StatsCan (getting personal data from the private sector)

2018-11-13   Canadians strongly oppose Statscan’s plan to obtain the banking records of 500,000 households: poll. Globe & Mail.

2018-11-13  Blind men describing elephant: Reply to “I wish I could persuade you that everyone gains from what is being proposed” by StatsCan (collection of data from Banks)

2018-11-12    My reply to “StatsCan plan to scoop customer spending data from banks”

2018-11-11  The law that lets Europeans take back their data from big tech companies, CBS 60 Minutes.

2018-11-08  Senator ‘repelled’ by StatsCan plan to scoop customer spending data from banks, IT World Canada

2018-11-06   News Release from Senate of Canada: Senate committee to probe Statistics Canada’s request for Canadians’ banking data

2016-08-23  MK Ultra: CIA mind control program in Canada (1980) – The Fifth Estate

 

Nov 122018
 

(NOTE:  list of RELATED postings at bottom)

Tech companies’ reign over users’ personal data has run largely unchecked in the age of the internet. Europe is seeking to end that with a new law

by  Steve Kroft

This has not been a great year for big tech; on Wall Street or in Washington. For decades, companies like Google, Facebook, and Amazon have made vast sums monetizing the personal information of their users with almost no oversight or regulation. They are still making vast sums of money, but public attitudes about their size and power and their ability, or willingness, to police themselves are being called into question. A consensus is developing that something has to change and once again the impetus is coming from Europe which is becoming the world’s leader in internet privacy and data protection. With a 31-year-old lawyer as the catalyst, the European Parliament has enacted a tough new law that has Silicon Valley scrambling to comply, and pressuring lawmakers here to do something about protecting your data.

Seven times this year big tech has been called on the carpet to answer for data breaches, fake news, political meddling on the internet, and the endless amounts of personal information being gathered on Americans.

Sen. John Kennedy: I don’t want to vote to have to regulate Facebook, but by God I will.

Sen. Mark Warner: The era of the Wild West in social media is coming to an end.

Sen. John Thune: The question is no longer whether we need a federal law to protect consumers’ privacy, the question is what shape will that law take?

In Europe, they already have a law in place. After levying multi-billion dollar fines against Google for anti-competitive behavior, the European Union enacted the world’s most ambitious internet privacy law, even winning support from the CEO of the biggest tech company in America, Apple’s Tim Cook.

cook2wwdc.jpg

Apple Inc. CEO Tim Cook has spoken in support of the GDPR   Reuters

 

Tim Cook: This is surveillance. And these stockpiles of personal data serve only to enrich the companies that collect them.

 

Speaking in Brussels, Cook did not say which companies he was talking about but Apple wasn’t one of them. Its business model is making and selling phones and computers, not marketing personal information for advertising like Google and Facebook.

 

Tim Cook: Our own information from the everyday to the deeply personal, is being weaponized against us with military efficiency. It is time for the rest of the world, including my home country, to follow your lead.

“Americans have no control today about the information that’s collected about them every second of their lives.”

Most people would agree that the point man in Europe has been a spikey-haired 31-year-old Viennese lawyer named Max Schrems who has been inflicting misery in Silicon Valley for the past seven years. He not only brought international attention to the issue of data privacy, he brought big tech lawyers into court. In the information age, he says data is the most important commodity. The question is who does it belong to.

 

Steve Kroft: Who owns your data?

 

Max Schrems: The legislation here says it’s you that your data belongs to.

 

Steve Kroft: You should have control over it.

 

Max Schrems: You should have control over that. However, in an environment where there is no such law, basically, whoever factually has the power over it, which is usually the big tech company, owns it, in that sense.

 

Max Schrems was a major force in drafting the General Data Protection Regulation or GDPR. It became law in May, after a long battle with big tech, and every company that does business in Europe, including the most powerful ones in America, must comply. It was designed specifically to ensure that consumers, not tech companies, have control over the collection and use of their own personal information.

 

Steve Kroft: What kind of new rights does this law give European citizens that people in the United States might not have?

 

Max Schrems: The default under the European system is you’re not allowed to use someone else’s data unless you have a justification. And the result of that is that you have rights, like a right that– you walk up to a company and say, “Delete everything you have about me.” You have a right to access. So you can say, “I want to have a copy of everything you have about me.” And all of these little elements in the law, overall, are meant to give you that power over your data that in an information society we should probably have.

max-schrems-steve-kroft-walk-talk.jpg

Max Schrems speaks to correspondent Steve Kroft    CBS News

And right now in the United States you have none of those legal rights.

 

Jeffrey Chester: Americans have no control today about the information that’s collected about them every second of their lives.

 

Jeff Chester is the executive director of the Center for Digital Democracy. He has been a major voice on digital privacy for two decades, and says the only Americans guaranteed privacy on the internet are children under 13. He says there are some limitations on some specific medical and financial information, but the internet has rendered them obsolete.

 

Jeffrey Chester: There are no rules, there’s not a government agency really protecting them. Any– the companies can do whatever they want in terms of gathering our information and using it in any way they see fit.

 

Steve Kroft: How did the big tech companies come to collect all this information?

 

Jeffrey Chester: No one ever told them they couldn’t collect it all. There’ve been no limits at all ever established.

 

Steve Kroft: And that’s what’s going on with GDPR, somebody saying, “You can’t?”

 

Jeffrey Chester: That’s exactly right. GDPR says you can’t collect it without permission.

 

The big tech companies have always argued that consumers have given them permission to take their personal data in exchange for using the product. It’s buried in the fine print on those long impenetrable online privacy agreements that you have to click on. Max Schrems says it’s not free choice but constitutes coercion under the new European law.

 

On the day it was enacted Schrems’ nonprofit group “None of Your Business” took action against Facebook and Google for allegedly violating European privacy laws.

 

Max Schrems: It’s this take it or leave it approach. You know it whenever you open an app it says, “agree, or don’t use the app” and your choice is basically not existent because either you go offline – or you have to agree.

 

Schrems cited the example of Google’s Android operating system, the software which runs up to 80% of the world’s smartphones. But to use one, you must first activate it and give Google consent to collect your personal data on all of its products.

 

Max Schrems: You paid $1,000 right now and you’re not allowed to use your $1,000 phone unless you agree that all the data goes to someone else. And that is basically forced consent.

 

Steve Kroft: The tech companies say, “Look, you, the user, you gave us permission to take this information to use it the way we wanted to. You agreed to it.”

 

Max Schrems:  And that–

 

Steve Kroft:  “You signed on. You made the deal.”

 

Max Schrems: The individual doesn’t have the power, the time, the legal expertise to understand any of that. And then you’re sitting at home at your desk and have the option to only say yes. This is not what any reasonable person would consider a fair deal.

 

Schrems has been waging this battle since 2011 when he spent a semester in California at Santa Clara University School of Law. A lawyer from Facebook told his class that big tech didn’t pay any attention to European privacy laws because they were rarely enforced and that the fines were very small.

 

Max Schrems: it was obviously the case that ignoring European privacy laws was the much cheaper option. The maximum penalty, for example, in Austria was 20,000 euros. So just a lawyer telling you how to comply with the law was more expensive than breaking it.

 

At the time most people had no idea how much personal information was being collected on them, so when the 23-year-old Schrems returned to Austria he decided to ask Facebook if he could see what they had collected on him. By mistake or miracle, someone at Facebook sent him this stack of information, lifting the veil on the extent of the company’s interest in him.

 

Max Schrems: And after a while I got a PDF file with 1,200 pages after using Facebook for three years and I’m not a heavy user or anything like that.

max-schrems-on-bike.jpg

Max Schrems   CBS News

Facebook had created a dossier of max’s life. That included his location history, events he attended, all of his contact information and his private Facebook messages, even the ones he thought he had deleted.

 

Steve Kroft:  So these were personal conversations you had that you thought were between yourself and the other person?

 

Max Schrems: Yeah.

 

Steve Kroft: And they’re all here?

 

Max Schrems: They’re all here, and they’re basically undeletable.

 

It created a huge stir at the time, but it’s nothing compared to what’s being gathered now. Today, Facebook collects information on people who don’t even have an account. Google’s Android software knows whether the user is walking, running, or riding in a car. And Amazon has patented algorithms that could be used on its Echo smart speaker to listen in on continuous conversations, and even read the mood of people in the room.

 

Max Schrems: The reality is that this industry is so fast-moving right now, even if you have perfect enforcement mechanisms, usually they will get away with it. Unless there is a serious penalty.

 

Today, if one of the big tech companies chooses to ignore Europe’s new data protection law it could cost them 4 percent of their global revenues, which for the biggest companies would mean billions of dollars.

 

Those decisions will likely be made here in Dublin, the busiest of Europe’s 28 data protection centers, and the place where most American tech companies have their European headquarters. They flocked here years ago because of Ireland’s low corporate taxes and its reputation for relaxed regulation.

 

Ireland’s data protection commissioner Helen Dixon says it’s not going to be business as usual.

 

Helen Dixon: U.S. internet companies have no doubt that this law is serious, it has serious bite  And all of them are eager to avoid any engagement with that.

 

Steve Kroft: How would you describe your relationship with these companies right now? Is the relationship cooperative or contentious?

 

Helen Dixon: It’s all of those things in any one week.

helen-dixon-walking.jpg

Helen Dixon    CBS News

Dixon says tech companies are spending tens of millions of dollars hiring lawyers, compliance officers and engineers to make sure they are operating within the law. The data protection authorities have only a few thousand employees in Europe to police some of the most powerful companies in the world, but they have subpoena power, can conduct raids, and even shut down operations.

 

Steve Kroft:  You think the big tech companies, the people in Silicon Valley are taking this seriously?

 

Eoin O’Dell: I think they have to.

 

Eoin O’Dell is a law professor at Trinity College in Dublin and a leading expert on European privacy law. He says Europe has now established an international standard for internet privacy, and companies like Facebook, Google and Amazon are not about to retreat from a $17 trillion market.

 

Eoin O’Dell: We have safety standards in cars, but that hasn’t stopped us driving cars. We have emissions standards for – for the gas in the cars but that hasn’t stopped us using the gas in the cars .  The data companies are – going to comply in the same way as the – car companies have complied

 

Steve Kroft: To stay in business.

 

Eoin O’Dell: To stay in business.

 

Since the European privacy law was passed, at least ten other countries have adopted similar rules. So has the state of California.

 

Perhaps sensing the inevitable, Facebook, Twitter, Google and Amazon are now saying they could support a U.S. privacy law if they were given considerable input. The Internet Association, which lobbies for big tech, and its president Michael Beckerman say they would support giving Americans reasonable access to their information and some privacy rights now enjoyed by the Europeans.

michael-beckerman-in-office.jpg

Michael Beckerman    CBS News

Steve Kroft: From your point of view, who owns the data that’s collected?

 

Michael Beckerman: I think individuals should have complete control over their information. You should have access to it, both how you’re giving it in the online world and offline world, and full transparency on who has the information and what you’re getting for it.

 

Steve Kroft: But who owns it?

 

Michael Beckerman: People should have control over it. I don’t view it as an ownership, you know, the way you’re– the way you’re asking. But I think the individual–

 

Steve Kroft: The Europeans do, the Europeans says it’s a right. You own your information. You have a right–

 

Michael Beckerman: We have–

 

Steve Kroft: –to go to the companies and say, “I want this information.”

 

Michael Beckerman: Under the law that we’re pushing, and the rules that we’re pushing, and what our companies already do, people can download the information– their personal information that they’ve shared with the sites, and delete it if they want, and cancel their accounts.

 

Privacy advocate Jeff Chester says the industry wants people to believe that it’s cooperating and open to change, but that it won’t do anything until it’s forced to by law.

 

Jeffrey Chester: This is simply a bait and switch in terms of protecting privacy in America today. The companies have no intention of supporting a privacy law that actually would stop them from collecting our information and give Americans the same rights the Europeans now have.

 

Produced by Maria Gavrilovic. Associate producer, Alex Ortiz.

© 2018 CBS Interactive Inc. All Rights Reserved.

Nov 122018
 

UPDATES:     (note – list of “RELATED” postings at bottom)

2018-11-16  the BLIND SPOT in Privacy Commissioner’s investigation of StatsCan (getting personal data from the private sector)

2018-11-13   Canadians strongly oppose Statscan’s plan to obtain the banking records of 500,000 households: poll. Globe & Mail.

– – – – – – – – – – – –

MY REPLY

Dear  Senator,

RE:

– – – – – – – – – – –

I am grateful to the Senate for probing the plans of StatsCan.

I wish to submit the following, pertinent to your deliberations.

– – – – – – – – – –

  1. THE MOST IMPORTANT ARGUMENT,  see   The role of mechanized census data in Nazi Europe.   (a characteristic of totalitarian police states:  detailed files of personal information on everyone.)

Mankind barely noticed when the concept of massively organized information quietly emerged to become a means of social control, a weapon of war, and a roadmap for group destruction.  …     IBM and the Holocaust, by Edwin Black

– – – – – – – – – –

2.  The effort by StatsCan to get personal data from banks (no matter how they rationalize it), contravenes our Charter Right to Privacy.

THE CHARTER RIGHT:

“In fostering the underlying values of dignity, integrity and autonomy, it is fitting that s. 8 of the Charter should seek to protect a biographical core of personal information which individuals in a free and democratic society would wish to maintain and control from dissemination to the state.” 

Please see

2010-12-23    Charter of Rights and Freedoms, Section 8 Privacy – Case Law: The Queen Vs Plant protects a “biographical core of personal information” from the state.  Oakes Test to override.

– – – – – – –

3.  The RULE OF LAW  is important to me;  it is not important to StatsCan.

Please see   Democracy: Significance of the Rule of Law

StatsCan claims that the Statistics Act gives them authority to take away citizens’ Charter Right to Privacy of Personal Information.  Most people know that Rights provided under Constitutional Law cannot be taken away by a regular act of Parliament.  Under Constitutional Law, in order for the Government to take away a Charter Right it has to meet the criteria set out in the Oakes Test.  As far as I know,  StatsCan / the Justice Dept have not applied to the Courts to see if they can meet the criteria, so the Charter Right stands. StatsCan’s assertions to citizens that the Statistics Act gives them authority to take away Charter Rights is bogus.

Further,  StatsCan continues to tell citizens “it is the Law”, under threat of prosecution,  you have to fill in (for example) surveys,  when the Statistics Act says that participation in surveys is NOT mandatory (the sanctions for census non-compliance do not apply).   StatsCan uses a serious lie to intimidate and coerce citizens into providing information protected by the Charter Right to Privacy of Personal Information.

– – – – – –

4.  The debacle at StatsCan since 2003 has spawned summaries, e.g.:   ARE STATSCAN “SURVEYS” MANDATORY?

Because of the continuous data collection on individuals being done through StatsCan Surveys,   this is the most-used page on my blog.  From blog statistics, number of  “hits” on this posting, as at Nov 11, 2018  shows  75,448.

NOTE:   I have no illusions, mine is a small blog – – these statistics, dependent on software programming, are inflated.  I do not know by how much and in what ways.

What the numbers do tell you:  the actions of Statistics Canada have sent large numbers of people scurrying to find information.

People do a web search – –  “Are StatsCan surveys mandatory”,  “Do I have to fill in a StatsCan survey?”,  “I am being harassed by StatsCan”,  and so on.   They end up on the above posting, which by now contains comprehensive information.

Lockheed Martin, War Economy, StatsCan, Charter Right Privacy, Trial   shows   68,460 hits, as of Nov 11, 2018.

Maybe the story will be:

Canadians noticed when massively organized information was emergingThey . . .

– – – – – –

5.   Integrity at StatsCan.    Anil Arora is currently the Chief Statistician for Canada, he testified at the recent Senate Hearing into StatsCan’s plan to get spending data directly from the banks.

Who is the driver behind the wheel?

From    2016-03-18    Does Lockheed Martin Corp have a role in the 2016 Census?    (Starting at least as early as 2003, Lockheed Martin Corp was at StatsCan.)

The above posting addresses collaboration by specified countries (including Canada) on:  data bases on citizens that are housed in their statistics departments (or census bureaux), under the Steerage of Lockheed Martin Corp.

EXCERPT FROM THE POSTING

StatsCan CREDIBILITY GAP

Edward Snowden and Glenn Greenwald did a good job of explaining that under the auspices of the NSA, backdoor entry to data bases is established if American “security” forces cannot obtain legal front door access.  Lockheed Martin is a contractor to the NSA. Both entities are surveillance specialists; both see themselves as being outside the rule of law.  The data base at StatsCan will contain the on-going collection of data through censuses AND surveys. Your name is on your file (established during the cross-examination of the StatsCan witness, Anil Arora, at my trial).  All in all, EVEN IF Lockheed Martin is “out” (it isn’t), a backdoor entry to the data base will be in place.

NOTE:

The interest of the Americans in obtaining access to information on ALL Canadians is known through mainstream media report:

2008-11-01   Ottawa Citizen, “American officials are pressuring the Federal Government to supply them with information on Canadians ..

The means?

It’s pretty well spelt out by Lockheed Martin:

2006-09-13   Maclean’s Magazine interview, President of the Americas for Lockheed Martin, Ron Covais

The interview with the Lockheed Martin CEO spells out how the corporations will, and do, circumvent democracy.  It is important for Canadians to understand, if we are to successfully reclaim our democracy.

Lockheed’s position at StatsCan was in place by the time of this interview, had been for about three years.  Covais:

We’ve decided not to recommend any things that would require legislative changes because we won’t get anywhere .. The guidance from the ministers was “Tell us what we need to do and we’ll make it happen”, recalls Covais who chairs the U.S. section of the Council.. the future of North America decided.. not in a sweeping trade agreement on which elections will turn, but by the accretion of hundreds of incremental changes implemented by executive agencies, bureaucrats, and regulators ..

StatsCan is an “agency” of the Government.

 

AMONG THE LIES TOLD BY STATSCAN:   (I would not use the word “lies” if I did not have a documented history to support the statement.)

According to the actual numbers provided by Yves Beland, StatsCan Director of Census Operations, at the Audrey Tobias trial, the non-compliance rate for the 2011 Census is 11%, not the 2% figure StatsCan claims, and quotes to the media.   (Under oath, StatsCan, the numbers are:   13 million out of 14.6 million.   StatsCan says that is 98% compliance. It is 89% compliance. Do the math.)

There are the blatant lies told to Canadians about what the Law is  (item #3).

 

Try this one on:

MAYBE Lockheed Martin (U.S. weapons, surveillance, war) is OUT of the Canadian Census?  (Tobias, Stegenga & other trials)

StatsCan said under oath at the trial of Audrey Tobias (over the 2011 Census, trial in 2013):  Lockheed Martin’s role in the Census would be ended within two years (the next Census in 2016).    The decision had been made by the time of the Tobias trial in October 2013.

And yet, they went ahead and prosecuted:

  • Audrey Tobias, 89 years old
  • Janet Churnin, 79 years  old  and
  • (Karen) Eve Stegenga,  a self-employed yoga instructor, 37 years old.

all of whom had refused to fill in a Census form because of the role of Lockheed Martin Corp at StatsCan. 

The three women are obviously a threat to other people in their communities, hence deserving of prosecution (under threat of fines and jail time).

What did the judges think?

  • (Karen) Eve Stegenga received a conditional discharge (July 17, 2014).  She was assigned 25 hours of community service.
  • Janet Churnin received a conditional discharge (December 2013).  50 hours of community service.
  • Audrey Tobias was found not guilty (October 2013).

Community service is routine in the lives of every one of these women.   The sentences were not a hardship, did not have deterrence effect.

Prosecution Services wanted a $250.00 fine, community service, and probation for Eve Stegenga.  The intent was deterrence for other Canadians.   Why they continued to prosecute the Lockheed cases is beyond me.   If it’s just to show who is Lord of the Manor, well, they may want to re-assess.   The Judges are not upholding their lordly status (two discharges and one not guilty – – speaking only of the 2011 Census, and only of these 3 prosecutions, known without a “Freedom of Information” request).

The cost of any one of these trials is very high – preparation, consultations, judges, prosecutors, court workers, facility costs, opportunity costs (the money could have been put to better use).

As mentioned, by the 2011 Census, non-compliance was 11%  (not the 2% reported by StatsCan).   Hopefully, after the Stegenga case, the Justice Dept “gets it”:  the collective conscience of Canadians is strong.   They are not going to obtain compliance by using the threat of prosecution, fines and jail time.  Not by operating outside the Law.  And not when people know that the Government should not be building detailed files of personal information on people.  Whether they know it intuitively, through remembrance of history, or through rational faculties,  it is the same.

So now,  it appears that StatsCan (or whoever is behind the steering wheel),  plans to get what they want, another way.  Be damned the Charter Right to Privacy of Personal Information,  and be damned the fact that detailed files on citizens are a hallmark of police states.

The role of mechanized census data in Nazi Europe.   (IBM)

– – – – – – – – –

Arora is good at word-smithing (misleading).

2010-01-17  StatsCan witness, Anil Arora, under oath says those who didn’t comply were referred for prosecution; 64 people in all of Canada were charged.

 

I appreciate the opportunity to fill in gaps in the decision process regarding StatsCan and its ill-conceived plans.   Thank-you.

/Sandra Finley

 

APPENDED   The following will be information overload for some, and “of interest” to others:

 

Do Canadians have a CHARTER RIGHT TO PRIVACY of personal information?  . . . in theory, in political and legal rhetoric, the answer is “yes”.    But see the  LEGAL ARGUMENT.   In practice, the answer is “only if citizens stand up and fight to keep it”.

Do not rely on Governments and the Courts to defend the Charter Right.  The  LEGAL ARGUMENT discusses what the law says and how it has been applied by the Courts in this instance.

A short posting:  The Oakes Test to over-ride Charter Rights.  How Prosecutors get around it.

  • From: Howard Solomon
    Sent: November 8, 2018 7:45 PM

Hi Sandra

StatsCan appears to be relying on Section 13 of the Statistics Act, which apparently gives it the right to any document held by a business. I don’t know if the Charter over-rides it. It may have to be tested in court. Privacy Commissioner Therrien has launched an investigation, which gives him the authority to look into whether what StatsCan wants complies with the law.

 

  • (Sandra speaking)  Another thing that causes me great concern:  Lockheed Martin is in the business of international surveillance.  Expense travel claims found on the StatsCan website,  show that in 2009 and in 2010,  Peter Morrison, Assistant Chief Statistician traveled to meetings of the International Census Forum, for which Lockheed Martin provides the “steerage”.  The idea that Lockheed Martin no longer has a presence in the data collection at StatsCan is highly suspect.   The question is addressed in  2016-03-18  Does Lockheed Martin Corp have a role in the 2016 Census?

BTW:  Edwin Black’s book “IBM and the Holocaust” is to me an interesting and compelling story.  There’s a bit about it in  The role of mechanized census data in Nazi EuropeBetween it, and what is known about the Stasi files on citizens in East Germany, and in other police states,  Canadians would have to be ignorant and stupid not to stand up against what StatsCan is trying to accomplish.   We have an eloquently stated Charter Right to Privacy of Personal Information, FOR A REASON.

People need to understand that there’s a difference between a business having your personal information, and a Government.   A Government, through the laws it enacts, through what laws it chooses to enforce, through the Police and the Courts, may have enormous power over the individual.  That power is not always benignly and justly exercised.   Reassurances from politicians and bureaucrats don’t cut the mustard.  It’s like the sweet-talking guy taking the sweet young thing, newly met, for a ride in his car.  She trusts him, stupidly, and gets raped.  I’d like to think I’m not that stupid, and collectively Canadians aren’t that gullible.   The why’s and wherefore’s of the Charter Right to Privacy of Personal Information, the importance of it, are not well enough known by Canadians.

= = = = = = = = = = =

RELATED POSTINGS

2018-11-16  the BLIND SPOT in Privacy Commissioner’s investigation of StatsCan (getting personal data from the private sector)

2018-11-13   Canadians strongly oppose Statscan’s plan to obtain the banking records of 500,000 households: poll. Globe & Mail.

2018-11-13  Blind men describing elephant: Reply to “I wish I could persuade you that everyone gains from what is being proposed” by StatsCan (collection of data from Banks)

2018-11-12    My reply to “StatsCan plan to scoop customer spending data from banks”

2018-11-11  The law that lets Europeans take back their data from big tech companies, CBS 60 Minutes.

2018-11-08  Senator ‘repelled’ by StatsCan plan to scoop customer spending data from banks, IT World Canada

2018-11-06   News Release from Senate of Canada: Senate committee to probe Statistics Canada’s request for Canadians’ banking data

2016-08-23  MK Ultra: CIA mind control program in Canada (1980) – The Fifth Estate

Nov 112018
 

(NOTE:  List of RELATED postings at bottom)

 

https://sencanada.ca/en/newsroom/banc-senate-committee-to-probe-statistics-canada-request-for-canadians-banking-data/

News Release

Senate committee to probe Statistics Canada’s request for Canadians’ banking data

November 6, 2018


Ottawa – Following confirmation that Statistics Canada intends to compel major financial institutions to provide detailed customer banking information, the Senate Committee on Banking, Trade and Commerce announced Monday that it will hold at least one hearing to hear views on whether such information should be required by Statistics Canada.

A recent Global News report revealed that Statistics Canada is seeking full details of every banking transaction made by 500,000 Canadians over a designated period, without their consent. Statistics Canada told Global that responses to its surveys are low; the agency said the data will be used to track household spending and consumer trends, and that the data collected will be made anonymous. Former Chief Statistician Wayne Smith told media he believes the agency may have overreached.

The committee intends to invite the minister responsible for Statistics Canada, Navdeep Bains, Privacy Commissioner Daniel Therrien, Statistics Canada Chief Statistician Anil Arora, and representatives from the Canadian Bankers Association, among others, to answer the committee’s questions.

The revelation of Statistics Canada’s request comes as the committee completed its cyber security study, cyber.assault: it should keep you up at night. The report delves into issues pertaining to the protection of personal information. It also made recommendations to give greater powers to the Office of the Privacy Commissioner of Canada to ensure that businesses comply with relevant privacy legislation, and that federal departments and agencies be required to report data breaches to the Privacy Commissioner.

The committee’s first hearing on this matter is expected to take place on November 8, 2018.

Quick Facts

  • According to the Global News story, Statistics Canada is asking for banks to provide 500,000 Canadians’ financial transaction data, with a new sample of Canadians to be chosen every year.
  • In that story, the Office of the Privacy Commissioner of Canada revealed it had already been in contact with Statistics Canada after businesses expressed concerns over its request for customer data. In a statement released on October 31, Statistics Canada said it has invited the Privacy Commissioner to provide “additional suggestions” to protect Canadians’ personal information.
  • Section 13 of the Statistics Act says the person in charge of records “in any department … corporation, business or organization” must provide access to that information to a person authorized by the Chief Statistician.

Quotes

“It makes me uncomfortable to think that banks may be forced to turn over every single financial transaction a person makes. While I don’t question the good intentions of the dedicated professionals at Statistics Canada, I would like to have more than just assurances that the intimate, personal details of Canadians’ lives will be protected.”

– Senator Doug Black, QC, Chair of the committee.

“Our latest report on cyber security shows just how vulnerable we can be to data theft. I want to know more about the rationale for Statistics Canada’s request and what security measures will be put in place to protect Canadians’ data and privacy.”

– Senator Carolyn Stewart Olsen, Deputy Chair of the committee.

Associated Links

 

For more information, please contact:

Sonia Noreau
Public Relations Officer
Communications Directorate
Senate of Canada
613-614-1180 | sonia.noreau@sen.parl.gc.ca