fbpx
Home » Submission of the Office of the Privacy Commissioner of Canada on Bill C-11, the Digital Charter Implementation Act, 2020

Submission of the Office of the Privacy Commissioner of Canada on Bill C-11, the Digital Charter Implementation Act, 2020

0 comment 369 views

May 11, 2021

Mr. Chris Warkentin, M.P.
Chair, Standing Committee on Access to Information, Privacy and Ethics
Sixth Floor, 131 Queen Street
House of Commons
Ottawa ON K1A 0A6

Dear Mr. Chair:

Subject: Submission on C-11

Further to my appearance before you on May 10, 2021, please find enclosed our submission on Bill C-11, the Digital Charter Implementation Act, 2020.  I hope these materials will assist your deliberations on this important piece of privacy legislation.

As I indicated when I appeared before you in the context of the Main Estimates and your study of Facial Recognition Technology, I believe that C-11 represents a step back overall from our current law and needs significant changes if confidence in the digital economy is to be restored. My submission outlines numerous enhancements that are required to help ensure that organizations can responsibly innovate in a manner that recognizes and protects the privacy rights of Canadians.

My opening message provides an overview of our position, while the rest of the document contains a detailed analysis of the bill and recommendations that we believe are necessary.

I am also including an analysis paper prepared for my Office by Dr. Teresa Scassa on the problems with how Bill C-11 would address the issue of trans-border transfers of personal information. It identifies key provisions of C-11 that relate to trans-border data transfers, critically analyzes the extent to which these provisions would substantively protect privacy and offers a series of recommendations for improvement.

I hope these materials will be useful for the Committee.  I remain available to meet with Parliament on this important Bill at its convenience.

Sincerely,

(Original signed by)

Daniel Therrien
Commissioner

encl. (1)

c.c.: The Honourable François-Philippe Champagne, P.C., M.P.
Minister of Innovation, Science and Industry

Ms. Miriam Burke
Clerk of the Committee


Commissioner’s message

Bill C-11, which enacts the Consumer Privacy Protection Act (CPPA) and the Personal Information and Data Protection Tribunal Act (PIDPTA), is an important and concrete step toward privacy law reform in Canada. Arising from the 2019 Digital Charter, and following years of Parliamentary studies, Bill C-11 represents a serious effort to realize the reform that virtually all – from Parliamentarians, to industry, privacy advocates, and everyday Canadians – have recognized is badly needed. It was an ambitious endeavour to completely restructure the existing Act. We are pleased to see that the law reform process appears to be truly underway.

The Bill completely rewrites that law and seeks to address several of the privacy concerns that arise in a modern digital economy. It promises more control for individuals, much heavier penalties for organizations that violate privacy, while offering companies a legal environment in which they can innovate and prosper.

We agree that a modern law should both achieve better privacy protection and encourage responsible economic activity, which, in a digital age, relies on the collection and analysis of personal information. However, despite its ambitious goals, our view is that in its current state, the Bill would represent a step back overall for privacy protection. This outcome can be reversed, and the Bill could become a strong piece of legislation that effectively protects the privacy rights of Canadians, with a number of important amendments under three themes:

  • a better articulation of the weight of privacy rights and commercial interests;
  • specific rights and obligations;
  • access to quick and effective remedies and the role of the OPC.

Why do I say that the Bill as drafted would represent a step back? In general terms, because the Bill, although seeking to address most of the privacy issues relevant in a modern digital economy, does so in ways that are frequently misaligned and less protective than laws of other jurisdictions. Our recommendations would lead to greater alignment.

More specifically, I say the Bill as drafted would be a step back overall because the provisions meant to give individuals more control give them less; because the increased flexibility given to organizations to use personal information without consent do not come with the additional accountability one would expect; because administrative penalties will not apply to the most frequent and important violations, those relevant to consent and exceptions to consent; and because my Office would not have the tools required to manage its workload to prioritize activities that are most effective in protecting Canadians. In fact, the OPC would work under a system of checks and balances (including a new administrative appeal) that would unnecessarily stand in the way of quick and effective remedies for consumers.

Poll after poll suggest there is currently a trust deficit in the digital economy. Improving trust is one of the objectives of the Digital Charter that Bill C-11 seeks to implement. After years of self-regulation, or permissive regulation, polls also suggest this requires more regulation (objective and knowable standards adopted democratically) and oversight (application of these standards by democratically appointed institutions). The regulation required is sensible legislation that allows responsible innovation that serves the public interest and is likely to foster trust, but that prohibits using technology in ways that are incompatible with our rights and values.

Oddly, the government’s narrative in presenting the Bill, while positive in many respects, focused on the need for “certainty” and “flexibility” for businesses and the need for “checks and balances” on the regulator. Unfortunately, it appears this was not a slip of the tongue, as we see that philosophy reflected in several provisions of the Bill.

The OPC welcomes transparency and accountability for its actions, and we agree businesses need some level of certainty and flexibility, within the law. But the focus on checks and balances for the regulator and more certainty and greater flexibility for businesses seems misplaced. It leads to the flaws identified earlier and to an imbalance in the law on the importance of rights and commercial interests.

Better articulation of the weight of rights and commercial interests

Digital technologies are at the heart of the fourth industrial revolution and modern economies. As we have seen in the current pandemic, they can serve the public interest. This includes economic prosperity.

For both good and bad, these technologies are disruptive. They have been shown to pose major risks for privacy and other rights. Data breaches have become routine. There is increasing talk of surveillance capitalism – this, a few years after the Snowden revelations of state surveillance. Biometrics heightens those risks. More recently, the Cambridge Analytica scandal highlighted the risks for democracy. Artificial intelligence brings risks to equality rights. And on and on.

Ultimately, it is up to parliamentarians, as elected representatives of the population, to decide how much weight to give to privacy rights and the interests of commercial enterprises.

My Office has argued for a modernization of laws that would give organizations greater flexibility to use personal information without consent for responsible innovation and socially beneficial purposes, but within a legal framework that would entrench privacy as a human right and as an essential element for the exercise of other fundamental rights.

The Bill maintains that privacy and commercial interests are competing interests that must be balanced. In fact, the Bill arguably gives more weight to commercial interests than the current law by adding new commercial factors to be considered in the balance, without adding any reference to the lessons of the past twenty years on technology’s disruption of rights.

The courts have held that PIPEDA’s purpose clause, without the new commercial factors added in Bill C-11, means privacy rights must be “reconciled” with commercial interests. This is a reasonable interpretation of the direction given to courts and the regulator by Parliament when it enacted PIPEDA in 2000.

Parliamentarians now have a chance to confirm or amend this direction. There is no dispute that the CPPA should both promote rights and commercial interests. The question is what weight to give to each.

In my view, it would be normal and fair for commercial activities to be permitted within a rights framework, rather than placing rights and commercial interests on the same footing. Generally, it is possible to concurrently achieve both commercial objectives and privacy protection. However, when there is a conflict, I believe rights should prevail. The recent Clearview matter is a good example of that principle.

To adopt a rights-based approach would also send a powerful message as to who we are and what we aspire to be as a country. The Canadian Charter of Rights and Freedoms is an integral part of our character and Canada is a signatory to international instruments that recognize privacy as a human right. We are a bijural country, in which the common law and civil law systems coexist in harmony. In Quebec, existing privacy laws seek to implement the right to privacy protected in the Civil Code and the Quebec Charter of Human Rights and Freedoms. Bill 64 would further protect privacy as a human right. Adopting a rights-based approach in the CPPA, including some elements of Bill 64’s provisions, would reflect Canada’s bijural nature.

Canada also aspires to be a global leader in privacy and it has a rich tradition of mediating differences on the world stage. Adopting a rights-based approach, while maintaining the principles-based and not overly prescriptive approach of our private sector privacy law, would situate Canada as a leader showing the way in defining privacy laws that reflect various approaches and are interoperable.

Our detailed submissions comment further on this and include a new preamble and amendments to sections 5, 12 and 13 of the proposed CPPA.

Specific rights and obligations

Again, I refer you to our detailed submissions for a fuller analysis of this theme. Let me now focus on consent, exceptions thereto and accountability.

(i) Valid vs meaningful consent

The Bill seeks to give consumers more control over their personal information. It does this by prescribing elements that must appear in a privacy notice, in plain language. This is similar to the approach taken in our 2018 Guidelines for obtaining meaningful consent, with an important omission. Bill C-11 also makes the same omission of a crucial aspect of meaningful consent under the current law (s. 6.1 of PIPEDA): “the consent of an individual is only valid if it is reasonable to expect that an individual to whom the organization’s activities are directed would understand the nature, purpose and consequences of the collection, use or disclosure of the personal information to which they are consenting.” (my emphasis)

By prescribing elements of information to appear in privacy notices without maintaining the requirement that consumers must be likely to understand what they are asked to consent to, the CPPA would give individuals less control, not more.

This is exacerbated by the open-ended nature of the purposes for which organizations may seek consent. PIPEDA currently requires that purposes be “explicitly specified” and be legitimate. This is consistent with the laws of most other jurisdictions, which prescribe that purposes must be defined “explicitly”, or even “as explicitly as possible”. This limitation, which is important for consumers to understand what they are consenting to, is also omitted from Bill C-11. With the result that, conceivably, organizations could seek consent for vague and mysterious purposes, such as “improving your consumer experience”.

Finally on this point, while s. 15(4) of the CPPA would make express consent the rule, this provision would allow an organization to rely on implied consent where it “establishes” (in French, “conclure” or concludes) that this would be appropriate, in light of certain factors. Bill C-11 therefore seems to give deference to an organization’s conclusion that implied consent is appropriate, as opposed to prescribing an objective assessment of the relevant factors. This is another manifestation of the philosophy where businesses would be given certainty and flexibility, rather than be subject to objective standards and oversight. A simple amendment, striking the words “the organization establishes that” in s. 15(4), would solve the problem and would fully implement the recommendation made by the ETHI Committee in 2018.Footnote1

(ii) Exceptions to consent and accountability

The CPPA would add important new exceptions to consent. We think this is appropriate in a modern privacy law.

Among the lessons of the past twenty years is that privacy protection cannot hinge on consent alone. Simply put, it is neither realistic nor reasonable to ask individuals to consent to all possible uses of their data in today’s complex information economy. The power dynamic is too uneven.

In fact, consent can be used to legitimize uses that, objectively, are completely unreasonable and contrary to our rights and values. In these circumstances, consent rules do not protect privacy but contribute to its violation. This also leads to the trust deficit affecting the digital economy.

Several of the new exceptions to consent brought by Bill C-11 are reasonable. We have two main concerns: some exceptions are unreasonably broad; and the Bill fails to associate greater authority to use personal information with greater accountability by organizations for how they rely on these broader permissions.

Paragraphs 18(2)(b) and (e) of the CPPA are too broad. The first can likely be narrowed but the second should be repealed. We can find no reasonable justification for an exception to consent based on the impracticability of obtaining consent. This would make the rule (consent) completely hollow. There may be some specific activities (for instance those of search engines) that should be permitted for the usefulness of their service even though consent may be impracticable. Or, as recommended in our recent paper on artificial intelligence, the CPPA could include a consent exception for “legitimate business purposes”, but only within a rights-based privacy law.

With Bill C-11, organizations would have much wider permission to collect, use and disclose the personal information of consumers, without consent. Or, put differently, the Bill recognizes that consent is often a fiction and tries to find ways to allow but regulate modern business operations that “(rely) on the analysis, circulation and exchange of personal information” (s. 5), where consent is neither reasonable nor realistic.

Creating newer and broader exceptions to consent means that the law would place less weight on individual control as a means to protect privacy. This form of protection should be replaced by others. Greater permission to use data should come with greater accountability for organizations. There is a consensus on this point in the privacy community, even among industry representatives.

Yet the CPPA would not enhance PIPEDA’s principle of accountability. It would arguably weaken it. In part by defining accountability in descriptive rather than normative terms. Accountability would not be, as in other laws and the OPC guidelines, translated in policies and procedures that ensure (normative goal) compliance with the law, but rather as those policies and procedures that an organization decides to put in place (descriptive) to fulfil its obligations. Again, certainty and flexibility for businesses, rather than standards and oversight.

We have argued for some time that in the current digital economy, based on complex technologies and business models which are difficult if not impossible to understand for consumers, the OPC as expert regulator should have the authority to proactively inspect, audit or investigate business practices to verify compliance with the law, without prior evidence or grounds that the law has been violated. This ability to “look under the hood” of these complex technologies and business models, not arbitrarily but based on our expert assessment of privacy risks, and subject to judicial review, is in our view a necessary element of a modern privacy law.

These provisions exist in the privacy laws of Quebec and Alberta and in those of several foreign jurisdictions, including common law countries such as the United Kingdom, Australia and Ireland, and is proposed by the Department of Justice in its latest consultation paper on Privacy Act reform. They would ensure that organizations are held accountable for the way in which they use the increased flexibility to collect, use and disclose the personal information of consumers. For instance, they would ensure that automated decision-making systems and artificial intelligence are developed and applied in a privacy compliant manner. They would also help address the concerns of Canadians that underlie the deficit of trust in the digital economy.

Finally, the CPPA’s provisions on accountability should explicitly include a requirement that organizations apply Privacy by Design, as recommended in ETHI’s 2018 report, and that privacy impact assessments (PIAs) be prepared for higher risk activities. Requiring PIAs for all activities involving personal information would create an excessive burden on organizations, particularly SMEs. But Privacy by Design and PIAs are important for their proactivity in protecting privacy. Compliance with the law cannot rest only on investigations and penalties. Proactive strategies are equally, and in our view more important in achieving ongoing compliance and respect for the rights of consumers.

Access to quick and effective remedies and the role of the OPC

The CPPA would give the OPC order-making powers and allow the OPC to recommend the imposition of very large penalties on organizations that violate the law, but these provisions are subject to limitations and conditions such that consumers would not have access to quick and effective remedies. To achieve this objective would require important amendments to the Bill.

(i) Limits on violations subject to administrative penalties

The most striking limitation on penalties is found in s. 93(1) of the CPPA, which lists only very few violations as subject to administrative penalties. This list does not include obligations related to the form or validity of consent, nor the numerous exceptions to consent, which are at the core of protecting personal information. It also does not include violations to the principle of accountability, which is supposed to be an important counterbalance to the increased flexibility given to organizations in the processing of data.

Only criminal penalties would be available for violations of these rights and obligations, following a process that in our view would take seven (7) years on average. This process would include an order made by the OPC and a refusal to comply by the organization. With the amendments we recommend, the process could take fewer than two (2) years. Notably, we recommend that most if not all violations of the CPPA could be subject to administrative penalties, following a notice by the OPC giving the organization a last opportunity to comply with the law. Criminal sanctions would be reserved for the most egregious violations.

(ii) The Personal Information and Data Protection Tribunal

Among the checks and balances imposed on the OPC would be the creation of an additional layer of appeal in the form of the Tribunal. According to the government, this would ensure both fairness to organizations and access to quick and effective remedies for consumers.

To reiterate, the OPC welcomes accountability for its actions. We respectfully suggest that the new Tribunal is both unnecessary to achieve greater accountability and fairness (a role already fulfilled by the Federal Court), and counter-productive in achieving quick and effective remedies. We recommend that this new layer not be added to a process that can already be quite long. However, should Parliament decide that the new Tribunal would add value, we recommend that its composition be strengthened and that appeals from its decisions go directly to the Federal Court of Appeal.

While our submissions elaborate on our analysis of this issue, I wish to emphasize a few points here. First, the addition of such an administrative layer between the privacy regulator and the courts does not exist in other jurisdictions. Second, the experience of these jurisdictions, including some Canadian provinces, shows that effective structures can be created within data protection authorities to enhance fairness through the separation of enforcement and adjudicative functions. Third, the OPC is already subject to judicial review, and only once in its almost 40-year history was a decision it had made found not compliant with natural justice.

Fourth and probably most important, the fact that the OPC would not be authorized to impose administrative penalties, and that its orders would be subject to appeal to another administrative structure before reaching the courts, would reduce the incentive organizations have under the model in place in other jurisdictions, to come to a quick agreement with the regulator. In these jurisdictions, where the data protection authority is the final administrative adjudicator and where it can impose financial penalties, organizations have an interest in coming to a negotiated settlement when, during an investigation, it appears likely a violation will be found and a penalty may be imposed. Unfortunately, the creation of the Tribunal would likely incentivize organizations to “play things out” through the judicial process rather than seek a negotiated settlement with the OPC, thus depriving consumers of quick and effective remedies. Sadly, but truly, justice delayed is justice denied.

(iii) Giving the regulator tools to be effective in protecting consumers

Bill C-11 would impose several new responsibilities on the OPC, including the obligation to review codes of practice and certification programs, and advice to individual organizations on their privacy management programs. We welcome the opportunity to work with businesses in these ways in ensuring their activities comply with the law. However, adding new responsibilities to an already overflowing plate means the OPC would not be able to prioritize its activities, based on its expert knowledge of evolving privacy risks, to focus on what is likely most harmful to consumers.

The issue here is not primarily money, although in our view additional resources will be required. The issue is whether the OPC should have the legal discretion to manage its caseload, respond to the requests of organizations and complaints of consumers in the most effective and efficient way possible, and reserve a portion of its time for activities it initiates, based on its assessment of risks for Canadians.

An effective regulator is one that prioritizes its activities based on risk. No regulator has enough resources to handle all the requests it receives from citizens and regulated entities. Yet Bill C-11 adds responsibilities, including the obligation to decide complaints before consumers may file a private right of action, imposes strict time limits to complete our activities, and adds no discretion to manage our caseload. This is not only untenable for us as a bureaucratic organization. It would deprive us of a central tool to ensure we can be effective in protecting Canadians.

We therefore make a number of recommendations under this theme, to ensure we can both be responsive, to the extent our resources allow, to individual requests made by complainants and organizations, and effective as a regulator for all Canadians.

Conclusion

The past few years have opened our eyes to the exciting benefits and worrying risks that new technologies pose to our values and to our rights. The issues we face are complex but the path forward is clear. As a society, we must project our values into the laws that regulate the digital space. Our citizens expect nothing less from their public institutions. It is on this condition that confidence in the digital economy, damaged by numerous scandals, will return.

OPC review of C-11 – context…

Read The Full Report 

related posts

Leave a Comment

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept