Submission on Bill C-11, the Digital Charter Implementation Act, 2020 to the House of Commons Standing Committee on Access to Information, Privacy and Ethics
Letter to Chris Warkentin, M.P. and Chair of the Standing Committee on Access to Information, Privacy and Ethics
Dear Mr. Chair:
I am pleased to share with you and your colleagues a submission from the Business Council of Canada on Bill C-11, the Digital Charter Implementation Act, 2020.
We recognize that Parliament has yet to determine which committee of the House of Commons will study Bill C-11. However, considering the submissions of other groups to the Standing Committee on Access to Information, Privacy and Ethics, we felt it was important to share our perspective on this important bill.
Our submission proposes targeted amendments to Bill C-11. In our view, such changes would ensure that the bill achieves the Government’s objective of protecting consumers while enabling the responsible collection, use, and disclosure of personal information for legitimate commercial purposes.
We trust that our submission will assist you in your deliberations.
c.c. Ms. Sherry Romanado, M.P. Chair, Standing Committee on Industry, Science and Technology
Submission on Bill C-11, the Digital Charter Implementation Act, 2020
The Business Council of Canada welcomes the opportunity to participate in Parliament’s review of Bill C-11, the Digital Charter Implementation Act, 2020.
This submission proposes targeted amendments to Bill C-11. In our view, such changes would ensure that the bill achieves the Government’s objective of protecting consumers while enabling the responsible collection, use, and disclosure of personal information for legitimate commercial purposes.
The emerging data-driven economy offers tremendous opportunities to improve Canadians’ quality of life and to strengthen our country’s ability to compete globally. Canadian companies are up to the challenge. In 2018, Statistics Canada estimated that Canadian organizations invested as much as $40 billion in data, databases, and data science.
At the same time, business leaders recognize that Canada will not develop a healthy and innovative data-driven economy without a solid foundation of consumer trust and confidence. This is why employers and entrepreneurs support Canada adopting a privacy framework maintaining high levels of consumer protection.
Canada was an early leader in protecting consumers’ privacy. But the world has changed since the drafting of its current consumer privacy law, the Personal Information Protection and Electronic Documents Act (“PIPEDA”), in the late 1990s. As this Committee has repeatedly made clear, there is room to modernize and strengthen PIPEDA in ways that better protect consumers’ personal information while enabling greater innovation.
The Government’s response to PIPEDA’s shortcomings is Bill C-11, the Digital Charter Implementation Act, 2020. Through Bill C-11, the Government intends to replace PIPEDA’s privacy provisions with two modernized statutes: the Consumer Privacy Protection Act (“CPPA”) and the Personal Information and Data Protection Tribunal Act (“PIDPTA”).
Targeted amendments are needed to promote greater productivity, innovation, and economic growth:
Bill C-11 represents a significant improvement over PIPEDA. Bill C-11 addresses many of PIPEDA’s weaknesses, including through the incorporation of measures expressly recommended by this Committee.
Committee recommendations that became apart of Bill C-11 include stronger oversight and enforcement powers; additional transparency obligations; and a new data portability right, which will unlock unrealized economic potential by enhancing consumer choice.
The Business Council believes strongly that Bill C-11 represents a responsible approach to privacy regulation. At the same time, we believe that the bill requires targeted, but important changes to support greater productivity, innovation, and economic growth. We urge parliamentarians to address the following concerns:
1. The CPPA should permit the responsible use of anonymized information:
The CPPA treats all de-identified information, including anonymized information, as being subject to the CPPA’s strict privacy requirements unless the use of that information falls within a narrowly defined exception.
By applying the CPPA to anonymized information in this highly restrictive manner, the CPPA prohibits the use of such information for a wide variety of legitimate purposes that are critical to everyday business operations. The CPPA’s treatment of anonymized information also makes the CPPA inconsistent with provincial privacy legislation and the privacy laws of Canada’s most important trading partners.In turn, the CPPA fragments Canada’s privacy framework and could cause the flight of legitimate business activities to its trading partners where the use of anonymized information is less restricted.
Business leaders fully understand the need to protect consumers’ privacy. Consumer trust and confidence are the bedrock of a dynamic data-driven economy. However, they question the need for such a restrictive approach given the low privacy risks associated with the use of anonymized information and the stiff penalties that organizations would face for identifying an individual using anonymized information.
2. The CPPA’s algorithmic transparency obligations are unduly broad:
The CPPA requires organizations to provide a “general account” of their use of “automated decision systems” to make predictions, recommendations, or decisions about consumers that could have significant impacts on them.
In addition, the CPPA requires that if an organization uses an “automated decision system,” the organization must, on request, provide an “explanation” of the prediction, recommendation, or decision and how the personal information used to make the prediction, recommendation, or decision was obtained.
The CPPA defines an “automated decision system” to include “any technology that assists or replaces the judgement of human decision-makers using techniques such as rules-based systems, regression analysis, predictive analytics, machine learning, deep learning and neural nets.”
The CPPA’s algorithmic transparency obligations are drafted far too broadly, departing significantly from the standards set forth in the privacy laws of Canada’s major trading partners. For instance, the CPPA’s transparency obligations contain none of the fair and reasonable limitations imposed by the European Union’s General Data Protection Regulation (“GDPR”). The scope of the GDPR’s algorithmic transparency rules is limited to “decisions” made “solely” by an automated decision system producing “legal effects” or “similarly significant” impacts on consumers. Further, the GDPR requires only the disclosure of information about the “existence of automated decision making” and meaningful information “about the logic involved, as well as the significance and envisaged consequences” of the automated decision for consumers.
Without similar constraints, the CPPA’s transparency obligations are likely to extend far beyond the reasonable compliance capabilities of most organizations. The CPPA’s obligations could also have a wide range of unintended consequences. This could include the provision of overly complex and lengthy explanations by organizations which consumers might find difficult to understand. Such rules could also reduce the incentive for companies to develop and adopt artificial intelligence systems if they believe that doing so could result in a requirement to disclose information of a commercial or proprietary nature, such as trade secrets.
3. Parliament can achieve compliance without the CPPA containing a private right of action:
The CPPA establishes a private right of action for consumers who are “affected” by an organization’s act or omission that constitutes a contravention or offence of the CPPA.
Business leaders question the need for a private right of action when Bill C-11 already creates powerful compliance incentives. Indeed, the CPPA sets out one of the world’s most robust and comprehensive oversight and enforcement regimes. This includes new order-making powers for the Privacy Commissioner as well as some of the world’s highest fines and penalties for non-compliance.
Establishing a private right of action is therefore unlikely to encourage further compliance. Instead, it could undermine Canadian innovation and entrepreneurship. At its worst, the right could expose even the most privacy conscious companies to “company-ending” lawsuits. At its best, the liability risk and uncertainty created by the right could discourage companies from engaging in legitimate commercial activities. In fact, the right, as currently drafted, is expected to result in a surge of nuisance lawsuits brought by consumers who might claim that they have been affected by an organization’s conduct irrespective of whether they suffered actual loss or injury.
4. The CPPA should encourage private sector solutions to social challenges:
The CPPA would permit organizations to disclose de-identified information without consumers’ knowledge or consent to public sector institutions for “socially beneficial purposes.”
Restricting the use of de-identified information to public institutions is a missed opportunity. The CPPA’s current approach ignores the importance of industry-led solutions to social challenges and exaggerates the risk of re-identification. It also fails to recognize the role that data trusts can play in addressing residual privacy risks that might arise through private sector entities’ use of de-identified information.
In our view, the CPPA should support the disclosure of de-identified information to private sector institutions that are engaged in tackling Canada’s greatest social challenges, such as climate change or the next public health crisis. If Parliament deems it in the public interest to maintain control over the disclosure of such information to private sector institutions, the Privacy Commissioner could be granted a right to disallow individual disclosures. The Privacy Commissioner could be authorized to exercise this right if the Privacy Commissioner reasonably determined that the disclosure could adversely affect a consumer’s privacy rights.
5. It is unreasonable to require that organizations de-identify personal information that they use and disclose for prospective business transactions:
PIPEDA places limits on the use and disclosure of personal information for prospective business transactions. These safeguards have worked well. The CPPA adds unnecessarily to these obligations by requiring that such personal information be de-identified before it is used or disclosed until the transaction is completed.
This requirement would impose an unreasonable burden on organizations and could harm future business transactions. In practice, organizations would be required to determine, for each document that might be disclosed, whether information in that document could re-identify a consumer if combined with other internally or externally available data.
The de-identification requirement for prospective business transactions should therefore be removed and the current framework for prospective business transactions under PIPEDA should remain intact.
6. The Personal Information and Data Protection Tribunal requires a greater proportion of members with specialized knowledge:
The PIDPTA would create a tribunal composed of three to six members. Only one of these people would be required to have expertise in the field of information and privacy law. This is concerning. A greater proportion of the tribunal’s membership should be specialists given the growing complexity of information and privacy law and the tribunal’s important oversight and enforcement functions.
7. The CPPA’s punitive fines would hamstring businesses and innovators:
Enhanced oversight and enforcement of privacy rules is essential to establishing a data-driven economy that inspires consumer trust and encourages technological adoption. However, the CPPA’s financial levies are excessive, to the point where they would undermine valuable innovation and legitimate commercial activity.
For contraventions of the CPPA, organizations could face administrative monetary penalties as high as $10 million or 3% of their global revenue. If convicted of an offense, organizations could also incur fines as high as $25 million or 5% of their global revenue.
Such levies are unduly punitive and fail to align with the fines found in the privacy laws of Canada’s major trading partners. The European Union caps levies for less serious violations of the GDPR at €10 million or 2% of global revenues. Levies for more serious violations are limited to €20 million or 4% of global revenues.
The levies set out in the CPPA would increase Canadian businesses’ compliance costs relative to their international peers. They would also cause organizations to become unduly risk adverse at a time when one of the objectives of public policy should be to promote private sector initiative and innovation.
To ensure that the magnitude of penalties and fines imposed under the CPPA are reasonable and appropriate in the circumstances, they should be brought in line with the levies set out in the privacy laws of Canada’s major trading partners. The tribunal should also be required to consider a non-exhaustive list of considerations, including any mitigating factors, when determining the amount of a penalty or fine.
8. The CPPA’s cross-border disclosure obligations are unduly burdensome:
The CPPA requires organizations to disclose if they carry out any international or interprovincial cross-border data transfers or disclosures “that may have reasonably foreseeable privacy implications.”
On its face, this requirement may appear reasonable. However, to meet this requirement, organizations must first evaluate the privacy implications of each of their cross-border data transfers and disclosures.
A typical Canadian company transfers or discloses the personal information of their customers, employees, and suppliers hundreds of times everyday to multiple service-providers scattered across the globe. Given the frequency in which businesses engage in cross-border transfers and disclosures during everyday operations, performing such resource-intensive analysis would significantly increase the cost of doing business in Canada.
9. Businesses should be given adequate time to adjust to privacy law changes:
Parliament should fix the date for Bill C-11’s coming into force to a day 24 months following Royal Assent. This will provide businesses with adequate time to bring their practices into compliance with the law and avoid unduly burdening businesses with new obligations during the COVID-19 pandemic. This implementation period is consistent with the period adopted by Canada’s major trading partners, including the European Union.
10. The CPPA should “grandfather in” PIPEDA-compliant consents:
The CPPA should include a “grandfathering” clause so that PIPEDA-compliant consents obtained prior to the coming into force of the CPPA continue to be valid. Without sufficient grandfathering, organizations will experience significant difficulties in being able to continue to serve consumers, and consumers will be inundated with re-consenting campaigns.
Privacy modernization should move forward in a timely fashion:
We recognize that COVID-19 has imposed tremendous pressures on policymakers and elected officials. At the same time, we believe strongly that Canadians cannot afford to see Bill C-11 die on the Order Paper.
Canadian workers and businesses depend on access to foreign markets. In less than a year’s time, Canadian firms could be locked out of the European Union if the protections put in place by Bill C-11 are not enacted and deemed “adequate” by the European Commission. The resulting loss of access to Canada’s second-largest export market would put jobs at risk and could encourage the flight of Canadian investment, talent, and ideas.
Canadian businesses are equally worried about internal trade barriers. If Parliament abdicates its leadership role over the privacy domain, provinces may be forced to enact or amend their own privacy laws. The result would be a “spaghetti bowl” of inconsistent privacy regulations across the country. Incompatible privacy rules would create confusion and security gaps for consumers, undermine business certainty and confidence for investors, and create prohibitive compliance obligations for employers and entrepreneurs.
Business leaders stand ready to support parliamentarians:
Canada’s business leaders share your goal of a healthy and innovative data-driven economy. Since the launch of our Data Driven initiative in 2018, we have consulted entrepreneurs, innovators, and policy experts across the country to advance policies that protect consumers, promote innovation, and strengthen Canadians’ trust and confidence in the emerging digital economy.
As Bill C-11 works its way through Parliament, we will continue to partner with policy makers and elected officials to ensure that Canada addresses the challenges and realizes the opportunities of our increasingly data-driven world.
Thank you for your service to Canadians and for your commitment to privacy modernization.
February 18, 2022