The Canadian Government Undertakes A Second Effort At Comprehensive Reform To Federal Privacy Law – Privacy Protection – Canada

On June 16, 2022, the Canadian authorities tabled a second
try to reform Canadian privateness regulation in Bill C-27, the Digital Charter Implementation Act,
. 1 While 2020’s Bill C-112 sought to enact
complete reform of the federal personal-sector privateness regulation,
many criticized it for being too timid of an try. The former
Privacy Commissioner of Canada and the Ontario authorities had been significantly essential of Bill C-11, with
former Commissioner Therrien calling it “a step again total
from our present regulation and wish[ing] important modifications” to
restore confidence within the digital financial system. Nonetheless, aside from
the brand new synthetic intelligence framework, Bill C-27 is derived
from Bill C-11, although with a lot wanted clarifications, and can
probably face among the similar criticism from those that took challenge
with Bill C-11 (see a comparability between Bill C-27 and Bill C-11 right here).

Like its predecessor, the 2022 Act proposes to

  • Enact the Consumer Privacy Protection Act (CPPA) to
    exchange Part 1 of the Personal Information Protection and
    Electronic Documents Act
    (PIPEDA), which is the a part of PIPEDA
    that addresses privateness within the personal sector; and

  • Enact the Personal Information and Data Protection Tribunal
    (Tribunal Act) establishing the Personal Information and
    Data Protection Tribunal (Tribunal), which might hear
    suggestions of and appeals from choices of the Privacy
    Commissioner of Canada (Commissioner).

In addition, the 2022 Act would:

  • Enact the Artificial Intelligence and Data Act (AIDA)
    to manage “synthetic intelligence programs” and the
    processing of information in reference to synthetic intelligence

Significantly, the preamble to Bill C-27 states that the
safety of privateness pursuits “is crucial to particular person
autonomy and dignity and to the complete enjoyment of basic
rights and freedoms in Canada.”

  • As within the former reform effort, the brand new CPPA retains the
    rules-primarily based method of PIPEDA however integrates and provides to
    these rules instantly within the act somewhat than set them out in a
    schedule like PIPEDA.

Relative to the model set out within the Bill C-11, the 2022
iteration of the CPPA additionally:

  • Deems the private info of minors to be delicate
    private info and offers extra protections for the
    private info of minors;

  • Introduces a “official curiosity” exception to
    consent, together with revisions to Bill C-11’s “enterprise
    actions” exception;

  • Clarifies that the “method” of amassing, utilizing, and
    disclosing private info, along with the aim for
    doing so, should be applicable within the circumstances, no matter
    whether or not consent is required underneath the CPPA;

  • Introduces a definition of “anonymize” and clarifies
    that de-recognized info is private info topic to
    the CPPA (with exceptions) and that anonymized info is

  • Provides that retention durations should contemplate the sensitivity
    of private info and that safety measures embody
    cheap authentication measures;

  • Limits the requirement to supply explanations of automated
    resolution-making to circumstances the place it “might have a big
    influence on people”;

  • Expands the circumstances the place de-recognized info could also be

  • Modifies the appropriate of disposal to use to the private
    info a company “controls” somewhat than
    private info “has collected from people”;

  • Expands the circumstances underneath which the Commissioner might
    suggest {that a} penalty be imposed.

The CPPA in Bill C-27 retains the regulatory instruments to deal with
compliance and the far more extreme treatments for non-compliance
launched in 2020’s Bill C-11:

  • New powers for the Commissioner, together with audit and
    order-making powers;

  • The potential for the Commissioner to suggest, and for the
    Tribunal to impose, penalties as much as the better of $10 million or
    3% of a company’s annual world revenues;

  • Significantly expanded offences with fines as much as the better of
    $25 million or 5% of annual world revenues; and

  • A personal proper of motion to allow recourse to the courts in
    sure circumstances.

Bill C-27’s most notable divergence from the 2020 reform
effort is the brand new AIDA, which regulates the design, growth,
and use of AI programs. AIDA units out optimistic necessities for AI
programs in addition to financial penalties and legal prohibitions on
sure illegal or fraudulent conduct in respect of AI

Like the European Union’s proposed Artificial Intelligence Act, AIDA is
threat-primarily based and focuses on mitigating the dangers of hurt and bias in
using “excessive-influence” AI programs. However, AIDA just isn’t
as prescriptive because the EU’s proposed regulation, which units out a extra
detailed methodology for classifying “excessive-threat” AI
programs and expressly prohibits a broader vary of dangerous AI
practices, equivalent to sure makes use of of biometric identification programs
by regulation enforcement. Still, each proposed legal guidelines goal to manage AI in
a balanced method which protects in opposition to particular person hurt however just isn’t
overly restrictive of technological growth.

Bill C-27 is a part of a pattern in the direction of the strengthening of
privateness guidelines worldwide, with probably the most well-known instance being the
adoption of the EU’s General Data Protection Regulation
(Regulation (EU) 2016/679) (GDPR) and with the pattern reaching
Canada extra just lately with Québec’s revised privateness regulation
(click on right here to learn extra about these modifications). That mentioned, there’s
no assure that Bill C-27 will move in its present kind, and it
may be very probably that amendments can be launched when the invoice is
thought-about on the relevant committee of the House of Commons.
Indeed, Bill C-27 introduces a completely new synthetic
intelligence framework and is inclined to a lot of the identical
criticisms as Bill C-11.

Like Bill C-11 earlier than it, the Act is for certain to draw very
sturdy consideration from home and international organizations that
gather details about Canadians and are topic to Canadian
privateness regulation, or that function synthetic intelligence programs or
course of knowledge utilized in these programs, significantly in mild of the
impacts of the COVID pandemic and the extra compliance prices
and dangers of fabric legal responsibility that Bill C-27 represents.
Organizations and commerce associations ought to contemplate the influence of
the Act and its evolution because it progresses by Parliament and
be ready to suggest enhancements and to deal with any unintended
penalties of its reforms. Our Fasken privateness staff will preserve you
up to date.

A excessive-degree abstract of key options of AIDA is about out beneath.
We have additionally up to date our abstract of the CPPA and the function of the
Tribunal primarily based on the revisions in Bill C-27.

The Revised Consumer Privacy Protection act (CPPA)


Like PIPEDA, the CPPA is consent-primarily based, however expands the
necessities for acquiring consent and relevant exceptions to

Under the CPPA, to acquire legitimate consent, organizations should
notify people, in plain language, of the kind of private
info they gather, use, and disclose, and of the needs,
method, and penalties of such assortment, use, and disclosure
earlier than or on the time of assortment. 3 Bill C-27
clarifies the plain language requirement: organizations should
present info in a language that a person would
moderately be anticipated to know. 4 Organizations
should additionally determine any third events to whom private info
can be disclosed.

The expanded exceptions to consent embody:

  • The assortment or use of private info for sure
    enterprise actions, together with an exercise required to
    present services or products to a person, an exercise that’s
    essential for the group’s info system or community
    safety, for the protection of a services or products that the
    group offers, or every other prescribed exercise, in every
    case supplied the person would count on the gathering or use and
    it isn’t for the needs of influencing the behaviours or
    choices of the person.

  • The assortment and use of private info for a
    official curiosity “that outweighs any potential
    hostile impact on the person,” supplied the person
    would count on the gathering or use and it isn’t for the needs
    of influencing the behaviours or choices of the person. The
    use of the official curiosity exception is topic to circumstances,
    together with that the group determine any potential hostile
    impact on the person and take cheap measures to cut back or
    mitigate these results. Organizations should additionally preserve information with
    respect to the foregoing. 5

  • Public curiosity functions as set out within the CPPA.

  • Transfers of private info to service suppliers.

  • De-identifying private info.

Finally, the CPPA requires specific consent for sure
actions, particularly for the enterprise actions listed above
and actions associated to official pursuits to the extent that
these actions can not profit from the exceptions to consent for
these actions (e.g., the actions are for the needs of
influencing behaviour or wouldn’t be anticipated by an inexpensive

Policies and Practices

The CPPA would require organizations, as a part of their governance
framework, to think about the sensitivity of private info when
figuring out its retention interval. 6 Physical,
organizational, and technological safety safeguards should embody
cheap measures to authenticate the identification of the person.
7 The CPPA requires organizations to stipulate in plain
language their insurance policies and practices relating to the safety of
private info, which should be available and
point out:

  • What sort of private info the group

  • How a company makes use of it and the way it applies the exceptions
    to the requirement to acquire a person’s consent, together with
    the place it invokes a “official curiosity”;

  • Whether it makes use of automated resolution making about people
    that might have a big influence on them;

  • Whether it transfers private info outdoors Canada or
    interprovincially in a method which will have foreseeable privateness

  • The retention durations relevant to delicate private

  • How a person could make a request for entry or disposal;

  • Who to contact to file a grievance. 8

Transfers of Personal Information and Service

Where a company transfers private info to a
service supplier, it should guarantee an equal degree of safety
of the private info (by contract or in any other case) that the
group is required to supply underneath the CPPA. Service
suppliers should safeguard private info and supply discover of
any breach of safety safeguards to the group who controls
the private info. Otherwise, supplied {that a} service
supplier solely makes use of the transferred private info for the
functions for which it was transferred, service suppliers are exempt
from the obligations of the CPPA with respect to the transferred
private info.

An group’s available insurance policies should embody a
description of interprovincial and worldwide transfers of
private info and the privateness implications of these

Enhanced Individual Rights: Disposal of Personal
Information and Mobility

The CPPA integrates PIPEDA’s second privateness precept,
Identifying Purposes, instantly into the physique of the
laws. The CPPA in Bill C-27 reinforces this precept by
clarifying that a company might gather, use, or disclose
private info solely in a method and for functions
{that a} cheap individual would contemplate applicable within the
circumstances, whether or not or not consent is required.

In addition, the CPPA requires a company on written
request to dispose of private info underneath its management, as
quickly as possible if:

  • The info was collected, used, or disclosed in
    contravention of the Act;

  • The particular person has withdrawn their consent to the gathering,
    use, or disclosure of their info; or

  • The info is not essential to supply a product or
    service requested by the person. 10

Relative to Bill C-11, Bill C-27’s CPPA offers for an
expanded checklist of exceptions the place a company might refuse
disposal. 11 Where it disposes of private info,
a company should additionally inform any service supplier to which it
has transferred the knowledge and be certain that the service
supplier disposes of the private info as nicely.

The CPPA additionally permits a person to request that an
group disclose their private info to a different
designated group the place each organizations are topic to a
knowledge mobility framework, 13 much like the European
Union’s knowledge portability proper.


The CPPA treats the private info of minors as delicate
info and imposes heightened safety for the dealing with of
such private info. In addition, the brand new CPPA now permits
mother and father to behave on behalf of their youngsters to guard their

Automated Decision Systems

Independently of the AIDA, the CPPA additionally addresses the impacts
on privateness rights and private info safety in relation
to automated resolution programs that exchange the judgement of a human
resolution maker. Compared to Bill C-11, the scope of the automated
resolution system provisions has been restricted to programs which will
have a important influence on people, a change that
organizations will definitely welcome. Organizations that use
automated resolution programs to decide that might have a
important influence on a person should, on request by the
particular person, clarify:

  • The sort of private info that was used to make the
    prediction, advice or resolution;

  • The supply of the knowledge; and

  • The causes or principal components that led to the prediction,
    advice or resolution. 14

De-Identification and Anonymization

Like Québec’s new privateness regulation, the CPPA distinguishes
between anonymized and de-recognized private info. It
defines “anonymize” as irreversibly and completely
modifying private info, in accordance with usually
accepted greatest practices, to make sure that no particular person could be
recognized from the knowledge, whether or not instantly or not directly, by
any means. 15 The CPPA doesn’t apply to anonymized
info. 16

The CPPA defines “de-determine” as modifying private
info in order that a person can’t be instantly recognized
from it, although a threat of the person being recognized
not directly stays. The CPPA clarifies that de-recognized
info is private info besides in sure circumstances, most
notably in reference to analysis, enterprise transactions, and
sure rights of people.

Under Bill C-27, the CPPA expands the conditions during which
organizations might re-determine a person utilizing de-recognized
info, 17 together with by granting the Commissioner
the facility to authorize a re-identification whether it is clearly within the
pursuits of the involved particular person. 18

New Commissioner Powers and the Personal Information and Data
Protection Tribunal

The Act represents a big departure from the present
enforcement mannequin in Canada and would current materially better
authorized and reputational dangers to organizations in relation to
non-compliance with federal privateness regulation. Currently underneath PIPEDA,
the Commissioner has no energy to make orders or to suggest
financial penalties. The CPPA offers for each, and organizations
topic to Canadian privateness regulation will face a variety of potential
sanctions, together with important financial penalties and awards, and
a non-public proper of motion.

The Tribunal Act establishes the Personal Information and Data
Protection Tribunal and grants the Tribunal jurisdiction over the
penalties that could be imposed underneath the CPPA. The Tribunal should
present a choice, with written causes, to all events to a
continuing, and should make its choices and the explanations for them
publicly out there. Bill C-27 makes comparatively few modifications to the
Tribunal Act however such modifications go to the core of the Tribunal’s
authority, as in Bill C-27 the Tribunal has the powers of a
superior courtroom of file as a substitute of the facility of a commissioner
underneath the Inquiries Act.

In addition, the CPPA expands the Commissioner’s powers to
conduct inquiries and impose penalties by suggestions to
the Tribunal. 19

Where the Commissioner finds it applicable to make an order
as a substitute of recommending a penalty, it might accomplish that instantly. It can
order a company to:

  • Take measures to adjust to the CPPA;

  • Stop doing one thing that’s in contravention of the CPPA;

  • Comply with the phrases of a compliance settlement entered into by
    the group; or

  • Make public any measures taken or proposed to be taken to
    right the insurance policies, practices, or procedures that the
    group has put in place to meet its obligations underneath the
    CPPA. 20

Penalties and Fines

The CPPA offers for penalties and expanded fines for
non-compliance with sure of its provisions. Where the
Commissioner has carried out an inquiry 21 and has
decided that a company has contravened sure sections of
the CPPA (that are expanded underneath Bill C-27), together with these
associated to sustaining a privateness administration program, transfers of
private info to service suppliers, consent, limiting
assortment, use, and disclosure of private info, the
retention and disposal of private info, and safety
safeguards, the Commissioner might make a advice to the
Tribunal to impose a penalty.

We have summarized the doable penalty and high-quality quantities within the
desk beneath:

Situation Amount
Monetary penalties Upon the Commissioner’s suggestions, the Tribunal might
impose a penalty.
$10 million or 3% of the group’s annual gross world
income, whichever is larger.

Where a company knowingly contravenes provisions relating

  • reporting of breaches of safety safeguards;

  • sustaining information of breaches of safeguards;

  • retaining info topic to an entry request;

  • utilizing de-recognized info to determine a person;

  • whistleblower protections; or

  • obstructing a Commissioner investigation or inquiry.

It is liable to a high-quality, on indictment: $25
million and 5% of a company’s annual gross world income
could also be imposed on indictment.

On abstract conviction: $20 million or 4% of annual
gross world revenues on abstract conviction

Private Right of Action

Private Right of Action

The CPPA introduces a non-public proper of motion in opposition to
organizations for damages for loss or damage the place the Commissioner
(the place the discovering is not topic to attraction to the Tribunal)
or Tribunal has discovered that a company has contravened the
CPPA. The personal proper of motion would permit people to hunt
monetary aid from Federal Court or a provincial superior courtroom
for varied violations of the CPPA. 22

The Artificial Intelligence and Data Act (AIDA)

AIDA defines “synthetic intelligence system” as a
technological system that, autonomously or partly autonomously,
processes knowledge associated to human actions by using a
genetic algorithm, a neural community, machine studying or one other
method with a purpose to generate content material or make choices,
suggestions, or predictions. AIDA contemplates future
rules that may set out standards to outline
“excessive-influence” AI programs.

AIDA regulates the next actions:

a) processing or making out there to be used any knowledge regarding
human actions for the aim of designing, creating, or utilizing
an AI system; and

b) designing, creating, or making out there to be used an AI system
or managing its operations.

Requirements of AI Systems

AIDA imposes the next necessities on individuals who handle or
are answerable for excessive-influence AI programs, or who in any other case have interaction
within the regulated actions listed above:

  • Anonymized Data: Persons who make anonymized
    knowledge out there to be used in AI programs should set up measures with
    respect to the style during which knowledge is anonymized and the use or
    administration of anonymized knowledge.

  • Assessments: Persons answerable for an AI
    system should assess whether or not it’s a excessive-influence system (in
    accordance with future rules).

  • Risk Mitigation Measures: Persons accountable
    for prime-influence programs should set up measures to determine,
    assess, and mitigate the dangers of hurt or biased output that might
    consequence from using the system and should monitor such measures to
    guarantee their effectiveness.

  • Record Keeping: Records should be saved which
    describe the danger mitigation measures established and the explanations
    supporting the evaluation of whether or not or not an AI system is a
    excessive-influence system. The Minister of Innovation, Science and
    Industry has broad powers to compel disclosure of such

  • Public AI Statement: Persons who handle or
    make out there excessive-influence AI programs should publish on a
    publicly-out there web site a plain-language description of the
    system that explains (a) how the system is used or supposed to be
    used, (b) the varieties of content material that it generates and the selections,
    suggestions or predictions that it makes (or is meant to
    generate or make), (c) the danger mitigation measures established in
    respect of it, and (d) every other info required by

  • Reporting Obligations: Persons who’re
    answerable for excessive-influence AI programs should notify the Minister if
    use of the system outcomes or is prone to end in materials hurt
    (which means bodily or psychological hurt to a person, harm to
    a person’s property or financial loss to a person).

The Minister has broad audit rights and order-making powers with
respect to those necessities. A one who contravenes any of the
necessities is liable to a most high-quality of the better of $10
million and three% of the individual’s gross world revenues within the
previous 12 months or, within the case of a person, a high-quality within the
discretion of the courtroom.

Monetary Penalties & Criminal Prohibitions

AIDA establishes an administrative financial penalty regime for
violations of the Act, the specifics of which can be set out in
future rules.

AIDA additionally makes sure prohibited actions legal

  • Unlawful use of private info in AI
    : Personal info utilized in reference to AI
    programs should be lawfully created or obtained, together with in
    accordance with the provisions of the CPPA. AIDA makes it an
    offence for an individual to own or use private info for the
    goal of designing, creating, utilizing or making out there to be used
    in AI system if the individual is aware of or believes that the private
    info was obtained or derived because of the fee
    of an offense underneath Canadian regulation (or an act or omission dedicated
    outdoors of Canada which, had it occurred in Canada, would
    represent an offence underneath Canadian regulation).

  • AI programs which trigger hurt or financial loss:
    Under AIDA, it’s an offence to make an AI system out there to be used
    whether it is prone to trigger severe bodily or psychological hurt to
    a person or substantial harm to a person’s
    property, and the system causes that hurt or harm. It can be an
    offence to make an AI system out there to be used with the intent to
    defraud the general public and to trigger substantial financial loss to an
    particular person, and the system causes that loss.

A one who commits both of those offences is liable to a
most high-quality of the better of $25 million and 5% of the
individual’s gross world revenues within the previous 12 months or, within the
case of a person, a high-quality on the discretion of the courtroom and/or
a time period of imprisonment of as much as 5 years much less a day.

Administration of AIDA underneath the brand new Artificial
Intelligence and Data Commissioner

The Minister might designate a senior official inside the Ministry
because the Artificial Intelligence and Data Commissioner. The
Commissioner will help within the administration and enforcement of
AIDA and could also be delegated any of the powers and duties conferred on
the Minister, together with the next:

  • Promote public consciousness of AIDA and supply schooling;

  • Make suggestions and put together reviews on measures to
    facilitate compliance with the necessities of AIDA;

  • Establish pointers with respect to compliance with the
    necessities of AIDA; and

  • Establish an advisory committee to supply recommendation and help
    on any issues associated to the necessities of AIDA and publish such
    recommendation on-line.


1. Bill C-27, An Act to enact the Consumer Privacy
Protection Act, the Personal Information and Data Protection
Tribunal Act and the Artificial Intelligence and Data Act and to
make consequential and associated amendments to different Acts
, forty fourth
Parl., 1st Sess., 70-71 Elizabeth II, 2021-2022 (First

2. Bill C-11, An Act to enact the Consumer Privacy
Protection Act and the Personal Information and Data Protection
Tribunal Act and to make consequential and associated amendments to
different Acts
, forty third Parl., 2nd Sess., 69 Elizabeth II, 2020
(First Reading).

3. Ibid., s. 15(3).

4. Ibid., s. 15(4).

5. Ibid., s. 18(2)(3).

6. Ibid., s. 53(2).

7. Ibid., s. 57(3).

8. Ibid., s. 62(2).

9. Ibid., s. 12(1).

10. Ibid., s. 55(1).

11. Ibid., s. 55(2).

12. Ibid., s. 55(4).

13. Ibid., s. 72.

14. Ibid., s. 63(4).

15. Ibid., s. 2(1).

16. Ibid., s. 6(5).

17. Ibid., s. 75.

18. Ibid., s. 116.

19. Ibid., s. 94.

20. Ibid., 93.

21. Ibid., s. 89 or 90.

22. Ibid., s. 107(1).

23. AIDA, ss. 6-12.

The content material of this text is meant to supply a common
information to the subject material. Specialist recommendation must be sought
about your particular circumstances.

Leave a Reply

Your email address will not be published.