TMT 2023

Last Updated January 29, 2023

Canada

Law and Practice

Authors



Fasken is a leading international business law and litigation firm. With over 700 lawyers, the firm has offices in Vancouver, Calgary, Toronto, Ottawa, Montreal, Quebec City, London and Johannesburg. Its TMT practice groups are routinely recognised as top band/tier in Canada, and it advises companies on most major consultation or policy proceedings held by Canada’s regulators. Fasken’s TMT industry group draws upon the firm’s expertise in a broad range of practice areas to advise participants in the TMT industry on important legal issues relating to: technology developments and contracts; regulatory communications (including broadcasting, telecommunications, spectrum and wireless, copyright, antitrust/competition and marketing, privacy, data security and trade); entertainment and media (including production, distribution, video games, online entertainment and online gaming); business law (including foreign ownership restrictions, start-ups, debt and equity financing, public-private partnerships, specialised commercial agreements, mergers and acquisitions, labour, employment and human rights, intellectual property and tax); and dispute resolution (including public law advocacy, commercial litigation and mediation and arbitration).

There are no laws that specifically address the metaverse and associated services and products in Canada. As a result, legal considerations regarding the metaverse arise from the application of existing laws of general application to metaverse services and products, such as private sector privacy laws, consumer protection laws, competition laws, and copyright and other intellectual property laws. The following is an illustrative list of such considerations.

Privacy and Cybersecurity

Canadian private sector privacy laws apply to metaverse products and services. It is possible to comprehensively log and record an individual’s activity in the metaverse to an even greater degree than in the physical world. Canada’s privacy laws remain consent-based, and potential reforms to federal law and the coming into force of reforms in the province of Quebec will increase the transparency and other requirements for that consent (though in the case of federal reform, new broader exceptions to consent may help the development of the metaverse) and empower regulators to impose significant penalties.

In addition, the coming into force of the new Quebec law will bring new requirements on individual profiling and the collection of personal information through technical means. Organisations operating in the metaverse will have to carefully design their processes for obtaining consent, and ensure their processing of personal information remains within the bounds of that consent and the other requirements of Canadian privacy laws.

Responsibility for Cyber-attacks and Security Breaches

To the extent organisations collect a greater volume of potentially more sensitive personal information in the metaverse, the risks from cyber-attacks and security incidents will increase. The requirements to establish and maintain appropriate safeguards for personal information as required by Canadian privacy laws will be more stringent, and it will be even more critical for organisations to identify, mitigate and ameliorate incidents. It may also be difficult to determine where responsibilities lie in respect of breach notification to users, the Office of the Privacy Commissioner of Canada (OPC) or other applicable regulators.

Artificial Intelligence Regulations

Many human interactions within the metaverse will be enabled by artificial intelligence. Seamless, AI-driven human/system interaction (particularly any AI interpreting or mimicking human behaviour) may fall within future artificial intelligence legislation and regulations as well as automated decision-making provisions in revised privacy laws. These points are discussed in 4. Artificial Intelligence and Big Data.

Consumer Protection and Civil Liability

Most Canadian provinces have prescriptive consumer protection laws that regulate marketing practices and consumer contracts. For example, organisations in the metaverse must comply with restrictive requirements on internet contracts, even though interactions in the metaverse may be more akin to in-person interactions. Further, the application of tort and other common and statutory law that applies to in-person public places has not yet been tested in the metaverse.

Commercial Contracting

Metaverse industry participants need to co-operate in order to create a seamless experience for consumers and users moving through different metaverse platforms. Common intellectual property sharing agreements and confidentiality clauses will need to account for these novel circumstances, and metaverse businesses will need to consider how to apportion responsibility for privacy and security risks in contracts.

In Canada, digital services and digital markets are not subject to a specific regulatory regime as under the European Union’s Digital Services Act and Digital Markets Act. Instead, any legal considerations regarding digital markets arise from the application of general laws to digital services and products such as communications laws, consumer protection laws, competition laws, and private sector privacy laws.

In addition, the Canadian government is proposing to regulate online platforms and digital streaming services as “online undertakings”, under a bill currently being considered by Parliament that would revise Canada’s broadcasting laws. It remains unclear at this time what conditions Canada’s broadcasting and telecommunications regulator – the Canadian Radio-television and Telecommunications Commission (CRTC) – will place on these digital services, but they will likely be required to contribute to the development of Canadian programming through monetary and non-monetary contributions and investments in Canada.

The bill would also allow the CRTC to require online streaming services to carry Canadian programming services and to regulate the discoverability of Canadian programming on online services. The amendments would also provide the CRTC with the authority to impose penalties for violations of certain provisions of the Broadcasting Act and provide the CRTC with explicit information-gathering powers about these services and platforms. At the time of writing, the amendments have passed third reading in the House of Commons and the Senate, and are expected to become law in early 2023.

Digital assets like cryptographic tokens and NFTs can be classified by securities regulators as a “security”. As a result, companies seeking to trade in digital assets may be subject to Canada’s provincial securities regulations. In particular, they will need to consider whether their assets meet provincial tests for a “security” and, if so, meet the attendant prospectus requirements.

Any business seeking to trade in digital services or markets in Canada should consult legal experts to determine whether their activity may be subject to existing generally applicable laws and regulations.

Laws and Regulations

There are no private sector laws of general application focused specifically on the provision of cloud services to the private sector in Canada, but other Canadian laws of general application and certain sector-specific regulations targeted to cloud services apply to the provision and use of such cloud services. Applicable laws include private sector privacy laws and regulations which impose industry-specific requirements, such as:

  • requirements governing the use of cloud services by federally regulated financial institutions;
  • requirements that certain records of federally regulated financial institutions be located in Canada; and
  • laws regarding personal health information.

Regulated Industries

The Office of the Superintendent of Financial Institutions (OSFI) is the Canadian federal regulator that supervises and regulates federally regulated banks and insurers, trust and loan companies and private pension plans subject to federal oversight. OSFI has issued Guideline B-10, Outsourcing of Business Activities, Functions and Processes, which specifies certain OSFI expectations for federally regulated financial institutions (FRFIs) that outsource one or more of their business activities to a service provider. This Guideline applies to all outsourcing arrangements, including cloud services, under which FRFIs are expected to:

  • evaluate the risks associated with all cloud service arrangements;
  • develop a process for determining the materiality of cloud service arrangements; and
  • implement a programme for managing and monitoring risks, commensurate with the materiality of the cloud service arrangements.

The Guideline also contains a list of specific terms that OSFI expects an FRFI to address in a cloud service contract. While Guideline B-10 is directed to federal entities, it has also been voluntarily adopted by many provincially regulated entities in the financial sector. In 2012, OSFI released a memorandum that confirmed that Guideline B-10 applies to cloud computing and that FRFIs should pay particular attention to the following in connection with cloud services:

  • confidentiality and security;
  • contingency planning;
  • location of records;
  • audit and access rights;
  • subcontractors; and
  • monitoring material outsourcing arrangements.

In April 2022, OSFI released for comment an updated Draft Guideline B-10, which sets out enhanced expectations for FRFIs in managing an expanded scope of third-party risks and places a greater emphasis on governance and risk management plans, and on specific outcomes and principles. The Draft Guideline expands the application to “third-party arrangements” which include any business or strategic arrangement with external entities. Examples of arrangements that would be subject to the Draft Guideline include:

  • use of independent professionals, brokers, and utilities;
  • use of financial market infrastructure;
  • other relationships involving the provision of services for the storage, use or exchange of data; and
  • generally, any outsourced activities, functions, and services.

The Draft Guideline replaces the “materiality” threshold in the current guideline and introduces a new “risk-based approach”, which requires a more comprehensive risk management framework that accounts for the level of risk and the “criticality” associated with individual third-party arrangements. A final, revised Guideline B-10 is expected to be issued in Autumn 2023.

Under the Bank Act, the Trust and Loan Companies Act, the Insurance Companies Act and the Cooperative Credit Associations Act, certain records of federally regulated financial organisations carrying on business in Canada must be maintained in Canada. In addition, an FRFI is expected to ensure that OSFI can access, in Canada, any records necessary to enable OSFI to fulfil its mandate.

In addition to Guideline B-10, OSFI has also released an advisory on Technology and Cybersecurity Incident Reporting, which sets out OSFI’s expectations in relation to the immediate and ongoing reporting of cybersecurity incidents, which FRFIs should account for in their agreements with cloud providers. These expectations are in addition to the mandatory breach notification requirements under Canadian privacy laws.

In July 2022, OSFI released a final Guideline B-13, Technology and Cyber Risk Management, that is intended to serve as a complement to existing guidelines, including Guideline B-10. Guideline B-13 is expected to be read, and implemented, from a risk-based perspective to allow FRFIs to compete effectively, and take full advantage of digital innovation, while maintaining sound technology risk management. Guideline B-13 provides FRFIs with technologically neutral guidance to produce key “outcomes” in three domains:

  • Technology and Cyber Governance and Risk Management;
  • Technology Operations; and
  • Cyber Security.

OSFI determined that Guideline B-13 will become effective on 1 January 2024 in order to provide FRFIs sufficient time to self-assess and ensure compliance.

Personal Information Processing in Canada – Overview

A comprehensive review of all privacy obligations in Canada is beyond the scope of this summary. However, generally the following applies.

  • A private sector organisation that uses third-party cloud services to collect, use, store or disclose personal information of individuals on its behalf will be subject to Canadian privacy legislation if the organisation is based in Canada or has a real and substantial connection to Canada.
  • A service provider that provides cloud services and that collects, uses, stores or discloses personal information of individuals will be subject to Canadian privacy legislation if the service provider is based in Canada or has a real and substantial connection to Canada. In addition, Canadian customers of cloud services will seek to have their agreements for cloud services address the requirements of Canadian privacy laws, including on transfers of personal information outside Canada.

In Canada, privacy and personal information are regulated by both federal and provincial legislation.

The Personal Information Protection and Electronic Documents Act (PIPEDA) is the federal privacy law for private sector organisations. The OPC oversees compliance with PIPEDA. The OPC has issued a number of guidelines and case summaries that provide non-binding guidance on the OPC’s interpretation of PIPEDA’s obligations. As of January 2023, PIPEDA continues to be subject to an adequacy decision by the European Commission, to ensure that data is processed in accordance with the General Data Protection Regulation (GDPR) and can be transferred from the EU to Canada without requiring additional data protection safeguards such as the use of standard contractual clauses.

PIPEDA applies in all provinces and territories in Canada to organisations engaged in commercial activities, except where a province or territory has enacted substantially similar private sector legislation and is subject to an exemption under PIPEDA (though PIPEDA continues to apply in those provinces in connection with federal “works, undertakings, and businesses” such as airlines, banks, and telecommunications companies and in connection with the interprovincial or international processing of personal information). British Columbia, Alberta and Quebec have their own legislation that regulates the collection, use and disclosure of personal information by private sector organisations in those provinces. In addition, most provinces have provincial legislation that regulates the collection, use and disclosure of personal health information. There are also federal and provincial public sector privacy laws that apply to the public sector.

In September 2021, Quebec’s legislature passed comprehensive reforms to the province’s privacy laws, the bulk of which will come into force in September 2023 (though note that certain requirements are effective as of September 2022, notably new mandatory breach notification requirements and obligations regarding the individual responsible for compliance with the Quebec law).

Quebec’s revised law is the first Canadian private sector privacy law to specifically address de-identification and anonymisation, automated decision-making, technology-based profiling, and data portability. Importantly, it will introduce significant administrative penalties and fines, with penalties of up to CAD10 million or 2% of worldwide “turnover” and fines of up to CAD25 million or 4% of worldwide turnover, as well as a private right of action. The revised Quebec law will also require organisations to undertake privacy impact assessments prior to transferring personal information outside of Quebec.

In June 2022, the Canadian government tabled Bill C-27, the Digital Charter Implementation Act, 2022 – legislation that would, among other things, enact the Consumer Privacy Protection Act (CPPA) to replace the privacy provisions of PIPEDA. The proposed CPPA retains the principles-based and consent-based approach of PIPEDA. Among other things, the CPPA would:

  • provide new regulatory tools and remedies for non-compliance, including significant monetary penalties;
  • introduce a plain language standard for obtaining consent, with expanded transparency requirements;
  • introduce broader “legitimate interest” and “business activities” exceptions to consent;
  • clarify that the “manner” of collecting, using, and disclosing personal information, in addition to the purpose for doing so, must be appropriate in the circumstances, regardless of whether consent is required;
  • create new individual’s rights of disposal and mobility;
  • clarify the obligations of service providers when personal information is transferred to them for processing;
  • provide for transparency requirements for automated decision-making systems and for transfers or disclosure of personal information; and
  • define and create standards for the de-identification and anonymisation of personal information.

Bill C-27 would also create a new Personal Information and Data Protection Tribunal that would consider decisions and recommendations of the federal privacy commissioner.

The CPPA would allow the federal privacy commissioner to recommend, and for the Personal Information and Data Protection Tribunal to impose, penalties up to the greater of either CAD10 million or 3% of an organisation’s annual global revenues. It would also provide for significantly expanded offences with fines up to the greater of either CAD25 million or 5% of annual global revenues, and a private right of action to permit recourse to the courts in certain circumstances.

Every aspect of privacy legislation might have some impact on the provision or use of cloud services.

The Personal Information Protection and Electronic Documents Act (PIPEDA)

Under PIPEDA, personal information means information about an identifiable individual. PIPEDA provides that an organisation is responsible for personal information in its control, or in its possession or custody, including information that has been transferred to a third party for processing. An organisation that transfers personal information to a cloud service provider remains primarily responsible for that personal information, and will want to ensure that the cloud services contract contains appropriate provisions to address the organisation’s responsibilities in relation to the personal information transferred to and processed by the cloud service provider.

OPC guidance clarifies that an organisation must take all reasonable steps to protect personal information from unauthorised uses and disclosures while it is in the hands of the third-party processor, regardless of whether the information is being processed in Canada or a foreign country. An organisation must be satisfied that the third party has policies and processes in place, including training for its staff and effective security measures, to ensure that the information in its care is properly safeguarded at all times, as well as an audit right.

PIPEDA requires that personal information be protected by security safeguards appropriate to the sensitivity of the information. The security safeguards must protect personal information against loss and theft, as well as unauthorised access, disclosure, copying, use or modification. The nature of the safeguards will vary depending on the sensitivity of the information that has been collected, the amount, distribution, and format of the information, and the method of storage. More sensitive information should be safeguarded by a higher level of protection, particularly where large volumes of information are involved. The methods of protection should include physical, organisational, and technical measures.

PIPEDA case summaries provide non-binding guidance on the OPC’s interpretation of these obligations.

An organisation will want to address the detail of a service provider’s security safeguards in the cloud services contract. When it investigates security breaches, the OPC will closely examine the safeguards in place at the time of the breach and the contractual requirements to implement and maintain such safeguards. The cloud provider’s obligations in the case of a breach of security safeguards should be included in the cloud services contract.

In June 2022 (and following an unsuccessful attempt to pass similar legislation in 2020) the Canadian government re-introduced proposed legislation to replace PIPEDA’s personal information provisions with a new law, the Consumer Privacy Protection Act.

Cross-border Transfers

PIPEDA does not prohibit an organisation from transferring personal information to an organisation outside Canada for processing. However, the OPC expects that organisations must assess the risks to the integrity, security and confidentiality of personal information when it is transferred to third-party service providers operating outside Canada. The OPC expects that organisations will advise their customers that their personal information may be sent to another jurisdiction for processing and that, while the information is in another jurisdiction, it may be accessed by the courts, law enforcement and national security authorities of that jurisdiction.

Alberta’s private sector privacy law requires organisations that transfer personal information outside Canada to maintain policies on:

  • the countries outside Canada in which the collection, use, disclosure or storage may occur; and
  • the purposes for which the service provider outside Canada has been authorised to collect, use or disclose personal information, and to provide information on such policies on request.

Beginning in September 2023, Quebec’s privacy law will soon require organisations, prior to personal information being transferred out of Quebec, to conduct a privacy impact assessment and to determine whether the transferred information will receive protection in accordance with “generally accepted best practices respecting the protection of personal information” in the target jurisdiction.

Big Data

Big data initiatives in Canada must balance the need to maximise the value of large data sets with the requirements of Canadian privacy laws. Holding large amounts of personal information can lead to issues surrounding consent, transparency, accountability, and the requirement to limit the collection of personal information to that needed for the purposes identified by the collecting organisation.

Additionally, holding large volumes of personal information requires organisations to implement more stringent safeguards in order for them to be considered appropriate under Canadian privacy laws. Holding greater amounts of personal information about a greater number of individuals also increases the risks of a class action in the event of a data breach, and the liability that would result from a breach.

To limit these risks, organisations are increasingly using anonymised and (in the case of machine learning) synthetic data. Anonymised and synthetic data, where there is no “serious possibility” that the information, alone or in combination with other information, can identify an individual (ie, be re-identified) are not personal information and thus not subject to existing Canadian privacy laws. However, the potential for re-identification increases as data sets grow and other data sets become available for matching, and statistical and other methods that can re-identify data are becoming increasingly sophisticated.

Canada’s federal privacy law does not currently include a definition of de-identified or anonymised data, or define what it means to de-identify or anonymise data. Recent revisions to Quebec’s privacy law, effective September 2023, introduce anonymisation and de-identification as separate concepts. Under the new Quebec law, anonymised information is information that:

  • no longer allows an individual to be identified directly or indirectly where “it is at all times reasonable to expect in the circumstances” that it will not identify an individual; and
  • is anonymised in accordance with “generally accepted best practices”.

De-identification is a less stringent standard and is information that no longer allows the direct identification of an individual. A business handling de-identified information will have a requirement to take “reasonable steps to avoid re-identification”. Proposed revisions to federal private sector privacy law being considered by Parliament would create a similar distinction between de-identified and anonymised information.

Artificial Intelligence

As mentioned in 3.1 Highly Regulated Industries and Data Protection, in June 2022 the Canadian government tabled Bill C-27, the Digital Charter Implementation Act, 2022. This legislation would, among other things, enact the Artificial Intelligence and Data Act (AIDA) as well as comprehensive reforms to Canada’s federal private sector privacy law regime. Bill C-27 remains in the legislative process under consideration by the House of Commons at this time (January 2023).

If passed, AIDA would regulate the design, development, and use of AI systems in the private sector with a focus on mitigating the risks of harm and bias in the use of “high-impact” AI systems. AIDA would set out positive requirements for AI systems as well as monetary penalties and new criminal offences on certain unlawful or fraudulent conduct in respect of AI systems.

Some of the issues applicable to big data also apply to AI and machine learning (ML), as both typically rely on large data sets. Canadian privacy law requirements regarding consent, openness, and transparency are areas of concern with regard to AI and ML. Meaningful consent and transparency require that organisations identify the purposes of the collection and use of personal information. This is more difficult in the case of AI and ML, since those purposes can evolve over time as ML algorithms and models make discoveries and predictions based on data.

Beginning in September 2023, Quebec’s privacy law will require that individuals be informed when their personal information is used to make a decision about them using automated processing, and that individuals must be offered the opportunity to make observations about the decision.

Organisations that use ML might also encounter issues with Canadian intellectual property laws. Canadian copyright law does not protect databases where the creation of a database or other compilation is not an exercise of “skill and judgement”, and it does not protect individual data elements removed from a database or other compilation (for example, where those data elements are mere facts such as street addresses).

Furthermore, an ML algorithm or model is not considered an “inventor” under the Patent Act or an “author” under the Copyright Act. For Canadian copyright law, the choice of the ML algorithm, training data, and the conduct of the training would have to be an exercise of skill and judgement for the ML model and its output to be potentially considered an original work eligible for copyright protection (subject to the output or model otherwise being the proper subject matter of copyright).

There are no laws that specifically address internet of things (IoT) services and devices in Canada. As a result, any legal considerations regarding the IoT arise from the application of general laws to IoT services and devices.

Canada’s private sector privacy laws apply to the use of IoT devices and services by individuals that allow organisations to collect personal information. In Alberta, British Columbia (BC), and Quebec, provincial private sector privacy laws will also apply to the use of IoT devices in the workplace, while the federal law (PIPEDA) will apply to federally regulated workplaces (ie, to federal works, undertakings and businesses) across Canada. Alberta, BC, and federal privacy laws are notice-based in connection with employee personal information and the employee-employer relationship, and employers must provide notice to employees of the use of IoT devices that collect employee personal information and the subsequent use and disclosure of that information.

Even where employers give notice, however, the processing of personal information must also be for purposes a reasonable person would consider appropriate in the circumstances. Thus, the use of IoT devices to collect employee personal information for inappropriate purposes, for example location tracking or video surveillance where less intrusive measures could be used, may still run afoul of Canadian laws even if employees were provided notice of the tracking.

Other than for limited exceptions, Canada’s private sector privacy laws are consent-based. From a privacy perspective, the IoT poses a challenge in obtaining meaningful consent as it allows passive information collection that may be less obvious to individuals and more difficult to explain. Transparency is particularly important if the IoT service provider is contemplating secondary uses of personal information (ie, uses in addition to providing the services), for example marketing or advertising. The ability of IoT devices to collect large amounts of data must be weighed against requirements to limit the collection of personal information.

The OPC has released guidance targeted towards manufacturers of IoT devices. In particular, the guidance recommends as a best practice that organisations perform a privacy impact assessment before releasing IoT products.

Beginning in September 2023, Quebec law will require organisations that use technologies that can identify, locate, or profile individuals to have this functionality off by default. IoT devices and services will have to comply with this requirement.

IoT devices and services are also seeing growing use in the healthcare sector. Most Canadian provinces have enacted health privacy legislation regulating the use of personal health information by healthcare providers. Depending on the province, health privacy legislation may apply to the healthcare provider, or to both the healthcare provider and its service provider.

Audio-Visual Services

All traditional audio-visual services (television, radio, cable, etc) operating in Canada are required to be licensed or exempt from licensing by the CRTC under the Broadcasting Act. The CRTC issues licences for terms not exceeding seven years and makes those licences subject to conditions related to the circumstances of the licensee that it deems appropriate for the implementation of Canada’s broadcasting policy. Licensees are generally subject to a variety of Canadian content, programme expenditure and/or contribution obligations.

Television and radio stations that use radio spectrum are also required to obtain authorisation from the Department of Innovation, Science and Economic Development Canada (ISED) in accordance with the Radiocommunication Act. Applications to obtain a broadcasting licence must be filed with the CRTC, and the CRTC is required to hold a public hearing to consider the application. The process typically takes between eight and 18 months to conclude.

In order to be eligible to hold a broadcasting licence, a company must be owned and effectively controlled by Canadians. Broadcasting licensees are generally required to pay two types of licence fees (Part 1 and Part 2 fees) under the Broadcasting Licence Fee Regulations:

  • the Part 1 fee is a licensee’s pro rata share of the annual cost of the CRTC’s operations; and
  • the Part 2 fee is established by the Canadian government using a complex formula and paid on a pro rata basis by each licensee.

The CRTC also has the authority to exempt classes of broadcasting undertakings from holding a licence, and has exercised this authority in a number of circumstances, including with respect to small satellite-to-cable (discretionary) services and small cable distributors. Exemption orders issued by the CRTC contain terms and conditions that apply to an entire class of broadcasting undertaking and do not require a company to pay any licence fee or to obtain any further authorisation from the CRTC.

Online Video-Sharing Platform Services

Individuals and companies that operate online video-sharing platforms (including user-generated content) and other online streaming services in Canada do so in accordance with a CRTC exemption order called the exemption order for digital media broadcasting undertakings. To operate under this exemption order, an online video channel must comply with minimal obligations, which include a prohibition on granting undue preferences or disadvantages and a requirement to submit to the CRTC’s dispute resolution process. There are no licence fees or Canadian ownership and control requirements applicable to online video channels.

In February 2022, the Canadian government introduced legislation in Parliament to amend the Broadcasting Act (Bill C-11, The Online Streaming Act). Bill C-11 will amend the Broadcasting Act to empower the CRTC to impose conditions on online video-sharing platforms, which the legislation calls “online undertakings”, providing the CRTC with express legislative authority to regulate the distribution of programming by digital streaming services and online communications platforms, notably through financial and non-financial measures to support the production and exhibition of Canadian content.

Legislative debate around Bill C-11 indicates that user-generated content will not be regulated, but the platforms that share them may be. Among other reforms, Bill C-11 would also provide the CRTC with the authority to impose administrative monetary penalties (AMPs) for violations of certain provisions of the Act and provide the CRTC with more explicit information-gathering powers.

At the time of writing, Bill C-11 has passed third reading in the House of Commons and the Senate and is expected to become law in early 2023.

Telecommunications

The Telecommunications Act regulates telecommunications common carriers and telecommunications service providers. It does not regulate technologies.

The Telecommunications Act defines a telecommunications common carrier as a person who owns or operates a “transmission facility” used by that person or another person to provide telecommunications services to the public for compensation. A transmission facility means “any wire, cable, radio, optical or other electromagnetic system, or any similar technical system for the transmission of intelligence between network termination points, but does not include an exempt transmission apparatus”.

A telecommunications service means a service provided by means of telecommunications facilities, which in turn is broadly defined to include any facility or thing that is used or capable of being used for telecommunications or for any operation directly connected with telecommunications, including a transmission facility. A telecommunications service provider (TSP) is defined as “a person who provides basic telecommunications services” and includes telecommunications service resellers.

Telecommunications regulation in Canada is therefore technology-agnostic and there are no restrictions on the use of new technologies by carriers or service providers. Certain services are, however, subject to registration and other regulatory requirements. For example:

  • non-dominant carriers and resellers must register with the CRTC, and service providers that carry traffic internationally must obtain a Basic International Telecommunications Services (BITS) licence;
  • providers of Voice over Internet Protocol (VoIP) services must obtain approval of their 911 emergency calling arrangements; and
  • Competitive Local Exchange Carriers (CLECs) must obtain approval of their interconnection arrangements with other carriers, and are subject to the mandated provision of certain services to persons with disabilities and to privacy and consumer protection provisions, all of which have been standardised.

The CRTC does not charge for registering as a TSP, but it operates a contribution fund to which carriers and TSPs are required to contribute based on a percentage of their Canadian telecommunications revenues once they are generating CAD10 million or more in revenues. Money from this fund is used to finance video relay services and the extension of broadband facilities to rural and remote parts of Canada.

VoIP service providers need to register with the CRTC as a carrier or reseller depending on whether they own transmission facilities. VoIP service providers also require a BITS licence, which entails an application to the CRTC. No fees are applicable for these registrations, approvals or licences other than contribution to the fund referenced above. VoIP service providers that provide access or egress to or from the public switched telephone network (PSTN), and that use North American Numbering Plan (NANP) telephone numbers to route calls, require CRTC approval of their 911 emergency services before providing service in Canada.

The provision of instant messaging is regulated if it involves the use of transmission facilities owned or leased by a carrier or TSP providing the messaging service. Registration as a reseller or non-dominant carrier will be required. A BITS licence will also be required. No fees are applicable other than contribution to the fund referenced above. The provision of an app over the public internet without transmission services is generally not regulated.

Radiocommunications

The Radiocommunication Act regulates spectrum, and the Minister of Innovation, Science and Industry (the “Minister”) is empowered to issue radio or spectrum licences, or to exempt frequencies from the requirement for a licence. The Minister oversees the department of Innovation, Science and Economic Development Canada (ISED) and is empowered to charge fees for radio or spectrum licences or to hold competitive bidding auctions.

Radio apparatus must meet ISED standards and certification requirements before they can be marketed, sold, offered for sale or imported into Canada. Certifications from specified countries can be used as the basis for obtaining Canadian certification, but certification from foreign regulators like the FCC does not serve as authorisation to market, sell, offer for sale, or import radio apparatus in Canada.

As with cloud services, there are no private sector laws of general application focused primarily on the provision of IT services to the private sector in Canada, but other Canadian laws will apply to the provision of such services.

Applicable laws include those relating to the processing and protection of personal information (both in the public and private sectors) and industry-specific regulations, such as requirements governing the use of IT services (and outsourcing generally) by federally regulated financial institutions (FRFIs), and requirements that certain records of FRFIs be located in Canada, as well as laws regarding personal health information that apply to healthcare service providers.

All Canadian provinces and territories other than Quebec operate under a common law regime, and the law with respect to contracts will be broadly similar to those of other common law jurisdictions such as the United States and United Kingdom, subject to Canadian jurisprudence and provincial laws concerning contracts. Quebec is a civil law jurisdiction subject to the Civil Code of Quebec, including provisions of the code that address service contracts.

The issues discussed in 3. Cloud Computing regarding cloud services also apply to IT services.

Provincial Laws and Regulations Governing the Use of Electronic Signatures

The use of electronic signatures is governed by provincial statute and regulation, and the requirements and conditions for use vary heavily from province to province – as such, this summary is not intended to be comprehensive. As an example, Ontario’s Electronic Commerce Act (the “Ontario ECA”) is generally permissive of the use of electronic signatures, which it defines as “electronic information that a person creates or adopts in order to sign a document and that is in, attached to or associated with the document”.

The Ontario ECA (and other provincial statutes and regulations), however, forbid the use of electronic signatures for particular types of documents, such as wills, powers of attorney, or documents that constitute negotiable instruments (in the case of the Ontario ECA). The Ontario ECA also requires the electronic signature to meet any prescribed technology standards or requirements.

The Ontario ECA does not apply to the use of biometric information as an electronic signature or digital personal identifier, although they can be used if all parties to a transaction agree to their use. 

Federal Laws and Regulations related to Financial Institutions

Federal laws and regulations may also apply to financial institutions – or other entities involved in the provision of services involving the transfer of money – and require the identification and validation of the government identification of their customers or users.

For example, the Proceeds of Crime (Money Laundering) and Terrorist Financing Regulations prescribe certain acceptable methods of identity verification that a reporting entity (generally, an entity involved in providing financial services) must use when required to verify the identity of an individual, corporation or entity other than a corporation. These methods include the validation of government-issued photo identification, reliance on the individual’s credit file, or a dual-process method (where information must come from two different reliable sources).

Guidance issued by the Financial Transactions and Reporting Analysis Centre of Canada (FINTRAC), the financial intelligence unit of Canada’s federal Department of Finance, contemplates the use of digital identification and verification services in the context of the government-issued photo identification method of identity verification when an individual is not physically present. At this time, FINTRAC guidance provides few details with respect to the specifications or technical requirements for such services, other than that such services must be able to determine that a government-issued photo identification document is authentic, valid and current.

Fasken

Bay Adelaide Centre
333 Bay Street, Suite 2400
P.O. Box 20
Toronto, ON
M5H 2T6
Canada

+1 416 366 8381

+1 416 364 7813

toronto@fasken.com www.fasken.com
Author Business Card

Law and Practice

Authors



Fasken is a leading international business law and litigation firm. With over 700 lawyers, the firm has offices in Vancouver, Calgary, Toronto, Ottawa, Montreal, Quebec City, London and Johannesburg. Its TMT practice groups are routinely recognised as top band/tier in Canada, and it advises companies on most major consultation or policy proceedings held by Canada’s regulators. Fasken’s TMT industry group draws upon the firm’s expertise in a broad range of practice areas to advise participants in the TMT industry on important legal issues relating to: technology developments and contracts; regulatory communications (including broadcasting, telecommunications, spectrum and wireless, copyright, antitrust/competition and marketing, privacy, data security and trade); entertainment and media (including production, distribution, video games, online entertainment and online gaming); business law (including foreign ownership restrictions, start-ups, debt and equity financing, public-private partnerships, specialised commercial agreements, mergers and acquisitions, labour, employment and human rights, intellectual property and tax); and dispute resolution (including public law advocacy, commercial litigation and mediation and arbitration).

Compare law and practice by selecting locations and topic(s)

{{searchBoxHeader}}

Select Topic(s)

loading ...
{{topic.title}}

Please select at least one chapter and one topic to use the compare functionality.