Legal Background
The basis of data protection can be found in the Hungarian constitution, which states that everybody has the right to privacy and that an independent authority shall oversee the protection of personal data.
The major laws in the data protection field are Act CXII of 2011 on the Right of Informational Self-Determination and on Freedom of Information (Data Protection Act) and the General Data Protection Regulation (GDPR).
To implement the GDPR and Directive (EU) 2016/680 of the European Parliament and of the Council (Law Enforcement Directive), the Data Protection Act was completely amended in July 2018, and now contains three groups of provisions:
Enforcement Environment
The most important administrative sanction is fines, which may go up to EUR20 million or 4% of the annual turnover, whichever is higher. In addition to the administrative GDPR sanctions, Hungarian laws also provide other types of sanctions, as follows:
The authority responsible for monitoring the application and enforcement of Hungarian data protection laws is the National Authority for Data Protection and Freedom of Information (the Authority).
Within its advisory powers, the Authority is particularly active in advising lawmakers on legislative measures in the data protection area. It also issues recommendations to controllers and the general public from time to time, although it has highlighted several times that the European Data Protection Board (EDPB) or its predecessor, the Article 29 Working Party (WP29), is the main body entrusted with interpreting the GDPR.
The Authority also has various authorisation powers in line with the GDPR, but these powers are rarely used, as they are rather specific (such as the approval of binding corporate rules or the approval of codes of conduct or certifications).
Apart from fines, the Authority may impose several other corrective measures, with the following being particularly common:
The Authority may conduct audits and has wide investigatory powers. Investigations are usually initiated by the complainant, but the Authority may also initiate them ex officio.
In its enforcement framework, the Authority has two main kinds of procedures.
Both procedures may be launched ex officio or based on a complaint (but in the administrative procedure only the data subject concerned can file a complaint).
In general, the Authority has a broad selection of investigatory powers, including making on-site visits and accessing equipment used in the course of the data processing. The Authority usually provides very short deadlines for controllers to present the GDPR-compliant documentation, so the GDPR’s accountability principle must be taken seriously.
The controllers and processors may challenge the Authority’s decision in merits in front of the Budapest Regional Capital Court. Such legal remedy by itself does not have suspensive effect.
As Hungary is part of the EU, the Hungarian lawmakers decided on several GDPR implementation packages to bring Hungarian law in line with the GDPR. The Authority also confirmed that the GDPR shall prevail if there is any direct conflict between it and the Hungarian privacy rules.
Moreover, the Authority follows the guidelines, opinions and similar soft law issued by the EDPB, and respects that the EDBP is the main body for interpreting the GDPR.
In cross-border proceedings, the Authority also co-operates with other member states: it suspends the proceeding until the lead supervisory authority makes its decision based on the GDPR's one-stop shop rule.
The future subnational landscape of e-privacy and big data largely depends on the final adoption of the widely debated E-Privacy Regulation and Data Governance Act.
Although the Authority is rather strict, privacy awareness in Hungary is still in its infancy, so the role of NGOs and self-regulatory organisations remains marginal.
The one NGO that aims to tackle this is MADAT (Hungarian Association for Privacy Awareness). There are also other NGOs that generally assist in the promotion and enforcement of human rights, such as TASZ (Hungarian Civil Rights Union), and as part of this work represent those clients whose right to privacy has been violated. TASZ also regularly shares privacy-related educational materials on its website.
In certain sectors, such as marketing, there are organisations that also cover sector-specific areas of privacy; for example, IAB Hungary and the Hungarian Data & Marketing Association share research and news on the topic of online marketing.
The Authority is one of the strictest authorities in the context of GDPR interpretation. It has an especially granular approach to purpose specification and data minimisation (see 2.1 Omnibus Laws and General Requirements). This made business management difficult for international companies that wanted to use one uniform privacy policy across different countries, but it was hardly possible in Hungary due to the local expectations of the Authority.
It remains to be seen whether the international EDPB practice will bring any change in this context. The Authority recognises that the EDPB is solely authorised to interpret the GDPR, but the Authority may continue its old practice in any matter that is not explicitly regulated by the EDPB.
On the other hand, the GDPR enforcement practice has not been aggressive so far, as the Authority’s fines have been rather low compared to the upper limit of GDPR fines (see 2.5 Enforcement and Litigation).
As Hungary adopted the so-called GDPR Omnibus Act by amending 86 sectoral acts in dozens of sectors (including finance, healthcare and online marketing) in April 2019, there were no particular developments at a legislative level in 2021.
The Authority has remained active and made 150 decisions public since the GDPR entered into force. It has also organised three annual DPA online conferences and made public training video materials concerning hot topics such as COVID-19 issues (eg, temperature body screening and digital education in a GDPR-compliant manner), CCTV operation, data breaches and the Schrems II judgment.
At a case law level, the relatively recent Supreme Court judgment in case BH2019/272 is still a widely debated issue in the Hungarian privacy world as it took a much more flexible interpretation of personal data than has been common in the Hungarian data protection practice.
Namely, the Authority interprets the notion of personal data broadly, by using the "absolute approach", according to which data remains personal as long as the data subject remains identifiable by the controller or any other person.
In contrast, the Supreme Court used the "relative" approach by narrowing down the question of whether the data subject is identifiable by the controller. The Supreme Court found that an organisation is not a data controller and does not process "personal data" if it processes pseudonymised medical data without the identification key.
In 2021, the Supreme Court brought another significant judgment in which it declared that the Authority is not entitled to impose fines if it significantly exceeds its own statutory time limit for decision-making (it had decided in 604 days instead of 120 days). Hopefully, this will influence the Authority to complete its procedure within the time limit set by law.
One of the key future issues is whether the Authority will keep its own former practice or rely more on the interpretation of the EDPB. It is interesting that privacy notices are no longer so prevalent in the Authority’s focus, although they were one of its enforcement priorities for years. It seems that the Authority has tried to avoid any confrontation of its old practice with EDPB/WP29 guidelines and focused more on areas that dictate simpler or more uniform logic, like data breaches and handling data subject right requests. On the other hand, as long as there are no specific EDPB guidelines, the Authority may use only its own practice.
The other key issue is whether or not the Authority accepts the Supreme Court’s more flexible interpretation of personal data. It is of pivotal importance to clarify this as there is currently a legal uncertainty regarding when an organisation falls under the scope of the GDPR (see 1.7 Key Developments).
Finally, the Schrems II decision raises a lot of issues in Hungary as well, as no proper standards have been developed regarding how to assess the adequacy of the data-importing country in a practical, smooth, cost-efficient and safe manner (see 4.2 Mechanisms or Derogations that Apply to International Data Transfers).
Data Protection Officers
The rules to appoint a data protection officer (DPO) in Hungary are the same as anywhere else in the EU. Appointing a DPO is necessary if:
The responsibilities of the DPO stem directly from the GDPR, with the primary responsibilities being as follows:
The DPO must be announced to the Authority via its website (naih.hu/adatvedelmi-tisztviselo-bejelento-rendszer).
The Authority has issued numerous explanatory guidelines about appointing DPOs, but they do not usually contain new information compared to the international WP29 Guideline No 243 on DPOs, upon which organisations should rely when deciding whether or not to appoint DPOs.
Legal Bases for Data Processing
Personal data may be processed only if there is adequate legal ground to do so. The GDPR recognises the following six legal grounds:
If the data processing involves sensitive personal data, additional conditions must be met (please see 2.2 Sectoral and Special Issues).
Privacy by Design and Default
Even in the pre-GDPR era, the Authority considered it important to examine whether companies integrated the basic data protection principles into their processes, so privacy by design and default are not completely new concepts in Hungary.
Privacy by design means that, even in the early stages (eg, when decisions are made) and throughout the entire cycle of the processing, controllers must use appropriate technical and organisational measures to implement the basic data protection principles and address key privacy concerns.
Privacy by default requires controllers to integrate appropriate measures so that data processing by default is limited to an “as-needed” basis in the context of the amount of personal data collected, the duration of processing and access rights.
Privacy Impact Assessments
Controllers must carry out privacy impact assessments (PIA) in high-risk data processes. The Authority has not issued any specific guidelines about PIA, but the international WP29 guideline on this topic (No 248) is relevant in Hungary as well.
In 2019, the Authority published the list of data processing operations that require a prior PIA to be carried out (the PIA list). Several processing activities that include the use of emerging technologies were listed in the document (please see 5.1 Addressing Current Issues in Law).
Controllers may freely decide on the PIA methodology they wish to use; however, the Authority recommends using the Hungarian version of the French Data Protection Authority’s PIA software, which it has published on its website.
Privacy Policies
Controllers are obliged to provide thorough information to data subjects about the use of their personal data. The provision of information differs if it is collected from the data subject directly (in this case, Article 13 of the GDPR applies) or if the data is obtained from someone else – or even created by the controller (Article 14 of the GDPR applies).
In 2016, the Authority issued a guideline on how the controller should prepare external privacy policies. The bottom line is that the information must be provided on a purpose level. Broad data processing purposes such as “HR management” are not acceptable in the eyes of the Authority – the purposes must be specified in a way that enables only one interpretation (such as “recruitment”). After the purpose is specified, the Authority also expects controllers to display in the privacy notice all the relevant circumstances of the data processing for the given purpose.
As for internal policies, the Authority concludes that a lack of internal policies does not automatically lead to GDPR sanctions, but the controller must implement adequate technical and organisational measures to prove compliance with the GDPR. The controller must decide on its own what measures to take, but such measures may include the preparation of internal policies as well.
Data Subject Rights
The GDPR gives several rights to data subjects to guarantee that they retain control over their personal data. The Authority requires data controllers not only to inform the individuals about the following rights, but also to give meaningful information about what the right means, and in what situations and how it can be exercised.
The Authority has been very active in enforcing data subjects' rights and has taken the 30-day response deadline seriously. Therefore, it would be important for controllers to implement adequate policies on handling data subject requests and also to implement technical/organisational measures so that the data subject requests could be easily fulfilled (eg, to localise the data subjects’ personal data in different records using software).
Anonymised, De-identified and Pseudonymised Personal Data
The GDPR and the Data Protection Act only apply to personal data that allows the direct or indirect identification of a person. Anonymisation means that such connection is lost forever, and therefore anonymised personal data does not fall under the scope of these laws.
De-identification and pseudonymisation are good methods by which to adhere to data security, but they do not strip personal information from the possibility to reconnect the data with the person, so de-identified and pseudonymised personal data as a general rule remain under the scope of the GDPR.
In the Authority’s view, the data remains personal as long as the data subject is identifiable, and it is not necessarily relevant whether the controller itself can identify the data subject (please see 1.7 Key Developments).
Use of New Technology
The GDPR addresses new technology such as profiling, automated decision-making, online tracking, big data and AI (please see 5.1 Addressing Current Issues in Law).
Breach of Personal Rights
The Data Protection Act authorises data subjects to bring private actions against data controllers or processors for breaches of privacy laws. They may claim both pecuniary and non-pecuniary damages in front of the court.
In the case of non-pecuniary damages, it is enough to prove that the privacy right of the data subject has been violated; beyond this, no proof of any non-pecuniary disadvantage has to be provided.
The GDPR covers three types of personal data:
Sensitive personal data includes data relating to racial or ethnic origin, political opinion, religious or philosophical beliefs, trade union membership, genetic data or biometric data (for the purpose of uniquely identifying a natural person), health data and data about the sex life or sexual orientation of an individual.
Under the Data Protection Act, personal data relating to criminal convictions and offences may be processed – unless the law states otherwise – on the legal basis applicable to special categories of personal data.
In the case of sensitive personal data and personal data relating to criminal convictions and offences, the Authority expects controllers to check whether any additional special condition is fulfilled under Article 9 of the GDPR (apart from the basic six legal grounds under Article 6 of the GDPR). If the controller is unable to demonstrate proper legal grounds this way, the data processing is prohibited.
Hungary also regulates sector-specific data (which is not necessarily sensitive personal data), for which special rules apply under the sector-specific acts. The controller may usually invoke Article 6 (1) c) of the GDPR if these acts determine the permitted scope of data processing, including the categories of personal data, the purposes and conditions of the processing, the persons authorised to process the personal data and the duration of processing.
Financial Data
Financial data is regulated in various Hungarian financial acts and comes under professional secrecy rules (such as insurance secrets, bank secrets and securities secrets). The Hungarian financial acts provide detailed rules on confidentiality and the disclosure of financial secrets.
Health Data
Health data is sensitive personal data. In Hungary, Act XLVII of 1997 on the Processing and Protection of Health Care Data and Associated Personal Data (the Health Data Act) provides detailed rules on processing health data. In general, health data can be processed only for a given purpose authorised by the Health Data Act, or if the patient gives explicit consent. The recent amendment of the Health Data Act (which followed the creation of the National eHealth Infrastructure) allows patients direct access to their health data.
Communications Data
Electronic communications data is regulated in detail in Act C of 2003 on Electronic Communications (the Electronic Communications Act). Electronic service providers may process electronic communication data for the purposes set out in the Electronic Communications Act (such as billing) to the extent it is necessary, and in line with privacy by design they must implement appropriate measures to prevent accidental or illegitimate interception of communication.
Voice Telephony and Text Messaging
Traditional voice telephony is regulated in various sector-specific laws (such as consumer and financial regulations), which provide rules on recording calls. In voice-to-voice (not-automated) calls, the user may be called for direct marketing, information, public opinion polling or market research only if they did not object to such communication. The Authority has also provided guidelines on how the controllers should provide the necessary privacy information to users over the phone. Voice over Internet Protocol (VoIP) is not explicitly regulated by Hungarian data protection laws, but in this context the future ePrivacy Regulation is expected to set rules.
Text messaging as a form of electronic communication is primarily relevant in terms of Hungarian anti-spam laws, according to which users may not receive electronic advertisements without providing prior consent.
Children's or Student Data
Children’s data in the context of information society services may be processed based on the child's consent if said child is not below 16; if the child is under 16 years of age, parental consent is required. In the context of offline services, Act V of 2013 on the Civil Code is relevant, which provides rules on the legal capacity of children.
The processing of the data of pupils and students that is used for assessment is included in the PIA list as an activity that results in higher risk.
Employment Data
Employment data is regulated by the Labour Code, which limits the scope of data that may be processed by the employer by defining that an employee may only be requested to disclose such information that is necessary for the establishment, completion or termination of the employment relationship, or for exercising claims arising from the Labour Code.
Internet, Streaming and Video Issues
Regarding internet, streaming and video issues, the only processing of personal data on which the Authority has provided a statement since the GDPR came into force is the use of cookies. The Authority’s view is in line with the European practice and WP29's Opinion 04/2012 on Cookie Consent Exemption. In June 2018, it stated that cookies may be set on the device of the user based on:
If the processing is based on consent, the website’s manager needs to provide sufficient information about the cookies used. This information must include:
Social Media
Social media is not explicitly regulated in Hungary by sector-specific data protection rules. In January 2021, the Authority issued a statement on the lawful use of social media, providing basis rules for website operators who use embedded social media modules (eg, tracking pixels) on their website (including providing proper privacy notices and consent mechanisms in the context of social media modules).
Search Engines
Search engines are also not explicitly regulated by sector-specific data protection rules. The Authority issued a guideline on how to handle right to be forgotten (RTBF) cases in line with the Costeja Judgment No C-131/12 of the CJEU and the WP29 guideline No 225 on implementing the judgment. There is also Hungarian case law on interpreting the scope of RTBF rules and the delisting criteria.
Online Platform Content
Online platform content (such as hate speech, disinformation and terrorist propaganda) is not specifically regulated in Hungary and usually does not have Hungarian data protection relevance. Online platform providers usually qualify as intermediary service providers and thus must remove any illegal content upon being notified of such by users, within the deadline set by Hungarian laws.
As of January 2022, the Authority may order platform providers to remove certain online content if it has data protection relevance (eg, if the online content seriously violates the privacy right of children or if it relates to special categories of personal data or data relating to criminal convictions).
Act XLVIII of 2008 on Business Advertising Activity still provides that explicit consent is required from the individual for unsolicited electronic marketing communication (via email, fax or sms), regardless of whether said individual is a recipient in a B2B or B2C context. However, based on the GDPR, the Authority has recognised that legitimate interests may be a proper legal ground for electronic marketing in existing client relationships.
Automated marketing calls are only permitted based on the explicit consent of the user. Non-automated (voice-to-voice) marketing calls are permitted only if the user has not objected to such calls (ie, there is no § or other similar objection mark in the relevant applicable phone directory).
Hungarian law does not have specific rules on behavioural advertising; thus, the GDPR rules apply. As a strict rule, tracking cookies that allow behavioural advertising may only be set on the user’s device based on prior explicit consent, and the website operator shall have full knowledge of third-party cookies on its website. In the context of social media marketing, the Authority has made it clear that website operators must use inactive social plug-ins so that the user may control what information will be transferred from the website to social media.
The Authority has issued a guideline in the workplace privacy context (the Workplace Guideline), which includes the basic principles of data processing and numerous special issues, including recruitment, employee monitoring and whistle-blowing operations.
The Authority places high importance on the basic data protection principles in the workplace environment. Workplace data processing purposes must be well specified, and only data that is strictly necessary for the employment relationship can be processed. In general, consent is not a proper legal ground for data processing, due to the subordinate relationship between the employer and the employee. The most common legal grounds used by employers are either legal obligation (under Article 6 (c) of the GDPR) or their legitimate interest (when the data processing purpose may not be connected with a specific legal obligation).
The employer has the right to monitor workplace communication, but certain guarantees must be provided. According to the Workplace Guideline, the following are the most important:
The Authority has not addressed new emerging technological means (like threat detection, e-discovery or loss prevention programmes), but the above guarantees may apply mutatis mutandis. Also, WP29 Opinion No 2/2017 on data processing at work is relevant in the Hungarian context as well.
In general, the employer must consult with the works council 15 days before implementing any workplace privacy or employee monitoring measure.
Hungary also has a specific whistle-blowing act that provides detailed rules on how the employer may lawfully operate a whistle-blowing system, including rules on the subject matter of the reports, access to reports, confidentiality rules, limitations on complainants and procedural rules.
In the Hungarian civil and administrative procedure, no specific standard applies to alleging violations of data protection laws, but the applicable Hungarian rules on evidence set out in the applicable procedural acts must be respected.
The Authority usually collects evidence by asking the controllers to provide the relevant information and documents. Based on the GDPR "super principle" of accountability, the burden of proof is on the controller to demonstrate compliance with data protection laws.
The GDPR enforcement practice has not been aggressive so far, with fines being rather low compared to the upper limit of GDPR fines.
However, in May 2020, the Authority imposed its record GDPR fine of HUF100 million (close to EUR300,000) for a security breach of a Hungarian telco company where an ethical hacker reported a security vulnerability to the telco company. Even though there was no actual theft or leak of personal data, apart from the access by the ethical hacker, the Authority still imposed a high fine, as it considered that the vulnerability of the database was high (as it could potentially have resulted in identify theft or the misuse of personal data). This leading case shows that there is some shift in enforcement focus from traditional GDPR issues to data breach management and cybersecurity.
Private Litigation
The Hungarian courts do not have specific standards on alleging violations but the applicable rules on evidence set out in procedural acts must be respected. Apart from written evidence, witnesses and expert opinions are often used in litigation.
The Data Protection Act authorises individuals to bring private actions against data controllers or processors for breaches of data protection laws. Class actions, however, are not allowed.
In line with the GDPR, the Data Protection Act states that the burden of proof in litigation to demonstrate compliance with data protection laws lies with the controller/processor involved as a defendant.
Both damages and injunctive relief may be obtained through the courts. In an ongoing case, a Hungarian court recently issued a preliminary injunction in which a newspaper had to remove the names of the owners of a Hungarian company from the online version of its list of richest Hungarians, due to privacy concerns.
In 2017, Act XC of 2017 on the Criminal Procedure (New Criminal Procedure Code) came into effect, which included completely new rules on law enforcement’s access to data and surveillance.
As a general rule, law enforcement authorities may collect information without prior official approval, except where the collection of information would be more privacy-intrusive. For example, law enforcement authorities may file information requests to service providers, but they need the prior approval of the public prosecutor if the information request is directed to financial organisations or electronic communication network service providers. Similarly, law enforcement authorities may only pursue certain covert surveillance activities (eg, covert surveillance of information systems, covert searches, covert surveillance of a specific location, opening mail or other closed packages, and interception) based on prior judicial approval.
Even when the law enforcement agency is authorised to unilaterally pursue covert surveillance activity, it is not sufficient to purely refer to “law enforcement/prosecution purposes”. The New Criminal Procedure Code made it clear that collecting secret information via concealed devices is possible only if:
The New Criminal Procedure Code also provides numerous other safeguards by – among others – specifying who may access the data, when the access is possible, what measures may be taken and when the collected data must be erased.
Moreover, following the pattern of the Law Enforcement Directive, the Data Protection Act itself provides detailed rules on how the law enforcement authorities must process the personal data, including rules on privacy by default, data subject rights, data security measures and logging (to make the activities of law enforcement traceable). In this area, the GDPR is not applicable, but the Authority may still impose a fine on the relevant authorities, based on a breach of the Data Protection Act. Such fine is capped at HUF20 million (approximately EUR59,000).
Access to data for national security purposes is regulated in detail in Act CCXV of 1995 on National Security Services (National Security Services Act).
The national security agencies have wider access to data than law enforcement agencies, and have particularly wide access to certain service providers' records and governmental records. On the other hand, certain covert surveillance activities that are more privacy-intrusive (such as covert surveillance in closed areas or covert surveillance of an information system) are subject to prior judicial approval.
The National Security Services Act provides that collecting secret information is possible only if the information required to perform the national security tasks set out in the National Security Services Act cannot be obtained otherwise. However, unlike the New Criminal Procedure Code, the National Security Services Act does not require the performance of a balance test to examine whether the national security purpose disproportionately restricts the personality rights.
Overall, national security agencies have wider power and may collect information based on more flexible rules than law enforcement agencies.
The Data Protection Act was amended in July 2018 to regulate data processing for national security purposes. The GDPR is not applicable in this area, but the Authority still supervises whether the national security agency complies with the provisions of the Data Protection Act, and may impose a fine of up to HUF20 million (approximately EUR59,000).
Similar to law enforcement agencies, the Data Protection Act provides detailed rules on how the national security agencies must process personal data, though some rules are more flexible for them (eg, data breaches must be notified to the Authority only once the national security interest has ceased to exist, and rules on electronic logging are less rigid).
Hungarian organisations may invoke a foreign EU member state authority’s access request as a legitimate basis to collect and transfer personal data, as long as its authority is properly granted in the respective member state’s law.
Hungarian organisations may transfer personal data to non-EU authorities only if the GDPR conditions on international transfers are met (please see 4. International Considerations) – ie, the transfer is based on an adequate level of data protection. This means that, in most cases, a direct request of a non-EU authority is not in itself a legitimate basis upon which to collect and transfer personal data (as most of these authorities could not provide an adequate level of data protection). In such cases, based on Article 48 of the GDPR, Hungarian organisations should generally refuse direct requests of non-EU authorities and refer to existing mutual legal assistance treaties (if there is such an agreement between Hungary and the given state).
In March 2018, the CLOUD Act was adopted by the US Congress to enable US law enforcement agencies to request direct access to electronic data in a cross-border setting. In this context, the EDPB took the position that the CLOUD Act is contrary to the GDPR, and reaffirmed its position that direct requests from US agencies (like other non-EU authorities) are not in themselves legitimate bases for collecting and transferring personal data.
Hungary does not participate in a Cloud Agreement with the USA, and the legal uncertainty will remain as long as the USA and the EU do not reach an international agreement on access requests.
In its annual report in 2018, the Authority emphasised that the regulation of intelligence and its actual practice has always been among its top priorities, especially as data subjects are hardly in a position to enforce any of their data subject rights, due to the secrecy of the interventions. The intelligence service received criticism in Hungary. When the Snowden case was a hot topic, it was even argued in Hungarian privacy circles that Hungarian intelligence raises very similar issues to those in the US.
This topic has now become hotter than ever in Hungary. In summer 2021, the Pegasus scandal came to light in Hungary, in which Hungarian intelligence services allegedly targeted journalists’ mobile phones with Israeli Pegasus spyware based on the “national security interest”. The Authority inspected the use of Pegasus and found that the intelligence service acted in line with the National Security Act. However, some privacy activists consider that the Pegasus scandal shows that the National Security Act gives too broad statutory power to Hungarian intelligence, without proper privacy guarantees.
Transfers within the European Economic Area (EEA) are permitted and shall be considered the same way as transfers within Hungary. The same applies to adequate countries (ie, those countries deemed by the EU Commission as being adequate countries based on the higher level of privacy standards).
Transfers of personal information to any entity outside the EEA or adequate countries are possible only if some additional mechanisms are fulfilled (please see 4.2 Mechanisms or Derogations that Apply to International Data Transfers).
International data transfers (outside the EEA and adequate countries) are permitted only if certain additional mechanisms are fulfilled. These additional mechanisms should be examined in the following order.
First, the transferor must examine whether additional privacy safeguards are taken to achieve an adequate level of privacy protection, including:
In light of the Schrems II decision, it must be emphasised that relying on the documentation of these safeguards (eg, just signing the SCC) is not sufficient. On the contrary, it must be factually assessed and documented that the guarantees provided by the safeguards can indeed be fulfilled in practice. This includes legal assessment of the data-importing jurisdiction (such as whether the access to personal data by public authorities is proportionate and whether there is any effective remedy available to data subjects) and the implementation of supplementary measures (such as encryption) to ensure compliance with the EU level of protection of personal data.
If none of the above special safeguards are taken, the data subject must give their informed consent specifically for the given international transfer.
In the absence of the above safeguards and consent, international transfers are still permitted if any special derogation rules apply under Article 49 of the GDPR, including when the transfer is necessary for:
Finally, in the absence of derogation rules, an international transfer is still permitted if the following conditions are met:
Transfers within the EEA/adequate countries are permitted. Transfers to countries outside the EEA/adequate countries are also permitted if the additional mechanisms set out in 4.2 Mechanisms or Derogations that Apply to International Data Transfers are followed; in general in such cases, no further notification to or approval from the Authority is required.
Certain public records within the scope of national data assets (such as land registry records, ID records, company registers, criminal records and close to 30 other public records) may be maintained only in-country. Data processing activities for certain public bodies (such as government offices or Ministries) may also be maintained in-country only.
Similarly, online betting service providers must maintain their servers in Hungary. This means that the data located on their servers must be maintained in-country as well.
In both cases, data may not be transferred outside of Hungary.
There is no Hungarian law that would explicitly require the sharing of software codes, algorithms or similar technical details with any authority. However, Hungarian authorities may ask for such information if it is relevant for the given official process. For example, if the open-source code of the software had vulnerabilities that led to a data breach, the Authority may require the controller to share the information of such software.
The following limitations may apply to transfers of data in connection with foreign official access requests.
Please see 4.6 Limitations and Considerations.
Emerging new technology was within the focus of the GDPR when it was being drafted, which led to the handling of various technological legal issues, including automated decision-making and the processing of biometric data. Aside from this, the long-awaited ePrivacy Regulation is expected to tackle issues related to electronic communications, including the hot topics of cookies, direct marketing using online communications and behavioural advertising (profiling) via tracking.
Big Data Analytics
The Authority has not issued specific guidelines on big data. However, the PIA list includes the processing activity of combining data from various sources for matching and comparison purposes, which is a classic use of big data.
Automated Decision-Making
After the GDPR came into force in 2018, companies needed to follow strict rules when they decided to use automated decision-making (including profiling). As a rule of thumb, individuals have the right not to be subject to decision-making based solely on automated means. Secondly, individuals have the right to receive thorough information about the logic used to make the decisions based on their given personal data.
The Authority has also added automated decision-making that has a legal effect or another significant effect on individuals to the PIA list.
Profiling
Profiling is any form of the automated processing of personal data where personal data is used to evaluate certain personal aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person's performance at work, economic situation, health, personal preferences, interests, reliability, behaviour, location or movements. According to this definition, profiling has three core elements:
If profiling is used, the data controller should make sure to adhere to the general data protection principle. For example, the profiling must be visible, fair, transparent and in line with data minimisation. The controller should consider using aggregated or anonymised data if it cannot justify the collection of data otherwise.
The Authority has also added profiling activities to the PIA list, such as profiling to assess solvency, profiling using the data of children or profiling by way of evaluating personal data in a large scale and systematically.
Artificial Intelligence
Artificial intelligence is not yet regulated by law in Hungary, but the government has set up the Coalition on Artificial Intelligence, which aims to create the legal background for using such technology. AI is also considered a hot topic on the European level, with the European Commission presenting the AI Act in April 2021, highlighting the necessity for a European regulatory framework to unify the safeguards provided by member states.
IoT
IoT is not specifically regulated in Hungary, but in 2019 the Authority issued guidance on using a form of IoT: smart energy meters. In this guidance, the Authority followed the logic of the former international soft-law maker, WP29, which also issued an opinion on smart meters.
Furthermore, the PIA list states that a data protection impact assessment needs to be carried out if a public utilities provider uses smart meters that send consumption information via a network.
Biometric Data
Biometric data is considered a special category of personal data. This means that, aside from having a lawful legal ground for its use, a further condition must also apply in order for the data processing to be lawful (see 2.2 Sectoral and Special Issues). In 2016, the Authority discussed the use of biometric systems in the employment context. Although the guideline was issued before the GDPR came into force, it is expected that the Authority will continue to rely on it in the future. The guideline highlighted that four core issues need to be considered before implementing such a system:
The Authority also addressed the use of biometric information in an employment context in 2019. In its statement, it concluded that there may be situations where using biometric information can be lawful (eg, in a research lab where employees also work with deadly viruses); however, in general, using biometric systems to monitor employees is neither essential nor the least intrusive method, and therefore is not likely to be lawful.
The PIA list also includes two types of processing biometric information:
Facial Recognition
Facial recognition is a use of biometric data, so the same rules apply as for biometric data.
Geolocation
Both the former WP29 and the Hungarian Authority issued opinions on the use of geolocation data to monitor employees through data sent by in-built GPS in company cars. Such geolocation information can be used to track the vehicles based on the employer’s legitimate interest if there is a compelling reason to do so (eg, organising routes for couriers, tracking vehicles that transport goods of great value), but it may not be used to monitor employees outside of their working hours. Thus, employers should allow employees to turn off GPS tracking when they are not working if they are allowed to use the vehicle outside of work. The employees should receive thorough information about GPS tracking, and should also be allowed to object to it.
The PIA list includes geolocation data as a factor that indicates a higher risk for individuals if it is used to monitor or create profiles on people.
Drones
The Hungarian drone law came into effect in January 2021. Although it included various rules on the operation of drones, it did not include privacy-related provisions, but the monitoring or recording of another's property via unauthorised drone use has become a criminal offence. The PIA list also contains the operation of drones flown above public places or areas open for the public. Aside from the list, the Authority issued a thorough recommendation about the use of drones in 2014.
Digital governance or fair data practice review boards are not established in Hungary.
There is no recently published, available enforcement decision of the Authority in the emerging digital and technology area, nor has Hungarian court practice recently tackled this area.
The Authority recognises parties' legitimate interest to transfer client databases in corporate transactions, but otherwise it has not yet issued any guidance on how to conduct due diligence in such transactions in a privacy-friendly manner (such as using data anonymisation techniques).
It has become clear in the Hungarian M&A market that the GDPR increases the buyer’s risk in the course of acquisition, which must be properly addressed (such as auditing the seller’s GDPR practice and including representations and warranties for data protection and cybersecurity).
In line with the NIS Directive, certain digital service providers (such as online marketplace providers, search engine providers and cloud-based IT service providers) must notify cybersecurity breaches that have major effects in terms of operation within the EU to the Hungarian computer security incident response team: the Special Service for National Security.
Cybersecurity has become a burning issue in Hungary, in light of the Authority’s recent cybersecurity decision in which it imposed a record fine (please see 2.5 Enforcement and Litigation). The key take-away for companies is that they must pay enough attention not just to having information security polices, but also to their implementation and regular testing of the effectiveness of the applied security measures.
Although only the general GDPR cybersecurity rule applies for most companies (without detailed Hungarian cybersecurity regulation), this is still a challenging area as Hungarian security awareness is low and the burden of deciding on the right measures lies entirely with the companies.
The Information Security Act applies for certain organisations (eg, critical service providers), and imposes additional requirements such as the classification of security breaches, the appointment of a security officer, a ban on data transfers outside the EEA, and logging or reporting security breaches (even if such breaches do not involve personal data).
Kernstok Károly tér 8
1126 Budapest
Hungary
+36 1 501 9900
office@vjt-partners.com www.vjt-partners.comData Breach Management: A Hot Topic in Hungary
In May 2020, the Hungarian Data Protection Authority (the Authority) issued a record fine of HUF100 million (close to EUR300,000) to the key Hungarian telco company, DIGI. This is a milestone in the Authority’s enforcement practice, as the highest fine that it had previously imposed in a traditional GDPR enforcement case was HUF30 million.
The DIGI case shows that there is a clear shift in enforcement focus away from traditional GDPR issues towards cybersecurity and data breach management. For example, in 2019 the Authority completed more than 50 administrative proceedings based on data breaches, which is more than 25% of its total number of proceedings. There has also been a rise in data breach notifications, with 781 in 2020, which is 1.5 times more than in 2019.
The trends show that data breach management is becoming a hot topic in Hungary, and it is worth examining it in a Hungarian context.
Notification of the Authority
The Authority sets a very low threshold on notification. In general, data controllers must file a notification with the Authority each time there is a reasonable certainty that a data breach has occurred, if such breach could have any adverse effect on data subjects (even a minor one).
Article 33 of the GDPR states that it is not necessary to notify the supervisory authority about a data breach when “it is unlikely to result in a risk to the rights and freedoms of data subjects.”
Based on this exemption rule, a more nuanced position can be taken in the context of the “likelihood of the adverse effect”. The following categories are used for differentiation:
Based on Article 33 of the GDPR, it can be argued that “not occurred” and “not likely” are not reportable categories. However, the Authority adopts a black-and-white approach, whereby anything beyond “not occurred” is reportable in practice.
The Authority takes the position that the data controller must fully exclude the possibility that an adverse effect occurred to exempt itself from notification. In its 2020 Annual Report, the Authority provides the following examples for exemption:
The data breach shall be notified either via post or online. The notification form is very detailed (the Word format has 26 pages). Although this could help the controllers to approach the risk assessment better, it also makes meeting the 72-hour notification deadline more difficult (even if the controller makes notifications in phases).
Risk Assessment
Data controllers are free to choose their own risk assessment method, but the Authority especially recommends that they assess the following criteria:
Data controllers must select from four levels of severity of data breach in the Authority’s notification form (ie, low, medium, high and very high), and are recommended to apply the European Union Agency for Network and Information Security (ENISA) methodology, which uses the following formula to identify these four levels:
Based on this calculation, the severity of data breach can be low (below 2), medium (between 2 and 3), high (between 3 and 4) and very high (above 4).
Depending on the result of the risk assessment, the data controller may decide to notify data subjects (this is highly recommended if the ENISA score is above 3). Furthermore, based on the gathered inputs and risk assessment, the data controller may decide on the necessary steps to be taken to mitigate the effects of the data breach.
Notification of the Data Subjects
If the data controller finds that there is a high risk to data subjects, based on risk assessment, the data controller must notify them as well. Prompt notification is highly recommended, especially when the mitigation measure itself requires prompt action from data subjects (eg, a swift change of log-in data).
In this context, once the data controller notifies the Authority about the data breach, it shall also notify the Authority whether:
Whichever decision the controller makes, they must be able to demonstrate that the decision was adequate based on the “super principle of accountability” under the GDPR.
Data Breach Management
There is no clear guidance on how to run a proper data breach management process; the Authority always assesses this on a case-by-case basis. The bottom line is that the data controller must do everything in its power to remedy the data breach – ie, mitigate the effects of the data breach and correct its processes to prevent future breaches.
Nevertheless, the Authority recommends that data controllers take the following steps in the course of data breach management:
Authority’s Enforcement Framework
In its enforcement framework, the Authority first conducts an investigation as a preliminary phase of the process to collect evidence. In this phase, the Authority examines the following in particular:
The Authority will complete the preliminary investigation within 60 days, after which it will either close the case and declare a lack of breach of data protection rules or launch an administrative procedure (this is the main proceeding in which the Authority may impose fines).
Based on the GDPR's one-stop shop rule, in an EU cross-border data breach, the Authority usually suspends the proceeding in order to identify the lead supervisory authority. Once the lead supervisory authority takes over the case, the Authority terminates its proceeding.
The DIGI Case and its Lessons
In the DIGI case, the Authority imposed its record GDPR fine of HUF100 million (close to EUR300,000) for a security breach of a Hungarian telco company where an ethical hacker reported a security vulnerability to the telco company, DIGI.
DIGI did not fix the vulnerability of the open-source content management system of its publicly available website, which was known of for more than nine years and could be detected and debugged using the appropriate instruments. The ethical hacker exploited this vulnerability and accessed DIGI’s database containing the personal data of around 322,000 data subjects.
DIGI reported the personal data breach and terminated the vulnerability by making the necessary installations and deleting the database.
There was no actual theft or leak of personal data, apart from the access by the ethical hacker, but the Authority still imposed a high fine, as it considered that the vulnerability of the database was high (as it could potentially have resulted in identify theft or the misuse of personal data).
It is a very instructive case as it shows that it is not enough to have an information security policy in place: data controllers must actually implement adequate security measures. In the lack of adequate security measures, the Authority can impose a high fine even if the data controller managed the data breach adequately. Of course, data controllers might also consider not reporting a breach to the Authority, but this would be the first aggravating factor in the context of a GDPR fine if the Authority were to discover the data breach.
Conclusion
Data breaches have become the Authority’s enforcement priority. The Authority has the strictest enforcement policy in this field (especially in the light of possible fines), so businesses are highly recommended to spend more time on working out proper security measures and data breach management. Also, despite the uniform GDPR rules, there are several local Hungarian specialities, which also makes this area very important.
Kernstok Károly tér 8
1126 Budapest
Hungary
+36 1 501 9900
office@vjt-partners.com www.vjt-partners.com