Skip to content

What is the daily routine of a Data Protection Officer like?

The Data Protection Officer is an independent supervisory authority. Independence means that we are not integrated into the cantonal administration but are assigned to the Canton Council, which also acts as the appointing body for the Data Protection Officer. The tasks and activities of a Data Protection Officer can be divided into three main parts: informing, advising, and monitoring.

Informing: Our role includes raising awareness among employees of public bodies in the Canton of Zurich and the general public about their fundamental right to privacy. We achieve this through events such as seminars, data protection video competitions, and panel discussions. Our primary communication tool is our website, and soon we will expand our presence on social media.

Advising: We provide guidance to public bodies on legal and technical matters. This ranges from brief telephone consultations to long-term project support spanning months or even years.

Monitoring: We conduct on-site audits to verify compliance with data protection regulations by public bodies. During these audits, we conduct interviews and document our findings along with recommended actions in a report. Public bodies are given time to implement necessary measures, and we verify compliance after the deadline.

In performing these tasks, our team members collaborate across disciplines. This means that legal experts closely interact with IT professionals
 

“An absolute classic among reported incidents is the sending of documents such as tax assessments, invoices, or medical reports to the wrong person. In most cases, incorrectly stored master data is the cause of such misdeliveries.”

– Dr. iur. Dominika Blonski

What data protection incidents are currently prevalent?

An absolute classic among reported incidents is the sending of documents such as tax assessments, invoices, or medical reports to the wrong person. In most cases, incorrectly stored master data is the cause of such misdeliveries. When incidents are reported, we assess the measures that a public body must take to prevent similar incidents in the future and provide guidance to the public body. Under the cantonal data protection law, there are no fines imposed on public bodies for such incidents.

                

Have data protection issues increased since more people are working from home or taking workations?

Working remotely or from home does indeed bring certain risks. However, these risks can be mitigated with specific measures. For example, it's important to separate business and personal data on laptops or smartphones, ensure all devices have the latest updates installed, refrain from sharing passwords with others, securely dispose of business-related paper documents (not in regular recycling bins), and be cautious during communication, whether via phone or video call, to prevent others from overhearing conversations. Additionally, screens should be protected from unauthorized viewing.

                

How do you assess the increasing use of AI? Is more regulation needed? Does data protection need to be adapted?

The use of AI involves data processing during system training, inputting prompts, and collecting user data, known as incidental data. All these data processing activities fall under data protection law, which mandates that processing must adhere to data protection principles. As far as possible, AI usage should comply with data protection regulations. However, when AI goes further, such as making decisions autonomously or generating tax assessments independently, questions arise regarding the adequacy of existing legal frameworks. At the very least, regulations should ensure transparency in AI usage for affected individuals and provide them with the right to demand human review of decisions made by AI.

Additionally, AI often involves the use of products from external service providers, typically operated in the cloud. This outsourcing implicates data protection considerations such as compliance with regulations for outsourcing and processing data. It's essential to ensure that such processing is done in a manner that complies with data protection laws, especially considering that U.S.-based providers may be subject to the CLOUD Act, enabling access to data by U.S. authorities. Given the inherent risks of AI projects, a prior assessment should be conducted with the data protection authority before project implementation. The Data Protection Officer examines whether the project complies with legal and technical requirements and provides guidance on how to implement the project in a data protection-compliant manner.

“If AI autonomously makes decisions, such as issuing tax assessments, it must be regulated to ensure transparency for affected individuals. They should have the right to request that decisions made by AI be reviewed by a human.”

– Dr. iur. Dominika Blonski

In many cases, I can decide for myself how and with whom I share my data. New technologies such as facial recognition or analyses on social media allow inferences about behavior or psychological state. Data is increasingly collected unconsciously, bypassing the individuals affected. How do you see this development? Is data sovereignty ultimately a futile struggle? Is there a need for a different awareness of data – that emotions, for instance, are personal data that concern no one else?

When private entities, such as companies, collect data, they do so based on consent from the affected individuals. When using social media, users consent to various analyses by accepting the terms and conditions. This is provided for in private law. However, facial recognition illustrates a loss of control over one's own data, as biometric data can be collected unnoticed. This raises the question of whether existing legal frameworks are sufficient. Legislators are obligated to ensure fundamental rights are protected even among private parties. Even when public authorities use facial recognition, they may only do so under conditions permitted by the constitution. Facial recognition must be legislatively authorized and proportionate; no less intrusive alternatives should achieve the same purpose. Complete video surveillance of streets with integrated facial recognition would likely not be proportionate.

However, foremost, there must be a societal discussion about how we want to shape this issue. Do we find it acceptable to be completely transparent, or do we see limits? These considerations should inform the legislative process. But first, this public debate is necessary.

                

“If data can be collected unnoticed, as with facial recognition, the question is whether the legal framework is sufficient.”

– Dr. iur. Dominika Blonski

How can people be made more aware of data protection issues? Should data protection and handling be taught as a school subject?

Educating and raising awareness among young people about data protection is certainly crucial. In collaboration with the Pädagogische Hochschule Zürich, we have developed educational materials for all three school cycles that can be used in classrooms.

                

The current data protection law of the Canton of Zurich is undergoing revision. What changes can be expected?

The total revision of the Data Protection Act of the Canton of Zurich (IDG) aims to strengthen the principle of transparency by introducing a Public Access Commissioner. The Data Protection Commissioner will be entrusted with this role. This change will ensure that municipalities also have a contact point for questions regarding public access, which is currently not the case. Additionally, a provision for open government data will be introduced to make certain data available for broader public use.

Public authorities will be required to maintain a publicly accessible directory listing the algorithmic decision-making systems they use, particularly in the area of artificial intelligence. Furthermore, a framework will be established to allow for pilot projects under strict conditions. This framework will enable the processing of special categories of personal data for pilot projects before a legal basis is formally established through regulation.