EN

The EU’s AI Act: What does it mean for Switzerland?

The enormous impact that AI is already having on various areas of our lives has prompted the European Union (EU) to implement a legal framework to regulate artificial intelligence more closely. There is a consensus in Brussels that it is essential to develop a clear and comprehensive strategy to harness the potential of AI while ensuring that any use of AI is in line with the EU’s ethical principles and fundamental values.

The AI Act was adopted in December 2023. There will now be a two-year implementation phase before the law is applied. During this transition phase, the EU Commission envisages voluntary compliance with the AI Act on the part of companies. The new rules will become final and legally binding in 2026 at the earliest.

The approved AI Act marks a significant milestone in the global regulation of artificial intelligence. It makes Europe the first continent to clearly regulate the use of AI. The proposed AI regulation aims to ensure that Europeans can trust what AI produces and guarantees that AI systems will not harm us humans. The existing fundamental rights and values of the Union should be safeguarded and the effective enforcement of applicable law strengthened.

But what does this mean for Switzerland and the FADP?
Overall, it is likely that the EU’s AI Act will also encourage the Swiss legislature to rethink and adapt its own regulations and strategies in the field of AI in order to facilitate cooperation with EU member states. However, the exact impact will depend on the specific agreements and implementation mechanisms negotiated between Switzerland and the EU.

One area that will be affected is trade and market regulation. The AI Act introduces new regulations and standards for AI applications. If Switzerland wants to maintain its intensive trade relations with the EU and sell AI services and products on the EU market, it will have to fulfil the requirements and provisions of the AI Act. Many Swiss companies that develop or offer AI technologies could therefore be affected in the future.

Data protection legislation will also be affected. The AI Act lays down rules for the exchange of data used for the operation of AI systems. If Switzerland wishes to exchange data with EU member states, the data protection and data security requirements of the AI Act may have to be met. This may have an impact on companies and organisations that work with data across borders.

The AI Act also creates uniform standards for the use of AI technologies. Switzerland could participate in these efforts and benefit from the joint development of standards and best practices. This would improve the interoperability of AI systems and promote the exchange of expertise and experience. In addition, the AI Act will probably have an international scope in general. It sets unique standards for dealing with AI and can serve as a model for other countries and regions. Switzerland, which is very advanced in terms of research and development of AI systems, can benefit from this global trend and strengthen its position as a centre of innovation.

In the middle of the year, the Social Democratic parliamentary group in Switzerland had already called on the Federal Council to lay the foundations for a law that is essentially similar to the AI Act or is as compatible as possible with it. In a statement, the Federal Council announced that it was closely monitoring the development of the AI Act and was basically pursuing a similar digitalisation policy to that of the EU. This also means that the aim is to create an AI Act that harmonises as well as possible with European legislation. Nevertheless, the final result in the EU must be awaited first. As this is now available, it can be assumed that the Swiss Federal Council will also soon take action.

In conclusion, it can be said that the EU’s AI Act will affect Switzerland in various areas, particularly with regard to trade, data exchange, cooperation and international standards. However, the exact impact will depend on the implementation and interpretation of the AI Act by Swiss policymakers, and it will be important to closely monitor developments in this area.

Source: German-Swiss Chamber of Commerce / Beat Singenberger

How small and medium-sized companies can easily protect themselves against data protection mishaps

How small and medium-sized companies can easily protect themselves against data protection mishaps

Small and medium-sized companies generally have very limited resources for data protection and are also unable to deploy their own specialists for this topic. But how can I still protect myself against expensive legal disputes and data protection violations under the GDPR and DSGVO?

Incorrectly installed video cameras in guest rooms, unencrypted customer databases or inadequately disposed of address data? Violations of the Data Protection Ordinance (DSG and DSGVO) in companies happen quickly.

DSG violations and warnings

Almost all companies are obliged to comply with the DPA. However, many companies find them complicated and difficult to understand. This can lead to costly and unsightly infringements and be the reason for warnings. Surveys also show that many have not yet fully implemented the DPA and GDPR. So there are still security gaps here too. But even for companies that are largely compliant with data protection regulations, there is still a residual risk due to legal uncertainties.

High costs possible

Violations of the DPA can be expensive under criminal law. Although high fines are rather rare for small companies, costs for lawyers, courts, expert opinions or even compensation and recourse claims can threaten their existence in the event of a legal dispute.

Protection for small companies

We will show you what options are available to protect yourself and take the necessary precautions.

Claim for damages by an applicant

A company does not have much time to respond to data protection requests. If a former applicant wants to know whether their data has been stored, this must be answered “without delay”.

According to the DPA and GDPR, a request for information from an employer must be answered immediately, and in any case within one month. The Duisburg Labour Court ruled that this in no way means that the employer always has one month.

A service provider had advertised for a receivables management clerk. An applicant submitted his personal documents, but received a rejection. More than 6 years later, the applicant contacted the service provider again by email and requested information in accordance with the GDPR as to whether and what personal data was stored about him. He set the company a deadline of 30 days to respond.

The company did not respond until one day after the deadline set by the applicant had expired. The applicant therefore contacted the company again by email after the 30-day deadline had expired and reminded it of his concerns. The company then issued negative information to the former applicant, stating that it had not stored any of the applicant’s data.

The former applicant then asked the company to explain why it had not previously provided this information. The company replied that it had provided the information in due time with regard to Article 12 GDPR. The former applicant took a different view. In his opinion, the company had violated Article 12 GDPR by providing the information late. He demanded that the company pay monetary compensation in the amount of EUR 1,000. The company did not comply with this request, so the former applicant brought an action before the labour court and demanded payment of compensation in the amount of EUR 2,000 due to an alleged violation of the GDPR by the company.

The labour court awarded the former applicant compensation in the amount of EUR 750. According to Art. 12 para. 3 GDPR, the controller must provide the data subject with information without undue delay and in any case within one month of receipt of the request. The company had failed to fulfil this obligation by replying too late.

How do I delete data according to DSG or DSGVO?

According to the Data Protection Act, companies are required to securely delete personal data of customers, but also of employees. This requires a concept according to which the deletions can be carried out in compliance with the Data Protection Act. Since this often involves various applications and databases, a concept should be in place that defines the uniform and GDPR-compliant processes.

An essential component of the GDPR is the principle of data minimisation, according to which data should only be stored as long as it is necessary for the intended purpose. This results in the need for a deletion concept.

An important principle is storage limitation. This means that data should only be stored for as long as it is needed. A well thought-out deletion concept reduces the risk of data protection breaches. The less data there is, the lower the risk of misuse by unauthorised persons.

A clear deletion concept creates trust. Customers and users realise that their data is not being stored unnecessarily. Deleting superfluous data increases the efficiency of databases and storage systems and thus reduces costs. The concept must define which data is stored, for what purpose and for how long.

A deletion concept is not only a legal requirement, but also demonstrates responsible data management practice. It protects the privacy of the data subjects and helps to use resources more efficiently and reduce the risk of data breaches.

We are happy to support you in this!

Is Microsoft 365 data protection compliant?

In Switzerland as well as in Germany and France, Microsoft 365 in particular is currently being criticised by data protection authorities. Last November, for example, the French Ministry of Education banned the use of the free Microsoft 365 offerings in schools on the grounds that they would violate the European General Data Protection Regulation (EU GDPR).

In Germany, however, the conference of the independent data protection supervisory authorities of the federal and state governments came to the conclusion that Microsoft 365 could not be considered compliant with data protection in principle. Microsoft, on the other hand, sees it differently: the data protection authorities would no longer pursue data protection in their criticism, but would elevate “data protection to a dogmatic end in itself”.

In Switzerland, meanwhile, the Zurich cantonal government caused a stir: In April 2022, the government council approved Microsoft 365 for the cantonal administration. Dominika Blonski, the data protection commissioner of the canton of Zurich, had signed off on the decision – but makes it clear: Despite this decision, the cantonal administration does not have a free pass to use Microsoft 365. Sensitive data must not be exposed to unlawful access by other authorities under any circumstances – and the Microsoft cloud cannot guarantee that, Blonski says in an interview. In general, the problem lies in the fact that Microsoft dominates the market for Office products, says the Zurich data protection commissioner, adding: “Switzerland also has to discuss how such dependencies can be broken as part of the discussions on digital sovereignty.”

However, SMEs should not only pay attention to the right settings, but also to the service contract with Microsoft. Furthermore, according to Korostylev, it is advisable for SMEs in particular to have a cloud exit strategy up their sleeve and to bear in mind that Microsoft intends to offer exclusively cloud-based delivery models from 2025.

Privacy Shield 2.0 – Trial No. 3 with an unclear outcome

The Data Privacy Framework – or Privacy Shield 2.0 – is the third attempt at a data protection agreement between the EU and the US, following the Safe Harbor scheme and the Privacy Shield. According to the European Data Protection Regulation EU-DSGVO, this new regulation is intended to ensure the protection of personal data transferred from the EU to the USA.

The agreement became necessary because the European Court of Justice (ECJ) had declared the previous agreements invalid in the so-called “Schrems judgements”. The EU adopted the Data Privacy Framework in July 2023 by adequacy decision. In the Data Privacy Framework, the US commits to improving the level of data protection of personal data transferred from the EU to the US. Through the EU adequacy decision of 10 July 2023, the EU Commission declares the level of data protection of personal data transferred to the US to be adequate and in line with the requirements of the EU GDPR for data transfers to a third country.

The agreement became necessary because the European Court of Justice (ECJ) had declared the predecessor agreements invalid in the so-called “Schrems judgements”. The EU adopted the Data Privacy Framework in July 2023 by adequacy decision. In the Data Privacy Framework, the US commits to improving the level of data protection of personal data transferred from the EU to the US. Through the EU adequacy decision of 10 July 2023, the EU Commission declares the level of data protection of personal data transferred to the US to be adequate and in line with the requirements of the EU GDPR for data transfers to a third country.

This is intended to end the legal uncertainty that has prevailed since the Schrems rulings. The adequacy decision serves as the basis for the transfer of personal data to certified companies or organisations in the USA from the time of its adoption by the EU Commission. The transfer of personal data to a US company that participates in the DPF and is certified no longer requires any further security measures or additional standard contractual clauses. Whether the new agreement will actually withstand the expected legal scrutiny by the European Court of Justice is not yet foreseeable.

From Switzerland’s point of view, the question is whether it can join the agreement between the EU and the USA with a parallel solution – as it did with “Safe Harbor” and “Privacy Shield”.

Google Analytics – can I still use it?

The use of Google Analytics for the evaluation of access and user behaviour on websites continues to enjoy great popularity. But what about data protection? Can I continue to use this analysis tool without any problems?

The answer is: No!

After the ruling of the European Court of Justice in July 2020 on the invalidity of the Privacy Shield, the data protection association European Centre for Digital Rights (noyb), founded by Max Schrems, filed more than 100 complaints. The first decisions already made it clear that the use of Google Analytics in the EU is illegal.
Subsequently, the data protection authorities of Austria, France, the Netherlands and Sweden found the use of Google Analytics on websites to be unlawful against the provisions of the GDPR on third country transfers.similar decisions by the other authorities are expected to follow.

The authority sees above all a violation of the general principles of data transfer according to Art. 44 DSGVO, since Google’s analytics programme transfers personal user information to the parent company in the USA.
For a data protection-compliant use of Google Analytics, I have to take various measures and make adjustments:

  • First, you must conclude a data processing contract with Google Inc.
  • Adjust the Google Analytics code so that IP addresses are only collected anonymously.
  • The privacy policy must be adapted: How Google Analytics affects data protection must be clearly explained.
  • Include an opt-out, with which the users of your site can object to the data collection by Google Inc.

We will be happy to support you with this implementation.

EU fines now with uniform rules

It is now known that violations of the European General Data Protection Regulation (GDPR) will be punished with fines by the data protection authorities of the EU member states: The fines can amount to up to 20 million EUR, or up to 4 percent of the global annual turnover.

Up to now, it has been up to the competent national data protection authority to determine the amount of the fines. So far, each EU member state decides for itself how far the “up to max. values” provided for by the GDPR will be exhausted. There are now new rules for the assessment of fines: The European Data Protection Board (EDSA) has adopted the final guidelines for the assessment of fines.

At its meeting of 24.5.2023, the European Data Protection Board (EDPB) adopted Guidelines 04/2022 on the calculation of administrative fines under the GDPR following a public consultation.

The guidelines now provide data protection supervisory authorities with uniform standards and a harmonised framework for determining fines. However, the harmonisation only relates to the basis for calculating the fines. The final amount of the fines will continue to be determined individually by the respective national supervisory authority due to the adjustment possibilities of the guidelines model.

The guidelines provide for a five-step assessment procedure that takes into account in particular the nature and gravity of the infringements and the turnover of the undertakings concerned:

Step 1: Sanctionable acts
The supervisory authorities examine whether the case at hand involves sanctionable acts and to what extent these have led to violations of the GDPR. In particular, it will be examined whether one or more acts subject to a fine have been committed.


Step 2: Determining the starting amount
The starting amount for the fine calculation is determined from three factors: The type of infringement (a), the gravity of the infringement (b) and the turnover of the company (c).

Type of infringement (Art. 83 (4) – (6) GDPR)
Violations of Art. 83(4) of the GDPR may be punished by a fine of up to EUR 10 million or, in the case of a company, up to 2% of its total annual worldwide turnover in the preceding business year. Violations of Article 83 (5) and (6) of the GDPR may be punished with a fine of up to EUR 20 million or, in the case of a company, up to 4% of its total annual worldwide turnover in the preceding business year. This results in the statutory maximum amounts that a fine may not exceed in each case.

Severity of the breach
The criteria listed in Art. 83 (2) GDPR are used to determine the gravity of the breach. The determination must result in a severity level in order to be able to determine the starting amount as a percentage of the statutory maximum amount:

  • Low severity: starting amount is between 0 and 10% of the legal maximum.
  • Medium severity: Initial amount is between 10 and 20 % of the statutory maximum.
  • High severity: Initial amount is between 20 and 100 % of the statutory maximum

The turnover of the enterprise
With regard to the turnover of an enterprise, further corrections are made to the initial amount previously determined. The amount can be reduced to between 0.2 % and 50 % of the initial amount determined.

Step 3: Determination of aggravating or mitigating circumstances
Supervisors identify aggravating or mitigating circumstances that may increase or decrease the amount determined in Step 2. These include, for example, the behaviour of the controllers (willingness to cooperate, countermeasures) and whether there have already been breaches of the GDPR in the past. The increase or decrease of the amount is made individually by the supervisory authority.

Step 4: Determining the upper limit
The determined amount of the fine is again compared with the statutory maximum amounts of Art. 83 (4) – (6) DSGVO. It is also decided whether the static (10 or 20 million EUR) or the dynamic (2% or 4% of the annual turnover) upper limit applies to the fine assessment. According to Article 83 (4) and (5) of the GDPR, the higher amount must be used as a basis.

Step 5: Possible readjustments
In the final step of the fine assessment, the supervisory authorities evaluate the determined fine pursuant to Art. 83 (1) GDPR with regard to effectiveness, proportionality and deterrence in order to be able to make any readjustments.

Training for more data protection and security competence

With our efficient online learning modules in the area of data protection management and information security, you increase the professional security competence in your company. For this purpose, in addition to our customised on-site training courses, we offer you a web-based basic training course that conveys basic knowledge and the concepts. The learner is immersed in a colourful world of the topic and receives the knowledge in stages in a playful and appealing way. If you wish, we can also prepare learning content individually tailored to you, which fits 100% to you and your requirements. Our goal is: training should no longer feel like training, but should inspire!

Spotify also fined: EUR 5 million

The Swedish data protection authority only took action against the company after a delay of four years and through court coercion.Allegation: Spotify allegedly failed to properly respond to requests to access data.

The Swedish data protection authority has ordered Spotify to pay the equivalent of around 5.03 million euros in fines. According to the authority, Spotify violated Article 15 of the General Data Protection Regulation (GDPR). In the specific case, the issue was how Spotify handled personal data and how customer access to this data was regulated.

The Integritetsskyddsmyndigheten (IMY) found that although Spotify provided users with personal data upon request, it “did not provide clear enough information about how this data was used by the company.” Spotify needs to be more transparent about “how and for what purposes users’ personal data is processed,” the agency demands.

According to IMY, the lack of transparency and comprehensibility ensured that “it was difficult for individuals to understand how their own personal data was processed.” As a result, Spotify made it difficult for customers to “verify whether the handling of their own persona