Talking HealthTech: 352 – Complying with European laws for AI powered healthcare solutions. Miklos Zorkoczy, Legalai§e Zorkóczy Law Office

divider line

Source: talkinghealthtech.com

Provided by:
Talking HealthTech

Published on:
26 May 2023

Podcast Home >

Meet Miklos Zorkoczy

Miklos is a lawyer who is committed to supporting the advancement of legal tech and HealthTech innovations in Europe. His dedication to his field is evident in his decision to learn python coding, which has allowed him to develop a deeper understanding of how AI design can be made more ethical and trustworthy. As a lecturer at a legal tech skill centre based in Hungary, Miklos shares his expertise and insights with students to help them develop the skills needed for successful careers in this field. Furthermore, Miklos is a valued member of a legal tech and HealthTech project in a startup academy located in Hungary, and a Talking HealthTech THT+ Member as well.

Interest in Python

Miklos explains that he became interested in learning python coding because of its relevance to his work as a lawyer in the international field of AI. As AI is now commonly used in medical devices and legal tech tools, it has become essential for lawyers to be well-versed in the legal aspects of AI and tech problems. Even if they work in a specific jurisdiction, they may still face international problems related to AI tools. With this knowledge, they can work anywhere in the world and be prepared to handle these types of legal issues.

In the Legal Space of AI

Explained that the lack of a clear legal definition for AI presents both challenges and opportunities. However, a draft regulation is currently being developed in the EU, which will provide a definition of AI and establish guidelines for compliance. To export an AI solution to the EU, it is necessary to comply with community laws, such as the General Data Protection Regulation, as well as the laws of each member state. Obtaining a CE certificate is also required to do commercial activity in the single European market. Compliance with EU and local laws is necessary before rolling out any AI-related activity in the European market.

image

AI Legislation

The AI legislation in Europe is a comprehensive approach to addressing areas of risk, and healthcare is one of those areas that will be affected by it. Medical solutions will be treated as high-risk activities and will require a risk management system that includes training, validation and testing datasets, technical documentation, event recording, accountability, transparency, human oversight, and compliance with cybersecurity requirements. Providers must also have a quality management system and undergo a conformity assessment before being registered with the authority. The good news is that there will be AI regulatory sandboxes for startups who want to develop and test their innovative AI systems in a controlled environment before entering the market. However, the challenge remains in obtaining quality data for AI systems in healthcare.

Health Infrastructure with AI

Having a national eHealth infrastructure in Hungary can provide startups with access to a centralised healthcare data management system. This can be a huge advantage, especially for larger countries that are divided into regions or smaller healthcare units. By having a single system, it can make it easier for startups to access data and improve healthcare activity. The new AI legislation in Europe can also be a great opportunity for startups by providing a structure, resources, and definitions to the development and testing of innovative AI systems, creating more trust in the process.

Considering global regulations is crucial, as seen with GDPR, which impacted organisations worldwide. The forthcoming AI regulations may also be rolled out in other countries. While both GDPR and AI regulations share the principle of transparency, what it means for AI is still up for interpretation. The AI law will have similar principles, but questions remain on how they will work in practice. Addressing transparency, accountability, and explainability in AI is necessary for effective regulations.

EU Certification 

Organisations outside of Europe looking to enter the market must obtain a CE mark and register their product into an EU registry. The CE mark is a certificate of conformity that is required for organisations to sell products in the EU market. The registration process begins with obtaining a local licence from the relevant EU country, after which the organisation can expand into other EU countries. These steps are crucial for any organisation looking to enter the European market.

image

Challenge in Data Collection

The future of data protection and bias-free data sets is a crucial topic for many organisations. The EU protects data by law, while in other countries, data is considered a common asset. In Western countries, data can be sold commercially, but individuals have the right to consent or decline. The conversation emphasises the importance of obtaining unbiased data sets to train machine learning systems properly. However, startups face challenges in obtaining data due to strict data protection laws. Therefore, the main question surrounding the world is how to access data to train systems effectively.

Source talkinghealthtech.com