On November 26th, the second edition of the Digital Society conference took place. Over 300 citizens, researchers, policy advisors, civil society organizations, industry and entrepreneurs came together to connect and work towards a common goal: to create a better and more inclusive digital society. During the day several workshops were organized by the Digital society program lines. Health & Well-being co-organized three workshops, designed to discuss digital literacy, cyber security and the responsible use of artificial intelligence in digital health care. Read on to learn more about these topics and the contents of our workshops.

Cybersecurity and use of connected medical devices – Health IoT- A workshop by the ‘Safety and Security’ and the ‘Health & Well-being’ group

The number of smart devices connected to the Internet is growing in healthcare organizations. Devices such as pacemakers, insulin pumps, wearables, and other at-home medical devices can increase the quality of healthcare by allowing doctors to monitor their patients remotely and adjust the function of the medical device if necessary. Besides smart features, they also come with potential cybersecurity risks. In this workshop, we have invited cybersecurity specialist Carlos Hernandez Ganan (TU Delft), medical doctor and researcher Marilou Jansen (Amsterdam UMC) and patient representative Stef Smits (region manager of the Dutch Diabetes Association) to learn more on their viewpoints on the use of smart medical devices in the current practice. How do they perceive the level of risk to people’s lives and the quality of healthcare services that connected medical devices could introduce? What do they see as advantages or disadvantages of connected medical devices and how do they see the future? Who is responsible in case something bad happens?

Connected devices may bring major benefits into clinical practice, but they may also introduce various types of risks. Doctors will, of course, consider the physical risks. In addition, the way people handle information may lead to negative side effects. For example, patients may believe that they are constantly being monitored by using a smart devices and don’t seek medical help, while they should, because they wait for their doctor to respond. Connected devices have an important signal function, for example, if the glucose level in the blood falls below a certain threshold in a patient with diabetes. There is, however, always a risk in setting thresholds, because optimal thresholds may vary between patients. It is therefore important to facilitate an infrastructure in which information from different sources, including the smart devices, can be combined easily. Doctors will trust their own judgment more than a device and will weigh options before making a final decision. Naturally, it will take time until doctors and patients will fully trust smart devices.

Take home messages about cyber security and connected devices

Society should pay more attention to the positive aspects of the digitalization of healthcare. Introduction of connected devices could lead to life-changing possibilities and give patients more control over their health status. Therefore, users should be more involved in the design process of medical devices.

In general, the cybersecurity risks in healthcare are currently low. Right now, the value of connected medical devices for cybercriminals is perceived as low. However, the market of smart devices is growing and therefore, the risk increases that people become victims of hacking, blackmailing, or collateral damage of cyber attacks. To prevent that people become victims of cyber-threats, we should be prepared and carefully think through future scenarios. Patients cannot be held responsible for cybersecurity risks, and we all share the responsibility for safe and secure healthcare.

The responsible use of Artificial Intelligence for clinical decision making – A workshop by the ‘Responsible Data Science’ and the ‘Health & Well-being’ group

At the start of this workshop a brief introduction on the use of Artificial Intelligence in clinical decision making was provided. It included the challenges with clinical decision support systems, bias in medicine (e.g. gender and racial biases), examples of (ir)responsible use of AI and potential ethical issues such as not knowing how exactly a (self-learning) algorithm works (also referred to as the “black box”). In the next step the approximately 25 participants were split in five stakeholder groups 1. patient, 2. company, 3. healthcare provider, 4. policy maker and 5. computer scientist. The stakeholder groups discussed from their perspective the following questions: What does the responsible use of AI mean for this stakeholder? What are the needs from this stakeholder? What issues/concerns do we foresee? What are potential solutions to make the use of AI responsible in clinical setting? Last the results of the individual groups were shared with the rest of the participants in a short pitch.

A few key messages from this workshop were the importance of trust, co-creation and connection between stakeholders, education (for example for doctors to understand the advantages and limitations of an AI and for data scientist to obtain knowledge of the clinical application and impact), responsibility, transparency, privacy, explainability (the ability to explain what information the algorithm uses to support its prediction ) the need of potential safety standards for the use of AI in clinical practise.
A report of one of the participants experience and a video impression can be found by clicking on the links.

Digital literacy in healthcare: a user-centred perspective on the implementation of e-health applications, participants. – A workshop by the ‘Citizenship & Democracy’ and the ‘Health & Well-being’ group

Around 2.5 million people in the Netherlands have low literacy skills and 22 percent can be defined as digital starters. To create an inclusive digital society, it is thus crucial to take into account digital literacy when developing and implementing health apps.

Together with around 30 participants ranging from eHealth developers, researchers to health care professionals, we discussed what it takes to develop inclusive health apps and how to ensure that everybody can profit from advantages in digital healthcare solutions. Around 2.5 million people in the Netherlands have low literacy skills and 22 percent can be defined as digital starters, which show the importance of taking into account digital literacy when developing and implementing health apps.

Photo: Brigit Klever

Joëlle Swart, postdoctoral researcher at the University of Groningen, opened the workshop by explaining the importance of digital literacy for citizenship and inclusion in a digital society to prevent a digital divide. Eveline van Dorp, senior project leader at Pharos (expertise center on health inequalities) and involved in the eHealth4all program shared her expertise with the participants. Willeke van Dijk, PhD-candidate at the VU University, introduced the ‘Samen met Eva’ (‘Together with Eva’) app aimed at supporting pregnant women with quit smoking. In subgroups we discussed the needs of digitally illiterate pregnant women, and the potential barriers and solutions regarding the use of an app like ‘Samen met Eva’. Starting from the moment that the caregiver introduces the technology, the workshop followed the journey of the patient until the successful completion of the treatment. During the wrap-up Eveline shared her insights in how to meet the needs of low literate patients, how to lower barriers they perceive, and how to incorporate these insights in the design and implementation of health apps.