Just a few weeks ago the news became viral that some students from a Puertollano Institute (Ciudad Real) have been investigated for creating and disseminating photomontages of sexual content in which students and teachers from the center appeared, using tools from generative artificial intelligence (AI) to generate false and explicit images from real photos extracted from social networks. In total, the National Police has identified 61 victims, of which at least 22 are minors.
Events like this only increase the concern about the use made of artificial intelligence, and worse, of the use made by minors of artificial intelligence within the educational centers themselvesbecause we see that sometimes like this neither teachers nor students are free from the risks associated with AI.
Being a subject as worrying as it is, we have contacted Atico34reference sign in legal advice on AI and LOPD, with the aim of shedding some light about the risks that generative AI can cause for the Protection of minors data and what the teaching centers can do in favor of educational cybersecurity.
What risks can be the generative AI for the personal data of minors?
The Generative It can affect especially delicate to the Data protection of minorsboth because of the nature of the data involved and by the legal and social vulnerability of minors.
One of the greatest threats is creation of images, videos or audios manipulated (Deepfakes) from real minorssuch as photos of social networks, school recordings, etc. As indicated from Atico34 this supposes “A violation of the right to one’s image and intimacy, more if it is the unauthorized dissemination of sexualized or humiliating content, or that is used for practices that can have serious psychological and social consequences such as cyberbullying, extortion, or school marginalization”
The use of IAS such as Grok or Chatgpt in educational environments raises serious risks for the protection of minors data, from exposure to inappropriate content to the improper use of their personal data.
Another of the potential risks is the unauthorized use of personal data for training systems or models. These systems could be trained with data collected without consent, including public or shared photographs by third parties without the knowledge of the child or their parents.
From Atico34 they clarify that “Use data without consent violates the principle of legality and transparency of the RGPD. We must not forget that the processing of the data of a minor can only be based on their consent when it is over 14 years, otherwise, the consent of the owners of parental or guardianship will be necessary. In addition, the treatment must be clear, informed, and respect the GDPR and the LOPDGDD”
The problem increases when it becomes evident that Some of the IAS most used by minors today, Grok’s case, do not offer the necessary restrictions. For example, Elon Musk himself states that his tool can be used to initiate sentimental or sexual relationships, and even offers very explicit sexual suggestions without checking if the user is a minor.
Likewise, the generative AI provokes many other doubts. For example, Once a content is generated and spread with AI, it is practically impossible to eliminate it completelywhat clashes with the right to oblivion and is a loss of data control. On the other hand, Generative technologies involve increasingly complex and hardly auditable modelswhich hinders basic aspects of the Data Protection Law such as determining who is responsible for treatment or applying the principle of proactive responsibility.
What can educational centers do to improve educational cybersecurity?
Educational centers play a crucial role in the protection of personal data of teachers, students and families, especially in an increasingly digitized context and with risks such as the improper use of generative AI, social networks, cyberbullying or data leaks.
First, they must establish a Data Protection Plan and Digital Safety According to the GDPR and the LOPDGDD, from which to create and maintain clear protocols on the safe use of devices, networks, platforms and educational applications. From Atico34, the importance of the Data Protection (DPO) delegate and remember that “Appointing a data protection delegate is mandatory by law for public centers And more than recommended for private ones, since it is precisely the figure that is responsible for ensuring compliance with data protection regulations in educational centers”
The problem is that this Data Protection Plan or Policy for Educational Centers requires knowledge, in addition to technical and human resources, To be executed. For example, through the use of verified educational platforms and compliant with the GDPR or the implementation of infrastructure and platforms that allow to ensure Wi-Fi networks with strong encryption, keep updated operating systems and antivirus, apply double factor authentication, make safety copies or manage access controls by profiles.
The Continuous Teachers and Personal Training. The ideal would be to train teachers in good digital practices, detection of threats such as Phishing or Malware, safe management of virtual classroom platforms and online evaluations, etc. But, being fair, wouldn’t this charge too much responsibility to the teachers themselves? Is it intended that the teacher also become an expert in data protection or AI? The reality is that, in this sense, teachers must contribute a minimum but, at general, centers need external help.
The importance of LOPD consultants for data protection in educational centers
The main advantage offered by LegalTech as ATICO34 Group for the protection of data in educational centers is that “We have infrastructure, tools and the legal team necessary to guarantee both compliance with the regulations for the protection of artificial data and intelligence, as well as training in the field of teachers and students”, They point out from the consultant.
Another decisive factor, as already mentioned, is the Mandatory Data Protection Delegate. If there is no internal DPO, the centers are forced by regulations to hire the services of an external data protection company that can provide this professional profile.
Already as a conclusion, and returning to the original theme of the article, from Atico34 they do not want to miss the opportunity to highlight that “Both educational centers and companies in the sector are in a scenario of change, full of risks and opportunities, and before the unstoppable boom of artificial intelligence there is no other alternative than adapting to it, which in our case means evolving towards a professional profile much more adapted to new technologies and the demands of the new regulations on the”