Realistically, what are the risks of not being GDPR compliant?
Do companies actually care about being GDPR compliant? Or rather, do they care enough to actually spend the time and effort needed to be compliant?
Do companies actually care about being GDPR compliant? Or rather, do they care enough to actually spend the time and effort needed to be compliant?
Hello, I’m about to complete my Master’s degree in Digital Law in France (M2) and I want to become a Data Protection Officer. I’m also planning to obtain the IAPP certification (CIPP/E).
My question is the following:
Since salaries in France are relatively low, would it be better for me to move to Luxembourg, where it seems easier to find a job with a French degree, or to the Netherlands or would you recommend another country?
In the case of choosing the Netherlands, is it realistic to get hired in this field with a French degree, or would it be preferable to pursue an LLM at a Dutch university first?
Hi all, looking for general experiences/opinions on a Subject Access Request to a school in England.
A search reportedly identified around 330+ documents relating to my child/family, but only around 30 files were ultimately disclosed. The ICO later told me the school had provided an appropriate response, emphasising that SAR rights are to personal data within documents, not necessarily full documents.
I was also told many items were considered not relevant / not disclosable.
My question is: is this a common outcome with school SARs?
Have others experienced large search-result numbers being reduced substantially after review? How is “relevance” usually interpreted in practice?
Not looking to name the organisation or restart a complaint — just trying to understand whether this is standard practice or something others have also found frustrating.
Thanks
Hello everyone!
I am not selling anything; I’m just here for advice because I’m not sure how to approach a GDPR issue regarding my future business idea.
I am based in the EU, and I’ve recently built an automation that scrapes public information from public sources about small businesses that do not have a website.
My automation reads the data, uses AI to create a website, and deploys a demo version to static web hosting. I’m planning to use this pre-made website as a hook to gain customers. As a new business, we are trying to give people something tangible they can see with their own eyes to build trust.
We plan on sending cold emails and SMS messages telling them we noticed they don't have a website, so we built one for them, and it will cost 200 euros. If no answer is received or they don’t want the website, the demo will be deleted within a maximum of 14 days due to a lack of response, or immediately upon their request.
However, I have some concerns regarding GDPR:
Hearing from people who have navigated this before would be incredibly helpful.
Thank you in advance! Any insight or knowledge you can share would be much appreciated. :)
As in the title, got an email from NatWest saying they're updating how they handle biometric data from May. Ran it through Claude to make sure I wasnt being an idiot, slightly concerned.
currently they need your consent to use your face/voice data. From May they're moving to legitimate interests basis (which you should be unticking when a website asks for your consents) which means they've decided it's fine without needing your say-so. You can still object but they don't have to listen.
I'm probably overreacting but it's biometrics, not a password. Feels like a pretty significant thing to bury in a routine email.
Is this standard practice in Ts & Cs or worth actually doing something about?
My company offers a storage and backup-as-a service solution for other b2b companies. As part of the service we offer a managed service wraparound - which is a uk service desk.
We have a new client in the usa. We are going to be using a usa datacenter for the actual backup but the UK service desk will still manage tickets.
Our USA client does not have uk/EEA subsidiaries or affiliates, nor any customers in the uk/eea. Does the uk gdpr apply to the service desk element even though the staff raising tickets are solely in the USA?
so confused, please help!
Buonasera
Dopo che il european commission data controller office non ha risposto per 30 giorni ad una richiesta di diritto all'oblio, violando art 12 gdpr 679/2016. Ho contattato il garante privacy italiano (GDPD), E mi ha inviato i link per fare il reclamo via EDPS, dato che forse non rientra nelle competenze del garante.
Avete mai fatto una segnalazione del genere? Come vi siete trovati, quando tempo per la risposta, ed i rischi?
[Adesso mi aspetto ancora quei paladini di ieri che fanno da difensori dell'ets e delle brave persone, laddove i diritti sono esclusivi a chi se lo merita, e non vogliono che gli errori vengono cancellati, costoro che assumono tali atteggiamenti li segnalo a reddit ed ai mod di questo subreddit]
[Fatelo anche Nel rispetto di altre persone curiose che anche loro un giorno troveranno il post semplicemente scrivendo su google poiche anche loro hanno dubbi, e non vogliono trovarsi in futili litigi nei commenti]
So TikTok just rolled out a new privacy toggle: “Allow AI to remix content.” This feature is reportedly being turned on by default, and if you want to opt out, you currently have to manually do it on every individual video (there is no account-wide "off" switch yet.)
From what I’ve seen from some (very angry, if I may add) content creators, this allows TikTok’s AI models to use our footage as reference data to generate new content, including branded ads.
I’m curious from a GDPR perspective, is this not a major violation? If this feature allows them to use our likeness to generate new synthetic content, doesn’t that require explicit, informed opt-in rather than a hidden, retroactive opt-out? Or is there a loophole 😬
After Schrems II, I kept seeing the same thing: European freelancers and small businesses routing signed contracts through DocuSign or HelloSign, assuming a "GDPR-compliant" badge means their data is protected.
It doesn't. Not fully.
The ECJ in Schrems II said explicitly that US surveillance law (FISA Section 702) means Standard Contractual Clauses alone can't provide adequate protection for data transferred to the US. The data is still accessible to US intelligence agencies. An SCC doesn't change the law of the country where the servers sit.
To actually comply, you'd need to:
- Sign a DPA with the vendor
- Rely on SCCs as your transfer mechanism
- Conduct a Transfer Impact Assessment
- Document all of it
For a 3-person accounting firm. For a solo freelancer. For a small law practice.
I wrote a longer piece breaking down exactly what Schrems II changed for document signing workflows and what a genuinely compliant setup looks like: [link to article on swipesign.xyz]
Happy to answer questions here — this is a topic a lot of compliance folks have quietly avoided because it's uncomfortable.
I had surgery at a private hospital (self pay) in the UK over 8 years ago. The hospital's privacy policy is vague: "we'll keep medical records as long as necessary for regulatory and legal reasons"
I understand that minimum recommended retention period is 8 years. But beyond that they can keep it for as long as they want. However, they are also required by GDPR to keep it for only as long as necessary.
So I find it hard to understand how they decide the "as long as necessary" retention period. Does the hospital unilaterally decide this? Is it legally possible for me to force them to delete it after 8 years?
The Commission says its new EU age verification app is ready.
In the press conference, Von der Leyen says you’d set it up with a passport or ID card, then use it to prove your age online without revealing anything else. She also says it’s anonymous, users can’t be tracked, and the app will be open source.
Posting here because that raises some obvious GDPR/privacy questions.
How anonymous is it? We should probably start digging!
Hey r/gdpr,
I ran into the problem of calculating GDPR fine ranges while working on my dissertation — I needed a way to estimate fine ranges for my research, and realized there wasn't really a good tool out there that properly followed the official methodology. So I ended up building one, and figured I'd share it here in case it's useful to anyone else: https://bussgeldrechner-dsgvo.de/en/
It's a GDPR fine calculator that estimates a realistic range for potential fines based on the official EDPB Guidelines 04/2022 on the calculation of administrative fines (not just the "up to €20M or 4%" headline number everyone already knows).
A few things I tried to get right:
Obvious disclaimer: it's an approximation. Supervisory authorities aren't bound by it and the real calculation involves a lot of case-specific judgment. But I found that most "GDPR fine calculators" out there either oversimplify wildly or are basically lead-gen forms for law firms, so I wanted something that actually follows the EDPB method and is free to use.
Happy to hear feedback — especially if you spot edge cases where the logic doesn't match how you'd expect a DPA to reason. Hope it's useful for some of you!
My used car (bmw idrive 6) contains the details of a number of contacts, when I clicked onto one contact it contained details such as iCloud account and passwords, Mastercard passwords, revenue logins, home security system passwords, ect.
firstly I want to know what should I do? i heard people talking about contacting the dealer to alert them of this issue but i would appreciate any Information.
secondly, how does something like this happen? how can the car have all of these contacts personal details. Is there anything I should do to prev this from happening to me.
(I’m not entirely sure if this belongs to the subreddit but I’m happy to remove it.)
Hi all, I need to delete a satispay account because i don't use it. Their process is basically telling the customer service and waiting for them to do it. It's been more than a working week with no reply from them and multiple contacts.
I heard it is a gdpr violation to have the account deletion not as easy as sign in, but i'm not sure about the actual section of the regulation that states this.
I will wait some more, but if they don't do anything what are my options?
I live in italy.
Thank you all
I’m very confused about the process of having Meta delete my data. Do I manually delete first, then submit a GDPR data deletion request? LLMs tell me to do this, but then to expect requiring to send a photo of my ID to Meta for identity verification once I submit a GDPR data deletion request, since regular account verification won’t work after manual deletion of my account…
Alternatively, if I submit a GDPR data deletion request before/instead of manually deleting my account, my account may remain even if my other data is deleted?
What is the correct flow here?
Hi everyone, I’m looking for a technical/compliance discussion regarding a complex DSAR scenario.
The Context: A patient is undergoing SOT (Supportive Oligonucleotide Technique) therapy with a laboratory (RGCC International, with HQ in Switzerland, processing in Greece). This is a "personalized" therapy where an miRNA preparation is created based specifically on the patient's own Circulating Tumor Cells (CTCs).
The patient is also developing a personalized neoantigen cancer vaccine with a separate team. For clinical safety and treatment coordination, the vaccine development team needs to know the genetic targets of the SOT therapy (the biomarkers/genes being silenced).
The Conflict: The lab has declined to disclose the specific gene names or targets, citing the miRNA sequence as a proprietary "trade secret."
The Technical Question: In the context of personalized medicine—where the "product" is derived entirely from the patient’s own unique biological data—how is the balance typically struck between Article 15 (Right of Access) and Article 15(4) (Rights of others/Trade Secrets)?
Personal Note: I submitted a formal DSAR today, but I haven't had any engagement from the lab for over two weeks on my initial inquiry for the data. For a late-stage cancer patient, every day is critical. Navigating this administrative "black hole" while fighting the disease is incredibly taxing, and I'm trying to understand the regulatory landscape to ensure we get the data needed for the vaccine in time.
Thanks for any info you could share on this matter.
For anyone managing both GDPR and EU AI Act compliance, the classification question keeps coming up: which AI Act obligations actually apply to your system, and how do they interact with your existing GDPR program?
The biggest confusion I'm seeing is around the provider vs deployer distinction. Most companies using third-party AI models assume they're deployers with lighter obligations. In practice, embedding an LLM into your product often makes you a provider under the Act with much heavier requirements (Articles 9-17 vs Article 26).
We built a free tool that runs through the full classification logic: prohibited practices under Article 5, Annex III high-risk check, Article 6(3) exemption analysis, and GPAI provider detection. Outputs a PDF with the specific articles applying to your system and penalty exposure.
aguardic.com/compliance/eu-ai-act/roadmap
No signup, no email gate. Built it because the classification step is where most teams get stuck before they can even start mapping obligations to their existing GDPR controls.
Would value input from anyone navigating both frameworks simultaneously.
We rolled out Microsoft 365 Copilot Chat (stand alone version) over a year ago. Since then new features keep appearing, Outlook integration, meeting summaries, Glance Cards, and nobody formally assessed the GDPR implications of each one.
We have a DPA with Microsoft but I'm not confident it covers the Bing web grounding exception, or that most people realise Anthropic models are explicitly excluded from the EU Data Boundary?
Curious how others are handling this. Do you do a fresh DPIA for each new feature rollout? Do you have a standing AI policy that covers it? Or are most orgs just hoping for the best?
Would also be interested if anyone has put together decent documentation for this. Everything I've found online is either too generic, not AI specific, or written for lawyers, not for the person actually doing the work.
Hello, I need some help. I recently created an account with a cv software, which proved to be pretty useless.
There’s no account delete button anywhere, and after searching for 10min, I found an email address for privacy concerns.
I have now written them three emails asking them to delete my account and all data associated with it, and every time I get the same response stating that I’m on the free plan and that I‘m not being charged any money.
I have reminded them that they must delete my data upon request, but the response was the same. What do I do?