UK GDPR SAR and Article 22 for gig workers
Summary
Gig workers in the UK can still use data protection law in 2025 to 26 to demand their data, challenge platform evidence and force a human review of serious automated decisions, but the Data (Use and Access) Act 2025 has made Article 22 weaker than it used to be. A strong Subject Access Request under Article 15 and a separate automated decision challenge are now two linked but different tools, and both matter when Uber, Deliveroo or Amazon Flex deactivate someone using algorithmic flags or opaque "risk" systems. Platforms still have to respond to a valid access request within one month in most cases, give key information about the data and logic used, and provide a way to contest significant automated decisions and get human intervention.
Key facts (UK 2025 to 26)
Under UK GDPR Article 15, a worker can ask a platform for confirmation that it processes their personal data, a copy of that data, and extra information including the purposes, categories, recipients, retention periods, source of the data, and information about automated decision-making.
The ICO says there are no formal wording requirements for a valid Subject Access Request, so a worker can make one verbally or in writing, including by email or social media, as long as it is clear they want their personal data.
The normal response deadline is one month from receipt of the request. The ICO's post-DUAA guidance says organisations can now "stop the clock" in some cases if they reasonably need more information to identify the person or clarify the request.
The Data (Use and Access) Act 2025 changed the old Article 22 system. GOV.UK and the ICO now say significant automated decisions are allowed more widely, but platforms must still give safeguards including information about the decision, a route to make representations, a route to obtain human intervention and a route to contest the decision.
The ICO's guidance says a decision is "solely automated" only if there is no meaningful human involvement. A token rubber stamp by a human is not enough.
The ICO's 2026 messaging on automated decisions in recruitment shows it is actively warning organisations that they must build in safeguards and not rely on automation without proper review, which matters directly for gig platforms using similar management systems.
Subject Access Requests are now subject to a "reasonable and proportionate search" standard after the Data (Use and Access) Act 2025, which may let platforms argue for narrower searches than before, so workers need to be precise about what they want.
If a platform stalls, refuses, redacts too much, or ignores a significant automated decision challenge, the worker can complain to the ICO and may also combine that with ACAS early conciliation or tribunal action where status, discrimination or detriment issues overlap.
Legislation, case law, regulation
UK GDPR Article 15, right of access, which gives the right to confirmation, a copy of personal data and supplementary information, including information about automated decision-making.
UK GDPR automated decision-making provisions as amended by the Data (Use and Access) Act 2025, which replaced the older stricter Article 22 framework with a broader permission model plus safeguards, now often referred to through the new automated decision-making regime and related amended articles.
Data Protection Act 2018, which works alongside UK GDPR and remains part of the UK data protection framework for enforcement, exemptions and remedies.
ICO guidance, "A guide to subject access", updated September 2025, which sets out the one month response rule, identity checks and what organisations must provide.
ICO guidance, "Rights related to automated decision making including profiling", updated September 2025 and under review after the Data (Use and Access) Act 2025, which says organisations must provide information, let people make representations, let them obtain human intervention and let them contest the outcome.
GOV.UK guidance, "Data (Use and Access) Act 2025: data protection and privacy changes", which confirms that solely automated significant decisions are now allowed more widely, but only if safeguards are in place.
ICO employment and AI fairness materials, including Article 22 fairness guidance and 2026 statements on automated hiring, which show the ICO expects meaningful human review and risk assessments where automated systems affect work opportunities.
Uber BV v Aslam [2021] UKSC 5, which is not a data protection case, but matters because gig workers often use data rights to get the evidence they need for worker status, unfair treatment and deactivation claims.
How it actually works
From a worker's point of view, there are two separate but connected problems. First, you want to see what data the platform actually holds about you. Second, you want to challenge a decision that may have been made by an algorithm, or by a human who just rubber-stamped an algorithm. That is why GigKiln should generate two letters, not one.
The first letter is the basic Subject Access Request under Article 15. This asks for your personal data and the key background information the law says you are entitled to. On gig platforms, that often means your trip or order history, GPS data, messages, complaint logs, fraud flags, facial recognition results, document verification records, internal notes, safety reports, earnings records, performance scores, risk profiles and communications about deactivation or throttling. Platforms do not have to give you other people's personal data or legal privilege material, but they cannot just send a thin export of trip receipts and pretend that is the whole file.
A good Subject Access Request should be specific. Because the law now talks about "reasonable and proportionate" searches and allows some stop-the-clock behaviour where more information is needed, workers should name the account, the email address, the phone number, the platform role, the relevant time period, and the exact categories of data they want. If you just say "send me everything", the platform may drag its feet or send an incomplete pack. If you say "send me all complaint logs, fraud or trust flags, document verification logs, internal account notes, GPS records, facial recognition records, account risk scores and records of any automated decision-making used in relation to my deactivation between 1 January 2025 and 31 March 2026", you are in a stronger position.
The second letter is the automated decision challenge. Before the Data (Use and Access) Act 2025, workers often relied on the old Article 22 as a broad right not to be subject to solely automated significant decisions. Since June 2025 that is no longer the law in the same way. GOV.UK and the ICO now say platforms can make significant automated decisions more widely, but they must build in safeguards. Those safeguards are the practical rights your tool should target: tell me what happened, tell me what data was used, let me make representations, give me human intervention, and let me contest the decision.
For gig workers, the classic trigger is deactivation, suspension, fraud flagging, route throttling, block allocation changes or safety scoring. If Uber, Deliveroo or Amazon Flex says "your account has been deactivated" but gives no usable explanation, and you suspect a fraud score, rating model, identity check or automated complaint system drove the result, the worker should send the automated decision challenge straight away. The letter should say that the decision has legal or similarly significant effects because it cuts off income, and should demand confirmation of whether the decision was solely automated or involved meaningful human review.
If the platform claims there was human involvement, the worker should push on what that actually means. The ICO's position is that meaningful human involvement means a person who carefully analyses the decision and all the relevant inputs and can genuinely change the outcome. A support worker clicking "uphold" after reading an auto summary is unlikely to be meaningful human review. That matters because many platform "appeals" look human on the surface but are still heavily automated underneath.
If the platform stalls or refuses, there are three common next steps. First, reply and point out that the one month deadline is running, ask them to justify any stop-the-clock pause, and narrow the request if that will get the data moving. Second, complain to the ICO with copies of your request, proof of delivery and the platform's replies or silence. Third, if the data issue links to deactivation, discrimination, trade union activity or worker status, speak to a union or firm like Leigh Day, Bates Wells or Farore Law and think about ACAS early conciliation before the three-months-minus-one-day deadline.
The ICO enforcement record against gig platforms is patchy. The ICO has produced substantial guidance on AI, employment and automated decision-making, and it has looked at labour platform issues in wider work on algorithmic management, but its direct public enforcement against UK gig platforms for automated firing or deactivation remains limited and much of the pressure has come from litigation, campaigns and complaints rather than headline fines. That means workers often need to be persistent and use SARs as evidence tools, not magic bullets.
Template letter 1, basic SAR
Purpose: get the full data file and force the platform to confirm what it is holding.
Suggested structure:
Your identity details, account email, phone number, worker ID, platform name.
Clear statement: "This is a Subject Access Request under Article 15 UK GDPR and the Data Protection Act 2018."
The date range.
Specific categories requested, for example account notes, complaints, fraud flags, GPS/location data, message logs, facial recognition checks, document verification results, performance scores, ratings, risk scores, earnings records, suspension and deactivation records.
Request for supplementary information, including purposes of processing, recipients, retention periods, source of data, and information about any automated decision-making and profiling.
Request for the response in electronic form.
Reminder of the one month deadline.
Template letter 2, escalated automated decision challenge
Purpose: force a real explanation and human review where a platform used automation to suspend, throttle or deactivate a worker.
Suggested structure:
Reference the decision and date, for example deactivation on 12 February 2026.
State that the decision has legal or similarly significant effects because it removed access to paid work.
Ask whether the decision was solely automated or involved meaningful human involvement.
If automation was involved, demand the safeguards required by UK GDPR as amended, including information about the decision, a way to make representations, human intervention, and a route to contest the decision.
Ask for the categories of data, rules, scores, profiling indicators and complaint inputs used.
Ask for the name or function of the human reviewer, and what material they actually reviewed, if the platform claims there was human review.
Attach a short representation explaining why the decision is wrong, incomplete or unfair.
State that if the platform does not comply, you will complain to the ICO and rely on the correspondence in any ACAS or tribunal process.
Worked example
Take a 22 year old Uber driver in Glasgow. In February 2026 Uber deactivates his account after saying there were "fraud concerns" linked to repeated trip anomalies. He has around £42,000 turnover in the 2025 to 26 tax year and expects about £8,000 in allowable expenses, so losing the app suddenly is a serious income hit. He believes Uber's systems have confused him with account sharing or route manipulation, but Uber has told him almost nothing.
He sends two separate emails. The first is a Subject Access Request. He gives his full name, account email, phone number, driver ID and the date range of 1 January 2025 to 1 March 2026. He asks for account notes, GPS records, complaint logs, fraud or trust flags, facial recognition checks, document verification logs, support tickets, internal escalation notes, ratings records and all records of suspension or deactivation. He also asks for the supplementary Article 15 information, including the source of any complaints and information about automated decision-making.
The second email is an automated decision challenge. He states that the deactivation on 18 February 2026 had a similarly significant effect because it cut off his access to paid work. He asks Uber to confirm whether the decision was solely automated or involved meaningful human review, and if a human was involved, who reviewed it and what evidence they looked at. He asks for human intervention, a chance to make representations and a full explanation of the data or profiling factors used.
If Uber replies with a partial data pack but no fraud logic and says "a specialist team reviewed your case", he replies again asking them to explain what that "review" involved and why it counts as meaningful human involvement. If one month passes without proper compliance, he files an ICO complaint attaching his original requests and Uber's vague replies. At the same time, because the deactivation date matters for employment rights, he contacts a union and thinks about ACAS early conciliation before three months minus one day has passed. The SAR does not fix the problem by itself, but it can force out the internal notes and data points that later help him attack the deactivation properly.
What Reddit, TikTok and forums get wrong
Misinformation: "A SAR has to use legal magic words or the platform can ignore it." This shows up in forum advice telling workers to copy rigid solicitor wording. Correction: the ICO says there are no formal wording requirements for a valid Subject Access Request, and a worker can make one verbally or in writing as long as it is clear they want their personal data.
Misinformation: "Article 22 still means platforms are banned from using automated decisions on you full stop." This is common in older TikTok clips and campaign material that pre-date the Data (Use and Access) Act 2025. Correction: GOV.UK and the ICO now say the law is more permissive after the 2025 changes, so solely automated significant decisions can be used more widely, but organisations must still give safeguards including information, human intervention, representations and a chance to contest the result.
Misinformation: "If a support agent sends a template email, that proves there was human review and you cannot challenge it as automated." This appears in Reddit and Facebook discussions after deactivations. Correction: the ICO says human involvement must be meaningful, with a person carefully analysing the decision and being able to alter the outcome. A superficial rubber stamp is not enough.
Misinformation: "Platforms always have 30 days and can never pause the clock." Correction: the one month rule is still the basic rule, but post-DUAA guidance allows organisations in some cases to pause the clock if they reasonably need more information, for example to confirm identity or clarify scope. That is why workers should make requests precise and provide identifying details up front.
Action steps for the reader
Send a Subject Access Request as soon as a platform suspends, throttles or deactivates you, and include your full identifying details and a precise list of the data you want.
Send a separate automated decision challenge if the platform's action cut off your work or income and you suspect scoring, profiling, facial recognition, fraud flags or other automation.
In the automated decision letter, ask directly whether there was meaningful human involvement and, if so, what the human actually reviewed.
Keep proof of sending, screenshots of the request and every platform reply, because you may need them for the ICO, ACAS or a tribunal.
If the platform asks for more information, reply quickly but do not let them turn that into endless delay; ask them to explain any stop-the-clock pause clearly.
Complain to the ICO if the platform ignores you, refuses without proper reasons, or gives a thin response that dodges the key automated decision points.
If the data issue links to deactivation, discrimination or worker status, speak to a union like IWGB, ADCU or GMB, or to solicitors such as Leigh Day, Bates Wells or Farore Law, before the ACAS three-months-minus-one-day deadline runs out.
Related tools GigKiln should build
Subject Access Request generator that asks platform, account details, date range and data categories, then produces a clean Article 15 request.
Automated decision challenge generator that asks what happened, when, and why the worker thinks automation was involved, then outputs a platform-specific challenge letter.
ICO complaint builder that turns a failed SAR timeline into a complaint-ready summary with attachments checklist.
Deactivation evidence organiser that helps a worker match platform decisions to the data categories they should request.
Human review test checker that helps a worker assess whether the platform's so-called review was probably meaningful or just a rubber stamp.
Related guides
"How to send a Subject Access Request to Uber, Deliveroo or Amazon Flex in 2025 to 26".
"What Article 22 and the Data (Use and Access) Act 2025 now mean for gig workers".
"How to challenge algorithmic deactivation on gig platforms".
"When to complain to the ICO after a platform ignores your data request".
"Using data requests to support an Uber, Deliveroo or Amazon Flex appeal".
Sources
ICO, "A guide to subject access", updated 9 September 2025, accessed 19 April 2026.
ICO, "Subject access request Q and As for employers", updated 24 November 2024 and marked under review after the Data (Use and Access) Act 2025, accessed 19 April 2026.
ICO, "Rights related to automated decision making including profiling", updated 21 September 2025, accessed 19 April 2026.
GOV.UK, "Data (Use and Access) Act 2025: data protection and privacy changes", published 26 June 2025, accessed 19 April 2026.
ICO, "The Data Use and Access Act 2025, what does it mean for data protection law?", published 29 June 2025, accessed 19 April 2026.
ICO, "Automated decisions can streamline the hiring process", published 30 March 2026, accessed 19 April 2026.
ICO, "What is the impact of Article 22 of the UK GDPR on fairness?", updated 3 December 2024, accessed 19 April 2026.
Supreme Court of the United Kingdom, "Uber BV and others v Aslam and others, UKSC 2019/0029", accessed 19 April 2026.
Arthur Cox, "UK Supreme Court rules Uber drivers entitled to workers' rights", 28 February 2021, accessed 19 April 2026.
International Labour Law Journal, "Uber BV v Aslam: work relations cannot safely be left to …", 30 December 2022, accessed 19 April 2026.
Before you leave
Sources
- UK GDPR Article 15 right of access
- Data (Use and Access) Act 2025
- Data Protection Act 2018
- ICO A guide to subject access (updated 9 September 2025)
- ICO Rights related to automated decision making including profiling
- ICO Automated decisions can streamline the hiring process (30 March 2026)
- GOV.UK Data (Use and Access) Act 2025 data protection and privacy changes
- Uber BV v Aslam [2021] UKSC 5