Apple has introduced Apple age verification UK measures that will require iPhone and iPad users to confirm they are adults before accessing certain services, including 18-plus apps. The change comes with the iOS 26.4 update and is being implemented in response to legal requirements in certain regions, including the UK.
According to Apple, users may be prompted to confirm that they are adults when creating a new Apple Account or while using specific services. This requirement applies to actions such as downloading apps or changing certain settings linked to their Apple Account.
Apple Age Verification UK: How Users Confirm Age
As part of the Apple age verification UK rollout, users can confirm their age through multiple methods. Apple may use existing account information, such as whether a credit card is already linked to the account or how long the account has been active, to help determine if a user is an adult.
Users also have the option to add a credit card to confirm their age or scan a government-issued ID, such as a driver’s license or national ID. Apple has stated that credit card details or ID documents are not stored unless users choose to save them for other purposes, such as adding a payment method.
To complete the process, users must update their device to the latest software version and follow prompts in the Settings app. If they choose not to confirm immediately, they will continue to see a notification in Settings prompting them to complete the process later.
If verification cannot be completed on the device, Apple requires users to use approved methods such as a driver’s license, national ID, or a credit card. Debit cards, gift cards, and passports are not supported, although a Digital ID in Apple Wallet created using a U.S. passport may be accepted in some cases.

Impact on Child Online Accounts
The Apple age verification UK changes also affect how minors use Apple services. In the UK, children under 13 cannot create an Apple Account without parental consent and must be part of a Family Sharing group. In such cases, a parent or guardian who has confirmed their age may be required to approve certain actions, including app downloads or changes to safety settings.
Depending on the region, some features may not be available to users until they turn 18. Apple has also noted that age requirements for child accounts vary across countries, with thresholds ranging from under 13 in most regions to higher limits in others.
Regulatory Push on Child Online Safety
The rollout of Apple age verification UK comes as UK regulators increase scrutiny on how platforms enforce age restrictions. The Information Commissioner’s Office (ICO) and Ofcom have asked major platforms to outline how they plan to strengthen child safety protections, particularly in preventing children under 13 from accessing services meant for older users.
The UK government is also considering additional measures, including potential restrictions on social media use for younger users and pilot programs to test new regulatory approaches. Several European countries have announced or are considering similar steps.
Ofcom has stated that many platforms are not effectively enforcing minimum age requirements, with children continuing to access services despite age restrictions. The regulator has called on companies to implement stronger measures, including effective age checks, improved protections against grooming, safer content feeds, and proper assessment of new product features before they are introduced.
Dame Melanie Dawes, Ofcom Chief Executive, said: “These online services are household names, but they’re failing to put children’s safety at the heart of their products. There is a gap between what tech companies promise in private, and what they’re doing publicly to keep children safe on their platforms.
“Without the right protections, like effective age checks, children have been routinely exposed to risks they didn’t choose, on services they can’t realistically avoid. That must now change quickly, or Ofcom will act.”
Growing Focus on Enforcement
The Apple age verification measures align with broader enforcement efforts under the UK’s online safety framework. Ofcom has written to major platforms, including Facebook, Instagram, Roblox, Snapchat, TikTok, and YouTube, requiring them to demonstrate how they will enforce minimum age rules and improve child safety protections.
Platforms have been given deadlines to respond, after which Ofcom will assess their actions and determine whether further regulatory steps are necessary. The regulator has also indicated it is prepared to take enforcement action if companies fail to meet expectations.
The introduction of age verification at the device and account level reflects increasing emphasis on ensuring that age restrictions are applied more consistently across digital services, particularly where children may be exposed to adult content or features.

