Need Help?

Exploring Digital Ageism – Executive Summary (HTML)

Exploring digital ageism  Executive Summary

Introduction

OB3 Research was commissioned by the Older People’s Commissioner for Wales to undertake a short research project to explore digital ageism in relation to older people aged 60 and over.

Digital ageism refers to stereotyping, prejudice or disadvantage directed at people on the basis of age in digital contexts. It operates at structural, institutional and individual levels, often intersecting with other inequalities such as gender, disability, ethnicity and socio-economic status.

This literature review explores digital ageism as a systemic issue that shapes older people’s participation across all aspects of life in an increasingly digital society.

Key Findings

Policy and governance frameworks

Digital ageism is reinforced by gaps in governance and regulation. While international bodies such as the United Nations (UN) and the World Health Organization (WHO) increasingly frame digital inclusion as a human rights issue, age remains less visible in digital rights and data protection laws than characteristics such as gender or disability. In the UK, fragmented strategies and the move towards “digital by default” services risk embedding ageist assumptions about users’ digital capability.

The growing use of AI highlights further risks, as older adults are underrepresented in datasets and rarely considered in algorithmic impact assessments. Although regulators such as the Information Commissioner’s Office (ICO) have begun to acknowledge age bias, efforts are limited, leaving scope for systemic digital ageism to become normalised in governance and service delivery.

Design of digital services

The design and development of digital systems often neglect the needs of older users. Products and services are shaped around younger “digital natives,” resulting in complex interfaces, inaccessible features and limited adaptability. Older adults are rarely involved meaningfully in co-design or usability testing; where engagement occurs it is often tokenistic and late in the process. “Compassionate ageism”, well-meaning assumptions that older people require simplified tools, can further limit autonomy and reinforce stereotypes.

Algorithms and artificial intelligence (AI)

AI is increasingly used in healthcare, employment and public services but frequently embeds bias against older adults. Underrepresentation in datasets and the predominance of youth-centric design assumptions produce discriminatory outcomes, such as reduced diagnostic accuracy in healthcare or exclusionary recruitment practices. These biases are often hidden in opaque decision-making systems, limiting accountability. While AI has potential to enhance independence and wellbeing, inclusive development practices that involve older adults remain rare.

Intersectionality

Digital ageism intersects with other inequalities, shaping diverse experiences among older people. Older women often face a “double disadvantage” through both ageism and sexism, with lower incomes and caring responsibilities limiting digital access. Ethnic minority communities may face language and cultural barriers, while disabled and rural older people encounter accessibility and connectivity challenges. Intersectional approaches are essential to addressing these compounded exclusions.

Employment

Older workers face systemic barriers in recruitment, training and workplace digitalisation. References to “digital natives” in job descriptions and reliance on AI recruitment systems replicate existing ageist biases. Many older employees lack access to tailored digital training, while workplace cultures often stereotype them as resistant to change. These factors contribute to exclusion and early exit from the workforce. In contrast, age-inclusive approaches such as intergenerational learning and co-designed training improve engagement, retention and productivity.

Social participation and leisure

Digital platforms provide important opportunities for connection and cultural engagement but are frequently designed with younger audiences in mind. Older people encounter usability barriers, lower visibility in algorithms and age-related abuse online. Exclusion from digital leisure and cultural participation reduces opportunities for connection, wellbeing and identity. The COVID-19 pandemic further highlighted these gaps, with many older people unable to engage in digital social activities.

Health, social care and wellbeing

Digital health services, from online booking systems to telehealth consultations, are often designed around assumptions of high digital literacy and mobile access, excluding many older adults. Surveillance technologies in care settings risk reinforcing paternalistic practices and reducing dignity.

In healthcare AI, underrepresentation of older adults in datasets undermines diagnostic accuracy and safety. Digital exclusion contributes to isolation, poorer mental health and reduced trust in services, while inclusive design and multiple access pathways can mitigate risks.

Lifelong learning

Digital skills and confidence are critical to inclusion and empowerment in later life. However, older learners face structural, financial and attitudinal barriers to participation, and provision often overlooks their needs, motivations and learning styles. Stereotypes portraying older people as incapable of learning reinforce exclusion. Evidence shows that tailored and empowering training, particularly when delivered through peer support or intergenerational models, builds confidence, autonomy and wider civic and cultural participation.

Conclusions

The evidence demonstrates that digital ageism is not a marginal concern but a cross-cutting issue embedded in governance, design and practice. It shapes access to services, employment, health, learning and social participation, and it intersects with wider inequalities.

While examples of good practice exist, few interventions explicitly target digital ageism, leaving significant scope for innovation and action. Addressing it will be essential for ensuring fairness, rights and inclusion for older people in Wales as society becomes increasingly digital.

Recommendations

For the Older People’s Commissioner for Wales:

  1. Raise awareness of ‘compassionate ageism’: lead public awareness and policy guidance on avoiding patronising assumptions about older people’s technology use, taking all opportunities to urge designers and service managers to respect older users’ capabilities and preferences.
  2. Advocate for age in digital rights and policy frameworks: encourage Welsh Government, the UK Government and regulators (e.g. ICO) to explicitly recognise age in data protection, privacy and AI ethics legislation. The Commissioner should also engage and collaborate with the Equality and Human Right Commission (EHRC) Wales to ensure that age discrimination in digital settings (for example, in AI deployment) is monitored, reported, and where necessary challenged, drawing on EHRC’s regulatory powers.

For Welsh Government and public services:

  1. Empower older people about their digital rights: building on the rights-based approach in Age friendly Wales: our strategy for an ageing society, the Welsh Government should consider how issues around ensuring that older adults understand data privacy, consent and service entitlements can be incorporated. This could form part of future Welsh Government-funded work on digital inclusion via workshops, factsheets and partnership events and include training community advocates or ‘digital champions’ to advise peers.
  2. Promote co-design with older adults in digital public services: the Welsh Government and digital public service teams (e.g. Centre for Digital Public Services (CDPS)) should seek to involve diverse older users in designing and testing all digital services. Existing standards such as the CDPS Digital Service Standard and Welsh Government guidance which already insist on inclusive design and offline alternatives should be promoted widely. Public bodies (such as health boards and local authorities) should adopt similar guidelines.
  3. Enshrine age in procurement and policy standards: Welsh Government should require that all publicly‑commissioned digital products and services meet age-inclusive accessibility and usability standards. This could be done by ensuring that the Social Partnership and Public Procurement (Wales) Act frameworks explicitly consider older users. The Strategic Equality and Human Rights plan and its associated action plans (disability, race, LGBTQ+, gender) should explicitly include digital ageism (algorithmic age bias, access issues, etc). For example, when Welsh Government refreshes the Advancing Gender Equality Plan or drafts the Disabled People’s Rights Plan, it should ensure age intersects with technology considerations. The Welsh Government should further ensure that the Strategic AI Advisory Group and the Office for AI in Wales include a remit to assess age equality impacts in the design, procurement, deployment and regulation of AI systems in public services.
  1. Embed an intersectional, bilingual approach: All Welsh digital policies must account for and recognise the diversity among older people’s experiences – including gender, disability, ethnicity, income and rurality. It is also imperative that the Welsh language is seen and embedded as a language of AI, tech and digital.
  1. Improve digital health and care inclusion: Welsh Government and NHS bodies must ensure that new telehealth tools are co-designed with older users so that moves towards further digitisation and increased use of AI avoid inbuilt bias. Staff training in age-inclusive digital communication should be mandated – and organisations such as Digital Health and Care Wales, Health Inspectorate Wales and Care Inspectorate Wales could set and monitor standards in this area.
  1. Ensure Fair Work practices in AI-driven recruitment: public bodies in Wales should review their recruitment and training practices in light of the increasing use of AI and digital tools. This includes auditing recruitment platforms and algorithms for potential age bias, and publishing results transparently. They should also provide ongoing, role-specific digital training and upskilling opportunities, co-designed with staff of different ages, to ensure that older workers are not excluded from advancement or retention opportunities. Welsh Government could promote this through the Fair Work agenda and public procurement requirements, encouraging private and third-sector employers to follow suit.
  1. Leverage Age-Friendly community networks: where not already underway, Age-Friendly Communities coordinators in each Welsh authority could also promote digital inclusion locally. Age-friendly partnerships in local authorities could host digital cafés, ensure community centres have internet access, and include tech literacy in social prescribing. They could encourage, support and highlight intergenerational initiatives that bring younger and older people together to share digital knowledge, helping to reduce stereotypes and strengthen community bonds.

For the tech sector and digital service designers:

  1. Design products inclusively with older users: technology companies and digital designers should involve older people in testing and co-design, ensuring interfaces avoid ageist stereotypes. Inclusive design could offer text size, voice assist or simple modes optionally, but never assume all older users need simplistic solutions. Welsh-language support and clear privacy controls should also be provided.
  2. Audit and mitigate against algorithmic bias: public bodies and companies alike, when developing AI and digital tools (e.g. recruitment platforms, credit scoring) must test for age bias and report on fairness. Where bias is found, they should refine or remove offending algorithms. This follows the same logic as audits for gender/race biases. Industry bodies (such as the UK’s AI Safety Institute) should issue standards that place responsibility on the tech sector.
  3. Promote older people’s representation: Digital media and online platforms should feature and hire older people and support content that reflects their lives. For instance, streaming services or social media campaigns can challenge stereotypes by showcasing older models, voices and stories.

 

Need to talk to someone? Email us or message us