Microsoft, Apple and Google: how inclusive are they?

Barriers to digital use are caused by a complex web of intersecting factors. Gender identity is just one of them. Age, education, socioeconomic status, race, physical and cognitive disadvantage all have a role to play. Focusing on one dimension of inclusion does not account for all the complexities. That’s the conclusion two researchers came to after looking at Microsoft, Apple, and Google websites.

The big software industry players have enthusiastically promoted their commitment to inclusive design. But how inclusive are they?

A Microsoft office building with the Microsoft logo displayed on the front.

A paper discussing aspects of their inclusive practice from a gender perspective reveals that it is only a partial response. That’s because gender intersects with many other identities such as age, capability, and ethnicity.

“Regardless of efforts to promote inclusivity mainly in terms of gender, the intersectionality of identities is frequently overlooked and ignored in design.”

A pair of hands wearing red nail polish is holding an iPad with a Google search screen visible.

Microsoft, Apple, Google and Meta

The paper covers the issues of intersectionality and imagery, which is followed by tech industry case studies. Here is a brief overview:

Microsoft’s manual does a great job in explaining why there is a need for inclusive design. However, it is focused on disability and the images maintain the male/female binary. It covers theoretical and practical aspects.

Apple’s site has a detailed description of what the company thinks about inclusive design. It goes further than Microsoft on accessibility an introduces language use and stereotypes. Apple does not provide designers with many practical tools on how to do universal design.

Google acknowledges the need for equity and inclusion in their products by giving voice to the most underrepresented groups throughout the production process. They have a list of diversity segments for designers to consider.

Meta has implemented several key strategies in its design process to create inclusive products. The company is evolving to recognise the importance of accessible and inclusive products for all users. Meta conducts user research on a regular basis to gather feedback for improvements.

Guidelines for gender inclusion

The research resulted in guidelines for designing gender-inclusive tools in technology. They are applicable to both academia and industry. They are also useful for anyone responsible for the look, feel and accessibility of their organisation’s website and digital products.

  • Consider intersectionality: avoid simplifying people to one-dimensional characters. People have complex identities, which go beyond belonging to one specific gender, race, or sexuality group.
  • Avoid propagating stereotypes: attaching typical looks, occupations, and traits to a person based on their gender, race, or sexuality, contributes to social stereotypes that aggravate misogyny, racism, and homophobia.
  • Overcome the gender-binary: avoid producing text and images that reinforce the gender binary and social stereotypes, regarding appearance, jobs, preferences, or skills.
  • Make your text, tone, and imagery consistent and inclusive: it is necessary to maintain efforts for inclusivity throughout your copy, visuals, communication, and products.
  • Show the diversity of each community: make a conscious effort to illustrate how multi-colored communities are, instead of simplifying them to stereotypes.
  • Involve people with that particular identity: diversity and inclusion should be taken seriously. There’s no better person than the one with that particular identity to tell you about their concerns and challenges.
  • Avoid concentrating on a single mode of communication: adapt your copy, images, and communication to different languages, cultures, and levels of complexity.
  • Provide training in Diversity, Equity, and Inclusion: help your business or organization by providing constant training and mentorship.

The title of the paper is, Gender Inclusive Design in Technology: Case Studies and Guidelines, from the ResearchGate website.

From the abstract

The need for inclusion stems from the fact that the composition of the IT sector reflects a workforce that is not diverse enough. This can result in blind spots in the design process, leading to exclusionary user experiences. The idea of inclusive design is becoming more prevalent. In fact, it is becoming a general expectation to create software that is useful for and used by more people.

With a focus on intersectionality, inclusive user experience (UX) seeks to actively and consciously integrate minority, vulnerable, and understudied user groups in the design.

UX that is based on inclusive design and aims to overcome social disadvantages in all of their intersectional complexities. These arise from gender, sexual orientation, age, education, dis/ability, socioeconomic status, and race/ethnicity, among others. At the same time, gender-inclusive design has challenges and limitations: the idea of gender inclusion in design is not yet a reality.

Our research investigates academic literature, as well as tech industry practices, like the websites of Microsoft, Apple, Google, and Meta. Our analysis shows that intersectionality suffers even when inclusivity is considered. We also offer guidelines for factors that might be explored for a more inclusive design.

Overcoming bias in AI

Artificial Intelligence (AI) is entering our everyday lives with increased speed and sometimes without our knowledge. But it is only as good as the data it is fed, and the worry about bias is a concern for marginalised groups. AI has the potential to enhance life for everyone, but that requires overcoming bias in AI development. In his article, Christopher Land argues for more advocacy and transparency in AI.

The power of machine learning comes from pattern recognition within vast quantities of data. Using statistics, AI reveals new patterns and associations that human developers might miss or lack the processing power to uncover.

A background of computer code with a female face overlaid. Overcoming AI bias.

Designing for the average is fraught with problems. Statistical averages do not translate to some kind of human average. That’s because statistics don’t measure human diversity. That’s why AI processes are at risk of leaving some people behind. But in gathering useful data there are some privacy issues.

AI shows great promise with robot assistants to assist people with disability and older people with everyday tasks. AI imaging and recognition tools help nonvisual users understand video and pictures.

Christopher Land outlines how AI and machine learning work and how bias is introduced into AI systems if not prevented. He also has some recommendations on strengthening legal protections for people with disability. The paper is not technical. Rather it explains clearly how it works, where it’s used, and what needs to be done.

The title of the article is, Disability Bias & New Frontiers in Artificial Intelligence. The “Black Box” issue is explained and the need for a “Glass Box” is presented.

From the abstract

Bias in artificial intelligence (AI) systems can cause discrimination against marginalized groups, including people with disabilities. This discrimination is most often unintentional and due to a lack of training and awareness of how to build inclusive systems.

This paper has two main objectives: 1) provide an overview of AI systems and machine learning, including disability bias, for accessibility professionals and related non-development roles; and 2) discuss methods for building accessible AI systems inclusively to mitigate bias.

Worldwide progress on establishing legal protection against AI bias is provided, with recommendations on strengthening laws to protect people with disabilities from discrimination by AI systems. When built accessibly, AI systems can promote fairness and enhance the lives of everyone, in unprecedented ways.

Diversity and inclusion in AI

An Australian book chapter takes a comprehensive and practical approach to how equity and inclusion should be considered throughout development. This should be done at both governance and development levels by applying inclusive design and human-centred design to the AI ‘ecosystem’.

The title of the chapter is Diversity and Inclusion in Artificial Intelligence.

Older people, ageism and digital design

Do stereotypes of older people affect how digital technology is designed? A team of researchers found that ageism has the potential to influence design in negative ways. However, co-design partnerships not only overcame the affect of ageism, it was also likely to produce technologies that are needed, wanted and used.

Ageism can have a detrimental role in how digital technologies are designed. Participating with older people in the design process has the additional benefit of countering stereotypes.

Image shows a group of older people on a desert camping expedition.

Photograph on a sand dune of 18 passengers and 4 drivers

Older people said the “ultimate partnership” in co-designing is to be involved from the beginning through to the end of the design process. Sharing control over design decisions was an important part of the process. They are more than informants – they are equals who have valuable contributions.

The researchers noted that although this vision of co-design is shared by designers, it is not always the case in practice.

Image shows older people working together on a workshop question.

Older people sit at round tables discussing questions. There are four round tables shown in this picture.

Older people in the study also said that ageism emerges in implicit and explicit language about ageing. And ageist images can influence the design process. Consequently, the researchers say it is important to view the diversity of older people.

It’s about co-design

How and when to involve older people in digital design is also important. Understanding co-design with older people has the potential for avoiding insufficient prototyping, biases and errors in the design process.

The title of the article is, An “ultimate partnership”: Older persons’ perspectives on age-stereotypes and intergenerational interaction in co-designing digital technologies.

From the abstract

There is a gap between the ideal of involving older persons throughout the design process of digital technology, and actual practice.

Twenty-one older people participated in three focus groups. Results showed ageism was experienced by participants in their daily lives and interactions with the designers during the design process. Negative images of ageing were pointed out as a potential influencing factor on design decisions. Nevertheless, positive experiences of inclusive design pointed out the importance of “partnership” in the design process.

Participants defined the “ultimate partnership” in co-designing as processes in which they were involved from the beginning, iteratively, in a participatory approach. Such processes were perceived as leading to successful design outcomes, which they would like to use, and reduced intergenerational tension.

Computing students show age and gender bias

As we know, bias sits in all of us whether we realise it or not. Some biases we are aware of, such as liking certain people and things. But what about those biases we are not aware of that sit silently in the background of our thoughts and ideas? And how much do they impact on the things we say and do? Age and gender bias has an impact on how things are designed.

Older people are thought less likely to use desktops, laptops and smartphones, and to have less expertise with them. Women, both young and old, are thought to have some experience with devices, but less expertise than men.

A young woman wearing a black beanie sits in front her her laptop. Behind her are icons of cog wheels indicating technology. Age and gender bias in computer science.

A group of researchers in Austria wanted to find out the perceptions of computing science students about age and gender. That’s because they are going to be designing the digital technology in the future. In a nutshell, they found several biases.

Age and gender bias

Students (aged around 21 years) started to see people as old at the average age of 57 years – several years younger than their grandparents. Older people are thought to be less likely to have experience with all types of devices. While younger people are thought to want aesthetic designs, older people are thought to need error tolerant systems.

The bias between the genders was smaller than that of age. Women were seen as less likely to use a desktop than men and to have less expertise than men overall. This fits the continuing stereotype that computing for men is ordinary, and exceptional for women. Consequently, older women were seen as less capable in using computers and laptops.

The article has a lot of statistical analysis, but the key points are in the findings, discussion and conclusions. The information is useful for teachers, and the authors recommend designing with users as a way to overcome bias. And, of course, more women should be encouraged into the computing sciences.

The title of the study is, How Age and Gender Affect the Opinions of Computing Students Regarding Computer Usage and Design Needs.

From the abstract

This study aimed to understand the perceptions of young computing science students about women and older people about computer literacy and how this may affect the design of computer-based systems. Based on photos, participants were asked how likely the person would be to use desktop computers, laptops and smartphones. And also, how much expertise they thought they would have with each technology. We asked what design aspects would be important and whether they thought an adapted technology would be required.

The results draw on 200 questionnaires from students in the first year of their Information and Communications Technology (ICT) studies at an Austrian university of applied sciences. Quantitative methods were used to determine if perceptions varied significantly based on the age and gender of the people depicted.

Qualitative analysis was used to evaluate the design aspects mentioned. The results show that there are biases against both older people and women with respect to their perceived expertise with computers. This is also reflected in the design aspects thought to be important for the different cohorts. This is crucial as future systems will be designed by the participants. Their biases may influence whether future systems meet the needs and wishes of all groups or increase the digital divide.

Gender and Mobile Apps

An academic paper, A Study of Gender Discussions in Mobile Apps, provides some insights into gender bias in app development.

From the abstract

Mobile software apps are one of the digital technologies that our modern life heavily depends on. A key issue in the development of apps is how to design gender-inclusive apps. Apps that do not consider gender inclusion, diversity, and equality in their design can create barriers for their diverse users.

There have been some efforts to develop gender-inclusive apps, but a lack of understanding of user perspectives on gender may prevent app developers and owners from identifying issues related to gender and proposing solutions for improvement.

Users express many different opinions about apps in their reviews, from sharing their experiences, and reporting bugs, to requesting new features. In this study, we aim at unpacking gender discussions about apps from the user perspective by analysing app reviews.

We first develop and evaluate several Machine Learning (ML) and Deep Learning (DL) classifiers that automatically detect gender reviews. We apply our ML and DL classifiers on a manually constructed dataset of 1,440 app reviews from the Google App Store, composing 620 gender reviews and 820 non-gender reviews.

Our best classifier achieves an F1-score of 90.77%. Second, our qualitative analysis of a randomly selected 388 out of 620 gender reviews shows that gender discussions in app reviews revolve around six topics: App Features, Appearance, Content, Company Policy and Censorship, Advertisement, and Community. Finally, we provide some practical implications and recommendations for developing gender-inclusive apps.

Good colour contrast for websites

How did you choose the colours for your last website update? Did you choose colours based on your brand logo and text or did you use the Web Content Accessibility Guidelines (WCAG) algorithm? But can the WCAG algorithm guarantee good legible colour contrasts for websites? Research by The University of Cambridge says it doesn’t. So they have developed an alternative algorithm for good colour contrast for websites. 

Five different coloured ovals with both black and white text for comparison. Human perception is better for good colour contrast for websites.
Examples of black and white text for comparison

Since January 2022, the Accessible Perceptual Contrast Algorithm proposes that legibility of text on websites is better with perceived difference than a mathematical contrast ratio. White text on strong coloured backgrounds are preferred over black text in almost all cases in the study. 

In the examples above, the black text passes the WCAG contrast ratio but fails the white text. The Accessible Perceptual Contrast Algorithm passes the white text and fails the black text. 

Sam Waller explains this more fully in his article, Does the contrast ratio actually predict the legibility of website text? 

As a result of this work, an early working draft of WCAG 3 proposes using the new method for calculating contrast. 

This is important information for choosing brand logos and text so it isn’t just something web designers should know. Many website designs are guided by brand colours so choose carefully. This information is also important for product labelling especially for online shopping. 


Tech and older adults

The stereotype of grandchildren helping grandparents with their phone or remote controller is often perpetuated by older people themselves. Skill in using phones and websites depends on the reasons for using them. Younger people can have different interests from older adults meaning they use different apps and software. This doesn’t mean tech and older adults don’t belong together.

“Grandma cannot use her phone because it was not designed for her. Ubiquitous mass-market tools should not present obvious and avoidable hurdles to everyday users.” Robert Schumacher.

A smartphone with graphics depicting a design problem being fixed.

The stereotype is not based in evidence, and it might not be the tech that’s the barrier – poor vision or hand dexterity can also cause problems with using phones and computers.

It’s about mental models

According to Schumacher, the main difference between younger and older generations is when their mental models of how things work was formed. He explains how these mental models can widen any gap in understanding in how things work:

“Every generation has its own mental models of the world. In Tom Standage’s (1998) fascinating book The Victorian Internet, he provides several examples of how emerging technology scrambled everyone’s way of thinking. Take the introduction of the telegraph. In one example, a mother brought a bowl of sauerkraut to the telegraph office, insisting that they send it (across the wire) to her son on the battlefront. This mother, with good intentions, mixed up the atoms and the bits, which perhaps is understandable if you do not have a suitable mental model.”

A telegraph pole with rows of wires attached.

Using this story we can see that many older adults will have mental models and beliefs of how things work. But they don’t align with the design models and systems. People who began working with computers and DOS systems in the 1980s have grown up with the evolution of technology. Their mental model has adjusted with each new development. But not everyone worked in an office with a computer.

Barriers for everyone

CAPTCHA requires good visual acuity, and multi-factor authentication (MFA) requires people to juggle more than one device and accounts. And access to healthcare is moving online with MFA requirements.

Electronic ticketing now requires a mobile app. The assumption is that the process is intuitive despite the many steps in the process. Downloading the app, getting the tickets by email, transferring to the app, retrieving and scanning the app at the gate. And access to a streaming service often requires a QR code – something else to learn.

Schumacher’s article discusses more on this topic and how to remedy the situation. More testing with older adults is essential. Their mental models aren’t the same as the developer’s – what’s intuitive for a seasoned tech user is not intuitive for everyone. However, it doesn’t mean older people are averse to using technology or too “stuck in their ways” to learn.

The title of the article is, Gran Got Tech: Inclusivity and Older Adults, published in the Journal of User Experience.

Mental models and autonomous vehicles

The concept of designing tech from the perspective of mental models is a factor in a research project for autonomous vehicles. As concepts evolve, eventually the need to design in-vehicle interfaces will be minimal with presets for each rider. In the meantime, touchscreens and audio controls will still be needed. These need to be co-designed with users to develop prototypes.

The title of a research paper on this topic is Designing Interaction with Autonomous Vhicles for Older Passengers.

The Metaverse: inclusive and accessible?

The concept of the Metaverse is a continuous online 3D universe that combines multiple virtual spaces. It’s the next step on from the internet. It means users can work, meet, game and socialize in these 3D spaces. We are not quite there yet, but some platforms have metaverse-like elements. Video games and Virtual Reality are two examples. So, we need to keep a careful watch on developments to make sure the Metaverse is inclusive and accessible.

Another term for the Metaverse is digital immersive environments. It sounds science fiction, but this fiction is becoming a fact. Someone is designing these environments, but are they considering equity, diversity and inclusion? Zallio and Clarkson decided to tackle this issue and did some research on where the industry is heading.

Several companies are involved in the development of digital immersive environments. So before they get too far in development it’s important to define some principles for the design of a good Metaverse. Zallio and Clarkson came up with ten principles that embrace inclusion, diversity, equity, accessibility and safety.

10 Principles for designing a good Metaverse

  1. is open and accessible
  2. is honest and understandable
  3. is safe and secure
  4. is driven by social equity and inclusion
  5. is sustainable
  6. values privacy, ethics and integrity
  7. guarantees data protection and ownership
  8. empowers diversity through self-expression
  9. innovates responsibly
  10. complements the physical world
A young woman is wearing a pair of virtual reality goggles and looking towards the sky.

Their paper is insightful and provides some important areas for discussion and research. We need developers to consider the essentials of inclusion, diversity and accessibility. Zallio and Clarkson advise that designers can learn from the past to reduce pitfalls in the future. As the Sustainable Development Goals say, “leave no-one behind”.

Diagram showing the 10 principles for designing a good Metaverse.

The title of the paper is Designing the Metaverse: A study on Inclusion, Diversity, Equity, Accessibility and Safety for digital immersive environments.

Synopsis of the paper

1. The Metaverse appears as the next big opportunity in the consumer electronics scenario.

2. Several companies are involved with its development.

3. It is extremely important to define principles and practices to design a good Metaverse.

4. Qualitative research pointed out to challenges and opportunities to design a safe, inclusive, accessible Metaverse that guarantees equity and diversity.

5. Ten principles for designing a good Metaverse embrace inclusion, diversity, equity, accessibility and safety.

From the abstract

The Metaverse is shaping a new way for people interact and socialise. By 2026 a quarter of the population will spend at least an hour a day in the Metaverse. This requires consideration of challenges and opportunities that will influence the design of the Metaverse.

A study was carried out with industry experts to explore the social impact of the Metaverse through the lens of Inclusion, Diversity, Equity, Accessibility and Safety (IDEAS). The goal was to identify directions business has to undertake.

The results indicated the nature of future research questions and analysis to define a first manifesto for Inclusion, Diversity, Equity, Accessibility and Safety in the Metaverse.

This manifesto is a starting point to develop a narrative, brainstorm questions and eventually provide answers for designing a Metaverse as a place for people that does not substitute the physical world but complements it.

City Access Map

CITY ACCESS MAP is a web application that shows how cities across the world are doing in terms of accessibility. It’s open source and covers any urban area with more than 100,000 residents. It computes walking accessibility down to the block level. It’s a tool for almost anyone who has an interest in cities that have access to services within a 15 minute walk.

A city view of the city access map. Short walking distances are shown in yellow and orange and long distances in purple.
A close up view of a city on the CITY ACCESS MAP

The CITYACCESSMAP is interactive and shows the differences in cities across the globe. For example, it shows that Bogota, Colombia is one of the most accessible cities. Orlando USA on the other hand is one of the least accessible. France is generally accessible with many cities reaching high levels of accessibility.

Australia is represented by Brisbane, Sydney, Melbourne and Adelaide. Searching by city brings a close up view of the suburbs. In Sydney, it shows good accessibility in and around the CBD. However, as expected as you move to outer suburbs accessibility reduces considerably.

It should be noted that the term “accessibility” mainly refers to access to services rather than an accessible built environment. The tool is worth investigating as a planner and administrator in any field. If nothing else, it is interesting to see how countries compare.

For IT people wanting to know the detail of the map design there is more information in a separate section. You can download processed data for any city in the application.

The scientific research is also available and you can contribute to the project by contacting Leonardo Nicoletti.

Co-designing for the digital world

If you want to create something really useful for intended users, asking them to participate in the design process is a good way to go. And that means the design of anything – guides and toolkits included. From Ireland comes a toolkit for co-designing for the digital world where participants are people with intellectual disability.

A collage of faces from around the world and pictures of smartphones. co-design for the digital world.

A series of iterative workshops involving people with intellectual disability formed the foundation of an accessible design toolkit.

Co-design is important in the area of digital design and computer interaction. However, projects that claim to be user-centred often become technology led rather than user driven. A university in Ireland teamed up with a community service that supports people with intellectual disability. With the guidance of researchers, computer science students and community service users engaged in a co-creation process from which a toolkit was developed.

The collaboration highlighted the need for accessible design resources and training materials for both students and users. While there are many resources on co-design processes, and design thinking, few address people with intellectual disability. Those that do exist are not accessible or suitable for people with intellectual disability.

The toolkit is about co-designing with people with intellectual disability. Two overarching principles emerged. Use simple English with short sentences and simpler grammatical structures. Provide visual aids – icons and images – to overcome literacy limitations.

The paper explains the co-creation process in detail. The authors call the users co-designers, which is confusing because co-design usually means all participants including designers.

Understanding the complex process of consent to participate had to be resolved for the users. Another difficulty was encouraging participants speak up about design flaws or issues.

The title of the paper is, An Inclusive Co-Design Toolkit for the Creation of Accessible
Digital Tools

From the abstract

Existing toolkits and resources to support co-design are not always accessible to designers and co-designers with disabilities. We present a study of a co-design process, where computer science students worked with service users with intellectual disabilities. The aim was to create digital applications together.

A series of co-design focus group sessions were conducted with service users previously involved in a co-design collaboration. The information from these sessions was used to devise an accessible design toolkit. This toolkit is intended to generate a sustainable resource to be reused in the student programme at TU Dublin but also in the wider community of inclusive design.

Editor’s comment: Most guides and toolkits are based on well-researched evidence, but the value of the evidence is sometimes lost in technicalities or too many words. A co-design process will seek out the key information that guideline users want and need.

Human centred design is for humans

A man in a blue shirt is wearing virtual reality goggles and is holding a controller in his hand. This is part of exploring Human centred design Almost all designing is done for human use. Even designs for animals and plants link back to human use or human benefit in one way or another. So based on that premise, all design is human centred design. Well, that is one way to define it. 

In the world of technology the key question for human centred design approach is, what are people doing? Not, what can the technology do? Dr Peter Schumacher reminds us in a video below that humans were creating things for themselves in the Stone Age. However, digital technology isn’t crafted by hand like a stone axe where the maker is often the user. With artificial intelligence, technology can do amazing things. But are they the things that humans want?

Based on the Stone Age definition, Schumacher says human centred design is something we’ve been doing forever. From the moment we could, we’ve been making things we could use. We were creating technology for a purpose, for something we wanted to do. But technology, and the way we interact with it, is getting more complicated.

We are at a point where technology and the environment we live in is creating problems. Schumacher says that finding out what’s going on with humans and their relationship to technology is the key focus. It’s putting human back into the centre of the technology rather than saying what the technology can do.

So, you start with the question, “what are people doing?” This is followed by “what does the technology need to do in that context?” Then there is the question of what’s the right technology?

Human Centred Design Group

The Human Centred Design Lab at University of South Australia is focused on developing methods for the design of objects, environments, and systems for human use. They take a collaborative and trans-disciplinary approach to create products for human use. The Human Centred Design Group is a research centre for interactive and virtual environments. The video below explains a bit more and they invite other collaborators.

Also see their Design Clinic brings together researchers, industry and end users for co-design in healthcare. The Design Clinic can capture ideas and designs for new products, systems and services vital to the health and wellbeing sectors. 


Accessibility Toolbar