Computing students show age and gender bias

As we know, bias sits in all of us whether we realise it or not. Some biases we are aware of, such as liking certain people and things. But what about those biases we are not aware of that sit silently in the background of our thoughts and ideas? And how much do they impact on the things we say and do? Age and gender bias has an impact on how things are designed.

Older people are thought less likely to use desktops, laptops and smartphones, and to have less expertise with them. Women, both young and old, are thought to have some experience with devices, but less expertise than men.

A young woman wearing a black beanie sits in front her her laptop. Behind her are icons of cog wheels indicating technology. Age and gender bias in computer science.

A group of researchers in Austria wanted to find out the perceptions of computing science students about age and gender. That’s because they are going to be designing the digital technology in the future. In a nutshell, they found several biases.

Age and gender bias

Students (aged around 21 years) started to see people as old at the average age of 57 years – several years younger than their grandparents. Older people are thought to be less likely to have experience with all types of devices. While younger people are thought to want aesthetic designs, older people are thought to need error tolerant systems.

The bias between the genders was smaller than that of age. Women were seen as less likely to use a desktop than men and to have less expertise than men overall. This fits the continuing stereotype that computing for men is ordinary, and exceptional for women. Consequently, older women were seen as less capable in using computers and laptops.

The article has a lot of statistical analysis, but the key points are in the findings, discussion and conclusions. The information is useful for teachers, and the authors recommend designing with users as a way to overcome bias. And, of course, more women should be encouraged into the computing sciences.

The title of the study is, How Age and Gender Affect the Opinions of Computing Students Regarding Computer Usage and Design Needs.

From the abstract

This study aimed to understand the perceptions of young computing science students about women and older people about computer literacy and how this may affect the design of computer-based systems. Based on photos, participants were asked how likely the person would be to use desktop computers, laptops and smartphones. And also, how much expertise they thought they would have with each technology. We asked what design aspects would be important and whether they thought an adapted technology would be required.

The results draw on 200 questionnaires from students in the first year of their Information and Communications Technology (ICT) studies at an Austrian university of applied sciences. Quantitative methods were used to determine if perceptions varied significantly based on the age and gender of the people depicted.

Qualitative analysis was used to evaluate the design aspects mentioned. The results show that there are biases against both older people and women with respect to their perceived expertise with computers. This is also reflected in the design aspects thought to be important for the different cohorts. This is crucial as future systems will be designed by the participants. Their biases may influence whether future systems meet the needs and wishes of all groups or increase the digital divide.

Gender and Mobile Apps

An academic paper, A Study of Gender Discussions in Mobile Apps, provides some insights into gender bias in app development.

From the abstract

Mobile software apps are one of the digital technologies that our modern life heavily depends on. A key issue in the development of apps is how to design gender-inclusive apps. Apps that do not consider gender inclusion, diversity, and equality in their design can create barriers for their diverse users.

There have been some efforts to develop gender-inclusive apps, but a lack of understanding of user perspectives on gender may prevent app developers and owners from identifying issues related to gender and proposing solutions for improvement.

Users express many different opinions about apps in their reviews, from sharing their experiences, and reporting bugs, to requesting new features. In this study, we aim at unpacking gender discussions about apps from the user perspective by analysing app reviews.

We first develop and evaluate several Machine Learning (ML) and Deep Learning (DL) classifiers that automatically detect gender reviews. We apply our ML and DL classifiers on a manually constructed dataset of 1,440 app reviews from the Google App Store, composing 620 gender reviews and 820 non-gender reviews.

Our best classifier achieves an F1-score of 90.77%. Second, our qualitative analysis of a randomly selected 388 out of 620 gender reviews shows that gender discussions in app reviews revolve around six topics: App Features, Appearance, Content, Company Policy and Censorship, Advertisement, and Community. Finally, we provide some practical implications and recommendations for developing gender-inclusive apps.

Accessibility Toolbar