AI, the web and picture descriptions

A laptop computer on a desk showing several picturesPeople who rely on screen readers or Braille displays like to use the internet in the same way as others. When it comes to images, this depends on remembering to use the alt-text feature to describe uploaded images. While many big websites do include alt-text, smaller ones often don’t – and there are a lot of bloggers. And social media is the biggest culprit. Don’t rely on AI to fix your picture descriptions.

The applications designed to describe images don’t do it accurately. The Chrome accessibility team at Google are working on this. The aim is to use machine learning to more accurately describe the millions of images that are yet to be described. The FastCo website has more information.

Applying alt-text to all your images means that screen readers don’t get the image filename or upload coding instead of something sensible. The added advantage is that when the image fails to load on, say, a phone, a sensible text appears instead of the image. Axess Lab has lots more information in a very readable format. Their website is a good example of how to write and design your posts for maximum readership.