Even before a tl;dr (as if I do them anyhow), I am about as far from a web accessibility expert as my dog.

But I’ve taken a curious interest since summer 2018 when Twitter made a big deal of announcing the addition of options to add alternative text to tweeted images. Fire of a few PR statements, get the stories echoed to the pious outlets of EngadgetWiredTechCrunchETC, and call it done. Never mind that adding this to your images means turning ON an option 15th in a list of preferences.

In my own informal research (meaning I scroll through my timeline and count how many images tweeted have alt text) I’d say the activity rate tn doing this us about 0.05% of tweets. But it’s changed my approach to not just tweeting but also blogging, to try to up my game to as much as possible in not only using them, but also making them useful.

But not to be outdone, Instagram is in the game too, in their announcement of Creating a More Accessible Instagram, they proclaim:

We are introducing two new improvements to make it easier for people with visual impairments to use Instagram. With more than 285 million people in the world who have visual impairments, we know there are many people who could benefit from a more accessible Instagram.

Creating a More Accessible Instagram

I sure hoped they talked to a few of those 285 million people…

First, we’re introducing automatic alternative text so you can hear descriptions of photos through your screen reader when you use Feed, Explore and Profile. This feature uses object recognition technology to generate a description of photos for screen readers so you can hear a list of items that photos may contain as you browse the app.

same source!

I wonder how useful this really is to people who are visually impaired.

Imagine your IG experience as a robo voice reading below (if you think I made this up read Bored Panda’s Somebody Is Showing How Instagram Photos Are All Starting To Look The Same And It’s Pretty Freaky)

Note that “first” in the Instagram announcement is the AI solution. Followed is poor John Henry, hammering by hand alt text one photo at a time:

Next, we’re introducing custom alternative text so you can add a richer description of your photos when you upload a photo. People using screen readers will be able to hear this description.

Yup, from Creating a More Accessible Instagram

I wanted to know the experience. It was a few days before my app updated with the feature.

At least unlike Twitter, you do not have to choose to do the right thing and enable the alt text description field through some arcane option (and Instagram has like 70 difference preference settings). It’s there, very obvious before you click the Share button closest to the where the user’s eye end up.

My, to add accessible text, I click Advanced Settings– seriously intuitive!

another editing box is where I can add my alt text.

adding the alt text to an Instagram photo

And here is the posted photo- as if you could tell there was an alt text!

But if you have resilience you can dig down deep in the source code (that is, if you actually can see web source code any more or care to look)

Source HTML of the img tag doe show the text I wrote above inside the alt= tag, so it technically works

It’s almost easier to add alt text to an existing photo, just open the top right … menu on your own photo:

Opening the menu to edit your own photo

Then clicking the EDIT link in the menu

The menu provides the link to Edit the photo

And here you can see a more “accessible” place and interface for adding Alt text!

The location where to add alt text is much more clear on an existing photo

I rummaged around a few of my photos, I see some of automated ones as

This photo has one (again buried in source HTML)

How helpful is that?

Other times, Instagram seems to just insert my caption as an alt= text — which is not really always useful either. Easy to code, maybe…

But what is it like to someone to navigate Instagram this way? What good is it for me as a sighted critic and their sighted engineers to design an interface?

I tried a bit using Voiceover in my iOS device and nearly went bezerk trying to even get the app open, much less reading me the image alt tags. Just turning off the feature led to howling.

So I activated Voice Assist on my laptop, and spent about 10 minutes (mostly my own inexperience in navigating) trying to get to the image above with the alt text I added via the instagram app.

Listen as voice over reads my Instagram photo’s alt tag content.

So, yes, in paper and in PR statements, Instagram has added accessibility features. Does that really make it accessible to people without vision? I cannot speak for them. Is it meaningful if it’s an obstacle course of interface for people to add alt text or do we rush on by and leave it to the AI.

To me that is the likely what they are after; anyone of us who adds alt text manually is working to verify their AI?

The best outcome for me in looking through this stuff is from Veronica with Four Eyes, someone really with a visual challenge, who writes to really helpful guides to accessibility (as opposed to PR statements), like

Because accessibility is much more than adding a few features….


Featured Image: Did some re-writing at the top of an image of the Empress of Britain a public domain image from the Library of Congress. This is my modified version:

Original image https://www.loc.gov/resource/ggbain.27873
If this kind of stuff has value, please support me by tossing a one time PayPal kibble or monthly on Patreon
Become a patron at Patreon!
Profile Picture for CogDog The Blog
An early 90s builder of web stuff and blogging Alan Levine barks at CogDogBlog.com on web storytelling (#ds106 #4life), photography, bending WordPress, and serendipity in the infinite internet river. He thinks it's weird to write about himself in the third person. And he is 100% into the Fediverse (or tells himself so) Tooting as @cogdog@cosocial.ca

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *