A meandering romp through the relationships of us and our tools to what it means to develop/use technologies that are what Audrey Watters write as “habitable, sustainable, and healthy”.
So oft uttered in the edtech field, having done my share of uttering in the early years– “Technology is Just a Tool.”
It’s. Almost. As. Frequent. As. Digital Natives.
I believe we do this to try and remove some fear about technology taking over what we control, like those old B sci fi movies. It’s a means of asserting our notion that Pedagogy is Primary (like first). Even the once wunderkid figurehead and now monumental grant provider stands up to declare it. Like 329,999 other google hits
So what we are doing is to thus distance ourselves from Technology, to assert our dominance.
Like most things that involve people, this is waaaaay to simplistic. When we say something is just a tool, it completely ignores our relationships with tools when we put them in our hands. I’ve tried to play this out as a more complex relationship. It’s like telling Monet “It’s just a brush”
Or telling Jimi to get over it, put down the guitar and focus on getting a job
I get smiles, but am not sure the point is getting home. Or maybe I have not made it quite as clear as it is to me.
A tool, be it a wrench, a mobile app, a blog software is “just a tool” only when it is sitting in the box. But when we, as [I hope] sentient, thinking, motivate, angry, creative, loving, curious creatures pick it up, or open it up, tool and us are in some sort of new relationship space, it’s not a tool, and in my hand, my hand is not a hand.
This is not my idea; it goes back to Marshall McLuhan’s ideas of technology (in his broad sweep, including media) as extensions of us:
It is the persistent theme of this book that all technologies are extensions of our physical and nervous systems to increase power and speed. Any extension, whether of skin, hand, or foot, affects the whole psychic and social complex.
What McLuhan experts note (and here is my feeble paraphrasing) is that most of our history with tools/technologies, the extension is one of our bodies- wheels extend what our feet can do for motion; radio and the telephone phone extends our voices; TV and camera extend our eyes. But our relatively more recent (in terms of human history) tools of computers, the internet, are offering an extension of the mind.
Perhaps the most vivid explanation of this was provided by Gardner Campbell, who had a presentation back when I was at NMC and we were doing virtual conferences in Second Life (go ahead and snicker). I found a paraphrasing of it in Robyn Heyden’s blog (google is extending my memory) that also links to a podcast Gardner and I did about McLuhan’s essay as part of his 2010 New Media faculty Development Seminar.
Gardner used this perfectly simple and powerful example to explain what McLuhan means by “tools as extensions of ourselves”. Here goes.
Pick up a hammer
“If you pick up a hammer, and hold it in your hand, what do you have?” Gardner asks. In trying to answer that question, we immediately jump to capabilities (“you can build a house” or “now you need a nail”). But Gardner urges us to, instead, think in terms of the most basic, the most obvious thing. You have a hammer in your hand. Simple. And the, he says, McLuhan goes further. What McLuhan would say is that you don’t have a hammer in your hand, what you now have is a “hammerhand”. You’ve changed the hammer. And you’ve changed your hand. A new union, that neither one was before you picked up the hammer.
Technology is not just a tool when we have it in our hands. A new union. So why do we need to drop the hammer as “just a tool”?
What happens when we do not make so a commotion about tools? Out of the accidental serendipity engine that twitter can be when not a steaming pool of (fill in the blank) came a brilliant link via someone I had not tweet crossed before George Station (@harmonygritz, thanks!). This was amongst the tweets of first day after the return of No Indictment in Ferguson.
— George Station (@harmonygritz) November 25, 2014
This leads me to this video which got me thinking perhaps not what George was suggesting, but that’s how links and ideas go
Mark Anthony Neal opens his talk on race relations in twitter by going back and highlighting Black people themselves as “OTs” or the Original Technology, being the engine of cotton production, that their social media was field songs. Neal moves through more history, to 1960 and asks, perhaps whimsically “What if the Greensboro 4 Had Twitter” but noting that the social media technology of the day was the mimeograph, and how it was used to send a message viral.
What touched off for me in the examples Neal shares, is that Black people never made a big deal of the technologies themselves. They used what was available to advance their purposes. If there were venture capitalists backing the Greensboro 4 (a horrible thought) they might be talking more about How The Mimeograph Will Change Everything. Was there dissent among protestors, “The Mimeograph, it’s just a tool!” ?? Or did they just use the thing.
Perhaps this a reach, but again I am hearing in Neal’s retelling how technologies were used as extensions of people, as something they had a close relationship with, and how as a medium affected “the whole psychic and social complex”.”
Tools in our hands, different from tools in the box. Not even tools anymore.
And this brings me to the brilliant twice recently delivered talk by Audrey Watters on Men Explain Technology to Me: On Gender, Ed-Tech, and the Refusal to Be Silent. I was glad I caught the version she did online for students in Alec Couros’s EC&I831 class.
She cites the problem of technology that’s been there all along but has become more front stage in 2014:
There’s a problem with the Internet. Largely designed by men from the developed world, it is built for men of the developed world. Men of science. Men of industry. Military men. Venture capitalists.
Harassment — of women, people of color, and other marginalized groups — is pervasive online. It’s a reflection of offline harassment, to be sure. But there are mechanics of the Internet — its architecture, affordances, infrastructure, its culture — that can alter, even exacerbate what that harassment looks like and how it is experienced.
The design this Internet tool has a bias built in? Is it the design, is it as an extension of power, privilege? I’ve not thought it this way before. And yet the internet is not “just a tool”, given the life threatening actions we’ve seen play out. It’s a weapon.
It’s been there all along.
Theory and concepts and issues (and inherent biases) are critically important, but I am also someone who likes to dabble in the concrete, the manifestations, the stuff. And so the part I’ve had mental wrestling with, and not that in any way expect Audrey to Womansplain it (it’s rather difficult) — is what are these better designs that provide equitable mechanics? What does it look like?
That gets, just a bit, at what I think we can do in order to make education technology habitable, sustainable, and healthy. We have to rethink the technology. And not simply as some nostalgia for a “Web we lost,” for example, but as a move forward to a Web we’ve yet to ever see. One that is inclusive and equitable. Perhaps education needs reminding of this: we don’t have to adopt tools that serve business goals or administrative purposes, particularly when they are to the detriment of scholarship and/or student agency — technologies that surveil and control and restrict, for example, under the guise of “safety” — that gets trotted out from time to time — but that have never ever been about students’ needs at all. We don’t have to accept that technology needs to extract value from us. We don’t have to accept that technology puts us at risk. We don’t have to accept that the architecture, the infrastructure of these tools make it easy for harassment to occur without any consequences. We can build different and better technologies. And we can build them with and for communities, communities of scholars and communities of learners. We don’t have to be paternalistic as we do so. We don’t have to “protect students from the Internet,” and rehash all the arguments about stranger danger and predators and pedophiles. But we should recognize that if we want education to be online, if we want education to be immersed in technologies, information, and networks, that we can’t really throw students out there alone. We need to be braver and more compassionate and we need to build that into ed-tech. Like Blockbot, this should be a collaborative effort, one that blends our cultural values with technology we build.
Right on. I am guilty as the next of longingly looking back at my web nostalgia, but bingo– move forward to a Web we’ve not seen before. Yes. Forward. I am here trying to consider what are the technological designs that would make web “technology habitable, sustainable, and healthy.”
Part of the way seems to lie in the IndieWeb movement, the Reclaim one, and the shining example Audrey cites from UMW Domains that move the locus of control of who we are on the web (our extensions) as something we manage. That maybe people (a few) are starting to look more sideways at the data we give to web sites.
I wondered a bit at her example of BlackBot, a tool designed to take agency to not having to hear feminist hate messages that come via twitter. Obviously it was a tool to help victims of online harassment. But there’s much more. More than just a tool. At first I wondered about how the design of The BlackBot fit into the Equitable Web, my misplaced assertion was it was social in terms of being a system for victims to share and network the information of abusers. It reminded the rallying of people to use internet technologies to battle the scum of Romance Scams.
But there’s more.
I needed to dig in. And so I learned what is fascinating about Blackbot is what they have created is a way at the individual level (agency) to report abusive behavior, share it, and then put in place a barrier level of blocking they choose. But the real difference is that the blocking has no impact or effect on the twitter system itself, it’s just being engaged for the individual. It’s operating at a layer between the Twitter ecosystem and individuals.
The Block Bot is a Twitter application to automatically block the nastiest of these people. Once installed, it works in the background, fetching the names of those to be blocked from a central server, and discreetly blocking them.
The Block Bot can be used anonymously, and makes no change whatsoever to your Twitter profile. The blocks are made silently, and (from the point of view of the person being blocked) are indistinguishable from ordinary blocks.
As one digs in more via the FAQ its obviously evolving but also imbued with social constructs. This is what systems with public APIs afford us, to extend ourselves, itself beyond what the original designers could have created.
It is a Hammerhand.
What is telling though still rears its head in the ugly comments of the Blackbot FAQ. Open comments can sometimes be the Bag of Internet Gold and but more likely in 2014 to be an Evil Cesspool. Is this in the design or our extension of our best/worst behavior? or both? Is there a clean division between us and the tool?
This is about where I need to Insert Some Grand Conclusion.
Not going to happen.
These are some threads I am working through, our complex relationships with tools and what new design approaches can help us create fair spaces for those relationships.
Hammers or what?
And yes, here at the end of the post, realize that a hammer might be an awful, violent choice of tool as a metaphor.
Just ask any nail.