creative commons licensed ( BY-NC ) flickr photo shared by QuestionMark

Has anyone jumped any chasms lately? Once again, the associative trails in my cerebral memex fired off this morning. Via a tweet from Karen Fasimpaur, I watched Wes Fryer’s metaphor rich keynote Igniting Innovation in Teaching and Learning

At 2:41 Wes introduced the Technology Adoption Curve, the notion based on Everett Rodgers Diffusion of Innovation work that there are different groups of people in terms of the way they adopt new technologies.

It got me thinking about a writer in the early or mid 1990s, I was pretty sure he worked with or was affiliated with IBM, and maybe his name was William, who had written a lot about how this was applied to instructional technology, but what he spoke of, and missing from the image that Wes showed, was what missing was the “chasm” between the innovators/early adopters and the larger number of people in the Early Majority

From 7 Storytelling Reasons Why Innovation Fails; Google search says it's licensed from reuse, I cant find license.
From 7 Storytelling Reasons Why Innovation Fails; Google search says it’s licensed from reuse, I cant find license.

My first searches failed

crossing the chasm rogers innovation "William"

crossing the chasm rogers innovation "education"

Then I went into Google Scholar seeking the reference to Roger’s work as a citation:

"citations" "Diffusion of innovations"

And got it when I limited the scope of search between 1993 and 1998. Bingo! Here is the winner:

Geoghegan, Willian (1994) Whatever Happened to Instructional Technology? Paper presented at the 22nd Annual Conference of the International Business Schools Computing Association

The paper is still available as a Word doc download from University of Southamption ePrints Soton.

And hah, they have a typo in his name, it’s William H. Geoghegan from IBM Academic Consulting. I actually remember co=-presenting with him at a conference in maybe Vancouver in the late 1990s.

This paper is from 1994 and quite revealing and relevant (in some ways sadly) 20 freaking years later). The abstract:

During the last decade and a half, American higher education has invested about 70 Billion Dollars in information technology goods and services, as much as 20 Billion Dollars of which has gone to the support of teaching and learning. But despite the size of this investment in instructional technology, numerous examples of innovative and successful instructional applications, and a growing comfort level with technology among both faculty and students, instructional technology has not been widely adopted by faculty, nor has it become deeply integrated into the curriculum. By some estimates, no more than five percent of faculty utilize information technology in their teaching as anything more than a “high tech” substitute for blackboard and chalk, overhead projectors, and photocopied handouts. Promising innovations rarely propagate beyond the innovators themselves. This paper examines the broad range of factors that underlie the failure of instructional technology to penetrate the curriculum more widely than it has. Particular attention is paid to the social barriers that impede the diffusion and adoption of promising innovations in instructional technology, and to the unintended manner in which well-meaning efforts to support the development and diffusion of instructional technology by IT support organizations and technology vendors have frequently undermined adoption by mainstream faculty.

The numbers and technologies have changed, but I’m seeing the same issues Geoghegan described in 1994 as not changing much in 2014. It fits what Brian Lamb and Jim Groom pitched in their Reclaiming Innovation Educause Review article as what has been missed in all of the money spent on vendor products and systems — investing in people.

At this point I am pretty much going to grab quotes from the 1994 article add my sarcastic comments. It’s illuminating and eerie.

The advent of digital computers on college campuses more than three decades ago brought with it a growing belief that this new technology would soon produce fundamental changes in the practice, if not the very nature, of teaching and learning in American higher education. It would foster a revolution where learning would be paced to a student’s needs and abilities, where faculty would act as mentors rather than “talking heads” at the front of an auditorium, where learning would take place through exploration and discovery, and where universal educational access, transcending barriers of time and space, would become the norm. This vision of a pedagogical utopia has been in circulation for at least three decades, enjoying a sort of perpetual imminence that renews itself with each passing generation of technology.

And then there were MOOCs. And promise of “pedagogical utopia”.

But there’s a problem. Despite massive technology expenditures over the last decade or so, the widespread availability of substantial computing power at increasingly reasonable prices, and a growing “comfort level” with this technology among college and university faculty, information technology is not being integrated into the teaching and learning process nearly as much as people have regularly predicted since it arrived on the educational scene three or four decades ago. There are many isolated pockets of successful technology implementations. But it is an unfortunate fact that these individual successes, as important and as encouraging as they might be, have been slow to propagate beyond their initiators; and they have by no means brought about the technologically inspired revolution in teaching and learning so long anticipated by instructional technology advocates.

Like the people who describe the LMS as being helpful for faculty to whom the open web is “too complicated” (a future post brewing there, because that is a crap filled point of view).

Geoghegan cites data way back in 1994 showing that faculty access to technology even than was not a problem, and scuttles the notion that it’s a fear issue (my emphasis added)

The instructional technology problem, in other words, is not simply a matter of technology being unavailable to faculty. It is not attributable to faculty discomfort with the technology itself, nor to faculty disenchantment with the potential benefits of information technology to instruction. In fact, the best evidence we have available today suggests that desktop computing is being widely used by faculty and, more importantly, that it is being used in support of teaching. The problem is that this support is for the most part logistical in nature: preparation of lecture notes, handouts, overhead transparencies, and other types of printed and display material that substitute for the products of yesterday’s blackboard and typewriter technologies. Such usage may enhance faculty productivity, and it may even help student learning (by substituting neatly printed transparencies for blackboard scribbles, if nothing else); but it does little or nothing to exploit the real value of the technology as an aid to illustration and explanation, as a tool that can assist in analysis and synthesis of information, as an aid to visualization, as a means of access to sources of information that might otherwise be unavailable, and as a vehicle to enable and encourage active, exploratory learning on the part of the student. The technology is being used logistically, in other words, but it is only occasionally being utilized as a medium of delivery, and to even a lesser extent do we find it deeply woven into the actual fabric of instruction.

Technology is used mainly as a logistical tool (cough LMS). Same as it every was.

And after dismissing the possible reasons for the lack of deeper adoption of technology -equipment and facilities, institutional support, and unrealistic expectations, he brings it home to a human factor:

I would argue nevertheless that one of the most basic reasons underlying the limited use of instructional technology is our failure to recognize and deal with the social and psychological dimensions of technological innovation and diffusion: the constellations of academic and professional goals, interests, and needs, technology interests, patterns of work, sources of support, social networks, etc., that play a determining role in faculty willingness to adopt and utilize technology in the classroom. The model that we have most commonly used for supporting the development of instructional technology – with its focus on technical support for technically “literate” faculty who often have strong track records of success in this area – may be well suited to the characteristics and needs of technologists, of technically inclined faculty innovators, and even technology vendors. But it is ill- adapted to the interests and needs of mainstream instructional faculty, whose concerns lie more with the teaching, research, and administrative tasks they have to address than with technologies that, at best, may assist in addressing them. The mismatch, in fact, may be so great in many circumstances as to alienate mainstream faculty from the more technically inclined early adopters, opening a gap between the two so great as to reduce or eliminate the likelihood of mainstream faculty actually adopting instructional technology for their own classroom use.

And this was the main part of the paper I remembered -most of the support approaches are focused on the Rogers groups of innovators and early adopters, rather than a multi-pronged strategy for the other groups on the other side of the chasm. And there fore the adoption never diffuses much beyond the leading edge 15%:

The differences between the two groups are extensive, and their importance is magnified in the context of changes that have the potential to make radical alterations in the teaching and learning process. The appeal of instructional technology to the early adopter will be very different from its appeal to a member of the early majority, despite the fact that both may recognize its potential benefits to teaching and learning; and the two are likely to have very different criteria for deciding whether or not to adopt a technology based innovation when it becomes feasible to do so.

And likely the current approach of appeal is still the leading edge —

This gap is so significant in the case of instructional technology that it has so far stymied almost all efforts to bridge it. It has left us in a situation in which the early market seems to have approached saturation in its use of instructional technology; but in which mainstream adoptions are relatively few and far between. This failure to penetrate the mainstream did not happen in regard to technology use in general; the use of personal computers and workstations for personal productivity (especially word processing) is becoming almost universal in higher education. But despite the longer history of instructional technology, it seems to have stalled in its progress where other applications of similar technology have not. What is it about instructional technology as an innovation, or about the way it has been supported and “marketed” by its proponents, that has prevented its bridging the gap?

He cites four reasons why the diffusion of educational technology has not jumped much beyond the gap. Note that it’s not just the presence of technology for a wide group of people- their offices, classrooms, pockets are full of technology. The missing diffusion is into the pedagogy.

The first is “Ignorance of the gap”:

We seem to have assumed a sort homogeneity (in quality if not degree) of faculty willingness to experiment with and use instructional technology, thereby ruling out the possibility of recognizing qualitatively distinct subgroups with different attitudes toward technology and its use in instruction. From this perspective, some faculty simply have a higher degree of resistance to instructional technology than others; and stronger arguments, or greater incentives, or more support, is all that is needed to bring them around (as opposed to different arguments, or different incentives, or different modes of support).

Second is the “Technologists Alliance”

The last decade has seen the formation of an alliance between “technologist” populations concerned with instructional computing. Those involved include faculty innovators and early adopters, campus IT support organizations, and information technology vendors with products for the instructional market. Ironically, while this alliance has fostered development of many instructional applications that clearly illustrate the benefits that technology can bring to teaching and learning, it has also unknowingly worked to prevent the dissemination of these benefits into the much larger mainstream population.

This leads to the third, “Alienation of the Mainstream”

Moore also points out that the “overall disruptiveness” of early adopter visionaries can alienate and anger the mainstream Moore (1991:59). The early adopters’ high visibility projects can soak up instructional improvement funds, leaving little or nothing for those with more modest technology-based improvements to propose; and their willingness to work in a support vacuum ignores the needs of mainstream faculty who may find themselves left with responsibility for the former’s projects after the developer has moved on to other things. And, finally, the type of discontinuous change favored by the early adopter has a tendency to product disruptive side-effects that magnify the overall cost of adoption.

Ahem. Disruption. Totally appealing to 15%.

And finally “Lack of a Compelling Reason to Adopt”

in order to bridge the gap, one must first establish a beachhead on the further side, best done by defining an application of the technology that is of absolutely compelling value in pragmatic, mainstream terms. This is not the so-called “killer app” of high tech legend, but rather an application of instructional technology that offers value substantially in excess of the costs of adoption. The application will be one that performs an existing important task, or solves an existing problem in a markedly better way; or it will be one that enables something new to be done in a way that contributes significantly to instructional effectiveness.

And in 2014, we all know a compelling reason is 100,000 students or harnessing big data.

Yeah.

And in the closing, where we still stand today:

Along the same lines, let me suggest that the technologically driven revolution in teaching and learning that we have sought for so long is probably nothing more than a chimera. Revolutions in teaching, or in anything else for that matter, are created by revolutionaries, not by their hardware; though good hardware properly employed can certainly help them succeed. But no revolution, no matter how well financed and equipped, and no matter how good the motivating ideas, will be successful if the revolutionaries and their supporters fail to convince a significant proportion of the general populace to follow them past the barricades. Absent that, we have nothing but a failed revolution: some interesting ideas, perhaps, and some quaint examples of what might have been, but no revolution.

Long live the revolutionaries where ever they be lurking. Viva the — whatever.

Grab a copy of the paper and tell me how far we have come on this since 1994.

creative commons licensed ( BY-SA ) flickr photo shared by Symic

The post "Whatever Happened to Whatever Happened to Instructional Technology?" was originally squeezed out of the bottom of an old rusted tube of toothpaste at CogDogBlog (http://cogdogblog.com/2014/10/whatever-happened-to-instructional-technology/) on October 20, 2014.

2 Comments

Leave a Comment

All fields are required. Your email address will not be published.