Jim Groom’s explorations for The Internet Course he is co-teaching with Paul Bond and Paul’s post on Gopher got me sroting the neurons and combing through the archives of my early 1990s days as an Instructional Technologist at the Maricopa Community College.
Actually I started as a “Programmer Analyst / Instructional Systems”. I found on an old CD a QuickTime Video (at a whopping standard resolution then of 160×120 pixels) of some kid who looks like me introducing himself.
It was a multimedia CD-ROM (authored in Macromedia Director) I made to introduce the people and resource of where I worked, The Maricopa Center for Learning & Instruction. I guess I was thinking metaphorically then; the title was “The House of MCLI” where the navigation was blueprint of our office.
(I can only run the CD if I break out my 2002 iBook). But in the directory of media was an image of the Gopher Server I ran there likely from 1992-1994:
Not the greates photo. I might have used the first QuickTake 100 camera.
When I started and was trying to learn things like programming in HyperCard, I made a lot of use of free resources I found at places like the Info_mac archives at Stanford, a machine I still remember was called “sumex-aim”. So one of the first things I started as a resource sharing at the central office I worked (Maricopa is a 10 college system spread over the sprawling Phoenix metro area) was not even an internet server, but an AppleTalk server on our college-wide network. I collected free programs, Hypercard stacks I was finding via FTP and heard about on listservs (that was like social media via email).
But being a large system, we had staff and faculty and students using both Macs and PCs… and that was one of the reasons I got interested in Gopher. At that time our email was still mainframe based on a VAX, and from that interface (I was using NCSA telnet to connect) which was text only, monochrome, one could connect to a Gopher Server.
You could start gopher at any server (default was University of Minnesota, where it started) (Get it? Gopher?). You got a menu list of directories, that you could arrow down, select, and then see another list of choices. They could be documents, images (not sure about sound or video). But you could also create what looked like a directory, but was really an alias that could connect you to another gopher server, or even a folder down in the directory of a gopher server. So by navigating up and down the tunnels, you could scurry around the internet, and download files.
I think I used a graphic Mac client called TurboGopher, where you could click on folders and files; it made it look like you were navigating the Mac Finder, but it was the internet.
That was a lot better than a single file serve limited to one organization’s network.
In hindsight, crude as it was, Gopher offered some of the attributes we’ve come to know on the open web. First of all, anything I put on my server, meant it could be accessed, and connected to directly by anyone else on the internet.
So on that machine (I think it was a Mac II), the server directory was a directory on the Mac, and I could create folders of shareware, reports, or make a directory jump to another gopher server.
My office published a newsletter we sent out (in paper); by 1993 I was putting the Labyrinth (later the Labyrinth-Forum) on our little web server. I have all the back issues; but the are no longer on the Maricopa server I started, so I have started putting some of that old stuff on my own server. Here is one from February 1993 “Tools for the Internet: Gopher” and what my little server looked like:
The top item is a plain text document, the next some more how to’s (and you can see how we designed the interface by putting “..” in front of directories. “into the Internet” was likely software tools like Telnet, Mosaic (later); I must have been making text versions of our newsletter (and at some point HyperCard/ToolBook ones); more directories related to projects, a few shareware folders (Mac and Windows), and the last one was places I made connections to other Gopher Servers.
Yeah, it looks crude, but it was a half of a step into a useful navigable connected space on the internet.
The article content:
Gopher is a menu-driven tool for “prowling” the Internet. At the selection menu item, you transfer to other Gopher servers, numerous library catalogs, archives of software to download, collections of journal articles, databases such as ERIC or WAIS, and more. Gopher takes you from one Internet site to the next and back. This friendly rodent orchestrates all of the transmission details.
Client applications for Gopher include TurboGopher for the Macintosh, “DOSgofer” for IBM/PC, and “Gopher v1.3.2” for the NeXT. These are available from the MCLI Public file server, in the “Into the Internet” folder. Look in the MCLI Public file server for more information on other Internet tools such as WAIS (Wide Area Information Servers) and WWW (World Wide Web).
Wow, I forgot about WAIS, I think was an internet service to search an indexed archive of published papers. And yes, in February 1993, I was referring to the web, which in those days you always had to add explanatory details like “World Wide Web”.
I did look a bit at the WWW via the VAX interface and a text based browser in 1992, but it did not click for me until 1993 when a colleague named Jim Walters from Phoenix College handed me a floppy disk labeled “Mosaic”. He did not try and tell em what it was, he just said, “Try it”.
The rest is my history, learning HTML from the NCSA Tutorial. I had kept for years a ratty copy of my print out from that, the foundation of my own Writing HTMl Tutorial.
The thing about those days was our leaders at Maricopa had foreseen the value of both a someone network for the entire system, but also the value of microcomputers old name for what we just calla computer. Micro for being much smaller than a mainframe VAX.
I had found out that every microcomputer in our office had a dedicated IP address (must be before DHCP), and that I could send a message to out IT department and create a human name for each machine, so thus my machine was topaz.mcli.dist.maricopa.edu and my colleagues had one like ruby.mcli.dist.maricopa.edu, garnet.mcli.dist.maricopa.edu, etc. I was not fun of my Windows PC, which I named pyrite.mcli.dist.maricopa.edu
So all office machines were I put my Geology degrees to work. The servers I used formations in the Grand Canyon, to the Gopher server was tapeats.mcli.dist.maricopa.edu and my first web server was hakatai.mcli.dist.maricopa.edu
But the thing was, all I had to do was ask IT for a name for a machine, and by plugging it into the network, I could make those machines servers. In those days I did not have to ask for a server, I just turned a Mac SE/30 into the first web server I ran.
We had some visionary leaders then. When I first started, my office was publishing a new report on the state of instructional technology in the Maricopa System– authored by Scottsdale Community College Faculty Alan Jacobs called “It’s a River Not a Lake”. I made a web version of the report, and just rekindled it on my own server.
The metaphor infers that thinking about change meant that technology was not just one thing coming at us like a river flow, but we would be engulfed by it all around us:
It’s a river; not a lake, this experience of ours with technology. While the examples so far have reflected the stream of changes in computing, those who use other technologies will recognize the same dynamics. Consumers of recorded music have been part of a media roller coaster ride: LP record/8-track/cassette tape/CD-ROM/ and perhaps digital tape. Have you tried to buy a new LP recently? Producers of music (many of them are artists and performers) have seen a two-decade change in digitally-synthesized and digitally-recorded sound, and in the control and manipulation of those sounds. Recorded music is being transformed into a genuinely different art form than live music.
It’s a river; not a lake. Our behavior, then, and our decisions need to reflect that reality. Namely, we should expect continual revision of the software tools we use. They are not onetime purchases with a onetime learning component. Rather, the software represents a continual cost in both time for learning and expense of upgrades.
The innovations identified in 1993 are interesting, especially for backward gazing at future projecting… there was interest in the MIT Athena project, but it was deemed to expensive at the time:
The Athena Project from MIT points the way to a possible future, namely the distributed client/server model. In Athena many high-end workstations are networked together with some server computers. While most applications run on the workstations themselves, each of the server computers has specialized services to deliver to the workstations: handling printing requests and authorization, file storage, and user authentication, to name a few.
With Athena a user can sit at any workstation and have access to his/her personal files and have access to any available software program. Thus, Athena combines features that are common in mainframe computing, without a mainframe computer, with features that personal computer users have enjoyed, like local processing and a graphic user interface.
Because Athena is a network, inter-user communication is available both as an instant message and as E-mail.
Athena runs on workstations and servers that themselves run on the UNIX operating system. Much software currently available in MS DOS, Windows or Mac OS is not available under UNIX, so the current software selection is limited. Of course, as third party software is developed for UNIX, the number of options will increase.
In many ways Athena was ahead of its time. And Maricopa did make an early investment into microcomputers; there were Faculty Instructional Computing Literacy projects in the late 1980s– they were referred to something like FICK-LIP but I forget the acronym.
This 1993 report was actually a followup to the 1986 Master Plan for Instructional Computing, way before my time. I understood that in the mid 1980s, Maricopa made an early entry into system computing by establishing mainframe VAXes at every college, that were connected and synced to the central office. The first intent was to digitize administrative systems, HR, student records. I know when I started, faculty talked about some late 1980s experimentation with VAX based tools for teaching and learning.
I don’t have that report but there is an ERIC record
Executive summaries are provided of a variety of projects undertaken by the Maricopa Community Colleges related to the use of technology, and to the staff and students in the district. Reports are presented for the following activities and plans: (1) Maricopa Community Colleges/Digital Equipment Corporation/Information Associates Partnership, a project to enhance the district’s management information system; (2) Telecommunications Improvement Project, Phase I: Needs Assessment, a project designed to review the district’s existing telecommunications system and to make recommendations for improvement; (3) Instructional Computing Master Plan, which offers recommendations for enabling the instructor to do instructional computing, provide programmer support, and exert leadership; (4) Library/Media Automation Project, for automating cataloging, public access, circulation, acquisitions, and media booking functions; (5) Center for Instructional Technology Strategic Master Plan, which outlines the center’s plans for increasing college access to its resources and support services; and (6) Information Technologies Services Strategic Master Plan, which includes general directions for improving the use of various educational and information technologies. Finally, a list of development projects undertaken by the college between 1983 and 1986 is presented.
That number 5 was critical for my present future – the “Center for Instructional Technology Strategic Master Plan” was what my directory Naomi Story wrote in ?? veery early 1990s as a proposal to establish the center which eventually hired a 27 year-old green mullet headed Geology Graduate school drop out.
We did have powerful leadership- Chancellor Paul Elsner (1977-1999) was a true visionary and led the greatest growth in the system and the investment in not only technology, but staff to support it. I barely remember a talk where he did some grand comparison to our then current age and the 1939 World’s Fair. He also would give out copies of this 4 act play that he wrote that was a metaphor for everything be believe in for the system.
And Vice Chancellor Alfredo de los Santos was a huge influence on me, and actually he was the last interview I had to get my job in 1992. He had this kind of country boy Mexican American personality that hid how smart he was, and his gift was asking his famous “stoopid questions”. It was a fascinating leadership style.
I got to have breakfast with him a few months back; Alfredo is still as sharp and likes to laugh as much as he did in the day. But he was also on tough SOB.
I found in my CD-ROM something I made as joke for him, and it shows how crude my graphic redesign skills were then; I had made a little multimedia app tribute to Alfredo, riffing off of the picture book From Alice to Ocean anamazing boo of photography we had in our office (and you can buy that one used for $0.24)
As you can see, in 1994 I was priming myself for ds106.
I do remember the books Jim cited, ones that Andy Rush still had and had shared — “Entering the World-Wide Web: A Guide To Cyberspace” by Kevin Hughes and Brandon Kehoe’s Zen and the Art of the Internet. It’s fascinating that it was written when Kehoe was a student at Widener University AND that he negotiated a publishers deal “to ensure that the original edition of the book would remain free-of-charge in the internet for everyone to access.”
I am pretty sure Zen as available in plain text, distributed via news groups, listserves, FTP, Gopher, etc.
One of the earliest Web books I remember was John December and Neil Randall’s The World Wide Web Unleashed — and look you can get the second 1996 edition for 57 cents (I just bought one, $3.99 for shipping).
I know it because at Maricopa I had gotten a free, autographed copy of the 1994 edition (October I believe)… because they included one of my early web sites in it
That site was “Community College Web” — a directory of community colleges with web sites. The first site was just a plain HTML file, and then with some help form a student programmer (Hi Derek!) we were able to make sites that were searchable using some sort of unix indexing engine on flat text files.
The page for my site (I had the first web server in Maricopa) shows how fancy web design was in 1994
But there you go, early on, you make something, share it online, and people start giving you free books
the book article and screen shot shows I was still using a Gopher server… I had a good hunch about the web, but had not jumped in and dropped everything else. I think “MariMUSE” was some sort of MUD or MOO run at Phoenix College.
My last scanned find from my paper files was an article from June 12, 1995 in the Arizona Republic (scanned to a PDF) called “Home, Home on the Web: Build your own Net Page, Impress Other Surfers” and it details some basic HTML– I saved it because it mentions my Writing HTML tutorial (although it linked to a mirror copy in Tawain?)
I like the part that fills you in after all of the HTML details…
As to the second part of our second question- How do you get your page on the web so others can see it? -Here are two options.
You can spend thousands of dollars setting up a web server. Or you can ask whether your internet provider will post the HTML page for you. Some providers will post pages free for their clients.
The last little bit I had posted a little bit back. In 1996 I was excited that one of my sites was included in a magazine- once called “The Net” that was full of all these exciting new web sites. Every site was new then, and worth putting into print. My excitement waned a little when I saw my mention was in a special cover issue:
Yep, the Online sex issue. I still have that baby wrapped in cellophane.
And that is way more than enough nostalgia.
The post "Old Gold Days at Maricopa" was originally emerged from the primordial ooze and first walked on land at CogDogBlog (http://cogdogblog.com/2014/02/old-gold-days-at-maricopa/) on February 13, 2014.