Accessing Digital Literacy

Sarah Hildebrand

 

The first IBM PC, the 5150, that went to the market on August 11 1981. With a price of 3,280 dollars. It was a machine of 11 kg, 15 cm of height with a small black and green screen of 11.5 inches, and ran Microsoft’s MS-DOS software – source: http://www.dailymail.co.uk/sciencetech/article-2591182/Dumb-users-Bill-Shake-Speare-The-jokes-Microsofts-programmers-hid-firms-MS-DOS-software-revealed.html

 

Born in the late 1980s, I am unmistakably a millennial. I grew up on the cusp of dumb-to-smart phones and a world increasingly tethered by the digital ether. I feel comfortable navigating online platforms, and a little anxious when I’m out-of-range of a cell tower. 

Yet, my formative years were comparatively low-tech. I still remember floppy disks, MS-DOS, and dial-up. My first laptop was too heavy to be transportable and generated enough heat to burn skin. I had a Computer Applications course in high school that was really just a typing class. 

It wasn’t until college that my classrooms became “smart” or at least technology-compatible. And even with each room outfitted with its own computer and projection system, very few professors utilized them with confidence. PowerPoint became a standard part of student presentations, and a couple intrepid teachers let us experiment with video and sound recording equipment, but all written assignments had to be printed and stapled, and my classmates and I built most of our digital literacy independently, surfing the web after (or eventually during) class. 

Flash forward a decade later and institutional initiatives to digitize learning and install classrooms with the most up-to-date technologies have only increased. While some faculty have happily jumped on the bandwagon, utilizing computers to enhance their pedagogy and craft innovative assignments—anything from podcasts to website design to collaborative writing—others have been less sold by the break from pen-and-paper-based learning. These scholars invoke research about the benefits of handwritten notes as evidence for why laptops should be universally banned from the classroom, argue that computers are a gateway to distraction, and hypothesize that smartphones have perhaps just made us dumber. 

Rebuttals to many of these arguments have already been made. They rightly expose how the uniform banning of technology is an ableist pedagogical move, discriminating against students who may not be able to take handwritten notes or otherwise outing them by making them an exception to the rule. They acknowledge that computers are far from the only distraction within our classrooms, asking us to rethink teaching practices that fall short of engaging our audience. 

However, while the myth that technology might somehow universally detract from learning has been largely debunked, the underexplored flipside of this is that requiring students to have and utilize technology can also create educational barriers. 

iPad in education classroom – source: http://rachelseciblog.blogspot.com/2015/10/tablets-in-classroom.html

 

Nowadays, we teachers often expect students to enter our classrooms with a baseline set of computer skills. We generally assume students will know how to log on to the school’s Wi-Fi and into our course websites—that they will have email addresses and be able to type. In fact, for those of us who didn’t grow up with the internet, or even a computer, we often trust (or fear) that our students know way more about technology than we do.

However, while at times this may be true, hidden behind this generalization is another set of assumptions around access and accessibility. We assume that our students have internet connections at home, laptops readily at-hand, and are generally computer-literate before entering our classrooms.  We stereotype millennials, and especially those who’ve come after, as permanently plugged-in and probably hoarding some enviable skills in computer coding that we ourselves missed the boat on. 

Yet, many students at CUNY are far from at-home in the digital world. Rarely do more than a handful of my students own laptops. Some can only access the web on-campus. And even students who are active on the internet are often less digitally literate than one might expect. 

Whenever I have library sessions with my students geared towards conducting academic research, the most common complaint I receive is that the instructor is moving too fast as they navigate the scholarly databases. My students can’t remember where to click or how to get the results they want from educational technology even if they do spend hours browsing Facebook, Instagram, or Twitter. They can conduct a cursory Google search, but would be hard-pressed to explain Boolean logic.

And while that may prove a difficult test for many of us, more recently I’ve come to realize that even uploading an assignment through our course website or posting to an online forum is no small task for many students. While an age-based bias often haunts our perceptions of who will or won’t be computer-literate in our classrooms, even those who fit the millennial demographic aren’t necessarily well-versed in how to effectively delve into educational technology. While Blackboard in particular is an admittedly poor platform—often awkwardly laid-out and far from intuitive—guiding some of my students through the Discussion Board feature has made me realize that many are unfamiliar with even basic website navigation.

Similar issues of access extend to contingent faculty. Adjuncts often can’t afford to purchase educational technology correlated with more “innovative” forms of pedagogy, and I’ve yet to teach a course at Lehman College in a room with ready-access to a computer. Instead, obtaining technology is a multistep process of reserving equipment online, hoping it’s available, picking it up from the media center, hoping it’s functional, and lugging it back and forth from my classroom. 

Accessing technology is a hassle. For adjuncts whose time is stretched thin by commutes to different campuses, it’s difficult to determine how to acquire technology in the first place, let alone find the time to shuttle equipment between buildings. As a Graduate Teaching Fellow, it took me a semester and a half just to get my official college email address set up. And when I finally figured out how to borrow media equipment, I was often thwarted by broken cables and cracked screens—one computer was literally held together with tape. I grew accustomed to knowing that, any time I wanted to integrate technology into my lesson, there was a 50/50 chance it would be successful. During a course observation, I once resorted to drawing pictures on the chalkboard while I waited for IT to bring me a new VGA cable. Luckily, both my students and the observer shared laughs over my lack of artistic ability, but the experience still left me reticent to use technology in any high-stakes way. 

And although, yes, there is always at least one student who knows way more about technology than I do and will readily volunteer to set up the projector while I start the lesson, often the majority of the class is just as out-of-the-loop as I sometimes am. I baffled many students the other day when I used an Ethernet cord to quick-fix a problem in the school’s Wi-Fi connection. They had no idea internet could come from a wall rather than thin air.  

While it’s no longer an anxiety-inducing experience to check-out equipment—Lehman has definitely updated their gear over the past few years, and I feel more adept at switching to a back-up plan—these experiences have raised a whole new set of questions. Not “Will the technology work?” but “Will my students be able to work the technology?”

Technology turns over so quickly that there is no universal platform. When I do manage to score access to computers for my class, my lessons sometimes become hijacked by crash courses in digital literacy. Now, before I design a computer-based project, I ask: Will my students be able to complete the assignment unaided? How much class-time will I need to devote to explanation? Is digital literacy a core component of my course? And, perhaps more importantly, should it be? 

CUNY Lehman launches virtual reality lab in the Bronx to teach students how to build in virtual reality – source: https://www.gearbrain.com/cuny-lehman-vr-lab-program-2440531284.html

 

Nowhere in any of the courses I’ve taught has the ability to use technology been mentioned in the standardized “course objectives” section of my syllabus, yet it’s a skill we expect all students to have magically acquired upon graduation or often to have entered into the academy with as freshmen. When are students supposed to learn this skill? From whom? 

I have never banned computers from my classroom. (Confession: I initially held a grudge against cellphones, but soon realized this was a discriminatory, classist move, too.) But I also have no great anecdote of how I revolutionized the learning process with educational technology. Instead, the ways my students and I employ computers is extremely pragmatic, perhaps even mundane. I use open educational resources to cut costs —all course readings are posted to our website. I frequently ask my students to Google vocabulary words, by which I hope more to help them develop certain habits of mind than become digitally literate — though that is a welcome byproduct. 

Sometimes, I find I am using less technology in my classroom than I’d like or I feel I should. But it’s not because I am lazy, a luddite, or in any way angsty about our increasingly digital world. I am simply torn. I wonder how I can ensure the success of twenty-five students with varying access to technology. Do I require a digital project that might put some of them at a disadvantage? Do I sacrifice other course content to make room for technology, which I was not told to teach but also told not to teach without?

The internet is not as ubiquitous as it may appear. As academics, most of us can hardly imagine life without email—as much as we might want to. But it’s worth remembering that this makes us part of a privileged class. And while my students are brilliant and up to the task, any language takes time to learn.

Leave a Reply

Your email address will not be published. Required fields are marked *

 OpenCUNY » login | join | terms | activity 

 Supported by the CUNY Doctoral Students Council.  

OpenCUNY.ORGLike @OpenCUNYLike OpenCUNY