Learning / Design / Technology

A sign says "one way". According to whom?

The job title “Instructional Designer” has become so commonplace that the two words are nearly inseparable. Of course we want instruction to be designed. Of course we need designers to shape and create instruction. Indeed, our public school systems expect teachers to create lesson plans for each period of each day. College instructors create course syllabi with daily class objectives that align with that semester’s student learning outcomes. We praise Understanding by Design.

However, in The Children’s Machine, Seymour Papert shares an anecdote in which a brilliant teacher runs an inspirational class and gets reprimanded for not having a lesson plan (pp. 59–60). Then, Papert uses this “account of a system defeating its own purposes in the attempt to enforce them” to illustrate how overly detailed plans and lock-stepped curriculum designs treat students as tools to be programmed, eliminating the role of curiosity and quantifying and sequencing every concept to be learned.

Who’s in Control?

The more carefully we attempt to design students’ experiences, the closer we come to Audrey Watters’ mantra of “Command. Control. Intelligence.“—and the less we expect students to contribute to their own learning. How will students manage to learn anything new after they graduate? Who will design their learning path then? Less sarcastically, how can we help students see that real learning happens as the result of curiosity more than design? How might we put them in control of their own learning processes?

Beware those who say “ed-tech” here. For plenty of folks, the answers to my questions involve a digital tool of some sort. According to them, some app or site or plugin can train students to follow a fail-safe process to learn how to control their learning. To these people, ed-tech can solve any trouble we have getting students to learn.

Questioning Design

But who designs ed-tech, and what principles and pedagogies do they encode into it? Critical digital pedagogy challenges us to think carefully before adopting a technology in our classes, to consider what we might lose in exchange for use of any given tool. In today’s activities, we’ll use an iteration of Jesse Stommel’s Critically Evaluating Digital Tools. This exercise guides an investigation of how corporate policies and managerial politics influence the tools we use in our classes. Through this exercise, you’ll look at common tools through uncommon lenses and see how not questioning our tech is a questionable practice.

Today, consider how we rarely challenge the dominance of technology and design in our planning decisions and remember how we started the week with an emphasis on the human side of teaching. As you go about your day today, ask yourself these questions:

  1. How do we form outcomes for online courses?
  2. Whose voices contribute to the design of our courses?

Considering the origins of outcomes and the relative say students have over their learning might help shine some light on inherent inequities of ed-tech and the problems of assuming access or equity.

Redirecting Focus; Aligning Intentions

Earlier this week, I shared with participants a list of platforms I chose to use for this week’s course. In that document, I included a brief justification for why I selected each platform. I did that with today’s Critically Evaluating Digital Tools exercise (derived from Jesse Stommel’s version) in mind. We should be able to articulate why the tools we use meet our needs and what they expect from students. Issues about accessibility, data management, privacy, and cost all factor into these decisions. For a great introduction to the complexity of choosing appropriate platforms for education—and the way platforms shape our thinking—check out my conversation with Chris Gilliard on the the “Platforms” episode of HybridPod.

Just like our platform choices should have purpose and intentionality behind them, so should our project options. This week, “the usual” wasn’t cutting it.

The Way We’ve Always Done Things

Traditionally, #CritPrax participants spend the five days of DHSI brainstorming, designing, building, deploying, and assessing a brand-new open online course. It’s an insane goal, and the irrationality makes the goal enticing. Traditionally, CritPrax participants also spend four or more hours together each day sharing space, ideas, and conversations. This year’s different. Like its predecessor, 2021 is anything but traditional, and this year’s DHSI workshops don’t involve the same intensity of sharing time and space. Yesterday, I discussed how the demands of our situations often mean we have less far less attention to dedicate to this workshop than we would if we were together in-person. As a result, the traditional goal of creating an open online course now seems impractical.

Our project goals need to adapt to our situation.

A New Approach

In Tuesday evening’s Jam Session, discussed how we might re-imagine the main deliverable for this course to better suit the demands of this week’s situations. Rather than building a course—however we might define that—we’re going to aim for a short document, perhaps something like a White Paper or a Research Note. This document would target educators and administrators, engaging them in discussion about how to select technologies for a class or institution. Here, our goal would be to help shape thinking. (Isn’t that what courses do?)

While our project aim may be less ambitious, it is no less significant. Now, our goal is clarity. Simplicity. Guidance. Rather than building a course that demands significant chunks of time from visitors and creators alike, we’re keeping it simple. Today, we’ll identify key concepts of interest to each participant and find ways to weave them together into a collaborative document we’ll share at the end of the week.

Stay tuned!

Control and Design

As you interact with your work today, as you interface with various tools and platforms, keep asking yourself who has control over each space. Do users control their data? Does the platform control what ideas are acceptable? Who controls access to information? Who determines what others see? Then, consider how that level of control affects student agency.

For instance, use of social-media platforms can help connect students with a broad public audience. At first glance, this seems like a great opportunity. But then we have to consider the algorithms that determine how likely a student’s post will be seen by others. Suddenly, students are writing for the algorithm more than for the human audience. In fact, as I write these words, I’m intentionally using transition words taken from a list in order to make this post more appealing to search engines. As a result, my language is being shaped by an algorithm as well as my readers.

Teaching Machines, not Students

Can writing in online spaces be free of influence of algorithms? I’m not so sure. If you use Google Docs, you’ve likely seen text predictions that suggest entire phrases that Google’s natural-language processing AI thinks you might be about to type. In other words, Google’s systems watch what you type as you type it, and they predict what you might say next. The system works so well because of the access Google has to everything people write on Google Docs and all the email that flows through Gmail servers. When we share written information with Google, we provide the content that teaches the machine. For those of us who teach for a living, donating our services in this way should unsettle us.

And that’s the kind of thinking we want to address in our collaborative document this week. Starting with the idea of surveillance, we want to examine issues such as:

  • information privacy
  • user control
  • capitalism (how tools are free to use)
  • and more!

We’ll shape our document today with the goal of publishing it Friday. There’s plenty for us to do (and think about)! Meanwhile, for those auditing this week’s course, here’s Wednesday’s activity checklist.