Humane Q&A: Responses to ALT-C Keynote

A bandanna-wearing dog tilts its head, awaiting responses to its curiosity

My keynote about Humane Education at the ALT Conference 2024 generated a number of questions in the Q&A forum that I did not get to address while on-stage. Below, I present the responses those questions prompted.

Do you have any favourite tools to use while teaching?

Movable desks. We often forget that analogue, physical tools hold as much power as our spiffy digital ones, and the one tool I cannot run a class without is modular furniture.

Business-department buildings in the U.S. are notorious for using long, narrow, tiered tables with permanently affixed chairs that might pivot but cannot be relocated. The design of the room tells everyone in it that the only person of value stands behind the podium, and everyone else prevents the chairs from returning to their default state.

The greatest resource in any class is the experience and energy of the learners, and furniture that allows students to collaborate unlocks the generative power of any classroom.

As for my favorite self-empowering tool, I bring an old set-top box with me to my classes. I have turned off every app, feature, and gizmo on that device, and I’ve set the casting receiver to require a password. (If brand-specific language helps here, I use a second-generation Apple TV, removed all apps, and use AirPlay. But to the point of agnosticism, this same approach works in an Android/Chromecast ecosystem.)

When I enter a classroom, I connect the set-top device to the room’s projector, then stream from my tablet’s presentation app. This setup lets me wander around the room and engage directly with students (instead of staying behind a podium) while putting content on the screen.

Any time I need to demo something, I grab my laptop and stream from there. I can sit with the students and show them what’s on my screen while being *with* them, both physically and intellectually. Any time I’ve been observed by my colleagues, they do a double-take and react as though I’m a magician.

Each of the tools above is a simple solution that fades into the background and puts human connection at the forefront of the classroom experience. To me, they exemplify the best of Humane Technology.

What one question should be asking which we’re not, to tech companies before using their tech?

Two questions shine a light on important issues:

  1. Where does the tech’s money come from? This means both following the funding sources and also identifying the product being sold. Thus, Google Search gets money from advertisers based on the specifics of the user profile the company allows advertisers to target. Understanding the kinds of profiling involved can help users understand how their data and behaviors are subject to surveillance and influence by those advertisers.
  2. Beyond the features and benefits, what does the tech actually do? Focus here on functionality, not marketing. For instance, the answer to what WordPress actually does is managing multiple databases, constructing webpages dynamically on-demand by pulling relevant information from a complex hierarchy of SQL databases. This includes post/page contents, plugin code, and theme frameworks. Understanding those pieces helps users know what’s going on—both when things work and, importantly, when they start to break.
Aren’t edtech professionals also disempowered by edtech?

I’m not the best person to answer this, as I’m not in that community and can’t speak to the situation with any meaningful experience. Anyone I’ve chatted with who works in the industry seems to view edtech as an opportunity rather than a deterrent, and the optimism in the edtech sector seems pervasive from what I’ve seen. Thus, my gut says no, but I suspect the person asking this question disagrees and could provide better insights than I can.

[Related to your] call for review boards for tech adoption, we do have to do data protection impact assessments for new technologies but maybe more could be done here to consider ethical use of these technologies alongside the privacy and security aspects.

I fully agree. I’m encouraged to hear you have data-protection reviews, and I’m envious of the UK-GDPR and DPA, those only address the use of student/user data and don’t exactly consider the effect of the technology on the user or compensation for the user’s labor.

For instance, the value of Turnitin comes from its massive database of student-generated materials. Many teachers force students to donate their labor to that database out of a sense of distrust by default. The database of student work could be the most secure vault in the world, and we could remove all personally identifying information, but we’re still mandating the donation of student labor to a for-profit company. That approach would pass a data-protection impact assessment with flying colors, but it’s still unethical.

My call for an edtech IRB is for a review that looks beyond data protection (a real and valid consideration) to the impact of the tool’s use on students’ psyches and wellbeing.

What would you want to say to educators not feeling comfortable with putting their students in the driver’s seat?

I read this question after reading the Q&A forum for the student-panel keynote the day after my talk. In that set of questions, someone said something to the effect of, “You students said you don’t like something, but isn’t there a benefit of pushing you out of your comfort zone?”

That same mentality applies here, and I find that teachers often feel too comfortable in their positions. We don’t often expect our professional experience to push us outside our comfort zone.

Yet the more comfortable and settled teachers get, the less responsive they become to the needs of their students in the moment.

Any time I’ve tried giving students more control over their learning and the direction of a class, my frustrations have come from being too timid, not from being too bold. My suggestion to folks uncomfortable with letting students drive is this: If we engage students as mature, responsible adults, we can negotiate situations to mutual benefit. (Recall the “benefit to all while protecting the minority” bit I presented in my talk.) 

When we give students control of their learning, we need to help guide them to prevent foreseeable problems while allowing space for experimentation and learning through challenge/struggle/mistake/etc.

Fundamentally, I assert that because we cannot think on a student’s behalf, we cannot presume to understand a student’s path through learning. And because they’re the ones doing the work of learning, we need to trust them to know themselves better than we do. (We know the content/discipline better than they do, though, so effective student-directed learning is actually a partnership.)

How do we get tutors to be confident in the tech first in order to enable the humane approach?

The harsh, blunt angle: I question why the tutors are using tech they’re not comfortable with. If it’s being imposed by the department or institution, that policy warrants scrutiny. If they’re uncomfortable with it due to inexperience, that’s on them, and they should learn the tech before implementing it.

The compassionate, optimistic angle: There’s tremendous benefit for students in watching a professional adult learn new tech. We can demonstrate the way we explore and learn new tech as a demonstration of procedure—but students need to know that’s what’s happening. There’s value in demonstrating how to learn a tool just as there’s value in demonstrating how to use a tool.

You state that Ed tech is disempowering students, would you not say that the current education format is also disempowering students?

I’m afraid I can’t discern what “current education format” is under scrutiny in this question. That said, I’ll respond by saying edtech doesn’t have to be digital or online—there’s a reason I put a #2 pencil on one of my slides, and my response above about desk arrangement offers another example.

While listening to the student-panel keynote the day after my talk, I was struck by how often the panelists mentioned lectures, recorded or otherwise. That’s an education format that’s both thousands of years old and deeply problematic. In a modern world with nearly universal access to nearly infinite books and videos on nearly any subject imaginable, why are teachers still using class time to lecture? The greatest, most distinctive resource in a class(room) is the unique collection of minds and experience it offers. The learning experience should use that as its primary resource, not a lecture from the faculty.

Doesn’t big tech disempower everyone? How can we bring it back down to a user / individual level?

In one sense, big tech empowers people by granting greater abilities and opportunities. Therein lies its seduction. But it tends to reproduce, if not strengthen, existing inequities, granting access and ability to those who already have confidence and resources.

Self-consciously writing as a middle-aged white man with a beard in a tenure-track position, there’s only so much I can say about how big tech can be wrangled to benefit everyone—my view of it and experience with it is colored by my position of privilege. The best way to bring big tech’s benefits to an individual/user level is to put authority over big tech in the hands of people who have the best interests of the individual user at heart. Since that’s not how American capitalism works, I’m afraid I don’t have a ready solution at hand, beyond “ask marginalized people what they need.”

What would you say to awarding bodies to enable more options of assessment?

Adding on to my answer above, I say awarding bodies need to ask themselves what about their criteria implicitly benefits a certain type of person. 

A good first step is to make explicit the hidden curriculum. So much of education—particularly higher education—is predicated on cultural norms or familial experiences that our institutions do a poor job of acknowledging, to say nothing of teaching. 

Many institutions (and awarding bodies) were built by and for straight white men of privilege, so they assume a certain cultural context for those they recognize. This connects to my point about transparency in my talk: Just like we need to be honest about what our tech does, we also need to be honest about what we expect from our people.

A small example from my experience can, I think, transfer to outside situations: Many classes list first-year writing as a prerequisite, but there’s nothing in the content of FYW that’s relevant to the courses with the pre-req. What many classes actually require is sophomore standing. Course prerequisites and award requirements all need to be honest about their real expectations, not just the expectations that have emerged through socio-cultural momentum.

Should some of the challenges you post at the door of edtech actually be posed to the uni administration?

Yes! I argue that the responsibility lies with all of us who are in any way involved with edtech decision-making. My proposal for an IRB for tech adoptions absolutely goes directly to administration, as they would need to form that panel and establish the principles by which it assesses potential tech adoptions.

Do you think it’s important that Ed Tech Professionals understand the needs of the discipline they are trying to support?

Absolutely. Otherwise, they won’t know what support to offer. This likely requires open, frequent communication between representatives of both the edtech provider and a disciplinary representative.

You challenged us to care for each other virtually? What’s a good starting point?

Ask people how they feel—and then respond appropriately with empathy. This even works in asynchronous environments, in which a deliberate review of state-of-mind allows space for empathy, compassion, and a sense of awareness. Jakob Gowell created an Affect Tracking activity for an online class we co-facilitated in 2022 that demonstrates this approach.

How do you as a tutor stay up to date with the real world and the digital needs and usage?

This should coincide with the process of staying up-to-date on current thinking in the field, as newer methodologies or platforms tend to emerge in the literature (scholarly, trade, or otherwise) of a discourse community.

Your thoughts on Distance Learning are really interesting. I am wondering if you have practical examples of what a success programme would look like, if we are remembering that people are in physical locations.

Successful education is education that has application for the learner’s future. As such, a successful distance program helps learners see relevance in their own contexts, including their community, environment, circumstance, and location. The more a program draws from, or applies to, the context a student brings to the course, the more meaningful that program will seem to the student.

How does ALT reconcile Chris’s message of data transparency and being humane with our learners with Anthology being the conference headline sponsor, a self-proclaimed data insights company which aims to turn our learners into data points?

I obviously cannot answer on behalf of ALT, and I support presenting this challenge to the organization for accountability and discussion. That said, it is possible for a company like Anthology to be transparent about what they do to/with student data points. Connecting this with a question above, it becomes the responsibility of administration to monitor, evaluate, and communicate those actions with (or on behalf of) students.

To be clear, if a company’s product works by using student-generated data, that in and of itself is not problematic. If the company doesn’t disclose what data it collects and how that data is used, shared, stored, and/or sold, that’s where trouble starts.

For instance, I’m a huge fan of Instructure’s Canvas product. I think it’s hands-down the best LMS available, particularly due to its ability to integrate with a number of external workflows and the ease with which it permits open publication of course shells/content. However, I struggle to accept the Canvas Data product, which takes data harvested from students’ use of the platform and sells access to that data and its analytics back to the institution. I question how much students know about the tracking and surveillance the Canvas platform subjects them to and the insights institutions draw from that information. 

If students know the ways they’re being surveilled, have the ability to opt out of that surveillance, and know what decisions are made as a result, that’s humane transparency. I find such transparency frustratingly rare.

Wouldn’t you agree that using current professional-life technology in the classroom (e.g. Teams, Slack, etc.) would be useless when we know tech tools have a realistic lifespan of 5-10 years? E.g. Who uses Skype anymore?

No, I absolutely do not agree. Those who learned to use Skype gained transferrable skills when the majority of organizations migrated to Zoom or Google Meet. We just need to keep the principles of agnosticism and robustness in mind. Teaching students (or employees, for that matter) how to use video conferencing tools is great. Teaching them that Skype is the only tool that can do such things causes frustrations down the line. Teaching folks how to engage in a collaboration platform like Slack or Teams is great, so long as it’s with the understanding that the platform being used is one option among many that offer similar functionality. Bonus points if the instructor/institution explain why they selected their preferred platform, adding transparency to the mix.