Ghost in the Ether: My Decision to (Sort of) Leave Facebook

It was a dark and story night… No, really. It was a dark and stormy night, and some photographer snapped a pic of it.

I’m slow to follow online trends. Part of it is skepticism. I don’t like creating accounts with new providers only to have them sell their user data to the highest bidder. Part of it is exhaustion. With the rapid rise and fall of massively popular platforms (think Vine, for instance), I hesitate to buy into something until I’m fairly convinced it’s stable. And when the trend is to abandon ship, I’m slow to follow that one too. After dozens of outlets and colleagues extolled the virtues of leaving Facebook, I finally decided to make my own exit. I waited until June 2020 to do what I should have done long ago.

Back in 2013, Estee Beck did what was then almost scandalous: She broke up with Facebook. At the time, many folks our age had fallen into a comfortable routine with Facebook by making consistent posts, keeping tabs on our friends, and sharing professional publications and advice. But not Estee. She saw early on the problematic signs of privacy issues and online surveillance. Her threshold for departure was much lower than others’. Granted, I’m not sure whether lower or wiser or more prescient works best to describe her limits. Her Facebook exit happened years before the Cambridge Analytica scandal that brought privacy violations and data-mining to the forefront of public perception. At least for a little bit.

Today, by contrast, people seem rather ambivalent about the platform. It’s been a while since their last major PR crisis. You know, when we hear yet again about Facebook’s failure to protect the privacy of its assets — I mean, users.

Selling Identity

That last point finally got me to make up my mind: I grew weary of creating content for the platform. I essentially worked for Facebook, not only for free but at my own expense. Everything I wrote or posted was used to build an ever-more-complex profile of my interests, allowing marketing and political agencies to target me with uncanny specificity. In other words, by providing content to Facebook, I helped them learn how to manipulate my attention and decisions.

Meanwhile, I was under the mistaken impression that I was doing those things to my friends. I thought I could use Facebook to draw people’s attention to issues I find important. I thought I could influence people’s thinking (or teaching, or voting) by posting things that would change perspectives. Instead, I contributed to an echo chamber, rarely saying anything that came as a surprise to my friends. Those who didn’t take an interest to my posts saw less and less of me as Facebook’s algorithm noted their lack of engagement with my content. If I didn’t create content good enough for Facebook’s algorithm, it diminished my audience and influence, shrinking my echo chamber and including me in fewer echo chambers of others. Facebook’s algorithm punishes users for donating substandard platform content.

Facebook Audiences

Politics

Two things highlighted the situation for me: Any time I posted anything remotely political, I would have one friend who would launch into a bit of a rant in the comments, invariably agreeing with the heart of my post but pointing out how I’d only scratched the surface of what the real issue or conspiracy or root cause truly was. Nik’s a well-meaning keyboard warrior with an ability to see contextual influences when others see isolated incidents. With my political posts, I would often have another friend (John) who enjoyed poking the bear. If I said anything that leaned extreme, John would chime in with commentary, often infused with sarcasm, and we’d get into a bit of a debate, and after a couple hours, we’d get to the core of our positions, find agreement, and move along.

Lather, rinse, repeat.

Teaching

The other thing that highlighted my relationship with Facebook as a platform content creator came from posts I made thinking they were innocuous and of interest to my professional connections. When Hybrid Pedagogy published a new article I liked — and especially if we published a podcast episode I made — I would share the piece on Facebook. And nobody would engage. It was like posting to a vacuum, despite my conviction that half my list would become better teachers if they’d pay attention.

It wasn’t until the third or fourth time that John (yes, the same one) referred to “that pet-a-goggy stuff” that I finally got his point: I insisted on posting professional content on a profile largely made up of friends who are not teachers. They simply did not care that I published a podcast episode or posted someone’s excellent article to the journal. Pedagogy isn’t part of their world, so no matter how exciting the pedagogical views of the articles/episodes were, they simply wouldn’t connect. I kept shooting myself in the proverbial foot by posting content irrelevant to the audience.

How to Train Facebook

Now some might kick into solution mode here and recommend making lists in Facebook to better target my messaging. Using lists, I could post political stuff such that one group of friends could see it while keeping it away from others, like my ultra-conservative uncle who wants the glory days of Reagan to return. I could post pedagogical stuff so my teacher-friends could see it but spare my non-teacher friends the interruption. I could post local commentary for my friends in Orlando or Tampa. Or I could post about LGBTQ+ issues to friends in, or supportive of, that community. And so on.

In short, I helped define my own echo chambers, and I taught the algorithm about my friends. It’s a simple process:

  1. Create a group around a shared interest (say, teaching)
  2. Use that group any time I post about that interest
  3. Because we have a shared interest, folks in that group respond more favorably than average to my posts. (Teachers are more likely to respond to teaching-related content.)
  4. The algorithm tracks the successful engagement of my posts, noting that teaching-related content resonates with everyone on that list. The platform then adds the “teaching” interest to every profile on my list. I taught the platform that we have that characteristic in common.

Lather, rinse, repeat.

Feed the Algorithm

Here’s a far older (and far more innocuous) example: Years ago, before streaming music services like Pandora, Spotify, and Apple Music became popular, most folks kept their own libraries of music housed locally on their own computers. Many of us built those libraries from CDs we owned and a few songs/albums we downloaded here and there. And most of us made playlists from songs we thought went together well, for whatever reason. Maybe one set of songs had similar beats. Perhaps another work great for road trips. And maybe we have fun collecting the songs used in Apple product advertising. I’ll come back to that last one in a minute.

Whatever the reasons behind each playlist, we determined that certain songs belonged together. If enough people agree that two songs belong together, others might want to know that, too. Apple realized that, if they could gather those connections, they could help people discover related music essentially through crowd-sourcing. To achieve this goal, Apple created iTunes Genius.

The principle behind Genius was, well, genius. It was also simple. Apple asked millions of users worldwide to contribute information about their playlists. In this way, Apple learned about the connections people make between songs. Armed with that information, Apple could connect users’ music in ways so appropriate it seemed magical. The larger someone’s collection of songs, the more likely they are to forget about some, and the easier it is for Genius to make matches. It became a neat way to make large music libraries feel fresh and surprising again. (It also undoubtedly formed the foundation of Apple Music recommendations, but that’s getting ahead of the story.)

The Creepiness Factor

Remember that playlist of Apple product advertising I mentioned? This one cracks me up. I used to work for Apple. For a while I was responsible for setting up people’s computers for them, installing software they’d purchased or transferring data from a PC to a new Mac. I was the first person to turn on these folks’ new computers. Back then, every time someone upgraded their OS, the installer started with pleasant music in the background. That meant I heard the initial setup music a lot. I mean, a lot — often a dozen times a shift.

One day, I got the song from the most recent update stuck in my head. I asked iTunes Genius to play songs related to it, hoping to remove the earworm. iTunes played the song I started with, from the most recent software release. Then it played the song from two releases prior. Then it played the song from the “Pods Unite” ad featuring a VW Beetle and an iPod. And then “Walkie-Talkie Man,” used in an early ad for the iPod. This went on for over half an hour. I recognized every song as being from Apple marketing. I’d been offered a docent-curated Apple Aural History tour. People apparently made playlists of songs used by Apple. Enough other people connect “Eple” and “Sofa Rockers” that, despite being unrelated songs, we think they belong together. This association is thanks to choices made by software teams making installers.

Political Influence

While music preferences are silly and help set a mood, political preferences are more consequential and help establish governmental authority. My use of Facebook helped it target and influence political views just as my use of iTunes helped Apple share musical affinities. My constant posting on Facebook contributed to the problem, every time, just by virtue of how the platform operates.

So I stopped.

I quit posting stuff, quit scrolling endlessly. I only visited Facebook when I had something specific to look up — and that happened rarely. After a week or two, I found I missed nothing. The FOMO I thought for sure would kick in and get me running back for more? It never appeared. The lack of social connection I thought for sure would compound the isolation of pandemic-induced quarantine? I never felt it. In fact, keeping myself off Facebook helped me look for more meaningful, longer-form means of communicating. By removing the lowest common denominator, I’ve forced myself to look for more valuable sources of news, ideas, and connections.

I’m connected with a number of friends solely via Facebook. I worried that deleting or deactivating my account might have consequences I wasn’t prepared to accept. Instead, I’ve deleted all my photos, videos, check-ins, and posts, leaving my account a shell of what it had been. I’ll share a link to this post as my only content and leave it at that. Emphasis on “leave”.

In my next post, I plan to share how that departure, and the search for better sources it required, had unintended and much-needed consequences beyond my social life.

Recent Articles

a pair of pink and white boxing gloves lie on a concrete surface
The Reality of Political Visual Rhetoric
08 Jul 2020
On the wall, a mural depicts a boat tossed by waves, struggling to lay anchor. In the front, a bold red couch rests comfortably, welcoming contemplation.
Adaptive Pedagogy
05 Jul 2020
On a rope bridge, how solid are the sides? How constrained is the hiker? How important is the connection? Why use a rope bridge anyway?
The (F)Utility of References
26 Jun 2020

About Me

Hi! I’m Chris Friend — a teacher, speaker, and podcaster specializing in introducing newcomers to conversations around education, writing, and technology.

Explore this site to learn more about me, my work, or my podcast. Connect with me to collaborate on course or training design.