The almighty internet algorithm, we know it's out there, dishing out content it thinks we care about most, but how does it really work? As communications pros, it's easy to feel powerless in the face of tech giants who get to pick and choose what lights up our screens. But there's a way to become active participants in these systems, rather than just passive users.
In this episode, Cameron Hickey, CEO of the National Conference on Citizenship, walks us through the payoffs and pitfalls of content filtering. We also have algorithmic puppet master extraordinaire Gretchen Andrew, who shares how she tricked Google's search engine into showing people her art instead of presidential election results.
Join host Brian Tomlinson as guides us through practical tips on how we can work WITH algorithms to bridge the gap between humans and computers.
Gretchen Andrew: I very much wanted to show people that part of what makes manipulating algorithms easy is not actually an understanding of computer science, but an understanding of human nature.
Brian Tomlinson: You're listening to Infernal Communication, brought to you by Staffbase, and I'm your host Brian Tomlinson. When we open our social media platforms, they all look similar in many ways, but the content varies from device to device and person to person. For me, for example, my Instagram feed is laden with football memes. That's soccer for you, Yanks, coffee art, books, stoic quotes.
Motivational Speaker: The best answer to anger is silence.
Brian Tomlinson: And motivational videos.
Motivational Speaker 2: What inspires you?
Brian Tomlinson: But what I see on Instagram, TikTok, Spotify and so on, might be different from what you see or from what my neighbors or my boss, our extended family see. This phenomenon of content filtering based on our interests is not a new concept, and so much of what we hear on the news about recommendation algorithms can sometimes feel like a doomsday warning.
TV Anchor: How often do you make decisions based solely on computer algorithms?
Brian Tomlinson: And the alarm bells are for good reason. Recommendation algorithms have become a part of the norm. Tech giants like Meta and Google have complete control over the content that is served to us, and there's not a lot of clarity on how exactly they're doing it.
Rapper: Hold up, don't scroll. Let me ask you something first. Can someone please explain how this algorithm works? I got one viral video, but now it's old news, and now every one I post gets like 100 views.
Brian Tomlinson: So what is there to do about it? Who are we, the social media users in comparison to major tech companies? Is there a way we can influence how we participate in these algorithms? In today's episode, we're taking a look at how these systems control what we see and what, if anything we can do about it. We're going to speak to an artist who tricked Google into showing people her art rather than the results of the 2020 US presidential election. But to kick things off, we spoke with Cameron Hickey. He's the CEO and director of a non-profit called the National Conference on Citizenship.
Cameron Hickey: I've spent the last five years doing work in the space specifically of looking at online communications, particularly around mis and disinformation.
Brian Tomlinson: Cameron has spent a lot of time analyzing how information circulates on the internet, so he's going to clear up some of the basics for us, like what even is an algorithm really?
Cameron Hickey: Algorithms can be incredibly simple. It's important to note that long division is essentially an algorithm, right? It is a process by which you are able to process information and do something with it. It can be a set of instructions. These days, when we refer to algorithms, we are thinking about things like the newsfeed algorithm or a recommendation algorithm. It is a much more complex and sophisticated set of instructions, or code, or things that integrate machine learning, what we sometimes are talking about when we say AI. It's software that takes some information and processes it and gives you some kind of output that is more valuable in some way. So the newsfeed algorithm on Facebook or the feed algorithm on Twitter, etc are bits of software that are making unique decisions often for you that determine what kinds of things that you see.
Brian Tomlinson: Sure. So simple. These algorithms are constantly working behind the scenes and it's no surprise that they're a pervasive part of the way things spread online.
Cameron Hickey: So algorithms bring in general, I would say two kinds of goods, right? On the one side, they can create profits. They in fact create billions and billions of dollars in profits. On the other side, they can create for us as individuals efficiency. So something that was long and tedious or not very meaningful can become faster, more efficient, and much more meaningful. I would argue that when I go on Amazon.com and it recommends other products to me based on the things that I've shopped for in the past or based on the product that I'm looking at right now, it's actually helping me do whatever I was going to do more efficiently.
Brian Tomlinson: With so much digital content, it does make sense to have some kind of filtering mechanism in place. We'll never be able to see everything that's out there on the internet. So in a way, algorithms work like the bouncers at a nightclub, dictating who gets let in and who has to stand outside for a little longer.
Bouncer: Brian, what have I told you? Back of the line.
Brian Tomlinson: That's a powerful position, but where's the oversight?
Cameron Hickey: I think most commonly when we think about algorithms, the risks are about biases so that they are reinforcing problems that might already exist and exacerbating them. That can take lots of different forms. Right. So if we think about an algorithm that's designed to sort good resumes from bad ones when you're trying to hire someone, that kind of an algorithm might reinforce pre-existing biases that might favor men, that might favor white people, that might favor people from a particular geographic area as opposed to another. Right. And therefore discriminate against other people, and those would be biases that are latent in society that got reinforced.
In social media I think for a long time, what we've been concerned about is that algorithms are trained to show you content that matches what you're interested in or that is engineered based on some other external value. So in the case of platforms, that value might be how much time you spend on the platform.
And so those algorithms, when they're engineered to show you more stuff like the stuff you like or show you more stuff that's going to keep you on the platform longer, the risk is that the things that it's showing you might be harmful in some way. Similarly, people have referred to the rabbit holes that exist on social media. That's kind of a version of that. I think sometimes it can just be wasteful, like there's a more marginal scale problem, which is that we shouldn't actually be sitting and focusing on screens all day long. And by making the platforms very addictive, we might be doing that as opposed to anything else, including getting exercise, interacting with one another, etc.
Brian Tomlinson: So algorithms are designed to keep us scrolling. We know about this. Some places are starting to implement new regulation to try to combat these tactics that keep people so hooked. The European Union, for example, recently passed the Digital Services Act. It's putting in rules to rein in what these tech giants are allowed to control and gives users more agency over what they see because who wants to be stuck in an echo chamber when there's a whole world of interesting content out there?
Gretchen Andrew: It's almost like we'd get stuck living in our teenage bedroom for the rest of our lives.
Brian Tomlinson: That's Gretchen Andrew. She used to work in Silicon Valley. With a passion for technology, she thought she had landed her dream job when she was hired by Google, but she soon realized it wasn't everything she hoped for.
Gretchen Andrew: I felt stuck in a system where these tools of both creativity and knowledge that I had experienced growing up were being used in ways that I felt were just very limited into what their human potential was. And so I started to think, well, if I am not happy in this fabulous tech job, I probably won't be happy in a job.
Brian Tomlinson: So she quit and embarked on an unexpected mission.
Gretchen Andrew: I decided to put my belief behind the technology instead of the tech companies and posed it as a challenge. I challenged myself and the internet to make me into something that I wasn't yet. So I said, internet make me into an artist.
Brian Tomlinson: So with no artistic training, Gretchen set out to see if she could, as she puts it, "Use her faith in technology to determine her fate as an artist," but she still had to start at zero in some ways.
Gretchen Andrew: This began with online classes, with MoMA online and Stanford online, YouTube videos, how to paint clouds, how to draw hands, Googling, how to write an artist's statement. And through that, really got to experience both the limitations and the transformative power of technology and also use it as a narrative form of becoming.
Brian Tomlinson: In a way, her creative exploration and this experiment were like a piece of performance art, the journey and the destination all wrapped up into one. Gretchen used technology to become an artist, then she used an algorithm to display her art. Her work is extremely creative and has had a lot of success.
Gretchen Andrew: All of my work is both handcrafted and digitally crafted. So for example, I have a series called Affirmation Ads where I made a series of paintings that have the positive message embedded into them. Written inside the oil painting it will say, I already have everything I need to be beautiful. And then people could go to my website or go to my exhibition and pick their favorite painting, which I would then set up to follow them around the internet in the same way an advertisement often stalks us around the internet and follows us around the internet. This instead let people pick a positive message, a painting that they chose, and it would pop up in their Instagram feed and on their Facebook feed and on websites when they were reading the news. And it was a way of treating our digital lives the same way we treat our living rooms as a place that we can choose how it makes us feel, and that we have a sense of power and mastery over.
Brian Tomlinson: Imagine instead of being bombarded with ads for weight loss drugs, dating apps, or male enhancement pills, you can choose instead to see art, art of your choosing.
Gretchen Andrew: We would never let 60% of our living room have advertisements in it, but that's what we allow for our Instagram feed. And most of the time when we're sitting on our sofas at home, we're on our phones anyway. So really what the Affirmation Ad series in particular takes that algorithm of tracking and of target that most time e-commerce companies use to make us insecure, to feel like we need something to be more complete and to turn that intent of that algorithm on its head.
Brian Tomlinson: This isn't the first time Gretchen upended an algorithm. During the 2020 US presidential election, Gretchen developed what she calls vision boards. They're pink, glittery, and would totally fit in with Barbie's Dreamhouse decor.
Gretchen Andrew: And so when you were searching for certain key terms on Google looking for the election results, what you got instead of election results were my vision boards about what I wanted the next American president to value and be and believe. And so by showing how easily me as an individual artist without my own troll farm,-
Brian Tomlinson: Just FYI troll farms are organized groups that intentionally spread deceptive information.
Gretchen Andrew: How easily I could change what came to the top of Google for this major political democracy institutional moment in history. If I can change that, I wanted people to both be impressed with me personally and also be extremely terrified that this happens every day in ways that we don't see it happen.
Brian Tomlinson: So not only is Gretchen's art getting attention, she's also getting people to pay attention to how the system can be messed with and how easy it is to do.
Gretchen Andrew: There's a lot of power in things that we don't tend to give a lot of authority to, femininity, and glitter, and pink. And one of the reasons why the series was so successful is because when we see Barbie shoes and balloons and champagne and princess stuff in a place of power and authority, it looks wrong and it causes us to question both how it got there and why that looks wrong. I very much wanted to show people that part of what makes manipulating algorithms easy is not actually an understanding of computer science, but an understanding of human nature.
The way that I'm able to manipulate algorithms, especially in this vision board series, relies on a difference between how people think and how computers think. The way that Google works and most algorithms work is they only deal in relevance, which is to say the only way to measure a relationship between two things is how relevant it is to each other. So for example, there's no such thing as hope, and love, and fear, and desire. There's only can I build an association between these two things. So when I say to you that I, as an artist someday would really love for one of my paintings to sell for a contemporary art auction record, you know that I'm talking about hope, about a desire, but Google only knows that I'm relevant to the auction record. It only knows that my vision boards are relevant to the presidential election. And so by compressing all of human relationships and experience into relevance, we lose that nuance, but that nuance that is symbolism and metaphor and desire is what makes us human. And I find it so extremely poetic that these algorithms can't parse desire, they can only parse relevance.
Brian Tomlinson: Okay, so confession time. I love poetry, but I have to admit, I have never stopped to think about the poetry of computer algorithms before, but there's something really important about this distinction that Gretchen is pointing out. Algorithms are great at connecting things together, but understanding what those things mean, giving them context, that's where the people come in. That's what makes us human.
Gretchen Andrew: It really is important to acknowledge what I'm good at as a person, what you're good at as a person as being different than what algorithms are great at. If you have something real to say to a real person, you can't just have a conversation in the town square anymore. Your customers might not be there, your audience might not be there, your community might not be where you physically are. And the ability to expand beyond our immediate geography is one of the greatest benefits of these technologies.
Brian Tomlinson: Just a bit of context for this next bit. In a place called Marfa, Texas, there's an art installation of a fake Prada storefront. It looks chic and expensive, but it's in the middle of nowhere.
Gretchen Andrew: So I think of it like somebody on a street corner yelling or preaching some religious message of like end times or whatever as being on one side and then the Prada store in Texas being the other side where one, is they're getting their message heard, but it's not to the right person. And the other is there's this beautiful, gorgeous thing, but it's humorously built in a place where no one will see it ever. And the truth is that to communicate today, we need to be having conversations with algorithms and with people simultaneously. And doing that in a way in which the communication to the person still gets through is I think a very interesting specialization that requires those very classic communication and writing skills while also acknowledging that unless you please that algorithm, it's not going to matter how clear your message is.
Brian Tomlinson: As Gretchen categorizes it, algorithms are good at helping us get noticed while people help us be understood. So knowing how to work with the algorithm is essential to getting our message across. On top of being an artist, Gretchen also runs a digital marketing agency maximizing on everything she's learned while exploring algorithms. Here are some of her pro tips on how to get noticed on the internet.
Gretchen Andrew: When I think about trying to get noticed, especially on an algorithm like search within Google for example, I think of it like an iceberg. There is a 10% part of the communication that is extraordinarily important person to person. It has to be clear, it has to have emotion in it. Now, what ends up happening is we're thinking too deeply into the rest of that iceberg. And when we write the caption of a photograph or the headline, we're simultaneously having both of these conversations in a way that often ends up having neither.
So if we think of it going back to the iceberg, having something real to say to a real person, a lot of focus and care and clarity should go into that top 10%. And I'm talking in a literal top 10% of a page or an email.
Whereas as that iceberg goes deeper and deeper, this is a little bit of a simplified way of thinking about it, but it can get deeper and you can sort of descend into the way of addressing the computer and the algorithm. There are also specific areas for talking to the algorithm, to pleasing Google that people tend to neglect. Alt texts, and metadata, and schema data, for example, were designed so that there was a conversation from data to algorithm, data to algorithm, and human comprehension didn't need to be involved.
Brian Tomlinson: Alt text, metadata, and schema data refer to things like descriptions, keywords, and labels you assign to digital elements like the images on a website, all helping the internet make associations and tie your material to other relevant things. So you need to pair all of this with your message.
Gretchen Andrew: The more that you can acknowledge when you're doing one and do both, but also put a lot of care into that top 10% of the iceberg. And what I think that really should be doing is not outsourcing compassion to the algorithm, but freeing up that 10% to be very creative and to be very personal.
Brian Tomlinson: Okay. So the top 10% is vital. Put your heart and soul into it. But what about the other 90%? Let's dive into that data a bit more.
Gretchen Andrew: I like to use artificial intelligence in the very deep parts of it. Say for example, I'm launching a new art series, or I have an exhibition coming up. I might go to AI and be like, "Okay, AI, what are some frequently asked questions for an art exhibition?" I would tailor that to be true and relevant, but I would have the bulk of it written by AI and then I might say, "Okay, AI write directions to MoMA for me." And you could have a section that's directions to MoMA, and there's some chance that this information could be relevant to someone. The information does still need to be true, but it's not the amount of care that would be taken in that first 10%. Most people don't read an entire article or email anyway.
Brian Tomlinson: So it's all about knowing how algorithms work so that we can use our skills as people and capitalize on the benefits that algorithms bring. To bridge the gap between people and computers, let's bring back Cameron Hickey from early in the episode. How would he leverage the almighty algorithm?
Cameron Hickey: I think that a lot of what we've learned from monitoring and tracking social media is that things are constantly in motion. So what was relevant a month ago is old news today, and what is the way people are going to be speaking tomorrow is something we're just identifying right now this minute. And so communications that are effective are ones that participate in the conversation.
And so I think if everyone is doing a dance move on TikTok, then it will be smart to use that sort of meme as the entry point for the thing that you want to talk about because people will already be familiar with the foundation. Right. Now, that's a really cheap, cheesy one. Right. But if we're thinking about politics and we want to participate in the conversation that people are having, the way that we talk about the issue that we care about has to be connected to the discourse that's already in motion today on social media. So paying close attention to it and reframing what you want to talk about in that context, I think is critical for it to actually get any oxygen.
Brian Tomlinson: Okay, so let's go back to the doom and doom. Unfortunately, as internet users, there's only so much we can do about the way that these systems are built. So things like biases and echo chambers won't change much until more legislation like the Digital Services Act is put forward. But what does Gretchen think? Her concerns are a little different than what we're used to hearing on the news.
Gretchen Andrew: One of my greatest fears around internet algorithms is not some kind of robot apocalypse, but actually that life will become boring. Algorithms have the effect of normalizing and controlling information. When information comes in a particular format, they get happy and they reward that particular format. And so my hope is that they become so boring that we get over it and we depend less on them. I'm always really surprised when I get talking to people about how afraid they are of algorithms. Honestly, it's just a fancy way of saying rules. And as people going to high school, or having jobs, or driving on the road, we all understand certain rules and the way to understand algorithms is the same, that there are sets of rules, things happen and there are consequences as a result.
It also is a good way to think about how a lot of times there are rules and then there are ways around the rules. There are loopholes, there are maybe a set of tax laws that have one purpose, but by studying them, a certain segment of the population can learn how to make them do unintended things. And this is both terrifying and creatively fun to explore. Don't be afraid of algorithms. They're just rules. And like all rules, there are ways to get around them.
Brian Tomlinson: Well, I love breaking the rules, so don't be scared, but know that algorithms exist and knowing is half the battle. As communicators, let's let AI help us with our writing, our planning, and our creativity. Leverage it to put your message out there and leave a mark on the world, but always leave a little room for the human touch. Meta, Google, Microsoft, Amazon, these corporations carry a lot of power that control what we see, and consequently, there's a lot out there that we don't hear about. To keep your online life exciting, have confidence in what makes your message unique , and personal, and authentic, and use that in tandem with strategies to boost its popularity.
We know algorithms aren't going anywhere. So it's in our best interest to know the rules of the game. Today, our guests were Cameron Hickey, CEO of the National Conference on Citizenship and artist slash algorithm extraordinaire, Gretchen Andrew. I'm Brian Tomlinson, and this is Infernal Communication, brought to you by Staffbase with production support from JAR Audio. Join us next time where we dive into the ever-evolving rules of grammar and how our quirks are something to celebrate, not harp on.
Anne Curzan: People will say, oh, if you don't know YOUR for Y - O - U, apostrophe, R - E, then I'm not going on a date with you. Or it's with an apostrophe versus its without an apostrophe. Let's all be honest about this. All of us have screwed up it's and its.
Brian Tomlinson: If you enjoyed the show, tell your bartender, your hairdresser, even that weird cousin who is now an amateur beekeeper. You can also follow us on your podcast app and leave a review. We would love to hear your thoughts. Until next time.