How to Make Self Addressed Envelopes Design
How to Design for Your Algorithmic Self
How to Design for Your Algorithmic Self: How to create a new algorithmic self. Designers around the world are trying, optimistically and earnestly, to solve the problem of this too many choices. We have new options to participate in our own algorithms, and those options are easier to find. We are reluctant to spend time curating, but we don't care, but it may be irrational. Gaming the algorithm is a story about being highly aware of an existing algorithmic shadow self, and about subverting it.
Once a year, I check in with some of my friends on Facebook. Really? Once a year? You are probably thinking I'm some kind of social media holdout. Or maybe I'm trying to be mindful. Maybe I'm just doing it wrong.
It's just that certain people never appear in my feed.
My husband, for instance. We haven't gotten Facebook married, so maybe that's why. He usually tells me, or sometimes texts me from the other room, to like his posts. Which I do, of course. And I usually do. Like them, that is.
The casual friend from my town. I see her at the grocery store sometimes, but never on Facebook. It's kind of a relief actually. In a small town like mine I feel like I know everything about everyone else I see in the grocery store. "Her dog was lost and then found." "He's bummed about the lack of snow for skiing." And so on.
Or a friend from high school. I think it may be that he is rarely on Facebook. I can't really be sure. I confess that I forget about him until a post appears once in a blue moon. Or it's his birthday.
Once a year, I am reminded of all my friends when it's their birthday. Once a year, I make a pilgrimage to the land of a lost friend. I look at her timeline. I wish her happy birthday and sometimes try to spark a conversation. My friend doesn't know that she's been lost. This is as much for me, as for her.
Then, I dream, a little wistfully, of seeing more variety in my feed. I know there are steps you can take. You can prioritize who you want to see first. But how to choose? I want them all, really. So I do nothing and the algorithm decides for me.
Gaming the Algorithm
As a design researcher, I interview people and observe what they do and review diary entries, to understand their experience. Lately, people describe doing something they call gaming the algorithm.
Gaming the algorithm means first inventing a story about how the algorithm works. Then, based on that story, behaving differently to create a new algorithmic self. The hope is that a new experience will emerge that aligns with our version of our self.
People decide to click on some posts more, some posts less. They like things strategically. They instinctively open up private windows. They follow and unfollow. They block ads. They create different profiles.
Gaming the algorithm rarely involves settings. We have new options to participate in our own algorithms, and those options are easier to find. Still, we are reluctant to spend time curating. It may be irrational. We care, but we don't Facebook care.
Gaming the algorithm is a story about being highly aware of an existing algorithmic shadow self, and about subverting it to create a truer representation. Usually, the story ends up with giving up.
So I guess I game the algorithm by paying attention to birthdays instead. The thing about birthdays is that, perhaps because most people don't opt out, it's a way to see almost everyone. It's not algorithmically determined.
Is there room for an experience that is not (or just less) algorithmically optimized?
Warning, Not Algorithmically Optimized
Twitter, and the impending demise of Twitter, has been linked to the chronological approach. What once made Twitter up-to-the-minute, real-time, drink-from-the-firehose now seems too time consuming.
Lists in Twitter, like Facebook settings, take time and effort to make. And, unlike Facebook settings, take time and effort to find and review. Follow too many smart, talented, funny people and you are in danger of information overload.
And yet, selective presentation like Twitter's While You Were Away or Moments is still not quite right. Searching on a hashtag and seeing a chronological listing of all posts from someone you follow is well, kind of liberating. It exposes us to different people, and different points of view. We want that too sometimes.
Designers around the world are trying, optimistically and earnestly, to solve this problem of too many choices. One way is with anticipatory design. Our internetted things will know us so well, based on our past behaviors and on physiological cues about our emotions, that our devices will anticipate what we need.
Cool, when it comes to getting my coffeemaker to start up at my usual wake-up time. Well, it would be cool if I didn't make coffee with a French press. And if I didn't wake up at different times every morning, because of kids and dogs and even weather (a fear of heavy rain at night). And if I decided to not have coffee because I'm cutting down on coffee (unlikely). And if I didn't need to download and set up and interact obsessively with an app instead of just touching a button on my coffeemaker which, by the way, I don't have.
Maybe not so cool when it comes to people and points of view. Research shows that Facebook can make us polarized and predictable by narrowing, after all. When do we want to be freed from the burden of choice? When is it OK for us to be confronted with, and actually make, choices?
Liberation by Algorithm
The dream is liberation by algorithm. There's that vision of coming home from a stressful day at work in a crappy mood and having our house decide on lighting, and warmth, and what movie we should watch. That vision of the algorithmically determined future where humans wander from one optimized choice to another in a pristine white and glass modern home.
The problem, with due apologies to Marie Kondo, is that people like a little mess. People like a little mess, especially when people are the mess. Whether it is city blocks, consequential strangers, or real people in all their glory in reviews, people like other humans.
We don't want so many choices, but we bristle at one choice. Why? Because we change a little every day. We change our minds. We get interested in new things. We are complicated. We are a little conflicted perhaps. Algorithms have a hard time with that.
Humanizing the Algorithm
So even though algorithms know our behaviors perhaps better than we know ourselves, we need to humanize the algorithm. Even though the algorithms can read our micro-expressions and extrapolate emotions (as long as we get expressive with our screens and keep putting screens on things), we add one part human to the mix.
Netflix has film buffs as taggers, Amazon has a cadre of citizen reviewers. Apple Music has DJs. Artificially intelligent app assistants, like Pana or Google Translate or Rise, have their own human assistants.
A human presence in the background keeps the algorithm from being a little dumb about humans. It makes the experience feel considerate. But it still makes a lot of assumptions based on our algorithmic selves.
Revealing the Algorithm
Rather than inadvertently prompting people to create new mythologies about how technology works, we could design to reveal the algorithm. Not just adding human input to computer algorithms, but making the algorithm present.
What if you could play with the same algorithm that Facebook uses to analyze the emotional timbre of posts? You can. What if you could see yourself through the ads you encounter? You can do that too.
What if you knew your algorithmic self, and like that human who assists the artificial intelligence assistant, you could reflect on it and respond to it and adapt it to express who you are and who you want to be? Call it an algorithmic angel. Or heartificial intelligence. Maybe that's coming.
Ultimately, we want to see our self as the algorithm sees us to have more say in how the algorithm defines us. Transparent terms and conditions would be good, but not good enough. Owning our data is a step forward. Design thinking will get us even further.
When I'm working with teams on designing with algorithms, we practice a new kind of empathy. We role-play the algorithm. We make personas for the algorithmic shadow of real people. We analyze the algorithmic view of a person, constituted by the behaviors that are the input and the interface changes and content that are the output.
Then we listen to real people and their perspective on their algorithmic doppelgänger. We listen for the stories they tell to explain how the technology works. We listen, not as devices would, but as humans do.
Maybe it will lead to a focus on human well-being, rather than a singular focus on commercial optimization. Empathy 2.0.
When it comes to the algorithm we are all asking ourselves the same question. What is behind the curtain? It may well turn out to be you.
So, anyway, happy birthday.
**********
If you liked this post, I'd like it if you Medium liked it with a ❤
This year, I'm imagining what a new positive technology would look like. I hope you will follow along here and on Twitter to join the discussion.
Tags
# artificial-intelligence# design# tech
How to Make Self Addressed Envelopes Design
Source: https://hackernoon.com/how-to-design-for-your-algorithmic-self-757c7df601e2
0 Response to "How to Make Self Addressed Envelopes Design"
Post a Comment