On the problem of our flawed human algorithms.

There's a feature in the Waking Up meditation app called Moment. If you turn Moment on, you will get one or more notifications each day in the time window you set. The notifications encourage you to "take a moment" to listen to a brief (usually around a minute or less) audio recording aimed at pulling you out of your habitual mental circus and into basic awareness. It's the only notification other than email and text notifications that I have turned on on my phone. And that's because it is, in a way, the anti-notification. It arrives to make you more present, less distracted, less of a mental circus animal. After you've listened to the Moment, you can share it or replay it for some days. But eventually it evaporates. And when it does, a message in its place informs you, "This moment has passed."

Very often, my instinct is to share the Moment. Almost as often, I don't do that. But today I'm breaking protocol. The Moment below is one I've been contemplating and returning to for some days. It's not that it's better than all the other ones. At least not necessarily. It just happened to reach me in a moment of my own that gave it a particular relevance and resonance. But the reason for that is more societal than personal. To be clear, it is also personal. Like everyone, I'm an inherently biased and fallible processor of the world. Nothing that reaches me is pure. Even if it arrives at me that way, and I'm not sure that anything actually can, it still immediately gets filtered through everything I am now and ever was. So I do of course derive personal meaning from this. But it's the larger, capital-T Truth that thrills and intrigues me. And when I returned to it again today and found that it had yet to evaporate, I decided to transcribe it before it did. So here's that moment:

There's been a lot said of late about how various algorithms rule human attention at this point. Many media and social media platforms have put immense resources into artificial intelligence designed to exploit human attention by appealing to whatever emotions offer the most leverage: desire, fear, outrage. But your mind has an algorithm, too. Your patterns of thought, and the emotions and behaviors they produce, are largely the product of conditioning. And on some basic level, it's similar to what's happening online. You get more of what you click on. So if you don't like the way your life is feeling right now, chances are you need to pay attention to different things. And the good news is, you can actually do that.

It's become easy to place some blame on actual computer algorithms for certain societal problems; the sometimes seemingly intractable polarization we're up against is the first and biggest one that comes to my mind. And the fact that this has become obvious to many of us is a good thing. Because it means we're more aware of a problem and one of its causes. And this opens up opportunities to fix it.

It's much harder to put some of the blame on our own minds, to even notice the intrinsically flawed stories and constructions taking shape inside of us, mobilizing and manifesting in ways we don't always see. But it's what we must learn to do. Because if we can better see these processing errors happening in us, then we might also start to see them with more empathy and commonality when they happen in others. And if taking steps to solve our problems and bridge our divisions is a goal, this might better inform what we do and don't choose to "click on.”