Louis Arge
Published on

A Buddhist, Sam Bankman-Fried, and an Economist Walk Into a Bar

This is a fictional dialogue. No real conversation took place. The views attributed to real people are inspired by their public statements but are ultimately the author's construction.

A Buddhist, Sam Bankman-Fried, and an Economist Walk Into a Bar

The economist says to the other two, "I have a fun personality test for you."

Sam Bankman-Fried says, "Do tell."

The economist says, "Well, imagine I have a coin. We flip it for your life savings. If it's heads, I get your life savings. If it's tails, I give you 2.01 times your life savings back."

Without a moment's hesitation, SBF blurts out "I'll take it", as though acting on an opportunity that might vanish if he doesn't grab it fast enough.

A stranger interjects from across the bar. "You'd risk all your life savings on a coin flip? That's a 50/50 chance. You might just walk out of here without a home to go back to!"

Sam Bankman-Fried says to the stranger, "Okay, what if it was a six-sided die, and anything but a 1 gets you back twice your life savings? And if you land a 1, I only take half."

"I don't gamble," the stranger says, and retreats back to his own group. Sam Bankman-Fried does the same.

The economist says, "Well, you, Sam, you would be an η = 0 personality type."

Sam Bankman-Fried asks, "And that stranger over there?"

"He's probably around an η = 5, if I were to guess."


η (eta), formally the elasticity of marginal utility of consumption:

η=cU(c)U(c)\eta = -\frac{c \cdot U''(c)}{U'(c)}

In plain English: if your consumption rises by 1%, how much (in percent) does the value of the next dollar fall?


"What's η?" the Buddhist asks, ever so slightly breaking his calm, contemplative facial posture for the first time since they sat down.

"It's a number that measures the marginal utility of consumption — how much more you enjoy your first dollar compared to every dollar after that.

At η = 0, there's no difference. Your millionth dollar is exactly as exciting as your first. That's Sam. You already watched him take the coin flip.

At η = 30, it's the opposite extreme. For every 1% your consumption increases, you care 30% less about the next dollar. Utility flattens almost immediately past subsistence. But losing what you have is catastrophic - the pain of halving your income is roughly a billion times worse than the pleasure of doubling it. That's not an exaggeration; that's what the math says. An η = 30 person puts all their money in bonds, and they'd never consider the S&P 500. A chance to retire twice as wealthy is nothing compared to the terror of losing even 10% on the way there."

"Sounds like my parents," the Buddhist chuckles wistfully. "They have everything they need, but they grasp onto it so tightly that they can't enjoy life."

"And let me guess," the economist says. "They vote Labour?"

The Buddhist blinks. "How did you know that?"

"It's not just about how they feel about their own money. η has a second face. If the value of each dollar drops steeply as you get richer — which is what high η means — then it follows that a dollar in the hands of a poor person is worth enormously more than a dollar in the hands of a rich person. To a high-η person, the existence of a billionaire while someone else can't feed their children represents a staggering, almost incomprehensible misallocation of wellbeing. At η ≈ 2.5, a dollar to someone earning $10,000 a year is worth about a hundred thousand times more than a dollar to someone earning a million. At η = 5, it's about ten billion times more. High-η people tend to be socialists. It's not just that they're afraid of losing what they have — it's that they can't look at inequality without feeling like something has gone badly, mathematically wrong."

SBF has been quiet through the economist's explanation, but now he leans forward. "See, that's the thing. I am an η = 0. But not because I don't care about those kids. It's because I care about them."

The economist raises an eyebrow.

"I'm an effective altruist. I believe in earning as much money as I can so I can give it away. If I lose a billion dollars, it won't have a big impact on my life, but if I make another 1.01 billion on that coin flip, that'd matter a lot for the kids currently dying of polio."

The economist tilts his head. "That's fascinating, Sam, because listen to what you just said. The reason you want to give the money away is that you believe a dollar matters more to a dying child than to a billionaire. That's a high-η argument — maybe the highest-η argument there is. You just heard me explain why high η makes someone a socialist. Well, it's the same math that makes you an effective altruist. The whole moral engine of EA runs on the idea that the curve is steep — that money in the hands of the global poor buys incomparably more wellbeing than money in the hands of the rich." He pauses. "But then you turn around and manage that money like η = 0. Risk-neutral. Double-or-nothing. Flip the coin. It's a strange contradiction — the heart of a socialist powering the gambling strategy of a Wall Street trader."


The Buddhist, who has been sitting quietly through all of this, finally sets down his tea.

"There is no contradiction," he says.

The economist and SBF both turn to look at him.

"I think it's worthwhile to realize that there aren't a bunch of different buckets of utility. It's not like 'my utility' is coherently separable from 'your utility'. It's all just utility that adds up in God's summation in the end."

"What makes you say that?"

"Well how could it be otherwise? What is it about the string of conscious moments that make up your life that make them part of the same bucket, while the string of experiences that make up my life are part of a separate bucket?"

The economist thinks for a while. Then he says, "Causal access."

The Buddhist waits.

"Right now, I have a slight headache. I know this because I can feel it. You don't know this — or at least, you didn't until I told you. And even now that I've told you, you know about my headache, but you don't feel it. That asymmetry isn't a philosophical assumption. It's the most basic fact of experience. It's more certain than anything else I know."

He taps the bar.

"My experiences are in the same 'bucket' because they're causally bound together by this." He gestures at his own head. "My memory of five minutes ago feeds into my experience right now. My pain modulates my attention. My attention shapes my next thought. It's a dense, closed causal loop. Your experiences run on a separate loop. Information between us is sparse — we need language to bridge it, and even then, it's lossy."

The Buddhist nods slowly, as if he expected this answer.

"That's a good answer. Causal density. I want to take it seriously. But I think if you follow it carefully, it actually leads to my position, not yours."

The economist crosses his arms.

"You said your experiences belong together because they're bound by a dense causal loop. Memory feeds into attention, attention feeds into the next thought, and so on. And you said my experiences run on a separate loop, and the bridge between us is sparse and lossy. Yes?"

"Yes."

"So the boundary between 'my' experiences and 'your' experiences is defined by the density of causal connection. Not by some bright metaphysical line — just by how much information flows."

"Right. And the density difference is enormous. It's not a close call."

"Today, sure. But let's follow your criterion. You at age four and you right now — how dense is that causal connection? You share almost no atoms. You share very few memories. Your values are different, your personality is different. If I showed you a transcript of your thoughts at age four, you would barely recognize them. The information bandwidth between you-at-four and you-right-now is... honestly, it might be lower than the bandwidth between you and me in this conversation. We're exchanging complex ideas in real time. Four-year-old you couldn't even understand the words we're using. And it gets worse. You go under general anesthesia. The causal loop breaks. There is no dense information flow between pre-anesthesia you and post-anesthesia you. The person who wakes up is rebooted, and is modelling anew what it's like to be you, in much the same way that I'm modelling what it's like to be you right now. The thread is cut. By your own criterion, those should be different buckets. But you wake up and say 'I'm still me' without hesitation. Why? Not because of causal continuity — that was interrupted. You say it out of habit. Out of a story you tell. The body is the same, the name is the same, so you assume continuity. But the thing you said mattered — the dense causal loop — wasn't there."

The economist contemplates for a moment. "That doesn't mean there's no boundary. It just means it's fuzzy."

"Agreed. I'm not trying to convince you that the connection between current you, and you one minute into the future isn't vastly stronger than most other connections you'd find between two minds in spacetime. The thing I am trying to convince you of, is that the causal loop between current you and the people immediately around you is much stronger than the causal loop between you and you in a decade. And so by that metric, I'm more you than you from 2036 is you."

The economist nods.

"You could draw an analogy to an ocean: a given spot of water is most connected to itself immediately in the future. But it's also connected to the water around it, and it's less strongly connected to the water in the same spot far into the future.

When we see these kinds of phenomena in nature, we treat them as wholes. We talk about 'oceans' as the main event, not streams of turbulent water, even though they do have unique local self-connectivity.

Anyway, you, I, and Sam are all part of the same ocean, even though we may be far away right now. Every drop of suffering, every toxin in that water, contributes to the same overall pollution. You don't get to pick and choose which drops of suffering are part of your ocean, because you will eventually come into contact with them all.

You described Sam as a strange chimera. A contradiction. But I think he's the most coherent person at this bar. He's just selfish."

SBF laughs uncomfortably.

"No, I'm serious. You're not an altruist, Sam. You never were. You're obsessed with everyone else's suffering because it's happening in your ocean. And you want to fix it for the same reason anyone takes an aspirin — not because of some abstract moral calculation about the diminishing marginal utility of consumption, but because you're in pain and you want it to stop."

The economist, now grinning a bewildered smile, continues the Buddhist's chain of thought. "And the reason you'll flip the coin to do it — the reason you're risk-neutral — is that the pain is infinite. You're not going to run out of suffering to address. Every dollar you deploy finds an equally desperate margin. The curve doesn't flatten because the ocean doesn't have a floor.

I thought we needed two different η values to explain you, one for your ethical worldview, and one for your strategy. But we only need one. η = 0. A single consciousness, attending to infinite pain, with no point of diminishing returns."

The table goes silent.

After a few moments, the Buddhist notes, "Of course, every Bodhisattva understands that no matter how good the cause is, we should abstain from murder, sexual misconduct, intoxication, lying, and theft in our pursuit of it. Being a utilitarian without principles always backfires."

SBF goes quiet.