Transcript for #361 — Sam Bankman-Fried & Effective Altruism
SPEAKER_00
00:21 - 06:34
Welcome to the making sense podcast. This is Sam Harris. Okay, just a little housekeeping here. Over to waking up, we just introduced playlists, which has been our most requested feature. Took a while to do that, but it seems like a very auspicious change. You can create your own retreats. You can create playlists for any purpose. Just by the name of the app, there's a lot of content there that is very good for sleep. So you could create a sleep playlist. Many of us fall asleep to audio these days. So thanks to the team over waking up for producing that feature. Among many others, the app is continually improving. And what else? If you haven't seen Coleman Hughes on the view promoting his book, that is worth finding on YouTube. Coleman was recently on the podcast, discussing the book, The End of Race Politics. He went on the view to do that. As you might expect, he was bombarded with a level of moral and political confusion that is genuinely hard to deal with in a confined space when one is short on time. And I have to say, he really did a perfect job. I mean, it was absolutely masterful. So it's worth watching. In case there was any doubt in your mind about Coleman's talents. If ever there were a commercial for the equanimity that can be achieved through mindfulness, that was it. So Bravo Coleman. In the last housekeeping, I acknowledged the death of Danny Kahneman. I went back and listened to my podcast with him, I recorded that event at the Beacon Theater in New York about five years ago. I was pleasantly surprised. It's often the case that live events don't translate into the best podcasts. I really thought this was a great conversation, and Danny was really worth listening to there. So that was episode 150. If you want to revisit it, I really enjoyed hearing it again. Okay. Today I'm talking to Wilma Casco. Wil is an associate professor in philosophy and a research fellow at the Global Priorities Institute at Oxford University. He is one of the primary voices in a philanthropic movement known as effective altruism. and the co-founder of three non-profits based on EA principles, giving what we can, 80,000 hours, and the Center for Effective Autrorism. He is also the author of several books, including doing good, better, effective altrorism in a radical new way to make a difference, and most recently what we owe the future. However, today we don't talk much philosophy, rather we do a post-mortem on the career of Sam Bankman Freed, and the implosion of FTX, and look at the effect that it's had on the effect of altruism movement. When we recorded last week Sam had not yet been sentenced, but he has since, and he was sentenced to 25 years in prison, which is not as much as he could have gotten, but certainly more than the minimum. I must say that strikes me as too long a sentence. You'll hear Will and I struggle to form a theory of mind of Sam in this podcast, we discussed the possibilities at some length. But when you look at some of the people who don't get 25 years in prison for the malicious things they do, I don't know, it's not strike me as a fair sentence. Perhaps I'll talk about that more some other time. And we will and I talk about the effect that this Fiasco has had on effect of altruism, the character of the EA community, potential problems with long-termism. We have a brief sideboard discussion on AI risk. We discuss the effects of the FTX collapse on will personally and other topics. There's no paywall for this one. As I thought, everyone should hear what will have to say on the topic. As you'll hear despite the size of the crater that Sam Bankman freed left on this landscape, I consider the principles of effective altruism untouched. And while I've always considered myself a peripheral member of the community, you'll hear me discuss the misgivings I have with it. Once again here, I've discussed them on previous podcasts as well. I just think the backlash against EA is thoroughly wrong-headed. And Will and I talk about that. As always, if you want to support the podcast, there's one way to do that. You can subscribe at SamHarris.org. And if you can't afford a subscription, you can request one for free. Occasionally I hear rumors that someone has requested a free subscription and didn't get one. That should never happen. So check your spam folder. Something has gone wrong. Just request again if that happens to you. We don't decline any of those requests. And I bring you Will McCaskel. I am back here with Wilma Casco. Well, thanks for joining me again. Thanks for having me on. So we have a lot to talk about. I've been wanting to do a post-mortem with you on the Sam Bankman freed FTX catastrophe. I don't think that's put in it too strongly, at least in EA circles. So, we're going to talk about what happened there, your perception of it, what it has done to the optics around effect of altruism and perhaps effect of altruism itself. Where should we start here? Perhaps you could summarize what Sam Beckman Fried's position was in the EA community before the wheels came so fully off. Where did you meet him and what, you know, he was certainly seemed like a promising young man who was going to do great things. Give us, you know, I mean, the press we should take it from the top.
SPEAKER_02
06:34 - 10:59
Sure, I'm happy to. And yeah, he did. For my perspective, seemed like a promising young man, even though that's, yeah, very much not how it turned out. So I first met Sam all the way back in 2012. I was giving talks for a new organization. I'd set up 80,000 hours, which will about how you can do good with your career. And I was going about college campuses speaking about this. And I can't remember who but someone put me in some in touch. I think he had been quite active on a forum for people who have interested in utilitarian philosophy. And so, an idea is like earning to give had been discussed on that forum. And as a result, we met up for lunch and he came to my talk. uh he was interested in a number of different career paths at the time so politics and to give was one perhaps we should remind people what earning to give means because it's is really the proper framing for everything Sam was up to sure so earning to give was the idea that rather than say, directly working for a charity. Instead, you could deliberately take a career that was higher paying, something you'd perhaps particularly good at, in order to donate a significant fraction of your earnings, where, depending on how much you made, that might be 50% or more. And the core idea was that, Well, you could say become a doctor in the developing world and you do a huge amount of good by doing that or you could earn more and donate enough to pay for many doctors working in the same course and thereby perhaps do even more good again. And this was one of the things that I was talking about at the time. And he found the ideas compelling. You know, we discussed it back and forth at the time. I next met him something like six months later at a vegan conference. And he told me, we hadn't been much in touch in that period, but then he told me that he'd gotten internship at changed Leet, which is this. quantitative trading fund. And, you know, that was very impressive, I thought he seemed just this very autonomous, very morally motivated person. Animal welfare was his main focus at the time. He said later he'd also ask some animal welfare organizations, would they rather his time, would he rather they work for him? Or would they rather that he go make money in order to donate it to them? And they said, we'd rather have the money. And so he went and did that at change the, but then subsequently left and set up a trading firm called Alameda the search that was a cryptocurrency trading firm. And then a couple of years later, an exchange as in a platform where others could trade crypto so currency called FDX in 2019. Those seem to be incredibly successful. So by the end of 2021, he was worth tens of billions of dollars. The company FTX was worth $40 billion. And he seems to be living up to his kind of claims. He was saying he was going to donate everything. Essentially everything here and 99% of his wealth, something like that. And through the course of 2022, had actually started making those donations too. had donated well north of $100 million. But then, as it turned out, in November, it seemed like the company was not all that it seemed. There was, you know, what you could call around on the bank, except it wasn't a bank. So there was a loss of confidence in FTX. A lot of people started withdrawing their money, but the money that customers had deposited at an exchange that should have been there was not there. And that should not have been possible. It must have been the case, therefore, that Sam and the others, leading FDX, had misapplied to you to that money in some way, all the while saying that the assets were perfectly safe that they were not invested. That led to the complete collapse of FTX, a number of people, so three other people who were high up at FTX or Alameda, so Caroline Ellison, Gary Wang and Nishad Singh. They all pleaded guilty to fraud a couple of months after the collapse, and it not pleaded guilty, but it was a trial. at the end of last year and he was found guilty.
SPEAKER_00
10:59 - 11:10
And what is his current state? He's in jail as he awaited an appeal. What's up to the minute information on his progress through the criminal justice system?
SPEAKER_02
11:11 - 11:17
Yes, so he's in jail and he's awaiting sentencing, which will happen next week, I think.
SPEAKER_00
11:17 - 13:46
So I guess one thing we should talk about is, you know, some theory of mind about Sam, you know, what his intentions actually were. And so far as we can guess about them, I guess, because it really, there are really two alternate pictures here, which give a very different ethical sense of him as a person and the situation. So many people were in in giving him their trust. I mean, perhaps we can just jump there. Do you think this was a conscious fraud? I mean, perhaps there are other variants of this, but I'll give you the two that come to mind for me. Either this was a conscious fraud where he was quite cynically using the concepts of effect of altruism, but his heart was never really in that place and he was just trying to get fantastically wealthy and famous and misappropriate in people's funds to that end and it all blew up because just bad luck on some level. So he was kind of a burning made-off style character running something like a Ponzi scheme or some unethical variant of, you know, misappropriating people's funds. Or, alternately, and, and I think quite differently, he was somebody who, based on what he believed about the, the actual ethics of the situation and, you know, probability theory, he was taking risks that he shouldn't have taken obviously in the end given the outcome, but they may well have paid off and he was taking these risks because he wanted to do the maximum amount of good in the world with as much of the resources available that he could get his hands around. And he just was just placing in the end some silly bets and bets that he was allowed to place in a totally unregulated space. And it catastrophically failed, but it was by no means guaranteed to fail. And he was on some level a good guy who was ruled by some bad or at least unrealistic expectations of just how many times you can play a game of roulette and win. Perhaps there's some middle position between those two cases, but what's your sense of his actual intentions throughout this whole time?
SPEAKER_02
13:47 - 15:01
Yeah, so this is something that I've now spent many months over the last year and a half really trying to understand. I didn't know about the floor or have suspicions about the floor at the time. So my understanding of things here is really me trying to piece together the story on the basis of all that's come out as a result of the trial and media coverage over the last year and a half. One thing I'll say kind of before we talk on this is very easy once you start getting into trying to inhabit someone's mental state to start saying things where, you know, it sounds like you're defending the person or something. And so yeah, I just want to be clear on just how bad and how harmful what happened was. So, you know, a million people lost money. The scale of this is just unbelievable. And actually, recently, the prosecution released Twitter messages that Sam had received during the collapse. And they're really heartbreaking to read. Like, one is some of Euclainian man who would flee Euclain and in order to get his money out, put the money on FTX. Another person who was going to be made homeless had four children. He needed defeat.
SPEAKER_00
15:02 - 15:14
And so on that point, will it just linger for a second? Yeah. I had heard that a lot of the money was getting recovered. Do you know where that process is and how much has been recovered? Yeah.
SPEAKER_02
15:14 - 15:33
So as it turns out, all customer customers will receive all of the financial, all of the money they cut on the exchange as measured in terms of as measured in terms of the value of what they put on the exchange in November 2022.
SPEAKER_00
15:33 - 15:40
So the value as of that date is opposed to the amount they initially put in at whatever date.
SPEAKER_02
15:41 - 16:32
Yes, but also as opposed to the amount today. So often people were say putting Bitcoin on the exchange Bitcoin is now worth more, rather than it was then. The standard narrative of this has been that that has happened because, Clip Toe has sort of listened and and far pick a particular investment that was made has done well. My best understanding is that actually that's not accurate. that has helped, but even putting that to the side, kind of even as of September last year when there had not been a click and a click to advise customers would have been made whole. So the issue was not that money was taken and then just lost in the sense of like spent or just lost on bad grades or something. Instead that the money was illegally taken and invested into assets that couldn't be liquidated quickly.
SPEAKER_00
16:33 - 16:54
So already where I think a far distance from someone like Bernie made off, right, who was whatever the actual origins of his behavior, whether he was ever a legitimate investor. For the longest time, he was making only sham investments and just lying to everyone in sight and running a proper Ponzi scheme.
SPEAKER_02
16:54 - 20:20
Yeah, that's like Bernie made off was committing the fraud for about eight years. And yeah, he was needing any time, a customer, or so, any time a client of his wanted to withdraw money, he would play his more money in order to give it back. I have to give the fiction, you know, what should have been there to the customers, which is going to prioritizing the funds. It depends upon the screen. Yeah. I'll let me done FTX. This is one of the things that so bizarre about the whole story and even tragic is just that the companies themselves were making money in fact large amounts of money. But what? But the customer asked it. So in that sense, they were not Ponzi's kings. the custom assets held on FTX that should have been there, should have been bank vaulted, separate from everything. Got used by Alameda of Research, the trading firm, in a way that should not have been possible, should not have even been possible. Right. And so you were asking on how to interpret the story and you gave two interpretations. One was that effect of altism was a sham. His commitment to that was a sham, he was just in it for his own powers and greed. The second was that it was some carefully calculated bet that, you know, may have been, was illegal, had good intentions though, perhaps twisted intentions, but didn't pay off. My personal take is that it was neither of those things. And obviously, I'll caveat, you know, I followed this a lot, because I really, to like to dissolve the confusion in my mind about what happened. But I'm not an expert in this. It's extremely complicated. But I think there's a few pieces of evidence for thinking that, This just wasn't irrational or calculated decision. No matter what utility function, Sam and the others were following, it did not make sense as an action. And one piece of evidence is actually just learning more about other white collar climbs. So Bernie made off being one example, but the England scandal and many others too. So there's this Harvard business professor, Eugene Solters, who's written this really excellent book called Why They Do It about White collar climb. And he argues, and it's on the basis of interviews with many of the most famous white collar criminals. And he argues quite strongly against the idea that these crimes are a result of some sort of careful cost-benefit analysis. Mainly, you know, in part because the cost-benefit analysis just does not make sense. Often these are actually really quite well-figurally quite successful people. who have not that much to gain, but everything to lose. But then secondly, looking at how the decisions actually get made, the word he often uses is mindless. You know, it's like people aren't even paying attention. It might be that, you know, this was true for the CEO of McKinsey, gets off, gets off a call from the board of Goldman Sachs, I think. And immediately 23 seconds later, calls a friend to tell him about what happened in the board meeting. in a way that was illegal inside our trading. This was not a carefully calculated decision, and it was just, it was irrationality, it was a failure, a failure of intuition, rather than kind of reasoning. And that's my best guess at what happened here as well, where, yeah, I think what happened, it seemed, is that, I mean, yeah.
SPEAKER_00
20:20 - 23:02
But it was so many actually led to let me lay around a few points that could certainly buy some people in the direction of kind of more calculation and less impetuality than that because I think both within EA circles more widely and certainly it seems within Sam's brain. There was some ideas that will strike people as strange, but nonetheless difficult to refute, you know, rationally or logically, which is to say, it all hinges around a topic that you and I believe have discussed before in the podcast, which comes up in any discussion of what's called long-termism, which I think we'll get to. It comes down to just how to integrate rationally any notion of probability, especially probabilities where one side of the decision tree represents some extraordinarily large possible gains, right? So I believe Sam at one point was accused of believing. He may have said something along these lines that If the expected value is such, you know, that you could, I forget if it was in terms in positive terms or negative terms in terms of avoiding, you know, extinction. But it sounds like he was willing to just toss a coin endlessly with the risk of ruin on one side with a sufficiently large expected value, you know, if positive outcome on the other, right? It's just like if you have a chance to win a million dollars on one side and lose 100,000 on the other, you should just keep tossing that coin because your expected value is 50% of a million on one side and 50% of losing a hundred thousand on the other. So it's your expected value is, you know, $450,000 every time you toss that coin. But of course, if you only have a hundred thousand dollars to lose, you toss the coin, you can lose everything on your first toss, right? And so he just seems to be someone who was looking at the expected value proposition somewhat naively and looking at it with everyone else's money on the line. or at least that's you know certain things I've heard said of him or by him suggested that was the case so perhaps bringing in your beliefs about how one should think about probability and they kind of ends justifying the means thinking that many people believe has corrupted EA more generally, but and Sam is just kind of the ultimate instance of a cautionary tale there.
SPEAKER_02
23:02 - 27:54
Sure, so yeah, one thing that's just absolutely true. was Sam was seemed unusually risk-tolerant and was unusually risk-tolerant. And at the outset, like when the collapse happened, I absolutely was worried that perhaps what had gone on was some sort of carefully calculated fraud, carefully calculated willingness to break, just to break the law in the service of what he thought was best. You know, I was worried that maybe they would come out a spreadsheet. that you know did a little cost benefit analysis of thought and was clear for all to see. I think yeah that yeah I think that I just good reasons for thinking that's not what happened and let's I'll discuss that first and then let's come back to the important points about attitudes to risk and and justify the means reasoning. But just briefly on why that's I think not what happened so one is just the overall plan makes like so little sense if it was a long con like a con kind of from the very start So if that was the case then why would they be trying so hard to get regulated? Why would they be so incredibly public about what they were doing? In fact, fairly actively courting, press attention, or in fact having Michael Lewis, one of the world's leading financial writers, following them around having access. Why would they be associating themselves with EA so much as well? If that was also what they seemed to care about. And then the second thing is it's just absolutely agreed by everyone that The organite, the companies were a shambles. There was not even the most basic accounting, not the most basic corporate controls. The in June of 2022, there was a meeting between the high-ups where they were very worried because it looked like there was a $16 billion loan. from FTX to Alameda, and they thought that was the rebad. It turns out that was a bug in the code, and actually it was only an $8 billion loan. And they were apparently related at the time. So they didn't know how much the assets were to within $10 billion. And in fact, it was at that time that they discovered that they had been double counting $8 billion. So customers for FTX, one way they could put money on the FTX exchange, was by sending money to Alameda that Alameda then should have given to FTX. And it seems that in June of that point of that time, they realized that had not been happening. Alameda had thought that money was within Alameda, legitimately, I guess, and FTX had thought that money was there. And now I'm not going to claim I know that wasn't like conveniently overlooked or something, but at least Caroline on the stand testified that Sam might, she at least didn't know that Sam knew that that money prior to that point in time was in Alameda when it, whereas it should have been in FTX. And that's the way almost all the money flowed from FTX to Alameda. There was also a little lending program, which was really focused on by the prosecution. But that actually, yeah, we can go into that in more detail as well. I think that's actually not where the action was in terms of. how the money moved, and if you're just like really getting to the heads of the people at this time, okay, let's now suppose they're the most ruthless, consequentialists ever, and they want to just make as much money as possible. Let's say it's just dollars, they're not risk-averse at all. It wasn't even true, but Let's even assume that why an earth would they take that money and then invest it 5.5 billion of it into illiquid venture investments so companies basically. It was obviously posing enormous risks on them and the gains were really quite small compared to the potential loss of not only everything in the company but also the huge harm that it would do to the rest of the world to the effect of altruism movement itself. It really just makes no sense from a utilitarian perspective at all. And that's why when I dial and have it this mode of them making these kind of carefully calculated, professional decisions, I think it just, there's too many facts that I see in intention with that or in consistent with that, for that to be the kind of best, these my best guess at what happened.
SPEAKER_00
27:55 - 28:11
So what do you think was actually going on? Did you describe this in the end to some version of incompetence combined with a dopaminergic attachment to just winning at a some kind of gambling task?
SPEAKER_02
28:11 - 30:45
Yeah, I mean, I see the deep kind of vice, the ultimately drove this all as hubris, where they were not very experienced. They were very smart. They grew a company in what was for a while an impressive way. And Sam in particular, I just think thought he was smarter than everyone else. And this was something that I didn't like about Sam and noticed like during the time. He would not be convinced of something just because other people, even if everyone else believed X and he believed Y, that wouldn't give him no pause for doubt. And so I think he got kind of corrupted by his own success. I think he felt like he had made these bets that had paid off. in the spite of people being skeptical, time and again. And so he just thought he was smarter. And that means that very basic things like having good accounting, having the kind of adults, professionals come in who could do risk management, actually point out what an earth was going on with where different, different stashes of money were. Because this is another thing that shows just how insane the whole venture was. At the time of collapse, they just didn't know where most of their assets were. You know, there would be hundreds of millions of dollars in a bank account somewhere, and they wouldn't even know existed. The bank would have to call them to tell them, by the way, you've got these assets on hand. Again, if it was a carefully calculated apply, you would want to know where all of your assets were. In case there was a mass withdrawal of customer deposits. And so I think, yeah, that hubris, also kind of not a risk calculation, but maybe an attitude to risk. where many people, I think, when they're in the position of, you know, really quite rapidly of earning a multi-billion dollar company would think, holy shit, I should really get some experience professionals in here, whereas that was and be quite worried about, you know, have I attended to everything? Have I basically just have I got this company under control? And I think at that point they did, you know, that was not tall how Sam and the others were thinking. And then the final thing to say just is that this isn't saying that they didn't commit fraud for them June onwards. After this whole has been discovered, I think then it becomes pretty clear that there are just ways and ways to try to get out of the position that they've put themselves in. I think there are also other cases of things that seem like clearly forward though they are not of the kind of $8 billion scale.
SPEAKER_00
30:47 - 31:06
And this was fraud. You think to conceal the hole in the boat that was putting everything at risk or this was fraud. Even when things were appeared to be going well and there was no risk of oblivion evident. What was the nature of the fraud do you think?
SPEAKER_02
31:07 - 32:47
Yeah, I mean, again, flagging that there's probably lots that I'm saying that are wrong because this is, you know, it's complex and I'm not confident there's like lots of different stories. My guess is both like in the trial one thing that came up and I'm surprised didn't have more attention was that FTX advertised it had an insurance fund. So if it's liquidation engine, basically a bit of technology that meant that even if a customer was borrowing funds on the exchange in order to make basically a bet on the exchange with borrowed money. You know, on other exchanges, you could easily go negative by doing that. And that would mean other other users would have to pay to cover that loss. FTX had this automated automatic liquidation engine that was quite well respected. But they said, even if that fails, there's this insurance fund. that will cover any losses. However, the number that was advertised on the website seemed to have been created just by a random number generator. So that seems like really quite clear forward. And I don't know. It hasn't been discussed very much, but seems totally inexcusable and was applied even when the going was good. But then the big fraud, for $8 billion, it seemed like that was that really started kicking in from June of 2022 onwards. Though I also say like, you know, we can talk about my interactions with the people there, it seemed like looking back, I think it seems to me like they did not know just how badly the situation they were in was.
SPEAKER_00
32:47 - 36:39
But yeah. Well, let's talk about your interactions and let's focus on Sam to start. I mean, you bring up the vice of hubris. I mean, I only spoke to him once I believe it's possible I had a call with him before I did a podcast with him. He was on the podcast once, and this was very much in the moment when he was the darling of the EA community. I think he was described as the, I guess he was the youngest, you know, self-made person to reach something like $30 billion. I think he was, you know, $29 and had $29 billion or something at that time. I spoke to him. And again, the purpose of all of this earning was to do the maximum amount of good he could do in the world. He was just earning to give as far as the eye could see. I didn't really encounter his now famous arrogance in my discussion with him. He just seemed smart and well-intentioned. And I had no reason to, I knew nothing about the details of FTX apart from what he told me. And, you know, I think anyone in our position of talking to him about his business could be forgiven for not immediately seeing the fraudulent of it or the potential fraudulent of it, given that, you know, he had people invest with him, you know, quite sophisticated, you know, venture investors, early on, and you know, they didn't detect the problem, right? And we were not in the position of investing with him. But in the aftermath, there are details of just how he behaved with people that struck me as arrogant to the point of insanity, really. I mean, just like he's, you know, in these investor calls, apparently he is, well, describing his business and soliciting, I think it was hundreds of millions of dollars that a minimum from firms like Sequoia, he is simultaneously playing video games. And this is, you know, celebrated as this delightful, affectation. But, I mean, clearly, he is someone who thinks, you know, he need not give people 100% of his attention because, you know, he's got so much bandwidth. He can just play video games while having these important conversations And there were some things in Michael Lewis's book that revealed, or at least seemed to reveal, that he was quite a strange person. And someone who claimed on his own account, at least to Lewis, that he didn't know what people meant when they said they experienced the feeling of love, right? Like it like he's, so he's neuro-atypical at a minimum. And perhaps I just, you know, offer this to you is just a series of impressions. how peculiar a person is he and shouldn't there have been more red flags, you know, earlier on, you know, in terms of his integrity, ethically or it's just his capacity for ethical integrity given, you know, if someone tells me that they have no idea what anyone means when they say they love other people. You know, that is an enormous red flag. I mean, it's something that, you know, I would feel compassion for, you know, that the person is obviously missing something. But as far as collaborating with this person or putting trust in them, it's an enormous red flag. And so I, you know, I don't know at what point he told Lewis that. What was your impression or is your impression of Sam as a person? And in retrospect, were there signs of his unreliability ethically, you know, far earlier than when the emergency actually occurred?
SPEAKER_02
36:39 - 40:34
Sure, so there's a lot to say here, and briefly, on the not feeling love. So yeah, my descriptions of Sam and feelings of Sam are quite very, very gated. On his ability to not feel love, It's, you know, that wasn't something that seemed striking or notable to me. Like after the Michael Lewisburg and lots of things came out, it seemed like you had just emotional flatness across the board and whether that's a result of depression or ADHD or autism is like not really that clear to me. But that wasn't something that seemed obvious at the time at least. I guess I interact with people who are relatively emotionally flat quite a lot. I certainly wouldn't have said he's a very emotional person. He did seem like a very thoughtful and incredibly morally motivated person all the way back to 2012. I mean, his main concern was for the plight of nonhuman animals on factually farms for most of that time. It's kind of an unusual thing to care about if you're some sort of psychopath or something like that. Yeah, I, you know, when I first reached out to Sam after the FTX had been so successful. I talked to him about, you know, okay, you've started this company. It's a crypto company, isn't crypto, like, you know, pretty sketchy, like, how much have you thought about risks to the company and so on? And there was a narrative that came from him. And then was echoed and emphasized by Nishard Singh, who in my experience was really a kind of crucial part of the story, like really crucial part of my interface with that world. And the story I got told was FTX is trying quite self-consciously to be much more ethical than the standard crypto exchange or anything going on in the crypto world. And there were two reasons why we need to do that, even putting aside the, you know, intrinsic desire to act ethically. One is because they were trying to get regulated. So, you know, they were very actively courting regulation in the US. because they thought that was a way in which, you know, by being, you know, they were these center left people. They were not the libertarians that populate crypto normally. They thought, you know, that's how they could get the edge over the competitors was by being much more open and open to regulation. And then secondly, because they planned to give the proceeds away, they knew that they would get, you know, they would face a higher bar for criticism. And that claim got, yeah, made to me over and over again, where not just Sam, but Nishad, you know, Sam was very busy. So, you know, I spoke to him a number of times, like half a dozen times for something one-on-one, more times in kind of group settings. But I talked to Nishad and Nishad really came across like, I mean, this is the thing that like maybe makes my heart the most about the whole story where he came across just as this incredibly thoughtful, you know, modeling motivated, careful, just kind person. And I would ask him kind of, okay, so why are you in the Bahamas? And there would be an answer, which is that that's where they were able to get licensed. Or that's kind of why is your apartment so nice? And they would say, well, we can't really get kind of mid-level property in the Bahamas. We just need somewhere that we can create like a campus feel. And so yeah, it is nicer than we'd like. Hopefully we can move over time to something a bit less nice. So over and over again, or other kind of ethical issues in crypto, we can go into. And yeah, over and over again, he was painting that picture. And something that was just so hurtful and confusing is just was he lying to me that whole time? Like was that just all false? Or was he just like a global fool?
SPEAKER_00
40:34 - 40:50
I haven't followed the trial in sufficient detail to know where what his role was revealed to be in all of this. Where is he and has he been prosecuted and what do you think of his actual intentions at this point?
SPEAKER_02
40:51 - 40:56
Yeah, and in so, P played guilty, I think, for thought among other things and he testified.
SPEAKER_00
40:56 - 41:21
In these pleadings, do you think this was just kind of a classically perverse prisoners to lemma situation where you have people, given the shadow of prosecution and prison hanging over them, they're willing to testify to things that the government wants to hear, but which are not strictly true. What's your theory of mind for the people who pled guilty at this point?
SPEAKER_02
41:22 - 42:49
Yeah, I mean, again, this is something that comes up in Eugene, Saltis's book, and he talks about where it's a very strange aspect of the US legal system, like it's not something that happens in the UK, where the government will reward people, literally with their lives, for going on the stand, like, you know, because they will, the other people probably will get no jail time. They will reward people in that way for going on the stand and testifying. And so that just does mean, you know, they can't tell lies or they're not verifiable lies, but there are very strong incentives to present things in a certain way. And again, I don't want to, this was all sounding much more defensive of Sam than I want to be. But the Sulta's book talks about some people who were just, you know, they would be the Hearst with their lawyers for hours and hours and hours in order to see, you know, display a ploppy at confession and so on. And so the view of this that kind of Michael Lewis took is just, you know, people on the stand, they will have just said, true things throughout. But the kind of tone of it is maybe a little different than it really was where there was a lot of, you know, the co-consperators talking about how bad they felt and how the knew what they were doing was long at the time, they were very torn up. That, you know, seems quite inconsistent with my experience of them, but maybe they were just incredible liars.
SPEAKER_00
42:50 - 43:50
One question about that. So they knew what they were doing was wrong, could mean many things. It could mean that they knew that they were taking risks with people's funds that were unconscionable, given the possibility of losing money that customers thought was safely on the exchange. But that's not the same thing as stealing money and misappropriating it in a way that is purely selfish. It's not like we took money that was not ours and we bought You know, luxury condominiums in the Bahamas with it and hoped no one would notice, right? That's one style of fraud. This, you tell me, is it possible that they thought they were going to, you know, wage or this money on other real investments, you know, however shady some of these crypto properties were. But they actually expected enormous returns as a result of that misappropriation, and that money would come back safely into FTX, and no one would lose anything in the end.
SPEAKER_02
43:51 - 45:40
Yeah, I mean, in terms of how things seem to me, I just think they didn't think the company was at risk, not at serious risk. And here's a couple of, there's a few reasons why. I mean, one, this is kind of how I felt like why I was so confused is whole time. Like, you know, I visited the Bahamas a number of times in 2022. I never saw any kind of change in attitude from them over that time. Like you would really think if you'd engaging this major thought that something would seep out some sort of flags. Maybe I'm a fool, but I did not see that. And in fact, even, so in September, my last trip to the Bahamas. I heard from Michael Lewis that Sam had been courting funding for FTX from Saudi Arabia and other places in the Middle East. And I do not love the idea of taking money from Saudi Arabia. I've issues with that. And it also just looked me as kind of odd. And I was aware there was a crypto downturn. So I talked in a shard and I say, I asked, look, look, is there anything up with a company like a UNTRABLE? And he says no. And we talk about this for, you know, it's not a passing comment for some time. And that by any account is like past the point when he, you know, allegedly had learned about the huge hole that the company faced. Similarly, Michael Lewis on what was the same time, ask both Caroline and Nishard as a kind of fun question. Like, oh, what could go along with the company? Like, if this all goes to zero, what happened? And again, he said, like, no indication of stress upon hearing that question. They had fun with it. They were like, oh, maybe crypto is just a lot of how air and everyone gets turned off or maybe Sam gets kidnapped. That was kind of one of the big worries. But nothing that kind of leaked out there.
SPEAKER_00
45:41 - 45:54
So given his guilty pleading and his testimony, what's your belief about your conversation with Nishata at that point, do you think he was unaware of the risk or do you think he was lying to you?
SPEAKER_02
45:54 - 46:33
So I think another thing that came out during the trial, though I'm not sure if it was admissible as evidence, was that Nishata commented to the government kind of immediately upon pleading guilty, that in that period he was he kind of thought he still thought that FTX would last for years. So, and yeah, and in terms of just giving an indication of what Nishad's personality was like, when the collapse happened, he had to be watched because he was on the verge of suicide. He was so distraught about what happened to the customers. And I think was really quite close to taking his own life.
SPEAKER_00
46:33 - 47:05
So the one sort of fraud was he pleading guilty, too, if he's the kind of person who's suicidal when the wheels come off, as though he had no idea that this was, this was in the cards, right? I mean, you think he's just responding to all of the appropriate and disgraced aimed his way in the aftermath, or do you think he was actually surprised fundamentally by the risk that was being run? And if the latter, it wasn't since he claimed to be culpable for a conscious fraud,
SPEAKER_02
47:05 - 48:36
Yeah, I mean, so yeah, so I don't know about whether Nishad at this time was igniving, as in just like, really did not know the risks they were running. I don't know if he was just like, delusional. So again, this is a thing that Soltz talks about is just you know, the capacity for humans to create a narrative in which they're still, you know, the good guys. I don't know, perhaps he thought that yes, this was bad, but they would get out of it. And so it was a little bit bad, but it'll be fine. Maybe that was there too. Well, yeah, one thing that he does is he buys this fee and a half million dollar property for himself in October. Again, it's just not the action with someone who, again, understand, said that, you know, how distilled he was and he was talking to her. Sam about this. So all of these things are possible to me as for, you know, him pleading guilty. Well, I mean, whichever of these is true, I think it would make sense to plead guilty if you're in that situation where, you know, there are huge costs to go into jail. And like, I don't know, like, you know, plausibly also, he just thought, look, yes, I thought, so, you know, there's a very, very story, but plausibly he thought, look, yes, I knew what I was doing as bad. I thought it was only a little bit bad. Actually, I was long, it was very bad. It was extremely bad. I'm willing to just face up and take the hit I should get. Like, there are various possible explanations there.
SPEAKER_00
48:36 - 49:30
Right. There was a lot made, I don't know if this appeared in the trial, but in the court of public opinion, there was a lot made of a text exchange that Sam had with somebody. I think it was a journalist or a quasi journalist in the immediate aftermath of of the scandal where he seemed to admit that all the effective altruism lip service was just that. It was just the thing you say to liberals to make them feel good. I forget the actual language, but it seemed like it was copying to the fact that it was That part was always a ruse. Honestly, when I read those texts, I didn't know how to interpret them, but it was not obvious to me that they were the smoking gun. They appeared to be in so many minds who are ready to bury a fact of altruism as a scam. Do you know the threat I'm referring to and do you have a clue?
SPEAKER_02
49:30 - 50:18
Yeah, I know the flits. And yeah, I mean, in a way, like, from the perspective of the bland of effective altruism, Maybe it would have been better if you'd actually be that part of being a big con. But no, I think he believed in these ideas. I think there he was the furthering to kind of what you might call it. Like sort of corp the ethics that early kind of PR. So, you know, companies will often make these big charitable donations to their local community and so on. And, I mean, and this takes exactly everyone knows like this is marketing. And I guess I don't know really details, but presumably FTX was doing stuff like that in the same way other companies do. And my interpretation of those texts is that that's what he was referring to.
SPEAKER_00
50:19 - 51:54
actually reminds me that the one concern I did have about Sam before the scandal broke. I don't know if this was contemporaneous with with my conversation with him on the podcast, but I just remember thinking this what when I heard how much money he had given away and then you referenced it earlier if it's something north of a hundred million dollars. I'm always quick to do the math on that, and I recognize what a poultry sum that actually is if you have $30 billion. It's an enormous amount of money out in the real world where people are grateful for whatever you give them, but it's analogous to somebody who has $30 million giving $100,000 away. It's not a sacrifice. It's a rounding error on their actual wealth and it's certainly not the sort of thing that I would expect of someone for whom the whole point of becoming fantastically wealthy is to give all of it away. And so I think, and I forget if I asked him a question about the pace of his giving during that podcast, but it's just, I know that some people think, well, it's, you know, the best thing for me to do is to use these assets to make more assets in the meantime and then I'll give it all away later on. Given the urgency of so many causes and given them the real opportunity to save lives and mitigate enormous suffering, you know, every day of the week, now you know, starting now, it just, you know, my spidey sense tingles when I hear, you know, a fantastically wealthy person deferring their giving, you know, to the far future. And so I'm wondering what you think of that.
SPEAKER_02
51:55 - 53:36
Sure, yeah, I think that issue is, I think that wasn't really an issue. So a couple of reasons. One is just, you know, his net worth, basically entirely in FTX, there's no way of converting that. So if you're in a startup, and all your wealth is in the equity of that startup. There's not really any way of converting that wealth into money, the sort of thing that you could donate. You have to basically keep building the company until you can have an exit, so have acquired or sell the company, and then you can become more liquid. And then the second thing is just at least a relative to other business people. He was very unusual in wanting to give more and give quickly So, I mean, I advised on the set up of his foundation and it got a lot of criticism for giving, for scaling up, giving too quickly. So, going from kind of zero to, you know, one to two hundred million in a year is like a very big scale up. And it's actually just quite hard to do. And so I guess I like, you know, if you were asking me, yes, it's a tiny fraction. And I agree with the point in general that when someone who's a Santa billionaire then gives, you know, $100 million to something. That is just really not very much at all. Especially once they've had that money for decades and they can really kind of distribute it. But in that case, the way it seemed to me at the time, and I guess still does just seem to me, was like basically consistent with someone trying to scale up their giving as fast as they can. And in fact, in a way that, you know, plausibly should have been paying more attention to the business and not going distracted by other things.
SPEAKER_00
53:38 - 54:22
So what is this? What if anything does this say about effect of altruism? I mean, there's an additional question here. What has been the effect as you perceive it on EA and the public perception of it, the fundraising towards good causes, has it forced a rethinking of any principles of effect about tourism that you know whether it's earning to give or or a focus on long-termism which we haven't talked about here yet but you know I've discussed before and How large a crater has this left and what has been touched by it and is there any good to come out of this just give me the pictures you see it of EA at the moment
SPEAKER_02
54:23 - 57:11
Yeah, I mean huge harm, huge harm to EA, where, you know, at the time of the collapse, I put it like 20% or something that the EA movement would just die, this was a killer blow. And so obviously in terms of hit to the blind, you know, so many people think ill of EA now or critical of EA now and all sorts of different ways. driven by this. In a way that's not surprising, it was this horrific, horrific thing. I don't think it happened because of EA. I think it happened in spite of EA. I think like EA leaders and communicators have been very consistent on the idea that the ends do not justify the means. Really since the start. I mean, and really this goes back, essentially, go back to John's Stuart Mill. And actually even Sam knew this. So again, as part of just trying to figure out what happened, I did some Facebook archeology. So there's an essay by Elie Azio Kelsky called The Ends, do not justify the means among humans. Basically making the classic point that you are not a guard calculation, even if you have 100% consequentialist, which I don't think it should be, but even so, follow the statistics that are tried and true, including statistics not to violate side constraints. And this was shared on a group, on a kind of discussion group. And this is, you know, well before FTX, Sam's response was like, why are you even sharing this? This is obvious, everyone already knows this. So, yeah, this was, in my view, in spite of the A, not because of it. But yes, the damage is huge. Also internal damage as well, morale was very, very low. Trust was very low. The thought being well, if Sam and the others did this, then who knows what other people are like. And there has just been enormous amount of self-reflection, self-scutiny, whether that's because of this could ask the fear itself, or just if there's any point in time for self-reflection, I think it's in the aftermath of that. And so there's a whole bunch of things that have changed over the last year and a half. Not in terms of the principles, because you know, what is effective altruism, it's the idea of using evidence and reason to try to make the world better. That principle is still good, like I still would love people to increase the amount by which they are benevolent towards others and increase the amount by which they think extremely carefully and are like really quite intense about trying to figure out how they can have more positive impact with their money or with their time. That's just still as true as ever, and the actions of one person in no way undermind that. I mean, take any ideology, take any moral view that you can imagine. You will find advocates of that ideology that are utterly of a pugnant.
SPEAKER_00
57:11 - 57:15
Yeah, this was the Hitler was a vegetarian principle. Exactly.
SPEAKER_02
57:15 - 59:14
And Sam was too. And so, um, vegetarians having a bad time. Yeah, exactly. Exactly. There have been a lot of changes to the kind of institutions within effect of altruism. So it has essentially entirely new leadership now, at least on the organizational side. So center for effect of altruism, open front, be 80,000 hours, and the boards of at least some of these organizations are like really quite afreshed. This is partly just a lot of people had to work exceptionally hard as a result of the fallout and got really quite burned out. In my own case, you know, I've stepped back from being on the boards of any of these main organizations and I won't do that again really for quite a while. You know, one thing was just that I wasn't able to talk about this stuff in the way I really wanted to, for, you know, over the year, like I spent, again, like months, a little month, kind of writing blog posts and the writing them and having them Then kind of knocked back because there were, uh, there was a kind of investigation being held by effective ventures, um, one of the charities and the law firm doing that. They didn't want me to speak well. That was, um, ongoing. But then also because I think a healthier effect of altruism movement is more decentralized than it was and there was an issue when The collapse happened that I was in the roles of being on the board of the charity, also if anyone was being a spokesperson for EA, but also having advised Sam and the creation of the foundation. And that meant I wasn't able to kind of offer guidance and of the assurance to the community at that time of crisis in a way that I really wanted to and wish I'd been able to. And so I do think like a healthier EA movement has great to decentralization in that way. And there's some other things happening in that direction too. So various organizations are kind of separating, but a project is like separating out legally and becoming and becoming the own entities.
SPEAKER_00
59:15 - 01:03:17
In the aftermath, I was certainly unhappy to see so many people eager to dance on the grave of effect of altruism. In the worst cases, these are people who are quite wealthy and cynical and simply looking for an excuse to judge the actual good intentions and real altruism of others as just, you know, patently false. And it was, you know, there was, there was never there. There are no, everyone's just in it for themselves. And therefore, I rich, I and Randy and type should feel a completely clear conscience in being merely selfish, right? It's all a scam, right? And that's, I just think that's an odious worldview and a false one, right? It's not that everyone is just in for themselves. It's not all just virtue signaling. There are real goods in the world that can be accomplished. There are real harms that can be averted. And being merely selfish is a really is a character flaw and is possible to be a much better person than that. And we should aspire to that. And I say this as someone who's been to some degree always somewhat critical or at least leary of EA as a movement and as a community. I think I'm one of the larger contributors to it just in personally and just how much money I give to EA line charities and how much I have spread the word about it and inspired others to take the pledge and to also give money to give well and similar organizations. But you know, I've always been and I've spoken to you about this and I've just I've said it's much on this podcast and elsewhere, you know, I feel like as a movement, it's it's always struck me as to online and for some reason, attractive to You know, in the most comedic case, you know, you know, nearly typical people who are committed to polyamory, right? I mean, there's a Silicon Valley cult-like dynamics that I've detected, if not in the center of the movement, certainly at its fringe, that I think is events to some degree in the life of Sam Bankman Freed, too. And we haven't talked about just how they were living in the Bahamas, but, you know, there's certainly some colorful anecdotes there and it just it seems to me that there's a there's a culture that you know I haven't wanted to endorse without a caveat and Yet the principles that I've learned from my conversations with you and reading books like your own and Toby Ords book, the precipice, their ideas about existential risk and actually becoming rational around the real effects of efforts to do good rather than the imagined effects or the hope for effects divorcing a rational understanding of mitigating human suffering and risk of harm and the good fields we get around, you know, specific stories and specific kind of triggers to empathy, right? And just performing conceptual surgery on all of that so that one can actually do what one actually wants to do in a clear headed way guided by compassion and a rational understanding of the effects one can have on the world. And it's, you know, we've talked about many of these issues before in previous conversations. I think all of that still stands. I mean, none of that was wrong. And none of that is shown to be wrong by the example of Sam Bankman Freed. And so I do more in any loss that those ideas have suffered in public perception because of this. So yeah, I'm going to take do with that what you will, but that's where I've netted out at this point.
SPEAKER_02
01:03:18 - 01:06:12
Yeah, I mean, I think it's part of a tragedy of the whole thing. It's just, you know, giving what we can has over 9,000 people who are trying to get, who pledging to give at least 10% of their income to highly cost effective charities, naming for 10,000 people this year. For those people, you know, generally living like the normal lives middle class, maybe they're, you know, or maybe they're wealthier. Like in what way does the action of Sam and the others invalidate that? And the answer's not at all. Like that is just as important as ever. And yeah, one of the things that's so sad is like, maybe fewer people will be inclined to do so, not for any good rational reasons, but just because of the bad order that surrounds that idea now. And that's just a little tragedy. I think that's, I think, donating a fraction of your income to causes that effectively help other people. I still think that's a really good way to live. You talk about, yeah, the kind of online cult-like and shot through with Esperges kind of side of the movement. I think I want to do say that, you know, EA is many, or like the EA movement is many things. And also, of course, you can endorse the ideas without endorsing anything to do with the movement. But I definitely worry that, you know, there is a segment that is extremely online. and perhaps unusually weird in its culture or something. And it's a bit of a shame, I think, if people get the impression that that's kind of what everyone within the EA movement is like on the basis of whoever there's kind of most loud on the internet. where, you know, and people can be poly, if they want, I don't know, no moral objection to that, at all, find way to live. People can have all sorts of weird beliefs too, and maybe some of them are correct. Like, I think AIOS was extremely weird for many years, and now people are taking it really seriously. So I mean, I think that's important. But I think the vast majority of people within the effect of out-to-use movement are like, pretty normal people that people who care a lot, they're people who are willing to put their money or the time where their mouth is. And because they care, they're really willing to think this through, and you know, willing to go where the arguments or the evidence lead them. And, you know, I'm not someone who's naturally kind of on the internet all the time. I find Twitter, the internet forums, you know, quite off-putting. And when I meet people in person who will engage in the project of effect around to them, it feels very, very different than it does if you're just hanging out on Twitter or on some of the forums online or something.
SPEAKER_00
01:06:14 - 01:07:30
So is there anything that has been rethought at the level of the ideas? I mean, the one other issue here, which I don't think it played an enormous role in the coverage of the FTX collapse, but it's come under some scrutiny and becoming kind of an ideological cause for concern. The emphasis on long-termism, which you brought out at book length in your last book, Was that part of the problem here and is there any rethink? Because that certainly brings in this issue of probability calculus that turns our decisions into a series of trolley problems wherein ends justify the means thinking at least becomes tempting. which is to say that if you thought you could a decision you made had implications for the survival of humanity not just in the near term but out into an endless future or trillions upon trillions of lives or at stake and the hang in the balance will then there's a lot you might do if you really took that the numbers seriously right is there anything that you're you have been forced to revise your thinking on as a result of this
SPEAKER_02
01:07:31 - 01:08:51
Yeah, so, I mean, I really think long-termism wasn't a play. I mean, again, like I've said, I feel like it wasn't. What happened to FTX was not the matter of some rational calculation and pursuit of some end. I think it looks dumb in a model for many perspective. I also just think, like, if your concern is with hundreds of millions of people in the next theme poverty or the tens of billions of animals suffering in fact to the farms, The scale of those problems are more than enough for the same reasons, the same kind of what is to arise. And in fact, we have seen like in the animal welfare movement on the fringes, people taking violent actions even in the pursuit of what they regarded as the kind of greater good. Long termism, if anything kind of actually shifts against it because this argument about, oh, you should be willing to take more of esk if you're using your money philanthropically than if you're willing to just spend the money in yourself. That argument applies much less strongly in the case of long termism than it does for global health and development, for example. because, you know, if I have $5 to spend, that can buy a bednet. If I have a billion and $5 to spend, that, that, that, the final $5 still buys a bednet. Global Health and Development can just absorb huge amounts of money without the cost effectiveness going down very much.
SPEAKER_00
01:08:52 - 01:09:29
The same is not just to follow further point on those lines. My concern with long termism has been the way in which it can seem to devalue the opportunity to alleviate present harms and present suffering because if you can tell yourself a story that The one billion people suffering now are, you know, their interests are infinitesimal compared to the trillions of Ontrilians who may yet exist if we play our cards right. So it's an argument for perhaps overlooking the immediate suffering of the present out of a concern for the unrealized suffering of the future.
SPEAKER_02
01:09:29 - 01:12:03
Yeah, and I think that's, you know, in what we are the future, I was very careful to defend only what I call the weak form of long-termism, that positively impacting the long-term future is any model polarity of our time. not claiming it's the only one, nor claiming it's overwhelming importance either. Like, I think we should be uncertain about this. In, you know, a new professor suggests a goal. Kind of a way of operationalizing that of rich countries putting 1% at least 1% of their resources to issues that distinctively impact future generations. Because at the moment they couldn't be put close to 0%. And I do think the mode of operating in which you think, oh, a pleasant catastrophe is nothing compared to the unparalleled good that may come in a tilling years time. I think that's a very bad way of thinking, even just from a pure long-term perspective. I think it doesn't have a good track record. And it's literally not how I would want people to think. There has been a different line of criticism that I got from within EA, from the publication of what we are the future onward, that I think has had a lot of merit. And that line of criticism was that I was misunderstanding, actually how near-term the risks we were talking about were. So in particular, the risk from AI, where the risks we face from really advanced artificial intelligence, even artificial general intelligence. These are coming in the next decade at most the next couple of decades. And secondly, the scale of the problems that are imposed by technological development like AI are so great that you don't need to think of future generations at all, even if you just care about the 8 billion people alive today. imposing the size of the risks that we are imposing on them via these technology is more than enough for this to become one of the top problems that the world should face today and over the last few years since the publication of the book I just think that perspective has been getting more and more vindicated and so I am now much more worried about really very advanced, very fast advances in AI, written a very near time thing, as in literally the next five, six years of the next decade, much more worried by a risk from that than I was even just a few years ago.
SPEAKER_00
01:12:04 - 01:12:39
This is a signed-bar conversation, but it's interesting. Are your concerns mostly around the prospect of unaligned AGI, or are they the more piecemeal and nearer term and actually already present? concerns around just that the misuse of AI at whatever capacity it exists to essentially render societies ungovernable and moralist guarantee you know malicious use at scale that becomes quite harmful to what degree are you focused on one versus the other.
SPEAKER_02
01:12:39 - 01:16:09
I think I want to say I'm focused on both but also other things too. So I think misalignment risk, I think it's real and I think it's serious and I think we should be working on it and working on it much more than we currently are. I am an optimist about it though. As in, I think very probably it will either turn out not to be an issue because it's just really quite easy to make advanced AI systems that do what we want them to. All we'll put in a big effort and be able to solve the problem. Or we'll notice that the problem has not been solved and we'll actually just hold back, put in regulations and other controls, for long enough to give us enough time to solve the problem. But this should still be more work. However, I think that AI will pose an enormous array of challenges that haven't really been appreciated. And the reason I think this is because I find it increasingly plausible that AI will lead to much accelerated rates of technological progress. So imagine there's a kind of thought experiment. All the technologies and intellectual developments that you might expect to happen over the coming five centuries. Everything that might happen there. And now imagine all of that happens in the course of three years. Would we expect that to go well? So in that period of time, okay, we're developing new weapons of master selection. We now have an automated army. an automated police force, so that in principle all militancy power could be controlled by a single person. We now have created beings that plausibly have moral status themselves, what economic rights, welfare rights, political rights, should they have? Potentially, I mean you talked about misinformation, but potentially now we have the ability to just have superhuman persuasive abilities. far, far better than even teams of the best most charismatic lawyers or politicians in the most targeted ways possible. And I think more challenges too, like over this period will probably also have new conceptual insights and intellectual insights, or radically changing the game board for us too. And all of that might be happening over the course of a very short period of time. Why not? Why might it happen in such a short period of time? Well, that's the classic argument goes back to IJ Good in the 50s, which is once you've got the point in time when AI can build better AI, you've got this tight tight feedback loop because once you built the better AI, that can help you build better AI and so on. And that argument has been subject to really quite intense inquiry over the last few years, building it into leading growth models, really looking at the input output curves in existing ML development for how much of a gain you get for an increase in input. And it really looks like the argument is checking out. And then that means that it's not long from the point of time that you've got your first AI that can significantly help you with AI research. to then trillions upon trillions of AI scientists that are driving progress and all sorts of scientific domains forward. And that's just a really quite dizzying prospect. I think missile information or some other challenges will have to face. but it's really just like, it's like facing all of technological progress at once and doing it in an incredibly short period of time such that I think the default outcome is not that we handle that well.
SPEAKER_00
01:16:09 - 01:16:40
The handling of well or not, I think we just birthday another topic for a future podcast. There's a lot that's a lot to talk about there. Okay, so finally, where has all of this controversy and confusion landed for you? Where does it leave you personally and in terms of what you're now doing and how do you, what is your view optimistic pessimistic or otherwise about EA going forward?
SPEAKER_02
01:16:40 - 01:17:32
Yeah, I mean, so the collapse was, yeah, I mean, it was extremely hard for me, and there's just been no doubt at all, it's the hardest year that I have now, off my life, both, you know, so many of reasons, just the harder of the harms that were caused, the incredible damage it did to organizations and people that I loved. And so I found that just, yeah, very tough, very tough to deal with. And I was, you know, really quite a dark place like for the first time in my life actually, this chunk of time where I just didn't, I kind of lost the feeling of model motivation. I like didn't really know if I could keep going. So I did actually even think about just stepping back like really just giving up on EA as a project in my life because it just felt kind of tainted. And that was weird. I mean, that was weird not having that motivation.
SPEAKER_00
01:17:32 - 01:17:59
Well, what would have produced that effect? Is it just the public perception of EA becoming so negative? Is it practically speaking fewer funds going into EA organizations that need those funds? What was it funds that were getting clawed back? because Sam Bankman Freed or FTX had given those funds and now those become legally challenged. What was the actual impact?
SPEAKER_02
01:18:00 - 01:18:44
Yeah, I mean, the reason for me thinking, you know, maybe I was just going to give up was nothing practical like that. It was, you know, psychological like after, you know, I really felt like I'd been punched or stabbed or something. Like, you know, I felt like I'd been building this thing for 15 years. And I really worked, you know, unsustainably hard on that. I was tired. And it had just been blown up in this one in one swoop. And it just been blown up by Sam's actions. And yeah, it was just hard to then think, OK, I'm going to get back up off the ground and go into the shallow pond and rescue another child.
SPEAKER_00
01:18:44 - 01:18:55
When you say it's been blown up, though, so you're talking essentially about the brand damage to EA. Yes. With what you have been so closely associated, it is really one of its progenitors.
SPEAKER_02
01:18:56 - 01:22:06
exactly, and I was talking about just what's going on, what was going on in my mind, and not about what in fact happened. So we can talk about what the hit was to EA. So yeah, huge band damage, I think people would definitely keen to disassociate themselves. In my own case, I actually got surprisingly little in the way of being cancelled, but it definitely means it's harder for people, especially in the aftermath to go out and advocate. But I'm now feeling much more optimistic. So it was a really tough like a year afterwards it was low morale. I think it was just harder for, you know, people associated to EA to just do the work they wanted to do. But a few months ago there was a conference among many of the leaders of the kind of most core then organizations. And I honestly just found that really inspiring because for so many people, the principles are still there, the problems in the world are still there, like sounds actions do not change in any way how serious the problems we're facing are and how important it is to take action and how important it is to ensure that our actions have as big an impact as possible. So there really was a sense of people growing up the sleeves and wanting to get back to it. And then in terms of the bland, like I think once you go off Twitter, people again are like quite understanding. People understand that the actions of one person don't reflect on an entire movement. And ultimately just not that many people have heard of either FTX or effects of altruism. So there's still kind of plenty of room to go. And so I think we're starting to see things change again. And you know, it's just made me reflect on the good things that the effect of altars movement is accomplishing and has continued to accomplish. So just recently, give well has now moved over $2 billion. to the top of that commended charities. Just so insane for me to think that that could be that amount compared to where I was 15 years ago. It's like hundreds of thousands of people who had a life that would have otherwise been dead. Like I live in Oxford has a population of 120,000 people. If I imagine like a nuclear bomb went off and killed everyone in Oxford. like that would be world news for months and if a kind of group of altruistically motivated people had managed to come together and prevent that plot saving the city of Oxford that would be huge news people would write about that for decades and yet that's exactly you know not in nearly as dramatic a way perhaps but that's what happens actually several times over And so now I think, yeah, we're getting back to basics, basically, back to the core principles, the effect of altruism, back to the idea that, you know, all people have model claims upon us. We in which countries have an enormous power to do good. And if we do just devote some of our money or some of our time, if we were a careers, to those big problems, we can make a truly enormous difference. That message is as inspiring to me as ever.
SPEAKER_00
01:22:07 - 01:23:43
Yeah, like I certainly could say the same for myself, as I've described, I've always viewed myself to be on the periphery of the movement, but increasingly committed to its principles. And those principles have been learned directly from you more than for many other sources. I think as I said, in a blurb to your most recent book, wrote that no living philosophers had a greater impact upon my ethics than Wilma Cascill and much of the good I now do in particular by systematically giving money to effective charities is the direct result of his influence that remains true and I remain immensely grateful for that influence and so I just You should not read into the noise as a result of what Sam Bankman Freed did any lesson that detracts from all the good you have done, that many of us recognize that you have done and all the good you have inspired other people to do. It's really, it's extraordinarily rare to see Abstruse, philosophical reasoning, result in so much tangible good and such an obvious increase in well-being in those whose lives have been touched by it. So if you keep your chin up and just get back to your alternate necessary work, it really remains inspiring. Well, you may no longer be the youngest philosopher I talk to. You still are nearly so, so just keep going. That's my advice.
SPEAKER_02
01:23:43 - 01:24:40
Well, yeah, thank you so much, Sam. That really, yeah, it really means a lot that you feel like that. Especially because, yeah, your advocacy is just It's really been unreal in terms of the impact it's had, so I was saying that giving what we can has over 9,000 members now. Over 1,000 of them cite you, say, and have as podcast as why they have taken the pledge. That's over $300 million of pledged donations. And so I guess it just goes to show that, yeah, the listeners of this podcast, you know, they just are people who are taking good ideas seriously and not, you know, you might think people who listen are just incident ideas just for their own sake, you know, they find them intellectually engaging. But no, actually people are just willing to put those ideas into practice and do something like take what we can pledge. And that's just, yeah, that's amazing to see.
SPEAKER_00
01:24:40 - 01:25:05
That's great. Well, will we have another podcast teed up? Let me know when our robot overlords start making increasingly ominous noises such that they're now unignorable and we will have a podcast talking about all the pressing concerns that AI has birth because, yeah, I share your fears here and a great conversation.
SPEAKER_02
01:25:05 - 01:25:06
Sure, I'm looking forward to it.