A Guide For (the) Perplexed Philosophers: Jesse Singal and The Rationalist Subculture

In the rapidly diminishing wake of the latest, relatively anodyne, Jesse Singal controversy, and as one of probably relatively few people in the online academic philosophy community obsessively online enough to have taken such a deep and extended look at online “rationalism” (not Spinoza fans, but we’ll get to that), the following piece doesn’t seem the worst way to spend some time writing. I’m not handing out all my receipts on a half-decades worth of engagement with the rationalist phenomenon, so rationalists and sympathisers should get used to the idea that much of this is made of well-informed impression: this isn’t my special subject, I’m just oddly enmired in it.

Singal’s new article for The Atlantic is a combination feature/opinion piece (who can tell the difference these days?) which draws on citing three authors to make the case for a scientific study of conversational disagreement. Of interest here is that two of those citations are blogs. Both of those blogs are written by members of the “rationalist community”, and this should be something to keep in mind, because that community forms a subculture, with its own particular ideas and biases.

Especially for “low-decouplers” like myself, who can’t help but draw connections between things where such connections exist, it should be of interest because the way we talk about such authors reflects on our own self-recognition. Are we the sort of people who take things at a surface-level, or people just at their word? Or are we the sort of people who reflect closely on the reasons other people have for their arguments and claims, and compare them with how we want to think about our own arguments claims?

The point here is just to outline some basics of the subculture, and how it operates. What follows therein are some observations that should give the Daily Nous pause before encouraging philosophers to be less circumspect about rationalist auto-didacts.

I want to make it clear before I go on that I don’t like these people, I think their approach is wrongheaded and – more privately – I think that there’s been a lot of harm done by this subculture (indeed I’ll claim to have seen it). But I’m leaving out all but my most anodyne negativity here, I have evidence and arguments about the rest for another time.

NB: in the foregoing I may have over-emphasised the extent to which this is a specifically “online” subculture, and as a commentator reminded me, this isn’t quite true. A lot of this emerges from offline interactions as well, often by people gravitating towards the same online sources and getting to know each other. Many of these people have dated each other, lived in the same houses (indeed there’s a whole phenomenon of “group houses” where these like-minded individuals move in with each other to share ideas and share the rent). But I didn’t want to over-emphasise that aspect: I’m not party to any of the offline stuff and don’t want to comment on that stuff unless I’ve seen it reported online. Moreover, to the extent that my perceptions here are negative, that doesn’t necessarily apply to the other people I don’t know about who are part of those offline interactions, and I don’t want to speak for them.

So:

Online “rationalism”, a guide for perplexed philosophers.

This “rationalism” is not the Rationalism of Early Modern Philosophy, it is a primarily online intellectual movement of sorts, bringing together like-minded individuals whose stated intent is to erase irrational thinking from their own mindset. This could be thinking about how to order economies, or what is the most accurate interpretation of Quantum Mechanics, or just how to live one’s life. For the last there, we can see that lots of “rationalists” are very interested in Effective Altruism. But that’s a vacuous definition. In reality it’s a subculture, with all of the sociological clutter that comes with not just being a seminar room, but a full-fledged subculture. In particular, online rationalism revolves around a set of key texts, a (broadly) shared set of ethical principles, and a set of friendships.

Rationalists tend to lean libertarian or at least pro-capitalist, tend to read the same websites, and tend to agree about what counts as a good or bad argument. Rationalist discourse is deliberately heavy on parables and literary excursion, and tends to savour contrarianism (these are all self-admitted traits). In spite of what the name might suggest, such discourse is, in my experience, often pretty low on data-driven analysis (Massimo Pigliucci agrees with me in at least one important case: search for his bit on Eliezer Yudkowsky – a key figure – and Bayes).

This is all part of a piece, in ways that’ll be laid out later, but the basic idea is that rationalism provides a model of communication and thinking which is opposed to the rule of the mob and mob-thinking. The parables are supposed to be illustrative of the fundamental data-driven principles, for example (they do often have some data, it must be admitted). Meanwhile, civil agreement and civil disagreement are considered both good, and since a lot of the people involved share very similar ideas, so – the reasoning goes – they must be onto something.

Origins.

The modern origins of the movement lie in two blogs, plus one later blog: OvercomingBias (now run by Robin Hanson, but originally run jointly by Hanson and Eliezer Yudkowsky) and LessWrong (which Yudkowsky left OvercomingBias in order to set up). SlateStarCodex is the third, set up by LessWrong user “Yvain”, now “Scott Alexander” – also an alias. He did so under the influence of LessWrong. SlateStarCodex is arguably the centre around which most rationalist discourse orbits today. I would argue this is probably because Alexander’s mode of thinking is less eccentric and his writing more demotic, though not much less wordy, than Yudkowsky’s.

In order: Robin Hanson is a economist; Eliezer Yudkowsky is supposedly a machine-learning theorist; and Scott Alexander is a psychiatrist, being also probably the most wide-ranging in his interests. Robin Hanson isn’t what I would call especially central to the subculture in the way the latter two are. Furthermore, but more offline, a figure who many will have heard of who is associated with the subculture is Julia Galef. But again, Galef isn’t a central player in the specific subculture I’m discussing here.

Rationalism, especially in its blogging, reddit, and tumblr incarnations – where it is has found the biggest foothold – should be seen as spiralling out in orbit around these particular websites and their central texts. For LessWrong these are “The Sequences”, an enormous and almost incomprehensibly eccentric corpus of thoughts on philosophy of science (especially physics), ethics, philosophy of personal identity, probability theory, psychology, decision theory, and artifical intelligence. For SlateStarCodex these central texts are less well-defined, but one will quickly find a few pieces cited far more often than others, held up as especially good.

To note something: I don’t want to use the phrase “insight porn” but I just did.

A brief precis of each of these texts or sets of texts:

Yudkowsky’s “The Sequences”:

At one time it was not unusual to find oneself ordered, on attempting to examine some particular of rationalist thought which one found perplexing, to “just read The Sequences”. This is a daunting task, as they run to thousands of pages, subdivided and subdivided into a corpus of Yudkowsky’s thoughts on any of the aforementioned subjects. But there are three main themes:

  1. Reductionism: Yudkowsky repeatedly insists that the only viable position on any one subject must by reductionist. Quantum Mechanics reduces to the Many Worlds Interpretation; personal identity reduces to physicalism; scientific understanding reduces to Bayesianism (some have claimed that this last one is only 99% of the truth, but you get the idea).
  2. AI Risk: Yudkowsky claims in rather messianic style that he was put here on Earth in order to ensure that when an Artificial Intelligence comes about (which he is sure it will) it doesn’t do something nasty more or less by accident, which he says is a major risk.
  3. Finally, to be a bit cheeky, Yudkowsky is always right: He is powerfully eccentric, and extremely strident in his exposition, brooking no disagreement on more or less any matter. People who disagree with Yudkowsky simply don’t get it. People who agree with Yudkowsky automatically do. It’s not exactly a new story.

Scott Alexander’s “SlateStarCodex”:

This is a more traditional blog, in which Alexander mixes his personal thoughts of the day and long disquisitions on politics with long statistical analyses of results in psychiatry. In contrast to Yudkowsky, Alexander is much more closely engaged with politics. For Yudkowsky and LessWrong, “politics is the mind killer” and overt discussion of political matters is discouraged because it dispromotes rationality just because that’s what “mind killer” is supposed to mean. Alexander appears to have kept this in mind, but is much less litigious about it, and explicitly sets up his stance as opposing – with argument – the purported irrational effects of politics on the mind. It should be noted too that Alexander also borrows a lot from LessWrong and endorses similar concerns about machine intelligence as Yudkowsky does, though that isn’t his focus.

Probably Alexander’s most popular achievement within the subculture is the idea that it is possible to exist in a “grey tribe” between the putative “red” and “blue” tribes. In this conceptualisation, there are two tribes, red and blue, which dominate political discourse. The grey tribe is the tribe which is most capable of rational deliberation because it doesn’t have the tribal commitments of the other two. Alexander belongs to this tribe, but it’s a broad church: indeed the whole point is that the only really defining feature is its willingness to entertain ideas that are anathemical to the dominant tribes. I’m suspicious of this set-up, personally, and I think you should be too, especially when it comes to empirically examining how argumentation really works in the real world…whether there are really three such tribes even coarsely defined.

Moreover, I’m not generally impressed by contrarianism or “meta-contrarianism” as parsimonious concepts (look up the latter for yourself, it’s all in there on google!).

His political posts (the ones we’re interested in here) fall into three main categories:

  1. Searching disquisitions on the nature of politics and the difficulty of solving political problems (Meditations on Moloch).
  2. Extended critiques of political movements which he thinks are wrong (Reactionary Philosophy…; Libertarian FAQ.
  3. Quite odd rants or quasi-rants about feminism and left wing politics (untitled; You Are Still Crying Wolf).

Alexander deserves a final special mention after that list before I go on:

The basic principle Alexander’s political writings adhere to, or start from, and which are immanent throughout the subculture, are: that politics should be about overcoming tribalism; that rationality of his particular brand should be how to do so; and that an extreme, sometimes contradictory, and rather overweening insistence on “civility” is a fundamental ideal.

Alexander believes that there is an ongoing “Culture War” which is destroying our ability to reason our way to a better society.

On this two things should be said.

First, Alexander has “confessed” (his own wording, I believe) on more than one occasion that this insistence on civility is not just a matter of the usual objective concerns about affording each other deserved respect, but are a deeply personal matter for him, as he suffers an extreme overabundance of anxiety in the face of any kind of conflict.

Second, make of it what you will, but both the comments section of his website, and the “Culture War Thread” from his subreddit, permitted explicitly fascist and white nationalist discourse so long as users were civil with each other in discussing it (indeed, both kinds of discourse have become quite popular on both).

NB: Alexander has since moved the “Culture War Thread” away from the subreddit, but has explicitly advocated for its new home, with which he is not directly associated.

Why does this matter? We were talking about Singal…

Well to begin with it’s worth pointing out that Singal’s main citation, the John Nerst blogger, more than favourably cites Scott Alexander thusly:

Scott Alexander explores the issue often and brilliantly. There is too much good erisology in his writings to mention it all, and he has had a great influence on me, but the very best is probably his great trilogy of political tribalism: I Can Tolerate Anything But the Outgroup, Five Case Studies of Politicization and Ethnic Tension and Meaningless Arguments, and what could be seen as its two-part prelude: Against Bravery Debates, and All Debates are Bravery Debates. If I were to add a postlude, it would be The Toxoplasma of Rage although parts of it may be overstated.”

And while this is amongst other citations, those citations also fall squarely into the brackets of what one would expect to see discussed in the rationalist subculture:

“The LessWrong sequence A Human’s Guide to Words is what you get when you distill a half-century of post-Wittgenstein philosophy mashed up with cognitive science, computing and mathematics. It makes great erisology writing. In general, the literature on cognitive biases has strong relevance for erisology. Most social constructionist writing could be too, if it was less about politics and more about cognition.”

There’s a reference to Hacking, sure, but not on the subject of politics or erisology. The reference to Wittgenstein is a sentence long, and the citations to the sociobiology debate are given just as examples of how erisology can explain things, not as of how erisological thinking plays out productively (evolutionary psychology and sociobiology are favourite topics of the rationalist subculture). Finally, Jonathan Haidt also gets a look in (another very popular choice amongst this subculture) for the anti-tribalist political approach he takes on the basis of psychology.

None of this is to impugn Nersh for thinking about this, but if you add up the citations you get a majority of references to texts from key blogs from within the subculture, and then if you add up the citations given for specifically erisological thinking there is only one citation to a work which isn’t one of those texts.

Moreover, once you’ve been immersed in this stuff for a while, it does get a bit tiring seeing the same combination of recommendations: links back to the same blogs and blogposts, mixed in with bestselling popular psychology books and bestselling biology books. This isn’t to say that people shouldn’t read those books, it’s just to say that there’s a lack of diversity apparent to me in what is getting read, and therefore in what is getting thought about. And then the issue is that you’ve got canonical interpretations of the wider texts being mainly drawn from the blog posts in which they are cited.

That’s rather a big deal when it comes to somebody writing for one of America’s most important intellectual publications.

An example: on an entirely different part of the internet you can find Scott Alexander’s joke about Marx’s “planet-sized ghost” (a dodgy extrapolation he made from the Very Short Introduction to Marx by Peter Singer) repeated as a truism as if it were what Marx was getting at. Very odd. This is surely an issue in which communications theorists would be interested?

A Bigger Picture

In which I begin to look rather like an antidisestablishmentarian…

What’s gone unspecified so far is how the discourse norms of the subculture at large are worthy of concern. All I’ve done thus far is point out a few key points about sacred texts and how one of Singal’s cited bloggers exhibited some of the insularity I’ve ascribed to the subculture in a vague way, but it’s an important point to expand on with some broader detail. To wit, in both my and other people’s experience: online rationalists just don’t like talking to or about people outside the subculture, at least not people with whom they may disagree.

This really cuts at the issue with Weinberg’s uncritical presentation of the blogger Singal is referencing. It isn’t just that this one guy happens to be playing round robin with other rationalist bloggers, it’s that round robin is the name of the whole game. Indeed, one way to characterise the rationalist subculture is that they’re not just trying to think for themselves, and amongst themselves, about how to solve problems. It’s sometimes explicit that people within the subculture think other people just aren’t up to the task.

In a profile of Yudkowsky in the New York Times from a few years ago, he was asked if there was any value to somebody going through academic channels to work on the problems they’re interested in. His response was, in his own words “visceral”. This sort of response becomes quite common.

Yudkowsky himself never completed high school, and he and his followers count his achievements against the idea that academic education is a positive.

Scott Alexander caught some flak, as I mentioned, when he wrote a piece eviscerating Marx on the mere basis of Peter Singer’s Very Short Introduction. Gwern, an intellectual hero within rationalism, and something of a Divine Innocent character in my view, likes to bypass the medical establishment to perform self-experiments. There are many such examples, all united by a deliberate and explicit suspicion of the academic establishment, or at least whichever bit of the academic establishment the rationalist in question is suspicious of.

Perhaps the most infamous is MetaMed, which as with all of this stuff you can easily research yourself.

Briefly, MetaMed was a now disgraced attempt, headed up by LessWrong fan Michael Vassar, to build an alternative system to normal medical practice using the tools of rationality written up in Yudkowsky’s Sequences. Vassar infamously claimed that no other medical system but his would function, anywhere in the world, and Alexander was a fan of the project, excited by its novelty. Like Theranos, although quicker, the scheme fell apart when its anti-establishment credentials weren’t sufficient to match their promise.

To finish up on this subject, and return to the issue of communication theory etc. one of the clearest and most common statements of this anti-establishment view concerns the humanities, in which subjects like rhetoric are situated. Distrust of and outright spleen towards humanistic methods is rife within the subculture, which I suppose is natural given its origins in an attempt to formalise a method for rationality. A preference for what many would label “scientism” would roughly sum it up, in spite of the apparent paradox that so much of this stuff relies on narrative thinking and parable.

Speaking of paradoxes, I wouldn’t want people to think I’m mistaken to say that opposition to the establishment characterises the movement because when they go out and look for themselves they discover that rationalists within the subculture do in fact cite plenty of ex-cathedra statistical evidence for their arguments. It’s true: they do. But, as with MetaMed, the point is to synthesise this evidence for the rationalists’ own purposes, and should not be taken as evidence of implicit trust in establishment academia.

Postscript:

It’s all just something to think about when we say things like “at least they’re thinking about interesting things” or when we read a blog which seems insightful. Pace the rationalists themselves, it’s easy to say or assent to things when we don’t look as hard as we should. And my concern here is that crediting a community of bloggers without examining what they’re up to is just not the right way to go.

Of course, as I said, this isn’t motivated just by the relatively anodyne observations I’ve made here. The things that really motivate me and which I’ve largely refrained from mentioning are for another, angrier, night, perhaps – if I were to go there at all. But it should be warning enough to find Daily Nous speaking so encouragingly of a writer who appears to more or less exclusively cite other people more or less in his own auto-didactic community.

Consider Singal’s other linkee: Sarah Constantin.

The blog Singal cites is Constantin’s own lengthy disquisition on avoiding bias and the people who do and don’t do it. Having been semi-immersed in the subculture for several years now, and heard a few things about Constantin: frankly it reads to me like she’s edging towards an argument that it’s a good for some people to be manipulated or even exploited given their irrationality compared with others, but I’ll leave that up for people to consider themselves, if they’re willing to do the research. Hence why this is a guide for the perplexed to figuring out how a “low de-coupler” might go about trying to work out what the hell is going on here.

And really what I’d most like to encourage is that people really do check things for themselves, because there’s a lot under the surface of things before we get to “well at least they’re reading”.

The upshot is: Singal cited two of these people out of three people cited, from a tiny internet subculture focused on auto-didactic techies and Silicon Valley. Make of that what you will.

Editor’s (my) note: I incorrectly characterised Singal’s article as citing only one academic against two non-academics. This isn’t the case, since Singal’s article discusses two academics, one of whom appears to disapprove of Singal’s position in her interview with Singal more than the other. Singal only links to the latter.

And as another note: here’s the original articles under discussion:

http://dailynous.com/2019/04/08/dont-roll-eyes-guy-recently-invented-philosophy/

https://www.theatlantic.com/ideas/archive/2019/04/erisology-the-science-of-arguing-about-everything/586534/

Third note: I need an editor, the original post referred to Constantin as an “interviewee” which is from an early draft and somehow slipped through, this is now corrected to “linkee”

Finally: the Yudkowsky profile appeared in The New Yorker, rather than the New York Times. Thanks to the sleuths who have hounded me over this apparently other than inconsequential detail for days.

Advertisements

9 thoughts on “A Guide For (the) Perplexed Philosophers: Jesse Singal and The Rationalist Subculture

  1. I really liked this article, but some people just falling onto it would perhaps want some additional context or at least a link to the Atlantic piece in question!

    Like

  2. Your description of the rationality subculture is surprisingly fair considering you don’t like it, so hats off in that regard.

    I still feel it has a harshly negative spin. Eg, Singal’s article is about Nerst who quotes Constantin talking about the work of academically-certified non-blogger psychologist Stanovich. This isn’t round-robin, it’s a pipeline. The ideas of someone in academia made it to the Atlantic for regular people like me to consider.

    Apparently there’s a field called “rhetoric” which already has a concept called “bracketing” that describes this concept? Cool, tell me more! But nothing pipelined that concept to me, and after googling the term, there’s just some academic papers using the term, and I still don’t know what it is. I’d love to see a non-STEM perspective on this, but they’re not putting it out there.

    NB: Just a point of order, Sarah Constantin was not an interviewee Singal’s piece as you stated, he just cited her.

    Like

    1. I don’t really understand why it’s surprising, but thanks.

      That Stanovich’s ideas made their way to you is one thing, but that’s just not the point. If Stanovich’s ideas made their way to you then so too one would hope that by uncovering those academic papers those ideas are reaching you too. If you want more sources here’s my friend Nathan Oseroff-Spicer discussing his own experiences with the field:

      The point is, in the end, that we find out about such things via networks, and those networks affect the things we come across.

      I flagged up the way in which the rationality community comes across things for those who don’t know.

      Like

  3. “To wit, in both my and other people’s experience: online rationalists just don’t like talking to or about people outside the subculture, at least not people with whom they may disagree.”

    Many of Scott Alexander’s best-known, most popular posts engage directly with people outside the rationalist subculture.

    “In a profile of Yudkowsky in the New York Times from a few years ago, he was asked if there was any value to somebody going through academic channels to work on the problems they’re interested in. His response was, in his own words “visceral”. This sort of response becomes quite common.”

    I looked for this and couldn’t find it. Can you please provide a direct quote with a source?

    In any case, I don’t believe this is a common response at all. Eliezer might have espoused this view once or twice many years ago, but many people in the rationalist community have an academic affiliation. Look at the MIRI team page for instance: https://intelligence.org/team/

    There’s definitely some criticism of academia in the community, but academics also criticize academia.

    “Yudkowsky himself never completed high school, and he and his followers count his achievements against the idea that academic education is a positive.”

    Again can you provide a specific source for this claim? A much better way to make this argument would be to cite the popularity of Bryan Caplan’s book *The Case against Education*. Caplan doesn’t think academic education isn’t useful, just that it’s inefficient.

    “Scott Alexander caught some flak, as I mentioned, when he wrote a piece eviscerating Marx on the mere basis of Peter Singer’s Very Short Introduction. Gwern, an intellectual hero within rationalism, and something of a Divine Innocent character in my view, likes to bypass the medical establishment to perform self-experiments. There are many such examples, all united by a deliberate and explicit suspicion of the academic establishment, or at least whichever bit of the academic establishment the rationalist in question is suspicious of.”

    A guy reviews a book on his blog: “deliberate and explicit suspicion of the academic establishment”? Another guy whose website is peppered with citations to academic papers performs self-experiments and collects data Quantified Self style. Again, “deliberate and explicit suspicion of the academic establishment”? Sounds to me like “deliberate and explicit suspicion of the rationalist community” on your part 🙂

    “But, as with MetaMed, the point is to synthesise this evidence for the rationalists’ own purposes, and should not be taken as evidence of implicit trust in establishment academia.”

    Heavens, we wouldn’t want people to analyze data *for their own purposes*, would we now 🙂

    Like

    1. Most of the responses I’ve had to this article treat it as if I’m putting myself forward as the final authority on these things, and neglect the early sentence in which I point out that this is about putting forward my well-informed impressions: those impressions are well-informed and you’re gonna have to get used to that fact. I even state outright that I’m not gonna do my angry thing and hand out the receipts I have for all the awful shit I think this subculture gets up to. People have called that “darkly hinting” or have, like you, asked me to provide sources for stuff that isn’t that hard to find out.

      As I stated in my original mission statement: this is a rough guide to the subculture written by somebody holding himself back from total dyspepsia, on the grounds that he thought it was an important guide for people reading a specific article in The Atlantic by Jesse Singal.

      To make things absolutely clear: this is not a takedown of the rationalist subculture, it isn’t intended as a takedown of the rationalist subculture, and if I wanted to write that article I would (and maybe will, but then again a friend of mine is writing a book on the subject already).

      If you want a source for the Yudkowsky quote or the fact that he didn’t finish high school I don’t know why, or how that would change anything: Yudkowsky is well-known for voicing such opinions and famously did not finish high school.

      Like

  4. >The point is, in the end, that we find out about such things via networks, and those networks affect the things we come across.

    This is definitely true. I guess was pushing back against the claim that this particular network only references itself, while the example in question has a clear link to an established psychologist. Meanwhile, there’s supposedly a related term “bracketing” in sociology, that rationalists are unaware because of their disdain for the humanities & academia. Maybe, but I think it’s because the field itself doesn’t use the concept all that much. I’d be curious how many of the 100 people who liked the following tweet were previously familiar with the concept:

    Or they’re just too busy publishing serious work to spend time blogging, I don’t know. This is just one example I think is illustrative of a common pattern, but maybe I’ve cherry picked it and it proves nothing.

    Like

  5. “People who disagree with Yudkowsky simply don’t get it. People who disagree with Yudkowsky automatically do.”

    Did you mean to type disagree twice? It seem like the second ‘disagree’ should be ‘agree’.

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s