David_Moss

I am the Principal Research Manager at Rethink Priorities working on, among other things, the EA Survey, Local Groups Survey, and a number of studies on moral psychology, focusing on animal, population ethics and moral weights.

In my academic work, I'm a Research Fellow working on a project on 'epistemic insight' (mixing philosophy, empirical study and policy work) and moral psychology studies, mostly concerned either with effective altruism or metaethics.

I've previously worked for Charity Science in a number of roles and was formerly a trustee of EA London.

Wiki Contributions

Comments

EA Survey 2020: Community Information

Thanks!
 

Do you have more information about how personal/family finance as a bottleneck for impact is to be understood?

Unfortunately, the majority of people's open comments didn't provide more detail beyond something like "financial constraints" or "low income." Among the minority of comments which did offer more detail, the specific thing most often mentioned was simply that people could donate more if they had more money. Freedom to explore different options, switch career, spend more time on high-impact work, and stress related to money were all mentioned by ~ a couple of people.

Foresight for Governments: Singapore’s Long-termist Policy Tools & Lessons

Yet, if Dominic Cummings’ word is anything to go by, the UK government still has a long way to go in terms of long-term policymaking. 

 

Apologies if you already linked to this and I missed it, but Dominic Cummings is also writing a series about Singapore right now: https://dominiccummings.substack.com/p/high-performance-startup-government

How are resources in EA allocated across issues?

I think the figures for highly engaged EAs working in Mental Health, drawn from EA Survey data, will be somewhat inflated by people who are working in mental health, but not in an EA-relevant sense e.g. as a psychologist. This is less of a concern for more distinctively EA cause areas of course.

Among people who, in EAS 2019, said they were currently working for an EA org, the normalised figures were only ~5% for Mental Health and ~2% for Climate Change (which, interestingly, is a bit closer to Ben's overall estimates for the resources going to those areas). Also, as Ben noted, people could select multiple causes, and although the 'normalisation' accounts for this, it doesn't change the fact that these figures might include respondents who aren't solely working on Mental Health or Climate Change, but could be generalists whose work somewhat involved considering these areas.

EA Survey 2020: Cause Prioritization

If it seems worth it (i.e., more people than me care!), you could potentially add a closed ended 'other potential cause areas' item. These options could be generated from the most popular options in the prior year's open ended responses.  E.g., you could have IIDM and S-risk as close ended 'other options' for that question next year

 

Yeh that seems like it could be useful. It's useful to know what kinds of things people find valuable, because space in the survey is always very tight.

EA Survey 2020: Cause Prioritization

I agree it's quite possible that part of this observed positive association between engagement and longtermism (and meta) and negative association with neartermism is driven by people who are less sympathetic to longtermism leaving the community. There is some evidence that this is a factor in general. In our 2019 Community Information post, we reported that differing cause preferences were the second most commonly cited reason for respondents' level of interest in EA decreasing over the last 12 months. This was also among the most commonly cited factors in our, as yet unpublished, 2020 data. There is also some evidence from that, as yet unpublished, post that support for longtermism is associated with higher satisfaction with the EA community, though I think that relationship still requires more research.

Dealing with differential attrition (i.e. different groups  dropping out of the survey/community at different rates) is a perennial problem. We may be able to tackle this more as we get more data tracked across years (anything to do with engagement is very limited right now, as we only have two years of engagement data). One possible route is that, in 2019, we asked respondents about whether they had changed cause prioritisation since they joined the community and if so which causes they switched from. A majority of those that had switched did so from Global Poverty (57%) and most seem to be switching into prioritising the Long Term Future. It may be possible to estimate what proportion of neartermist respondents should be expected to switch to longtermism across time (assuming no dropout), then compare that with actual changes in the percentage of neartermists across time and see whether we observe fewer neartermists within cohorts across time (i.e. across surveys) than we'd expect given the estimated conversion rate. But there are lots of complexities here, some of which we discuss in more detail in later posts on satisfaction and engagement.

A couple of perhaps weakly suggestive observations are that, within 2020 data, i) engagement is more clearly associated with cause prioritisation than time in EA and ii) we also observe more engaged EAs to be more longtermist (or meta) and less neartermist even within cohorts (i.e. EAs who reported joining within the same year). Looking within different engagement levels, below, the relationship cause prioritisation across time in EA is comparatively flat (an interesting exception being neartermism among those reporting highest engagement, where it drops dramatically among the most recent cohorts (2016-2020), i.e. those who have been in EA a longer are visibly less neartermist, which is roughly the pattern I would expect to see were neartermists dropping out [though it would be odd if that was only occurring among the most engaged]).

EA Survey 2020: Cause Prioritization

I can't speak for others, but I don't think there's any specific theoretical conception of the categories beyond the formal specification of the categories (EA movement building and Meta (other than EA movement building). Other people might have different substantive views about what does or does not count as EA movement building, specifically.

I think the pattern of results this year, when we split out these options, suggests that most respondents understood our historical "Meta" category to primarily refer to EA movement building. As noted, EA movement building received much higher support this year than non-EA movement meta; EA movement building also received similar levels of support to "Meta" in previous years; EA movement building and Meta (other than EA movement building) were quite well correlated, but only ~12% of respondents rated Meta (other than EA movement building) higher than EA movement building (44% rated movement building higher, and 43% rated them exactly the same).

I think this suggests either that we could have just kept the Meta category as in previous years or that in future years we could consider dropping Meta other than movement building as a category (though, in general, it is strongly preferable not to change categories across years).

EA Survey 2020: Cause Prioritization

In future, I'd like to see changes in the 'other causes'  over time and across engagement level, if possible. For instance, it would be interesting to see if causes such as IIDM or S-risk are becoming more or less popular over time, or are mainly being suggested by new or experienced EAs.

 

Yeh, I agree that would be interesting. Unfortunately, if we were basing it on open comment "Other" responses, it would be extremely noisy due to low n, as well as some subjectivity in identifying categories. (Fwiw, it seemed like people mentioning S-risk were almost exclusively high engagement, which is basically what I'd expect, since I think it requires some significant level of engagement before people would usually be exposed to these ideas).

I think that it would be very interesting if we could compare  the EA communities results on this survey against a sample of 'people who don't identify as EAs' and people who identify as being in one or more 'activist groups' (e.g., vegan/climate etc)  and explore the extent of our similarities and differences in values (and how these are changing over time).

I agree this would be interesting. I'm particularly interested in examing differences in attitudes between EA and non-EA audiences. Examining differences in cause ratings directly might be more challenging due to a conceptual gap between EA understanding of certain causes and the general population (who may not even be familiar with what some of these terms mean). I think surveying more general populations on their support for different things (e.g. long-termist interventions, suitably explained) and observing changes in these across would be valuable though. Another way to examine differences in cause prioritisation would be to look at differences in the charitable portolios of the EA community vs wider donors, since that aggregate data is more widely available.

EA Survey 2020: Cause Prioritization

The context here was that we've always asked about "Meta" since the first surveys, but this year an org was extremely keen that we ask explicitly about "EA movement building" and separate out Meta which was not movement building. 

In future years, we could well move back to just asking about Meta, or just ask about movement building, given that non-EA movement building meta both received relatively low support and was fairly well correlated with movement building.

Research into people's willingness to change cause *areas*?

I think there's definitely something to this.

As is suggested by this report, even donors who are very proactive, are often barely reflecting about where they should give at all. They are also, often, thinking about the charity sector in terms of very coarse-grained categories (e.g. my country/international charities, people/animal charities). On the other hand, they often are making sense of their donations in terms of causes and an implicit hierarchy of causes (including particular, personal commitments,  such as to heart disease because a family member died from that, and so on). They also view charitable donation as highly personal and subjective (e.g. a matter of personal choice) [there is some evidence for this in here and unpublished work by me and my academic colleagues].

I think the overall picture this suggests is that people are sometimes thinking in terms of causes, but rarely explicitly deliberating about the optimal cause or set of causes.

To address the original question: I think this suggests that trying to get people to "change causes" by giving them reasons as to why certain causes are best may be ineffective in most cases, as people rarely deliberate about what cause is best and may not even be aiming to select the best cause. On the other hand, as many donors give fairly promiscuously or indiscriminately to charities across different cause areas, it's plausible you could get them to support different causes just by making them salient and appealing.

EA Survey 2020: How People Get Involved in EA

Thanks!

Incidentally your comment just now prompted me to look at the cross-year cross-cohort data for this. Here we can see that in EAS 2019, there was a peak in podcast recruitment closer to 2016 (based on when people in EAS 2019 reported getting involved in EA). Comparing EAS 2019 to EAS 2020 data, we can see signs of dropoff among podcast recruits among  those who joined ~2014-2017 (and we can also see the big spike in 2020).
 

These are most instructive when compared to the figures for other recruiters (since the percentage of a cohort recruited by a given source is inherently a share relative to other recruiters, i.e. if one percentage drops between EAS 2019 and EAS 2020 another's has to go up). 

Comparing personal contact recruits we can see steadier figures across EAS 2019 and EAS 2010, suggesting less dropoff. (Note that the figures for the earliest cohorts are very noisy since there are small numbers of respondents from those cohorts in these surveys).

Load More