EA - Why Neuron Counts Shouldn't Be Used as Proxies for Moral Weight by Adam Shriver
The Nonlinear Library: EA Forum - En podcast av The Nonlinear Fund
Kategorier:
Link to original articleWelcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: Why Neuron Counts Shouldn't Be Used as Proxies for Moral Weight, published by Adam Shriver on November 28, 2022 on The Effective Altruism Forum.Key TakeawaysSeveral influential EAs have suggested using neuron counts as rough proxies for animals’ relative moral weights. We challenge this suggestion.We take the following ideas to be the strongest reasons in favor of a neuron count proxy:neuron counts are correlated with intelligence and intelligence is correlated with moral weight,additional neurons result in “more consciousness†or “more valenced consciousness,†andincreasing numbers of neurons are required to reach thresholds of minimal information capacity required for morally relevant cognitive abilities.However:in regards to intelligence, we can question both the extent to which more neurons are correlated with intelligence and whether more intelligence in fact predicts greater moral weight;many ways of arguing that more neurons results in more valenced consciousness seem incompatible with our current understanding of how the brain is likely to work; andthere is no straightforward empirical evidence or compelling conceptual arguments indicating that relative differences in neuron counts within or between species reliably predicts welfare relevant functional capacities.Overall, we suggest that neuron counts should not be used as a sole proxy for moral weight, but cannot be dismissed entirely. Rather, neuron counts should be combined with other metrics in an overall weighted score that includes information about whether different species have welfare-relevant capacities.IntroductionThis is the fourth post in the Moral Weight Project Sequence. The aim of the sequence is to provide an overview of the research that Rethink Priorities conducted between May 2021 and October 2022 on interspecific cause prioritization—i.e., making resource allocation decisions across species. The aim of this post is to summarize our full report on the use of neuron counts as proxies for moral weights. The full report can be found here and includes more extensive arguments and evidence.Motivations for the ReportCan the number of neurons an organism possesses, or some related measure, be used as a proxy for deciding how much weight to give that organism in moral decisions? Several influential EAs have suggested that the answer is “Yes†in cases that involve aggregating the welfare of members of different species (Tomasik 2013, MacAskill 2022, Alexander 2021, Budolfson & Spears 2020).For the purposes of aggregating and comparing welfare across species, neuron counts are proposed as multipliers for cross-species comparisons of welfare. In general, the idea goes, as the number of neurons an organism possesses increases, so too does some morally relevant property related to the organism’s welfare. Generally, the morally relevant properties are assumed to increase linearly with an increase in neurons, though other scaling functions are possible.Scott Alexander of Slate Star Codex has a passage illustrating how weighting by neuron count might work:“Might cows be "more conscious" in a way that makes their suffering matter more than chickens? Hard to tell. But if we expect this to scale with neuron number, we find cows have 6x as many cortical neurons as chickens, and most people think of them as about 10x more morally valuable. If we massively round up and think of a cow as morally equivalent to 20 chickens, switching from an all-chicken diet to an all-beef diet saves 60 chicken-equivalents per year.†(2021)This methodology has important implications for assigning moral weight. For example, the average number of neurons in a human (86,000,000,000) is 390 times greater than the average number of neurons in a chicken (220,000,000) so we would treat the welfare units of humans as 39...
