In any practical context, metaethics doesn’t matter. All the major standard metaethical systems (consequentialism, deontology, virtue ethics) cash out in pretty much the same ways. Object-level moral differences between one person and another are much likelier to spring from questions like “what are your guiding rules?” or “what are the factors in your calculus?” or “which virtues do you most favor?” than from metaethical differences. It’s impossible for anyone to apply any of those systems purely without a heavy (often-unacknowledged) admixture of the other two, due to the limitations of human hardware. Hell, each of those systems can sandbox and encompass the other two with no difficulty.
…it’s telling that Eliezer Yudkowsky, the self-appointed high priest of consequentialism, write a Manual of Morality that consists largely of (a) important rules to follow and (b) important virtues to cultivate.
Which is all to say: please stop using metaethics as a basis for tribalist identitarianism. It really doesn’t make any sense.
I sympathize with the last line, but this has some problems.
There’s a stark gap between “metaethics doesn’t matter” and “by and large most people don’t have different metaethics” (and between that and “… most groups don’t have different metaethics.”)
If you take the model (intuitive to begin with, but greatly expanded by Scott today) that most people start from vague practical intuitions, and then backsolve for terminal values behind them, and then make future decisions based on those frankly made up terminal values, then it’s easy to see that they do the same for metaethics too. And just because it’s made up doesn’t mean it’s not very real to them. Someone can just have a life path that led them to fundamentally feel “deontology is the way to make decisions” and they really will fight to the death for that.
On a group level, this usually doesn’t get discussed and all of the group’s morality becomes some vague hodge-podge of things that sound good to everyone, so you can sometimes get very surprising splits in otherwise similar tribal members about whether their foundational virtue was say, compassion or justice. Splits that it becomes nigh impossible to move either person off of. I can think of no way in which this doesn’t matter.
And even though on the group level it’s fuzzier, because everything on the group level is fuzzier, it’s not hard to say “the stories that form the core of the culture of this tribe have a certain metaethical consistency, which is at odds with the stories that form the core of the culture of that tribe.” Now you can always, always point out individual exceptions to that sort of folk and myth sociology, but what of it. We either think it’s meaningful to aggregate those sorts of cultural traditions across people, or we don’t, and metaethics isn’t particularly weaker to generalize about than say, “what sort of man feminists want.”
***
(You can take the Marxist route and say “all ideologies are just off springs of social pressure and material interests” which might even be true, but is fairly useless once those belief systems have been formed, and people will (sometimes) very clearly violate social pressure and material interests to stay loyal to the belief system they made up to justify the prior.)
It is very true that different people have importantly different moral intuitions and moral practices, but this is – in theory, and mostly in practice – totally orthogonal to any consideration of metaethics. Compassion versus justice is not a metaethical issue, it’s a virtue-prioritization issue within virtue ethics. “I care about small hard-to-track benefits that accrue to people far away” is not a consequentialist position, it’s an object-level position; and, indeed, many of the people who espouse it do so on the grounds that thinking in such a way is virtuous. You can get to any single position through the right deontic rules or the right consequentialist values or the right favored virtues.
C v J was just an example. I don’t see purely metaethical considerations being any different.
I have of late taken a soft deontological position. Consequentialism suffers from our inability to predict a chaotic universe, and it’s important to do things that you think are good in of themselves, and not based on distant benefits. You may call this “a shorthand and practical limitation on consequentialism”, but as actually practiced it really feels like a metaethical foundation.
When I hear people say “deterrence” my skin crawls. You don’t know shit about the effects of locking up that kid for 5 years on the broader incidence of hate crimes. I am filled with passionate loathing for consequentialist thinking that dismisses immediate harms for extremely hazy and unpredictable future results, loathing I can rarely express in polite society. I don’t express it because I don’t think someone can really be convinced without getting to my most fundamental beliefs (everything you trust is unreliable and chaotic.)
That certainly feels like a metaethical difference from my interlocutor, and is an important part of predicting me.
Similarly, I think many people (both in Southern honor culture, and modern progressive culture) have a system of thinking and moral action that looks a lot more like virtue ethics than like consequentialism (even if they would vary in how they explain it.)
“Well, if we put together a coalition to run this unpleasant potentially-dangerous person out of town on a rail, the concrete immediate easily-predictable outcome is that…the unpleasant potentially-dangerous person won’t be around anymore. Which sounds great. Sure, there might be some kind of fuzzy second- or third-order effect with consequences for local norms, but – that sounds like something we can’t actually understand, in an unreliable and chaotic universe.”
Also, once upon a time we had a conversation about keeping congressmen’s kids out of jail for drug use.
I suppose I can’t know for sure how your thinking looks from the inside, but I would be very surprised if you actually hewed to this alleged deontic principle in a consistent way.