by Case Greenfield, June 12, 2022
by Case Greenfield
June 12, 2022
We sincerely believe that we do the universally right thing, but secretly we serve our own interest better than the interest of others when in conflict with ours. It is called Error Theory.
This week, I learned about the so-called ‘Error Theory’ of Ethics. It says, that “ordinary moral claims presuppose that there are objective (universal) moral values, but there are no such things. Hence, the practice of morality is founded upon a metaphysical error.”
In my words: Ethical standards are alway, at least partly, subjective, serving the interest of the person or group (who support the ethical standard) more than the interest of another person or group. We sincerely believe that we do the universally right thing, but secretly (consciously, unconsciously, or something in between) we serve our own interest at the cost of conflicting interests of other people.
Recently, I wrote about this ‘grey area‘, the gliding slope, which is basically the same thing.
The Error Theory and the Grey Area are great examples of what I like to call Mind Models, the realities that we create to shape ourselves. You see the active use of the Error Theory everywhere: from individual people who eg. enrich themselves at the cost of someone else and somehow feel that it is ethically the right thing to do for some reason to eg. companies that willfully sell bad or too expensive products justifying it with made-up ‘for-good’ arguments up to eg. dictators who invade and plunder other countries for a self-created ‘good’ cause.
Anyway, it made me wonder about the following thought experiment (which will never actually occur, hence a thought experiment):
What would happen if we all would stucturally and relentlessly serve the interest of others first, especially when it would hurt our own interest?
It would create an interesting recursive dynamic. Hurting my own interest in the things that I do, would automatically make the other person stop hurting my interest. If the other person would let me do, they would automatically be involved in hurting my interest, which they would not want to allow.
Could this somehow result into a balanced situation where my and your interest are served equally? And would this balance represent some sort of, well, not a objective (universal) moral value, but rather a practical step-by-step balancing equilibrium of interests?
And, could that be a better solution: not to search for objective (universal) moral values, but case-by-case strive for a balanced equilibrium of interests?
And isn’t the real reason why we don’t, that it is too complicated for our brain? We rather have a few simple, universal rules that we can blindly follow, rather than make the effort of again and again debating a delicate balance of interests. And, how to deal with the fact that some people are verbally better skilled than others? Wouldn’t that create imbalance?
But I’m afraid an even stronger reason is that our brain simply is wired as a survival organ. It will always be inclined to advance our own interests or the interests of our group at the cost of the interests of the other or another group. It’s nature. How to deal with that? Modify our brains?
Not so easy, I’m afraid …
(Addition – June 15th, 2022)
By the way, the Error Theory of Ethics, really, is a form of what neuroscientists call naive realism. Here’s a quote from an article in Neuroscience News, called Well, I See It Differently! Why People Don’t View the World the Same Way Others Do:
People often mistake their own understanding of people and events as objective truth, rather than as merely their own interpretation. That phenomenon, called “naive realism,” leads people to believe that they should have the final word on the world around them. “We tend to have irrational confidence in our own experiences of the world, and to see others as misinformed, lazy, unreasonable or biased when they fail to see the world the way we do,” [UCLA psychology professor Matthew] Lieberman said.
And this has a huge impact in the world, because:
Naive realism may be the single most underappreciated source of conflict and distrust across individuals and groups, he said. “When others see the world differently than we do, it can serve as an existential threat to our own contact with reality and often leads to anger and suspicion about the others,” Lieberman said. “If we know how a person is seeing the world, their subsequent reactions are much more predictable.”
I like the term ‘naive realism’. It is really what I always call ‘mind models’. With mind models we create our own naive reality. “Reality“, because for us it is real. And “naive“, because like a child we don’t know better.
But it is not ‘reality’, whatever that is … . It is ‘our reality’. And, coincidentally as it may seem, or is, our naive reality, somehow, always or almost always, is to our advantage and in our interest. I suppose, that is what evolution has done to our brain as a survival organ.
– – –
By the way, naive realism is one of at least three philosophical theories of reality (naive realism, indirect realism and idealism). Naive realism states that we believe our perception of the world reflects it exactly as it is, whereas indirect realism suggests our perceptions are clouded by our biases, and idealism suggests the material world does not exist independently of our perceptions. (See also: Naive Realism, Wikipedia)
Personally, I believe that indirect realism is the most likely candidate to be true, with our biases strongly colored by our (personal and group) interests. ‘I see it in the way that is good for me.’ The brain as a survival organ, filtering reality into what suits me well to my advantage.