I have discovered two logical fallacies in the past year, which I have not seen elucidated elsewhere: one formal, one informal. I present them for the evaluation, judgment, improvement, and utilization of readers:
1. The Relationship-Object Conflation (ROC) Fallacy
The ROC fallacy occurs when a proposed analogy between two relationships is conflated with an analogy between the constituent objects in these relationships.
For example, I might say that Joe’s relationship with Sue is like Adolf Hitler’s relationship with Eva Braun. Whatever I may be trying to say by making this analogy (perhaps that Joe’s inattention may lead to a non-fatal suicide attempt, or that Joe needs to marry Sue already), I am in some way comparing the relationship between Hitler and Braun to the relationship between Joe and Sue. I am not comparing Joe to Hitler, nor am I comparing Sue to Eva.
Someone could argue that in fact, the two relationships aren’t analagous, or that there is a better (perhaps less offensive) analogy that could be made. But they could not criticize the comparison by saying that I was comparing Joe to Hitler. That would be a category error, confusing an object that forms a part of a relationship with the relationship itself. It would be a commission of the ROC fallacy.
2. The Novalis Fallacy
I could not think of a good conceptually descriptive title for this informal fallacy, so for my own memory, I’ve named it after its inspirational source in a lively thread about the merits and detriments of Jordan Peterson’s arguments for individual responsibility:
The Novalis fallacy is essentially a rhetorical excuse to sidestep the obligation of philosophical charity. It can sound somewhat persuasive because, ironically, it sometimes appeals to the intelligence of the subject, and uses this intelligence to presume the subject’s knowledge of some material.
Often times, this is entirely legitimate. Assumptions about what other people know are necessary for any communication. But the Novalis fallacy takes this assumption and leverages it as evidence of ulterior motives, based upon not valuing or concluding what is tacitly asserted as the “correct” things.
Mr. Novalis clarified his position in his response to my response:
I assumed he has knowledge cuz its common knowledge and an educated person should have it. Since hes an educated person either he doesnt, and is thus not really educated, or he does, and is consequentially dishonest. The only reason he ignores it or it doesnt inform his analysis is cuz hes from Harvard, probably has a Jew wife, was propelled into stardom by the tribe and all his friends are Jews. Even if you cant see it, its obvious who hes allegiances are to.
Obviously, low-hanging fruit is easy to pick, but the argument is much more common, and often sounds far more persuasive. It is sometimes used by fairly intelligent people, and it sounds a little more persuasive even when it is merely laid out in bullets:
- All educated people know X.
- Therefore, Jordan Peterson either knows X or is not educated.
- Jordan Peterson is an educated person, so he knows X.
- Jordan Peterson adheres to [conclusion that appears antithetical to X]
- Therefore, Jordan Peterson is being dishonest.
The reason that this is a fallacy is that even the smartest, most intelligent people in the world are often ignorant of certain facts. There are simply too many facts out there, in too many fields, for everyone to know all of them, even facts which some individuals hold to be basic.
But more importantly, there are a variety of theories and worldviews to explain facts that are not intuitively obvious. It might seem intuitive to believe that if someone knows that a particular action will result in their death, then they would not knowingly perform that action… and yet people take exactly these kinds of risks all the time, from flying into space, to charging an enemy position on the battlefield, to running into a burning building to save someone else, or simply killing themselves for the explicit and sole purpose of ending their life. Sometimes the value we place on a particular conclusion which is not the fact itself is not shared by another person. Alternatively, that person might share your value, but simultaneously hold another, higher value which supersedes their loyalty to the first; Allan and Bob may both value life, but perhaps Allan values his honor over his life.
Rhetorically, the Novalis fallacy also fails because the speaker is — often times — appealing to the intelligence of the subject. But this primes listeners to the possibility that the subject may, in fact, know more than the accuser believes, and that they do not share the opinion of the accuser based on additional information which they possess that the accuser does not, rather than an ulterior motive or hidden loyalty of the subject.