HOME/Articles/

What's better - a broken clock or a clock that's always 5 minutes off?

Article Outline

<script src="https://cdnjs.cloudflare.com/ajax/libs/mathjax/2.7.1/MathJax.js?config=TeX-AMS-MML_HTMLorMML" type="text/javascript"></script>

This'll be a shorter post, but hopefully one that is a precursor to a longer one that deals with methods of argumentation, our inherent bias in our thought process, and on the differentiation of signal and noise in our beliefs.

I think it's important to start from a good reference point, which I feel is illustrated exceptionally well in this extract from The Signal and the Corrective

<p>Most people have somewhat moderate views and they recognize that there is a bit of truth to both of two apparently opposing narratives. This can mask fundamental differences between those appearing to be in agreement.</p> <p>Like, look at [a] zebra[.]</p> <p>We can all agree on what it looks like. But some of us will think of it as a white horse with black stripes and some as a black horse with white stripes, and while it doesn’t actually matter now, that might change if whether “zebras are <em>fundamentally white</em>” or “zebras are <em>fundamentally black</em>” ever becomes an issue of political importance.</p> <p>In the real world zebras are (thank God) still politically neutral, but similar patterns exist. Two people with political views like:</p> <p><em>“The free market is extremely powerful and will work best as a rule, but there are a few outliers where it won’t, and some people will be hurt so we should have a social safety net to contain the bad side effects.”</em></p> <p>and</p> <p><em>“Capitalism is morally corrupt and rewards selfishness and greed. An economy run for the people by the people is a moral imperative, but planned economies don’t seem to work very well in practice so we need the market to fuel prosperity even if it is distasteful.”</em></p> <p>. . . have very different fundamental attitudes but may well come down quite close to each other in terms of supported policies. If you model them as having one “main signal” (basic attitude) paired with a corrective to account for how the basic attitude fails to match reality perfectly, then this kind of difference is <em>understated</em> when the conversation is about specific issues (because then signals <em>plus</em> correctives are compared and the correctives bring “opposite” people closer together) but <em>overstated</em> when the conversation is about general principles — because then it’s only about the signal.</p> <p>The <del>funny</del> sad thing is that this supports the view that if we saw every issue on its own terms instead of part of a <a href="http://www.slatestarcodex.com/2014/10/16/five-case-studies-on-politicization/" target="_blank" rel="noopener"><span style="text-decoration:underline;">Big Referendum on Which Side is Right About Everything</span></a> then we would agree more and fight less (which is part of the reason politics gets less terrible as it gets more local).</p> <p>It also explains the sort of situation (which happens to me a lot) where you switch sides based on who you’re talking to. If you’re with someone with an opposite signal, you prioritize boosting your own signal and ignore your own corrective that actually agrees with the other person. However, when talking to someone who agrees with your signal you may instead start to argue for your corrective. And if you’re in a social environment where everyone shares your signal and nobody ever mentions a corrective you’ll occasionally be tempted to defend something you don’t actually support (but typically you won’t because people will take it the wrong way). My “defense” of the concept “<a href="https://everythingstudies.com/2016/11/24/case-study-the-war-on-christmas/" target="_blank" rel="noopener"><span style="text-decoration:underline;">War on Christmas</span></a>” from last year is an example of that.</p> <p>The model gives us a new way to characterize zealots or ideologues (they’re people without correctives) and groupthink (that’s when correctives are not allowed). Such people and such environments creep me out.</p> <p>Finally, it offers a new perspective on the whole Rand lovers vs. Rand haters thing. Capitalist-types are usually the villains in fiction<a href="#fn3" name="ref3">[3]</a>, and how the poor are oppressed by the evil rich has been dramatized so many times that the corrective<a href="#fn4" name="ref4">[4]</a> — that entrepreneurship is crucial for building wealth, capital owners fill a very important function, and the wealth of the industrialized world is disproportionally created by those who innovate and build technological systems — will be jarring when it’s for once brought out and given full signal-treatment instead (with the corresponding corrective ignored). It’ll feel perverse (or liberating) the same way “medical treatment hurts people who want to die” does in House of God.</p> <p>But why does it? I think most who aren’t hard left ideologues would recognize that there is at least some truth to the notion that entrepreneurs and innovators carry the modern world on their shoulders and that you shouldn’t live your life in service to others — and that there is less than perfect truth to the opposite notion that everyone wealthy is nothing but a rent-seeking parasite and everyone’s labor belongs to the community. If offering up one acknowledged-as-partial narrative is ok, why not another?</p> <p>Because “signal plus corrective” is not just a balanced view: there is a difference between “the market is amoral but also, I admit, effective” and “the market is effective but also, I admit, amoral”. While we’re willing to accept criticisms of what we support and defenses of what we oppose, respect to our basic view must be paid <em>first.</em> You can say how bad capitalism/government sometimes is but only after you’ve admitted its general goodness, and vice versa<a href="#fn5" name="ref5">[5]</a> .</p> <p>I’ve noticed how well this explains how I function myself. There is a lot of valid criticisms of how science is done, for instance, but unless you first acknowledge that science is the best and perhaps only way to get reliable knowledge — far ahead any other contenders like religion or intuition — I’m not going to listen to them. I need to know you’re not going to use those valid points as a wedge to push something fundamentally unscientific. That’s not on the table and we need to be clear on that. In other words, I need to know you’re on my side (avoiding triggering our <a href="https://www.edge.org/response-detail/27168"><span style="text-decoration:underline;">coalitional</span></a> instincts is key).</p> <p>And sure, criticize modern civilization for destructive wars and environmental degradation, but <em>not</em> until after you’ve given proper thanks for everything good it’s brought us.</p> <p>And sure, postmodernists and social constructionists <a href="https://everythingstudies.com/2017/03/06/science-the-constructionists-and-reality/" target="_blank" rel="noopener"><span style="text-decoration:underline;">have some good points</span></a> about the relationship between knowledge, labels, and power, but it took realizing that they typically <em>aren’t</em> denying objective reality to let myself appreciate those points.</p> <p>And, to pick a real life example: sure I’m aware of the <a href="https://en.wikipedia.org/wiki/The_Paradox_of_Choice" target="_blank" rel="noopener"><span style="text-decoration:underline;">paradox of choice</span></a> (that more options can sometimes make us less happy), and I have my reservations about consumerism and contemporary materialistic culture, etc. But those are correctives to a rock-hard commitment to individual freedom of choice, so when a coworker I happen to know sympathizes with an actual communist party mentions such points I’m considerably more hesitant to agree than if it had been someone else making them.</p> <p>Order matters. Try scrolling back up to the list of opposing narratives and do this with them, picking your own preferred side as a requirement for the other. I’ll wait.</p>

Anchoring arguments

I want to use a fairly simple analogy to make evident some more true facts about our belief systems and how we defend them. More specifically I'd like to talk about reference points and how we anchor certain portions of our arguments.

There's the old adage that a "broken clock is right twice a day". While this commonly used to describe someone that cries wolf, or that is usually wrong but sometimes right by accident, I'd like to take a more pragmatic look at it as a function of value and draw the analogy to our argumentation methods.

Take two clocks; clock A, that is broken, and stuck at a certain time, say 6:00, and clock B), that is 5 minutes fast.

Depending on what frame of reference you use, and what variables you "anchor", you can view either as better, or more correct.

Clock A is, objectively, correct more often than Clock B. In a 24 hour period clock A will have been right twice (or 3 times, depending on how you consider time bounds), while Clock B will have been right 0 times.

The first person wants to figure out which is better. For them, whichever clock is correct more often is better. 3 > 0, therefore Clock A is better, and provably so. There is no room for debate, if the variable their argument anchors for is "absolute rate of being provably correct". Anyone that argues otherwise, in their mind, will not only be incorrect, but foolishly so.

I doubt anyone reading this would agree with that statement though - everyone would prefer to have the clock that is 5 minutes fast, and argue that it is much more valuably, and arguably [sic] the only one of the two that does its function correctly.

How should one rectify this disagreement? The person that believes clock A is right, and by their selected signal they are also right in a scientific sense. A common mistake is that people will try to argue within the bounds of the first persons system, instead of arguing against the system itself.

They will attempt to argue that clock B is right. Obviously any logical person would see that trying to argue that 0 > 3 is futile, and it obviously is in this contrived example, but in real arguments you'll see online they attempt that exact same thing.

It's also unfair to entirely dismiss the first persons system - their qualification instructions don't have any obvious flaws at first glance, and seem like a fairly reasonable judgement criteria for a clock.

The way to correctly approach this problem is to attempt to use the parts of the system that the first person believes and use them to slowly deconstruct or mutate it. For instance, in this case, you could relax the criteria of "correct more often in a day" to "how close the clock is, on average, to the current time".

If we assume discrete, minute long intervals, then Clock A will be, on average

$$\frac{2*\sum_{n=1}^{360} n}{720} = 180.5$$

minutes off, or 3 hours off. It's now much easier to argue that 5 < 180.5, and that Clock B is objectively better in comparison. If the first person is reasonable it should be easy to convince them of this fact.

I hope to expand on this idea a bit further in a further, more long form, post.