However, this runs headlong into a problem that I highlighted in a previous post: that principles of metacoherence mean that such humble meta-beliefs rationally undermine our first-order beliefs. Or, as I put it in my comment:
If you think that both inductive arguments are equally reasonable, and that there's no sense in which X is objectively more similar to Y than Z, then you have no grounds for believing one conclusion rather than the other (e.g. that future emeralds will be green rather than grue). By the principle of metacoherence, you should be agnostic.
I introduce this notion in my global rationality paper:
Sometimes we may be in a position to realize that our initial judgments should be revised. I may initially be taken in by a visual illusion, and falsely believe that the two lines I see are of different lengths. Learning how the illusion worked would undercut the evidence of my senses. I would come to see that the prima facie evidence was misleading, and the belief I formed on its basis likely false. Principles of meta-coherence suggest that it would be irrational to continue accepting the appearances after learning them to be deceptive, or more generally to hold a belief concurrently with the meta-belief that the former is unjustified or otherwise likely false.
As I further explain in my post 'Meta-Coherence vs. Humble Convictions':
It's [meta-]incoherent to believe that P whilst also believing that the weight of evidence fails to support P, since this is just to judge both [on the lower level] that P is true and [on the higher level] that P is probably not true after all.
If, faced with alternative possibilities, you nonetheless retain your belief in P, you must represent this to yourself as being because you judge that P is more likely true, and not just because you happened to have the belief in the past and don't feel like revising it. You can't consciously be an epistemic conservative, because the realization that you believe P partly for non-truth-related reasons will instantly lower your degree of belief in P, to match your assessment of the (subjective) probability that P is really true.
Some form of epistemic conservatism may be true nonetheless, in the trivial sense that we think our existing beliefs are true, and it will take positive work to convince us otherwise. But from a first-personal perspective, the reason we hold on to our existing beliefs is not the egoistic fact that they are our beliefs, but rather the "fact" (as it seems to us) that they are true. If you lose your confidence in the truth of a belief, and are "pretty sure" it's mistaken, then you can't sincerely believe it any more. You can't now appeal to epistemic conservatism as a way to hold onto your belief without judging it to be true. That's incoherent.