TLDR: The belief will not change when the reasons are defeated. The causality is reversed. People believe the reasons because they believe in the conclusion.
Why is it that we may agree in advance that a particular result is a fair test of our theory, then see so much more when the result is known
To a good first approximation, people simply don't change their minds about anything that matters.
But we know that the power of reasons is an illusion. The belief will not change when the reasons are defeated. The causality is reversed. People believe the reasons because they believe in the conclusion.
In politics and in religion, the main driver is social. We believe what the people we love and trust believe. This is not a conscious decision to conform by hiding one's true beliefs. It's the truth. This is how we believe. Indeed, beliefs persevere even without any social pressure.
Classic studies by the late Stanford social psychologist Lee Ross established the phenomenon of belief perseverance. In those experiments, you first provide people with evidence that supports a particular belief. For example, you may give people the task of guessing which suicide notes are genuine, then provide feedback about accuracy. People draw inferences from what they're told. Those who have been given positive feedback, score themselves much higher on empathy than people who have been given negative feedback.
Then you discredit the feedback by telling people there was a mix-up and you test their beliefs about their empathy. The outcome? The elimination of the evidence does not eliminate the beliefs that were inferred from it. People who have raised their opinions of how empathetic they are, maintain their new belief, and the same is true if people have been convinced that they're not very good at guessing other people's feelings