Naked Science Forum

General Science => General Science => Topic started by: Martin J Sallberg on 13/05/2013 13:18:52

Title: How does cognitive bias affect our receptivity to new theories?
Post by: Martin J Sallberg on 13/05/2013 13:18:52
When alleged "cognitive biases" are used as arguments against theories (on the lines of "let's be sceptical of this claim because there is a cognitive bias to believe in it") one important fact is overlooked. It is the possibility that the belief that there is a cognitive bias may itself be due to a bias towards believing that it is biased. Thus, psychologistic arguments for scepticism cuts off the branch they themselves sit on.
Title: Re: The bias bias.
Post by: dlorde on 13/05/2013 14:29:16
Follow the evidence. If there is evidence to support the possibility of cognitive bias in a given context, it is reasonable to take account of that possibility in the absence of compelling evidence for the claim.

Of course, the weightings you apply in assessing the probability of cognitive bias, and how compelling you consider the evidence for a claim, are themselves open to cognitive bias. The wise skeptic bears this in mind.

Being aware of the possible influences on your judgement should help make for a better assessment of its reliability.
Title: Re: The bias bias.
Post by: Martin J Sallberg on 15/05/2013 06:36:15
Follow the evidence. If there is evidence to support the possibility of cognitive bias in a given context, it is reasonable to take account of that possibility in the absence of compelling evidence for the claim.

What if the whole claim of "evidence for a cognitive bias" is just plain a bunch of historical contingencies and self-fulfilling prophecies?

Being aware of the possible influences on your judgement should help make for a better assessment of its reliability.

No. Believing that you are biased becomes a self-fulfilling prophecy. Ever heard of the placebo effect? And if the bias is supposed to be panhuman, then peer review means plus minus zero. Psychologistically-motivated "scepticism" is effectively the most blatant conspiracy theory in the world.
Title: Re: How does cognitive bias affect our receptivity to new theories?
Post by: dlorde on 15/05/2013 10:29:54
What if the whole claim of "evidence for a cognitive bias" is just plain a bunch of historical contingencies and self-fulfilling prophecies?
If the evidence for cognitive bias was invalid or unreliable, then we would be unable to validly or reliably take account of cognitive bias. Fortunately, it is possible to reliably demonstrate the existence of cognitive bias by experiment, and corroborate it by investigation of well-documented past events. This has been done and continues to be done.

Quote
Believing that you are biased becomes a self-fulfilling prophecy. Ever heard of the placebo effect? And if the bias is supposed to be panhuman, then peer review means plus minus zero. Psychologistically-motivated "scepticism" is effectively the most blatant conspiracy theory in the world.
Being aware of the potential for bias enables you to control for bias. If you are unaware of the possibility of bias, you will be unable to control for it or eliminate it.

The placebo effect is a change in perception of health or well-being after a pharmacologically or physiologically inactive medical intervention and it may occur whether the subject or patient is aware of the placebo or not. How is it relevant here?

Scepticism as conspiracy theory is new to me; can you point to any evidence for it, or are you just suggesting the possibility? Conspiracy involves deliberate, informed agreement to malfeasance; your description of a "bunch of historical contingencies and self-fulfilling prophecies" doesn't fit that.
Title: Re: How does cognitive bias affect our receptivity to new theories?
Post by: Martin J Sallberg on 15/05/2013 13:23:13
If the evidence for cognitive bias was invalid or unreliable, then we would be unable to validly or reliably take account of cognitive bias. Fortunately, it is possible to reliably demonstrate the existence of cognitive bias by experiment, and corroborate it by investigation of well-documented past events. This has been done and continues to be done.

Now you are ignoring that it may simply be the intolerance and following justifications that creates the "bias". As shown in "What is the brain basis for blame?" extreme recoveries from brain damage and other severe mental disorders are linked to tolerant environments. This supports the theory that biases are not fixed at all, but simply products of cultural intolerance.

Being aware of the potential for bias enables you to control for bias. If you are unaware of the possibility of bias, you will be unable to control for it or eliminate it.

The placebo effect is a change in perception of health or well-being after a pharmacologically or physiologically inactive medical intervention and it may occur whether the subject or patient is aware of the placebo or not. How is it relevant here?.

It is believing that one is biased that makes one biased. That is, bias is the product of believing that one is biased. Placebo, at least in this case, refers to the effects of beliefs. If you believe that you have a fixed bias, that belief makes you biased

Scepticism as conspiracy theory is new to me; can you point to any evidence for it, or are you just suggesting the possibility? Conspiracy involves deliberate, informed agreement to malfeasance; your description of a "bunch of historical contingencies and self-fulfilling prophecies" doesn't fit that.

I never claimed that all scepticism was conspiracy theories, only that psychologistically motivated scepticism is effectively a conspiracy theory. By "effectively" I am leaving all quibble about what is "deliberate" aside. Alleging a panhuman bias has the same effect as alleging a worldwide conspiracy that can never leak.
Title: Re: How does cognitive bias affect our receptivity to new theories?
Post by: damocles on 16/05/2013 00:29:32
I would argue empirically that a cognitive bias makes one totally unreceptive to new theories. I see cognitive bias all the time -- it is amply demonstrated by refusal to follow the rules of a forum, the desire to always have the last word, the refusal to accept peer review as a valid mechanism, etc. And when I find it I know that it is totally pointless to argue back.
Title: Re: How does cognitive bias affect our receptivity to new theories?
Post by: dlorde on 16/05/2013 15:05:40
... when I find it I know that it is totally pointless to argue back.

Quite. 'Nuff said.
Title: Re: How does cognitive bias affect our receptivity to new theories?
Post by: sparshmishra1 on 18/05/2013 05:21:35
Cognitive bias to an extent completely lashes down our receptivity to new theories. We tend to look at only the things we currently know and do not even try to accept new theories which may be different.
This makes an individual closed minded rather than open minded which may cause a deviation in ones judgement.
All the discoveries that have taken place in the world have only happened because of the open mindedness of the people who have discovered them.
Title: Re: How does cognitive bias affect our receptivity to new theories?
Post by: damocles on 18/05/2013 08:11:33
Cognitive bias to an extent completely lashes down our receptivity to new theories. We tend to look at only the things we currently know and do not even try to accept new theories which may be different.
This makes an individual closed minded rather than open minded which may cause a deviation in ones judgement.
All the discoveries that have taken place in the world have only happened because of the open mindedness of the people who have discovered them.

This is certainly also true. So what standard should we adopt? It is fairly obvious that science should be generally conservative, else it would be quite chaotic, changing every time that anyone wakes up with a new thought. Fortunately the sorting does not occur on these forums. When someone has a new theory, they should first submit it to a respectable journal. It will be passed out for peer review -- at least two independent scientists who are supposed to be expert in the field will review the work, make a recommendation re publication and specify any shortcomings they see in the work. The comments will be passed back to the originator of the work who is given an opportunity to correct the detail of the work or to refute the referees' comments. Nearly all of the great advances in science have gone through this procedure. If there is a recommendation that the work be rejected, then the originator must either accept it or argue her case further. Where and how this is done is entirely up to the author's judgement. But if the author comes to a forum like this to argue the case, then they should produce the expert referees' comments alongside their argument, and if they find that less qualified experts are also questioning their "research" on any grounds whatever, it is a sign that their mind is closed, particularly if they are feeling that they have to have "the last word" on every point. The only battle to be won on these forums is a political one, not a scientific one.
Title: Re: How does cognitive bias affect our receptivity to new theories?
Post by: dlorde on 18/05/2013 14:28:25
Initial publication isn't the end of the story, by any means. Potentially important results should be subject to independent replication before they will be taken seriously by the community, and even then it may be some time before they are accepted and models are adjusted (if necessary) to accommodate them.
Title: Re: How does cognitive bias affect our receptivity to new theories?
Post by: Martin J Sallberg on 23/05/2013 09:15:41
The point is that psychologistic bias claims is the same thing as claiming that objectivity cannot exist, which is the same thing as saying that science does not really exist but is just another belief system. Psychologism must be wrong in order for science to exist.
Title: Re: How does cognitive bias affect our receptivity to new theories?
Post by: damocles on 23/05/2013 11:31:22
The point is that psychologistic bias claims is the same thing as claiming that objectivity cannot exist, which is the same thing as saying that science does not really exist but is just another belief system. Psychologism must be wrong in order for science to exist.
I think not. Modern Science has developed with a good but far from perfect system of checks and balances. It tends to have a conservative bias, but that means that most of the wacky new theories that are not likely to be productive do not immediately get incorporated into the mainstream of knowledge and belief about the way that nature works, and any worthwhile theory will eventually make people sit up and take notice because of the way that it succeeds in providing a description of uncomfortable anomalies. Empirically science does exist, and it is not just "another belief system" though it can be. Scientism does not provide any moral guidance (without extra assumptions), and I find that science sits very comfortably with my (Anglo-Catholic) Christian world view -- my God is a God of truth, who has commanded us to explore and come to terms with the world around us.
Title: Re: How does cognitive bias affect our receptivity to new theories?
Post by: Martin J Sallberg on 23/05/2013 13:37:01
I think not. Modern Science has developed with a good but far from perfect system of checks and balances. It tends to have a conservative bias, but that means that most of the wacky new theories that are not likely to be productive do not immediately get incorporated into the mainstream of knowledge and belief about the way that nature works, and any worthwhile theory will eventually make people sit up and take notice because of the way that it succeeds in providing a description of uncomfortable anomalies.

What does "wacky" mean? It is, if science exists, easy to test if the predictions made by a theory pans out or not. There is no need to have an official viewpoint. Just admit that we do not ####ing know, until it is tested. The whole term "burden of proof" is all based on a deluded belief that there must somehow be an official viewpoint. Just collect "uncomfortable" anomalies en masse and see what it leads to, even if it means having to come up with completely new ideas after the data is collected.

Empirically science does exist, and it is not just "another belief system" though it can be. Scientism does not provide any moral guidance (without extra assumptions), and I find that science sits very comfortably with my (Anglo-Catholic) Christian world view -- my God is a God of truth, who has commanded us to explore and come to terms with the world around us.

To claim that "scientism does not provide any moral guidance without extra assumptions" naively assumes that there should be a reliable definition of "morality", which there is not. What if, for instance, the whole distinction between "intrinsic value" and "instrumental value" is just a delusion acquired through too much tool use?

As for religion, consider the fact that just 30 years ago, it was believed that any speculations of whether or not exoplanets existed was in the relam of faith and would never become testable. The lesson is: never ever be so sure that something will never be testable.

And the main point: any psychologistic bias claims inevitably leaves the door wide open for instinctualistic "explanations" of ideas, which renders any scientific system useless.
Title: Re: How does cognitive bias affect our receptivity to new theories?
Post by: damocles on 23/05/2013 16:37:26
From Martin J Sallberg:
Quote
What does "wacky" mean?
My apologies Martin. It is Australian (and North American?) slang for "eccentric", with perhaps a hint of being a pejorative term and of referring to an extreme form of eccentricity.

 
Quote
It is, if science exists, easy to test if the predictions made by a theory pans out or not. There is no need to have an official viewpoint.

This is not necessarily the case. It can be very difficult to work out an experiment which will test a theory, and even with an alleged falsification it may not be quite clear where to lay the blame (Google Quine-Duhem problem)

Quote
Just admit that we do not ####ing know, until it is tested. The whole term "burden of proof" is all based on a deluded belief that there must somehow be an official viewpoint. Just collect "uncomfortable" anomalies en masse and see what it leads to, even if it means having to come up with completely new ideas after the data is collected.
This is not a practical possibility, because science is not a simple, timeless enterprise. There is no point at which we can say "we now have all the pieces of the jigsaw puzzle, so let us now go ahead and solve it". So at every point there has to be an "official viewpoint" corresponding to the best guess of the scientific community.
This "best guess" will generally involve leaving a number of issues unresolved on the "back burner". For example, when Marie Curie was concentrating her solutions of radium chloride, she arrived at the point where the solutions were glowing and getting quite warm -- emitting about 30 watts of power, but with no measurable change. This was as clear a counter example to the conservation of energy as one would ever have expected to find, but it was left on the back burner because too much else hinged on the conservation of energy.

The issue was resolved a few years later when radon gas was discovered in the air above radium salt solutions, and it was recognized that the loss of mass in the radium would have been unmeasurable.

Quote
Quote from: damocles on Today at 10:31:22
Empirically science does exist, and it is not just "another belief system" though it can be. Scientism does not provide any moral guidance (without extra assumptions), and I find that science sits very comfortably with my (Anglo-Catholic) Christian world view -- my God is a God of truth, who has commanded us to explore and come to terms with the world around us.

To claim that "scientism does not provide any moral guidance without extra assumptions" naively assumes that there should be a reliable definition of "morality", which there is not.

I believe that "morality" arises from an innate sense of good and evil -- frequently described as "conscience", and described in the creation myth in terms of Adam eating the fruit of the tree of knowledge of good and evil. An adherent of scientism, like Dawkins, might believe that it arises from a gene selfishly trying to ensure its continuing expression. The point is that we both think in terms of "morality" and we find a lot of common ground in what we see as moral, as well as a number of points of quite serious difference. Empirically I have been quite interested recently to find that here in Australia there is a lot of public sympathy for the Government to upgrade its spending on mental health facilities. I believe that this suggests support for my notion of morality rather than Dawkins', though he would probably have the "out" that it arises from cultural factors.

Quote
As for religion, consider the fact that just 30 years ago, it was believed that any speculations of whether or not exoplanets existed was in the realm of faith and would never become testable. The lesson is: never ever be so sure that something will never be testable.

I am very familiar with this point in the atomist/anti-atomist debates in 19th century chemistry, and those who maintained that atoms were not a part of science because they would always be beyond the reach of the senses. The really interesting thing is that chemists of both persuasions were able to work together and lay out a lot of the foundations of their subject. What I do not understand is what all this has to do with religion, unless you are rather naively suggesting that at some future date scientific findings will rule out religion?
Title: Re: How does cognitive bias affect our receptivity to new theories?
Post by: cheryl j on 24/05/2013 04:18:01
What is the difference between bias and expectations based on past experience and learning? Over time one observes how things work in the world, sees patterns, deduces causes and effects, which results in expectations and predictions about what is likely to happen in the future.

From the stand point of scientific experiments this could be called bias, since researchers do not want any expectations affecting the outcome experiment, even if they are reasonable expectations.
From the standpoint of psychology or personality, bias may be a particular type of belief, or a unwillingness to change ones views, despite a lot of evidence to the contrary. Or it might be believing in a idea for reasons other than the reason one claims (or may not even be aware of). For example, do some people believe in religion because they genuinely think it is  probably or likely to be true? Or do they believe in religion because it offers relief from existential anxiety – ie, a loving father figure who will guide one through life, intervene during a catastrophic event, and promises life after death. Do some people reject science because they see it as incompatible with religion, and fear that accepting science would require abandoning the comforting ideas that religion offers. I’m not trying to start a religion vs science debate, or implying that all religious people think that way. There are lots of other examples of belief that may be motivated by emotional factors, like political correctness, concern for ones reputation or career advancement, and fear of criticism or ridicule. Another factor might also be respect for the ability or knowledge of other people who hold that belief, which might not be a legitimate reason, but may understandably tip the scales when one is undecided. I expect even physicists do this (Well, if Stephen Hawking believes it....) I also think it is more difficult to accept ideas that are depressing or predict something unpleasant, and that is probably the bias I am most prone to. I want to believe that people are basically good and capable of change. I want to believe in things like free will.

Some research has shown that different people tend to be more flexible or less flexible in changing their opinions or beliefs. One study showed more similarity in the brains of people who identified themselves politically as ultra conservatives or die hard liberals, than between either of these groups and moderate, swing voters.

Being flexible would seem to be the better option when it comes to learning, eliminating erroneous assumptions, and adapting to change. But as Damocles pointed also pointed out, if one constantly had to revise or start from scratch with each novel experience, it would be difficult to function, and this seems true for individuals as well as collective thinkers like “the scientific community.” I don’t abandon my concept of gravity the first time I let go of a helium balloon and it floats up to the ceiling. I look for an explanation that is compatible, because the other evidence for gravity is so compelling. In addition,  overwhelming uncertainty makes it difficult to make choices and act on them, so one tends to maintain certain beliefs despite a few doubts or apparent inconsistencies.

Ironically, some people or groups might be “biased” towards changing their views or adopting unorthodox opinions. I know individuals who become passionate about a spiritual practice or health regimen for about a year or so, and then abandon it for something else which they believe will be the answer to all their life issues. I’ve also heard it said that grad student thesis’s are abundant with wacky ideas because one does not get as much attention or praise for a new finding that confirms conventional wisdom, as one gets from discovering/proposing something unexpected, contradictory, or revolutionary.
Title: Re: How does cognitive bias affect our receptivity to new theories?
Post by: Martin J Sallberg on 24/05/2013 10:43:00
My apologies Martin. It is Australian (and North American?) slang for "eccentric", with perhaps a hint of being a pejorative term and of referring to an extreme form of eccentricity.

It was not dialectal misunderstanding, it was criticism of the term.


This is not necessarily the case. It can be very difficult to work out an experiment which will test a theory, and even with an alleged falsification it may not be quite clear where to lay the blame (Google Quine-Duhem problem)

It is not necessary to "know where to put the blame" in order to show that a theory is false. If there is empirical evidence for something that the theory predicts should be impossible, then the theory is false. Period.

Quote
Just admit that we do not ####ing know, until it is tested. The whole term "burden of proof" is all based on a deluded belief that there must somehow be an official viewpoint. Just collect "uncomfortable" anomalies en masse and see what it leads to, even if it means having to come up with completely new ideas after the data is collected.
This is not a practical possibility, because science is not a simple, timeless enterprise. There is no point at which we can say "we now have all the pieces of the jigsaw puzzle, so let us now go ahead and solve it". So at every point there has to be an "official viewpoint" corresponding to the best guess of the scientific community.

I meant collecting the results of already done experiments. It is not necessary to do a new experiment for every theory to falsify. And there is absolutely no need to pretend to know for sure when there is uncertainty.

This "best guess" will generally involve leaving a number of issues unresolved on the "back burner". For example, when Marie Curie was concentrating her solutions of radium chloride, she arrived at the point where the solutions were glowing and getting quite warm -- emitting about 30 watts of power, but with no measurable change. This was as clear a counter example to the conservation of energy as one would ever have expected to find, but it was left on the back burner because too much else hinged on the conservation of energy. The issue was resolved a few years later when radon gas was discovered in the air above radium salt solutions, and it was recognized that the loss of mass in the radium would have been unmeasurable.

Well, of course falsifications must be real falsifications, not errors of measurement, so that was an invalid example. I was talking about actual falsifications. When a theory fails during an actual falsification, it does not matter at all "how much hinges on it", but the new theory must have a close enough superficial resemblance to the old mainstream theory in all the contexts of the specific observations where it have passed the test to predict those outcomes as well.


I believe that "morality" arises from an innate sense of good and evil -- frequently described as "conscience", and described in the creation myth in terms of Adam eating the fruit of the tree of knowledge of good and evil. An adherent of scientism, like Dawkins, might believe that it arises from a gene selfishly trying to ensure its continuing expression. The point is that we both think in terms of "morality" and we find a lot of common ground in what we see as moral, as well as a number of points of quite serious difference. Empirically I have been quite interested recently to find that here in Australia there is a lot of public sympathy for the Government to upgrade its spending on mental health facilities. I believe that this suggests support for my notion of morality rather than Dawkins', though he would probably have the "out" that it arises from cultural factors.

The evidence for the existence of rapid evolution proves that any belief in genetically-determined morality is bound to make racist predictions, as inevitably as the theory of a lumniferous aether predicts that the speed of light should be different in different directions due to the motion of the Earth.

And the main point about morality is that there is no definition of what morality really means, and thus no way to say for sure that "science cannot give moral guidance without extra assumptions" either.


I am very familiar with this point in the atomist/anti-atomist debates in 19th century chemistry, and those who maintained that atoms were not a part of science because they would always be beyond the reach of the senses. The really interesting thing is that chemists of both persuasions were able to work together and lay out a lot of the foundations of their subject. What I do not understand is what all this has to do with religion, unless you are rather naively suggesting that at some future date scientific findings will rule out religion?

I never said that science will rule out religion. Just that it is naive to be absolutely sure that it never will. And in fact, I am not really atheist but rather ignostic. Ignosticism is the view that the question "does God exist?" cannot be answered due to the lack of a clear definition of what the word "God" really means.
Title: Re: How does cognitive bias affect our receptivity to new theories?
Post by: Martin J Sallberg on 24/05/2013 10:55:59
What is the difference between bias and expectations based on past experience and learning? Over time one observes how things work in the world, sees patterns, deduces causes and effects, which results in expectations and predictions about what is likely to happen in the future.

From the stand point of scientific experiments this could be called bias, since researchers do not want any expectations affecting the outcome experiment, even if they are reasonable expectations.
From the standpoint of psychology or personality, bias may be a particular type of belief, or a unwillingness to change ones views, despite a lot of evidence to the contrary. Or it might be believing in a idea for reasons other than the reason one claims (or may not even be aware of). For example, do some people believe in religion because they genuinely think it is  probably or likely to be true? Or do they believe in religion because it offers relief from existential anxiety – ie, a loving father figure who will guide one through life, intervene during a catastrophic event, and promises life after death. Do some people reject science because they see it as incompatible with religion, and fear that accepting science would require abandoning the comforting ideas that religion offers. I’m not trying to start a religion vs science debate, or implying that all religious people think that way. There are lots of other examples of belief that may be motivated by emotional factors, like political correctness, concern for ones reputation or career advancement, and fear of criticism or ridicule. Another factor might also be respect for the ability or knowledge of other people who hold that belief, which might not be a legitimate reason, but may understandably tip the scales when one is undecided. I expect even physicists do this (Well, if Stephen Hawking believes it....) I also think it is more difficult to accept ideas that are depressing or predict something unpleasant, and that is probably the bias I am most prone to. I want to believe that people are basically good and capable of change. I want to believe in things like free will.

What a long rant. The point is that psychologism predicts that science should be impossible and is thus incompatible with the existence of science, and if science does not exist nothing can be supported or discredited by science.


Some research has shown that different people tend to be more flexible or less flexible in changing their opinions or beliefs. One study showed more similarity in the brains of people who identified themselves politically as ultra conservatives or die hard liberals, than between either of these groups and moderate, swing voters.

Of course thinking is interaction between material particles. If thinking was a separate soul it would have been unable to percieve or affect the physical world. That thinking has an association with brains thus does not prove fixedness any more than learning something new should require an immaterial soul.


Being flexible would seem to be the better option when it comes to learning, eliminating erroneous assumptions, and adapting to change. But as Damocles pointed also pointed out, if one constantly had to revise or start from scratch with each novel experience, it would be difficult to function, and this seems true for individuals as well as collective thinkers like “the scientific community.”

Not start from scratch with every novel experience, just change theory when an actual falsification comes.


I don’t abandon my concept of gravity the first time I let go of a helium balloon and it floats up to the ceiling. I look for an explanation that is compatible, because the other evidence for gravity is so compelling. In addition,  overwhelming uncertainty makes it difficult to make choices and act on them, so one tends to maintain certain beliefs despite a few doubts or apparent inconsistencies.

It does, in fact, falsify the claim that everything always falls down. However, the existence of boyancy makes the example poor. For that part, Newtonian gravity is falsified by the bending of light by gravity. But an actual falsification means having to abandon the theory, even if the new theory must predict the observations already made that appeared to confirm the old theory within the specific contexts those observations were made.


Ironically, some people or groups might be “biased” towards changing their views or adopting unorthodox opinions. I know individuals who become passionate about a spiritual practice or health regimen for about a year or so, and then abandon it for something else which they believe will be the answer to all their life issues.

And the relevance is?

I’ve also heard it said that grad student thesis’s are abundant with wacky ideas because one does not get as much attention or praise for a new finding that confirms conventional wisdom, as one gets from discovering/proposing something unexpected, contradictory, or revolutionary.

Are you kidding? Established dogma gets lots and lots of attention. They are regurgitated in textbooks ad infinitum.
Title: Re: How does cognitive bias affect our receptivity to new theories?
Post by: dlorde on 24/05/2013 11:53:16
What is the difference between bias and expectations based on past experience and learning?

I'd suggest that, in its derogatory sense, bias is an expectation that is unjustified by the balance of the evidence and critical thought, or an exaggerated expectation. At it's worst it's a positive desire or conviction that things should be a certain way, regardless of critical thought and evidence.
Title: Re: How does cognitive bias affect our receptivity to new theories?
Post by: cheryl j on 24/05/2013 14:23:52



What a long rant. The point is that psychologism predicts that science should be impossible and is thus incompatible with the existence of science, and if science does not exist nothing can be supported or discredited by science.



Lol.

Well, my post may have been too long, or not of interest to you, but I don't think it qualified as a "rant." I was not attacking anyone's view point. Just sharing a few of my own thoughts about bias and the forms it takes.

I agree with other posters just because bias exists, doesn't mean its effects can't be accounted for or controlled in experiments. It doesn't mean science is impossible or that science doesn't exist.
The existence of oppositional forces are not contradictions.



Title: Re: How does cognitive bias affect our receptivity to new theories?
Post by: cheryl j on 24/05/2013 14:57:44
What is the difference between bias and expectations based on past experience and learning?

I'd suggest that, in its derogatory sense, bias is an expectation that is unjustified by the balance of the evidence and critical thought, or an exaggerated expectation. At it's worst it's a positive desire or conviction that things should be a certain way, regardless of critical thought and evidence.

I would agree with that definition. Although, I don't think most people intentionally adopt the position "I'm going to believe in crazy stuff regardless of the facts!." They possess a different set of knowledge/facts, attribute more significance or importance to certain facts, or there are emotional motivations that sway them in one direction or another.

Although not a very scientific example, in the old Pepsi vs Coke taste test, more people said they preferred the taste of Pepsi when blindfolded, but the majority preferred the taste of Coke when they knew which was which. Was it the appealing script of the familiar Coca-cola logo, nostalgia for the old green coke bottles or something the else about the product that convinced them they actually liked the taste of Coke better? The marketing and advertising industry is based on taking advantage of people's subconscious bias.

Another example that comes to mind involves the famous Nixon Kennedy debate. People who listened to it on the radio thought Nixon was the winner. People who watched it on television thought Kennedy was more convincing. Researchers recently repeated this in an experiment with university students and got the same results. Which seems to suggest that even though people thought they were basing their decisions on the debaters' arguments and ideas, they were affected by certain visual or auditory cues as well. Nixon had a deeper, more resonating voice. Kennedy had a very nasal sound. But Nixon looked tired, sweaty and unshaven, and Kennedy appeared more polished and confident.
Title: Re: How does cognitive bias affect our receptivity to new theories?
Post by: damocles on 24/05/2013 21:51:26

This is not necessarily the case. It can be very difficult to work out an experiment which will test a theory, and even with an alleged falsification it may not be quite clear where to lay the blame (Google Quine-Duhem problem)

It is not necessary to "know where to put the blame" in order to show that a theory is false. If there is empirical evidence for something that the theory predicts should be impossible, then the theory is false. Period.
***
Quote
This "best guess" will generally involve leaving a number of issues unresolved on the "back burner". For example, when Marie Curie was concentrating her solutions of radium chloride, she arrived at the point where the solutions were glowing and getting quite warm -- emitting about 30 watts of power, but with no measurable change. This was as clear a counter example to the conservation of energy as one would ever have expected to find, but it was left on the back burner because too much else hinged on the conservation of energy. The issue was resolved a few years later when radon gas was discovered in the air above radium salt solutions, and it was recognized that the loss of mass in the radium would have been unmeasurable.

Well, of course falsifications must be real falsifications, not errors of measurement, so that was an invalid example. I was talking about actual falsifications.
***
Whoops!
Title: Re: How does cognitive bias affect our receptivity to new theories?
Post by: damocles on 24/05/2013 22:15:04
From Martin J Sallberg:
Quote
Quote from: damocles on 23/05/2013 15:37:26
My apologies Martin. It is Australian (and North American?) slang for "eccentric", with perhaps a hint of being a pejorative term and of referring to an extreme form of eccentricity.

It was not dialectal misunderstanding, it was criticism of the term.
Why, Martin? Do you imagine that journal editors do not have a flood of wacky papers coming across their desks all the time? Do you imagine that they should just shrug their shoulders and publish everything? Do you imagine that working scientists do not have a hard enough time keeping up with the literature?

My reading of it is that you have a hidden agenda -- perhaps a piece of your work that is not 'wacky' that has been rejected by an editor, and you are so focussed on that that you are blind to the fact that there are hundreds of genuinely wacky articles being submitted?

Empirically you are showing a lot of the signs of cognitive bias, e.g. dismissing Cheryl's well thought through and fairly harmless post as a "rant", and feeling obliged to have the last word and to criticize every point that is made against you.

Title: Re: How does cognitive bias affect our receptivity to new theories?
Post by: damocles on 24/05/2013 22:33:50
The reason for a "collective judgement" from "the scientific community" is that science has become too large a subject for any one person to master. But different branches of science often rely on the results of experiments, or the dictates of theories, from another specialization. So that atmospheric modellers, for example, are primarily applied mathematicians. But they depend on research in oceanography to obtain a boundary condition for their models, on chemists for reliable measured rates of reaction (and the underlying principles of how to measure them) in order to incorporate chemical processes  into their models, etc.

There is a system that is far from perfect, but generally very reliable, that allows this to happen. Empirically this system can be judged by its productivity in terms of technological spin-offs, predictions that are borne out in practice, etc.
Title: Re: How does cognitive bias affect our receptivity to new theories?
Post by: dlorde on 25/05/2013 01:26:03
Although not a very scientific example, in the old Pepsi vs Coke taste test, more people said they preferred the taste of Pepsi when blindfolded, but the majority preferred the taste of Coke when they knew which was which. Was it the appealing script of the familiar Coca-cola logo, nostalgia for the old green coke bottles or something the else about the product that convinced them they actually liked the taste of Coke better? The marketing and advertising industry is based on taking advantage of people's subconscious bias.

Coke made a particularly interesting marketing research snafu with New Coke. They produced the New Coke formula and did the taste tests around America, and found that people overwhelmingly preferred New Coke to classic Coke in blind taste tests. So they bet the company on this wonderful new product. What they hadn't allowed for was that the new sweeter product tasted better in a single sip comparison, but the majority found it too sweet and cloying to drink a whole can or bottle. The less sweet original Coke formula didn't get sickly after a couple of gulps like the new one. That mistake almost put them out of business and nearly left Pepsi with a monopoly; fortunately for them, relaunching 'Classic Coke' was a marketing success...

Quote
Another example that comes to mind involves the famous Nixon Kennedy debate. People who listened to it on the radio thought Nixon was the winner. People who watched it on television thought Kennedy was more convincing. Researchers recently repeated this in an experiment with university students and got the same results. Which seems to suggest that even though people thought they were basing their decisions on the debaters' arguments and ideas, they were affected by certain visual or auditory cues as well. Nixon had a deeper, more resonating voice. Kennedy had a very nasal sound. But Nixon looked tired, sweaty and unshaven, and Kennedy appeared more polished and confident.
It may be of interest that visual clues actually do distract from the subtle interpretation of honest & dishonesty. It's been found that people who only hear a speech rather than see and hear the speaker, have a better success rate at picking up lies, evasions, half-truths, etc.
Title: Re: How does cognitive bias affect our receptivity to new theories?
Post by: Martin J Sallberg on 25/05/2013 06:35:59
From Martin J Sallberg:
Quote
Quote from: damocles on 23/05/2013 15:37:26
My apologies Martin. It is Australian (and North American?) slang for "eccentric", with perhaps a hint of being a pejorative term and of referring to an extreme form of eccentricity.

It was not dialectal misunderstanding, it was criticism of the term.
Why, Martin? Do you imagine that journal editors do not have a flood of wacky papers coming across their desks all the time? Do you imagine that they should just shrug their shoulders and publish everything? Do you imagine that working scientists do not have a hard enough time keeping up with the literature?

My reading of it is that you have a hidden agenda -- perhaps a piece of your work that is not 'wacky' that has been rejected by an editor, and you are so focussed on that that you are blind to the fact that there are hundreds of genuinely wacky articles being submitted?

Empirically you are showing a lot of the signs of cognitive bias, e.g. dismissing Cheryl's well thought through and fairly harmless post as a "rant", and feeling obliged to have the last word and to criticize every point that is made against you.

You are completely ignoring several facts:

#The same evidence can disprove multiple theories, so there is no need to test every theory separately to test them.

#Technically speaking, the word "predictor" should replace both "theory" and "hypothesis". Quibbling about distinctions between "theory" and "hypothesis" totally ignores the fact that no matter how many apparent verifications there is, it is no guarantee against falsification, so the "hypothesis/theory" threshold is arbitrary. The term "predictor", however, properly says what it really is, something that makes predictions.

#Predictors can be tested without being published in a few elitist-select papers.
Title: Re: How does cognitive bias affect our receptivity to new theories?
Post by: Martin J Sallberg on 25/05/2013 06:37:38
The reason for a "collective judgement" from "the scientific community" is that science has become too large a subject for any one person to master. But different branches of science often rely on the results of experiments, or the dictates of theories, from another specialization. So that atmospheric modellers, for example, are primarily applied mathematicians. But they depend on research in oceanography to obtain a boundary condition for their models, on chemists for reliable measured rates of reaction (and the underlying principles of how to measure them) in order to incorporate chemical processes  into their models, etc.

There is a system that is far from perfect, but generally very reliable, that allows this to happen. Empirically this system can be judged by its productivity in terms of technological spin-offs, predictions that are borne out in practice, etc.

It is possible to work collectively without an official viewpoint that ridicules other viewpoints. Just that all psychologism inherently ridicules its opposition and can therefore not be part of it.
Title: Re: How does cognitive bias affect our receptivity to new theories?
Post by: damocles on 25/05/2013 23:03:44
The reason for a "collective judgement" from "the scientific community" is that science has become too large a subject for any one person to master. But different branches of science often rely on the results of experiments, or the dictates of theories, from another specialization. So that atmospheric modellers, for example, are primarily applied mathematicians. But they depend on research in oceanography to obtain a boundary condition for their models, on chemists for reliable measured rates of reaction (and the underlying principles of how to measure them) in order to incorporate chemical processes  into their models, etc.

There is a system that is far from perfect, but generally very reliable, that allows this to happen. Empirically this system can be judged by its productivity in terms of technological spin-offs, predictions that are borne out in practice, etc.

It is possible to work collectively without an official viewpoint that ridicules other viewpoints. Just that all psychologism inherently ridicules its opposition and can therefore not be part of it.
I think that I disagree with you here. Let me illustrate with an example: A team of scientists is working on the circulation of air pollutants. They need to be able to insert a value for the rate of a chemical reaction which must be measured indirectly.

There are three teams of scientists who have studied this reaction, each using a different method of determining the rate, and coming up with three wildly different results. Team A have used a method that the referees have found fault with, and after submitting the referee's comments to get replies to the referees' comments, the journal editor has decided not to publish the paper. Teams B and C have managed to get their very different results published. An expert reviewer has reviewed the work and recommended that the value obtained by team C is the most reliable.

The scientists working on the circulation of pollutants need to be able to take the value obtained by team C and insert it into their own study. They do not want to have to evaluate the three pieces of work. It might be only one of twenty such reactions, and they lack the expertise to do this evaluation anyway. Their main concern is to find a reliable value to plug into their circulation model. Life is too short!

Meanwhile the results of team B are there if new evidence leads to a new comparison. The results of team A are, unfortunately, lost forever.

This is how science works. It is, as I have repeatedly been saying, not a perfect vehicle for discovering the way that nature works, but it has been very fruitful.

I think that perhaps that Martin has a very idealized vision of the way that science "ought to be" -- perhaps defining "science" as "knowledge of nature" -- whereas my vision is of "science" as it "is", a "warts and all" methodology that has proved to be very fruitful
Title: Re: How does cognitive bias affect our receptivity to new theories?
Post by: Martin J Sallberg on 26/05/2013 06:25:12
It is either absolutely objective science, or just a belief system among other belief systems. Geocentrism was also "fruitful" in that it could predict most of the motion of celestial bodies and left the anomalies on the "back burner".
Title: Re: How does cognitive bias affect our receptivity to new theories?
Post by: yor_on on 31/05/2013 19:10:49
Science has a bias :) towards getting as accurate information as possible. And it is conservative but not impossible, although the more 'of stream' ones ideas are the slower the ship will turn, and one will need experiments proving it, at least to me. To be objective in a ideal manner I don't expect to be possible, as we're all human.