Why do people believe what they believe

  1. Why do people believe what they believe? Can anyone direct me to studies of this? I'm not interested in Kant or other philosophers, case studies would be more like it.
  2. jcsd
  3. Why do YOU believe what you believe?
  4. Unpublished theories are not allowed in physicsforums.
    Last edited: Apr 11, 2013
  5. Ryan_m_b

    Staff: Mentor

    This thread is far too vague, you'll have to be more specific. Are you asking about religious belief?
  6. Bobbywhy

    Bobbywhy 1,864
    Gold Member

  7. Also people are told what to believe by moral entrepeneurs. Might be an idea to study that a bit more closer.
  8. A lot of it is mob mentality. If enough people around you believe in the same things you and reflect that back to you, then you're more likely to retain that belief.
  9. That could work quite well. I'm more interested in why people believe in commonly accepted, mainstream things, though the dynamics could be pretty much the same. The same author's "The Believing Brain" is probably more like I would want. I assume that Shermer gives references, right? I can trace those down and that should do the trick.

    "The rest of the book is more information about the author's personal beliefs, pet peeves, etc. Interestingly, when discussing theories he is critical of, the author holds studies to a very high standard, but when discussing his own theory, he references studies and concepts that often do not reach the same level of rigor." Yep. Well, if his own theories didn't apply to him, wouldn't that be a weakness?
  10. That's still kind of broad, because there are so ways to come at this.

    For example, I've recently been reading a lot of books on probability and randomness, which suggest that people's inability to cope with randomness leads them to believe things that aren't true or find patterns that aren't there (pareidolia).

    Then you've got basic cognitive biases (See here for a great website and book: http://59ways.blogspot.com/p/index-of-unnatural-acts-that-can.html)

    I think the area of decision making is related, since it tends to be based on people's beliefs. We tend to make decisions based on emotional connections and then use "logic" to justify that belief. (This explanation is a tad simplistic, but I would say that's the idea).

    "How we decide" (I haven't read it, but I've heard interviews with the author.) touches on this area.


    Cognitive dissonance (holding two conflicting beliefs simultaneously) is another interesting area. This book popped up in relation to the last one, though I don't know anything about it:


    I think the list could go on. But maybe these topics can narrow it down for you.

    -Dave K
  11. Bobbywhy

    Bobbywhy 1,864
    Gold Member

    May I ask, what is the source of the above quotation you posted? Thank you.

  12. It is from an Amazon review of "The Believing Brain."
  13. I'd be particularly interested in cases where people say one thing and do another. There must be at least one clever psychology experiment or study out there that covers this. The "cognitive dissonance" studies are close, but not quite what I'm looking for. It would be more a question of what happens when people can profit by secretly defecting from group norms.
  14. Bobbywhy

    Bobbywhy 1,864
    Gold Member

    ImaLooser, Thank you for that clarification. Unfortunately, that quote was NOT from a review of the book that that link referred to. The quote you posted was from a review of “The Believing Brain: From Ghosts and Gods to Politics and Conspiracies---How We Construct Beliefs and Reinforce...” by Michael Shermer. That was the second book at that Amazon website.

    My reference was to the first book on that Amazon website entitled: "Why People Believe Weird Things: Pseudoscience, Superstition, and Other Confusions of Our Time" by Michael Shermer and Stephen Jay Gould. Sorry I did not specify that title in my first post.

  15. Bobbywhy

    Bobbywhy 1,864
    Gold Member

    As for folks who say one thing and do another, this behavior is common among psychopaths. In politics, for instance, envision the suave manipulator who makes all types of promises, depending on who’s listening. Some politicians will say anything, regardless of their core beliefs, just to get others to vote for them.

    The psychopath’s need for absolute power over others, and occasionally the wish to inflict pain for the enjoyment of watching others suffer, are almost never apparent to the casual observer. The reason for this is that another core trait of the psychopath is disguise. So unfortunately, these individuals usually mask themselves as good-natured people. If they have tremendous wealth, you can be sure that they'll create charitable organizations as part of their mask.

    A need for absolute power over others is a core trait of the psychopath. Goal-oriented deceitfulness, superficial charm, an outward friendly appearance, and having no remorse, are other traits which will allow them to achieve their goals. The political aphrodisiac is power. If they are also people of tremendous wealth, they will definitely use this to further their objectives. And because deceitfulness is a core psychopathic trait, this is often done by creating a humanitarian front organization.

  16. But that seems more behavioral. You originally asked about why people believe the things they do, not why they behave the way they do. Obviously people act out of their own self interest. They say one thing because they profit from saying it, and they do another thing because they profit from doing it. Is that such a tough nut to crack? I don't see anything interesting there.

    Cognitive dissonance - actually simultaneously believing (or generally cognizing, feeling, etc) contrary notions at the same time, is significantly less trivial. It's about self interest as well, but at a deeper level where we have to be able to hold contradicting notions in order to function.

    -Dave K
  17. The trouble with the "everything does everything out of self-interest" is that there is no definition of self-interest. The only way out is a circular definition in which we observe what they do and say retroactively that this is their self-interest. This means that the theory is tautological, with no predictive value. Without predictive value it is both useless and unfalsifiable. It is a definition, not a scientifically testable hypothesis. Defining self-interest as strictly economic is testable, but is easily found to fail in many situations.

    I thought the same of cognitive dissonance until I read the Wikipedia page, which I found very interesting and would recommend. According to them it is more about lying to yourself than about "doublethink." So this is very much what I'm looking for, since there are experiments and studies.
  18. There is a gray elephant in your closet. You can choose to believe me or not and yet it could be true or not. We think in terms of probability and make that truth. Knowing is separate from belief and yet does not every vibration of existence move. In every moment the universe and all of infinity is different and ever changing. If you really want the answer you must ask the question to yourself and if we really did that would we fall into a silence and never speak again? For human kind there are as many beliefs as there are people.
  19. julian

    julian 426
    Gold Member

    Psychologists thought that humans were for the most part rational - experiment over the past couple of decades has contradicted this - I like the book "You are not so smart" by David McRaney. The full title of the book sounds a bit trashy but each chapter is based on psychological research and reviewers include a clinical psychologist who says it is accurate.

    Also on "Why do people believe what they believe" an interesting topic is self-deceit...a subject approached by Robert Trivers from an evolutional perspective.
    Last edited: Jul 14, 2013
  20. Pythagorean

    Pythagorean 4,471
    Gold Member

    Human belief systems are essentially a matter of comparing an internal model to external observations.

    Via Piaget, their are basically two ways people collect information. Mostly, it's assimilation: people take new information and work it into their established model. Less typical, and often associated with a negative emotional state, people will accomodate: change their internal model to better represent new information.

    Changing your whole internal model takes a lot of emotional energy, you have to override a lot of thinking habits, thus assimilation takes the least energy in information acquisition. Therefore, confirmation bias is a more typical outcome of information acquisition.

    You also don't want to be accommodating (significantly changing your internal model) every day, obviously, or the world would seem unstable and inconsistent to you, which is probably not a very healthy mental perspective to have.

    So when are internal models constructed? What role does genetics, development, and parental guidance play in the shape and structure of the internal models? What models are more general in the first place, and thus more conducive to new information? These are probably all open questions with some insights. A meta-example: psychologists now use biopsychosocial models, such as the stress-diathesis model... rather than bickering over nature or nurture, it's become accepted that many factors can play significant roles simultaneously. In other words, the psychology discipline has accommodated their model to the diversity of modern evidence.

    I think the scientific community, in general, requires a lot of accomodating
Know someone interested in this topic? Share this thead via email, Google+, Twitter, or Facebook

Have something to add?

Draft saved Draft deleted