Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

An a priori justified?

  1. Jul 30, 2010 #1
    What justifies an a priori statement? If not experience, then what else is left?
    I mean, I get the example with counting numbers by adding another number to that last number, thus we arrive at a concept called "infinity". How is that an a priori example?

    I'm not getting the big picture here. :cry:
  2. jcsd
  3. Jul 30, 2010 #2
    So, nobody knows what an a priori is? Or maybe I don't know... OK let's see:
  4. Jul 30, 2010 #3
    Well, okay, understanding what you're idea of a priori knowledge is would help. If you're talking in terms of mathematics, then a priori knowledge tends to mean knowledge that is gained by deduction rather than by empirical evidence.

    So, for example, with math, you can think about having two pencils in front of you now and, when Bob brings you three more pencils, you can deduce that you'll then have five. You've arrived at that by way of deduction and don't actually have to place two pencils in front of you and then add another three to know how many you'll have then.

    The other use for a priori, though, is "innate" or "inborn" knowledge. There is an argument that I think is rather flawed that asserts that mathematics is true without reference to reality. That the knowledge of mathematics -- as opposed to the knowledge created by mathematics -- is known without reference to reality.
  5. Jul 30, 2010 #4
    Deduction of what, if not empirical evidence?

    I will not know if he brought me in total 5 pencils until I see them all laid down together.

    I have a feeling that a priori is in other words our capacity rather than an ambiguous terms such as innate or inborn. But h o w do we measure this capacity?
  6. Jul 30, 2010 #5
    Of definition. If I play an A minor triad on a piano, you know that the notes I played were A-C-E. You don't have to walk over and look at my fingers because by definition an A minor triad consists of A-C-E.

    Granted. But we're going on the assumption that you have two and he's bringing you three. Can you conceive of a scenario where you have two pencils, and someone brings you three, and yet the pencils he brings and the ones you have somehow add up to six?
  7. Jul 31, 2010 #6
    I wouldn't consider either of those cases a priori things. Each is, really, deduced by empirical evidence: You're taking an event (data input) and expecting a pattern that has emerged to continue. You just happen to have a very limited data set... Or, I have the idea of an a priori statement mixed up with something else.

    Remove the the association of an 'innate "truth" ' in the definition of the thing and what you have left is a statement that is taken to be true not by the gathering of any evidence whatsoever. Mathematical proofs come to mind. You can derive something from them that is fully functional in a mathematical construct that adheres to certain rules, else, the reasoning (the proof) fails. It just so happens that the math could also be derived from a bunch of data (or not, but the line blurs when you get to applications of any kind of math, i.e. physics...).

    It seems silly to have two strict definitions of what is a priori and what isn't, though. They inherently complement each other (i.e. very simple kinematics).
  8. Jul 31, 2010 #7
    Now for me a priori is more or less synonymous with 'by definition', because that just seems like a more useful way to apply the phrase. But this caught my interest:
    My understanding of empiricism is that it refers solely to things which we observe through our senses. But you've related it to 'data-input'.

    So let's take my perceptions. I know (i.e. have stored data) that I am perceiving things right now - that is, I don't just know what I see, but I know that I see. Do I perceive that I perceive? Or, do I 'just know' that I have perceptions? Or something else?

    I don't presume to have the answer to this question, btw
  9. Aug 1, 2010 #8
    I doubt the question doesn't have an answer, but that it's just too complicated for us to fully comprehend at the moment.

    This unknown, of course, becomes a bit clearer if one of the amazing scientists around here would kindly explain how in the world our brain is functioning!

    Going back to Willow's quote (Kemp's definition), the data that I would consider for something related to what seems to be under this very broad and vague 'a priori' thing (a fault of most non-modern, and a lot of modern, "philosophy") is mostly separated from human senses. You bring up that we, ultimately, need to interpret some group of data in some way. Therefore, why is something commonsense disregarded?

    I think that the idea is not to remove the human factor, but to remove those commonsense, and usually terribly wrong, ideas from the (a priori) statement. I think, by saying 'empirical evidence' I made things a little confusing. What I meant was to only consent to using data in formulating some (temporary) conclusion so long as it can be quantified. Can you mathematically represent it? Unfortunately I feel that this explanation is also a bit vague, but hopefully someone comes along with a better idea.

    This whole a priori business really starts to become important when put into the context of the decisions made by people in power (politicians). The senate would have been more prompt to deal with the healthcare issue (and would have been more comprehensive in dealing with it, too!), if its members understood the information brought to them. Sadly, what we do end up getting is a bunch of unfounded judgments and plenty of unnecessary and uncompromising partisanship. But this is only one little example.
  10. Aug 1, 2010 #9
    Such a scenario is very well plausible, when applied for instance to counting clouds. They can easily merge together in which the sum total of clouds is not equal to the individual count of clouds.
  11. Aug 1, 2010 #10
    Counting rabbits or mice -or for that purpose: bacteria - is another example. They don't add up, they multiplicate!
  12. Aug 1, 2010 #11


    User Avatar
    Science Advisor

    An a priori statement is not a statement about the world, it is an expansion of an already known concept. We don't need data to back it up, because the statement doesn't refer to something which makes sense to back up empirically.

    One classical example of an a priori conclusion is that all bachelors are unmarried. This does not refer to the bachelors themselves, but rather to the state of being a bachelor. The state of being a bachelor is exactly that of being an unmarried man. So the a priori conclusion is a tautology, it simply cannot be false. We need no evidence to back that up.

    When we are counting clouds the a priori conclusion we arrive at depends on the logical picture we have of the situation. We quantify clouds in our mind, and the logical conclusions only hold to the degree clouds can be quantified. The conclusions based on a physical model are always a priori, but the question of whether the conclusions correspond to reality is the same question as whether the model actually represents reality.

    So we can say that all statements about the world are a posteriori, while a statement deduced from a logical picture is a priori, but the logical picture of a situation does not necessarily correspond to the situation itself.
    Last edited: Aug 1, 2010
  13. Aug 1, 2010 #12


    User Avatar
    Science Advisor

    If I understand you correctly:
    Mathematics is true without reference to reality! All mathematical conclusions are arrived at through logical proof, and all mathematical inferences therein are in reference to the given axioms, not reality.

    If the axioms can be used in a given physical model or not is not relevant to mathematics itself, but to the use of mathematics in physics. What we deduce from the axioms does not depend on the physical situation they may represent.
  14. Aug 1, 2010 #13
    So, concepts are unrelated to the world? Does this lead us to some Platonism?

    "...the logical picture of a situation does not necessarily correspond to the situation itself" How can you tell?
  15. Aug 1, 2010 #14


    User Avatar
    Gold Member

    They're mechanical pencils, and one has twice the lead, and all you care about is how much lead you have to write with.

    Maybe you were thinking of wooden pencils and the textual definition of pencil as an object, not based on logistics.

    Either way, the point is that you have to qualify it. There would be no way to ever conceive of math if we couldn't empirically experience a world where matter is conserved. 2 pencils + 3 pencil = 5 pencils would be as meaningful as 2 pencils + 8 pencils = -3 pencils.

    My conclusion: purely quantitative descriptions are meaningless, while purely qualitative descriptions are ambiguous. There's a harmony between the two.
  16. Aug 1, 2010 #15


    User Avatar
    Science Advisor

    We think like this all the time. What happens when you are imagining a move in the game of chess? You are drawing necessary conclusions based on the rules of the game. You don't need the actual game to do this, but does that mean the game you are thinking of must be located in a platonic realm? No, it does not; this has nothing to do with platonism.

    Experience shows us if our way of thinking of a situation is reasonable or not. We can easily count stones, but we are always careful when we are counting drops of water.
  17. Aug 2, 2010 #16
    Precisely. The whole point is that once you have a working definition of something, you can infer things based on that definition. The first paragraph doesn't really disrupt this, because in that case 'pencil' means 'a certain amount of graphite'. Same with the example with clouds earlier in this thread - you either define 'cloud' as a certain amount of vapor, or you just ask how many there are before they merge.

    A lot of us are talking past each other in this thread. I'll say that I don't deny that all of our knowledge has an empirical origin in principle. But as soon as you start using language, you open the door to analytic statements. And you can go quite a ways just on definition, as mathematics shows.
  18. Aug 2, 2010 #17


    User Avatar
    Gold Member

    So then you're stance is that your definition is based on how it's written in a dictionary, based on the text, not the meaning the author was trying to convey with it? And that the textual definition is the 'right' definition? (see: http://en.wikipedia.org/wiki/Textualism" [Broken])

    What if two clouds ARE merging when you freeze the frame? At one percentage of overlap do you consider them one cloud? Do you think your discrepancy is universal?
    Last edited by a moderator: May 4, 2017
  19. Aug 2, 2010 #18


    User Avatar
    Science Advisor

    A large pencil can contain twice the amount of graphite, but we still consider it one pencil.

    In language are definitions fleeting, and only precise if the situation requires it. Definitions are commonly a way of describing how a word is used, but the usage does not flow from the definitions. It's the other way around.

    We don't need a definition of clouds to know what a cloud is, but you will need to know what a cloud is to define it.
    Last edited: Aug 2, 2010
  20. Aug 2, 2010 #19
    I mean whatever definition we agree on. I defaulted to the textual example of 'pencil' because the original context didn't seem to imply anything else.

    I don't think I clearly understand you. Are you saying that things inferred from definition ('a priori') won't be clear in a discussion between two people, or that we can't make real-world inferences about things because they may not fit our definition? Or am I missing the boat entirely here?
  21. Aug 2, 2010 #20
    How could I play a game while not playing by the rules?
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook