Causal inference developed by Pearl

  • A
  • Thread starter Demystifier
  • Start date
In summary: Exercising will make you...fit". "Having a social life will make you...happy".There is no logical connection between the two.Something else caused them both.
  • #1
Demystifier
Science Advisor
Insights Author
Gold Member
14,273
6,747
I have some questions about causal inference developed by Pearl. For a background see e.g. the thread https://www.physicsforums.com/threads/the-causal-revolution-and-why-you-should-study-it.987205/
and wikipedia page https://en.wikipedia.org/wiki/Causal_model

Scientists think in causal terms intuitively, even without knowing about formal theory of causal inference. Is there an example of an actual scientific problem that scientist couldn't solve with intuitive causal thinking, but solved it with the use of formal causal inference?
 
Physics news on Phys.org
  • #2
Demystifier said:
I have some questions about causal inference developed by Pearl. For a background see e.g. the thread https://www.physicsforums.com/threads/the-causal-revolution-and-why-you-should-study-it.987205/
and wikipedia page https://en.wikipedia.org/wiki/Causal_model

Scientists think in causal terms intuitively, even without knowing about formal theory of causal inference. Is there an example of an actual scientific problem that scientist couldn't solve with intuitive causal thinking, but solved it with the use of formal causal inference?
I think that the real problem is that scientists and everyone else use intuitive causal thinking when there is no valid reason to. Formal causal methods may be safer to use to avoid false conclusions about causality.

Example: There are innumerable examples where someone will say (or imply) something like "better sleep gives you better health" based on statistical evidence. But they do not say how they adjusted for the possibility that people in poor health may not sleep well. So the causality may be the opposite of what they are implying. The same thing occurs with exercise, social interaction, etc. "causing" better health or longer life.
I see examples like that all the time.
 
  • Like
Likes Demystifier
  • #3
FactChecker said:
But they do not say how they adjusted for the possibility that people in poor health may not sleep well. So the causality may be the opposite of what they are implying.
Yes, but the point is that you recognized this possibility without using formal causal inference. You, as well as scientists in general, are aware of this intuitively. So how exactly the formal causal inference helps, in "real life" of a scientist?
 
  • Like
Likes FactChecker
  • #4
What about the cases in math where a pattern is found and a conjecture made that a counterexample disproved later on?
 
  • #5
jedishrfu said:
What about the cases in math where a pattern is found and a conjecture made that a counterexample disproved later on?
Mathematicians resolved it without using the method of causal inference by Pearl.
 
  • #6
Demystifier said:
Yes, but the point is that you recognized this possibility without using formal causal inference. You, as well as scientists in general, are aware of this intuitively. So how exactly the formal causal inference helps, in "real life" of a scientist?
Good point. I am not an expert on this so I don't know if there are examples of practical cases where formal causal inference really paid off.
I am not as confident as you seem to be that scientists intuitively recognize the error of assuming that a particular correlation that they have established implies causation. I think they often jump to an unproven conclusion.
 
  • #7
Demystifier said:
I have some questions about causal inference developed by Pearl. For a background see e.g. the thread https://www.physicsforums.com/threads/the-causal-revolution-and-why-you-should-study-it.987205/
and wikipedia page https://en.wikipedia.org/wiki/Causal_model

Scientists think in causal terms intuitively, even without knowing about formal theory of causal inference. Is there an example of an actual scientific problem that scientist couldn't solve with intuitive causal thinking, but solved it with the use of formal causal inference?
Interesting. I have seen the math in this called structural equation modeling. I never heard of it in terms of causality. So I am aware of the method, but not the causal interpretation.
 
  • #8
FactChecker said:
I am not as confident as you seem to be that scientists intuitively recognize the error of assuming that a particular correlation that they have established implies causation. I think they often jump to an unproven conclusion.
Often they do jump, of course. But later other scientists often correct them.
 
  • Like
Likes FactChecker
  • #9
Demystifier said:
Often they do jump, of course. But later other scientists often correct them.
I am probably not being fair. I do not have the time or background to read and understand the scientific publications. It is the general public news reports that I am complaining about.
 
  • Like
Likes Demystifier
  • #10
Demystifier said:
Scientists think in causal terms intuitively, even without knowing about formal theory of causal inference.
Do you agree that P(A | B) is a causal relationship?
That is to say, P(A) given P(B) is a mathematical model of dependence, with a before/after status and causality?
 
  • #11
AngleWyrm said:
Do you agree that P(A | B) is a causal relationship?
No, I do not agree. If P(A|B) > P(A), then it is often also true that P(B|A) > P(B). So which is causing the other? Or did neither cause the other and something else caused them both? Probability relationships alone should not be mistaken for proof of a causal relationship.

Suppose A="a person has long arms" and B="a person has long legs". You can not say that the long arms caused the long legs, or that the long legs caused the long arms. Something else caused them both.
In fact, I see this mistake all the time: "If you sleep well then you will be healthier." But also people who are sick and in pain can not sleep as well as people who are healthy. So which is causing the other?
Also: "If you exercise more, then you will live longer." But people who are sick have trouble exercising, so which causes the other?
 
  • Like
Likes Demystifier
  • #12
FactChecker said:
No, I do not agree. If P(A|B) > P(A), then it is often also true that P(B|A) > P(B). So which is causing the other?
Then it should be possible to say that one of these two conditions holds:
  1. P(A|B) > P(A) AND P(B|A) > P(B), the above stated scenario
  2. the relationship measured in #1 isn't the case
Since the statement in #1 is a logical AND operation, there are four possible outcomes. So it can be represented as a Venn diagram:

Untitled.png


Example case
Two variables A and B are claimed to be P(A)=0.90, P(B)=0.20. In actuality, they merely start out that way. It's a simulation of drawing cards from a deck, and so the results alternately accumulate a causal effect on each other.

500 ObservationsBnot B
A104340
not A551

ExpectedB (0.20)not B (0.80)
A (0.90)0.9 × 0.2 = 0.18 (90/500)0.9 × 0.8 = 0.72 (360/500)
not A (0.10)0.1 × 0.2 = 0.02 (10/500)0.1 × 0.8 = 0.08 (40/500)

Chi squared test for independence 8.8139 > 3.8415 so the observations are outside independence with 95% certainty.
 
Last edited:
  • #13
AngleWyrm said:
Do you agree that P(A | B) is a causal relationship?
No. It is a quantity, not even a relationship, let alone a causal one.
 
  • Haha
Likes Demystifier
  • #14
AngleWyrm said:
Then it should be possible to say that one of these two conditions holds:
  1. P(A|B) > P(A) AND P(B|A) > P(B), the above stated scenario
  2. the relationship measured in #1 isn't the case
Yes, either #1 is true or it is not.
AngleWyrm said:
Since the statement in #1 is a logical AND operation, there are four possible outcomes. So it can be represented as a Venn diagram:

View attachment 313285

Example case
Two variables A and B are claimed to be P(A)=0.90, P(B)=0.20. In actuality, they merely start out that way. It's a simulation of drawing cards from a deck, and so the results alternately accumulate a causal effect on each other.
What does "causal effect" mean?
AngleWyrm said:
500 ObservationsBnot B
A104340
not A551

ExpectedB (0.20)not B (0.80)
A (0.90)0.9 × 0.2 = 0.18 (90/500)0.9 × 0.8 = 0.72 (360/500)
not A (0.10)0.1 × 0.2 = 0.02 (10/500)0.1 × 0.8 = 0.08 (40/500)

Chi squared test for independence 8.8139 > 3.8415 so the observations are outside independence with 95% certainty.
That doesn't say anything about causality. Statistical or probabilistic dependence does not, alone, imply anything about causality. There are simple counterexamples to any conclusion you try to state.
If a man has a short right arm it tends to imply that he has a short left arm, but it does not cause it. If the right arm is chopped off, the left arm does not fall off.
 
  • #15
I guess we also have to filter out the issue of lurking variables before we get to any potential causality.
An increase in consumption of ice cream is followed by an increase in skin cancer rates. or the number of shark attacks. But we may have perfect correlations without causation: betwee length in feet and in meters.
 
  • Like
Likes FactChecker
  • #16
WWGD said:
I guess we also have to filter out the issue of lurking variables before we get to any potential causality.
Yes, and I think that is what they try to do in a "causal model". They try to hold all other variables in set K constant.
##P(Y|X K=k) \gt P(Y| K=k)##
But there is a lot to do to make that work correctly. I think it might just be a theoretical exercise, but I am not familiar with this subject.
 
  • Like
Likes WWGD
  • #17
FactChecker said:
What does "causal effect" mean?
Given P(cancer | smoker) > P(cancer) -- cancer is more prevalent in smokers -- does that declare smoking causes cancer? Not quite yet, because it's also possible P(smoker | cancer) > P(smoker) -- smokers are more common in cancer victims. Simpson's Paradox.

This leads to two classifications:
  1. P(cancer|smoker)>P(cancer) AND P(smoker|cancer)>P(smoker)
  2. #1 is false
Thus P(cancer|smoker)>P(cancer) XOR P(smoker|cancer)>P(smoker) asserts causality and selects culprit
 
  • #18
AngleWyrm said:
Thus P(cancer|smoker)>P(cancer) XOR P(smoker|cancer)>P(smoker) asserts causality and selects culprit
Do you have a professional scientific reference that supports this claim?
 
  • Like
Likes FactChecker
  • #19
Dale said:
Do you have a professional scientific reference that supports this claim?
I believe at this stage , causation is more a philosophical matter than a scientific one.
 
  • #20
AngleWyrm said:
Thus P(cancer|smoker)>P(cancer) XOR P(smoker|cancer)>P(smoker) asserts causality and selects culprit
No. You have to give up the idea that simple probabilities alone can give a conclusion about causality. They are not the same. You CAN say that one is a good predictor of the other, but that is not the same as physically causing the other.
 
  • #21
FactChecker said:
No. You have to give up the idea that simple probabilities alone can give a conclusion about causality. They are not the same. You CAN say that one is a good predictor of the other, but that is not the same as physically causing the other.
I'd like something more than a verbal assertion of falsehood on my claim that causality is a demonstrable subset of two dependent variables.
 
  • #22
AngleWyrm said:
Thus P(cancer|smoker)>P(cancer) XOR P(smoker|cancer)>P(smoker) asserts causality and selects culprit
This XOR condition can never be true except in the degenerate case where one probability (P(smoker) or P(cancer)) is zero. Suppose that ##P(C|S)\gt P(C)## and that neither ##P(C)## nor ##P(S)## is zero.
By Bayes probability formula, ##P(S|C)=\frac{P(C|S)P(S)}{P(C)} \gt P(S)##
 
Last edited:
  • #23
AngleWyrm said:
I'd like something more than a verbal assertion of falsehood on my claim that causality is a demonstrable subset of two dependent variables.
I have already given some examples of the falsehood of your claim. That should be enough, but you seem to be missing my point.
 
  • Like
Likes Dale
  • #24
FactChecker said:
Suppose that ##P(C|S)\gt P(C)## and that neither ##P(C)## nor ##P(S)## is zero.
By Bayes probability formula, ##P(S|C)=\frac{P(C|S)P(S)}{P(C)} \gt P(S)##
I think that this may illustrate the difference between S causing C versus S increasing the probability of C.
In probability, if S increases the probability of C ( i.e. ##P(C|S)\gt P(C)##), then it follows that C increases the probability of S (i.e. ##P(S|C)\gt P(S)##). That is not like "causation", you would not say that S causing C meant that C causes S. So there is a real fundamental difference between probabilistic inference and causation.
 
  • #25
WWGD said:
I believe at this stage , causation is more a philosophical matter than a scientific one.
So then it isn’t a valid topic of discussion
 
  • #26
AngleWyrm said:
I'd like something more than a verbal assertion of falsehood on my claim that causality is a demonstrable subset of two dependent variables.
You already received that. However, it is important to understand that this is not how this forum is supposed to work. When you make a claim the onus is on you. You are the one who needs to show that your claim is correct, not the other way around.

Please PM me with a reference that supports your claim. I don’t think that one exists.
 
  • Like
Likes FactChecker
  • #27
Suppose A causes B. You would not say that B causes A.
But the probabilities do something like that. Because ##P(A|B)P(B) = P(A\cap B) = P(B|A)P(A)##, we have that ##\frac{P(A|B)}{P(A)} = \frac{P(B|A)}{P(B)}##. So if B increases the probability of A then also A increases the probability of B by the same factor. Clearly, using an increase in probability as the sole evidence of causation is wrong. In fact, there is no way to use these conditional probabilities to rule out some other thing being the physical cause of both A and B.
 
Last edited:
  • #28
Demystifier said:
I have some questions about causal inference developed by Pearl. For a background see e.g. the thread https://www.physicsforums.com/threads/the-causal-revolution-and-why-you-should-study-it.987205/
and wikipedia page https://en.wikipedia.org/wiki/Causal_model

Scientists think in causal terms intuitively, even without knowing about formal theory of causal inference. Is there an example of an actual scientific problem that scientist couldn't solve with intuitive causal thinking, but solved it with the use of formal causal inference?
I don't know of specific papers off the top of my head, but it is used in fields like epidemiology and behavioral sciences as far as I am aware.

Basically, if your model is not setup properly then you may not "measure" what you want to. So you can encode your model as a causal graph and then check it. These are usually probabilistic models and not the deterministic models we rely on for much of physics. As such, it doesn't really apply in that domain. There is a good discussion why on Sean Carroll's podcast with Pearl, although he seems to be working on something using this framework.

https://www.preposterousuniverse.com/podcast/2022/05/09/196-judea-pearl-on-cause-and-effect/

I am no expert, though, so take the above with a grain of salt.
 
  • #29
WWGD said:
I believe at this stage , causation is more a philosophical matter than a scientific one.
When we say that smoking causes cancer, is it philosophy or science?
 
  • #30
Demystifier said:
When we say that smoking causes cancer, is it philosophy or science?
I was referring more to framing the concept in ways that are clear and specific enough to be used, applied.
 
  • #31
WWGD said:
I was referring more to framing the concept in ways that are clear and specific enough to be used, applied.
I disagree. Pearl's framework is specific and clear. I think most just have an ignorance of the methods.
 
  • Like
Likes Dale
  • #32
jbergman said:
I disagree. Pearl's framework is specific and clear. I think most just have an ignorance of the methods.
Or, like me, are even aware of the mathematical and statistical methods, but not of the connection to causal inference
 
  • Like
Likes jbergman
  • #33
Dale said:
Interesting. I have seen the math in this called structural equation modeling. I never heard of it in terms of causality. So I am aware of the method, but not the causal interpretation.
Structural Equation Modeling is an integral part of the New Causal Revolution. Indeed, it is the glue that shows why Pearl's Directed Acyclic Graph approach and Rubin's Potential Outcomes Framework are equivalent.
AngleWyrm said:
Do you agree that P(A | B) is a causal relationship?
That is to say, P(A) given P(B) is a mathematical model of dependence, with a before/after status and causality?
At the intervention level (Pearl's "Second rung"), causality looks like this: ##P(Y|do(X))>P(Y).## In this case, we would say that ##X## causes ##Y.## Here, the ##do## operator means "force the variable ##X## to have the value ##x.## Much of Pearl's framework has to do with eliminating the ##do## expression so that you can get an expression you can evaluate in terms of data (the ##do## operator is not directly measurable). Conditional probability on its own is utterly incapable of expressing the ideas of causality.
 
Last edited:
  • Like
Likes Dale, jbergman, FactChecker and 1 other person
  • #34
Demystifier said:
I have some questions about causal inference developed by Pearl. For a background see e.g. the thread https://www.physicsforums.com/threads/the-causal-revolution-and-why-you-should-study-it.987205/
and wikipedia page https://en.wikipedia.org/wiki/Causal_model

Scientists think in causal terms intuitively, even without knowing about formal theory of causal inference. Is there an example of an actual scientific problem that scientist couldn't solve with intuitive causal thinking, but solved it with the use of formal causal inference?
One almost-example is the idea that smoking causes lung cancer. Now this had actually been causally established before Pearl's ideas came out. However, it is possible to show this using the front-door criterion - a dimension of Pearl's framework. Pearl's magnum opus, Causality: Models, Reasoning, and Inference, has been cited thousands of times. If you want examples of how the New Causal Revolution has produced causal information that wouldn't otherwise have been obtained, look at the papers that cite that book.

The fact is this: before the New Causal Revolution, if you wanted to demonstrate causality, you had one and only one tool: the Randomized Controlled Trial (RCT) - the good ol' tried-and-true experiment. This harks back to Mill's Methods for demonstrating causality, as well as Francis Bacon. If you didn't manipulate variables (force them to have particular values), then you didn't have causality. An observational study, in particular, was utterly incapable of demonstrating causality because by definition you don't manipulate variables in an observational study.

However, with the coming of the New Causal Revolution, while you certainly still have the experiment available, you can get causality from an observational study, given the right data and the right model. How does that all work? Study the New Causal Revolution (see link in the OP) to find out! The importance of the New Causal Revolution is that many of the experiments you might like to run are impractical or unethical (smoking, anyone?). So we can still get causality sometimes, even when a RCT is not available.
 
  • Like
  • Informative
Likes jbergman and FactChecker
  • #35
WWGD said:
I believe at this stage , causation is more a philosophical matter than a scientific one.
I strongly disagree with this. Science has long been concerned with causation - I would say primarily concerned with causation. It's the most important question! Mill's Methods show how an experiment demonstrates causality, but as I have just said in this thread, the New Causal Revolution has demonstrated how you can get causality from an observational study, given the right conditions. This opens up many new possibilities.

The field of statistics, for a long time, distanced itself from causality because it didn't have the vocabulary and tools necessary to deal with it, other than in experiments. But again, the New Causal Revolution has changed all that.
 
  • Like
Likes jbergman

Similar threads

Replies
1
Views
2K
Replies
6
Views
2K
Replies
3
Views
3K
Replies
10
Views
3K
  • Sticky
Replies
0
Views
1K
Replies
27
Views
3K
Back
Top