Recent content by AngleWyrm

  1. AngleWyrm

    Graduate Causal inference developed by Pearl

    I'd like something more than a verbal assertion of falsehood on my claim that causality is a demonstrable subset of two dependent variables.
  2. AngleWyrm

    Graduate Causal inference developed by Pearl

    Given P(cancer | smoker) > P(cancer) -- cancer is more prevalent in smokers -- does that declare smoking causes cancer? Not quite yet, because it's also possible P(smoker | cancer) > P(smoker) -- smokers are more common in cancer victims. Simpson's Paradox. This leads to two classifications...
  3. AngleWyrm

    Graduate Causal inference developed by Pearl

    Then it should be possible to say that one of these two conditions holds: P(A|B) > P(A) AND P(B|A) > P(B), the above stated scenario the relationship measured in #1 isn't the case Since the statement in #1 is a logical AND operation, there are four possible outcomes. So it can be represented...
  4. AngleWyrm

    Graduate Causal inference developed by Pearl

    Do you agree that P(A | B) is a causal relationship? That is to say, P(A) given P(B) is a mathematical model of dependence, with a before/after status and causality?
  5. AngleWyrm

    High School How to Find the Intersection of a Logarithmic Curve and a Tangent Line?

    Re-read the thread summary Are there an infinite number of tangent lines with a slope of -50? Or does that restrict the answer set to one unique line and set of coordinates?
  6. AngleWyrm

    High School How to Find the Intersection of a Logarithmic Curve and a Tangent Line?

    No, that's an estimate of the answer to the question what are the coordinates.
  7. AngleWyrm

    High School How to Find the Intersection of a Logarithmic Curve and a Tangent Line?

    Yeah that works; looks like about -25/0.5 = -50 slope
  8. AngleWyrm

    High School How to Find the Intersection of a Logarithmic Curve and a Tangent Line?

    I have a formula y=log(x)/log(0.9) which has this graph: I want to find the intersection of this curve and a tangent line illustrated in this rough approximation: The axes have very different scales, so the line isn't actually a slope of -1, it's just looks that way. How can I figure out: 1)...
  9. AngleWyrm

    Undergrad Confidence as used in Probability

    RE: "standard terms" Variables are placeholders; they are nouns but they are not proper nouns. They are always a mapping of the form a → b, such as "let Ω be the set of outcomes" Use of these forums is not predicated on a specific naming convention.
  10. AngleWyrm

    Undergrad Confidence as used in Probability

    I'm sensing hostility and denial; would you like to bargain? Because I hear the next step is depression.
  11. AngleWyrm

    Undergrad Confidence as used in Probability

    Let's start with a model from the Mirror Universe. In the Lab-O-Doom I conduct three trial runs of an experiment. I've got a bag of jelly beans and there's a yummy green one in there. There are also four other jelly beans in the bag. And because I'm wearing a white smock and steam-punk eye...
  12. AngleWyrm

    Undergrad Confidence as used in Probability

    The casino's offer of a gamble is a Bernoulli trial, the flip of a coin before the coin has been flipped. In this case a very unfair coin. The two possible outcomes are called success and failure to describe preference, such as winning a gamble. Each try is a movement from before we know the...
  13. AngleWyrm

    Undergrad Confidence as used in Probability

    Let's take a look at that, because I may have made a tragic error. There are two quantities and a formula that uses them. First quantity is confidence confidence = 0.95 chanceToBeWrong = 1 - confidence Second quantity is chance of success each try chanceOfSuccess = 0.01 chanceOfFailure = 1 -...
  14. AngleWyrm

    Undergrad Confidence as used in Probability

    Example problem A casino offers you a gamble with a 1% chance of winning a try. How many tries will it take to win at least once? Solution For this example, I chose 95% confidence, a willingness to be wrong once in twenty...