Seminar in Causal Modeling and Current Topics in Cognitive Systems

Seminar in Causal Modeling and Current Topics in Cognitive Systems
paly

This seminar explores the mathematics of cause and effect, including the development of Western science and the discovery of causal relationships through systematic experimentation. Prerequisites include Bayesian Networks or Commonsense. Taught by Judea Pearl at the University of California Los Angeles.

About Seminar in Causal Modeling and Current Topics in Cognitive Systems

PowerPoint presentation about 'Seminar in Causal Modeling and Current Topics in Cognitive Systems'. This presentation describes the topic on This seminar explores the mathematics of cause and effect, including the development of Western science and the discovery of causal relationships through systematic experimentation. Prerequisites include Bayesian Networks or Commonsense. Taught by Judea Pearl at the University of California Los Angeles.. The key topics included in this slideshow are causal modeling, cognitive systems, mathematics, Bayesian Networks, experimentation,. Download this presentation absolutely free.

Presentation Transcript


1. RELATED CLASS CS 262 Z SEMINAR IN CAUSAL MODELING CURRENT TOPICS IN COGNITIVE SYSTEMS INSTRUCTOR: JUDEA PEARL Spring Quarter Monday and Wednesday, 2-4pm A74 Haines Hall (location may change, check Schedule of Classes prior) Prerequisites: Bayesian Networks or Commonsense

2. Judea Pearl University of California Los Angeles THE MATHEMATICS OF CAUSE AND EFFECT

3. David Hume ( 17111776 ) I would rather discover one causal law than be King of Persia. Democritus (460-370 B.C.) Development of Western science is based on two great achievements: the invention of the formal logical system (in Euclidean geometry) by the Greek philosophers, and the discovery of the possibility to find out causal relationships by systematic experiment (during the Renaissance). A. Einstein, April 23, 1953

4. HUMES LEGACY HUMES LEGACY 1. Analytical vs. empirical claims 2. Causal claims are empirical 3. All empirical claims originate from experience.

5. THE TWO RIDDLES OF CAUSATION THE TWO RIDDLES OF CAUSATION l What empirical evidence legitimizes a cause-effect connection? l What inferences can be drawn from causal information? and how?

7. Easy, man! that hurts! Easy, man! that hurts! The Art of Causal Mentoring

8. 1. How should a robot acquire causal information from the environment? 2. How should a robot process causal information received from its creator-programmer? OLD RIDDLES IN NEW DRESS OLD RIDDLES IN NEW DRESS

9. Input: 1. If the grass is wet, then it rained 2. if we break this bottle, the grass will get wet Output: If we break this bottle, then it rained CAUSATION AS A PROGRAMMER'S NIGHTMARE CAUSATION AS A PROGRAMMER'S NIGHTMARE

10. CAUSATION AS A PROGRAMMER'S NIGHTMARE (Cont.) ( Lin, 1995) CAUSATION AS A PROGRAMMER'S NIGHTMARE (Cont.) ( Lin, 1995) Input: 1. A suitcase will open iff both locks are open. 2. The right lock is open Query: What if we open the left lock? Output: The right lock might get closed.

11. THE BASIC PRINCIPLES THE BASIC PRINCIPLES Causation = encoding of behavior under interventions Interventions = surgeries on mechanisms Mechanisms = stable functional relationships = equations + graphs

12. Input: 1. If the grass is wet, then it rained 2. if we break this bottle, the grass will get wet Output: If we break this bottle, then it rained CAUSATION AS A PROGRAMMER'S NIGHTMARE CAUSATION AS A PROGRAMMER'S NIGHTMARE

13. TRADITIONAL STATISTICAL INFERENCE PARADIGM Data Inference Q ( P ) (Aspects of P ) P Joint Distribution e.g., Infer whether customers who bought product A would also buy product B . Q = P ( B | A )

14. What happens when P changes? e.g., Infer whether customers who bought product A would still buy A if we were to double the price . FROM STATISTICAL TO CAUSAL ANALYSIS: 1. THE DIFFERENCES Probability and statistics deal with static relations Data Inference Q ( P ) (Aspects of P ) P Joint Distribution P Joint Distribution change

15. FROM STATISTICAL TO CAUSAL ANALYSIS: 1. THE DIFFERENCES Note: P ( v ) P ( v | price = 2) P does not tell us how it ought to change e.g. Curing symptoms vs. curing diseases e.g. Analogy: mechanical deformation What remains invariant when P changes say, to satisfy P ( price =2)=1 Data Inference Q ( P ) (Aspects of P ) P Joint Distribution P Joint Distribution change

16. FROM STATISTICAL TO CAUSAL ANALYSIS: 1. THE DIFFERENCES (CONT) CAUSAL Spurious correlation Randomization Confounding / Effect Instrument Holding constant Explanatory variables STATISTICAL Regression Association / Independence Controlling for / Conditioning Odd and risk ratios Collapsibility 1. Causal and statistical concepts do not mix. 2. 3. 4.

17. CAUSAL Spurious correlation Randomization Confounding / Effect Instrument Holding constant Explanatory variables STATISTICAL Regression Association / Independence Controlling for / Conditioning Odd and risk ratios Collapsibility 1. Causal and statistical concepts do not mix. 4. 3. Causal assumptions cannot be expressed in the mathematical language of standard statistics. FROM STATISTICAL TO CAUSAL ANALYSIS: 2. MENTAL BARRIERS 2. No causes in no causes out (Cartwright, 1989) statistical assumptions + data causal assumptions causal conclusions }

18. 4. Non-standard mathematics: a) Structural equation models (Wright, 1920; Simon, 1960) b) Counterfactuals (Neyman-Rubin ( Y x ) , Lewis ( x Y )) CAUSAL Spurious correlation Randomization Confounding / Effect Instrument Holding constant Explanatory variables STATISTICAL Regression Association / Independence Controlling for / Conditioning Odd and risk ratios Collapsibility 1. Causal and statistical concepts do not mix. 3. Causal assumptions cannot be expressed in the mathematical language of standard statistics. FROM STATISTICAL TO CAUSAL ANALYSIS: 2. MENTAL BARRIERS 2. No causes in no causes out (Cartwright, 1989) statistical assumptions + data causal assumptions causal conclusions }

19. Data Inference Q ( M ) (Aspects of M ) Data Generating Model M Oracle for computing answers to Q s . e.g., Infer whether customers who bought product A would still buy A if we were to double the price. Joint Distribution THE STRUCTURAL MODEL PARADIGM

20. Z Y X INPUT OUTPUT FAMILIAR CAUSAL MODEL ORACLE FOR MANIPILATION

21. STRUCTURAL CAUSAL MODELS Definition: A structural causal model is a 4-tuple V,U, F, P ( u ) , where V = { V 1 ,...,V n } are observable variables U = { U 1 ,...,U m } are background variables F = { f 1 ,..., f n } are functions determining V , v i = f i ( v , u ) P ( u ) is a distribution over U P ( u ) and F induce a distribution P ( v ) over observable variables

22. U (Court order) D (Death) B (Riflemen) C (Captain) A CAUSAL MODELS AT WORK (The impatient firing-squad) CAUSAL MODELS AT WORK (The impatient firing-squad)

23. CAUSAL MODELS AT WORK CAUSAL MODELS AT WORK (Glossary) (Glossary) U : Court orders the execution C : Captain gives a signal A : Rifleman- A shoots B : Rifleman- B shoots D : Prisoner dies =: Functional Equality (new symbol) U D B C A C=U A=C B=C D=A B

24. SENTENCES TO BE EVALUATED SENTENCES TO BE EVALUATED S1. prediction: A D S2. abduction: D C S3. transduction: A B S4. action: C D A S5. counterfactual: D D { A} S6. explanation: Caused( A, D ) U D B C A

25. STANDARD MODEL FOR STANDARD MODEL FOR STANDARD QUERIES STANDARD QUERIES S1. (prediction): If rifleman- A shot, the prisoner is dead, A D S2. (abduction): If the prisoner is alive, then the Captain did not signal, D C S3. (transduction): If rifleman- A shot, then B shot as well, A B U D B C A iff iff iff OR

26. WHY CAUSAL MODELS? WHY CAUSAL MODELS? GUIDE FOR SURGERY GUIDE FOR SURGERY S4. (action): If the captain gave no signal and Mr. A decides to shoot , the prisoner will die: C D A , and B will not shoot: C B A U D B C A

27. WHY CAUSAL MODELS? WHY CAUSAL MODELS? GUIDE FOR SURGERY GUIDE FOR SURGERY S4. (action): If the captain gave no signal and Mr. A decides to shoot , the prisoner will die: C D A , and B will not shoot: C B A U D B C A TRUE

28. U D B C A Abduction Action Prediction S5. If the prisoner is dead, he would still be dead if A were not to have shot. D D A 3-STEPS TO COMPUTING 3-STEPS TO COMPUTING COUNTERFACTUALS COUNTERFACTUALS U D B C A FALSE TRUE TRUE U D B C A FALSE TRUE TRUE TRUE

29. U D B C A Abduction P (S5). The prisoner is dead. How likely is it that he would be dead if A were not to have shot. P ( D A |D) = ? COMPUTING PROBABILITIES COMPUTING PROBABILITIES OF COUNTERFACTUALS OF COUNTERFACTUALS Action TRUE Prediction U D B C A FALSE P ( u|D ) P ( D A |D ) P ( u ) U D B C A FALSE P ( u|D ) P ( u|D )

30. CAUSAL MODEL (FORMAL) CAUSAL MODEL (FORMAL) M = U - Background variables V - Endogenous variables F - Set of functions { U V \V i V i } v i =f i ( pa i , u i ) Submodel: M x = , representing do ( x ) F x = Replaces equation for X with X=x Actions and Counterfactuals: Y x ( u ) = Solution of Y in M x or P ( y | do ( x )) P ( Y x =y )

31. WHY COUNTERFACTUALS? WHY COUNTERFACTUALS? Action queries are triggered by (modifiable) observations, demanding abductive step, i.e., counterfactual processing. E.g., Troubleshooting Observation: The output is low Action query: Will the output get higher if we replace the transistor? Counterfactual query: Would the output be higher had the transistor been replaced?

32. APPLICATIONS APPLICATIONS 1 . Predicting effects of actions and policies 2 . Learning causal relationships from assumptions and data 3 . Troubleshooting physical systems and plans 4 . Finding explanations for reported events 5 . Generating verbal explanations 6 . Understanding causal talk 7 . Formulating theories of causal thinking

33. CAUSAL MODELS AND COUNTERFACTUALS Definition: The sentence: Y would be y (in situation u ), had X been x , denoted Y x ( u ) = y , means: The solution for Y in a mutilated model M x , (i.e., the equations for X replaced by X = x ) with input U=u , is equal to y . Joint probabilities of counterfactuals: The super-distribution P * is derived from M. Parsimonous, consistent, and transparent

34. AXIOMS OF CAUSAL COUNTERFACTUALS 1. Definiteness 2. Uniqueness 3. Effectiveness 4. Composition 5. Reversibility Y would be y , had X been x (in state U = u )

35. GRAPHICAL COUNTERFACTUALS SYMBIOSIS Every causal graph expresses counterfactuals assumptions, e.g., Assumptions are guaranteed consistency. Assumptions are readable from the graph. 1. Missing arrows X Z 2. Missing arcs Y Z X Y Z

36. DERIVATION IN CAUSAL CALCULUS DERIVATION IN CAUSAL CALCULUS Smoking Tar Cancer P ( c | do { s } ) = t P ( c | do { s } , t ) P ( t | do { s } ) = s t P ( c | do { t } , s ) P ( s | do { t } ) P ( t | s ) = t P ( c | do { s } , do { t } ) P ( t | do { s } ) = t P ( c | do { s } , do { t } ) P ( t | s ) = t P ( c | do { t } ) P ( t | s ) = s t P ( c | t, s ) P ( s ) P ( t | s ) = s t P ( c | t, s ) P ( s | do { t } ) P ( t | s ) Probability Axioms Probability Axioms Rule 2 Rule 2 Rule 3 Rule 3 Rule 2 Genotype (Unobserved)

37. THE BACK-DOOR CRITERION Graphical test of identification P ( y | do ( x )) is identifiable in G if there is a set Z of variables such that Z d -separates X from Y in G x . Z 6 Z 3 Z 2 Z 5 Z 1 X Y Z 4 Z 6 Z 3 Z 2 Z 5 Z 1 X Y Z 4 Z Moreover, P ( y | do ( x )) = P ( y | x,z ) P ( z ) (adjusting for Z ) z G x G

38. do -calculus is complete Complete graphical criterion for identifying causal effects (Shpitser and Pearl, 2006). Complete graphical criterion for empirical testability of counterfactuals (Shpitser and Pearl, 2007). RECENT RESULTS ON IDENTIFICATION

39. DETERMINING THE CAUSES OF EFFECTS ( The Attribution Problem ) Your Honor! My client (Mr. A) died BECAUSE he used that drug.

40. DETERMINING THE CAUSES OF EFFECTS ( The Attribution Problem ) Your Honor! My client (Mr. A) died BECAUSE he used that drug. Court to decide if it is MORE PROBABLE THAN NOT that A would be alive BUT FOR the drug! P ( ? | A is dead, took the drug) > 0.50 PN =

41. THE PROBLEM Semantical Problem: 1. What is the meaning of PN ( x,y ): Probability that event y would not have occurred if it were not for event x , given that x and y did in fact occur.

42. THE PROBLEM Semantical Problem: 1. What is the meaning of PN ( x,y ): Probability that event y would not have occurred if it were not for event x , given that x and y did in fact occur. Answer: Computable from M

43. THE PROBLEM Semantical Problem: 1. What is the meaning of PN ( x,y ): Probability that event y would not have occurred if it were not for event x , given that x and y did in fact occur. 2. Under what condition can PN ( x,y ) be learned from statistical data, i.e., observational, experimental and combined. Analytical Problem:

44. TYPICAL THEOREMS (Tian and Pearl, 2000) Bounds given combined nonexperimental and experimental data Identifiability under monotonicity (Combined data) corrected Excess-Risk-Ratio

45. CAN FREQUENCY DATA DECIDE LEGAL RESPONSIBILITY? CAN FREQUENCY DATA DECIDE LEGAL RESPONSIBILITY? Nonexperimental data: drug usage predicts longer life Experimental data: drug has negligible effect on survival Experimental Nonexperimental do ( x ) do ( x ) x x Deaths ( y ) 16 14 2 28 Survivals ( y ) 984 986 998 972 1,000 1,000 1,000 1,000 1. He actually died 2. He used the drug by choice Court to decide (given both data): Is it more probable than not that A would be alive but for the drug? Plaintiff: Mr. A is special.

46. SOLUTION TO THE ATTRIBUTION PROBLEM WITH PROBABILITY ONE 1 P ( y x | x,y ) 1 Combined data tell more that each study alone

47. CONCLUSIONS Structural-model semantics, enriched with logic and graphs, provides: Complete formal basis for causal reasoning Powerful and friendly causal calculus Lays the foundations for asking more difficult questions: What is an action? What is free will? Should robots be programmed to have this illusion?

48. RELATED CLASS CS 262 Z SEMINAR IN CAUSAL MODELING CURRENT TOPICS IN COGNITIVE SYSTEMS INSTRUCTOR: JUDEA PEARL Spring Quarter Monday and Wednesday, 2-4pm A74 Haines Hall (location may change, check Schedule of Classes prior) Prerequisites: Bayesian Networks or Commonsense

Related