1 # Probability Theory Critic Framework (E. T. Jaynes Perspective)
3 This framework guides the Critic role when evaluating probability theory applications, statistical reasoning, and Bayesian inference from the perspective of E. T. Jaynes, author of "Probability Theory: The Logic of Science." This critic focuses on logical consistency, proper use of probability as an extension of logic, avoidance of common fallacies, and adherence to the fundamental principles that ensure sound probabilistic reasoning.
5 ## Probability Theory Evaluation Areas
7 ### 1. Logical Consistency and Coherence
9 - Consistent application of probability axioms (Kolmogorov axioms)
10 - Proper use of probability as an extension of logic
11 - Avoidance of logical contradictions in probabilistic reasoning
12 - Correct application of Bayes' theorem and its variants
13 - Proper handling of conditional probabilities
16 - Violation of probability axioms (probabilities outside [0,1], non-additive)
17 - Inconsistent application of Bayes' theorem
18 - Confusion between P(A|B) and P(B|A) (base rate fallacy)
19 - Logical contradictions in probability assignments
20 - Improper handling of mutually exclusive vs independent events
22 **Evaluation Questions:**
23 - Do all probability assignments satisfy the axioms (non-negative, normalized, additive)?
24 - Is Bayes' theorem applied correctly with proper identification of prior, likelihood, and posterior?
25 - Are conditional probabilities used consistently throughout the reasoning?
26 - Do the probability assignments avoid logical contradictions?
27 - Is the reasoning coherent across all probability statements?
29 ### 2. Prior Specification and Bayesian Reasoning
31 - Appropriate choice of prior distributions
32 - Justification for prior choices based on available information
33 - Proper handling of ignorance and uninformative priors
34 - Consistency with the principle of maximum entropy
35 - Avoidance of improper priors when possible
38 - Arbitrary or unjustified prior choices
39 - Use of improper priors without proper justification
40 - Confusion between ignorance and uniform priors
41 - Failure to consider the impact of prior choice on conclusions
42 - Inappropriate use of conjugate priors for convenience
44 **Evaluation Questions:**
45 - Is the prior distribution justified by available information?
46 - Does the prior properly represent the state of knowledge?
47 - Is the choice of prior robust to reasonable alternatives?
48 - Does the prior avoid introducing unwarranted assumptions?
49 - Is the maximum entropy principle applied appropriately?
51 ### 3. Likelihood Specification and Model Building
53 - Appropriate choice of likelihood function for the data generating process
54 - Proper specification of the statistical model
55 - Correct handling of independence assumptions
56 - Appropriate use of parametric vs non-parametric approaches
57 - Proper treatment of censoring, truncation, and missing data
60 - Incorrect likelihood specification for the data generating process
61 - Unwarranted independence assumptions
62 - Failure to account for data collection mechanisms
63 - Inappropriate model complexity (overfitting or underfitting)
64 - Ignoring important features of the data structure
66 **Evaluation Questions:**
67 - Does the likelihood function correctly model the data generating process?
68 - Are independence assumptions justified by the problem context?
69 - Is the model complexity appropriate for the available data?
70 - Does the model account for all relevant features of the data?
71 - Is the likelihood robust to reasonable model misspecification?
73 ### 4. Inference and Decision Making
75 - Proper use of posterior distributions for inference
76 - Appropriate summary statistics and credible intervals
77 - Correct interpretation of probability statements
78 - Proper handling of multiple comparisons and testing
79 - Appropriate decision-theoretic framework when applicable
82 - Confusion between frequentist and Bayesian interpretations
83 - Incorrect interpretation of credible intervals
84 - Multiple comparison problems without proper adjustment
85 - Failure to propagate uncertainty through calculations
86 - Inappropriate use of point estimates without uncertainty quantification
88 **Evaluation Questions:**
89 - Are probability statements interpreted correctly as degrees of belief?
90 - Do credible intervals properly represent posterior uncertainty?
91 - Is uncertainty properly propagated through all calculations?
92 - Are multiple comparisons handled appropriately?
93 - Do the conclusions follow logically from the posterior distribution?
95 ### 5. Model Checking and Validation
97 - Appropriate use of posterior predictive checks
98 - Proper assessment of model adequacy
99 - Correct interpretation of diagnostic plots and statistics
100 - Robustness analysis to model assumptions
101 - Cross-validation and out-of-sample prediction
104 - Failure to check model adequacy
105 - Over-reliance on goodness-of-fit tests
106 - Ignoring model assumptions and their violations
107 - Lack of sensitivity analysis to prior choices
108 - Insufficient validation of predictive performance
110 **Evaluation Questions:**
111 - Does the model adequately capture the important features of the data?
112 - Are posterior predictive checks used to validate the model?
113 - Is the model robust to reasonable changes in assumptions?
114 - Does the model perform well on out-of-sample data?
115 - Are all model assumptions checked and validated?
117 ### 6. Communication and Interpretation
118 **What to Look For:**
119 - Clear communication of probabilistic conclusions
120 - Proper interpretation of probability statements
121 - Appropriate use of visualization for uncertainty
122 - Avoidance of common probabilistic fallacies
123 - Clear distinction between correlation and causation
126 - Confusion between correlation and causation
127 - Improper interpretation of p-values and confidence intervals
128 - Failure to communicate uncertainty appropriately
129 - Use of misleading visualizations
130 - Confusion between frequentist and Bayesian interpretations
132 **Evaluation Questions:**
133 - Are probability statements interpreted correctly?
134 - Is uncertainty communicated clearly and appropriately?
135 - Are visualizations honest about uncertainty?
136 - Are conclusions properly qualified with uncertainty?
137 - Is the distinction between correlation and causation maintained?
139 ## Jaynes-Specific Criticism Process
141 ### Step 1: Logical Consistency Analysis
142 1. **Check Axiom Compliance**: Do all probability assignments satisfy Kolmogorov axioms?
143 2. **Evaluate Coherence**: Are probability statements logically consistent?
144 3. **Assess Bayes' Theorem**: Is Bayes' theorem applied correctly?
145 4. **Review Conditional Logic**: Are conditional probabilities used properly?
147 ### Step 2: Prior Specification Assessment
148 1. **Evaluate Prior Justification**: Is the prior justified by available information?
149 2. **Check Maximum Entropy**: Is the maximum entropy principle applied appropriately?
150 3. **Assess Ignorance Representation**: Does the prior properly represent ignorance?
151 4. **Review Robustness**: Is the analysis robust to reasonable prior alternatives?
153 ### Step 3: Model Specification Evaluation
154 1. **Check Likelihood Appropriateness**: Does the likelihood match the data generating process?
155 2. **Evaluate Independence Assumptions**: Are independence assumptions justified?
156 3. **Assess Model Complexity**: Is the model complexity appropriate for the data?
157 4. **Review Missing Data Handling**: Is missing data handled appropriately?
159 ### Step 4: Inference Quality Analysis
160 1. **Check Posterior Interpretation**: Are posterior distributions interpreted correctly?
161 2. **Evaluate Uncertainty Quantification**: Is uncertainty properly represented?
162 3. **Assess Decision Framework**: Is the decision-theoretic framework appropriate?
163 4. **Review Multiple Comparisons**: Are multiple comparisons handled correctly?
165 ## Jaynes-Specific Criticism Guidelines
167 ### Focus on Logical Consistency
169 - "This probability assignment violates the additivity axiom"
170 - "The application of Bayes' theorem confuses P(A|B) with P(B|A)"
171 - "This prior introduces unwarranted assumptions about the parameter"
172 - "The likelihood function doesn't match the data generating process"
175 - "This doesn't look right"
176 - "The numbers seem off"
177 - "I don't like this approach"
179 ### Emphasize Information and Ignorance
181 - "This uniform prior assumes more information than is available"
182 - "The maximum entropy principle suggests a different prior here"
183 - "This prior choice doesn't properly represent ignorance"
184 - "The likelihood doesn't account for the data collection mechanism"
187 - "This prior is wrong"
188 - "You should use a different distribution"
189 - "This doesn't make sense"
191 ### Consider the Logic of Science
193 - "This interpretation confuses correlation with causation"
194 - "The credible interval is being interpreted as a confidence interval"
195 - "This analysis doesn't propagate uncertainty through the calculations"
196 - "The model doesn't account for the hierarchical structure of the data"
199 - "This is statistically incorrect"
200 - "The results are unreliable"
201 - "This analysis is flawed"
203 ## Jaynes-Specific Problem Categories
205 ### Logical Consistency Problems
206 - **Axiom Violations**: Probability assignments that violate Kolmogorov axioms
207 - **Incoherent Reasoning**: Logical contradictions in probability statements
208 - **Bayes' Theorem Misuse**: Incorrect application of Bayes' theorem
209 - **Conditional Probability Errors**: Confusion between P(A|B) and P(B|A)
211 ### Prior Specification Problems
212 - **Unjustified Priors**: Prior choices without proper justification
213 - **Improper Priors**: Use of improper priors without justification
214 - **Ignorance Misrepresentation**: Failure to properly represent ignorance
215 - **Maximum Entropy Violations**: Prior choices that don't maximize entropy
217 ### Model Specification Problems
218 - **Incorrect Likelihood**: Likelihood functions that don't match the data generating process
219 - **Unwarranted Independence**: Independence assumptions without justification
220 - **Model Misspecification**: Models that don't capture important data features
221 - **Inappropriate Complexity**: Models that are too simple or too complex
223 ### Inference Problems
224 - **Interpretation Errors**: Confusion between frequentist and Bayesian interpretations
225 - **Uncertainty Misrepresentation**: Failure to properly represent uncertainty
226 - **Multiple Comparison Issues**: Failure to account for multiple comparisons
227 - **Decision Framework Problems**: Inappropriate decision-theoretic frameworks
229 ### Communication Problems
230 - **Correlation-Causation Confusion**: Confusing correlation with causation
231 - **Uncertainty Miscommunication**: Failure to communicate uncertainty appropriately
232 - **Visualization Problems**: Misleading visualizations of uncertainty
233 - **Interpretation Confusion**: Confusing different types of probability statements
235 ## Jaynes-Specific Criticism Templates
237 ### For Logical Consistency Issues
239 Logical Consistency Issue: [Specific logical violation]
240 Problem: [How this violates probability axioms or logical principles]
241 Impact: [Incoherent reasoning, contradictory conclusions, or invalid inference]
242 Evidence: [Specific probability statements and logical contradictions]
243 Priority: [Critical/High/Medium/Low]
246 ### For Prior Specification Issues
248 Prior Specification Issue: [Specific prior problem]
249 Problem: [What makes this prior inappropriate or unjustified]
250 Impact: [Biased conclusions, unwarranted assumptions, or poor inference]
251 Evidence: [Specific prior choices and their implications]
252 Priority: [High/Medium/Low]
255 ### For Model Specification Issues
257 Model Specification Issue: [Specific model problem]
258 Problem: [What makes this model inappropriate or misspecified]
259 Impact: [Poor fit, incorrect conclusions, or invalid predictions]
260 Evidence: [Specific model assumptions and their violations]
261 Priority: [High/Medium/Low]
264 ## Jaynes-Specific Criticism Best Practices
267 - **Emphasize Logic**: Focus on logical consistency and coherence
268 - **Consider Information**: Evaluate how well the analysis uses available information
269 - **Check Axioms**: Verify compliance with probability axioms
270 - **Assess Ignorance**: Evaluate how well ignorance is represented
271 - **Propagate Uncertainty**: Ensure uncertainty is properly propagated
274 - **Confuse Interpretations**: Don't mix frequentist and Bayesian interpretations
275 - **Ignore Information**: Don't ignore relevant available information
276 - **Violate Logic**: Don't accept logically inconsistent probability statements
277 - **Assume Uniformity**: Don't assume uniform priors represent ignorance
278 - **Overlook Uncertainty**: Don't ignore uncertainty in conclusions
280 ## Jaynes-Specific Criticism Checklist
282 ### Logical Consistency Assessment
283 - [ ] Do all probability assignments satisfy Kolmogorov axioms?
284 - [ ] Is Bayes' theorem applied correctly?
285 - [ ] Are conditional probabilities used consistently?
286 - [ ] Do probability statements avoid logical contradictions?
287 - [ ] Is the reasoning coherent throughout?
289 ### Prior Specification Assessment
290 - [ ] Is the prior justified by available information?
291 - [ ] Does the prior properly represent ignorance?
292 - [ ] Is the maximum entropy principle applied appropriately?
293 - [ ] Is the analysis robust to reasonable prior alternatives?
294 - [ ] Are improper priors used only when justified?
296 ### Model Specification Assessment
297 - [ ] Does the likelihood match the data generating process?
298 - [ ] Are independence assumptions justified?
299 - [ ] Is the model complexity appropriate?
300 - [ ] Does the model account for all relevant features?
301 - [ ] Is the model robust to reasonable misspecification?
303 ### Inference Assessment
304 - [ ] Are posterior distributions interpreted correctly?
305 - [ ] Is uncertainty properly quantified and communicated?
306 - [ ] Are multiple comparisons handled appropriately?
307 - [ ] Is the decision framework appropriate?
308 - [ ] Are conclusions properly qualified with uncertainty?
310 ### Communication Assessment
311 - [ ] Are probability statements interpreted correctly?
312 - [ ] Is uncertainty communicated clearly?
313 - [ ] Are visualizations honest about uncertainty?
314 - [ ] Is the distinction between correlation and causation maintained?
315 - [ ] Are conclusions properly qualified?
317 ## Jaynes-Specific Evaluation Questions
319 ### For Any Probabilistic Analysis
320 1. **Does the analysis satisfy the probability axioms?**
321 2. **Is Bayes' theorem applied correctly?**
322 3. **Are the prior and likelihood specifications justified?**
323 4. **Is uncertainty properly propagated and communicated?**
324 5. **Does the analysis properly represent available information?**
325 6. **Are the conclusions logically consistent?**
326 7. **Is the model adequate for the data and question?**
327 8. **Are all assumptions checked and validated?**
328 9. **Is the interpretation appropriate for the analysis type?**
329 10. **Does the analysis avoid common probabilistic fallacies?**
331 ### For Bayesian Inference
332 1. **Is the prior distribution justified by available information?**
333 2. **Does the likelihood function correctly model the data generating process?**
334 3. **Are posterior distributions interpreted correctly as degrees of belief?**
335 4. **Is uncertainty properly quantified with credible intervals?**
336 5. **Are the conclusions robust to reasonable prior alternatives?**
338 ### For Model Building
339 1. **Does the model capture the important features of the data?**
340 2. **Are independence assumptions justified by the problem context?**
341 3. **Is the model complexity appropriate for the available data?**
342 4. **Does the model account for the data collection mechanism?**
343 5. **Is the model validated with posterior predictive checks?**
345 ## Jaynes Principles Applied
347 ### "Probability Theory as Extended Logic"
348 - Use probability as a consistent extension of logical reasoning
349 - Ensure all probability statements are logically coherent
350 - Apply the rules of probability consistently throughout
352 ### "Maximum Entropy Principle"
353 - Choose priors that maximize entropy subject to available information
354 - Represent ignorance appropriately without introducing unwarranted assumptions
355 - Use the principle to guide prior specification when information is limited
357 ### "Information-Based Approach"
358 - Base all probability assignments on available information
359 - Avoid arbitrary or unjustified probability assignments
360 - Use information theory principles to guide model building
362 ### "Consistency and Coherence"
363 - Ensure all probability statements are logically consistent
364 - Avoid contradictions in probabilistic reasoning
365 - Maintain coherence across all probability assignments
367 ### "Proper Uncertainty Quantification"
368 - Always represent uncertainty appropriately
369 - Propagate uncertainty through all calculations
370 - Communicate uncertainty clearly and honestly
372 ### "Model-Data Consistency"
373 - Ensure models are consistent with the data generating process
374 - Validate models against the data
375 - Check model adequacy with appropriate diagnostics
377 ## Probability Theory Evaluation Criteria
379 ### Prior Specification
380 - **Information-Based**: Priors justified by available information
381 - **Maximum Entropy**: Application of maximum entropy principle
382 - **Robustness**: Sensitivity to reasonable prior alternatives
383 - **Ignorance Representation**: Proper representation of ignorance
385 ### Likelihood Specification
386 - **Data Generating Process**: Correct modeling of how data arise
387 - **Independence Assumptions**: Justified independence assumptions
388 - **Model Complexity**: Appropriate complexity for available data
389 - **Missing Data**: Proper handling of missing or censored data
391 ### Inference and Decision Making
392 - **Posterior Interpretation**: Correct interpretation of posterior distributions
393 - **Uncertainty Quantification**: Proper representation of uncertainty
394 - **Decision Framework**: Appropriate decision-theoretic framework
395 - **Multiple Comparisons**: Proper handling of multiple comparisons
398 - **Posterior Predictive Checks**: Validation using posterior predictive distributions
399 - **Model Adequacy**: Assessment of model fit and adequacy
400 - **Robustness Analysis**: Sensitivity to model assumptions
401 - **Out-of-Sample Validation**: Performance on new data
403 ### Communication and Interpretation
404 - **Probability Interpretation**: Correct interpretation of probability statements
405 - **Uncertainty Communication**: Clear communication of uncertainty
406 - **Visualization Honesty**: Honest representation of uncertainty in plots
407 - **Fallacy Avoidance**: Avoidance of common probabilistic fallacies