P-Doom Measure
Jump to navigation
Jump to search
A P-Doom Measure is an AI risk probability measure that is an existential life-risk assessment measure.
- AKA: Probability of Doom, P(Doom), AI Doom Probability.
- Context:
- It can typically express Individual AI Risk Assessments through percentage values ranging from 0% to 100%.
- It can typically capture Existential Risk Beliefs about artificial general intelligence development timelines.
- It can typically inform AI Safety Research Prioritys through risk quantification frameworks.
- It can typically facilitate AI Risk Communication between AI safety researchers and policy makers.
- It can typically vary based on AGI Timeline Assumptions and AI capability trajectory beliefs.
- ...
- It can often reflect Personal Epistemic Uncertainty about AI development pathways.
- It can often influence AI Governance Decisions through risk perception metrics.
- It can often serve as Community Coordination Signals within AI safety communitys.
- It can often incorporate Multiple Risk Factors including AI misalignment risks and AI capability risks.
- ...
- It can range from being a Low P-Doom Measure to being a High P-Doom Measure, depending on its AI risk assessment value.
- It can range from being a Near-Term P-Doom Measure to being a Long-Term P-Doom Measure, depending on its AI timeline scope.
- ...
- It can be derived from AI Safety Expert Surveys for consensus assessments.
- It can be updated with New AI Capability Evidence for dynamic risk tracking.
- It can be compared across Different AI Safety Organizations for perspective analysis.
- It can be incorporated into AI Policy Frameworks for regulatory decision support.
- It can be communicated through Public AI Safety Discussions for awareness building.
- ...
- Example(s):
- Individual P-Doom Measures, such as:
- AI Researcher P-Doom Measures, such as:
- Eliezer Yudkowsky P-Doom Measure (reported as >90%), expressing high existential concern.
- Stuart Russell P-Doom Measure, reflecting moderate AI risk assessment.
- AI Organization P-Doom Measures, such as:
- MIRI P-Doom Measure, representing organizational risk consensus.
- DeepMind Safety Team P-Doom Measure, indicating industry safety perspective.
- AI Researcher P-Doom Measures, such as:
- Survey-Based P-Doom Measures, such as:
- AI Impacts Survey P-Doom Measure (2022), aggregating expert opinions.
- FHI Survey P-Doom Measure, compiling academic risk assessments.
- Conditional P-Doom Measures, such as:
- P-Doom Given No AI Alignment Progress, assessing worst-case scenarios.
- P-Doom Given Current Trajectory, evaluating status quo risks.
- ...
- Individual P-Doom Measures, such as:
- Counter-Example(s):
- General AI Risk Measures, which lack specific doom probability quantification.
- AI Capability Predictions, which focus on performance milestones rather than existential risk probability.
- AI Timeline Estimates, which predict AGI arrival dates without catastrophic outcome probability.
- See: Humanitarian Measure, AI Existential Risk, AI Safety Research, AGI Timeline, AI Alignment Problem.