LLM Safety Metric

From GM-RKB
Jump to navigation Jump to search

A LLM Safety Metric is a safety evaluation metric that is an ai safety measure quantifying large language model risks and llm harmful behaviors.