LLM as Judge Bias Mitigation System

From GM-RKB
Jump to navigation Jump to search

A LLM as Judge Bias Mitigation System is a bias mitigation system that identifies, measures, and reduces systematic biases in large language model evaluation decisions to ensure fair and consistent judgments.