There should be a Wikipedia LLM with a sole purpose to check that the tone of the text is objective and matches Wikipedia standards.
The LLM should flag any changes it would make and if the the changes are above a threshold, the edit should be flagged to be reviewed more by another human.
There should be a Wikipedia LLM with a sole purpose to check that the tone of the text is objective and matches Wikipedia standards.
The LLM should flag any changes it would make and if the the changes are above a threshold, the edit should be flagged to be reviewed more by another human.