Skip to content

Commit

Permalink
FIX: Update spam controller action should consider seeded LLM properly (
Browse files Browse the repository at this point in the history
#1053)

The seeded LLM setting: `SiteSetting.ai_spam_detection_model_allowed_seeded_models` returns a _string_ with IDs separated by pipes. running `_map` on it will return an array with strings. We were previously checking for the id with custom prefix identifier, but instead we should be checking the stringified ID.
  • Loading branch information
keegangeorge authored Jan 8, 2025
1 parent 404092a commit 24b69bf
Show file tree
Hide file tree
Showing 2 changed files with 22 additions and 2 deletions.
2 changes: 1 addition & 1 deletion app/controllers/discourse_ai/admin/ai_spam_controller.rb
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ def update
llm_model_id = updated_params[:llm_model_id] = allowed_params[:llm_model_id]
if llm_model_id.to_i < 0 &&
!SiteSetting.ai_spam_detection_model_allowed_seeded_models_map.include?(
"custom:#{llm_model_id}",
llm_model_id.to_s,
)
return(
render_json_error(
Expand Down
22 changes: 21 additions & 1 deletion spec/requests/admin/ai_spam_controller_spec.rb
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@

expect(response.status).to eq(422)

SiteSetting.ai_spam_detection_model_allowed_seeded_models = seeded_llm.identifier
SiteSetting.ai_spam_detection_model_allowed_seeded_models = seeded_llm.id.to_s

put "/admin/plugins/discourse-ai/ai-spam.json",
params: {
Expand All @@ -49,6 +49,26 @@
expect(response.status).to eq(200)
end

it "ensures that seeded llm ID is properly passed and allowed" do
seeded_llm = Fabricate(:seeded_model)

SiteSetting.ai_spam_detection_model_allowed_seeded_models = [
llm_model.id,
seeded_llm.id,
].join("|")

put "/admin/plugins/discourse-ai/ai-spam.json",
params: {
is_enabled: true,
llm_model_id: seeded_llm.id,
custom_instructions: "custom instructions",
}
expect(SiteSetting.ai_spam_detection_model_allowed_seeded_models).to eq(
"#{llm_model.id}|#{seeded_llm.id}",
)
expect(response.status).to eq(200)
end

it "can not enable spam detection without a model selected" do
put "/admin/plugins/discourse-ai/ai-spam.json",
params: {
Expand Down

0 comments on commit 24b69bf

Please sign in to comment.