-
Notifications
You must be signed in to change notification settings - Fork 8.3k
Connectors list: hide inference connector if inference endpoint does not exist #217585
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Connectors list: hide inference connector if inference endpoint does not exist #217585
Conversation
Pinging @elastic/ml-ui (:ml) |
if ( | ||
connector.actionTypeId !== '.inference' || | ||
(connector.actionTypeId === '.inference' && | ||
// @ts-ignore property 'config' does not exist on type ActionConnector |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Due to the time sensitive nature of this change, I did not update types as I wanted to keep code churn to a minimum. I can create a follow up to get rid of this ignore.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it will be better to filter on the server side instead of the client making http calls for each .inference connector.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it 100% should be on the server, otherwise there's too many places where this can be used and it's A) a lot of work to find/update them all, B) we risk missing something
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also I agree with Yuliia, making this a blocking HTTP call per connector is not the right choice (at all).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
+1 on this.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
PR with the server-side fix is up #217641
@elasticmachine merge upstream |
💔 Build Failed
Failed CI StepsHistory
|
Created a duplicate inference endpoint check function to avoid circular dependency errors. This is a temporary fix due to time constraints of trying to get this in to fix the release blocker |
Closing this in lieu of #217641 |
Summary
This PR adds a check to see if the inference endpoint exists before displaying inference connectors in the Connectors list.
This means that the default Elastic LLM connector (which is currently shown and just doesn't work) will now be hidden for users it is not enabled for.
Checklist
Check the PR satisfies following conditions.
Reviewers should verify this PR satisfies this list as well.
release_note:breaking
label should be applied in these situations.release_note:*
label is applied per the guidelinesIdentify risks
Does this PR introduce any risks? For example, consider risks like hard to test bugs, performance regression, potential of data loss.
Describe the risk, its severity, and mitigation for each identified risk. Invite stakeholders and evaluate how to proceed before merging.