Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

use kubectl create to provision calico CRDs #243

Merged
merged 1 commit into from
Dec 17, 2024
Merged

Conversation

tariq1890
Copy link
Contributor

@tariq1890 tariq1890 commented Dec 17, 2024

This PR fixes the regression introduced by PR #175

I initially switched the kubectl commands from apply to create as apply was idempotent and safer when running commands in a retry loop. However, the Installation CRD of the Calico Operator is so large that (1.1 MB) that it exceeds the kubernetes annotation size of 262144 bytes (every resource created via kubectl apply has the kubectl.kubernetes.io/last-applied-configuration annotation added to it)

Since the holodeck e2e flakes happen when executing the following command in a retry loop, it should suffice to just kubectl apply the calico custom-resources.yaml file

with_retry 3 10s kubectl --kubeconfig $KUBECONFIG apply -f https://raw.githubusercontent.com/projectcalico/calico/${CALICO_VERSION}/manifests/custom-resources.yaml

@tariq1890 tariq1890 requested a review from shivakunv December 17, 2024 01:10
@tariq1890 tariq1890 changed the title use kubectl apply to create calico CRDs use kubectl create to provision calico CRDs Dec 17, 2024
Signed-off-by: Tariq Ibrahim <tibrahim@nvidia.com>
Copy link
Contributor

@shivakunv shivakunv left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@tariq1890 tariq1890 merged commit f478ba5 into main Dec 17, 2024
5 checks passed
@tariq1890 tariq1890 deleted the kubectl-create-crds branch December 17, 2024 06:22
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants