You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've recently received approval to contribute the TypeScript definitions I have authored internally at $DAYJOB for Tekton in jkcfg to open source, and wanted to gauge interest in this style of API for this project. I could also see these type of functions making sense in a different package which uses jkcfg/tekton as a dependency.
For context, as an example, this is the file I wrote for generating Pipeline objects:
import{ResourceDeclarations,ResourceDeclaration}from'lib/tekton/resource';import{ParameterSpecs,Parameters,ParameterValue,ParameterSpec,}from'lib/tekton/param';import{Workspaces,Workspace}from'lib/tekton/workspace';import{taskSpec,taskRef}from'lib/tekton/task';import{TaskOptions,TaskRef,Task}from'lib/tekton/task';import{KubernetesObject}from'lib/models';import{resource}from'lib/tekton/common';import{objToNamedObj,objToNameValue}from'lib/util';/** * Resource models */interfacePipelineextendsKubernetesObject{spec: PipelineSpec;}exportinterfacePipelineSpec{tasks: PipelineTaskSpec[];resources?: ResourceDeclaration[];params?: ParameterSpec[];workspaces?: Workspace[];}/** * Pipeline Task Resources */interfacePipelineTaskResource{name: string;resource: string;}interfacePipelineTaskOutputResourceextendsPipelineTaskResource{}interfacePipelineTaskInputResourceextendsPipelineTaskResource{// list of other pipeline tasks the resource has to come fromfrom?: string[];}/** * Object format of PipelineTaskOutputResource[] for convenience */interfacePipelineTaskOutputResources{[prop: string]: Omit<PipelineTaskOutputResource,'name'>;}/** * Object format of PipelineTaskInputResource[] for convenience */interfacePipelineTaskInputResources{[prop: string]: Omit<PipelineTaskInputResource,'name'>;}interfacePipelineTaskConditionSpec{conditionRef: string;params?: ParameterValue[];resources?: PipelineTaskInputResource[];}/** * Object format of PipelineTaskConditionSpec for convenience */interfacePipelineTaskConditions{[prop: string]: {params?: Parameters;resources?: PipelineTaskInputResources;};}interfacePipelineTaskSpec{name: string;taskRef?: TaskRef;taskSpec?: Task['spec'];resources?: {inputs?: PipelineTaskInputResource[];outputs?: PipelineTaskOutputResource[];};params?: ParameterValue[];conditions?: PipelineTaskConditionSpec[];retries?: number;runAfter?: string[];}/** * Abstract internal representation of what goes into PipelineTask */interfacePipelineTaskOptions{name: string;// simplify to string while keeping flexibility to use type for final interfacetaskRef?: string;taskSpec?: TaskOptions;resources?: {inputs?: PipelineTaskInputResources;outputs?: PipelineTaskOutputResources;};params?: Parameters;runAfter?: string[];retries?: number;conditions?: PipelineTaskConditions;}exportinterfacePipelineOptions{tasks: PipelineTaskOptions[];/** * Technically these are not the same in Pipelines, but shrug: * https://github.com/tektoncd/pipeline/blob/release-v0.10.x/pkg/apis/pipeline/v1alpha2/pipeline_types.go#L170 */resources?: ResourceDeclarations;params?: ParameterSpecs;workspaces?: Workspaces;}/** * Resource creation functions *//** * Creates PipelineTaskSpec from PipelineTaskOptions * @param opts */exportconstpipelineTask=(opts: PipelineTaskOptions): PipelineTaskSpec=>{const{ name }=opts;constspec: PipelineTaskSpec={ name };if(opts.taskRef)spec.taskRef=taskRef(opts.taskRef);if(opts.taskSpec)spec.taskSpec=taskSpec(opts.taskSpec);if(opts.retries)spec.retries=opts.retries;if(opts.runAfter)spec.runAfter=opts.runAfter;if(opts.resources)spec.resources={};if(opts.resources?.inputs){spec.resources!.inputs=objToNamedObj<PipelineTaskInputResource>(opts.resources.inputs);}if(opts.resources?.outputs){spec.resources!.outputs=objToNamedObj<PipelineTaskOutputResource>(opts.resources.outputs);}if(opts.params){spec.params=objToNameValue(opts.params)asParameterValue[];}if(opts.conditions){spec.conditions=(objToNamedObj(opts.conditions)asunknown)asPipelineTaskConditionSpec[];}returnspec;};exportconstpipelineSpec=(opts: PipelineOptions): PipelineSpec=>{constspec: PipelineSpec={tasks: opts.tasks.map(t=>pipelineTask(t)),};if(opts.resources)spec.resources=objToNamedObj(opts.resources);if(opts.workspaces)spec.workspaces=objToNamedObj(opts.workspaces);if(opts.params)spec.params=objToNamedObj(opts.params);returnspec;};/** * Creates Pipeline object * @param name * @param opts */exportconstpipeline=(name: string,opts: PipelineOptions): Pipeline=>resource(name,'Pipeline',pipelineSpec(opts));
In order to enable the more terse "jsonnet" style of defining Kubernetes named arrays as maps, I added additional types on top of the types which define the raw spec (those types which define the raw spec could probably be scrapped and the types from this project used in their place with a little work):
/** * Pipeline Task Resources */interfacePipelineTaskResource{name: string;resource: string;}interfacePipelineTaskOutputResourceextendsPipelineTaskResource{}interfacePipelineTaskInputResourceextendsPipelineTaskResource{// list of other pipeline tasks the resource has to come fromfrom?: string[];}/** * Object format of PipelineTaskOutputResource[] for convenience */interfacePipelineTaskOutputResources{[prop: string]: Omit<PipelineTaskOutputResource,'name'>;}/** * Object format of PipelineTaskInputResource[] for convenience */interfacePipelineTaskInputResources{[prop: string]: Omit<PipelineTaskInputResource,'name'>;}
By omitting the name and using the name as the key, I can now define things a lot more concisely, which brings joy to all, I think.
The really neat part, and I think where most of the value exists, is by using generics in TypeScript I can convert these "objectified" versions of arrays into arrays without losing any type checking:
The generic utils for doing that are pretty stupidly simple, and might make sense to go into the @jkcfg/kubernetes project?
/** * Easily turn objects into arrays where the original key name is now `name:` * and the value is now `value:` * * e.g, { foo: bar } => [{ name: foo, value: bar }] * * @param obj */exportconstobjToNameValue=(obj: {[prop: string]: any},valueKeyName='value')=>Object.keys(obj).map(key=>({name: key,[valueKeyName]: obj[key]}));/** * Easily turn objects into arrays of objects that have a `name` field based on * their original key name. Contents of obj[key] are spread alongside name. * @param obj */exportconstobjToNamedObj=<T=NamedObj>(obj: {[prop: string]: any;}): T[]=>Object.keys(obj).map(key=>({name: key, ...obj[key]}));/** * Turns ['name1', 'name2', 'name3' ] => [{ name: name1, ... }] * @param arr */exportconstarrToNamedObj=(arr: string[])=>arr.map(name=>({ name }));
Additionally, due to most Tekton resources being scheduled as part of a TriggerTemplate, I separate out the spec generation from the resource generation:
As an example of what this all looks like in context of generating a Tekton TriggerTemplate:
constprTriggerTemplate=triggerTemplate('pull-request',{gitrefafter: {description: 'git ref pointing at HEAD of pull request',},gitrefbefore: {description: 'git ref pointing at HEAD of base branch in pull request',},gitrepourl: {description: 'git repository url',},pullrequesturl: {description: 'pull request url',},gitbranch: {description: 'head branch in pull request',},pullrequestnum: {description: 'pull request number',},},[pipelineRun(k8sManifestsPresubmit.metadata?.name!,{pipelineRef: k8sManifestsPresubmit.metadata?.name,resources: {source: {resourceSpec: {type: ResourceTypes.git,params: objToNameValue({revision: '$(params.gitbranch)',url: '$(params.gitrepourl)',})asResourceParameter[],},},},params: {branch: '$(params.gitbranch)',},serviceAccountName: saName,}),pipelineRun(checkForLinkedIssues.metadata?.name!+'-$(uid)',{// dont generate name so we can control the run name and produce URL// reliablygenerateName: false,pipelineRef: checkForLinkedIssues.metadata?.name!,serviceAccountName: saName,params: {url: pipelineRunLogsUrl(checkForLinkedIssues.metadata?.name+'-$(uid)'),},resources: { pr },}),].map(r=>{r.metadata!.labels=labels;r.metadata!.annotations=annos;returnr;}));
Using all array based values would extend the length of a declaration like this quite a bit and add a bunch of boilerplate.
If there is any interest in these kind of patterns, I would be happy to work on refactoring my existing code to compose with what already exists in this repository and add some of our internal tasks/pipelines/etc as examples. Either way, I honestly just adore working with this project and am interested in your thoughts <3
The text was updated successfully, but these errors were encountered:
I've recently received approval to contribute the TypeScript definitions I have authored internally at
$DAYJOB
for Tekton in jkcfg to open source, and wanted to gauge interest in this style of API for this project. I could also see these type of functions making sense in a different package which uses jkcfg/tekton as a dependency.For context, as an example, this is the file I wrote for generating Pipeline objects:
In order to enable the more terse "jsonnet" style of defining Kubernetes named arrays as maps, I added additional types on top of the types which define the raw spec (those types which define the raw spec could probably be scrapped and the types from this project used in their place with a little work):
By omitting the name and using the name as the key, I can now define things a lot more concisely, which brings joy to all, I think.
The really neat part, and I think where most of the value exists, is by using generics in TypeScript I can convert these "objectified" versions of arrays into arrays without losing any type checking:
The generic utils for doing that are pretty stupidly simple, and might make sense to go into the
@jkcfg/kubernetes
project?Additionally, due to most Tekton resources being scheduled as part of a TriggerTemplate, I separate out the spec generation from the resource generation:
As an example of what this all looks like in context of generating a Tekton TriggerTemplate:
Using all array based values would extend the length of a declaration like this quite a bit and add a bunch of boilerplate.
If there is any interest in these kind of patterns, I would be happy to work on refactoring my existing code to compose with what already exists in this repository and add some of our internal tasks/pipelines/etc as examples. Either way, I honestly just adore working with this project and am interested in your thoughts <3
The text was updated successfully, but these errors were encountered: