Build (branch-4.0, Scala 2.13, Hadoop 3, JDK 17) #41
build_branch40.yml
on: workflow_dispatch
Run
/
Check changes
35s
Run
/
Protobuf breaking change detection and Python CodeGen check
0s
Run
/
Run TPC-DS queries with SF=1
1h 24m
Run
/
Run Docker integration tests
1h 29m
Run
/
Run Spark on Kubernetes Integration test
1h 1m
Run
/
Run Spark UI tests
0s
Matrix: Run / build
Run
/
Build modules: sparkr
25m 59s
Run
/
Linters, licenses, and dependencies
28m 41s
Run
/
Documentation generation
0s
Matrix: Run / pyspark
Annotations
10 errors and 6 warnings
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-ad38fd950528ff7e-exec-1".
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-4f43ff950529f1f9-exec-1".
|
Run / Run Spark on Kubernetes Integration test
sleep interrupted
|
Run / Run Spark on Kubernetes Integration test
sleep interrupted
|
Run / Run Spark on Kubernetes Integration test
Task io.fabric8.kubernetes.client.utils.internal.SerialExecutor$$Lambda$825/0x00007f2b1c6e1a30@5a554667 rejected from java.util.concurrent.ThreadPoolExecutor@254efbfc[Shutting down, pool size = 1, active threads = 1, queued tasks = 0, completed tasks = 333]
|
Run / Run Spark on Kubernetes Integration test
Task io.fabric8.kubernetes.client.utils.internal.SerialExecutor$$Lambda$825/0x00007f2b1c6e1a30@3d964c7d rejected from java.util.concurrent.ThreadPoolExecutor@254efbfc[Shutting down, pool size = 2, active threads = 2, queued tasks = 0, completed tasks = 332]
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-2de45795053cf627-exec-1".
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-b2639995053de5ab-exec-1".
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-c2b5ef9505419789-exec-1".
|
Run / Run Spark on Kubernetes Integration test
Status(apiVersion=v1, code=404, details=StatusDetails(causes=[], group=null, kind=pods, name=spark-test-app-7de3d112a7274533b897c988205357f9-driver, retryAfterSeconds=null, uid=null, additionalProperties={}), kind=Status, message=pods "spark-test-app-7de3d112a7274533b897c988205357f9-driver" not found, metadata=ListMeta(_continue=null, remainingItemCount=null, resourceVersion=null, selfLink=null, additionalProperties={}), reason=NotFound, status=Failure, additionalProperties={})..
|
Run / Base image build
Failed to save: Failed to CreateCacheEntry: Received non-retryable error: Failed request: (409) Conflict: cache entry with the same key, version, and scope already exists
|
Run / Run TPC-DS queries with SF=1
The Ubuntu-20.04 brownout takes place from 2025-02-01. For more details, see https://github.com/actions/runner-images/issues/11101
|
Run / Build modules: sparkr
Cache not found for keys: sparkr-coursier-649032160d1ffa7f1722df9b15695c315fadaedc302b084342f5a13723baf6ea, sparkr-coursier-
|
Run / Linters, licenses, and dependencies
Cache not found for keys: docs-maven-742be464ad063e0623e7cc1bd513cd18b8ae673281ad75a075e6b7bad43c396c, docs-maven-
|
Run / Linters, licenses, and dependencies
Cache not found for keys: docs-coursier-649032160d1ffa7f1722df9b15695c315fadaedc302b084342f5a13723baf6ea, docs-coursier-
|
Run / Build modules: core, unsafe, kvstore, avro, utils, network-common, network-shuffle, repl, launcher, examples, sketch, variant
Failed to save: Failed to CreateCacheEntry: Received non-retryable error: Failed request: (409) Conflict: cache entry with the same key, version, and scope already exists
|
Artifacts
Produced during runtime
Name | Size | |
---|---|---|
test-results-core, unsafe, kvstore, avro, utils, network-common, network-shuffle, repl, launcher, examples, sketch, variant--17-hadoop3-hive2.3
|
801 KB |
|