Skip to content

Build / Python-only (branch-4.1) #11

Build / Python-only (branch-4.1)

Build / Python-only (branch-4.1) #11

Triggered via schedule November 12, 2025 12:07
Status Success
Total duration 1h 50m 27s
Artifacts 9
Run  /  Check changes
51s
Run / Check changes
Run  /  Base image build
53s
Run / Base image build
Run  /  Protobuf breaking change detection and Python CodeGen check
0s
Run / Protobuf breaking change detection and Python CodeGen check
Run  /  Java 17 build with Maven
0s
Run / Java 17 build with Maven
Run  /  Java 25 build with Maven
0s
Run / Java 25 build with Maven
Run  /  Run TPC-DS queries with SF=1
0s
Run / Run TPC-DS queries with SF=1
Run  /  Run Docker integration tests
0s
Run / Run Docker integration tests
Run  /  Run Spark on Kubernetes Integration test
0s
Run / Run Spark on Kubernetes Integration test
Run  /  Run Spark UI tests
0s
Run / Run Spark UI tests
Matrix: Run / build
Run  /  Build modules: sparkr
0s
Run / Build modules: sparkr
Run  /  Linters, licenses, and dependencies
0s
Run / Linters, licenses, and dependencies
Run  /  Documentation generation
0s
Run / Documentation generation
Matrix: Run / pyspark
Fit to window
Zoom out
Zoom in

Annotations

3 warnings
Run / Build modules: pyspark-core, pyspark-errors, pyspark-streaming, pyspark-logger
No files were found with the provided path: **/target/test-reports/*.xml **/target/surefire-reports/*.xml. No artifacts will be uploaded.
Run / Build modules: pyspark-core, pyspark-errors, pyspark-streaming, pyspark-logger
WARNING conda.cli.main_config:_set_key(451): Key auto_activate_base is an alias of auto_activate; setting value with latter
Run / Build modules: pyspark-core, pyspark-errors, pyspark-streaming, pyspark-logger
The 'defaults' channel might have been added implicitly. If this is intentional, add 'defaults' to the 'channels' list. Otherwise, consider setting 'conda-remove-defaults' to 'true'.

Artifacts

Produced during runtime
Name Size Digest
apache~spark~VX1LTI.dockerbuild
27.1 KB
sha256:014d7257674eeebe688ee050abb3e45ec4f25b38ea6d15367ab83c68c3d97653
test-results-pyspark-connect--17-hadoop3-hive2.3-python3.11
228 KB
sha256:d6ad07485b58869a1eb6ada07abd454b57f4875e6cc3c27e3c47761b31772c36
test-results-pyspark-mllib, pyspark-ml, pyspark-ml-connect, pyspark-pipelines--17-hadoop3-hive2.3-python3.11
176 KB
sha256:25eb7699f172eada64f083c74e077cff056def764e56540e92aae3a158233bcd
test-results-pyspark-pandas--17-hadoop3-hive2.3-python3.11
196 KB
sha256:e9eacd6b28ad2f6fa9551efc72a2f251d18c2c1b6608aee1698aea541def28d9
test-results-pyspark-pandas-connect--17-hadoop3-hive2.3-python3.11
206 KB
sha256:3c74682d1f82c3def303cab341afe27e52cd739b276e11d03164c5767d1a6961
test-results-pyspark-pandas-slow--17-hadoop3-hive2.3-python3.11
173 KB
sha256:c274b5d808677d82010bf9dbaf88a217b4c8a8b2b35e92cc4ca69b48528e2bbf
test-results-pyspark-pandas-slow-connect--17-hadoop3-hive2.3-python3.11
183 KB
sha256:45f4ed8c2058ccaba49fd810c5c137fd34f92314d622c6c407244f1e62d24dbb
test-results-pyspark-sql, pyspark-resource, pyspark-testing--17-hadoop3-hive2.3-python3.11
250 KB
sha256:ac3b5abfcb837a544df1b878e4897c819bad494e57cabd93b1d9286a128a9597
test-results-pyspark-structured-streaming, pyspark-structured-streaming-connect--17-hadoop3-hive2.3-python3.11
41.5 KB
sha256:c982659224bfc441f6fefb47c535717d2fc36b7c6419531f21ebf62cdbc17270