Skip to content

Build / Python-only (branch-4.1) #6

Build / Python-only (branch-4.1)

Build / Python-only (branch-4.1) #6

Triggered via schedule November 7, 2025 12:06
Status Success
Total duration 1h 51m 43s
Artifacts 9
Run  /  Check changes
38s
Run / Check changes
Run  /  Base image build
1m 3s
Run / Base image build
Run  /  Protobuf breaking change detection and Python CodeGen check
0s
Run / Protobuf breaking change detection and Python CodeGen check
Run  /  Java 17 build with Maven
0s
Run / Java 17 build with Maven
Run  /  Java 25 build with Maven
0s
Run / Java 25 build with Maven
Run  /  Run TPC-DS queries with SF=1
0s
Run / Run TPC-DS queries with SF=1
Run  /  Run Docker integration tests
0s
Run / Run Docker integration tests
Run  /  Run Spark on Kubernetes Integration test
0s
Run / Run Spark on Kubernetes Integration test
Run  /  Run Spark UI tests
0s
Run / Run Spark UI tests
Matrix: Run / build
Run  /  Build modules: sparkr
0s
Run / Build modules: sparkr
Run  /  Linters, licenses, and dependencies
0s
Run / Linters, licenses, and dependencies
Run  /  Documentation generation
0s
Run / Documentation generation
Matrix: Run / pyspark
Fit to window
Zoom out
Zoom in

Annotations

3 warnings
Run / Build modules: pyspark-core, pyspark-errors, pyspark-streaming, pyspark-logger
No files were found with the provided path: **/target/test-reports/*.xml **/target/surefire-reports/*.xml. No artifacts will be uploaded.
Run / Build modules: pyspark-core, pyspark-errors, pyspark-streaming, pyspark-logger
WARNING conda.cli.main_config:_set_key(451): Key auto_activate_base is an alias of auto_activate; setting value with latter
Run / Build modules: pyspark-core, pyspark-errors, pyspark-streaming, pyspark-logger
The 'defaults' channel might have been added implicitly. If this is intentional, add 'defaults' to the 'channels' list. Otherwise, consider setting 'conda-remove-defaults' to 'true'.

Artifacts

Produced during runtime
Name Size Digest
apache~spark~NPQM5N.dockerbuild
27.5 KB
sha256:cfa421a9bcf8369fe97c87805d7bef8de0ecae861efac851b4fa7eb4d5e99535
test-results-pyspark-connect--17-hadoop3-hive2.3-python3.11
227 KB
sha256:6c9af65b6706da73480792724769c014493959fb6342b614f690dcf2b51703bf
test-results-pyspark-mllib, pyspark-ml, pyspark-ml-connect, pyspark-pipelines--17-hadoop3-hive2.3-python3.11
182 KB
sha256:0a1c9276bb9bf186227923377114978328d026a34b9cc7b04e387a87c6d5ffbe
test-results-pyspark-pandas--17-hadoop3-hive2.3-python3.11
196 KB
sha256:be2aa2675566f35828c0a90ad632d1242542b8513801330d2df31f96ab7470ac
test-results-pyspark-pandas-connect--17-hadoop3-hive2.3-python3.11
206 KB
sha256:cc054bd3fed9e7994c19ac870e99b1f633eb40065856f54c3549fd07ab197183
test-results-pyspark-pandas-slow--17-hadoop3-hive2.3-python3.11
173 KB
sha256:ef834a33d16c6c4a1c0c1797a847ba47d172a49b6d52840143a1f0dfd4d6ad1b
test-results-pyspark-pandas-slow-connect--17-hadoop3-hive2.3-python3.11
183 KB
sha256:471124df355636fd12637d5d64b57f19e48e441575402b6d8afb4de33210123d
test-results-pyspark-sql, pyspark-resource, pyspark-testing--17-hadoop3-hive2.3-python3.11
250 KB
sha256:9a1d7d945b2ed86921d327c570f3109b33edfb7c120358f79a8c7e6fe34d16a0
test-results-pyspark-structured-streaming, pyspark-structured-streaming-connect--17-hadoop3-hive2.3-python3.11
41.6 KB
sha256:669ef6f757f4aed8ea7e1f01914edf0ccf76a3897aef4ea38fe66cca8b860c66