Skip to content

Commit ac55ce2

Browse files
authored
fix: Fixing Spark min / max entity df event timestamps range return order (#2735)
fix: Fixing the return order of elements when calculating the min and max entity-DF event timestamps in the Spark offline store. Signed-off-by: Lev Pickovsky <[email protected]>
1 parent a15fcb4 commit ac55ce2

File tree

1 file changed

+1
-1
lines changed
  • sdk/python/feast/infra/offline_stores/contrib/spark_offline_store

1 file changed

+1
-1
lines changed

sdk/python/feast/infra/offline_stores/contrib/spark_offline_store/spark.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -330,8 +330,8 @@ def _get_entity_df_event_timestamp_range(
330330
df = spark_session.sql(entity_df).select(entity_df_event_timestamp_col)
331331
# TODO(kzhang132): need utc conversion here.
332332
entity_df_event_timestamp_range = (
333-
df.agg({entity_df_event_timestamp_col: "max"}).collect()[0][0],
334333
df.agg({entity_df_event_timestamp_col: "min"}).collect()[0][0],
334+
df.agg({entity_df_event_timestamp_col: "max"}).collect()[0][0],
335335
)
336336
else:
337337
raise InvalidEntityType(type(entity_df))

0 commit comments

Comments
 (0)