pyspark.sql.functions.
hours
Partition transform function: A transform for timestamps to partition data into hours.
New in version 3.1.0.
Notes
This function can be used only in combination with partitionedBy() method of the DataFrameWriterV2.
partitionedBy()
Examples
>>> df.writeTo("catalog.db.table").partitionedBy( ... hours("ts") ... ).createOrReplace()