pyspark.sql.functions.try_to_time#
- pyspark.sql.functions.try_to_time(str, format=None)[source]#
Converts a
Column
intopyspark.sql.types.TimeType
using the optionally specified format. Specify formats according to datetime pattern. By default, it follows casting rules topyspark.sql.types.TimeType
if the format is omitted. Equivalent tocol.cast("time")
. The function always returns null on an invalid input.New in version 4.1.0.
- Parameters
- str
Column
or column name string to be parsed to time.
- format: :class:`~pyspark.sql.Column` or column name, optional
time format pattern to follow.
- str
- Returns
Column
time value as
pyspark.sql.types.TimeType
type.
Examples
Example 1: Convert string to a time
>>> import pyspark.sql.functions as sf >>> df = spark.createDataFrame([("10:30:00",)], ["str"]) >>> df.select(sf.try_to_time(df.str).alias("time")).show() +--------+ | time| +--------+ |10:30:00| +--------+
Example 2: Convert string to a time with a format
>>> import pyspark.sql.functions as sf >>> df = spark.createDataFrame([("10:30:00", "HH:mm:ss")], ["str", "format"]) >>> df.select(sf.try_to_time(df.str, df.format).alias("time")).show() +--------+ | time| +--------+ |10:30:00| +--------+
Example 3: Converion failure results in NULL
>>> import pyspark.sql.functions as sf >>> df = spark.createDataFrame([("malformed",)], ["str"]) >>> df.select(sf.try_to_time(df.str).alias("time")).show() +----+ |time| +----+ |NULL| +----+