pyspark.sql.functions.make_timestamp_ltz#
- pyspark.sql.functions.make_timestamp_ltz(years, months, days, hours, mins, secs, timezone=None)[source]#
Create the current timestamp with local time zone from years, months, days, hours, mins, secs and timezone fields. If the configuration spark.sql.ansi.enabled is false, the function returns NULL on invalid inputs. Otherwise, it will throw an error instead.
New in version 3.5.0.
- Parameters
- years
Columnor str The year to represent, from 1 to 9999
- months
Columnor str The month-of-year to represent, from 1 (January) to 12 (December)
- days
Columnor str The day-of-month to represent, from 1 to 31
- hours
Columnor str The hour-of-day to represent, from 0 to 23
- mins
Columnor str The minute-of-hour to represent, from 0 to 59
- secs
Columnor str The second-of-minute and its micro-fraction to represent, from 0 to 60. The value can be either an integer like 13 , or a fraction like 13.123. If the sec argument equals to 60, the seconds field is set to 0 and 1 minute is added to the final timestamp.
- timezone
Columnor str, optional The time zone identifier. For example, CET, UTC and etc.
- years
- Returns
ColumnA new column that contains a current timestamp.
See also
Examples
>>> spark.conf.set("spark.sql.session.timeZone", "America/Los_Angeles")
Example 1: Make the current timestamp from years, months, days, hours, mins and secs.
>>> import pyspark.sql.functions as sf >>> df = spark.createDataFrame([[2014, 12, 28, 6, 30, 45.887, 'CET']], ... ['year', 'month', 'day', 'hour', 'min', 'sec', 'tz']) >>> df.select( ... sf.make_timestamp_ltz(df.year, df.month, 'day', df.hour, df.min, df.sec, 'tz') ... ).show(truncate=False) +--------------------------------------------------------+ |make_timestamp_ltz(year, month, day, hour, min, sec, tz)| +--------------------------------------------------------+ |2014-12-27 21:30:45.887 | +--------------------------------------------------------+
Example 2: Make the current timestamp without timezone.
>>> import pyspark.sql.functions as sf >>> df = spark.createDataFrame([[2014, 12, 28, 6, 30, 45.887, 'CET']], ... ['year', 'month', 'day', 'hour', 'min', 'sec', 'tz']) >>> df.select( ... sf.make_timestamp_ltz(df.year, df.month, 'day', df.hour, df.min, df.sec) ... ).show(truncate=False) +----------------------------------------------------+ |make_timestamp_ltz(year, month, day, hour, min, sec)| +----------------------------------------------------+ |2014-12-28 06:30:45.887 | +----------------------------------------------------+
>>> spark.conf.unset("spark.sql.session.timeZone")