pyspark.sql.functions.make_interval#
- pyspark.sql.functions.make_interval(years=None, months=None, weeks=None, days=None, hours=None, mins=None, secs=None)[source]#
- Make interval from years, months, weeks, days, hours, mins and secs. - New in version 3.5.0. - Parameters
- yearsColumnor column name, optional
- The number of years, positive or negative. 
- monthsColumnor column name, optional
- The number of months, positive or negative. 
- weeksColumnor column name, optional
- The number of weeks, positive or negative. 
- daysColumnor column name, optional
- The number of days, positive or negative. 
- hoursColumnor column name, optional
- The number of hours, positive or negative. 
- minsColumnor column name, optional
- The number of minutes, positive or negative. 
- secsColumnor column name, optional
- The number of seconds with the fractional part in microsecond precision. 
 
- years
- Returns
- Column
- A new column that contains an interval. 
 
 - See also - Examples - Example 1: Make interval from years, months, weeks, days, hours, mins and secs. - >>> import pyspark.sql.functions as sf >>> df = spark.createDataFrame([[100, 11, 1, 1, 12, 30, 01.001001]], ... ['year', 'month', 'week', 'day', 'hour', 'min', 'sec']) >>> df.select( ... sf.make_interval(df.year, df.month, 'week', df.day, df.hour, df.min, df.sec) ... ).show(truncate=False) +---------------------------------------------------------------+ |make_interval(year, month, week, day, hour, min, sec) | +---------------------------------------------------------------+ |100 years 11 months 8 days 12 hours 30 minutes 1.001001 seconds| +---------------------------------------------------------------+ - Example 2: Make interval from years, months, weeks, days, hours and mins. - >>> import pyspark.sql.functions as sf >>> df = spark.createDataFrame([[100, 11, 1, 1, 12, 30, 01.001001]], ... ['year', 'month', 'week', 'day', 'hour', 'min', 'sec']) >>> df.select( ... sf.make_interval(df.year, df.month, 'week', df.day, df.hour, df.min) ... ).show(truncate=False) +---------------------------------------------------+ |make_interval(year, month, week, day, hour, min, 0)| +---------------------------------------------------+ |100 years 11 months 8 days 12 hours 30 minutes | +---------------------------------------------------+ - Example 3: Make interval from years, months, weeks, days and hours. - >>> import pyspark.sql.functions as sf >>> df = spark.createDataFrame([[100, 11, 1, 1, 12, 30, 01.001001]], ... ['year', 'month', 'week', 'day', 'hour', 'min', 'sec']) >>> df.select( ... sf.make_interval(df.year, df.month, 'week', df.day, df.hour) ... ).show(truncate=False) +-------------------------------------------------+ |make_interval(year, month, week, day, hour, 0, 0)| +-------------------------------------------------+ |100 years 11 months 8 days 12 hours | +-------------------------------------------------+ - Example 4: Make interval from years, months, weeks and days. - >>> import pyspark.sql.functions as sf >>> df = spark.createDataFrame([[100, 11, 1, 1, 12, 30, 01.001001]], ... ['year', 'month', 'week', 'day', 'hour', 'min', 'sec']) >>> df.select(sf.make_interval(df.year, df.month, 'week', df.day)).show(truncate=False) +----------------------------------------------+ |make_interval(year, month, week, day, 0, 0, 0)| +----------------------------------------------+ |100 years 11 months 8 days | +----------------------------------------------+ - Example 5: Make interval from years, months and weeks. - >>> import pyspark.sql.functions as sf >>> df = spark.createDataFrame([[100, 11, 1, 1, 12, 30, 01.001001]], ... ['year', 'month', 'week', 'day', 'hour', 'min', 'sec']) >>> df.select(sf.make_interval(df.year, df.month, 'week')).show(truncate=False) +--------------------------------------------+ |make_interval(year, month, week, 0, 0, 0, 0)| +--------------------------------------------+ |100 years 11 months 7 days | +--------------------------------------------+ - Example 6: Make interval from years and months. - >>> import pyspark.sql.functions as sf >>> df = spark.createDataFrame([[100, 11, 1, 1, 12, 30, 01.001001]], ... ['year', 'month', 'week', 'day', 'hour', 'min', 'sec']) >>> df.select(sf.make_interval(df.year, df.month)).show(truncate=False) +-----------------------------------------+ |make_interval(year, month, 0, 0, 0, 0, 0)| +-----------------------------------------+ |100 years 11 months | +-----------------------------------------+ - Example 7: Make interval from years. - >>> import pyspark.sql.functions as sf >>> df = spark.createDataFrame([[100, 11, 1, 1, 12, 30, 01.001001]], ... ['year', 'month', 'week', 'day', 'hour', 'min', 'sec']) >>> df.select(sf.make_interval(df.year)).show(truncate=False) +-------------------------------------+ |make_interval(year, 0, 0, 0, 0, 0, 0)| +-------------------------------------+ |100 years | +-------------------------------------+ - Example 8: Make empty interval. - >>> import pyspark.sql.functions as sf >>> spark.range(1).select(sf.make_interval()).show(truncate=False) +----------------------------------+ |make_interval(0, 0, 0, 0, 0, 0, 0)| +----------------------------------+ |0 seconds | +----------------------------------+