pyspark.sql.functions.try_remainder#
- pyspark.sql.functions.try_remainder(left, right)[source]#
Returns the remainder after dividend/divisor. Its result is always null if divisor is 0.
New in version 4.0.0.
Examples
Example 1: Integer divided by Integer.
>>> import pyspark.sql.functions as sf >>> spark.createDataFrame( ... [(6000, 15), (3, 2), (1234, 0)], ["a", "b"] ... ).select(sf.try_remainder("a", "b")).show() +-------------------+ |try_remainder(a, b)| +-------------------+ | 0| | 1| | NULL| +-------------------+
Example 2: Exception during division, resulting in NULL when ANSI mode is on
>>> import pyspark.sql.functions as sf >>> origin = spark.conf.get("spark.sql.ansi.enabled") >>> spark.conf.set("spark.sql.ansi.enabled", "true") >>> try: ... df = spark.range(1) ... df.select(sf.try_remainder(df.id, sf.lit(0))).show() ... finally: ... spark.conf.set("spark.sql.ansi.enabled", origin) +--------------------+ |try_remainder(id, 0)| +--------------------+ | NULL| +--------------------+