pyspark.sql.functions.nanvl#

pyspark.sql.functions.nanvl(col1, col2)[source]#

Returns col1 if it is not NaN, or col2 if col1 is NaN.

Both inputs should be floating point columns (DoubleType or FloatType).

New in version 1.6.0.

Changed in version 3.4.0: Supports Spark Connect.

Parameters
col1Column or str

first column to check.

col2Column or str

second column to return if first is NaN.

Returns
Column

value from first column or second if first is NaN .

Examples

>>> df = spark.createDataFrame([(1.0, float('nan')), (float('nan'), 2.0)], ("a", "b"))
>>> df.select(nanvl("a", "b").alias("r1"), nanvl(df.a, df.b).alias("r2")).collect()
[Row(r1=1.0, r2=1.0), Row(r1=2.0, r2=2.0)]