Trouble passing in lambda to apply for pandas DataFrame

PythonPandasDataframeLambdaApply

Python Problem Overview


I'm trying to apply a function to all rows of a pandas DataFrame (actually just one column in that DataFrame)

I'm sure this is a syntax error but I'm know sure what I'm doing wrong

df['col'].apply(lambda x, y:(x - y).total_seconds(), args=[d1], axis=1)

The col column contains a bunch a datetime.datetime objects and and d1 is the earliest of them. I'm trying to get a column of the total number of seconds for each of the rows

EDIT I keep getting the following error

TypeError: <lambda>() got an unexpected keyword argument 'axis'

I don't understand why axis is getting passed to my lambda function

EDIT 2

I've also tried doing

def diff_dates(d1, d2):
    return (d1-d2).total_seconds()

df['col'].apply(diff_dates, args=[d1], axis=1)

And I get the same error

Python Solutions


Solution 1 - Python

Note there is no axis param for a Series.apply call, as distinct to a DataFrame.apply call.

> Series.apply(func, convert_dtype=True, args=(), **kwds)

func : function
convert_dtype : boolean, default True
Try to find better dtype for elementwise function results. If False, leave as dtype=object
args : tuple
Positional arguments to pass to function in addition to the value

There is one for a df but it's unclear how you're expecting this to work when you're calling it on a series but you're expecting it to work on a row?

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionsedavidwView Question on Stackoverflow
Solution 1 - PythonEdChumView Answer on Stackoverflow