Output from traditional Task can not be used in task decorator without a *reduce* task #27209
Replies: 4 comments 3 replies
-
Thanks for opening your first issue here! Be sure to follow the issue template! |
Beta Was this translation helpful? Give feedback.
-
I believe you should make sure that your task returns a serializable value. Maybe @uranusjr have more to talk about it but IMHO returning JSON serializable value is a responsibility of the person who wants the operator/task to work in dynamic task context. Converting it into discussion if needed. |
Beta Was this translation helpful? Give feedback.
-
The example is far too long as a sample, but I guess the relevant part is basically non_reduce_task(arg_1=reduce_task(list_task_sensor_2.output)) where |
Beta Was this translation helpful? Give feedback.
-
I filed two PRs #27250 and #27251 as two approaches to resolve the issue (by either documenting the |
Beta Was this translation helpful? Give feedback.
-
Apache Airflow version
2.4.1
What happened
When build a flow between traditional task and taskflow API, I saw this error log:
What you think should happen instead
It a task expand call follow a S3keySensor, then I try to wrap it with another reduce (return a list or do aggregation), it work again.
I think the result push through xcom should be reduced by default to be used in next task if that next task is not a map/expand task ?
How to reproduce
this is the sample code included both working and bug flow:
Operating System
REHL 7
Versions of Apache Airflow Providers
apache-airflow-providers-amazon==4.1.0
Deployment
Other
Deployment details
No response
Anything else
No response
Are you willing to submit PR?
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions