TaskFlow/Task Arguments and Results
Contents
Overview
In taskflow, all flow & task state goes to (potentially persistent) storage (via the logbook backends and persistence design). That includes all the information that task/s in the flow needs when they are executed, and all the information task produces (via serializable task results). A developer who implements tasks or flows can specify what arguments a task accepts and what result it returns in several ways. This document will help you understand what those ways are and how to use those ways to accomplish your desired taskflow usage pattern.
- Task arguments
- Set of names of task arguments available as the
requires
property of the task instance. When a task is about to be executed values with these names are retrieved from storage and passed toexecute
method of the task as keyword arguments (ie, kwargs). - Task results
- Set of names of task results (what task provides) available as
provides
property of task instance. After a task finishes successfully, its result(s) (what the taskexecute
method returns) are available by these names from storage (see examples below).
Arguments Specification
There are different way to specify the task argument requires
set.
Arguments Inference
Task arguments can be inferred from arguments of the execute
method of the task.
For example:
>>> class MyTask(task.Task): ... def execute(self, spam, eggs): ... return spam + eggs ... >>> MyTask().requires set(['eggs', 'spam'])
Inference from the method signature is the simplest way to specify task arguments. Optional arguments (with default values), and special arguments like self
, *args
and **kwargs
are ignored on inference (as these names have special meaning/usage in python).
For example:
>>> class MyTask(task.Task): ... def execute(self, spam, eggs=()): ... return spam + eggs ... >>> MyTask().requires set(['spam']) >>> >>> class UniTask(task.Task): ... def execute(self, *args, **kwargs): ... pass ... >>> UniTask().requires set([])
Rebinding
There are cases when the value you want to pass to a task is stored with a name other then the corresponding task arguments name. That's when the rebind
task constructor parameter comes in handy. Using it the flow author can instruct the engine to fetch a value from storage by one name, but pass it to a tasks execute
method with another name.
There are two possible way of using it. The first is to pass a dictionary that maps the task argument name to the name of a saved value.
For example:
If you have task
class SpawnVMTask(task.Task): def execute(self, vm_name, vm_image_id, **kwargs): pass # TODO(imelnikov): use parameters to spawn vm
and you saved 'vm_name' with 'name' key in storage, you can spawn a vm with such 'name' like this:
SpawnVMTask(rebind={'vm_name': 'name'})
The second way is to pass a tuple/list/dict of argument names. The length of the tuple/list/dict should not be less then number of task required parameters. For example, you can achieve the same effect as the previous example with:
SpawnVMTask(rebind_args=('name', 'vm_image_id'))
which is equivalent to a more elaborate:
SpawnVMTask(rebind=dict(vm_name='name', vm_image_id='vm_image_id'))
In both cases, if your task accepts arbitrary arguments with **kwargs
construct, you can specify extra arguments.
For example:
SpawnVMTask(rebind=('name', 'vm_image_id', 'admin_key_name'))
When such task is about to be executed, name
, vm_image_id
and admin_key_name
values are fetched from storage and
value from name
is passed to execute
method as
vm_name
, value from vm_image_id
is passed as
vm_image_id
, and value from admin_key_name
is passed
as admin_key_name
parameter in kwargs
.
Manually Specifying Requirements
Why: It is often useful to manually specify the requirements of a task, either by a task author or by the flow author (allowing the flow author to override the task requirements).
To accomplish this when creating your task use the constructor to specify manual requirements.
Those manual requirements (if they are not functional arguments) will appear in the kwargs
of the execute()
method.
For example:
>>> class Cat(task.Task): ... def __init__(self): ... super(Cat, self).__init__(requires=("food", "milk")) ... def execute(self, food, **kwargs): ... pass ... >>> Cat().requires set(['food', 'milk'])
During flow construction of your task the flow author can also add-on additional requirements if desired.
Those manual requirements (if they are not functional arguments) will appear in the kwargs
or args
of the execute()
method.
For example:
>>> class Dog(task.Task): ... def execute(self, food, **kwargs): ... pass >>> Dog(requires=("food", "water", "grass")).requires set(['food', 'water', 'grass'])
If the flow author desires to add-on to existing flow requirements they can resort to during off the argument inference and manually overriding what a tasks requires, use this at your own risk as you must be careful to avoid invalid argument mappings.
For example:
>>> class Bird(task.Task): ... def execute(self, food, *args, **kwargs): ... pass >>> Bird(requires=("food", "water", "grass"), ... auto_extract=False).requires set(['food', 'water', 'grass'])
Results Specification
In python, function results are not named, so we can not infer what task a returns. Of course, the complete task result (what execute
method returns) is saved in (potentially persistent) storage, but it is not accessible by others unless the task specifies names of those values via its provides
task constructor parameter.
Returning One Value
If task returns just one value, provides
should be string -- the
name of the value.
For example:
class TheAnswerReturningTask(task.Task): def execute(self): return 42
TheAnswerReturningTask(provides='the_answer')
Returning Tuple
For a task that returns several values, one option (as usual in python) is to return those values via a tuple
.
For example:
class BitsAndPiecesTask(task.Task): def execute(self): return 'BITs', 'PIECEs'
Then, you can give the value individual names, by passing a tuple or list as provides
parameter:
BitsAndPiecesTask(provides=('bits', 'pieces'))
After such task executes, you (and the engine, which is useful for other tasks) will be able to get those elements from storage by name:
>>> storage.fetch('bits') 'BITs' >>> storage.fetch('pieces') 'PIECEs'
Provides argument can be shorter then the actual tuple returned by a task -- then extra values are ignored (but, as expected, all those values are saved and passed to the revert
).
Note: Provides arguments tuple can also be longer then the actual tuple returned by task -- when this happens the extra parameters are left undefined: a warning is printed to logs and if use of such parameter is attempted a NotFound
exception is raised.
Returning Dictionary
Another option is to return several values as a dictionary (aka a dict
).
For example:
class BitsAndPiecesTask(task.Task): def execute(self): return { 'bits': 'BITs', 'pieces': 'PIECEs' }
TaskFlow expects that a dict will be returned if provides
argument is a set
:
BitsAndPiecesTask(provides=set(['bits', 'pieces']))
After such task executes, you (and the engine, which is useful for other tasks) will be able to get elements from storage by name:
>>> storage.fetch('bits') 'BITs' >>> storage.fetch('pieces') 'PIECEs'
Note: if some items from the dict returned by the task are not present in the provides arguments -- then extra values are ignored (but, of course, saved and passed to the revert
method). If the provides argument has some items not present in the actual dict returned by the task -- then extra parameters are left undefined: a warning is printed to logs and if use of such parameter is attempted a NotFound
exception is raised.
Default Provides
As mentioned above, the default task base class provides nothing, which means task results are not accessible by all the other tasks in the flow.
The task author can override this and specify default value for provides using default_provides
class variable:
class BitsAndPiecesTask(task.Task): default_provides = ('bits', 'pieces') def execute(self): return 'BITs', 'PIECEs'
Of course, the flow author can override this to change names if needed:
BitsAndPiecesTask(provides=('b', 'p'))
or to change structure -- e.g. this instance will make whole tuple accessible to other tasks by name 'bnp':
BitsAndPiecesTask(provides='bnp')
or the flow author may want to return default behavior and hide the results of the task from other tasks in the flow (e.g. to avoid naming conflicts):
BitsAndPiecesTask(provides=())