Jump to: navigation, search

TaskFlow/Task Arguments and Results

< TaskFlow
Revision as of 19:36, 21 October 2013 by Harlowja (talk | contribs) (Returning Tuple)

Overview

In TaskFlow, all flow & task state goes to storage (potentially persistent). That includes all the information that task/s in the flow needs when it is executed (task dependencies via arguments), and all the information task produces (serializable task results). A developer who implements tasks or flows can specify what arguments a task accepts and what result it returns in several ways. This document will help you understand what those ways are and how to use those ways to accomplish your desired taskflow usage pattern.

Task arguments
Set of names of task arguments available as the requires property of the task instance. When task is about to be executed values with these names are retrieved from storage and passed to execute method of the task as keyword arguments (ie, kwargs).
Task results
Set of names of task results (what task provides) available as provides property of task instance. After task finishes successfully, it's result(s) (what the task execute method returns) are available by these names from storage (see examples below).

Arguments Specification

There are different way to specify the task argument requires set.

Arguments Inference

Task arguments can be inferred from arguments of execute method of the task.

For example:

   >>> class MyTask(task.Task):
   ...     def execute(self, spam, eggs):
   ...         return spam + eggs
   ... 
   >>> MyTask().requires
   set(['eggs', 'spam'])

Inference from signature is the simplest way to specify task arguments. Optional arguments (with default values), and special arguments like self, *args and **kwargs are ignored on inference (as these names have special meaning/usage in python).

For example:

   >>> class MyTask(task.Task):
   ...     def execute(self, spam, eggs=()):
   ...         return spam + eggs
   ... 
   >>> MyTask().requires
   set(['spam'])
   >>>
   >>> class UniTask(task.Task):
   ...     def execute(self,  *args, **kwargs):
   ...         pass
   ... 
   >>> UniTask().requires
   set([])

Rebinding

There are cases when the value you want to pass to task is stored with a name other then the corresponding task arguments name. That's when the rebind task constructor parameter comes handy. Using it the flow author can instruct the engine to fetch a value from storage by one name, but pass it to task's execute method with another name.

There are two possible way of using it. First is to pass dictionary that maps task argument name to name of saved value.

For example:

If you have task

   class SpawnVMTask(task.Task):
       def execute(self, vm_name, vm_image_id, **kwargs):
           pass  # TODO(imelnikov): use parameters to spawn vm

and you saved vm name with 'name' key in storage, you can spawn vm with such name like this:

   SpawnVMTask(rebind={'vm_name': 'name'})

Second, you can pass a tuple or list of argument names, and values with that names are passed to the task. The length of the tuple or list should not be less then number of task required parameters. For example, you can achieve the same effect as the previous example with:

   SpawnVMTask(rebind_args=('name', 'vm_image_id'))

which is equivalent to a more elaborate:

   SpawnVMTask(rebind=dict(vm_name='name',
                           vm_image_id='vm_image_id'))

In both cases, if your task accepts arbitrary arguments with **kwargs construct, you can specify extra arguments.

For example:

   SpawnVMTask(rebind=('name', 'vm_image_id', 'admin_key_name'))

When such task is about to be executed, name, vm_image_id and admin_key_name values are fetched from storage and value from name is passed to execute method as vm_name, value from vm_image_id is passed as vm_image_id, and value from admin_key_name is passed as admin_key_name parameter in kwargs.

Manually Specifying Requirements

TODO(imelnikov): describe requires parameter, optional task args and **kwargs.

Results Specification

In python, function results are not named, so we can not infer what task a returns. Of course, the complete task result (what execute method returns) is saved in (potentially persistent) storage, but it is not accessible by others unless the task specifies names of those values via its provides task constructor parameter.

Returning One Value

If task returns just one value, provides should be string -- the name of the value.

For example:

   class TheAnswerReturningTask(task.Task):
       def execute(self):
           return 42
   TheAnswerReturningTask(provides='the_answer')

Returning Tuple

For task that returns several values, one option (as usual in python) is to return those via a tuple.

For example:

   class BitsAndPiecesTask(task.Task):
       def execute(self):
           return 'BITs', 'PIECEs'

Then, you can give the value individual names, by passing a tuple or list as provides parameter:

   BitsAndPiecesTask(provides=('bits', 'pieces'))

After such task executes, you (and the engine, which is useful for other tasks) will be able to get those elements from storage by name:

   >>> storage.fetch('bits')
   'BITs'
   >>> storage.fetch('pieces')
   'PIECEs'

Provides argument can be shorter then the actual tuple returned by a task -- then extra values are ignored (but, as expected, all those values are saved and passed to the revert).

Note: Provides arguments tuple can also be longer then the actual tuple returned by task -- when this happens the extra parameters are left undefined: a warning is printed to logs and if use of such parameter is attempted a NotFound exception is raised.

Returning Dictionary

Another option is to return several values as a dictionary (aka a dict).

For example:

   class BitsAndPiecesTask(task.Task):
       def execute(self):
           return {
               'bits': 'BITs',
               'pieces': 'PIECEs'
           }

TaskFlow expects that a dict will be returned if provides argument is a set:

   BitsAndPiecesTask(provides=set(['bits', 'pieces']))

After such task executes, you (and the engine, which is useful for other tasks) will be able to get elements from storage by name:

   >>> storage.fetch('bits')
   'BITs'
   >>> storage.fetch('pieces')
   'PIECEs'

Note: if some items from the dict returned by the task are not present in the provides arguments -- then extra values are ignored (but, of course, saved and passed to the revert method). If the provides argument has some items not present in the actual dict returned by the task -- then extra parameters are left undefined: a warning is printed to logs and if use of such parameter is attempted a NotFound exception is raised.

Default Provides

As mentioned above, by default task provides nothing, which means task results are not accessible by all the other tasks in the flow.

Task author can override this and specify default value for provides using default_provides class variable:

   class BitsAndPiecesTask(task.Task):
       default_provides = ('bits', 'pieces')
       def execute(self):
           return 'BITs', 'PIECEs'

Of course, flow author can override this to change names:

   BitsAndPiecesTask(provides=('b', 'p'))

or to change structure -- e.g. this instance will make whole tuple accessible to other tasks by name 'bnp':

   BitsAndPiecesTask(provides='bnp')

or flow author may want to return default behavior and hide the results of the task from other tasks in the flow (e.g. to avoid naming conflicts):

   BitsAndPiecesTask(provides=())