Jump to: navigation, search

Difference between revisions of "TaskFlow/Task Arguments and Results"

(Revert Arguments)
(syntax highlighting)
Line 20: Line 20:
 
'''For example:'''
 
'''For example:'''
  
 +
<source lang="python">
 
     >>> class MyTask(task.Task):
 
     >>> class MyTask(task.Task):
 
     ...    def execute(self, spam, eggs):
 
     ...    def execute(self, spam, eggs):
Line 26: Line 27:
 
     >>> MyTask().requires
 
     >>> MyTask().requires
 
     set(['eggs', 'spam'])
 
     set(['eggs', 'spam'])
 +
</source>
  
 
Inference from the method signature is the ''simplest'' way to specify task arguments. Optional arguments (with default values), and special arguments like <code>self</code>, <code>*args</code> and <code>**kwargs</code> are  ignored on inference (as these names have special meaning/usage in python).
 
Inference from the method signature is the ''simplest'' way to specify task arguments. Optional arguments (with default values), and special arguments like <code>self</code>, <code>*args</code> and <code>**kwargs</code> are  ignored on inference (as these names have special meaning/usage in python).
Line 31: Line 33:
 
'''For example:'''
 
'''For example:'''
  
 +
<source lang="python">
 
     >>> class MyTask(task.Task):
 
     >>> class MyTask(task.Task):
 
     ...    def execute(self, spam, eggs=()):
 
     ...    def execute(self, spam, eggs=()):
Line 44: Line 47:
 
     >>> UniTask().requires
 
     >>> UniTask().requires
 
     set([])
 
     set([])
 +
</source>
  
 
=== Rebinding ===
 
=== Rebinding ===
Line 55: Line 59:
 
If you have task
 
If you have task
  
 +
<source lang="python">
 
     class SpawnVMTask(task.Task):
 
     class SpawnVMTask(task.Task):
 
         def execute(self, vm_name, vm_image_id, **kwargs):
 
         def execute(self, vm_name, vm_image_id, **kwargs):
 
             pass  # TODO(imelnikov): use parameters to spawn vm
 
             pass  # TODO(imelnikov): use parameters to spawn vm
 +
</source>
  
 
and you saved 'vm_name' with 'name' key in storage, you can spawn a vm with such 'name' like this:
 
and you saved 'vm_name' with 'name' key in storage, you can spawn a vm with such 'name' like this:
  
 +
<source lang="python">
 
     SpawnVMTask(rebind={'vm_name': 'name'})
 
     SpawnVMTask(rebind={'vm_name': 'name'})
 +
</source>
  
 
The second way is to pass a tuple/list/dict of argument names. The length of the tuple/list/dict should not be less then
 
The second way is to pass a tuple/list/dict of argument names. The length of the tuple/list/dict should not be less then
Line 67: Line 75:
 
the previous example with:
 
the previous example with:
  
 +
<source lang="python">
 
     SpawnVMTask(rebind_args=('name', 'vm_image_id'))
 
     SpawnVMTask(rebind_args=('name', 'vm_image_id'))
 +
</source>
  
 
which is equivalent to a more elaborate:
 
which is equivalent to a more elaborate:
  
 +
<source lang="python">
 
     SpawnVMTask(rebind=dict(vm_name='name',
 
     SpawnVMTask(rebind=dict(vm_name='name',
 
                             vm_image_id='vm_image_id'))
 
                             vm_image_id='vm_image_id'))
 +
</source>
  
 
In both cases, if your task accepts arbitrary arguments with <code>**kwargs</code> construct, you can specify extra arguments.
 
In both cases, if your task accepts arbitrary arguments with <code>**kwargs</code> construct, you can specify extra arguments.
Line 78: Line 90:
 
'''For example:'''
 
'''For example:'''
  
 +
<source lang="python">
 
     SpawnVMTask(rebind=('name', 'vm_image_id', 'admin_key_name'))
 
     SpawnVMTask(rebind=('name', 'vm_image_id', 'admin_key_name'))
 +
</source>
  
 
When such task is about to be executed, <code>name</code>, <code>vm_image_id</code>
 
When such task is about to be executed, <code>name</code>, <code>vm_image_id</code>
Line 97: Line 111:
 
'''For example:'''
 
'''For example:'''
  
 +
<source lang="python">
 
     >>> class Cat(task.Task):
 
     >>> class Cat(task.Task):
 
     ...    def __init__(self, **kwargs):
 
     ...    def __init__(self, **kwargs):
Line 107: Line 122:
 
     >>> Cat().requires
 
     >>> Cat().requires
 
     set(['food', 'milk'])
 
     set(['food', 'milk'])
 +
</source>
  
 
During flow construction of your task the flow author can also add-on additional requirements if desired.
 
During flow construction of your task the flow author can also add-on additional requirements if desired.
Line 114: Line 130:
 
'''For example:'''
 
'''For example:'''
  
 +
<source lang="python">
 
     >>> class Dog(task.Task):
 
     >>> class Dog(task.Task):
 
     ...    def execute(self, food, **kwargs):
 
     ...    def execute(self, food, **kwargs):
Line 119: Line 136:
 
     >>> Dog(requires=("food", "water", "grass")).requires
 
     >>> Dog(requires=("food", "water", "grass")).requires
 
     set(['food', 'water', 'grass'])
 
     set(['food', 'water', 'grass'])
 +
</source>
  
 
If the flow author desires to add-on to existing flow requirements they can resort to during off the argument
 
If the flow author desires to add-on to existing flow requirements they can resort to during off the argument
Line 126: Line 144:
 
'''For example:'''
 
'''For example:'''
  
 +
<source lang="python">
 
     >>> class Bird(task.Task):
 
     >>> class Bird(task.Task):
 
     ...    def execute(self, food, *args, **kwargs):
 
     ...    def execute(self, food, *args, **kwargs):
Line 132: Line 151:
 
     ...      auto_extract=False).requires
 
     ...      auto_extract=False).requires
 
     set(['food', 'water', 'grass'])
 
     set(['food', 'water', 'grass'])
 +
</source>
  
 
== Results Specification ==
 
== Results Specification ==
Line 144: Line 164:
 
'''For example:'''
 
'''For example:'''
  
 +
<source lang="python">
 
     class TheAnswerReturningTask(task.Task):
 
     class TheAnswerReturningTask(task.Task):
 
         def execute(self):
 
         def execute(self):
Line 149: Line 170:
  
 
     TheAnswerReturningTask(provides='the_answer')
 
     TheAnswerReturningTask(provides='the_answer')
 +
</source>
  
 
=== Returning Tuple ===
 
=== Returning Tuple ===
Line 156: Line 178:
 
'''For example:'''
 
'''For example:'''
  
 +
<source lang="python">
 
     class BitsAndPiecesTask(task.Task):
 
     class BitsAndPiecesTask(task.Task):
 
         def execute(self):
 
         def execute(self):
 
             return 'BITs', 'PIECEs'
 
             return 'BITs', 'PIECEs'
 +
</source>
  
 
Then, you can give the value individual names, by passing a tuple or list as <code>provides</code> parameter:
 
Then, you can give the value individual names, by passing a tuple or list as <code>provides</code> parameter:
 
      
 
      
 +
<source lang="python">
 
     BitsAndPiecesTask(provides=('bits', 'pieces'))
 
     BitsAndPiecesTask(provides=('bits', 'pieces'))
 +
</source>
  
 
After such task executes, you (and the engine, which is useful for other tasks) will be able to get those elements from storage by name:
 
After such task executes, you (and the engine, which is useful for other tasks) will be able to get those elements from storage by name:
  
 +
<source lang="python">
 
     >>> storage.fetch('bits')
 
     >>> storage.fetch('bits')
 
     'BITs'
 
     'BITs'
 
     >>> storage.fetch('pieces')
 
     >>> storage.fetch('pieces')
 
     'PIECEs'
 
     'PIECEs'
 +
</source>
  
 
Provides argument can be shorter then the actual tuple returned by a task -- then extra values are ignored (but, as expected, '''all''' those values are saved and passed to the <code>revert</code>).
 
Provides argument can be shorter then the actual tuple returned by a task -- then extra values are ignored (but, as expected, '''all''' those values are saved and passed to the <code>revert</code>).
Line 181: Line 209:
 
'''For example:'''
 
'''For example:'''
  
 +
<source lang="python">
 
     class BitsAndPiecesTask(task.Task):
 
     class BitsAndPiecesTask(task.Task):
 
         def execute(self):
 
         def execute(self):
Line 187: Line 216:
 
                 'pieces': 'PIECEs'
 
                 'pieces': 'PIECEs'
 
             }
 
             }
 +
</source>
  
 
TaskFlow expects that a dict will be returned if <code>provides</code> argument is a <code>set</code>:
 
TaskFlow expects that a dict will be returned if <code>provides</code> argument is a <code>set</code>:
  
 +
<source lang="python">
 
     BitsAndPiecesTask(provides=set(['bits', 'pieces']))
 
     BitsAndPiecesTask(provides=set(['bits', 'pieces']))
 +
</source>
  
 
After such task executes, you (and the engine, which is useful for other tasks) will be able to get elements from storage by name:
 
After such task executes, you (and the engine, which is useful for other tasks) will be able to get elements from storage by name:
  
 +
<source lang="python">
 
     >>> storage.fetch('bits')
 
     >>> storage.fetch('bits')
 
     'BITs'
 
     'BITs'
 
     >>> storage.fetch('pieces')
 
     >>> storage.fetch('pieces')
 
     'PIECEs'
 
     'PIECEs'
 +
</source>
  
 
'''Note:''' if some items from the dict returned by the task are not present in the provides arguments -- then extra values are ignored (but, of course, saved and passed to the <code>revert</code> method). If the provides argument has some items not present in the actual dict returned by the task -- then extra parameters are left undefined: a warning is printed to logs and if use of such parameter is attempted a <code>NotFound</code> exception is raised.
 
'''Note:''' if some items from the dict returned by the task are not present in the provides arguments -- then extra values are ignored (but, of course, saved and passed to the <code>revert</code> method). If the provides argument has some items not present in the actual dict returned by the task -- then extra parameters are left undefined: a warning is printed to logs and if use of such parameter is attempted a <code>NotFound</code> exception is raised.
Line 207: Line 241:
 
The task author can override this and specify default value for provides using <code>default_provides</code> class variable:
 
The task author can override this and specify default value for provides using <code>default_provides</code> class variable:
  
 +
<source lang="python">
 
     class BitsAndPiecesTask(task.Task):
 
     class BitsAndPiecesTask(task.Task):
 
         default_provides = ('bits', 'pieces')
 
         default_provides = ('bits', 'pieces')
 
         def execute(self):
 
         def execute(self):
 
             return 'BITs', 'PIECEs'
 
             return 'BITs', 'PIECEs'
 +
</source>
  
 
Of course,  the flow author can override this to change names if needed:
 
Of course,  the flow author can override this to change names if needed:
  
 +
<source lang="python">
 
     BitsAndPiecesTask(provides=('b', 'p'))
 
     BitsAndPiecesTask(provides=('b', 'p'))
 +
</source>
  
 
or to change structure -- e.g. this instance will make whole tuple accessible to
 
or to change structure -- e.g. this instance will make whole tuple accessible to
 
other tasks by name 'bnp':
 
other tasks by name 'bnp':
  
 +
<source lang="python">
 
     BitsAndPiecesTask(provides='bnp')
 
     BitsAndPiecesTask(provides='bnp')
 +
</source>
  
 
or the flow author may want to return default behavior and hide the results of the
 
or the flow author may want to return default behavior and hide the results of the
 
task from other tasks in the flow (e.g. to avoid naming conflicts):
 
task from other tasks in the flow (e.g. to avoid naming conflicts):
  
 +
<source lang="python">
 
     BitsAndPiecesTask(provides=())
 
     BitsAndPiecesTask(provides=())
 +
</source>
  
 
== Revert Arguments ==
 
== Revert Arguments ==
Line 240: Line 282:
 
'''For example:'''
 
'''For example:'''
  
 +
<source lang="python">
 
     from taskflow.utils import misc
 
     from taskflow.utils import misc
  
Line 250: Line 293:
 
             else:
 
             else:
 
                 print("do_something returned %r" % result)
 
                 print("do_something returned %r" % result)
 +
</source>
  
 
If this task failed (<code>do_something</code> raised exception) it will print <code>"This task failed, exception:"</code> and exception message on revert. If this task finished successfully, it will print <code>"do_something returned"</code> and representation of result.
 
If this task failed (<code>do_something</code> raised exception) it will print <code>"This task failed, exception:"</code> and exception message on revert. If this task finished successfully, it will print <code>"do_something returned"</code> and representation of result.

Revision as of 07:42, 12 November 2013

Revised on: 11/12/2013 by Ivan Melnikov

Overview

In taskflow, all flow & task state goes to (potentially persistent) storage (via the logbook backends and persistence design). That includes all the information that task/s in the flow needs when they are executed, and all the information task produces (via serializable task results). A developer who implements tasks or flows can specify what arguments a task accepts and what result it returns in several ways. This document will help you understand what those ways are and how to use those ways to accomplish your desired taskflow usage pattern.

Task arguments
Set of names of task arguments available as the requires property of the task instance. When a task is about to be executed values with these names are retrieved from storage and passed to execute method of the task.
Task results
Set of names of task results (what task provides) available as provides property of task instance. After a task finishes successfully, its result(s) (what the task execute method returns) are available by these names from storage (see examples below).

Arguments Specification

There are different ways to specify the task argument requires set.

Arguments Inference

Task arguments can be inferred from arguments of the execute method of the task.

For example:

    >>> class MyTask(task.Task):
    ...     def execute(self, spam, eggs):
    ...         return spam + eggs
    ... 
    >>> MyTask().requires
    set(['eggs', 'spam'])

Inference from the method signature is the simplest way to specify task arguments. Optional arguments (with default values), and special arguments like self, *args and **kwargs are ignored on inference (as these names have special meaning/usage in python).

For example:

    >>> class MyTask(task.Task):
    ...     def execute(self, spam, eggs=()):
    ...         return spam + eggs
    ... 
    >>> MyTask().requires
    set(['spam'])
    >>>
    >>> class UniTask(task.Task):
    ...     def execute(self,  *args, **kwargs):
    ...         pass
    ... 
    >>> UniTask().requires
    set([])

Rebinding

Why: There are cases when the value you want to pass to a task is stored with a name other then the corresponding task arguments name. That's when the rebind task constructor parameter comes in handy. Using it the flow author can instruct the engine to fetch a value from storage by one name, but pass it to a tasks execute method with another name. There are two possible ways of accomplishing this.

The first is to pass a dictionary that maps the task argument name to the name of a saved value.

For example:

If you have task

    class SpawnVMTask(task.Task):
        def execute(self, vm_name, vm_image_id, **kwargs):
            pass  # TODO(imelnikov): use parameters to spawn vm

and you saved 'vm_name' with 'name' key in storage, you can spawn a vm with such 'name' like this:

    SpawnVMTask(rebind={'vm_name': 'name'})

The second way is to pass a tuple/list/dict of argument names. The length of the tuple/list/dict should not be less then number of task required parameters. For example, you can achieve the same effect as the previous example with:

    SpawnVMTask(rebind_args=('name', 'vm_image_id'))

which is equivalent to a more elaborate:

    SpawnVMTask(rebind=dict(vm_name='name',
                            vm_image_id='vm_image_id'))

In both cases, if your task accepts arbitrary arguments with **kwargs construct, you can specify extra arguments.

For example:

    SpawnVMTask(rebind=('name', 'vm_image_id', 'admin_key_name'))

When such task is about to be executed, name, vm_image_id and admin_key_name values are fetched from storage and value from name is passed to execute method as vm_name, value from vm_image_id is passed as vm_image_id, and value from admin_key_name is passed as admin_key_name parameter in kwargs.

Manually Specifying Requirements

Why: It is often useful to manually specify the requirements of a task, either by a task author or by the flow author (allowing the flow author to override the task requirements).

To accomplish this when creating your task use the constructor to specify manual requirements. Those manual requirements (if they are not functional arguments) will appear in the kwargs of the execute() method.

For example:

    >>> class Cat(task.Task):
    ...     def __init__(self, **kwargs):
    ...         if 'requires' not in kwargs:
    ..              kwargs['requires'] = ("food", "milk")
    ...         super(Cat, self).__init__(**kwargs)
    ...     def execute(self, food, **kwargs):
    ...         pass
    ... 
    >>> Cat().requires
    set(['food', 'milk'])

During flow construction of your task the flow author can also add-on additional requirements if desired. Those manual requirements (if they are not functional arguments) will appear in the kwargs or args of the execute() method.

For example:

    >>> class Dog(task.Task):
    ...     def execute(self, food, **kwargs):
    ...         pass
    >>> Dog(requires=("food", "water", "grass")).requires
    set(['food', 'water', 'grass'])

If the flow author desires to add-on to existing flow requirements they can resort to during off the argument inference and manually overriding what a tasks requires, use this at your own risk as you must be careful to avoid invalid argument mappings.

For example:

    >>> class Bird(task.Task):
    ...     def execute(self, food, *args, **kwargs):
    ...         pass
    >>> Bird(requires=("food", "water", "grass"),
    ...      auto_extract=False).requires
    set(['food', 'water', 'grass'])

Results Specification

Why: In python, function results are not named, so we can not infer what a task returns. This is important since the complete task result (what the execute method returns) is saved in (potentially persistent) storage, and it is typically (but not always) desirable to make those results accessible to other tasks. To accomplish this the task specifies names of those values via its provides task constructor parameter or other method (see below).

Returning One Value

If task returns just one value, provides should be string -- the name of the value.

For example:

    class TheAnswerReturningTask(task.Task):
        def execute(self):
            return 42

    TheAnswerReturningTask(provides='the_answer')

Returning Tuple

For a task that returns several values, one option (as usual in python) is to return those values via a tuple.

For example:

    class BitsAndPiecesTask(task.Task):
        def execute(self):
            return 'BITs', 'PIECEs'

Then, you can give the value individual names, by passing a tuple or list as provides parameter:

    BitsAndPiecesTask(provides=('bits', 'pieces'))

After such task executes, you (and the engine, which is useful for other tasks) will be able to get those elements from storage by name:

    >>> storage.fetch('bits')
    'BITs'
    >>> storage.fetch('pieces')
    'PIECEs'

Provides argument can be shorter then the actual tuple returned by a task -- then extra values are ignored (but, as expected, all those values are saved and passed to the revert).

Note: Provides arguments tuple can also be longer then the actual tuple returned by task -- when this happens the extra parameters are left undefined: a warning is printed to logs and if use of such parameter is attempted a NotFound exception is raised.

Returning Dictionary

Another option is to return several values as a dictionary (aka a dict).

For example:

    class BitsAndPiecesTask(task.Task):
        def execute(self):
            return {
                'bits': 'BITs',
                'pieces': 'PIECEs'
            }

TaskFlow expects that a dict will be returned if provides argument is a set:

    BitsAndPiecesTask(provides=set(['bits', 'pieces']))

After such task executes, you (and the engine, which is useful for other tasks) will be able to get elements from storage by name:

    >>> storage.fetch('bits')
    'BITs'
    >>> storage.fetch('pieces')
    'PIECEs'

Note: if some items from the dict returned by the task are not present in the provides arguments -- then extra values are ignored (but, of course, saved and passed to the revert method). If the provides argument has some items not present in the actual dict returned by the task -- then extra parameters are left undefined: a warning is printed to logs and if use of such parameter is attempted a NotFound exception is raised.

Default Provides

As mentioned above, the default task base class provides nothing, which means task results are not accessible by all the other tasks in the flow.

The task author can override this and specify default value for provides using default_provides class variable:

    class BitsAndPiecesTask(task.Task):
        default_provides = ('bits', 'pieces')
        def execute(self):
            return 'BITs', 'PIECEs'

Of course, the flow author can override this to change names if needed:

    BitsAndPiecesTask(provides=('b', 'p'))

or to change structure -- e.g. this instance will make whole tuple accessible to other tasks by name 'bnp':

    BitsAndPiecesTask(provides='bnp')

or the flow author may want to return default behavior and hide the results of the task from other tasks in the flow (e.g. to avoid naming conflicts):

    BitsAndPiecesTask(provides=())

Revert Arguments

To revert a task engine calls its revert method. This method should accept same arguments as execute method of the task and one more special keyword argument, named result.

For result value, two cases are possible:

  • if task is being reverted because it failed (an exception was raised from its execute method), result value is instance of taskflow.utils.misc.Failure object that holds exception information;
  • if task is being reverted because some other task failed, and this task finished successfully, result value is task result fetched from storage: basically, that's what execute method returned.

All other arguments are fetched from storage in the same way it is done for execute method.

For example:

    from taskflow.utils import misc

    class RevertingTask(task.Task):
        def execute(self, spam, eggs):
            return do_something(spam, eggs)
        def revert(self, result, spam, eggs):
            if isinstance(result, misc.Failure):
                print("This task failed, exception: %s"  % result.exception_str)
            else:
                print("do_something returned %r" % result)

If this task failed (do_something raised exception) it will print "This task failed, exception:" and exception message on revert. If this task finished successfully, it will print "do_something returned" and representation of result.