Jump to: navigation, search

Difference between revisions of "Scheduler/NormalizedWeights"

(Created page with " __TOC__ == Overview == Currently the weighter is using the raw values instead of normalizing them. This makes difficult to properly use multipliers for establishing the rel...")
 
Line 6: Line 6:
 
Currently the weighter is using the raw values instead of normalizing
 
Currently the weighter is using the raw values instead of normalizing
 
them. This makes difficult to properly use multipliers for establishing
 
them. This makes difficult to properly use multipliers for establishing
the relative importance between two weighter (one big magnitude could
+
the relative importance between two weighters (one big magnitude could
 
shade a smaller one).
 
shade a smaller one).
 +
 +
For example, a weighter that is returning 0 and 1 as values will need a multiplier
 +
that is big enough to be taking into account with regard to the RAM weighter, that
 +
returns higher values.
  
 
This blueprint aims to introduce weight normalization so that we can apply
 
This blueprint aims to introduce weight normalization so that we can apply
Line 13: Line 17:
 
since currently we only have 1 weighter. All the weights will be
 
since currently we only have 1 weighter. All the weights will be
 
normalized between 0.0 and 1.0, so that the final weight for a host
 
normalized between 0.0 and 1.0, so that the final weight for a host
will be.
+
will be as follows:
  
 
     weight = w1_multiplier * norm(w1) + w2_multiplier * norm(w2) + ...
 
     weight = w1_multiplier * norm(w1) + w2_multiplier * norm(w2) + ...
  
Furthermore, two kinds of normalization will be provided.  
+
This way it is easier for a resource provider to configure and establish the
 +
importance between the weighters.
 +
 
 +
Two kinds of normalization will be provided:
 +
 
 +
* If the weighter specifies the upper and lower values, the weighted objects will be normalized with regard to these values.
 +
* In the case that the weighter does not supply the lower and upper limits for the weighted objects, the maximum and minimum values from the weighted objects will be used.
  
* In the first case the list will be normalized using its lowest and highest value.
 
* In the second case, the weighter should be able to specify the upper and lower values.
 
  
Comment from Lindgren:
+
Comment from Lindgren in https://review.openstack.org/#/c/27160/ :
  
 
<blockquote>
 
<blockquote>

Revision as of 08:57, 5 June 2013

Contents

Overview

Currently the weighter is using the raw values instead of normalizing them. This makes difficult to properly use multipliers for establishing the relative importance between two weighters (one big magnitude could shade a smaller one).

For example, a weighter that is returning 0 and 1 as values will need a multiplier that is big enough to be taking into account with regard to the RAM weighter, that returns higher values.

This blueprint aims to introduce weight normalization so that we can apply multipliers easily. The commit does not change the behavior per-se since currently we only have 1 weighter. All the weights will be normalized between 0.0 and 1.0, so that the final weight for a host will be as follows:

   weight = w1_multiplier * norm(w1) + w2_multiplier * norm(w2) + ...

This way it is easier for a resource provider to configure and establish the importance between the weighters.

Two kinds of normalization will be provided:

  • If the weighter specifies the upper and lower values, the weighted objects will be normalized with regard to these values.
  • In the case that the weighter does not supply the lower and upper limits for the weighted objects, the maximum and minimum values from the weighted objects will be used.


Comment from Lindgren in https://review.openstack.org/#/c/27160/ :

If I have a weigher for a resource that ranges between 1 and 100 and normalize the weights of two objects, the normalization will produce equal results for weights that differ on the scale (as long as the same object has the highest weight).

For weights [0, 1], this will result in normalized values [0.0, 1.0]. For weights [99, 100] this will still result in the same [0.0, 1.0].

Lets assume I have two objects and two weighers. Both weighers produce values in the same range (this could possibly be already normalized values like utilization of resources which range from 0.0 to 1.0). The weighers give weights [0.1, 0.2] and [1.0, 0.1] which without further normalization gives final weights [1.1, 0.3]. With the current way of calculating normalization, the result will instead be [1.0, 1.0] which has an effect on the ordering of the objects.