Jump to: navigation, search

Difference between revisions of "Sahara/ReleaseNotes/0.3"

(Created page with "Elastic Data Processing (first release): * MapReduce jobs could be created and launched using REST API or UI including Pig and Hive scripts in flexible, configurable manner; ...")
 
m (Sergey Lukjanov moved page Savanna/ReleaseNotes/0.3 to Sahara/ReleaseNotes/0.3: Savanna project was renamed due to the trademark issues.)
 
(2 intermediate revisions by the same user not shown)
Line 1: Line 1:
 +
 +
 +
Blueprint and bugs info: https://launchpad.net/savanna/0.3/0.3
 +
 +
 
Elastic Data Processing (first release):
 
Elastic Data Processing (first release):
  
Line 4: Line 9:
 
* jobs could be stored in both Savanna internal DB and Swift;
 
* jobs could be stored in both Savanna internal DB and Swift;
 
* Swift supported as Data Source for MapReduce job input and output;
 
* Swift supported as Data Source for MapReduce job input and output;
 +
  
 
Hadoop provisioning:
 
Hadoop provisioning:
Line 10: Line 16:
 
* Vanilla plugin supports parallel provisioning of latest Hadoop 1.2.1 clusters including Oozie and Hive that are needed for EDP;
 
* Vanilla plugin supports parallel provisioning of latest Hadoop 1.2.1 clusters including Oozie and Hive that are needed for EDP;
 
* Hortonworks Data Platform (HDP) plugin supports HDP 1.3.2 provisioning in parallel manner too;
 
* Hortonworks Data Platform (HDP) plugin supports HDP 1.3.2 provisioning in parallel manner too;
 +
  
 
Core Savanna:
 
Core Savanna:
Line 16: Line 23:
 
* Neutron support (nova-network is supported too);
 
* Neutron support (nova-network is supported too);
 
* all DB-related code is now moved to the “conductor”;
 
* all DB-related code is now moved to the “conductor”;
 +
  
 
Python client:
 
Python client:
Line 21: Line 29:
 
* all core and EDP features are implemented and fully operational through the python-savannaclient;
 
* all core and EDP features are implemented and fully operational through the python-savannaclient;
 
* it supports both v2.0 and v3 APIs of Keystone REST API;
 
* it supports both v2.0 and v3 APIs of Keystone REST API;
 +
  
 
OpenStack Dashboard plugin:
 
OpenStack Dashboard plugin:
Line 26: Line 35:
 
* all core and EDP features are supported in UI;
 
* all core and EDP features are supported in UI;
 
* it uses python-savannaclient for interop now;
 
* it uses python-savannaclient for interop now;
 +
  
 
Others:
 
Others:

Latest revision as of 15:41, 7 March 2014


Blueprint and bugs info: https://launchpad.net/savanna/0.3/0.3


Elastic Data Processing (first release):

  • MapReduce jobs could be created and launched using REST API or UI including Pig and Hive scripts in flexible, configurable manner;
  • jobs could be stored in both Savanna internal DB and Swift;
  • Swift supported as Data Source for MapReduce job input and output;


Hadoop provisioning:

  • data locality is fully supported now - rack and 4-level node group awareness for both HDFS and Swift;
  • Vanilla plugin supports parallel provisioning of latest Hadoop 1.2.1 clusters including Oozie and Hive that are needed for EDP;
  • Hortonworks Data Platform (HDP) plugin supports HDP 1.3.2 provisioning in parallel manner too;


Core Savanna:

  • Extended network configuration including floating IPs pool configuration for node groups and auto assignment support;
  • Neutron support (nova-network is supported too);
  • all DB-related code is now moved to the “conductor”;


Python client:

  • all core and EDP features are implemented and fully operational through the python-savannaclient;
  • it supports both v2.0 and v3 APIs of Keystone REST API;


OpenStack Dashboard plugin:

  • all core and EDP features are supported in UI;
  • it uses python-savannaclient for interop now;


Others:

  • supports building images with Oozie, Hive and optional MySQL for Vanilla plugin;
  • one-click image building for Vanilla plugin;
  • UI integration tests implemented using Selenium;
  • Savanna could be installed in Fedora using yum;
  • all four main Savanna subprojects are available now at pypi.