Jump to: navigation, search

Sahara/SparkPlugin

< Sahara
Revision as of 12:10, 24 October 2013 by Daniele Venzano (talk | contribs) (Requirements)

Introduction

Spark is an in-memory implementation of MapReduce written in Scala.
This blueprint proposes a Savanna provisioning plugin for Spark that can launch and resize Spark clusters and run EDP jobs.

Requirements

Support for version 0.8.0 of Spark and later is planned, since it has relaxed dependencies on Hadoop and HDFS library versions. Spark in standalone mode is targeted, no support for Mesos or YARN is planned for now.

Implementation Notes

TBD

Related Resources