OverOps for Spark

Start free trial

Monitor and analyze all Spark errors. Get full visibility into your cluster.

OverOps tracks ALL errors coming from Spark and provides rich analytics
to help you see which one really affect you.

See the entire stack and variable values that led to each error, laid on your original code.
See how changes to your code impact Spark.

WordCount tokenize
private def tokenize( text null :String):Array[String] = { text null .toLowerCase.replaceAll("[^a-zA-Z0-9\\s]", "").split("\\s+") }
WordCount execute
def execute( master "local" :String, args args "hd""/tmp/1427040635690-0/input" "tl":: :List[String], jars Nil :Seq[String] = Nil) { val sc sc appName"WordCountJob" master"local" startTime1427040637334 = new SparkContext( master "local" , AppName "WordCountJob" Hover over variables to see their value when the error occurred. OverOps automatically reconstructs the code leading to the error within the JVM. , null, jars Nil ) val file file scSparkContext namenull log_null origin"textFile at WordCount.scala:29" = sc sc appName"WordCountJob" master"local" startTime1427040637334 .textFile( args args "hd""/tmp/1427040635690-0/input" "tl":: (0)) val words = file file scSparkContext namenull log_null origin"textFile at WordCount.scala:29" .flatMap(line => tokenize(line)) val wordCounts = words.map(x => (x, 1)).reduceByKey(_ + _) wordCounts.saveAsTextFile( args args "hd""/tmp/1427040635690-0/input" "tl":: (1)) }
User already exists Log in

FREE trial   |   Unlimited number of servers   |   Unlimited number of users