Things on this page are fragmentary and immature notes/thoughts of the author. Please read with your own judgement!
Symptom
OutOfMemoryError
Cause
java.lang.OutOfMemoryError
is thrown when there is not enough heap memory (for JVM to allocating new objects).
Solution
Increase executor memory.
--executor-memory=20G
Reference:
http://stackoverflow.com/questions/27462061/why-does-spark-fail-with-java-lang-outofmemoryerror-gc-overhead-limit-exceeded