Things on this page are fragmentary and immature notes/thoughts of the author. Please read with your own judgement!
Symptoms
java.io.IOException: ViewFs: Cannot initialize: Empty Mount table in config for viewfs://cluster-name-ns02/
Possible Causes
As the error message says,
viewfs://cluster-name-ns02
is not configured.
-
It is possible that the Spark cluster has just migrated to Router-based Federation (RBF) namenodes, but the Spark client is not updated correspondingly.
-
The HDFS path is not configured to be accessible.
Possible Solutions
-
Ask the Hadoop admin to update the Hadoop/Spark client (both the Hadoop binary and configuration files) if this is due to lacking of RBF compatibility.
-
Ask the Hadoop admin to configure
viewfs://cluster-name-ns02
if the issue is due to misconfiguration. -
Use a different HDFS path.