When Java 9 is the default version getting resolved in the environment, pyspark will throw error below
and you will see name 'xx' is not defined error when trying to access sc, spark etc. from shell / Jupyter.
Python 3.6.3 (default, Oct 19 2017, 13:58:41)
[GCC 4.2.1 Compatible Apple LLVM 9.0.0 (clang-900.0.38)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).