Install Spark on Windows (PySpark)
Michael Galarnyk

Michael, have some problems: 1. have warning when I re-open CMD windown to call pyspark Warning Subcommand ‘ipython notebook’ is deprecated….. Warning You likely want to use ‘jupyter notebook’ in future Should I go back to change SETX sentense some where earlier? 2. when Jpyter open, I only have new python 3 (not 2, does it matter) 3. when I just typy in the first section in you Github there are error saying — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — — NameError Traceback (most recent call last) <ipython-input-2–0ecff6c59792> in <module>() 2 # 3 → 4 sc = SparkContext.getOrCreate() NameError: name ‘SparkContext’ is not defined Do you know the problem and any recommendation to solve it? Thank you!

And I notice the Path in my User Variable is not the same as the Path in my system Variable

and I was using another way to set up pyspark enviroment and I have this path as my User Variable. are those conflict and does it matter?

I set a JAVA_HOME: C:\Program Files\Java\jdk1.8.0_131

and the Path: C:\Users\Python\Python36–32\Scripts\;C:\Program Files\Java\jdk1.8.0_131\bin;C:\Program Files (x86)\sbt\bin;

so should I also add


to my user Path?

Like what you read? Give Yang Xu a round of applause.

From a quick cheer to a standing ovation, clap to show how much you enjoyed this story.