When I run a single class of scala with Spark in IDEA, an error happens:
java.lang.NoClassDefFoundError: org/apache/spark/SparkContext
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
at java.lang.Class.privateGetMethodRecursive(Class.java:3048)
at java.lang.Class.getMethod0(Class.java:3018)
at java.lang.Class.getMethod(Class.java:1784)
at sun.launcher.LauncherHelper.validateMainClass(LauncherHelper.java:544)
at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:526)
Caused by: java.lang.ClassNotFoundException: org.apache.spark.SparkContext
at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
… 7 more
Error: A JNI error has occurred, please check your installation and try again
Exception in thread “main”
Process finished with exit code 1
Open the Run/Debug Configuration of this class dialog, Click “Modify options” -> Select “Include dependencies with “Provided” scope” , then click OK resolved this issue.