Pspark & Jupyter: Unsupported major.minor version 52.0












0















I downgraded from JDK 1.8 to 1.7 as I'm trying to deal with another problem for which one suggestion was to use 1.7.



However I'm now finding that my Juypyter notebook now hangs on this line:



spark = SparkSession.builder.appName("Basic").master("local[*]").config("spark.network.timeout","50s").config("spark.executor.heartbeatInterval", "50s").getOrCreate();


Looking at the console I see:



Exception in thread "main" java.lang.UnsupportedClassVersionError: org/apache/spark/launcher/Main : Unsupported major.minor version 52.0
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(Unknown Source)
at java.security.SecureClassLoader.defineClass(Unknown Source)
at java.net.URLClassLoader.defineClass(Unknown Source)
at java.net.URLClassLoader.access$100(Unknown Source)
at java.net.URLClassLoader$1.run(Unknown Source)
at java.net.URLClassLoader$1.run(Unknown Source)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at sun.launcher.LauncherHelper.checkAndLoadMain(Unknown Source)


Which from searching I understand is due to different versions of Java being used. However both my path and Java_Home are pointing to 1.7 not 1.8 and I rebooted my machine. What else should I do? Should I remove and re-do my pip install of pyspark?










share|improve this question

























  • It appears that you are using Spark 2. As per my understanding, Spark 2 works with JDK 1.8 whereas in your system, you configured to use JDK 1.7 which is causing the issue. If possible, try using Spark 1.6 if you want to stick to JDK 1.7.

    – Irfan Elahi
    Jan 7 at 3:49
















0















I downgraded from JDK 1.8 to 1.7 as I'm trying to deal with another problem for which one suggestion was to use 1.7.



However I'm now finding that my Juypyter notebook now hangs on this line:



spark = SparkSession.builder.appName("Basic").master("local[*]").config("spark.network.timeout","50s").config("spark.executor.heartbeatInterval", "50s").getOrCreate();


Looking at the console I see:



Exception in thread "main" java.lang.UnsupportedClassVersionError: org/apache/spark/launcher/Main : Unsupported major.minor version 52.0
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(Unknown Source)
at java.security.SecureClassLoader.defineClass(Unknown Source)
at java.net.URLClassLoader.defineClass(Unknown Source)
at java.net.URLClassLoader.access$100(Unknown Source)
at java.net.URLClassLoader$1.run(Unknown Source)
at java.net.URLClassLoader$1.run(Unknown Source)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at sun.launcher.LauncherHelper.checkAndLoadMain(Unknown Source)


Which from searching I understand is due to different versions of Java being used. However both my path and Java_Home are pointing to 1.7 not 1.8 and I rebooted my machine. What else should I do? Should I remove and re-do my pip install of pyspark?










share|improve this question

























  • It appears that you are using Spark 2. As per my understanding, Spark 2 works with JDK 1.8 whereas in your system, you configured to use JDK 1.7 which is causing the issue. If possible, try using Spark 1.6 if you want to stick to JDK 1.7.

    – Irfan Elahi
    Jan 7 at 3:49














0












0








0








I downgraded from JDK 1.8 to 1.7 as I'm trying to deal with another problem for which one suggestion was to use 1.7.



However I'm now finding that my Juypyter notebook now hangs on this line:



spark = SparkSession.builder.appName("Basic").master("local[*]").config("spark.network.timeout","50s").config("spark.executor.heartbeatInterval", "50s").getOrCreate();


Looking at the console I see:



Exception in thread "main" java.lang.UnsupportedClassVersionError: org/apache/spark/launcher/Main : Unsupported major.minor version 52.0
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(Unknown Source)
at java.security.SecureClassLoader.defineClass(Unknown Source)
at java.net.URLClassLoader.defineClass(Unknown Source)
at java.net.URLClassLoader.access$100(Unknown Source)
at java.net.URLClassLoader$1.run(Unknown Source)
at java.net.URLClassLoader$1.run(Unknown Source)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at sun.launcher.LauncherHelper.checkAndLoadMain(Unknown Source)


Which from searching I understand is due to different versions of Java being used. However both my path and Java_Home are pointing to 1.7 not 1.8 and I rebooted my machine. What else should I do? Should I remove and re-do my pip install of pyspark?










share|improve this question
















I downgraded from JDK 1.8 to 1.7 as I'm trying to deal with another problem for which one suggestion was to use 1.7.



However I'm now finding that my Juypyter notebook now hangs on this line:



spark = SparkSession.builder.appName("Basic").master("local[*]").config("spark.network.timeout","50s").config("spark.executor.heartbeatInterval", "50s").getOrCreate();


Looking at the console I see:



Exception in thread "main" java.lang.UnsupportedClassVersionError: org/apache/spark/launcher/Main : Unsupported major.minor version 52.0
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(Unknown Source)
at java.security.SecureClassLoader.defineClass(Unknown Source)
at java.net.URLClassLoader.defineClass(Unknown Source)
at java.net.URLClassLoader.access$100(Unknown Source)
at java.net.URLClassLoader$1.run(Unknown Source)
at java.net.URLClassLoader$1.run(Unknown Source)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at sun.launcher.LauncherHelper.checkAndLoadMain(Unknown Source)


Which from searching I understand is due to different versions of Java being used. However both my path and Java_Home are pointing to 1.7 not 1.8 and I rebooted my machine. What else should I do? Should I remove and re-do my pip install of pyspark?







pyspark






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Jan 2 at 16:23







user1761806

















asked Jan 2 at 16:16









user1761806user1761806

2,27711633




2,27711633













  • It appears that you are using Spark 2. As per my understanding, Spark 2 works with JDK 1.8 whereas in your system, you configured to use JDK 1.7 which is causing the issue. If possible, try using Spark 1.6 if you want to stick to JDK 1.7.

    – Irfan Elahi
    Jan 7 at 3:49



















  • It appears that you are using Spark 2. As per my understanding, Spark 2 works with JDK 1.8 whereas in your system, you configured to use JDK 1.7 which is causing the issue. If possible, try using Spark 1.6 if you want to stick to JDK 1.7.

    – Irfan Elahi
    Jan 7 at 3:49

















It appears that you are using Spark 2. As per my understanding, Spark 2 works with JDK 1.8 whereas in your system, you configured to use JDK 1.7 which is causing the issue. If possible, try using Spark 1.6 if you want to stick to JDK 1.7.

– Irfan Elahi
Jan 7 at 3:49





It appears that you are using Spark 2. As per my understanding, Spark 2 works with JDK 1.8 whereas in your system, you configured to use JDK 1.7 which is causing the issue. If possible, try using Spark 1.6 if you want to stick to JDK 1.7.

– Irfan Elahi
Jan 7 at 3:49












1 Answer
1






active

oldest

votes


















0














Just use a Docker container from: https://github.com/jupyter/docker-stacks
Why make it difficult for yourself?






share|improve this answer























    Your Answer






    StackExchange.ifUsing("editor", function () {
    StackExchange.using("externalEditor", function () {
    StackExchange.using("snippets", function () {
    StackExchange.snippets.init();
    });
    });
    }, "code-snippets");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "1"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54009655%2fpspark-jupyter-unsupported-major-minor-version-52-0%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    0














    Just use a Docker container from: https://github.com/jupyter/docker-stacks
    Why make it difficult for yourself?






    share|improve this answer




























      0














      Just use a Docker container from: https://github.com/jupyter/docker-stacks
      Why make it difficult for yourself?






      share|improve this answer


























        0












        0








        0







        Just use a Docker container from: https://github.com/jupyter/docker-stacks
        Why make it difficult for yourself?






        share|improve this answer













        Just use a Docker container from: https://github.com/jupyter/docker-stacks
        Why make it difficult for yourself?







        share|improve this answer












        share|improve this answer



        share|improve this answer










        answered Jan 7 at 2:04









        user9382513user9382513

        211




        211
































            draft saved

            draft discarded




















































            Thanks for contributing an answer to Stack Overflow!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f54009655%2fpspark-jupyter-unsupported-major-minor-version-52-0%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            'app-layout' is not a known element: how to share Component with different Modules

            android studio warns about leanback feature tag usage required on manifest while using Unity exported app?

            WPF add header to Image with URL pettitions [duplicate]