pyspark error does not exist in the jvm error when initializing SparkContext












5















I am using spark over emr and writing a pyspark script,
I am getting an error when trying to



from pyspark import SparkContext
sc = SparkContext()


this is the error



File "pyex.py", line 5, in <module>
sc = SparkContext() File "/usr/local/lib/python3.4/site-packages/pyspark/context.py", line 118, in __init__
conf, jsc, profiler_cls) File "/usr/local/lib/python3.4/site-packages/pyspark/context.py", line 195, in _do_init
self._encryption_enabled = self._jvm.PythonUtils.getEncryptionEnabled(self._jsc) File "/usr/local/lib/python3.4/site-packages/py4j/java_gateway.py", line 1487, in __getattr__
"{0}.{1} does not exist in the JVM".format(self._fqn, name)) py4j.protocol.Py4JError: org.apache.spark.api.python.PythonUtils.getEncryptionEnabled does not exist in the JVM


I found this answer stating that I need to import sparkcontext but this is not working also.










share|improve this question

























  • Did you close the SparkContext? Also, can you show the full code?

    – karma4917
    Nov 5 '18 at 21:04













  • this is happening before I get the chance to use it. I am creating it and get the error.

    – thebeancounter
    Nov 5 '18 at 21:05











  • What do you get if you do print(conf)?

    – karma4917
    Nov 5 '18 at 21:13











  • <module 'pyspark.conf' from '/usr/local/lib/python3.4/site-packages/pyspark/conf.py'>

    – thebeancounter
    Nov 5 '18 at 21:26











  • Try sc = SparkContext(conf)

    – karma4917
    Nov 5 '18 at 21:26


















5















I am using spark over emr and writing a pyspark script,
I am getting an error when trying to



from pyspark import SparkContext
sc = SparkContext()


this is the error



File "pyex.py", line 5, in <module>
sc = SparkContext() File "/usr/local/lib/python3.4/site-packages/pyspark/context.py", line 118, in __init__
conf, jsc, profiler_cls) File "/usr/local/lib/python3.4/site-packages/pyspark/context.py", line 195, in _do_init
self._encryption_enabled = self._jvm.PythonUtils.getEncryptionEnabled(self._jsc) File "/usr/local/lib/python3.4/site-packages/py4j/java_gateway.py", line 1487, in __getattr__
"{0}.{1} does not exist in the JVM".format(self._fqn, name)) py4j.protocol.Py4JError: org.apache.spark.api.python.PythonUtils.getEncryptionEnabled does not exist in the JVM


I found this answer stating that I need to import sparkcontext but this is not working also.










share|improve this question

























  • Did you close the SparkContext? Also, can you show the full code?

    – karma4917
    Nov 5 '18 at 21:04













  • this is happening before I get the chance to use it. I am creating it and get the error.

    – thebeancounter
    Nov 5 '18 at 21:05











  • What do you get if you do print(conf)?

    – karma4917
    Nov 5 '18 at 21:13











  • <module 'pyspark.conf' from '/usr/local/lib/python3.4/site-packages/pyspark/conf.py'>

    – thebeancounter
    Nov 5 '18 at 21:26











  • Try sc = SparkContext(conf)

    – karma4917
    Nov 5 '18 at 21:26
















5












5








5








I am using spark over emr and writing a pyspark script,
I am getting an error when trying to



from pyspark import SparkContext
sc = SparkContext()


this is the error



File "pyex.py", line 5, in <module>
sc = SparkContext() File "/usr/local/lib/python3.4/site-packages/pyspark/context.py", line 118, in __init__
conf, jsc, profiler_cls) File "/usr/local/lib/python3.4/site-packages/pyspark/context.py", line 195, in _do_init
self._encryption_enabled = self._jvm.PythonUtils.getEncryptionEnabled(self._jsc) File "/usr/local/lib/python3.4/site-packages/py4j/java_gateway.py", line 1487, in __getattr__
"{0}.{1} does not exist in the JVM".format(self._fqn, name)) py4j.protocol.Py4JError: org.apache.spark.api.python.PythonUtils.getEncryptionEnabled does not exist in the JVM


I found this answer stating that I need to import sparkcontext but this is not working also.










share|improve this question
















I am using spark over emr and writing a pyspark script,
I am getting an error when trying to



from pyspark import SparkContext
sc = SparkContext()


this is the error



File "pyex.py", line 5, in <module>
sc = SparkContext() File "/usr/local/lib/python3.4/site-packages/pyspark/context.py", line 118, in __init__
conf, jsc, profiler_cls) File "/usr/local/lib/python3.4/site-packages/pyspark/context.py", line 195, in _do_init
self._encryption_enabled = self._jvm.PythonUtils.getEncryptionEnabled(self._jsc) File "/usr/local/lib/python3.4/site-packages/py4j/java_gateway.py", line 1487, in __getattr__
"{0}.{1} does not exist in the JVM".format(self._fqn, name)) py4j.protocol.Py4JError: org.apache.spark.api.python.PythonUtils.getEncryptionEnabled does not exist in the JVM


I found this answer stating that I need to import sparkcontext but this is not working also.







python python-3.x apache-spark pyspark amazon-emr






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Nov 7 '18 at 16:22









Kara

3,977104252




3,977104252










asked Nov 5 '18 at 20:45









thebeancounterthebeancounter

8721827




8721827













  • Did you close the SparkContext? Also, can you show the full code?

    – karma4917
    Nov 5 '18 at 21:04













  • this is happening before I get the chance to use it. I am creating it and get the error.

    – thebeancounter
    Nov 5 '18 at 21:05











  • What do you get if you do print(conf)?

    – karma4917
    Nov 5 '18 at 21:13











  • <module 'pyspark.conf' from '/usr/local/lib/python3.4/site-packages/pyspark/conf.py'>

    – thebeancounter
    Nov 5 '18 at 21:26











  • Try sc = SparkContext(conf)

    – karma4917
    Nov 5 '18 at 21:26





















  • Did you close the SparkContext? Also, can you show the full code?

    – karma4917
    Nov 5 '18 at 21:04













  • this is happening before I get the chance to use it. I am creating it and get the error.

    – thebeancounter
    Nov 5 '18 at 21:05











  • What do you get if you do print(conf)?

    – karma4917
    Nov 5 '18 at 21:13











  • <module 'pyspark.conf' from '/usr/local/lib/python3.4/site-packages/pyspark/conf.py'>

    – thebeancounter
    Nov 5 '18 at 21:26











  • Try sc = SparkContext(conf)

    – karma4917
    Nov 5 '18 at 21:26



















Did you close the SparkContext? Also, can you show the full code?

– karma4917
Nov 5 '18 at 21:04







Did you close the SparkContext? Also, can you show the full code?

– karma4917
Nov 5 '18 at 21:04















this is happening before I get the chance to use it. I am creating it and get the error.

– thebeancounter
Nov 5 '18 at 21:05





this is happening before I get the chance to use it. I am creating it and get the error.

– thebeancounter
Nov 5 '18 at 21:05













What do you get if you do print(conf)?

– karma4917
Nov 5 '18 at 21:13





What do you get if you do print(conf)?

– karma4917
Nov 5 '18 at 21:13













<module 'pyspark.conf' from '/usr/local/lib/python3.4/site-packages/pyspark/conf.py'>

– thebeancounter
Nov 5 '18 at 21:26





<module 'pyspark.conf' from '/usr/local/lib/python3.4/site-packages/pyspark/conf.py'>

– thebeancounter
Nov 5 '18 at 21:26













Try sc = SparkContext(conf)

– karma4917
Nov 5 '18 at 21:26







Try sc = SparkContext(conf)

– karma4917
Nov 5 '18 at 21:26














5 Answers
5






active

oldest

votes


















12














PySpark recently released 2.4.0, but there's no stable release for spark coinciding with this new version. Try downgrading to pyspark 2.3.2, this fixed it for me



Edit: to be more clear your PySpark version needs to be the same as the Apache Spark version that is downloaded, or you may run into compatibility issues



Check the version of pyspark by using




pip freeze







share|improve this answer


























  • For which version have they even released PySpark 2.4.0 for then?

    – shubhamgoel27
    Nov 16 '18 at 7:22











  • when i made this post, spark.apache.org/downloads.html did not have 2.4.0 available for download, only 2.3.2. As long as the pyspark version == apache sparks you should be good. I will update the post

    – svw
    Nov 17 '18 at 14:37





















2














I just had a fresh pyspark installation on my Windows device and was having the exact same issue. What seems to have helped is the following:



Go to your System Environment Variables and add PYTHONPATH to it with the following value: %SPARK_HOME%python;%SPARK_HOME%pythonlibpy4j-<version>-src.zip:%PYTHONPATH%, just check what py4j version you have in your spark/python/lib folder.



The reason why I think this works is because when I installed pyspark using conda, it also downloaded a py4j version which may not be compatible with the specific version of spark, so it seems to package its own version.






share|improve this answer































    1














    Use SparkContext().stop() at the end of the program to stop this situation.






    share|improve this answer































      1














      The following steps solved my issue:
      - Downgrading it to 2.3.2
      - adding PYTHONPATH as System Environment Variable with value %SPARK_HOME%python;%SPARK_HOME%pythonlibpy4j-<version>-src.zip:%PYTHONPATH%
      Note: use proper version in the value given above, don't copy exactly.






      share|improve this answer































        0














        Instead of editing the Environment Variables, you might just ensure that the Python environment (the one with pyspark) also has the same py4j version as the zip file present in the pythonlib dictionary within you Spark folder. E.g., d:ProgramsSparkpythonlibpy4j-0.10.7-src.zip on my system, for Spark 2.3.2. It's the py4j version shipped as part of the Spark archive file.






        share|improve this answer























          Your Answer






          StackExchange.ifUsing("editor", function () {
          StackExchange.using("externalEditor", function () {
          StackExchange.using("snippets", function () {
          StackExchange.snippets.init();
          });
          });
          }, "code-snippets");

          StackExchange.ready(function() {
          var channelOptions = {
          tags: "".split(" "),
          id: "1"
          };
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function() {
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled) {
          StackExchange.using("snippets", function() {
          createEditor();
          });
          }
          else {
          createEditor();
          }
          });

          function createEditor() {
          StackExchange.prepareEditor({
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: true,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: 10,
          bindNavPrevention: true,
          postfix: "",
          imageUploader: {
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          },
          onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          });


          }
          });














          draft saved

          draft discarded


















          StackExchange.ready(
          function () {
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53161939%2fpyspark-error-does-not-exist-in-the-jvm-error-when-initializing-sparkcontext%23new-answer', 'question_page');
          }
          );

          Post as a guest















          Required, but never shown

























          5 Answers
          5






          active

          oldest

          votes








          5 Answers
          5






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          12














          PySpark recently released 2.4.0, but there's no stable release for spark coinciding with this new version. Try downgrading to pyspark 2.3.2, this fixed it for me



          Edit: to be more clear your PySpark version needs to be the same as the Apache Spark version that is downloaded, or you may run into compatibility issues



          Check the version of pyspark by using




          pip freeze







          share|improve this answer


























          • For which version have they even released PySpark 2.4.0 for then?

            – shubhamgoel27
            Nov 16 '18 at 7:22











          • when i made this post, spark.apache.org/downloads.html did not have 2.4.0 available for download, only 2.3.2. As long as the pyspark version == apache sparks you should be good. I will update the post

            – svw
            Nov 17 '18 at 14:37


















          12














          PySpark recently released 2.4.0, but there's no stable release for spark coinciding with this new version. Try downgrading to pyspark 2.3.2, this fixed it for me



          Edit: to be more clear your PySpark version needs to be the same as the Apache Spark version that is downloaded, or you may run into compatibility issues



          Check the version of pyspark by using




          pip freeze







          share|improve this answer


























          • For which version have they even released PySpark 2.4.0 for then?

            – shubhamgoel27
            Nov 16 '18 at 7:22











          • when i made this post, spark.apache.org/downloads.html did not have 2.4.0 available for download, only 2.3.2. As long as the pyspark version == apache sparks you should be good. I will update the post

            – svw
            Nov 17 '18 at 14:37
















          12












          12








          12







          PySpark recently released 2.4.0, but there's no stable release for spark coinciding with this new version. Try downgrading to pyspark 2.3.2, this fixed it for me



          Edit: to be more clear your PySpark version needs to be the same as the Apache Spark version that is downloaded, or you may run into compatibility issues



          Check the version of pyspark by using




          pip freeze







          share|improve this answer















          PySpark recently released 2.4.0, but there's no stable release for spark coinciding with this new version. Try downgrading to pyspark 2.3.2, this fixed it for me



          Edit: to be more clear your PySpark version needs to be the same as the Apache Spark version that is downloaded, or you may run into compatibility issues



          Check the version of pyspark by using




          pip freeze








          share|improve this answer














          share|improve this answer



          share|improve this answer








          edited Nov 28 '18 at 13:40









          Rob

          2,30611024




          2,30611024










          answered Nov 7 '18 at 6:01









          svwsvw

          1365




          1365













          • For which version have they even released PySpark 2.4.0 for then?

            – shubhamgoel27
            Nov 16 '18 at 7:22











          • when i made this post, spark.apache.org/downloads.html did not have 2.4.0 available for download, only 2.3.2. As long as the pyspark version == apache sparks you should be good. I will update the post

            – svw
            Nov 17 '18 at 14:37





















          • For which version have they even released PySpark 2.4.0 for then?

            – shubhamgoel27
            Nov 16 '18 at 7:22











          • when i made this post, spark.apache.org/downloads.html did not have 2.4.0 available for download, only 2.3.2. As long as the pyspark version == apache sparks you should be good. I will update the post

            – svw
            Nov 17 '18 at 14:37



















          For which version have they even released PySpark 2.4.0 for then?

          – shubhamgoel27
          Nov 16 '18 at 7:22





          For which version have they even released PySpark 2.4.0 for then?

          – shubhamgoel27
          Nov 16 '18 at 7:22













          when i made this post, spark.apache.org/downloads.html did not have 2.4.0 available for download, only 2.3.2. As long as the pyspark version == apache sparks you should be good. I will update the post

          – svw
          Nov 17 '18 at 14:37







          when i made this post, spark.apache.org/downloads.html did not have 2.4.0 available for download, only 2.3.2. As long as the pyspark version == apache sparks you should be good. I will update the post

          – svw
          Nov 17 '18 at 14:37















          2














          I just had a fresh pyspark installation on my Windows device and was having the exact same issue. What seems to have helped is the following:



          Go to your System Environment Variables and add PYTHONPATH to it with the following value: %SPARK_HOME%python;%SPARK_HOME%pythonlibpy4j-<version>-src.zip:%PYTHONPATH%, just check what py4j version you have in your spark/python/lib folder.



          The reason why I think this works is because when I installed pyspark using conda, it also downloaded a py4j version which may not be compatible with the specific version of spark, so it seems to package its own version.






          share|improve this answer




























            2














            I just had a fresh pyspark installation on my Windows device and was having the exact same issue. What seems to have helped is the following:



            Go to your System Environment Variables and add PYTHONPATH to it with the following value: %SPARK_HOME%python;%SPARK_HOME%pythonlibpy4j-<version>-src.zip:%PYTHONPATH%, just check what py4j version you have in your spark/python/lib folder.



            The reason why I think this works is because when I installed pyspark using conda, it also downloaded a py4j version which may not be compatible with the specific version of spark, so it seems to package its own version.






            share|improve this answer


























              2












              2








              2







              I just had a fresh pyspark installation on my Windows device and was having the exact same issue. What seems to have helped is the following:



              Go to your System Environment Variables and add PYTHONPATH to it with the following value: %SPARK_HOME%python;%SPARK_HOME%pythonlibpy4j-<version>-src.zip:%PYTHONPATH%, just check what py4j version you have in your spark/python/lib folder.



              The reason why I think this works is because when I installed pyspark using conda, it also downloaded a py4j version which may not be compatible with the specific version of spark, so it seems to package its own version.






              share|improve this answer













              I just had a fresh pyspark installation on my Windows device and was having the exact same issue. What seems to have helped is the following:



              Go to your System Environment Variables and add PYTHONPATH to it with the following value: %SPARK_HOME%python;%SPARK_HOME%pythonlibpy4j-<version>-src.zip:%PYTHONPATH%, just check what py4j version you have in your spark/python/lib folder.



              The reason why I think this works is because when I installed pyspark using conda, it also downloaded a py4j version which may not be compatible with the specific version of spark, so it seems to package its own version.







              share|improve this answer












              share|improve this answer



              share|improve this answer










              answered Nov 6 '18 at 16:06









              mugurktmugurkt

              212




              212























                  1














                  Use SparkContext().stop() at the end of the program to stop this situation.






                  share|improve this answer




























                    1














                    Use SparkContext().stop() at the end of the program to stop this situation.






                    share|improve this answer


























                      1












                      1








                      1







                      Use SparkContext().stop() at the end of the program to stop this situation.






                      share|improve this answer













                      Use SparkContext().stop() at the end of the program to stop this situation.







                      share|improve this answer












                      share|improve this answer



                      share|improve this answer










                      answered Nov 21 '18 at 18:30









                      abhishek kumarabhishek kumar

                      465




                      465























                          1














                          The following steps solved my issue:
                          - Downgrading it to 2.3.2
                          - adding PYTHONPATH as System Environment Variable with value %SPARK_HOME%python;%SPARK_HOME%pythonlibpy4j-<version>-src.zip:%PYTHONPATH%
                          Note: use proper version in the value given above, don't copy exactly.






                          share|improve this answer




























                            1














                            The following steps solved my issue:
                            - Downgrading it to 2.3.2
                            - adding PYTHONPATH as System Environment Variable with value %SPARK_HOME%python;%SPARK_HOME%pythonlibpy4j-<version>-src.zip:%PYTHONPATH%
                            Note: use proper version in the value given above, don't copy exactly.






                            share|improve this answer


























                              1












                              1








                              1







                              The following steps solved my issue:
                              - Downgrading it to 2.3.2
                              - adding PYTHONPATH as System Environment Variable with value %SPARK_HOME%python;%SPARK_HOME%pythonlibpy4j-<version>-src.zip:%PYTHONPATH%
                              Note: use proper version in the value given above, don't copy exactly.






                              share|improve this answer













                              The following steps solved my issue:
                              - Downgrading it to 2.3.2
                              - adding PYTHONPATH as System Environment Variable with value %SPARK_HOME%python;%SPARK_HOME%pythonlibpy4j-<version>-src.zip:%PYTHONPATH%
                              Note: use proper version in the value given above, don't copy exactly.







                              share|improve this answer












                              share|improve this answer



                              share|improve this answer










                              answered Dec 22 '18 at 8:05









                              Babu ReddyBabu Reddy

                              73117




                              73117























                                  0














                                  Instead of editing the Environment Variables, you might just ensure that the Python environment (the one with pyspark) also has the same py4j version as the zip file present in the pythonlib dictionary within you Spark folder. E.g., d:ProgramsSparkpythonlibpy4j-0.10.7-src.zip on my system, for Spark 2.3.2. It's the py4j version shipped as part of the Spark archive file.






                                  share|improve this answer




























                                    0














                                    Instead of editing the Environment Variables, you might just ensure that the Python environment (the one with pyspark) also has the same py4j version as the zip file present in the pythonlib dictionary within you Spark folder. E.g., d:ProgramsSparkpythonlibpy4j-0.10.7-src.zip on my system, for Spark 2.3.2. It's the py4j version shipped as part of the Spark archive file.






                                    share|improve this answer


























                                      0












                                      0








                                      0







                                      Instead of editing the Environment Variables, you might just ensure that the Python environment (the one with pyspark) also has the same py4j version as the zip file present in the pythonlib dictionary within you Spark folder. E.g., d:ProgramsSparkpythonlibpy4j-0.10.7-src.zip on my system, for Spark 2.3.2. It's the py4j version shipped as part of the Spark archive file.






                                      share|improve this answer













                                      Instead of editing the Environment Variables, you might just ensure that the Python environment (the one with pyspark) also has the same py4j version as the zip file present in the pythonlib dictionary within you Spark folder. E.g., d:ProgramsSparkpythonlibpy4j-0.10.7-src.zip on my system, for Spark 2.3.2. It's the py4j version shipped as part of the Spark archive file.







                                      share|improve this answer












                                      share|improve this answer



                                      share|improve this answer










                                      answered Feb 1 at 10:57









                                      Pawel KranzbergPawel Kranzberg

                                      13128




                                      13128






























                                          draft saved

                                          draft discarded




















































                                          Thanks for contributing an answer to Stack Overflow!


                                          • Please be sure to answer the question. Provide details and share your research!

                                          But avoid



                                          • Asking for help, clarification, or responding to other answers.

                                          • Making statements based on opinion; back them up with references or personal experience.


                                          To learn more, see our tips on writing great answers.




                                          draft saved


                                          draft discarded














                                          StackExchange.ready(
                                          function () {
                                          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53161939%2fpyspark-error-does-not-exist-in-the-jvm-error-when-initializing-sparkcontext%23new-answer', 'question_page');
                                          }
                                          );

                                          Post as a guest















                                          Required, but never shown





















































                                          Required, but never shown














                                          Required, but never shown












                                          Required, but never shown







                                          Required, but never shown

































                                          Required, but never shown














                                          Required, but never shown












                                          Required, but never shown







                                          Required, but never shown







                                          Popular posts from this blog

                                          'app-layout' is not a known element: how to share Component with different Modules

                                          android studio warns about leanback feature tag usage required on manifest while using Unity exported app?

                                          WPF add header to Image with URL pettitions [duplicate]