Unable load a CSV file as dataframe in spark












2















I am trying to load a CSV file in the data frame and my objective is to display the first row as the column name of the CSV file. but while using the below code, I am getting the error



Exception in thread "main" java.lang.AbstractMethodError
at scala.collection.TraversableLike$class.filterNot(TraversableLike.scala:278)


Code:



def main(args : Array[String]): Unit = {
val spark : SparkSession = SparkSession
.builder()
.master("local")
.appName("SparkSessioncsvExample")
.config("spark.some.config.option", "some-value")
.getOrCreate()
val df = spark.read
.format("csv")
.option("header", "true") //reading the headers
.load("D:/Scala/C2ImportCalEventSample.csv")}


But I'm able to load the file with the code:



val df = spark.sparkContext
.textFile("D:/Scala/C2ImportCalEventSample1.csv")
//.flatMap(header='true')
.map(line => line.split(","))
// .map(line => line.map()
.toDF()


but in the second code file is getting successfully loaded but the first row is not getting as column_name of the data frame.



spark version is: spark-2.3.2  
scala 2.11.3
jdk1.8.0_20
sbt-1.2.7


Thanks any anyone who can help me on this.










share|improve this question

























  • your second method is actually loading the text file.. so you wont get the column names.. could you please paste the sample csv in the question

    – stack0114106
    Dec 31 '18 at 18:57











  • do u have scala 2.10 also on your classpath?

    – Harjeet Kumar
    Jan 1 at 6:57











  • How do you execute the app?

    – Jacek Laskowski
    Jan 1 at 17:24
















2















I am trying to load a CSV file in the data frame and my objective is to display the first row as the column name of the CSV file. but while using the below code, I am getting the error



Exception in thread "main" java.lang.AbstractMethodError
at scala.collection.TraversableLike$class.filterNot(TraversableLike.scala:278)


Code:



def main(args : Array[String]): Unit = {
val spark : SparkSession = SparkSession
.builder()
.master("local")
.appName("SparkSessioncsvExample")
.config("spark.some.config.option", "some-value")
.getOrCreate()
val df = spark.read
.format("csv")
.option("header", "true") //reading the headers
.load("D:/Scala/C2ImportCalEventSample.csv")}


But I'm able to load the file with the code:



val df = spark.sparkContext
.textFile("D:/Scala/C2ImportCalEventSample1.csv")
//.flatMap(header='true')
.map(line => line.split(","))
// .map(line => line.map()
.toDF()


but in the second code file is getting successfully loaded but the first row is not getting as column_name of the data frame.



spark version is: spark-2.3.2  
scala 2.11.3
jdk1.8.0_20
sbt-1.2.7


Thanks any anyone who can help me on this.










share|improve this question

























  • your second method is actually loading the text file.. so you wont get the column names.. could you please paste the sample csv in the question

    – stack0114106
    Dec 31 '18 at 18:57











  • do u have scala 2.10 also on your classpath?

    – Harjeet Kumar
    Jan 1 at 6:57











  • How do you execute the app?

    – Jacek Laskowski
    Jan 1 at 17:24














2












2








2








I am trying to load a CSV file in the data frame and my objective is to display the first row as the column name of the CSV file. but while using the below code, I am getting the error



Exception in thread "main" java.lang.AbstractMethodError
at scala.collection.TraversableLike$class.filterNot(TraversableLike.scala:278)


Code:



def main(args : Array[String]): Unit = {
val spark : SparkSession = SparkSession
.builder()
.master("local")
.appName("SparkSessioncsvExample")
.config("spark.some.config.option", "some-value")
.getOrCreate()
val df = spark.read
.format("csv")
.option("header", "true") //reading the headers
.load("D:/Scala/C2ImportCalEventSample.csv")}


But I'm able to load the file with the code:



val df = spark.sparkContext
.textFile("D:/Scala/C2ImportCalEventSample1.csv")
//.flatMap(header='true')
.map(line => line.split(","))
// .map(line => line.map()
.toDF()


but in the second code file is getting successfully loaded but the first row is not getting as column_name of the data frame.



spark version is: spark-2.3.2  
scala 2.11.3
jdk1.8.0_20
sbt-1.2.7


Thanks any anyone who can help me on this.










share|improve this question
















I am trying to load a CSV file in the data frame and my objective is to display the first row as the column name of the CSV file. but while using the below code, I am getting the error



Exception in thread "main" java.lang.AbstractMethodError
at scala.collection.TraversableLike$class.filterNot(TraversableLike.scala:278)


Code:



def main(args : Array[String]): Unit = {
val spark : SparkSession = SparkSession
.builder()
.master("local")
.appName("SparkSessioncsvExample")
.config("spark.some.config.option", "some-value")
.getOrCreate()
val df = spark.read
.format("csv")
.option("header", "true") //reading the headers
.load("D:/Scala/C2ImportCalEventSample.csv")}


But I'm able to load the file with the code:



val df = spark.sparkContext
.textFile("D:/Scala/C2ImportCalEventSample1.csv")
//.flatMap(header='true')
.map(line => line.split(","))
// .map(line => line.map()
.toDF()


but in the second code file is getting successfully loaded but the first row is not getting as column_name of the data frame.



spark version is: spark-2.3.2  
scala 2.11.3
jdk1.8.0_20
sbt-1.2.7


Thanks any anyone who can help me on this.







apache-spark-sql apache-spark-dataset






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Dec 31 '18 at 13:29









Yaron

5,60562641




5,60562641










asked Dec 31 '18 at 9:09









sanjeetsanjeet

143




143













  • your second method is actually loading the text file.. so you wont get the column names.. could you please paste the sample csv in the question

    – stack0114106
    Dec 31 '18 at 18:57











  • do u have scala 2.10 also on your classpath?

    – Harjeet Kumar
    Jan 1 at 6:57











  • How do you execute the app?

    – Jacek Laskowski
    Jan 1 at 17:24



















  • your second method is actually loading the text file.. so you wont get the column names.. could you please paste the sample csv in the question

    – stack0114106
    Dec 31 '18 at 18:57











  • do u have scala 2.10 also on your classpath?

    – Harjeet Kumar
    Jan 1 at 6:57











  • How do you execute the app?

    – Jacek Laskowski
    Jan 1 at 17:24

















your second method is actually loading the text file.. so you wont get the column names.. could you please paste the sample csv in the question

– stack0114106
Dec 31 '18 at 18:57





your second method is actually loading the text file.. so you wont get the column names.. could you please paste the sample csv in the question

– stack0114106
Dec 31 '18 at 18:57













do u have scala 2.10 also on your classpath?

– Harjeet Kumar
Jan 1 at 6:57





do u have scala 2.10 also on your classpath?

– Harjeet Kumar
Jan 1 at 6:57













How do you execute the app?

– Jacek Laskowski
Jan 1 at 17:24





How do you execute the app?

– Jacek Laskowski
Jan 1 at 17:24












1 Answer
1






active

oldest

votes


















1














java.lang.AbstractMethodError almost always means that you have different libraries on the classpath than at compilation time. In this case I would check to make sure you have the correct version of Scala (and only have one version of scala) on the classpath.






share|improve this answer
























  • Thanks , Harjeet kumar, my env variable is pointing scala 2.12.8 and my sbt file it is 2.11.3,. I have change my env varibale to 2.11.8 version also in sbt file it is 2.11.8 version. and it s working file. Thanks for help.

    – sanjeet
    Jan 2 at 6:30













Your Answer






StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53985580%2funable-load-a-csv-file-as-dataframe-in-spark%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes









1














java.lang.AbstractMethodError almost always means that you have different libraries on the classpath than at compilation time. In this case I would check to make sure you have the correct version of Scala (and only have one version of scala) on the classpath.






share|improve this answer
























  • Thanks , Harjeet kumar, my env variable is pointing scala 2.12.8 and my sbt file it is 2.11.3,. I have change my env varibale to 2.11.8 version also in sbt file it is 2.11.8 version. and it s working file. Thanks for help.

    – sanjeet
    Jan 2 at 6:30


















1














java.lang.AbstractMethodError almost always means that you have different libraries on the classpath than at compilation time. In this case I would check to make sure you have the correct version of Scala (and only have one version of scala) on the classpath.






share|improve this answer
























  • Thanks , Harjeet kumar, my env variable is pointing scala 2.12.8 and my sbt file it is 2.11.3,. I have change my env varibale to 2.11.8 version also in sbt file it is 2.11.8 version. and it s working file. Thanks for help.

    – sanjeet
    Jan 2 at 6:30
















1












1








1







java.lang.AbstractMethodError almost always means that you have different libraries on the classpath than at compilation time. In this case I would check to make sure you have the correct version of Scala (and only have one version of scala) on the classpath.






share|improve this answer













java.lang.AbstractMethodError almost always means that you have different libraries on the classpath than at compilation time. In this case I would check to make sure you have the correct version of Scala (and only have one version of scala) on the classpath.







share|improve this answer












share|improve this answer



share|improve this answer










answered Jan 1 at 6:56









Harjeet KumarHarjeet Kumar

3115




3115













  • Thanks , Harjeet kumar, my env variable is pointing scala 2.12.8 and my sbt file it is 2.11.3,. I have change my env varibale to 2.11.8 version also in sbt file it is 2.11.8 version. and it s working file. Thanks for help.

    – sanjeet
    Jan 2 at 6:30





















  • Thanks , Harjeet kumar, my env variable is pointing scala 2.12.8 and my sbt file it is 2.11.3,. I have change my env varibale to 2.11.8 version also in sbt file it is 2.11.8 version. and it s working file. Thanks for help.

    – sanjeet
    Jan 2 at 6:30



















Thanks , Harjeet kumar, my env variable is pointing scala 2.12.8 and my sbt file it is 2.11.3,. I have change my env varibale to 2.11.8 version also in sbt file it is 2.11.8 version. and it s working file. Thanks for help.

– sanjeet
Jan 2 at 6:30







Thanks , Harjeet kumar, my env variable is pointing scala 2.12.8 and my sbt file it is 2.11.3,. I have change my env varibale to 2.11.8 version also in sbt file it is 2.11.8 version. and it s working file. Thanks for help.

– sanjeet
Jan 2 at 6:30






















draft saved

draft discarded




















































Thanks for contributing an answer to Stack Overflow!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53985580%2funable-load-a-csv-file-as-dataframe-in-spark%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

MongoDB - Not Authorized To Execute Command

Npm cannot find a required file even through it is in the searched directory

in spring boot 2.1 many test slices are not allowed anymore due to multiple @BootstrapWith