How to find installation directory of Apache Spark package in Homebrew?












0














I installed spark on my mac with homebrew. I'm trying to find the directory where I installed it. I've tried googling it but I'm not having much luck. It doesn't seem like it should be that tricky. Can anyone please tell me what I need to run in mac terminal or from spark shell to find the installation directory for spark?



Update:



Code:



brew info apache-spark


Output:



apache-spark: stable 2.3.2, HEAD
Engine for large-scale data processing
https://spark.apache.org/
/usr/local/Cellar/apache-spark/2.3.2 (1,058 files, 244.6MB) *
Built from source on 2018-10-30 at 14:16:30
From: https://github.com/Homebrew/homebrew-core/blob/master/Formula/apache-spark.rb
==> Requirements
Required: java = 1.8 ✔
==> Options
--HEAD
Install HEAD version
==> Analytics
install: 4,534 (30 days), 14,340 (90 days), 56,698 (365 days)
install_on_request: 4,263 (30 days), 13,490 (90 days), 51,876 (365 days)
build_error: 0 (30 days)


code:



which spark-shell


Output:



/Users/sshields/anaconda2/bin/spark-shell









share|improve this question
























  • Did you look at apple.stackexchange.com/questions/253128/…?
    – tk421
    Nov 19 '18 at 21:42
















0














I installed spark on my mac with homebrew. I'm trying to find the directory where I installed it. I've tried googling it but I'm not having much luck. It doesn't seem like it should be that tricky. Can anyone please tell me what I need to run in mac terminal or from spark shell to find the installation directory for spark?



Update:



Code:



brew info apache-spark


Output:



apache-spark: stable 2.3.2, HEAD
Engine for large-scale data processing
https://spark.apache.org/
/usr/local/Cellar/apache-spark/2.3.2 (1,058 files, 244.6MB) *
Built from source on 2018-10-30 at 14:16:30
From: https://github.com/Homebrew/homebrew-core/blob/master/Formula/apache-spark.rb
==> Requirements
Required: java = 1.8 ✔
==> Options
--HEAD
Install HEAD version
==> Analytics
install: 4,534 (30 days), 14,340 (90 days), 56,698 (365 days)
install_on_request: 4,263 (30 days), 13,490 (90 days), 51,876 (365 days)
build_error: 0 (30 days)


code:



which spark-shell


Output:



/Users/sshields/anaconda2/bin/spark-shell









share|improve this question
























  • Did you look at apple.stackexchange.com/questions/253128/…?
    – tk421
    Nov 19 '18 at 21:42














0












0








0







I installed spark on my mac with homebrew. I'm trying to find the directory where I installed it. I've tried googling it but I'm not having much luck. It doesn't seem like it should be that tricky. Can anyone please tell me what I need to run in mac terminal or from spark shell to find the installation directory for spark?



Update:



Code:



brew info apache-spark


Output:



apache-spark: stable 2.3.2, HEAD
Engine for large-scale data processing
https://spark.apache.org/
/usr/local/Cellar/apache-spark/2.3.2 (1,058 files, 244.6MB) *
Built from source on 2018-10-30 at 14:16:30
From: https://github.com/Homebrew/homebrew-core/blob/master/Formula/apache-spark.rb
==> Requirements
Required: java = 1.8 ✔
==> Options
--HEAD
Install HEAD version
==> Analytics
install: 4,534 (30 days), 14,340 (90 days), 56,698 (365 days)
install_on_request: 4,263 (30 days), 13,490 (90 days), 51,876 (365 days)
build_error: 0 (30 days)


code:



which spark-shell


Output:



/Users/sshields/anaconda2/bin/spark-shell









share|improve this question















I installed spark on my mac with homebrew. I'm trying to find the directory where I installed it. I've tried googling it but I'm not having much luck. It doesn't seem like it should be that tricky. Can anyone please tell me what I need to run in mac terminal or from spark shell to find the installation directory for spark?



Update:



Code:



brew info apache-spark


Output:



apache-spark: stable 2.3.2, HEAD
Engine for large-scale data processing
https://spark.apache.org/
/usr/local/Cellar/apache-spark/2.3.2 (1,058 files, 244.6MB) *
Built from source on 2018-10-30 at 14:16:30
From: https://github.com/Homebrew/homebrew-core/blob/master/Formula/apache-spark.rb
==> Requirements
Required: java = 1.8 ✔
==> Options
--HEAD
Install HEAD version
==> Analytics
install: 4,534 (30 days), 14,340 (90 days), 56,698 (365 days)
install_on_request: 4,263 (30 days), 13,490 (90 days), 51,876 (365 days)
build_error: 0 (30 days)


code:



which spark-shell


Output:



/Users/sshields/anaconda2/bin/spark-shell






macos apache-spark homebrew






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Nov 19 '18 at 23:25







user3476463

















asked Nov 19 '18 at 19:59









user3476463user3476463

73821332




73821332












  • Did you look at apple.stackexchange.com/questions/253128/…?
    – tk421
    Nov 19 '18 at 21:42


















  • Did you look at apple.stackexchange.com/questions/253128/…?
    – tk421
    Nov 19 '18 at 21:42
















Did you look at apple.stackexchange.com/questions/253128/…?
– tk421
Nov 19 '18 at 21:42




Did you look at apple.stackexchange.com/questions/253128/…?
– tk421
Nov 19 '18 at 21:42












1 Answer
1






active

oldest

votes


















1














You should use brew info apache-spark and it will include the path in the output if you brew install it (I did not so it's not in the output below)



$ brew info apache-spark
apache-spark: stable 2.3.2, HEAD
Engine for large-scale data processing
https://spark.apache.org/
Not installed
From: https://github.com/Homebrew/homebrew-core/blob/master/Formula/apache-spark.rb
==> Requirements
Required: java = 1.8 ✔
==> Options
--HEAD
Install HEAD version
==> Analytics
install: 4,534 (30 days), 14,340 (90 days), 56,698 (365 days)
install_on_request: 4,263 (30 days), 13,490 (90 days), 51,876 (365 days)
build_error: 0 (30 days)


From the website:




Homebrew installs packages to their own directory and then symlinks their files into /usr/local.







share|improve this answer





















  • Thank you for getting back to me so quickly. Your suggestion works. I've added an update to my original post with the output. I noticed also if I use "which spark-shell" I get a different directory. Does that mean I have two spark environments installed?
    – user3476463
    Nov 19 '18 at 23:27










  • @user3476463 Yes. Anaconda is something from Python dev envs. No experience with it. BTW Please accept the answer if worked for you. Thanks.
    – Jacek Laskowski
    Nov 20 '18 at 6:55













Your Answer






StackExchange.ifUsing("editor", function () {
StackExchange.using("externalEditor", function () {
StackExchange.using("snippets", function () {
StackExchange.snippets.init();
});
});
}, "code-snippets");

StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "1"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);

StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});

function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});


}
});














draft saved

draft discarded


















StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53381794%2fhow-to-find-installation-directory-of-apache-spark-package-in-homebrew%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown

























1 Answer
1






active

oldest

votes








1 Answer
1






active

oldest

votes









active

oldest

votes






active

oldest

votes









1














You should use brew info apache-spark and it will include the path in the output if you brew install it (I did not so it's not in the output below)



$ brew info apache-spark
apache-spark: stable 2.3.2, HEAD
Engine for large-scale data processing
https://spark.apache.org/
Not installed
From: https://github.com/Homebrew/homebrew-core/blob/master/Formula/apache-spark.rb
==> Requirements
Required: java = 1.8 ✔
==> Options
--HEAD
Install HEAD version
==> Analytics
install: 4,534 (30 days), 14,340 (90 days), 56,698 (365 days)
install_on_request: 4,263 (30 days), 13,490 (90 days), 51,876 (365 days)
build_error: 0 (30 days)


From the website:




Homebrew installs packages to their own directory and then symlinks their files into /usr/local.







share|improve this answer





















  • Thank you for getting back to me so quickly. Your suggestion works. I've added an update to my original post with the output. I noticed also if I use "which spark-shell" I get a different directory. Does that mean I have two spark environments installed?
    – user3476463
    Nov 19 '18 at 23:27










  • @user3476463 Yes. Anaconda is something from Python dev envs. No experience with it. BTW Please accept the answer if worked for you. Thanks.
    – Jacek Laskowski
    Nov 20 '18 at 6:55


















1














You should use brew info apache-spark and it will include the path in the output if you brew install it (I did not so it's not in the output below)



$ brew info apache-spark
apache-spark: stable 2.3.2, HEAD
Engine for large-scale data processing
https://spark.apache.org/
Not installed
From: https://github.com/Homebrew/homebrew-core/blob/master/Formula/apache-spark.rb
==> Requirements
Required: java = 1.8 ✔
==> Options
--HEAD
Install HEAD version
==> Analytics
install: 4,534 (30 days), 14,340 (90 days), 56,698 (365 days)
install_on_request: 4,263 (30 days), 13,490 (90 days), 51,876 (365 days)
build_error: 0 (30 days)


From the website:




Homebrew installs packages to their own directory and then symlinks their files into /usr/local.







share|improve this answer





















  • Thank you for getting back to me so quickly. Your suggestion works. I've added an update to my original post with the output. I noticed also if I use "which spark-shell" I get a different directory. Does that mean I have two spark environments installed?
    – user3476463
    Nov 19 '18 at 23:27










  • @user3476463 Yes. Anaconda is something from Python dev envs. No experience with it. BTW Please accept the answer if worked for you. Thanks.
    – Jacek Laskowski
    Nov 20 '18 at 6:55
















1












1








1






You should use brew info apache-spark and it will include the path in the output if you brew install it (I did not so it's not in the output below)



$ brew info apache-spark
apache-spark: stable 2.3.2, HEAD
Engine for large-scale data processing
https://spark.apache.org/
Not installed
From: https://github.com/Homebrew/homebrew-core/blob/master/Formula/apache-spark.rb
==> Requirements
Required: java = 1.8 ✔
==> Options
--HEAD
Install HEAD version
==> Analytics
install: 4,534 (30 days), 14,340 (90 days), 56,698 (365 days)
install_on_request: 4,263 (30 days), 13,490 (90 days), 51,876 (365 days)
build_error: 0 (30 days)


From the website:




Homebrew installs packages to their own directory and then symlinks their files into /usr/local.







share|improve this answer












You should use brew info apache-spark and it will include the path in the output if you brew install it (I did not so it's not in the output below)



$ brew info apache-spark
apache-spark: stable 2.3.2, HEAD
Engine for large-scale data processing
https://spark.apache.org/
Not installed
From: https://github.com/Homebrew/homebrew-core/blob/master/Formula/apache-spark.rb
==> Requirements
Required: java = 1.8 ✔
==> Options
--HEAD
Install HEAD version
==> Analytics
install: 4,534 (30 days), 14,340 (90 days), 56,698 (365 days)
install_on_request: 4,263 (30 days), 13,490 (90 days), 51,876 (365 days)
build_error: 0 (30 days)


From the website:




Homebrew installs packages to their own directory and then symlinks their files into /usr/local.








share|improve this answer












share|improve this answer



share|improve this answer










answered Nov 19 '18 at 22:01









Jacek LaskowskiJacek Laskowski

43.7k17128260




43.7k17128260












  • Thank you for getting back to me so quickly. Your suggestion works. I've added an update to my original post with the output. I noticed also if I use "which spark-shell" I get a different directory. Does that mean I have two spark environments installed?
    – user3476463
    Nov 19 '18 at 23:27










  • @user3476463 Yes. Anaconda is something from Python dev envs. No experience with it. BTW Please accept the answer if worked for you. Thanks.
    – Jacek Laskowski
    Nov 20 '18 at 6:55




















  • Thank you for getting back to me so quickly. Your suggestion works. I've added an update to my original post with the output. I noticed also if I use "which spark-shell" I get a different directory. Does that mean I have two spark environments installed?
    – user3476463
    Nov 19 '18 at 23:27










  • @user3476463 Yes. Anaconda is something from Python dev envs. No experience with it. BTW Please accept the answer if worked for you. Thanks.
    – Jacek Laskowski
    Nov 20 '18 at 6:55


















Thank you for getting back to me so quickly. Your suggestion works. I've added an update to my original post with the output. I noticed also if I use "which spark-shell" I get a different directory. Does that mean I have two spark environments installed?
– user3476463
Nov 19 '18 at 23:27




Thank you for getting back to me so quickly. Your suggestion works. I've added an update to my original post with the output. I noticed also if I use "which spark-shell" I get a different directory. Does that mean I have two spark environments installed?
– user3476463
Nov 19 '18 at 23:27












@user3476463 Yes. Anaconda is something from Python dev envs. No experience with it. BTW Please accept the answer if worked for you. Thanks.
– Jacek Laskowski
Nov 20 '18 at 6:55






@user3476463 Yes. Anaconda is something from Python dev envs. No experience with it. BTW Please accept the answer if worked for you. Thanks.
– Jacek Laskowski
Nov 20 '18 at 6:55




















draft saved

draft discarded




















































Thanks for contributing an answer to Stack Overflow!


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.





Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


Please pay close attention to the following guidance:


  • Please be sure to answer the question. Provide details and share your research!

But avoid



  • Asking for help, clarification, or responding to other answers.

  • Making statements based on opinion; back them up with references or personal experience.


To learn more, see our tips on writing great answers.




draft saved


draft discarded














StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53381794%2fhow-to-find-installation-directory-of-apache-spark-package-in-homebrew%23new-answer', 'question_page');
}
);

Post as a guest















Required, but never shown





















































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown

































Required, but never shown














Required, but never shown












Required, but never shown







Required, but never shown







Popular posts from this blog

android studio warns about leanback feature tag usage required on manifest while using Unity exported app?

SQL update select statement

'app-layout' is not a known element: how to share Component with different Modules