Hadoop best performs on a cluster of multiple nodes/servers, however, it can run perfectly on a single machine, even a Mac, so we can use it for development. Also, Spark is a popular tool to process data in Hadoop. The purpose of this blog is to show you the steps to install Hadoop and Spark on a Mac.
Operating System: Mac OSX Yosemite 10.11.3
Hadoop Version 2.7.2
1. Install Java
Open a terminal window to check what Java version is installed.
$ java -version
If Java is not installed, go to https://java.com/en/download/ to download and install latest JDK. If Java is installed, use following command in a terminal window to find the java home path
Next we need to set JAVA_HOME environment on mac
$ echo export “JAVA_HOME=$(/usr/libexec/java_home)” >> ~/.bash_profile
$ source ~/.bash_profile
2. Enable SSH as Hadoop requires it.
Go to System Preferences -> Sharing -> and check “Remote Login”.
View original post 549 more words