通过jps查看启动是否成功:
[hadoop@localhost sbin]$ jps
4706 Jps
3692 DataNode
3876 SecondaryNameNode
4637 Worker
4137 NodeManager
4517 Master
4026 ResourceManager
3587 NameNode
可以看到有一个Master跟Worker进程 说明启动成功
可以通过:8080/查看spark集群状况
4 运行spark自带的程序
首先需要进入spark下面的bin目录 :
[hadoop@localhost sbin]$ ll ../bin/
total 56
-rw-rw-r--. 1 hadoop hadoop 2601 Mar 27 13:44 compute-classpath.cmd
-rwxrwxr-x. 1 hadoop hadoop 3330 Mar 27 13:44 compute-classpath.sh
-rwxrwxr-x. 1 hadoop hadoop 2070 Mar 27 13:44 pyspark
-rw-rw-r--. 1 hadoop hadoop 1827 Mar 27 13:44 pyspark2.cmd
-rw-rw-r--. 1 hadoop hadoop 1000 Mar 27 13:44 pyspark.cmd
-rwxrwxr-x. 1 hadoop hadoop 3055 Mar 27 13:44 run-example
-rw-rw-r--. 1 hadoop hadoop 2046 Mar 27 13:44 run-example2.cmd
-rw-rw-r--. 1 hadoop hadoop 1012 Mar 27 13:44 run-example.cmd
-rwxrwxr-x. 1 hadoop hadoop 5151 Mar 27 13:44 spark-class
-rwxrwxr-x. 1 hadoop hadoop 3212 Mar 27 13:44 spark-class2.cmd
-rw-rw-r--. 1 hadoop hadoop 1010 Mar 27 13:44 spark-class.cmd
-rwxrwxr-x. 1 hadoop hadoop 3184 Mar 27 13:44 spark-shell
-rwxrwxr-x. 1 hadoop hadoop 941 Mar 27 13:44 spark-shell.cmd
run-example org.apache.spark.examples.SparkLR spark://localhost:7077
run-example org.apache.spark.examples.SparkPi spark://localhost:7077