[Linux Terminal]
cd
ls zeppelin*
ln -s zeppelin-0.5.6-incubating-bin-all zeppelin
ls 20*
cp Downloads/hadoop_cnf/* hadoop/etc/hadoop/
cp -rp spark_project01/ spark_project02
jps
pwd
sudo -
unzip ~.zip
tar -zxf zeppelin-0.5.6-incubating-bin-all.gz
gedit conf/flume_avro.txt
vi hadoop_cnf/hdfs-site.xml
mkdir /home/hadoop/syslog
touch /home/hadoop/syslog/a.txt
echo "TESTTEST" >> syslog/a.txt
ssh hadoop02 "netstat -nl | grep 4545"
ssh hadoop02 "ln -s spark-1.4.0-bin-hadoop2.4 spark"
ssh hadoop02 "tar -zxf spark-1.4.0-bin-hadoop2.4.tgz"
head -5 ml-100k/u.user
wget https://statistics.stanford.edu/~tibs/ElemStatLearn/datasets/spam.data
scp spark/conf/* hadoop@hadoop02:~/spark/conf/
cat symbols.txt | head -5
cat input/NASDAQ_daily_prices_A.csv | head -5
R
ssh hadoop02 "netstat -nl | grep 4545"
[Spark]
spark/sbin/start-all.sh
spark/bin/spark-shell --master spark://192.168.56.101:7077
SPARK_REPL_OPTS="-XX:MaxPermSize=256m" spark-shell --executor-memory 1600m --driver-memory 1G --master spark://192.168.56.101:7077
MASTER=spark://192.168.56.101:7077 /home/hadoop/spark/bin/run-example streaming.FlumeEventCount 192.168.56.102 4545
spark-submit --master spark://192.168.56.101:7077 --class StatefulNetworkWordCount target/scala-2.10/simple-project_2.10-1.0.jar
sbt/sbt package
sbt/sbt run
[Hadoop]
hadoop/sbin/start-all.sh
hadoop/sbin/stop-all.sh
hadoop/bin/hadoop fs -cat /rlt01/part-00000
hadoop fs -copyFromLocal ml-100k ml-100k
hadoop fs -copyFromLocal input input
hadoop fs -put spam.data spam.data
hadoop fs -put spark/README.md README.md
hadoop fs -mkdir -p /user/hadoop
hadoop fs -cat rvis/* | head -5
[Hive]
tar -zxf apache-hive-0.13.1-bin.tar.gz
ln -s apache-hive-0.13.1-bin hive
gedit Downloads/hive-site.xml
hive/bin/hive
[Zepplin]
zeppelin/bin/zeppelin-daemon.sh stop
zeppelin/bin/zeppelin-daemon.sh start
[flume]
bin/flume-ng agent --conf ./conf/ -f conf/flume_avro.txt -Dflume.root.logger=DEBUG,console -n a1
'Data > SPARK' 카테고리의 다른 글
6. 스파크의 핵심 RDD Resilient Distributed Datasets (1) | 2016.02.12 |
---|---|
5. 웹 기반 명령어 해석기 Zeppelin Install (4) | 2016.02.12 |
4. CentOS 스파크 설치 Spark Install "Hello Spark" (5) | 2016.02.05 |
[Spark] URLs (0) | 2016.01.29 |
실시간 빅데이터 분석[Hadoop, Spark] 개요 (0) | 2016.01.27 |