Linux/Unix
- Copy file to another server
scp prime.war lchen@ny-sample-002:/home/lchen
- Grep log with regular expression from the zip file:
zgrep –color ‘CCVI[0-9]{12}’ /logs/pl_timings.17-11-01_21.log.gz
- Check if a domain name accessible from DNS
dig codebycase.github.io +noall +answer
codebycase.github.io. 1799 IN CNAME codebycase.github.io.
codebycase.github.io. 3600 IN CNAME sni.github.map.fastly.net.
sni.github.map.fastly.net. 30 IN A 151.101.21.147
- Curl with proxy and client certificate
curl -E “./Priceline DB.pem” –key “./Priceline DB.key” -H “Authorization: orgId=116770” –proxy “nw-prx-v01.dqs.pcln.com:8080” https://api.searchads.apple.com/api/v1/campaigns
- Create a pkcs12 (.pfx or .p12) from OpenSSL files (.pem, .cer, .crt,…)
openssl pkcs12 -export -in
.pem -inkey .key -out .p12
- Install certificate to JVM keystore
sudo keytool -import -alias sunas -keystore /Library/Java/JavaVirtualMachines/jdk1.8.0_144.jdk/Contents/Home/jre/lib/security/cacerts -file /Users/lchen/Downloads/deva-consul.dqs.pcln.com-443.crt
the keystore password is changeit by default
Install Hadoop in Ubuntu
sudo apt-get install default-jdk
Mac OS
- If install Hadoop on Mac OSX, make sure turn on Remote Login under System Preferences then File Sharing.
- Show the using TCP ports:
sudo lsof -iTCP -sTCP:LISTEN -n -P
- Show the latest changed files
ls -ltr
MacOS VirtualBox
- Change settings (advanced) to allow bidirectional copy & paste.
- Install Guest Additions
- Change Host Key Combination to right right command
- Change Host Network manager, add a host-only network vboxnet0
- Change Client Network Adapter2, choose Host-only Adapter
- In Ubuntu, install/run openssh:
sudo apt-get install openssh-server
- In Ubuntu, check ip address with
ifconfig -a
- In Host terminal, you can now ssh to client, say
ssh lchen@192.168.56.101
Install Hadoop & Hbase on macOS
- Prepare local SSH access
$ ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa
$ cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
$ chmod 0600 ~/.ssh/authorized_keys
$ ssh localhost
$ exit
- Install Ruby & Brew
$ rvm install ruby-2.5
$ brew search hadoop
- Install Hadoop
$ brew install hadoop
$ ls /usr/local/Cellar/hadoop
- Update /usr/local/Cellar/hadoop/3.1.1/libexec/etc/hadoop/hadoop-env.sh
export HADOOP_OPTS="-Djava.net.preferIPv4Stack=true -Djava.security.krb5.realm= -Djava.security.krb5.kdc="
- Update /usr/local/Cellar/hadoop/3.1.1/libexec/etc/hadoop/core-site.xml
<configuration>
<property>
<name>fs.defaultFS</name>
<value>hdfs://localhost:9000</value>
</property>
<property>
<name>hadoop.tmp.dir</name>
<value>/usr/local/Cellar/hadoop/hdfs/tmp</value>
<description>A base for other temporary directories.</description>
</property>
</configuration>
-
Update /usr/local/Cellar/hadoop/3.1.1/libexec/etc/hadoop/mapred-site.xml
-
Update /usr/local/Cellar/hadoop/3.1.1/libexec/etc/hadoop/hdfs-site.xml
<configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
</configuration>
- Format HDFS and start all
$ hdfs namenode -format
$ sh start-all.sh
NameNode: http://localhost:9870/ Scheduler: http://localhost:8088/cluster/nodes
- Install HBase
$ brew install hbase
- Update /usr/local/Cellar/hbase/1.2.8/libexec/conf/hbase-site.xml
<property>
<name>hbase.rootdir</name>
<!--<value>file:///usr/local/var/hbase</value>-->
<value>hdfs://localhost:9000/hbase</value>
</property>