文章目录
- 1.安装WSL
- 2.安装Java
- 安装Hadoop3.3
- 配置文件
- 1.修改hadoop-env.sh
- 2.修改core-site.xml
- 3.修改hdfs-site.xml
- ssh
- 启动
1.安装WSL
- 重启电脑
管理员打开powershell
PS C:\windows\system32> wsl --list --online
PS C:\windows\system32> wsl --install -d Ubuntu-20.04
2.安装Java
# 下载链接
https://repo.huaweicloud.com/java/jdk/8u202-b08/jdk-8u202-linux-x64.tar.gz
# 复制到Ubuntu
# 安装Java
sudo tar -zxvf jdk-*-linux-x64.tar.gz -C /usr/local
cd /usr/local
sudo mv jdk* jdk8
环境变量
vim ~/.bashrc
# jdk环境
export JAVA_HOME=/usr/local/jdk8
export JRE_HOME=${JAVA_HOME}/jre
export CLASSPATH=.:${JAVA_HOME}/lib:${JRE_HOME}/lib
export PATH=${JAVA_HOME}/bin:$PATH
source .bashrc
java -version
安装Hadoop3.3
# 下载
https://mirrors.tuna.tsinghua.edu.cn/apache/hadoop/common/hadoop-3.3.1/hadoop-3.3.1.tar.gz
# 复制到Ubuntu
sudo tar -zxvf hadoop*.tar.gz -C /usr/local
cd /usr/local
sudo mv hadoop* hadoop
- 环境变量
vim ~/.bashrc
# Hadoop环境
export HADOOP_HOME=/usr/local/hadoop
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export PATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin
export JAVA_LIBRAY_PATH=/usr/local/hadoop/lib/native
source ~/.bashrc
hadoop version
配置文件
cd /usr/local/hadoop/etc/hadoop
1.修改hadoop-env.sh
- 在hadoop-env.sh顶部添加
export HDFS_NAMENODE_USER=用户名
export HDFS_DATANODE_USER=用户名
export HDFS_SECONDARYNAMENODE_USER=用户名
export YARN_RESOURCEMANAGER_USER=用户名
export YARN_NODEMANAGER_USER=用户名
- 在hadoop-env.sh尾部添加
export JAVA_HOME=/usr/local/jdk8
# 解决Unable to load native-hadoop library for your platform...
export HADOOP_OPTS="-Djava.library.path=${HADOOP_HOME}/lib/native"
2.修改core-site.xml
<configuration>
<property>
<name>hadoop.tmp.dir</name>
<value>file:/usr/local/hadoop/tmp</value>
<description>Abase for other temporary directories.</description>
</property>
<property>
<name>fs.defaultFS</name>
<value>hdfs://localhost:9000</value>
</property>
</configuration>
3.修改hdfs-site.xml
<configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
<property>
<name>dfs.namenode.name.dir</name>
<value>file:/usr/local/hadoop/tmp/dfs/name</value>
</property>
<property>
<name>dfs.datanode.data.dir</name>
<value>file:/usr/local/hadoop/tmp/dfs/data</value>
</property>
</configuration>
文章目录
- 1.安装WSL
- 2.安装Java
- 安装Hadoop3.3
- 配置文件
- 1.修改hadoop-env.sh
- 2.修改core-site.xml
- 3.修改hdfs-site.xml
- ssh
- 启动
ssh
sudo service ssh start
# 尝试是否可以连接本机(重要)
ssh 用户名@localhost
解决Permission denied (publickey).
连不上参考 解决Permission denied (publickey).
启动
/usr/local/hadoop/bin/hdfs namenode –format
./sbin/start-all.sh
jps
恭喜你, 安装成功!
参考