以下是在ubuntu12.04 amd64, hadoop cdh3u3下安装 在${HADOOP_HOME}/contrib/fuse-dfs目录下已经帮我编译了一个64bit的fuse-dfs了,不用自己编译(苦逼,自己编译出错=。=)
缺少某些.so文件
把这些so加入到LD_LIBRARY_PATH中,可以写入~/.bashrc文件
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$JAVA_HOME/jre/lib/amd64/server
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$HADOOP_HOME/c++/lib
提示/etc/fuse.conf没权限
把当前用户加入fuse组, adduser 用户名 fuse
检查一下/etc/fuse.conf权限是否是可读
挂载点input/output error
变成一堆???号,原因是没有加上-d参数
./fuse_dfs_wrapper.sh dfs://localhost:8020 /mnt/dfs -d
变成???可以通过sudo umount /mnt/xxx解除挂载
class not found
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.conf.Configuration
把hadoop的lib加进去
for f in $(find $HADOOP_HOME/ -name '*.jar')
do
export CLASSPATH=$CLASSPATH:$f
done
最终的脚本是:
#!/bin/sh
for f in $(find $HADOOP_HOME/ -name '*.jar')
do
export CLASSPATH=$CLASSPATH:$f
done
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$JAVA_HOME/jre/lib/amd64/server
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$HADOOP_HOME/c++/lib
./fuse_dfs_wrapper.sh dfs://localhost:8020 /mnt/dfs -d &