2014-02-18 2 views
0

я мог бы иметь возможность настроить Hadoop-0.19.1 на моей системе окон и последующим шагам, вНе удалось настроить Hadoop-1.2.1 на окнах

http://ebiquity.umbc.edu/Tutorials/Hadoop/09%20-%20unpack%20hadoop.html

Теперь я пытаюсь настроить Hadoop -1.2.1 следующие те же шаги, но я stucked по команде

$ бен/Hadoop NameNode -format

и он дает мне ошибка, как показано ниже

[email protected] ~ 
$ cd hadoop-1.2.1 

[email protected] ~/hadoop-1.2.1 
$ bin/hadoop namenode -format 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 15: $'\r': comma 
nd not found 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 19: $'\r': comma 
nd not found 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 21: $'\r': comma 
nd not found 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 26: $'\r': comma 
nd not found 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 30: cd: /home/31 
: No such file or directory 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 32: $'\r': comma 
nd not found 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 35: $'\r': comma 
nd not found 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 87: syntax error 
: unexpected end of file 
Error: Could not find or load main class org.apache.hadoop.util.PlatformName 
Error: Could not find or load main class org.apache.hadoop.hdfs.server.namenode. 
NameNode 

[email protected] ~/hadoop-1.2.1 
$ ssh-host-config 

*** ERROR: There are still ssh processes running. Please shut them down first. 

[email protected] ~/hadoop-1.2.1 
$ ssh-host-config 

*** Query: Overwrite existing /etc/ssh_config file? (yes/no) yes 
*** Info: Creating default /etc/ssh_config file 
*** Query: Overwrite existing /etc/sshd_config file? (yes/no) yes 
*** Info: Creating default /etc/sshd_config file 
*** Info: Privilege separation is set to yes by default since OpenSSH 3.3. 
*** Info: However, this requires a non-privileged account called 'sshd'. 
*** Info: For more info on privilege separation read /usr/share/doc/openssh/READ 
ME.privsep. 
*** Query: Should privilege separation be used? (yes/no) yes 
*** Info: Updating /etc/sshd_config file 
*** Query: Overwrite existing /etc/inetd.d/sshd-inetd file? (yes/no) yes 
*** Info: Creating default /etc/inetd.d/sshd-inetd file 
*** Info: Updated /etc/inetd.d/sshd-inetd 

*** Info: Sshd service is already installed. 

*** Info: Host configuration finished. Have fun! 

[email protected] ~/hadoop-1.2.1 
$ cd .. 

[email protected] ~ 
$ ssh-keygen 
Generating public/private rsa key pair. 
Enter file in which to save the key (/home/313159/.ssh/id_rsa): 
/home/313159/.ssh/id_rsa already exists. 
Overwrite (y/n)? y 
Enter passphrase (empty for no passphrase): 
Enter same passphrase again: 
Your identification has been saved in /home/313159/.ssh/id_rsa. 
Your public key has been saved in /home/313159/.ssh/id_rsa.pub. 
The key fingerprint is: 
21:45:11:30:6b:3a:af:c6:4a:2d:ed:3a:bf:be:69:be [email protected] 
The key's randomart image is: 
+--[ RSA 2048]----+ 
|  oo=o  | 
|  +   | 
|  + .  | 
|  o . .  | 
| o S  | 
| o o   | 
| o.o .   | 
| ..o+o   | 
| oXE+   | 
+-----------------+ 

[email protected] ~/.ssh 
$ ls -l 
total 12 
-rw-r--r-- 1 313159 mkpasswd 2004 Feb 18 11:59 authorized_keys 
-rw------- 1 313159 mkpasswd 668 Oct 21 12:06 id_dsa 
-rw-r--r-- 1 313159 mkpasswd 605 Oct 21 12:06 id_dsa.pub 
-rw------- 1 313159 mkpasswd 1679 Feb 18 11:57 id_rsa 
-rw-r--r-- 1 313159 mkpasswd 397 Feb 18 11:57 id_rsa.pub 
-rw-r--r-- 1 313159 mkpasswd 171 Oct 21 12:00 known_hosts 

[email protected] ~/.ssh 
$ cd .. 

[email protected] ~ 
$ cd hadoop-1.2.1 

[email protected] ~/hadoop-1.2.1 
$ bin/hadoop namenode -format 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 15: $'\r': comma 
nd not found 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 19: $'\r': comma 
nd not found 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 21: $'\r': comma 
nd not found 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 26: $'\r': comma 
nd not found 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 30: cd: /home/31 
: No such file or directory 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 32: $'\r': comma 
nd not found 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 35: $'\r': comma 
nd not found 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 87: syntax error 
: unexpected end of file 
Error: Could not find or load main class org.apache.hadoop.util.PlatformName 
Error: Could not find or load main class org.apache.hadoop.hdfs.server.namenode. 
NameNode 

[email protected] ~/hadoop-1.2.1 
$ cd bin 

[email protected] ~/hadoop-1.2.1/bin 
$ dos2unix *.sh 
dos2unix: converting file hadoop-config.sh to Unix format ... 
dos2unix: converting file hadoop-daemon.sh to Unix format ... 
dos2unix: converting file hadoop-daemons.sh to Unix format ... 
dos2unix: converting file slaves.sh to Unix format ... 
dos2unix: converting file start-all.sh to Unix format ... 
dos2unix: converting file start-balancer.sh to Unix format ... 
dos2unix: converting file start-dfs.sh to Unix format ... 
dos2unix: converting file start-jobhistoryserver.sh to Unix format ... 
dos2unix: converting file start-mapred.sh to Unix format ... 
dos2unix: converting file stop-all.sh to Unix format ... 
dos2unix: converting file stop-balancer.sh to Unix format ... 
dos2unix: converting file stop-dfs.sh to Unix format ... 
dos2unix: converting file stop-jobhistoryserver.sh to Unix format ... 
dos2unix: converting file stop-mapred.sh to Unix format ... 

[email protected] ~/hadoop-1.2.1/bin 
$ cd .. 

[email protected] ~/hadoop-1.2.1 
$ cd conf 

[email protected] ~/hadoop-1.2.1/conf 
$ dos2unix *.sh 
dos2unix: converting file hadoop-env.sh to Unix format ... 

[email protected] ~/hadoop-1.2.1/conf 
$ cd .. 

[email protected] ~/hadoop-1.2.1 
$ bin/hadoop namenode -format 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 15: $'\r': comma 
nd not found 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 19: $'\r': comma 
nd not found 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 21: $'\r': comma 
nd not found 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 26: $'\r': comma 
nd not found 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 30: cd: /home/31 
: No such file or directory 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 32: $'\r': comma 
nd not found 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 35: $'\r': comma 
nd not found 
/home/313159/hadoop-1.2.1/bin/../libexec/hadoop-config.sh: line 87: syntax error 
: unexpected end of file 
Error: Could not find or load main class org.apache.hadoop.util.PlatformName 
Error: Could not find or load main class org.apache.hadoop.hdfs.server.namenode. 
NameNode 

Может кто-нибудь пожалуйста, предложите/поделиться мне решение для проблемы Спасибо

+0

Этот вопрос не соответствует теме, поскольку это возможный дубликат http://stackoverflow.com/questions/12839705/cant-find-or-load-main-class-error-in-hadoop и http: //stackoverflow.com/questions/14852199/hdfs-namenood-formatting-error-could-not-find-the-main-class-namenood-code-a – Chiron

ответ

0

Причина, вероятно, разница в символ новой строки в Unix и Windows.

Я думаю, что ответ заключается в том, чтобы запустить команду dos2unix для этого файла, как предложено в this post [возможно дублировать].

Смежные вопросы