Di shell ketik 'jps' (Anda mungkin memerlukan jdk untuk menjalankan jps). Ini mencantumkan semua proses java yang berjalan dan akan mencantumkan daemon hadoop yang sedang berjalan.
Jika Anda melihat proses hadoop tidak berjalan di ps -ef|grep hadoop
, jalankan sbin/start-dfs.sh
. Pantau dengan hdfs dfsadmin -report
:
[[email protected] bin]$ hadoop dfsadmin -report
Configured Capacity: 105689374720 (98.43 GB)
Present Capacity: 96537456640 (89.91 GB)
DFS Remaining: 96448180224 (89.82 GB)
DFS Used: 89276416 (85.14 MB)
DFS Used%: 0.09%
Under replicated blocks: 0
Blocks with corrupt replicas: 0
Missing blocks: 0
-------------------------------------------------
Datanodes available: 2 (2 total, 0 dead)
Name: 192.168.1.16:50010
Decommission Status : Normal
Configured Capacity: 52844687360 (49.22 GB)
DFS Used: 44638208 (42.57 MB)
Non DFS Used: 4986138624 (4.64 GB)
DFS Remaining: 47813910528(44.53 GB)
DFS Used%: 0.08%
DFS Remaining%: 90.48%
Last contact: Tue Aug 20 13:23:32 EDT 2013
Name: 192.168.1.17:50010
Decommission Status : Normal
Configured Capacity: 52844687360 (49.22 GB)
DFS Used: 44638208 (42.57 MB)
Non DFS Used: 4165779456 (3.88 GB)
DFS Remaining: 48634269696(45.29 GB)
DFS Used%: 0.08%
DFS Remaining%: 92.03%
Last contact: Tue Aug 20 13:23:34 EDT 2013
Pustaka hilang saat memasang permata ruby-filemagic di Linux
Bagaimana cara mencetak hanya nilai hex dari hexdump tanpa nomor baris atau tabel ASCII?