Everything.
I am using hasoop2.6.0.
When I force hadoop to leave safe mode using hdfs dfsadmin -safemode leave
, it shows Safe mode is OFF
, but I still can’t delete the file in the directory, the result shows that:
rm: Cannot delete /mei/app-20151013055617-0001-614d554c-cc04-4800-9be8-7d9b3fd3fcef. Name node is in safe mode.
I am trying to solve this problem using the enumeration method on the Internet, it does not work ...
I use the command "hdfs dfsadmin -report", it shows:
Safe mode is ON
Configured Capacity: 52710469632 (49.09 GB)
Present Capacity: 213811200 (203.91 MB)
DFS Remaining: 0 (0 B)
DFS Used: 213811200 (203.91 MB)
DFS Used%: 100.00%
Under replicated blocks: 39
Blocks with corrupt replicas: 0
Missing blocks: 0
-------------------------------------------------
Live datanodes (1):
Name: 127.0.0.1:50010 (bdrhel6)
Hostname: bdrhel6
Decommission Status : Normal
Configured Capacity: 52710469632 (49.09 GB)
DFS Used: 213811200 (203.91 MB)
Non DFS Used: 52496658432 (48.89 GB)
DFS Remaining: 0 (0 B)
DFS Used%: 0.41%
DFS Remaining%: 0.00%
Configured Cache Capacity: 0 (0 B)
Cache Used: 0 (0 B)
Cache Remaining: 0 (0 B)
Cache Used%: 100.00%
Cache Remaining%: 0.00%
Xceivers: 1
Last contact: Wed Oct 14 03:30:33 EDT 2015
Does anyone have the same problem?
Any help on this.
source
share