前言:
創(chuàng)新互聯(lián)2013年至今,先為丹陽(yáng)等服務(wù)建站,丹陽(yáng)等地企業(yè),進(jìn)行企業(yè)商務(wù)咨詢服務(wù)。為丹陽(yáng)企業(yè)網(wǎng)站制作PC+手機(jī)+微官網(wǎng)三網(wǎng)同步一站式服務(wù)解決您的所有建站問(wèn)題。
[hadoop@hadoop000 ~]$ hdfs dfs -put test.log /
[hadoop@hadoop000 ~]$ hdfs dfs -ls /
Found 3 items
-rw-r--r-- 1 hadoop supergroup 34 2018-05-23 16:49 /test.log
drwx------ - hadoop supergroup 0 2018-05-19 15:48 /tmp
drwxr-xr-x - hadoop supergroup 0 2018-05-19 15:48 /user
# 刪除test.log 注意提示
[hadoop@hadoop000 ~]$ hdfs dfs -rm -r /test.log
Deleted /test.log
# 重新查看 發(fā)現(xiàn)test.log被刪除
[hadoop@hadoop000 ~]$ hdfs dfs -ls /
Found 2 items
drwx------ - hadoop supergroup 0 2018-05-19 15:48 /tmp
drwxr-xr-x - hadoop supergroup 0 2018-05-19 15:48 /user
[hadoop@hadoop000 hadoop]$ pwd
/opt/software/hadoop-2.8.1/etc/hadoop
# 增加fs.trash參數(shù)配置 開(kāi)啟trash(進(jìn)程不需重啟)
[hadoop@hadoop000 hadoop]$ vi core-site.xml
fs.trash.interval
1440
fs.trash.checkpoint.interval
1440
# fs.trash.interval是在指在這個(gè)回收周期之內(nèi),文件實(shí)際上是被移動(dòng)到trash的這個(gè)目錄下面,而不是馬上把數(shù)據(jù)刪除掉。等到回收周期真正到了以后,hdfs才會(huì)將數(shù)據(jù)真正刪除。默認(rèn)的單位是分鐘,1440分鐘=60*24,剛好是一天;fs.trash.checkpoint.interval則是指垃圾回收的檢查間隔,應(yīng)該是小于或者等于fs.trash.interval。
# 參考官方文檔:http://hadoop.apache.org/docs/r2.8.4/hadoop-project-dist/hadoop-common/core-default.xml
[hadoop@hadoop000 ~]$ hdfs dfs -put test.log /
[hadoop@hadoop000 ~]$ hdfs dfs -ls /
Found 3 items
-rw-r--r-- 1 hadoop supergroup 34 2018-05-23 16:54 /test.log
drwx------ - hadoop supergroup 0 2018-05-19 15:48 /tmp
drwxr-xr-x - hadoop supergroup 0 2018-05-19 15:48 /user
# 刪除test.log 注意提示的不同
[hadoop@hadoop000 ~]$ hdfs dfs -rm -r /test.log
18/05/23 16:54:55 INFO fs.TrashPolicyDefault: Moved: 'hdfs://192.168.6.217:9000/test.log' to trash at: hdfs://192.168.6.217:9000/user/hadoop/.Trash/Current/test.log
# 發(fā)現(xiàn)刪除的文件在回收站里
[hadoop@hadoop000 ~]$ hdfs dfs -ls /user/hadoop/.Trash/Current
Found 1 items
-rw-r--r-- 1 hadoop supergroup 34 2018-05-23 16:54 /user/hadoop/.Trash/Current/test.log
# 恢復(fù)誤刪除的文件
[hadoop@hadoop000 ~]$ hdfs dfs -mv /user/hadoop/.Trash/Current/test.log /test.log
[hadoop@hadoop000 ~]$ hdfs dfs -ls /
Found 3 items
-rw-r--r-- 1 hadoop supergroup 34 2018-05-23 16:54 /test.log
drwx------ - hadoop supergroup 0 2018-05-19 15:48 /tmp
drwxr-xr-x - hadoop supergroup 0 2018-05-19 15:48 /user