Remove Duplicate Files on Linux using FSlint

0
1263

There are lots of chances when your Linux system may come up with tons of duplicate files. Those pesky files end up eating a lot of unnecessary disk space and sometimes, even slow down the system. Well, there’s a tool named FSlint that can effectively clean out all the unnecessary files from your system and make it lighter and (maybe) faster.

Installing FSlint

If you’re using any mainstream Linux distro or derivatives from those, you will have a really easy time using this tool. Run the commands according to your distro.

  • Ubuntu
sudo apt install fslint

  • Debian
sudo apt-get install fslint
  • OpenSUSE
sudo zypper install fslint
  • Fedora
sudo dnf install fslint
  • Arch Linux
sudo pacman -S fslint
  • Other Linux distros
wget http://www.pixelbeat.org/fslint/fslint-2.46.tar.xz

tar -xf fslint-2.46.tar.xz
cd fslint-2.46
cd po && make
./fslint-gui

Using FSlint

FSlint is a GUI tool that makes the life of selecting and removing all the unnecessary duplicate files a lot easier. It obviously makes sense as the CLI would end up misleading you and deleting an important file (unless you’re an expert one).

After installation is complete, start the program.

Note that FSlint will work on directories that you select to scan. The more directories you select, the more time it’ll take for looking up for duplicate files. Please be patient while the scanning completes.

From the left panel, you can choose to perform lots of actions on your files. For example, “Bad names”, “Temp files” etc.

You can also merge duplicate files into one single file, or delete the duplicate one(s) with the tool.

Another really important feature of FSlint is the cleaning whitespace. The option is available at the bottom of the top panel. Using this facility, you can easily reduce file sizes by deleting whitespace (long spaces between texts).

Enjoy.

LEAVE A REPLY

Please enter your comment!
Please enter your name here