
In this comprehensive guide, we will explore how to move 30GB of files from an 80GB folder using the terminal. This task can be accomplished through various methods, including selective move, script-based move, console-based file managers, and tar-based move.
To move 30GB of files from an 80GB folder in the terminal, you have several options. You can use selective move commands like mv
or find
to move specific files based on criteria. Alternatively, you can use a script to automate the process and move files until a certain total size is reached. Console-based file managers like GNU Midnight Commander or fff provide an interactive approach, allowing you to select and move multiple files. Another efficient method is to use tar
to create a stream of the files and extract the desired amount of data to the destination folder.
Selective Move
If you have specific criteria for selecting the files to move, such as moving files starting with a certain prefix or files larger than a certain size, you can use commands like mv
or find
.
The mv
command is used in the terminal to move or rename files. The general syntax is mv [options] source destination
.
For example, if you wish to move all files starting with “a” to another folder, you can use the following command:
mv /path/to/source/a* /path/to/destination/
Here, /path/to/source/a*
is the source, which includes all files starting with “a”, and /path/to/destination/
is the destination folder where the files will be moved.
Script-based Move
If you want to automate the process and move files until a certain total size has been reached, you can use a script.
Here’s an example of a bash script that moves files from the source directory to the destination directory until the desired size is reached:
#!/bin/bash
SOURCE="/path/to/source"
DESTINATION="/path/to/destination"
LIMIT=30G
# Function to convert size to bytes
function to_bytes()
{
echo $(numfmt --from=iec $1)
}
LIMIT_BYTES=$(to_bytes $LIMIT)
# Move files until limit is reached
while IFS= read -r -d '' file
do
FILE_SIZE=$(stat -c%s "$file")
if (( TOTAL_SIZE + FILE_SIZE > LIMIT_BYTES )); then
break
fi
mv "$file" $DESTINATION
let TOTAL_SIZE+=FILE_SIZE
done < <(find $SOURCE -type f -print0)
This script uses the find
command to list all files in the source directory and moves them one by one to the destination directory until the total size of moved files exceeds the limit.
Console-based File Managers
If you prefer a more interactive approach, you can use console-based file managers like GNU Midnight Commander or fff (Fucking Fast File-Manager).
These file managers allow you to navigate through your files, select multiple files, and move them to the desired destination.
To install GNU Midnight Commander, use the following command:
sudo apt install mc
And to run it, use:
mc
For fff, you can download the script from the GitHub repository and run it with the command bash fff
.
Tar-based Move
Another efficient method to move large amounts of data is to use tar
to create a stream of the files and then extract the desired amount of data to the destination folder.
Here’s an example of a bash script that demonstrates how to achieve this:
#!/bin/bash
SOURCE="/path/to/source"
DESTINATION="/path/to/destination"
LIMIT=30G
# Function to convert size to bytes
function to_bytes()
{
echo $(numfmt --from=iec $1)
}
LIMIT_BYTES=$(to_bytes $LIMIT)
TOTAL_SIZE=0
# Create a tar stream and extract files until limit is reached
find $SOURCE -type f -print0 | while IFS= read -r -d '' file
do
FILE_SIZE=$(du -sb "$file" | cut -f1)
if (( TOTAL_SIZE + FILE_SIZE > LIMIT_BYTES )); then
break
fi
tar cf - "$file" | (cd $DESTINATION; tar xf -)
let TOTAL_SIZE+=FILE_SIZE
done
This script creates a tar stream of the files in the source directory and extracts them to the destination folder until the desired size is reached.
Remember to adjust the paths and sizes in the provided examples to match your specific setup. With these methods, you can efficiently move a specific amount of data from one folder to another using the terminal.
To move files starting with a certain prefix using the mv
command, you can use the wildcard character *
. For example, to move all files starting with "a" to another folder, you can use the command mv /path/to/source/a* /path/to/destination/
.
You can move files until a certain total size is reached using a bash script. Here’s an example of a script that does this:
#!/bin/bash
SOURCE="/path/to/source"
DESTINATION="/path/to/destination"
LIMIT=30G
# Function to convert size to bytes
function to_bytes()
{
echo $(numfmt --from=iec $1)
}
LIMIT_BYTES=$(to_bytes $LIMIT)
# Move files until limit is reached
while IFS= read -r -d '' file
do
FILE_SIZE=$(stat -c%s "$file")
if (( TOTAL_SIZE + FILE_SIZE > LIMIT_BYTES )); then
break
fi
mv "$file" $DESTINATION
let TOTAL_SIZE+=FILE_SIZE
done < <(find $SOURCE -type f -print0)
Some console-based file managers you can use to move files interactively are GNU Midnight Commander and fff (Fucking Fast File-Manager). To install GNU Midnight Commander, use the command sudo apt install mc
, and to run it, use mc
. For fff, you can download the script from the GitHub repository and run it with the command bash fff
.
You can move a large amount of data using tar
by creating a stream of the files and then extracting the desired amount of data to the destination folder. Here’s an example of a bash script that demonstrates this:
#!/bin/bash
SOURCE="/path/to/source"
DESTINATION="/path/to/destination"
LIMIT=30G
# Function to convert size to bytes
function to_bytes()
{
echo $(numfmt --from=iec $1)
}
LIMIT_BYTES=$(to_bytes $LIMIT)
TOTAL_SIZE=0
# Create a tar stream and extract files until limit is reached
find $SOURCE -type f -print0 | while IFS= read -r -d '' file
do
FILE_SIZE=$(du -sb "$file" | cut -f1)
if (( TOTAL_SIZE + FILE_SIZE > LIMIT_BYTES )); then
break
fi
tar cf - "$file" | (cd $DESTINATION; tar xf -)
let TOTAL_SIZE+=FILE_SIZE
done
Remember to adjust the paths and sizes in the provided examples to match your specific setup.