GNU Wget (or just Wget, originally Geturl) is a GNU Project software that fetches content from web servers. The words World Wide Web and get are combined to form its name. The download protocols HTTP, HTTPS, and FTP are all supported.
Why should I use Wget?
Using the Wget application at the UNIX command line prompt to download files from a remote server to your GreggHost server is a terrific shortcut.
Because Wget downloads the files straight to your GreggHost server, it avoids the sometimes painful and slow download process. Otherwise, you’d have to download them to your computer, then upload them to your server using an FTP program like Filezilla, which takes longer owing to the nature of those applications.
Wget is a sophisticated tool with many choices, but even the most basic features are important.
For customers migrating between two rsync-enabled servers, rsync may be a superior (faster, less difficult) solution (such as moving from GreggHost Shared Hosting to GreggHost VPS hosting).
Wget’s basic usage is as follows:
In your panel, create a shell user.
Use SSH to connect to your server.
Type wget, then the complete URL of the file you want to download. For example, to get the.tar.gz file for Python 3.8.1, perform the following command:
wget $ https://www.python.org/ftp/python/3.8.1/Python-3.8.1.tgz
The.tgz file is downloaded to the location where you run the command.
Wget is a popular tool for downloading compressed files.
If the file you’re downloading is compressed, open it with gunzip, unzip, or tar to expand and unpack it.
If you need to supply variables to a script, surround the URL in single quotes to avoid the ampersand character being misinterpreted as the shell command:
$ wget ‘https://www.example.com/myscript.php?var1=foo&var2=bar&var1=foo&var2=bar&var1=foo&var2=bar&var1=foo&var2=bar&var1=fo
You may simply use ftp to log into the server and transfer the folder to make a mirror image of a folder on a separate server (with the same structure as the original one):
wget -r ftp:/username:[email protected]/folder/* $ wget -r ftp:/username:[email protected]/folder/*
This command downloads folder/ and everything within it while maintaining the folder structure. Rather of performing wget on each file individually, this saves a lot of time.
To save space, simply compress the folder using the following command:
[server]$ folder zip -r
Clean up by deleting the copy of the zip folder:
$ rm -rf folder [server]
It’s a fantastic way to back up your entire website at once, and it’s also really useful for migrating large sites between providers.
To download the whole contents of example.com, for example, execute the following command:
$ wget -r -l 0 [server] https://www.example.com/
Information adapted from the GNU Wget Manual/Examples – Advanced Usage Man page.
Run the following command in your terminal to read the Wget manual page: