General

Wget Command: What It Is and How to Use It (12 Examples Included)

General·March 22, 2026·28 min read

This guide teaches you how to use the wget command in Linux. It provides 12 examples of wget commands in action to help you use them. Once you are done, you will know everything about wget and how to use it to get files from the web.

What is the Wget Command?

Wget is a computer tool created by the GNU Project. You can use it to retrieve content and files from various web servers. The name is a combination o

f World Wide Web and the word get. It supports downloads via FTP, SFTP, HTTP and HTTPS.

Wget is written in portable C and is usable on any Unix system. It is also possible to implement it on Mac OS X, Microsoft Windows, AmigaOS and other popular platforms.

How to Install Wget?

For this wget command demonstration, we will use Ubuntu 16.04. But the syntax will work on any other Linux distribution.

First, access your server via SSH:

```bash

ssh usuario@tu_ip_del_servidor -p puerto

```

To install wget on Ubuntu 18.04 or similar, run the following command:

```bash

sudo apt-get install wget

```

To install wget on CentOS 7 or earlier distributions, use:

```bash

sudo yum install wget

```

Once the installation is complete, you will be ready to use it. Additionally, knowledge of basic SSH commands can make things easier.

Wget Command Examples:

To get started, we will provide 12 wget command examples that you can use for daily tasks. Keep in mind that you can also call this function from scripts and cron jobs!

1. Using Wget to Download Single Files:

One of the basic wget command examples is downloading a single file and storing it in your current working directory. For example, you can get the latest version of WordPress with the following command:

```bash

wget https://wordpress.org/latest.zip

```

In this example, a file called latest.zip will be downloaded to the current working directory. You will also see additional information, such as download progress, speed, size, time and date.

2. Using Wget to Download Multiple Files:

We can take the use of wget one step further and download multiple files at once. To do that, we will need to create a text document and place the download URLs there.

For example, we will retrieve the latest versions of WordPress, Joomla and Drupal with wget. Enter the following:

```bash

nano ejemplo.txt

```

This will create a file called example.txt and open a text editor interface. Paste these links there:

```

https://wordpress.org/latest.zip

https://downloads.joomla.org/cms/joomla3/3-8-5/Joomla_3-8-5-Stable-Full_Package.zip

https://ftp.drupal.org/files/projects/drupal-8.4.5.zip

```

Once done, you can use -i to get all the files stored in your example text file:

```bash

wget -i ejemplo.txt

```

Wait for the process to finish and you will have the installations of three of the most popular content management systems.

3. Using Wget to Get Files with Different Names:

In this wget example, we will save a file with a different name using the -O option:

```bash

wget -O wordpress-install.zip https://wordpress.org/latest.zip

```

In this case, the downloaded resource will be saved as wordpress-install.zip instead of its original name.

4. Using Wget to Save Files in a Specified Directory:

You can use wget to place a file in another directory using the -P function:

```bash

wget -P documentos/archivos/ https://wordpress.org/latest.zip

```

The file you retrieve with this syntax will appear in the documents/files/ folder.

5. Using Wget to Limit Download Speed:

With wget, you can also limit the download speed. This is useful when retrieving large files and will prevent it from using all your bandwidth. This wget example will set the limit to 500k:

```bash

wget --limit-rate=500k https://wordpress.org/latest.zip

```

6. Using Wget to Set Retry Attempts:

Internet connection issues can interrupt your download. To address this problem, we can increase retry attempts using the -tries function:

```bash

wget --tries=100 https://wordpress.org/latest.zip

```

7. Using Wget to Download in the Background:

For extremely large files, you can take advantage of the -b function. It will download your content in the background.

```bash

wget -b http://ejemplo.com/archivo-grande.tar.gz

```

A wget-log file will appear in your working directory, which can be used to verify the progress and status of your download. You can also use the tail command:

```bash

tail -f wget-log

```

8. Using Wget to Download via FTP:

The command is also usable with FTP. You will just need to specify the username and password, as in this wget example:

```bash

wget --ftp-user=TU_NOMBRE_DE_USUARIO --ftp-password=TU_CONTRASEÑA ftp://ejemplo.com/algo.tar

```

9. Using Wget to Continue Interrupted Downloads:

Your download may be interrupted if you lose your internet connection or experience a power outage. This is quite common when downloading large files. Instead of starting over, it is possible to continue the download using the -c function:

```bash

wget -c https://ejemplo/archivo-muy-grande.zip

```

If you proceed without the -c function, the new file will have .1 appended to the end, since it already exists.

10. Using Wget to Retrieve Complete Websites:

It is also possible to use the wget command to download the content of an entire website. This will allow you to view it locally without an internet connection. Here is an example:

```bash

wget --mirror --convert-links --page-requisites --no-parent -P documentos/sitios-web/ https://algun-sitio.com

```

Let us analyze the ingredients of this wget command:

- --mirror: Makes your download

recursive.

- --convert-links: All links will be converted for proper offline use.

- --page-requisites: The following will include all necessary files, such as CSS, JS and images.

- --no-parent: Ensures that directories above the hierarchy are not retrieved.

- -P documents/websites/: Ensures that all content goes to our specified directory.

Once the process finishes, you will be able to open the downloaded website locally and find all files in the documents/websites/ folder.

11. Using Wget to Locate Broken Links:

Let us try something more advanced. We can use the wget command to locate all broken URLs that display the 404 error on a specific website. Start by running the following:

```bash

wget -o wget-log -r -l 5 --spider http://ejemplo.com

```

- -o: Collects the output in a file for later use.

- -l: Specifies the recursion level.

- -r: Makes the download recursive.

- --spider: Sets wget to spider mode.

Now we can investigate the wget-log file to find the list of broken links. Here is the command to do it:

```bash

grep -B 2 '404' wget-log | grep "http" | cut -d " " -f 4 | sort -u

```

12. Using Wget to Download Numbered Files:

If you have numbered files or images in a specific list, you can easily download all of them with the following syntax:

```bash

wget http://ejemplo.com/imagenes/{1..50}.jpg

```

Conclusion:

Congratulations! By completing this tutorial, you have learned several uses of the wget command. Now you can use it to get single or multiple files. Additionally, you have learned some advanced uses, such as downloading a complete website or locating broken URLs. For more information, you can also refer to the official documentation.

Is there any hidden tip or trick you want to share? Feel free to do so in the comments!

Still have questions?

Come chat with us and we will get back to you as soon as possible!

Contact Support