The curl command transfers data to or from a network server, using one of the supported protocols (HTTP, HTTPS, FTP, FTPS, SCP, SFTP, TFTP, DICT, TELNET, LDAP or FILE). It is designed to work without user interaction, so it is ideal for use in a shell script.
The software offers proxy support, user authentication, FTP uploading, HTTP posting, SSL connections, cookies, file transfer resume, metalink, and other features.
Syntax
curl [options] [URL…]
Options
URLs
The URL syntax is protocol-dependent. You’ll find a detailed description in RFC 3986.
- Syntax
- URLs
- Progress meter
- Exit codes
- Examples
- Related commands
- Linux commands help
You can specify multiple URLs or parts of URLs by writing part sets within braces as in:
http://site.{one,two,three}.com
or you can get sequences of alphanumeric series using [] as in:
ftp://ftp.numericals.com/file[1-100].txt ftp://ftp.numericals.com/file[001-100].txt ftp://ftp.letters.com/file[a-z].txt
Nested sequences are not supported, but you can use several ones next to each other:
http://any.org/archive[1996-1999]/vol[1-4]/part{a,b,c}.html
You can specify any amount of URLs on the command line. They are fetched in a sequential manner in the specified order.
You can specify a step counter for the ranges to get every Nth number or letter:
http://www.numericals.com/file[1-100:10].txt http://www.letters.com/file[a-z:2].txt
If you specify URL without protocol:// prefix, curl attempts to guess what protocol you might want. It then defaults to HTTP but try other protocols based on often-used hostname prefixes. For example, for hostnames starting with “ftp.” curl assumes you want to speak FTP.
curl does its best to use what you pass to it as a URL. It is not trying to validate it as a syntactically correct URL by any means but is instead very liberal with what it accepts.
curl attempts to re-use connections for multiple file transfers, so that getting many files from the same server does not do multiple connects / handshakes. This improves speed. Of course, this is only done on files specified on a single command line and cannot be used between separate curl invokes.
Progress meter
curl normally displays a progress meter during operations, indicating the amount of transferred data, transfer speeds and estimated time left, etc.
curl displays this data to the terminal by default, so if you invoke curl to do an operation and it is about to write data to the terminal, it disables the progress meter as otherwise it would mess up the output mixing progress meter and response data.
If you want a progress meter for HTTP POST or PUT requests, you need to redirect the response output to a file, using shell redirect (>), -o [file] or similar.
It is not the same case for FTP upload as that operation does not spit out any response data to the terminal.
If you prefer a progress “bar” instead of the regular meter, -# is your friend.
Environment variables
The environment variables can be specified in lowercase or uppercase. The lowercase version has precedence. http_proxy is an exception as it is only available in lowercase.
Using an environment variable to set the proxy has the same effect as using the –proxy option.
Exit codes
There are a bunch of different error codes and their corresponding error messages that may appear during bad conditions. At the time of this writing, the exit codes are:
Examples
curl https://www.computerhope.com/index.htm
Fetch the file index.htm from www.computerhope.com using the HTTP protocol, and display it to standard output. This is essentially the same as “viewing the source” of the webpage; the raw HTML is displayed.
curl https://www.computerhope.com/index.htm > index.htm
Fetch the same file as above, but redirect the output to a file, index.htm, in the current directory.
curl -O https://www.computerhope.com/index.htm
Fetch the same file as above, and output to a file with the same name (index.htm) in the current directory, this time using the curl function -O.
curl –limit-rate 1234B -O https://www.computerhope.com/index.htm
Same as above, but this time, limit the download speed to (an average speed of) 1,234 bytes/second.
curl –limit-rate 1234B -O -# https://www.computerhope.com/index.htm
Same as above, but this time, display a progress bar (the -# option) instead of the numerical progress meter.
Related commands
wget — Download files via HTTP or FTP.