Mac OS X: Terminal Download File Command
I
often need to download files using the Terminal. However, I am unable to find the wget command on OS X. How do download files from the web via the Mac OS X bash command line option?
You need to use a tool (command) called curl. It is a tool to transfer data from or to a server, using one of the following supported protocols
FTP
HTTP
HTTPS
FTPS
POP3
SFTP
SMTPS
SMTP and more.
This command is designed to work without user interaction.
curl command syntax
The syntax is:
curl url
curl [option] url
curl -O url
curl -L -O url
curl -o output.file.name.here url-here
curl -o foo.pdf http://server1.cyberciti.biz/foo.pdf
Examples
Open the Terminal and then type the following command to grab “Mastering vim” in pdf format from www.cyberciti.biz server:
curl -o mastering-vim.pdf http://www.cyberciti.biz/files/mastering-vi-vim.pdf
Sample outputs:
Fig.01: curl command in action
The -o option write output to a file called mastering-vim.pdf instead of screen. You can skip the -o option and use the -O (capital letter O) to write (save) output to a local file named like the remote file we get. Only the file part of the remote file is used, the path is cut off:
curl -O http://www.cyberciti.biz/files/mastering-vi-vim.pdf
ls -l *.pdf
How do I specify multiple URLs or parts of URLs?
The syntax is:
# grab files from server1.cyberciti.biz, server2.cyberciti.biz, server3.cyberciti.biz
curl -O http://server{1,2,3}.cyberciti.biz/
curl -O http://server{1,2,3}.cyberciti.biz/foo.pdf
grab latest reports from US, UK, India FTP servers ##
curl -O ftp://intranet.site.{us,uk,in}.lic.net.in/reports/latest[a-z].tar.gz
You can get sequences of alphanumeric series by using []. In this example, grab invoices-1.pdf, invoices-2.pdf, …, invoices-1000.pdf using curl command:
curl -O ftp://ftp.cyberciti.biz/invoices-[1-1000].pdf
You can grab urls with leading zeros as follows:
curl -O ftp://ftp.cyberciti.biz/images-[001-300].png
You can combine various techniques to build complex download url structure as follows:
curl -O ftp://ftp.cyberciti.biz/backups-us-[a-z].tar.gz
curl -O http://server1.cyberciti.biz/music[1997-2000]/series[1-20]/dump{a,b,c,d,e,f}.tar.gz
You can set a step counter for the ranges to get every Nth number or letter:
grab http://server1.cyberciti.biz/files/foo1.txt, http://www.cyberciti.biz/files/foo3.txt, http://www.cyberciti.biz/files/foo5.txt ##
curl -O http://server1.cyberciti.biz/files/foo[1-10:2].txt
grab bar-a.pdf, bar-c.pdf, bar-e.pdf, and so on ##
curl -O http://server1.cyberciti.biz/files/bar-[a-z:2].pdf
DISPLAYING A PROGRESS BAR
You can force curl to show progress as a simple progress bar instead of the standard, more informational, meter:
curl -# -O http://www.cyberciti.biz/files/mastering-vi-vim.pdf
Sample outputs:
################################################################## 100.0%
DEALING WITH URL REDIRECTION
The following is recommended syntax for servers that may do http redirect before downloading files. Other servers may hide actual download file names.
curl -L -o file.name.here http://example.com/download.php?fileID=foo
Consider the following filezilla download url from sourceforge foss hosting platform:
Problems ##
# 1. Long ulr name with special characters in it
# 2. Url hides actual download file name
# 3. Url does http 301 redirect to pick nearest mirror
# —————————————————–
http://downloads.sourceforge.net/project/filezilla/FileZilla_Client/3.7.3/FileZilla_3.7.3_i686-apple-darwin9.app.tar.bz2?r=http%3A%2F%2Fsourceforge.net%2F&ts=1381651492&use_mirror=ncu
To avoid problems, use the following syntax:
****************** TIP *************************************** ##
a) Put all urls in single-quotes to avoid nasty shell surprises
b) The -L option follows url redirection
c) The -o file write file to given name
****************** TIP *************************************** ##
curl -L -o filezilla.tar.bz2 http://downloads.sourceforge.net/project/filezilla/FileZilla_Client/3.7.3/FileZilla_3.7.3_i686-apple-darwin9.app.tar.bz2?r=http%3A%2F%2Fsourceforge.net%2F&ts=1381651492&use_mirror=ncu
Sample outputs:
Fig.02: Put the URL in single-quotes and accept url redirection with the -L option
SAVE BANDWIDTH
You can pass the –compressed option to http based urls to request a compressed response using one of the algorithms curl supports, and save the uncompressed document. If this option is used and the server sends an unsupported encoding, curl will report an error:
curl -L -O –compressed http://server1.cyberciti.biz/large.report-tab.html
DOWNLOAD A FILE USING USERNAME AND PASSWORD
The syntax is:
Security alert: Anything (username/password) done over HTTP/FTP is completely open to interception. Do not pass username/passwords using ftp/http protocols.
## Insecure examples ##
curl ftp://username:passwd@ftp1.cyberciti.biz:21/path/to/backup.tar.gz
curl http://username:passwd@server1.cyberciti.biz/file/path/data.tar.gz
Secure examples SSL/HTTPS/SFTP etc ##
curl –ftp-ssl -u UserName:PassWord ftp://ftp1.cyberciti.biz:21/backups/07/07/2012/mysql.blog.sql.tar.gz
SFTP example ##
curl -u userNameHere sftp://home1.cyberciti.biz/~/docs/resume.pdf
Check out our previous video tutorial on curl command for more information:
(Video 01: curl Command Line Download Examples For FTP / HTTP Protocols)
Recommended readings
More – Linux / Unix: curl Command Download File Examples
Linux / Unix: curl Command Pass Host Headers
See curl(1) for more information.