When we start learning about command line tools, we tend to see them as single-purpose. You're taught that

        cat
    

prints file contents,

        ls
    

lists all items in a directory, and

        du
    

shows the disk space usage. However, many command line tools have dozens of options, all neatly described in their

        man
    

files. Some of them can do wonders when combined with other commands.

Of course, it's unreasonable to expect that anyone should remember every single option. With that in mind, it's good to ocassionally refresh our knowledge of Linux commands, because you might discover new uses for them.

curl-commands-introduction

This time, we're focusing on cURL, a tool for transferring data via a number of Internet protocols such as HTTP(S), FTP, Telnet, LDAP, IMAP, POP3, SMTP, and more.

In simplified terms, cURL performs various requests from a client to a server, establishing a connection between them by means of a specific protocol and its associated methods. For example, as a HTTP client, cURL can send a request to view or download content (GET request method), or to post content through a form on a website (POST request method). Many web applications and services allow cURL to interact with their APIs (Application Programming Interface).

Because their functionality overlaps to an extent, cURL and wget are often compared to each other. Both tools can download content from the Internet, but wget supports recursive downloads, web scraping, and generally feels simpler to use. If you just want to download files from the terminal, wget is probably a better choice.

On the other hand, if you need advanced HTTP authentication methods, and want to upload files as well as download them, it's worth learning how to cURL. Also, wget only supports HTTP(S) and FTP, while cURL covers a wide range of protocols. This means cURL can do more cool stuff—and here are ten examples to prove it.

1. Get the Weather Report

If someone told you to check the weather from the terminal, you'd expect to see some boring numbers. Not with this command.

        curl http://wttr.in/LOCATION
    
curl-commands-weather

The information is provided by a CLI application called wego, but if you don't want to install it, cURL can fetch the forecast from its web frontend wttr.in. All it needs is the location for which you want the forecast. Just type the name of a city, its airport code, or your current IP address. A new feature shows the information about moon phases if you type:

        curl wttr.in/Moon
    
curl-commands-moonphase

2. Download Files and Resume Downloads

Downloading files is something we usually do in the browser. Sometimes you'll want to use a download manager; for example, when downloading several files at once, or when you want to pause downloads. Although cURL isn't a popular choice for simultaneous downloads (wget is recommended instead), you can still use it for that purpose by combining its powerful options (switches). First you'll need a direct link to the file. In this example, we'll use a PDF of the Linux Voice magazine.

        curl -O -C - https://www.linuxvoice.com/issues/016/Linux-Voice-Issue-016.pdf
    

The uppercase O switch (-O) makes cURL save the file with the default filename (usually the one from the link itself). If you wanted to save it under a different name, you'd use lowercase o followed by the new name:

        curl -o magazine.pdf -C - https://www.linuxvoice.com/issues/016/Linux-Voice-Issue-016.pdf
    

By default, the files are saved in the current directory (check it with the

        pwd
    

command). To save them elsewhere, provide the path after the -o switch. The -C - switch enables cURL to resume the download. You'd pause it by pressing Ctrl+C in the terminal, and resume by running the same download command again:

curl-commands-resume-download

cURL displays the download progress in a table-like format, with columns containing information about download speed, total file size, elapsed time, and more. If you dislike this, you can opt for a simpler progress bar by adding -# or --progress-bar to your cURL command.

To download multiple files at once, just list the links one after the other:

        curl -O file1.txt -O file2.pdf -O file3.zip
    

With the help of other command-line tools, we can batch-download all PNG and JPG images from a Tumblr blog:

        curl http://concept-art.tumblr.com/ | grep -o 'src="[^"]*.[png-jpg]"' | cut -d\" -f2 | while read l; do curl "$l" -o "${l##*/}"; done
    

In this case,

        cut
    

and

        grep
    

collect information about filenames and format it so that only files with specified extensions are displayed. If you run the command without the last pipe:

        curl http://concept-art.tumblr.com/ | grep -o 'src="[^"]*.[png-jpg]"' | cut -d\" -f2
    

you'll just get a list of files that satisfy our criteria, but they won't actually be downloaded. cURL can get a list of images from a range of pages, provided that the blog uses standard pagination:

        curl http://concept-art.tumblr.com/page/[1-7] | grep -o 'src="[^"]*.[png-jpg]"' | cut -d\" -f2
    

You can modify the range by changing the numbers in square brackets. Again, this command would only list the images; to download them, run the full command in the directory where you want to save the images:

        curl http://concept-art.tumblr.com/page/[1-7] | grep -o 'src="[^"]*.[png-jpg]"' | cut -d\" -f2 | while read l; do curl "$l" -o "${l##*/}"; done
    

If you're well-versed in regular expressions, you can improve the looks and the efficiency of this command, and share the result in the comments.

3. Manage Files on an FTP Server

We don't hear much about FTP these days, but that doesn't mean it's obsolete. In fact, many open source projects and Linux distributions share their software on FTP servers. Since FTP is supported by cURL, you can use it as a simple FTP client to upload and download files. You can browse the files on an FTP server by accessing the directories:

        curl ftp://ftp.debian.org/debian/
    

To enter a subdirectory, type its name followed by a forward slash (/).

curl-commands-ftp-browse

Downloading files is similar to HTTP downloads described in the previous section. You can either use -o or -O, and add -C - if you want to pause downloads.

        curl -O ftp://ftp.heanet.ie/mirrors/linuxmint.com/stable/17.3/linuxmint-17.3-kde-64bit.iso
    

Although cURL doesn't support recursive downloads (remember, wget does!), it can still download a range of files at once. The only condition is that the filenames follow a pattern. For example, we could download from a wallpaper-hosting server where the wallpapers are all named "wallpaperNUMBER":

        curl -O ftp://ftp.myserver.com/files/wallpaper[0-120].jpg
    

Some FTP servers require authentication before you can download files. cURL lets you log in with the -u (user) option:

        curl -u username:password -O ftp://ftp.protectedserver.com/files/example.txt
    

You can also upload files to an FTP server with the -T (transfer) option:

        curl -u username:password -T /home/user/Documents/test.txt ftp://ftp.myserver.com
    

Here you can also define multiple files as a range. This feature is sometimes called "globbing". If the filenames don't follow a pattern, just list them within curly brackets (

        -T "{file1.txt,image27.jpg}"
    

). Conversely, if they have similar names, apply the same logic from the Tumblr download example and use square brackets (

        -T "photo[1-50].jpg"
    

). Make sure to provide the full path to the files if they're not in your current directory.

4. Check If a Website Is Down

We've all been there. A website you absolutely need suddenly stops working. Then Facebook won't load. Faced with a true first world problem, what do you do?

You could Google it, ask a friend to test it for you, or use one of those single-serving sites that tell you if a website is down. Or you could just fire up the terminal and run cURL:

        curl -Is https://www.twitter.com -L | grep HTTP/
    

The uppercase I switch (-I) checks the HTTP header of a web page, and the -L (location) option is added to make cURL follow redirections. This means you don't have to type the full Facebook URL; just write facebook.com and cURL will take care of the rest thanks to -L. If there are any redirections, they will be displayed with their own HTTP status.

curl-commands-http-status

The message we're interested in is "200 OK", which means everything is fine with the website. If it's indeed down, you'll see something like this:

curl-commands-http-status-down

HTTP status codes are only as informational as your understanding of them allows. This method is not completely reliable, because a website may return a status code indicating a successfully processed request, yet it will be empty when you open it in the browser. Still, in most cases it should correspond to the real situation, and let you know what's up -- or down.

5. Expand Shortened URLs

Shortened URLs aren't inherently bad. Without them, it would be difficult to share links on Twitter and other character-limited social networks. Some URL shortening services offer useful analytics, too. But there's always a risk that someone is trying to hide malicious content behind a shortened URL, or that a troll is masking a Rickroll (or something much, much worse). If you ever feel suspicious of a shortened URL for any reason, cURL can help you expand it and find out where exactly it leads to:

        curl -sIL http://buff.ly/1lTcZSM | grep ^Location;
    

or

        curl -sI http://buff.ly/1lTcZSM | sed -n 's/Location: *//p';
    
curl-commands-expand

You can combine cURL with

        grep
    

or

        sed
    

; the main difference is in the formatting. Sed is one of those tools every Linux user should know, and it complements cURL in this and a few other use cases. Let's not forget that cURL can download files from a shortened URL (provided that the URL actually points to a file):

        curl -L -o filename.txt http://short.url
    

The syntax is the same as with other cURL downloads, and the -L option takes care of the redirection from a shortened URL to the original one.

6. Show Your Appreciation for ASCII Art

Admittedly, this isn't particularly useful, but it looks cool. With the help of

        pv
    

, a utility for monitoring data progress, cURL can display ASCII animations in the terminal.

        curl -s http://artscene.textfiles\.com/vt100/wineglas.vt | pv -L9600 -q
    
curl-commands-ascii-animation

The -s and -q options keep both commands in silent (quiet) mode. The -L option here refers to the pv command, and lets you modify the transfer rate of data in bytes per second. In other words, if the animation is moving too fast or too slowly, try playing with that number. Apart from animations, cURL can display plain, static ASCII art:

curl-commands-asciiart

The Web has plenty of websites with all kinds of ASCII art out there: from amazingly detailed, high-quality pieces to weird, silly, and even NSFW material. This digital art technique dates back to the 1960s, and today it's part of Internet culture and history, kept alive in numerous collections and tools that let you convert text and images to ASCII art. You can use it to decorate your terminal or to prank your friends -- whatever floats your boat.

7. Experiment with Social Media

Using social media from the terminal is nothing new -- we've already shown you command-line Twitter clients for Linux. While you probably won't switch to cURL as your online socializing tool, it's good to know that you can post to Facebook with it, as described here. You'll notice that, technically, cURL doesn't do it on its own; a combination of tools gets the job done.

curl-facebook11

As for Twitter, it used to be possible to manage it directly from the terminal with cURL. Then Twitter changed its API, and now there's a special cURL client for Twitter called Twurl. It's not the easiest thing to use, especially for a beginner, and it requires authentication with the Twitter Ad Platform. This makes sense if you're a developer or an advanced user, but not so much if you just want to tweet from the command-line. Still, there are ways to have fun with Twitter. You can use cURL to check a user's follower count:

        curl -s https://twitter.com/username | grep -o '[0-9,]* Followers';
    
curl-commands-twitter-followers

8. Find Your External IP Address

Finding your local IP address is easy enough -- just run

        ifconfig
    

or consult your Network Management applet. For the external IP, most people use specialized websites to obtain this information. Still, some things are just easier to do from the terminal, and this might be one of them. You can also create an alias for the cURL command. There are several online services that cooperate with cURL:

        curl ipinfo.io
curl -s https://4.ifcfg.me
curl -s http://whatismyip.akamai.com
curl ifconfig.me
curl -s icanhazip.com

Some can tell you more about any external IP address:

        curl ipinfo.io/207.46.13.41
curl ifconfig.me/207.46.13.41
curl-commands-ipaddress

All you have to do is choose a service. If you're indecisive, just include them all in your alias, as backup solutions.

9. Paste Text and Share Images

Breaking your workflow is never good for productivity and focus. If you do most of your work in the terminal, switching to a browser just to share a few files can be impractical, if not annoying. Luckily, some pastebin and file sharing services were born to work with cURL, so you can use them straight from the terminal, without a user account.

Clbin and Sprunge.us have similar syntax. With Clbin, you pipe a local file or the output of a command, and it returns a link to your uploaded text:

        cat textfile.txt | curl -F 'clbin=<-' https://clbin.com
    

It also supports image uploads (PNG, JPG, and GIF):

        curl -F 'clbin=@image.png' https://clbin.com
    

If you want to use Sprunge.us instead, type:

        cat textfile.txt | curl -F 'sprunge=<-' http://sprunge.us
    

Sprunge.us doesn't support image uploads for now.

Ix.io is based on the same principle as the previous two services, with a few extra features. To upload a file, type:

        cat file.txt | curl -F 'f:1=<-' ix.io
    

or

        curl -F 'f:1=@file.txt' ix.io
    

When you get a link to the uploaded text, you can modify its URL to show syntax highlighting (with

        ix.io/yourpaste+
    

,

        ix.io/yourpaste/
    

, or

        ix.io/yourpaste/language
    

for a specific scripting or programming language). It's also possible to limit how many times a link can be viewed by modifying the number after the

        'read:1'
    

value:

        cat file.txt | curl -F 'f:1=<-' -F 'read:1=2' ix.io
    

Ix.io is primarily intended for text-based files such as source code or system logs. If you want to upload a variety of file formats, use Transfer.sh. It supports images, file encryption, and keeps your files online for two weeks. You can upload up to 5 GB of data to Transfer.sh. Here's how:

        curl --upload-file bunnies.jpg https://transfer.sh/bunnies.jpg 
    

You're free to define the name of the uploaded file. To upload multiple files, list them one after the other with the -F option:

        curl -i -F filedata=@/tmp/hello.txt -F filedata=@/tmp/hello2.txt https://transfer.sh/
    

10. Check Unread Mail on GMail

There is massive potential to be unlocked in cURL if you're willing to delve into details of email-related protocols (SMTP, POP, IMAP). For a quick email check, this command will do. It parses your GMail feed and formats the output (email subject and sender) with

        tr
    

,

        awk
    

,

        sed
    

and/or

        grep
    

commands. Note that this solution is extremely unsafe because it exposes your login credentials to anyone with access to your terminal. The first version shows the sender's name, while the second one prints only unread email subjects:

        
curl -u username:password --silent "https://mail.google.com/mail/feed/atom" | tr -d '\n' | awk -F '' '{for (i=2; i<=NF; i++) {print $i}}' | sed -n "s/
        
curl -u username:password --silent "https://mail.google.com/mail/feed/atom" | grep -oPm1 "(?<=)[^<]+" | sed '1d'
curl-commands-check-gmail

What Else Can cURL Do?

cURL is rarely used as a standalone command. Most people use it as part of a script or an application. Still, it's possible to create practical one-liners with cURL, as we've demonstrated here. Many of these examples were adapted from CommandLineFu, a fantastic source of smart command-line hacks, and you shouldn't consider them as set in stone.

With enough knowledge and experience, we can modify every command, format it differently, or completely replace it with a better solution. Can you improve our suggested cURL commands? Do you know of any other cool uses for cURL? Share your tips in the comments.

Image Credits: Inside Introduction to the Command Line by Osama Khalid via Flickr.