Linux curl command: Web Interaction from the Shell

curl is a powerful command that lets you transfer data to or from a server.

It supports a wide range of protocols, and it’s equipped to deal with various network-related tasks without requiring user interaction.

 

 

Install curl

curl is a widely-used tool, and most Linux distributions include it in their repositories. Depending on your operating system, the installation command can vary.

Debian/Ubuntu-based Distributions

To install curl on Debian, Ubuntu, or any other Debian-based system:

$ sudo apt update
$ sudo apt install curl

Red Hat/Fedora/CentOS

For Red Hat and its derivatives:

$ sudo yum install curl

If you’re on a system using dnf:

$ sudo dnf install curl

 

Supported protocols

With curl, you can work with numerous protocols including HTTP, HTTPS, FTP, FTPS, SCP, SFTP, LDAP, LDAPS, and many others.

$ curl --version

Output:

Protocols: dict file ftp ftps gopher gophers http https imap imaps ldap ldaps mqtt pop3 pop3s rtmp rtsp scp sftp smb smbs smtp smtps telnet tftp 

You’ll notice the protocols supported by your installed version of curl listed at the beginning of the output.

It’s crucial to know which protocols your curl installation supports, especially if you’re dealing with many servers and services.

 

URL Globbing

URL globbing in curl allows you to fetch multiple URLs with a single command by specifying parts of the URL in a brace {} sequence.

This can save time and automate processes when dealing with sites or servers that have a predictable URL structure.

Fetching Sequential URLs

If you know that URLs are numerically sequenced, you can use URL globbing to fetch them all:

$ curl http://example.com/file[1-5].zip

This command fetches file1.zip, file2.zip, … up to file5.zip.

Fetching Based on Specific Patterns

You can specify multiple patterns within the braces:

$ curl http://example.com/file_{A,B,C}.zip

This fetches file_A.zip, file_B.zip, and file_C.zip.

Combining Multiple Patterns

URL globbing allows multiple pattern sequences in a single URL:

$ curl http://example.com/{year2022,year2023}/event_{A,B,C}.html

This command would fetch six URLs: from http://example.com/year2022/event_A.html to http://example.com/year2023/event_C.html.

Use globbing responsibly to avoid unintentional DoS attacks or being rate-limited.

 

Dynamic URLs using Variables

Starting from version 8.3.0, curl introduced support for command-line variables. These variables can be set using the --variable name=content or --variable name@file syntax.

The latter allows you to set the variable’s content from a file, and if the file is set to a single dash (-), it reads from stdin.

To expand the content of these variables in option parameters, you can use the {{name}} syntax, provided the option name is prefixed with --expand-.

For instance, if you have a variable named “username”, you can use it in a URL like this: --expand-url "https://example.com/{{username}}/details".

Basic Variable Expansion:

Set a variable named “username” and use it in a URL.

$ curl --variable username=JohnDoe --expand-url "https://example.com/profile/{{username}}"

Using Environment Variables:

Use the USER environment variable in a URL.

$ curl --variable %USER --expand-url "https://example.com/api/{{USER}}/details"

Using Variables from a File:

If you have a file named token.txt containing an authentication token, you can use its content in a URL.

$ curl --variable token@token.txt --expand-url "https://api.example.com/data?auth={{token}}"

Using Multiple Variables:

Combine multiple variables in a single URL.

$ curl --variable user=JohnDoe --variable action=edit --expand-url "https://example.com/{{user}}/{{action}}"

By leveraging variables, you can make your curl commands more dynamic and adaptable to different scenarios.

 

Downloading files

To download a file using curl, you simply provide the URL of the resource:

$ curl http://example.com/file.txt -o localfile.txt

With this command, you download “file.txt” from “example.com” and save it as “localfile.txt” on your local machine. The -o flag lets you specify the name of the saved file.

Always make sure to use the -o flag to define the local filename, ensuring you don’t overwrite other files.

 

Uploading files

You can upload files to a server using the HTTP PUT or POST methods.

For PUT method:

$ curl -T localfile.txt http://example.com/upload/

Here, you upload “localfile.txt” to the specified URL using the HTTP PUT method.

For POST method:

$ curl -F "data=@localfile.txt" http://example.com/upload/

This command uploads “localfile.txt” using the HTTP POST method. -F signals curl to POST data as a multipart/form-data.

The string “data=@localfile.txt” means to post the content of the file as the value for the “data” parameter.

When uploading, ensure you have appropriate permissions and know the method the server expects (either PUT or POST).

 

Form submissions with POST

Submitting forms is a common task. With curl, you can easily simulate a form submission.

$ curl -d "username=john&password=12345" http://example.com/login

This command submits a POST request to “example.com/login” with the provided data as “application/x-www-form-urlencoded”

The -d flag tells curl to send data in the POST request body.

Ensure that you correctly encode data for transmission, especially if the data contains special characters.

 

Using other HTTP methods

curl can utilize various HTTP methods, not just GET and POST.

$ curl -X HEAD http://example.com

This command fetches the headers from “example.com” using the HEAD method. The -X flag specifies the HTTP method.

For DELETE method:

$ curl -X DELETE http://example.com/resource/1

This command sends a DELETE request to remove a resource on “example.com”.

 

Authentication with cURL

Sometimes, resources are protected and require authentication.

$ curl -u username:password http://example.com/protected

This command sends a request to “example.com/protected” with Basic Authentication using the provided username and password. The -u flag followed by the credentials is what allows this authentication.

Always be cautious with this method. Transmitting credentials in plaintext can be a security risk, especially over unencrypted connections.

 

Crafting custom headers and user agents

You may need to send custom headers or change the User-Agent.

$ curl -H "X-Custom-Header: customvalue" -A "MyUserAgent/1.0" http://example.com

This sends a request to “example.com” with a custom header and a custom User-Agent.

The -H flag sets custom headers, and the -A flag sets the User-Agent.

Custom headers can be useful for APIs or services expecting specific information, while changing the User-Agent can help emulate different devices or browsers.

 

Manipulating the Expect: header

Sometimes, when sending POST requests with a large payload, curl sends an “Expect: 100-continue” header. To explicitly control this behavior:

$ curl -H "Expect:" -d "large_data_payload" http://example.com/upload

By setting the “Expect:” header with no value, you effectively disable the default behavior, allowing immediate transmission of the data without waiting for server confirmation.

When dealing with servers that don’t handle “Expect: 100-continue” gracefully, this manipulation can be crucial.

 

Handling compressed responses

Many servers compress their responses to save bandwidth. curl can automatically decompress these responses for you.

$ curl --compressed http://example.com/resource

By using the --compressed option, if the server response includes content encoded with gzip or deflate, curl will decompress it automatically.

This option ensures that the data you receive is immediately readable and usable.

 

Following redirects

Servers can redirect requests to other URLs. To make curl follow these redirects:

$ curl -L http://example.com/redirect

Using the -L or --location flag instructs curl to follow any redirects sent by the server.

It’s important to be aware when using this option, as following redirects blindly might lead you to unexpected destinations.

 

Using cookies and sessions

Cookies are essential for maintaining sessions or retaining certain preferences across requests.

$ curl -c cookies.txt -b cookies.txt http://example.com/login

Here, -c cookies.txt tells curl to save cookies to the file “cookies.txt”. The -b cookies.txt instructs curl to read and send cookies from the same file for subsequent requests.

Managing cookies properly ensures seamless interaction with sites that require session persistence.

Read more on managing cookies with Linux curl command.

 

cURL and APIs

APIs often require specific headers, methods, or data formats. curl is a valuable tool for interacting with them.

$ curl -H "Authorization: Bearer YOUR_API_TOKEN" -H "Content-Type: application/json" -d '{"key":"value"}' http://api.example.com/data

This sends a POST request to the API endpoint with an authorization header, and a JSON payload.

The multiple -H flags allow setting required headers.

When working with APIs, ensure you’re using the correct headers, methods, and data formats expected by the API.

 

SSL/TLS and certificate verification

By default, curl verifies the SSL certificate of the server.

$ curl https://secured-example.com

If there’s an issue with the server’s certificate, curl will produce an error.

However, in some scenarios, like testing or internal networks, you might encounter self-signed certificates. To bypass the verification:

$ curl -k https://internal-server

The -k or --insecure option disables the SSL certificate verification. Use this option with caution and only in trusted environments.

Read more about SSL/TLS with curl command.

 

Using curl behind a proxy

If you’re operating behind a proxy, curl can be configured to use it:

$ curl -x http://proxyserver:port https://example.com

The -x flag followed by the proxy details informs curl to route its request through the specified proxy.

 

Setting custom DNS servers

In cases where you want curl to resolve hostnames using a specific DNS server:

$ curl --resolve example.com:80:1.2.3.4 http://example.com

Here, curl will use the IP address “1.2.3.4” for “example.com” on port 80. This is useful for testing or bypassing DNS-related issues.

Ensure you trust the DNS server you’re specifying to prevent potential redirect attacks.

 

Reusing connections for multiple requests

When hitting the same server multiple times, you can reuse connections:

$ curl -H "Connection: keep-alive" http://example.com/page1 -o page1.html
$ curl -H "Connection: keep-alive" http://example.com/page2 -o page2.html

The “Connection: keep-alive” header signals to keep the TCP connection open for future requests, reducing overhead.

Keep-alive is particularly useful for scripts or automated tasks that interact repeatedly with the same server.

 

Limiting download/upload speeds

To limit the bandwidth used by curl:

$ curl --limit-rate 100K -O http://example.com/largefile.zip

This limits the download rate to 100 KBytes per second. The --limit-rate option followed by a speed (like 100K or 500K) sets the restriction.

Limiting speeds can be useful when network bandwidth is necessary.

 

Using curl for FTP transfers

curl supports FTP and allows you to upload and download files.

$ curl -u ftpuser:ftppass -O ftp://ftp.example.com/file.zip

This command downloads “file.zip” from the FTP server using the provided credentials.

For uploading:

$ curl -u ftpuser:ftppass -T localfile.zip ftp://ftp.example.com/

This uploads “localfile.zip” to the FTP server.

 

Displaying response headers

To view the headers returned by a server:

$ curl -I http://example.com

The -I option fetches only the headers. It’s beneficial to check metadata, cookies, or other header-related information.

This method helps you understand server responses without fetching the entire content.

 

IPv6 and curl

curl fully supports IPv6. To make a request over IPv6:

$ curl http://[2001:0db8:85a3:0000:0000:8a2e:0370:7334]/

Brackets are used to encapsulate IPv6 addresses in URLs.

Ensure your network infrastructure supports IPv6 when making such requests.

 

Streaming data with curl

curl can stream data, useful for watching logs or ongoing data feeds:

$ curl -N http://example.com/streaming-data

The -N flag turns off buffering, allowing data to stream directly to your console.

When streaming, always monitor the amount of data being received to ensure it doesn’t overwhelm your system or network.

 

Using .curlrc for default settings

For frequently used options, you can set defaults in a .curlrc file in your home directory:

Sample .curlrc content:

--compressed
--user-agent "MyUserAgent/1.0"

With this file in place, curl will always use the --compressed option and set the specified User-Agent by default.

Customizing .curlrc can save time and ensure consistency in repeated operations.

 

The Magic of Curl in a Telecom Crisis

I received a distress call from one of the top telecom operators.

They were experiencing a major hiccup in their system: a large chunk of their prepaid customers were unable to top up their accounts, leading to widespread dissatisfaction.
The issue? The application they used to communicate with their payment gateway was malfunctioning and they needed a quick way to validate if the issue was from their end or the gateway’s end.

The current toolchain the company had required a complex process of logging into the system, running multiple scripts, waiting for batch processes, and then obtaining logs.

To extract data for just one user, it took roughly 20 minutes. Given that they wanted to verify data for 500 users as a sample, we were looking at about 166 hours of continuous work. An unrealistic task under the circumstances.

Familiar with the company’s APIs, I proposed using the curl command to bypass the faulty application and directly communicate with the payment gateway.

By scripting a loop in bash and using curl, I could programmatically send HTTP requests and get immediate feedback on the responses.
Here’s the simplified version of what I did:

#!/bin/bash
for user in $(cat user_list.txt); do
    curl -X POST https://gateway.example.com/topup -d "userid=$user&amount=50" >> results.txt
done

After implementing the curl solution:

The entire process took roughly 30 seconds per user, a significant improvement from the initial 20 minutes. Within 4 hours, we had results for all 500 users. That’s a 97.6% reduction in time!

Analysis of the results.txt showed that the gateway responded correctly to our requests, indicating that the issue was indeed within the company’s application.

 

Resources

https://curl.se/docs/manpage.html

Leave a Reply

Your email address will not be published. Required fields are marked *