cURL Linux Command Guide: Options, Examples, and Tips

By

Ethan Fahey

Sep 5, 2025

Two developers coding at dual monitors with speech bubbles and code icons, representing curl Linux command usage and API testing workflows.
Two developers coding at dual monitors with speech bubbles and code icons, representing curl Linux command usage and API testing workflows.
Two developers coding at dual monitors with speech bubbles and code icons, representing curl Linux command usage and API testing workflows.

cURL in Linux is one of those essential tools every engineer and recruiter working with technical teams should know about. It’s a command-line utility designed for transferring data with URLs, and it supports a wide range of protocols. From downloading files to interacting with APIs or quickly testing if a website is up, cURL is versatile and reliable. In this guide, we’ll break down the must-know commands, share real-world examples, and offer practical tips to get the most out of it. And for recruiters or businesses, platforms like Fonzi AI help you identify candidates who already have these hands-on skills, saving time and ensuring you’re connecting with professionals who can put tools like cURL to work effectively.

Key Takeaways

  • cURL is a versatile command-line tool for transferring data with URLs, supporting multiple protocols like HTTP, HTTPS, and FTP, and is lightweight and pre-installed on most Linux systems.

  • You can perform a variety of tasks with cURL, from fetching data with simple commands to making complex HTTP requests with various methods, including GET and POST, using straightforward syntax and options.

  • Advanced features of cURL allow fine-tuning of your commands for tasks like file uploads, handling authentication securely, and customizing output, making it an essential tool for developers and sysadmins.

Understanding cURL in Linux

A terminal window showing cURL commands being executed in Linux.

cURL is a command-line tool for transferring data with URLs, essential for anyone working with servers and networks. Its versatility is notable: it supports multiple protocols, including HTTP, HTTPS, FTP, and even the lesser-known curl dict protocol for dictionary lookups. This multi-protocol support means you can use cURL for a wide range of tasks, from downloading files and making API calls to verifying website availability using curl https.

A significant advantage of cURL is its lightweight nature. Unlike graphical tools that need substantial system resources, cURL launches quickly from the terminal and performs efficiently.

Additionally, it’s pre-installed on most Linux distributions, allowing immediate use without additional setup. Whether fetching data from a web server over HTTP or uploading files via FTP, cURL is your go-to command-line tool.

Basic Syntax of cURL Commands

cURL’s straightforward syntax is one of its strengths. At its core, a cURL command comprises options that modify its behavior, followed by a specified URL. This simplicity allows you to quickly get started without memorizing complex commands. For example, a basic cURL command to fetch the main page of a website looks like this:

curl http://example.com

In this command, http://example.com

is treated as a URL, and any input not recognized as an option is automatically considered a URL.

cURL offers a plethora of options to cater to different needs. These options can be prefixed with one or two dashes, where single dash options can be combined without spaces. For instance, to fetch a webpage and save the output to a file, you can use the -o

option:

curl -o output.html http://example.com

Moreover, cURL supports various protocols, including:

  • HTTP

  • HTTPS

  • FTP

  • and more, each with its unique set of command-line flags.

You can even handle multiple URLs in a single command, processing them in the order they are listed unless modified by the -Z flag option.

cURL also supports ‘globbing’, which allows you to specify multiple URLs or parts of URLs using braces or brackets. This feature is handy when you need to fetch or upload multiple files efficiently. Additionally, you can use environmental variables in cURL commands by setting them with the --variable option.

When dealing with complex commands, you can read command-line arguments from a specified configuration file using the -K option, making it easier to manage and reuse your commands.

Fetching Data with cURL

cURL is frequently used to retrieve data from a specific URL. Whether you need to fetch a webpage, download a file, or get some JSON data, cURL makes it incredibly simple. To fetch a web page, provide the URL as an argument. For example, to get the main page from a web server, you would use:

curl http://example.com

This command shows the HTML content of the main page directly in the terminal. cURL also displays a progress meter during data fetching, which is useful for monitoring the download process.

If you prefer a different type of progress bar, you can use the -# option. For a cleaner output without progress bars and error messages, the -s option is your friend. Additionally, the -w or --write-out option in curl lets you specify what information to display after a transfer, such as HTTP status codes or timing details.

The option allows you to:

  • Customize the output

  • Print details about the transfer, making it easier to integrate into scripts and workflows

  • Format the output as a JSON object, providing a structured and readable summary of the transfer.

Performing HTTP Requests with cURL

cURL is not just for fetching data; it’s also a powerful tool for performing various HTTP requests. Whether you need to send:

  • a GET request

  • a POST request

  • a PUT request

  • or a DELETE request, cURL has you covered. For instance, to send a POST request with data, you can use the -d option.

flag along with the Content-Type header when necessary:

curl -d "name=JohnDoe&age=25" -H "Content-Type: application/x-www-form-urlencoded" http://example.com/post

This command sends the data in a format that mimics browser behavior, ensuring compatibility with most web servers.

Handling multiple HTTP methods in a single command is possible with cURL using the --next option.

This option allows you to specify different methods for multiple requests within a single command. For example:

curl -X GET http://example.com --next -X POST -d "name=JohnDoe" http://example.com/post

This command includes:

  • -X GET http://example.com

a GET request to http://example.com using an HTTP request

  • --next

: separates multiple requests

  • -X POST -d "name=JohnDoe" http://example.com/post

: a POST request to http://example.com/post with data name=JohnDoe

Redirects are a common aspect of web interactions, and cURL handles them gracefully with the --location or -L option. This option ensures that your requests follow redirects, reaching the final destination without manual intervention.

Additionally, to upload files using the PUT method, cURL requires server-side handling to receive the uploaded stream.

File Transfers using cURL

An illustration of file transfers using cURL.

cURL excels in file transfers, whether you’re downloading files from the internet or uploading them to a server. It supports multiple protocols, including:

  • FTP

  • HTTP

  • HTTPS

  • SFTP

  • SCP This gives you the flexibility to work with various systems. To download a file, provide the URL as the argument:curl ftp -O http://example.com/file.txt Using file transfer protocol enhances the efficiency of these operations.

If you want to specify a custom filename for the download, use the -o option to save the specified file.

option:

curl -o myfile.txt http://example.com/file.txt

Uploading files is just as straightforward. Use the -T option to upload a file to an FTP server, along with the necessary authentication credentials:

curl -T uploadfile.txt -u username:password ftp://ftp.example.com/upload/

For resuming incomplete downloads, the -C - option is a lifesaver, provided the server supports range requests. This option allows you to pick up where you left off, saving time and bandwidth. Additionally, the -L option ensures that file downloads are directed correctly when URLs change due to redirects.

For multiple files, specify them in a single cURL command to transfer sequentially in the specified order.

Handling Authentication with cURL

Authentication is a critical aspect of secure file transfers and data retrieval. cURL supports multiple authentication methods, including:

  • Basic authentication

  • Digest

  • Negotiate

  • NTLM

The default method is Basic, which sends credentials in base64 encoding—not secure over HTTP connections. To include credentials in a cURL command, use the -u option.

curl -u username and password http://example.com

For secure data transfers, keep in mind the following:

  • Always use HTTPS to encrypt the credentials and data.

  • When a server requires authentication, it typically responds with a 401 status code and a WWW-Authenticate header detailing the supported methods.

  • cURL can automatically switch to the appropriate authentication method using the --anyauth option.

option, ensuring the most secure method is used.

Advanced cURL Options

cURL offers a range of advanced options to fine-tune your data transfers. For instance, you can limit data transfer rates with the --limit-rate

option, specifying the value in bytes. This is useful for managing bandwidth usage:

curl --limit-rate 1000K http://example.com

Customizing HTTP headers is another powerful feature. Use the -H or --header option to set any HTTP header, including the User-Agent. This is particularly important for mimicking browser behavior or preventing detection by anti-bot systems:

curl -H "User-Agent: Mozilla/5.0" http://example.com

Disabling the progress meter without affecting other output messages is possible with the --no-progress-meter option. For a clean output while keeping track of transfers, combine the -s option with the -S option. These advanced options allow you to customize your cURL commands to fit various needs and scenarios.

Using libcurl in C Programs

libcurl is the library that powers the features of the curl command, making it a valuable tool for developers integrating cURL functionalities into C programs. Various programming languages have ported the libcurl library, extending its capabilities beyond the command line.

For developers, the --libcurl option is a treasure trove, outputting libcurl-based C source code for the same operation as the command.

Setting Up libcurl

Setting up libcurl is straightforward. When using the --libcurl option, cURL outputs C code into a file (code.c) and downloads the HTML response into a separate file (log.html). This makes it easy for developers to see how cURL operations translate into C code, providing a solid foundation for integrating cURL functionalities into their applications.

libcurl is an essential library that allows developers to integrate cURL functionalities into C programs. Linking your C program with libcurl allows you to perform various cURL operations programmatically, enhancing your application’s capabilities.

Making HTTP Requests with libcurl

Making HTTP requests with libcurl involves the following steps:

  1. Initialize a CURL handle using curl_easy_init()

.

  1. Set the URL option with curl_easy_setopt()

using the CURLOPT_URL option.

  1. Send the request and perform the transfer with curl_easy_perform()

.

  1. Clean up the CURL handle with curl_easy_cleanup()

“Please ensure that all lines of communication with the client are clear and concise.”

Here is the basic structure:

CURL *curl;

CURLcode res;

curl = curl_easy_init();

if(curl) {

    curl_easy_setopt(curl, CURLOPT_URL, "http://example.com");

    res = curl_easy_perform(curl);

    curl_easy_cleanup(curl);

}

Linking your C program with libcurl is made easier with the curl-config tool, which provides the necessary compiler flags.

Handling Responses and Errors

Handling responses and errors is crucial for robust and resilient applications using libcurl. You can check HTTP response codes to determine the outcome of your requests, facilitating the diagnosis of issues. libcurl provides comprehensive error codes through the curl_easy_perform function, allowing developers to identify issues effectively.

Managing timeout settings prevents your application from hanging indefinitely, enhancing the user experience. Enabling verbose output during development helps trace and debug issues by providing detailed information about the process.

Web Scraping with cURL and C

Web scraping is an advanced and interesting application of cURL and C. Using cURL in C enables the automation of data extraction from web pages, turning repetitive tasks into efficient scripts. For instance, you can use cURL to fetch HTML content, parse it, and extract the desired information, useful for gathering data from websites for analysis or monitoring.

However, web scraping comes with its own set of challenges:

  • Dynamic websites that rely heavily on JavaScript may require additional libraries to render and interact with the content properly.

  • Advanced web scraping often involves dealing with rate limiting.

  • Geo-blocking can complicate the scraping process.

  • Fingerprinting is another challenge that can make scraping more difficult.

Despite these challenges, mastering cURL and C for web scraping can open new avenues for data collection and automation.

Best Practices for Using cURL

A visual guide on best practices for using cURL commands.

Using cURL effectively requires following best practices to ensure security and efficiency. Avoid passing sensitive information like passwords through command line options, as this can expose them to other users on the system. Instead, use secure methods to handle credentials, such as environment variables or secure authentication protocols.

Monitoring the data received during the transfer of data is essential to prevent denial-of-service attacks. To safeguard your systems from malicious activities:

  • Set appropriate limits on data transfers.

  • Sanitize user inputs.

  • Avoid using .netrc files for storing credentials, as they pose a security risk due to clear-text storage.

Handling redirects carefully is also important, as they can lead to unintended file access or data exposure. When redirecting cURL output to a file, using the -s option may not hide the progress bar, so additional redirection might be necessary.

For debugging and development, the --verbose option is invaluable, as it allows you to see the commands cURL sends to the server along with other connection details. Setting a maximum time limit for cURL operations using the --max-time option ensures that long requests do not hang indefinitely, enhancing the overall user experience.

Well-Formatted Table of cURL Options and Examples

To make your cURL experience even smoother, here’s a well-formatted table summarizing some of the most common use cases of cURL options along with practical examples. This table is designed to serve as a quick reference guide, helping you find the right option for your specific needs without having to sift through extensive documentation.

Option

Description

Example

-o

Write output to a file

curl -o output.txt http://example.com

-O

Save file with the same name as the URL

curl -O http://example.com/file.txt

-u

User authentication

curl -u username:password http://example.com

-d

Send POST data

curl -d "name=JohnDoe" http://example.com/post

-H

Custom HTTP header

curl -H "User-Agent: Mozilla/5.0" http://example.com

-s

Silent mode (no progress meter or error messages)

curl -s http://example.com

-L

Follow redirects

curl -L http://example.com

--limit-rate

Limit data transfer rate

curl --limit-rate 1000K http://example.com

--max-time

Maximum time for cURL operation

curl --max-time 30 http://example.com

-K

Read arguments from a config file

curl -K config.txt http://example.com

--libcurl

Dump libcurl code for the command

curl --libcurl code.c http://example.com

This comprehensive table will help you quickly find the right options and understand how to use them with practical examples, making your cURL commands more efficient and effective.

Summary

In summary, cURL is an incredibly powerful and versatile command-line tool that can handle a wide range of tasks, from fetching data and performing HTTP requests to transferring files and handling authentication. Its lightweight nature and support for multiple protocols make it an indispensable tool for developers, system administrators, and Linux enthusiasts alike. By mastering cURL, you can streamline your workflows, automate repetitive tasks, and enhance your productivity.

Training a large language model (LLM) is all about teaching it how to understand and generate human-like text. It’s a multi-step process that includes everything from prepping the right datasets and choosing a model architecture to allocating serious compute power and evaluating how well it performs. In this article, we walk through each of these key steps to help you get a handle on how custom LLMs are built from the ground up. At Fonzi AI, we make this process smoother for teams working in fast-paced business environments. Whether you’re training models for recruitment automation, data analysis, or tailored customer interactions, Fonzi helps you streamline LLM development and deploy smarter AI solutions that drive real results.

FAQ

How do I add headers and authentication to cURL requests?

How do I add headers and authentication to cURL requests?

How do I add headers and authentication to cURL requests?

How can I limit the data transfer rate with cURL?

How can I limit the data transfer rate with cURL?

How can I limit the data transfer rate with cURL?

Can cURL handle file uploads?

Can cURL handle file uploads?

Can cURL handle file uploads?

How do I resume an incomplete download with cURL?

How do I resume an incomplete download with cURL?

How do I resume an incomplete download with cURL?

What are some best practices for using cURL?

What are some best practices for using cURL?

What are some best practices for using cURL?