Downloading files from URLs in Laravel
- Handling file downloads from URLs in Laravel
- Basic controller setup
- Directly streaming a remote URL to the browser
- Downloading a file from a URL and storing it on disk
- Streaming a URL directly into storage using HTTP sink
- Letting the user download the stored file
- Combining remote download and user download in one request
- Using queues or cron jobs to download from URLs
- Conclusion
Handling file downloads from URLs in Laravel
In this article you will see how to download files from external URLs in a Laravel application. You will do this both to save the file on your server and to send it to the user as a download. Everything happens inside normal Laravel code, for example in controllers or jobs. We will focus only on practical, copy pasteable patterns you can adapt to your own project.
Basic controller setup
All the examples below assume you are calling a controller method from a route. For example:
// routes/web.php use App\Http\Controllers\FileDownloadController;use Illuminate\Support\Facades\Route; Route::get('/download-remote', [FileDownloadController::class, 'downloadRemote']);Route::get('/cache-remote', [FileDownloadController::class, 'cacheRemote']);
// app/Http/Controllers/FileDownloadController.php namespace App\Http\Controllers; use Illuminate\Http\Request;use Illuminate\Support\Facades\Http;use Illuminate\Support\Facades\Storage; class FileDownloadController extends Controller{ // methods will go here}
You do not have to structure your code exactly like this, but having a dedicated controller keeps the examples clear.
Directly streaming a remote URL to the browser
Sometimes you just want to take a remote file URL and let the user download it through your Laravel app, without storing it on disk. This is common when you proxy downloads through your app for authentication or logging.
You can do this with response()->streamDownload() and a plain PHP stream:
// app/Http/Controllers/FileDownloadController.php public function downloadRemote(Request $request){ $url = $request->query('url'); // example: https://example.com/file.pdf if (! $url) { abort(400, 'Missing url parameter'); } $fileName = basename(parse_url($url, PHP_URL_PATH)) ?: 'downloaded-file'; return response()->streamDownload(function () use ($url) { $read = fopen($url, 'r'); if (! $read) { abort(502, 'Unable to open remote file'); } while (! feof($read)) { echo fread($read, 1024 * 1024); // 1 MB chunks } fclose($read); }, $fileName);}
What this does:
- Reads the
urlquery parameter from the request. - Uses
streamDownloadto stream the remote file directly to the browser. - Uses
fopenandfreadin a loop so the file is not fully loaded into memory. - Picks a filename from the URL path, or falls back to
downloaded-file.
Edge cases to be aware of:
-
allow_url_fopenmust be enabled in PHP forfopen($url, 'r')to work. - If the remote server is slow or unreachable, your request will also be slow or fail.
- You might want to add your own authorization logic before allowing the download.
Downloading a file from a URL and storing it on disk
Very often you want to fetch a remote file once, store it somewhere in your storage disk, then serve it from there. This is handy for caching remote assets, invoices, reports, and similar files.
A simple approach is to use the Laravel HTTP client and the Storage facade.
// app/Http/Controllers/FileDownloadController.php public function cacheRemote(Request $request){ $url = $request->query('url'); if (! $url) { abort(400, 'Missing url parameter'); } $response = Http::timeout(30)->get($url); if (! $response->successful()) { abort(502, 'Failed to download remote file'); } $extension = pathinfo(parse_url($url, PHP_URL_PATH) ?? '', PATHINFO_EXTENSION); $fileName = 'remote_' . time() . ($extension ? ".{$extension}" : ''); $path = "downloads/{$fileName}"; Storage::disk('local')->put($path, $response->body()); return response()->json([ 'stored_as' => $path, 'disk' => 'local', ]);}
What this does:
- Uses the HTTP client to GET the remote URL with a timeout.
- Checks
successful()so you do not store error pages as files. - Derives a simple extension from the URL path.
- Writes the raw response body to
storage/app/downloads/...using thelocaldisk. - Returns a small JSON payload with where the file was stored.
For typical small and medium files this is fine. For very large files this approach reads the full file into memory once, so use the streaming approach in the next section if that matters.
Streaming a URL directly into storage using HTTP sink
For large files you usually do not want to hold the entire file in memory. Laravel's HTTP client is built on top of Guzzle, which supports a sink option that streams the response directly into a file.
// app/Http/Controllers/FileDownloadController.php use GuzzleHttp\Psr7\Utils; public function cacheRemoteStreamed(Request $request){ $url = $request->query('url'); if (! $url) { abort(400, 'Missing url parameter'); } $extension = pathinfo(parse_url($url, PHP_URL_PATH) ?? '', PATHINFO_EXTENSION); $fileName = 'remote_' . time() . ($extension ? ".{$extension}" : ''); $relativePath = "downloads/{$fileName}"; $absolutePath = Storage::disk('local')->path($relativePath); $resource = Utils::tryFopen($absolutePath, 'w'); Http::withOptions([ 'sink' => $resource, 'timeout' => 60, ])->get($url); return response()->json([ 'stored_as' => $relativePath, 'disk' => 'local', ]);}
What this does:
- Creates a writable file handle inside the
localdisk root (by defaultstorage/app/private/downloads/...). - Passes that handle to the HTTP client via the
sinkoption. - Guzzle streams the response into the file as it arrives.
- Memory usage stays low even for large files.
If you prefer, you can pass a full file path string instead of a resource to sink, and Guzzle will open the file for you.
Letting the user download the stored file
Once the file is stored in a disk, you usually want to let the user download it. If you stored the file using Storage on an internal disk (for example local), the simplest way is Storage::download().
// app/Http/Controllers/FileDownloadController.php public function downloadCached(Request $request){ $path = $request->query('path'); // for example downloads/remote_1700000000.pdf if (! $path) { abort(400, 'Missing path parameter'); } if (! Storage::disk('local')->exists($path)) { abort(404, 'File not found'); } return Storage::disk('local')->download($path);}
What this does:
- Reads the
paththat was returned when you cached the file. - Checks if the file exists on the
localdisk. - Returns a download response that forces the browser to download the file.
You can also override the download filename and headers:
return Storage::disk('local')->download($path, 'report.pdf', [ 'Content-Type' => 'application/pdf',]);
Combining remote download and user download in one request
Sometimes you want to fetch the file from a URL and immediately send it to the browser once, without keeping a copy. You can combine the HTTP client and streamDownload for this.
// app/Http/Controllers/FileDownloadController.php public function proxyRemote(Request $request){ $url = $request->query('url'); if (! $url) { abort(400, 'Missing url parameter'); } $fileName = basename(parse_url($url, PHP_URL_PATH)) ?: 'downloaded-file'; $response = Http::timeout(60)->get($url); if (! $response->successful()) { abort(502, 'Failed to download remote file'); } $contentType = $response->header('Content-Type', 'application/octet-stream'); return response()->streamDownload(function () use ($response) { echo $response->body(); }, $fileName, [ 'Content-Type' => $contentType, ]);}
What this does:
- Fetches the remote URL using the HTTP client.
- Validates that the response was successful.
- Uses the remote
Content-Typeheader when sending the file to the browser. - Wraps the body in a
streamDownloadso the browser gets a proper attachment download.
This approach is fine for small or moderate file sizes. For very large files you should stick with the streaming variant that does not hold the whole response body in memory.
Using queues or cron jobs to download from URLs
If downloading the remote file takes a long time, you usually do not want to block a normal web request. A common pattern is:
- The HTTP request dispatches a job that downloads and stores the remote file using one of the cache methods above.
- The job writes the path to the database.
- A separate controller action later exposes the file via
Storage::download().
The exact queue and scheduling setup depends on your application, but you reuse the same cacheRemote or cacheRemoteStreamed code inside your job class.
Conclusion
You have seen how to download files from external URLs inside a Laravel application.
You can stream remote files directly to the browser, cache them to storage, or do both.
You also know how to handle small files with simple Http::get calls and large files with the HTTP sink option.
With these patterns you can cover most real world "download from URL" requirements without extra packages.
Stay Updated.
I'll you email you as soon as new, fresh content is published.