
S3 n8n INTEGRATION: AUTOMATE S3 WITH N8N
Looking to automate S3 with n8n? Whether you're managing file storage for Minio, Digital Ocean Spaces, or any other S3-compatible service, manually handling uploads, downloads, and bucket management quickly becomes a bottleneck as your operations scale.
The S3 n8n integration gives you access to 12 powerful actions to automate your object storage workflows without writing code. From uploading files automatically to organizing folders and managing entire buckets, you can connect S3 to hundreds of other applications and build sophisticated data pipelines.
In this guide, you'll discover exactly how to connect S3 to n8n, explore every available action in detail, and learn how to leverage this integration to streamline your file management processes.
Need help automating S3 with n8n?
Our team will get back to you in minutes.
Why automate S3 with n8n?
The S3 n8n integration opens up 12 distinct actions covering three core resource types: files, folders, and buckets. This means you can automate virtually any operation you'd typically perform manually through your S3 dashboard or command line.
Significant time savings come from eliminating repetitive tasks. Instead of manually uploading processed files, downloading reports, or reorganizing folder structures, you set up workflows once and let them run 24/7. For teams handling hundreds or thousands of files daily, this translates to hours reclaimed every week. Zero oversight means your automations monitor and execute reliably—whether it's backing up data at 3 AM or instantly copying files between buckets when a trigger fires elsewhere in your workflow.
Concrete business examples include: automatically uploading invoice PDFs generated by your billing system, syncing media files between your CMS and S3 storage, bulk-downloading analytics exports for processing, creating organized folder structures when new projects launch, and cleaning up temporary files on a schedule. The integration works with any S3-compatible service (Minio, Digital Ocean Spaces, Wasabi, etc.), giving you flexibility across providers.
How to connect S3 to n8n?
! 1 stepHow to connect S3 to n8n?
- 01
Add the node
Search and add the node in your workflow.
TIP💡 TIP: This S3 node is specifically designed for S3-compatible services like Minio or Digital Ocean Spaces. If you're using AWS S3 directly, n8n provides a dedicated AWS S3 node that handles AWS-specific authentication and regions more seamlessly.- 01
Need help automating S3 with n8n?
Our team will get back to you in minutes.
S3 actions available in n8n
01 Action 01S3 Get Many Folders
This action retrieves a list of folders from a specified S3 bucket, making it invaluable when you need to programmatically discover your storage structure or iterate through directories in subsequent workflow steps.
Key parameters: Credential to connect with (required), Resource set to "Folder" (required), Operation set to "Get Many" (required), Bucket Name (required), Return All toggle (optional), and Limit defaulting to 100 (optional).
Use this action when building workflows that need to audit folder structures, generate reports on storage organization, or dynamically process content across multiple directories. It's particularly useful for backup verification workflows or content management systems that need to sync folder hierarchies.

02 Action 02Delete Folder
When you need to clean up your S3 storage programmatically, the Delete Folder action removes entire directory structures from your buckets. This is essential for maintenance workflows, temporary file cleanup, or project archival processes.
Key parameters: Credential to connect with (required), Resource set to "Folder" (required), Operation set to "Delete" (required), Bucket Name (required), and Folder Key providing the full path of the folder to delete (required).
Ideal for automated cleanup routines—for example, deleting temporary processing folders after jobs complete, or removing outdated backup directories based on retention policies.

03 Action 03S3 Create Folder
Programmatically create new folder structures in your S3 buckets. This action is fundamental for workflows that need to organize files dynamically, such as creating date-based directories or project-specific storage locations.
Key parameters: Credential to connect with (required), Resource set to "Folder" (required), Operation set to "Create" (required), Bucket Name (required), Folder Name (required), and Additional Fields for optional configurations.
Perfect for onboarding workflows that provision storage for new clients, or content pipelines that need organized directory structures before file uploads begin.

04 Action 04Upload a File
The Upload a File action is likely your most-used S3 operation—it takes binary data from your workflow and stores it in your S3 bucket. Whether you're saving processed images, generated reports, or backup files, this action handles it all.
Key parameters: Credential to connect with (required), Resource set to "File" (required), Operation set to "Upload" (required), Bucket Name (required), File Name with full path (required), Binary File toggle (optional), Input Binary Field defaulting to "data" (required when binary is enabled), Additional Fields and Tags for metadata (optional).
Use cases include: automatically uploading invoice PDFs after generation, saving processed images from an image manipulation workflow, or archiving webhook payloads for audit purposes.

05 Action 05S3: Get Many (File)
Retrieve lists of files from your S3 buckets for processing, reporting, or synchronization workflows. This action returns metadata about multiple files, which you can then iterate through for bulk operations.
Key parameters: Credential to connect with (required), Resource set to "File" (required), Operation set to "Get Many" (required), Bucket Name (required), Return All toggle (optional), Limit defaulting to 100 (optional), and Options for filtering like prefix or delimiter (optional).
Essential for workflows that need to process all files in a bucket, generate file inventories, or sync S3 contents with other systems like Dropbox or Google Sheets.

06 Action 06Download
The Download action retrieves actual file content from S3 and makes it available as binary data in your workflow. This is crucial for any automation that needs to process, transform, or transfer files stored in S3.
Key parameters: Credential to connect with (required), Resource set to "File" (required), Operation set to "Download" (required), Bucket Name (required), File Key with full path (required), and Put Output File in Field defaulting to "data" (optional).
Use this when building ETL pipelines that process S3-stored data, email workflows that attach S3 files via Gmail, or integration flows that move files between S3 and other services.

07 Action 07Delete File
Remove specific files from your S3 storage programmatically. The Delete File action is essential for cleanup routines, file rotation policies, and workflows that manage temporary storage.
Key parameters: Credential to connect with (required), Resource set to "File" (required), Operation set to "Delete" (required), Bucket Name (required), File Key with exact path (required), and Options for additional parameters (optional).
Commonly used in: log rotation workflows that delete old files, temporary file cleanup after processing completes, or content management systems that need to remove deprecated assets.

08 Action 08Copy
Duplicate files within or between S3 buckets without downloading and re-uploading. The Copy action is efficient for backup creation, file organization, and multi-region replication scenarios.
Key parameters: Credential to connect with (required), Resource set to "File" (required), Operation set to "Copy" (required), Source Path including bucket (required), Destination Path with new location (required), and Additional Fields for extra options (optional).
Perfect for creating backup copies before processing, organizing files into new folder structures, or duplicating assets across different bucket configurations.

09 Action 09Search
Search within S3 buckets to find specific objects based on criteria. This action helps you locate files dynamically rather than hardcoding paths, making your workflows more flexible and resilient.
Key parameters: Credential to connect with (required), Resource set to "Bucket" (required), Operation set to "Search" (required), Bucket Name (required), Return All toggle (optional), Limit defaulting to 100 (optional), and Additional Fields for search filters (optional).
Useful when building workflows that need to find files matching certain patterns, locate the most recent upload, or discover files for batch processing.

10 Action 10S3 Get Many Buckets
List all buckets accessible with your credentials. This action is useful for administrative workflows, storage auditing, and multi-tenant applications that need to discover available storage locations.
Key parameters: Credential to connect with (required), Resource set to "Bucket" (required), Operation set to "Get Many" (required), Return All toggle (optional), and Limit defaulting to 100 (optional).
Ideal for monitoring dashboards that track storage usage, provisioning workflows that verify bucket existence, or administrative tools that manage multi-bucket environments.

11 Action 11S3 Delete Bucket
Remove entire buckets from your S3 storage. This is a powerful administrative action typically used for cleanup, decommissioning, or automated environment management.
Key parameters: Credential to connect with (required), Resource set to "Bucket" (required), Operation set to "Delete" (required), and Name with exact bucket name (required).
Note: Buckets typically must be empty before deletion. Use this in controlled administrative workflows or teardown processes for temporary environments.

12 Action 12S3: Create Bucket
Programmatically provision new S3 buckets. This action is essential for workflows that need to create isolated storage for new clients, projects, or environments.
Key parameters: Credential to connect with (required), Resource set to "Bucket" (required), Operation set to "Create" (required), Name for the unique bucket name (required), and Additional Fields for settings like region or ACLs (optional).
Perfect for onboarding automations that provision storage for new accounts, or infrastructure workflows that set up complete environments including storage. Consider pairing with Supabase for database provisioning.

Build your first workflow with our team
Drop your email and we'll send you the catalog of automations you can ship today.
- Free n8n & Make scenarios to import
- Step-by-step setup docs
- Live cohort + community support
Frequently asked questions
Is the S3 n8n integration free to use?
Yes, the S3 node is included in n8n's core nodes and available on all plans, including the free self-hosted version. There are no additional costs from n8n to use the S3 integration. However, your S3-compatible storage provider (Minio, Digital Ocean Spaces, Wasabi, etc.) will charge based on their own pricing—typically for storage volume, API requests, and data transfer. The n8n side of the equation adds no extra expense.What's the difference between the S3 node and the AWS S3 node in n8n?
The generic S3 node is designed for S3-compatible services that follow the S3 API standard, such as Minio, Digital Ocean Spaces, Backblaze B2, and Wasabi. The dedicated AWS S3 node is optimized specifically for Amazon Web Services S3, handling AWS-specific authentication (IAM roles, AWS regions) more seamlessly. If you're using AWS S3, choose the AWS S3 node; for any other S3-compatible service, use the generic S3 node covered in this guide.Can I automate file uploads from other apps to S3 using n8n?
Absolutely. That's one of the most common use cases. You can trigger workflows from virtually any app that n8n supports—webhooks, email attachments, form submissions, CRM updates, database changes—and use the S3 Upload action to store files. For example, you could automatically save email attachments to S3, upload images processed by an AI node, or archive form submissions as JSON files. The key is ensuring your workflow passes binary data to the S3 node's "Input Binary Field" parameter. Learn more about building these automations in our AI agent tutorial or explore our downloadable automation catalog.



