Daily Database Backup to GitHub (CSV Format) - n8n Workflow

Automate daily PostgreSQL backups using this robust n8n workflow. Convert tables to CSV format and upload or update them in a GitHub repository to ensure data integrity and version control.

Workflow Preview

Ready to automate?

Download this n8n workflow template and start using it instantly.

Who is this best for?


  • DevOps engineers needing automated database archival.

  • Developers requiring daily version-controlled backups of production data.

  • Users looking for advanced examples of conditional logic within an n8n workflow.

  • Data administrators seeking reliable, automated data export solutions using n8n templates.

Overview

Maintaining reliable database backups is crucial for any application. This specialized n8n workflow provides a comprehensive solution for automated PostgreSQL data extraction, conversion, and archival using GitHub for storage and version control. By utilizing a periodic n8n trigger, the system ensures that every 24 hours, data from specified database tables is queried, transformed into the universal CSV format, and conditionally committed to a repository. This sophisticated process, built entirely within an n8n workflow, determines if a table is new (requiring an upload) or existing (requiring an update), providing efficiency and preventing unnecessary file duplication. This robust n8n node configuration is an excellent addition to your operational monitoring toolkit.

How it Works

This automation is initiated by the Daily Schedule n8n trigger, set to run every 24 hours.


  1. Preparation (GitHub Check): The n8n workflow first uses the GitHub n8n node to list all existing files in the target repository. These filenames are aggregated into a single list for rapid lookup.

  2. Table Listing (Postgres): Concurrently, the PostgreSQL n8n node connects to the database to retrieve a list of all table names from the information_schema.tables in the public schema.

  3. Data Extraction Loop: The workflow enters a primary batch loop (Loop Over Items) to process each table name individually. For each table, a PostgreSQL n8n node executes a SELECT query to retrieve all data.

  4. Conversion: The retrieved JSON data for the table is then passed to the Convert to File n8n node, which transforms the data into a binary CSV file. The resulting file is dynamically named after the source table (e.g., users.csv).

  5. Conditional Upload/Update: A subsequent Split to single items n8n node prepares the CSV file for final action. An IF n8n node (Check if file exists in repository) compares the CSV file name against the list of aggregated GitHub filenames from Step 1.

  6. Action:

If True (File Exists): The workflow executes the Update file [GITHUB] n8n node, committing the new CSV binary data to the existing file path.
* If False (New Table): The workflow executes the Upload file [GITHUB] n8n node, creating a new file in the repository.

Installation Guide

To deploy this powerful n8n workflow template, follow these steps:


  1. Import: Copy the provided JSON code and import it directly into your n8n instance via the 'Workflows' > 'New' > 'Import from JSON' option.

  2. PostgreSQL Credentials: Locate the PostgreSQL n8n node (List tables and List tables1). You must configure the required credentials connecting to your database. Ensure the credentials allow access to information_schema.tables for listing purposes and SELECT permissions on the tables you intend to back up.

  3. GitHub Credentials: Configure the OAuth2 credentials for the GitHub n8n node instances. This account must have read/write access to the target repository (github-repo in the template) to list, update, and upload files.

  4. Configuration: Update the GitHub n8n node parameters to reflect your specific GitHub 'Owner' and 'Repository' names.

  5. Activate: Save the workflow and switch the toggle from 'Inactive' to 'Active'. The Daily Schedule n8n trigger will now execute the backup every 24 hours.

Node Details

Daily Schedule (n8n trigger): The starting n8n node, responsible for running the entire backup automation once every 24 hours.
List files from repository [GITHUB] (GitHub n8n node): Lists the current contents of the target GitHub repository to determine which tables have existing backups.
Combine file names [GITHUB] (Item Lists n8n node): Aggregates the names of the existing files from GitHub, streamlining the check performed later by the IF node.
List tables1 (PostgreSQL n8n node): Queries the database's system schema (information_schema) to retrieve a list of all tables that need backing up.
List tables (PostgreSQL n8n node): Executed inside the loop, this dynamically queries all data (operation: select, returnAll: true) from a specific table name received from the previous n8n node.
Convert to File1 (Convert To File n8n node): Takes the raw PostgreSQL data (JSON format) and converts it into a binary CSV file, ensuring compatibility with version control systems. The file name parameter is set dynamically using the table name.
Check if file exists in repository (IF n8n node): The core control flow n8n node that uses the aggregated list of existing files to decide whether the current CSV file should be updated (True) or uploaded as new (False).
Update file [GITHUB] / Upload file [GITHUB] (GitHub n8n node): These nodes handle the final commitment of the CSV binary data to the repository, ensuring the daily backup is stored and versioned.

Related n8n Workflows

Free

Nodes: 9 Nodes
Updated: December 26 2025
View all
Created by

AI and Automation developer. Im implementing n8n and ai tools to automate marketing and sales in companies

Featured*