Automate daily PostgreSQL backups using this robust n8n workflow. Convert tables to CSV format and upload or update them in a GitHub repository to ensure data integrity and version control.
Download this n8n workflow template and start using it instantly.
Maintaining reliable database backups is crucial for any application. This specialized n8n workflow provides a comprehensive solution for automated PostgreSQL data extraction, conversion, and archival using GitHub for storage and version control. By utilizing a periodic n8n trigger, the system ensures that every 24 hours, data from specified database tables is queried, transformed into the universal CSV format, and conditionally committed to a repository. This sophisticated process, built entirely within an n8n workflow, determines if a table is new (requiring an upload) or existing (requiring an update), providing efficiency and preventing unnecessary file duplication. This robust n8n node configuration is an excellent addition to your operational monitoring toolkit.
This automation is initiated by the Daily Schedule n8n trigger, set to run every 24 hours.
information_schema.tables in the public schema.Loop Over Items) to process each table name individually. For each table, a PostgreSQL n8n node executes a SELECT query to retrieve all data.Convert to File n8n node, which transforms the data into a binary CSV file. The resulting file is dynamically named after the source table (e.g., users.csv).Split to single items n8n node prepares the CSV file for final action. An IF n8n node (Check if file exists in repository) compares the CSV file name against the list of aggregated GitHub filenames from Step 1.Update file [GITHUB] n8n node, committing the new CSV binary data to the existing file path.Upload file [GITHUB] n8n node, creating a new file in the repository.To deploy this powerful n8n workflow template, follow these steps:
List tables and List tables1). You must configure the required credentials connecting to your database. Ensure the credentials allow access to information_schema.tables for listing purposes and SELECT permissions on the tables you intend to back up.github-repo in the template) to list, update, and upload files.Daily Schedule n8n trigger will now execute the backup every 24 hours. Daily Schedule (n8n trigger): The starting n8n node, responsible for running the entire backup automation once every 24 hours.
List files from repository [GITHUB] (GitHub n8n node): Lists the current contents of the target GitHub repository to determine which tables have existing backups.
Combine file names [GITHUB] (Item Lists n8n node): Aggregates the names of the existing files from GitHub, streamlining the check performed later by the IF node.
List tables1 (PostgreSQL n8n node): Queries the database's system schema (information_schema) to retrieve a list of all tables that need backing up.
List tables (PostgreSQL n8n node): Executed inside the loop, this dynamically queries all data (operation: select, returnAll: true) from a specific table name received from the previous n8n node.
Convert to File1 (Convert To File n8n node): Takes the raw PostgreSQL data (JSON format) and converts it into a binary CSV file, ensuring compatibility with version control systems. The file name parameter is set dynamically using the table name.
Check if file exists in repository (IF n8n node): The core control flow n8n node that uses the aggregated list of existing files to decide whether the current CSV file should be updated (True) or uploaded as new (False).
Update file [GITHUB] / Upload file [GITHUB] (GitHub n8n node): These nodes handle the final commitment of the CSV binary data to the repository, ensuring the daily backup is stored and versioned.
Use this powerful n8n workflow to automate a daily positive news digest. It leverages an n8n trigger, filters articles via OpenAI’s API for positive sentiment, and delivers the summarized content directly to your inbox using Gmail.

Automate daily summaries of top trending podcasts using the Taddy API, OpenAI Whisper for transcription, and GPT for summarization within a powerful n8n workflow. Leverage n8n templates for AI content analysis.

Use this powerful n8n workflow to schedule daily, personalized health alerts based on real-time air quality and pollen data. This advanced n8n template integrates the Ambee API, OpenAI (GPT), and Gmail to deliver actionable, tailored advice.

Automate monitoring of GitHub pull requests (PRs) for the n8n repository. This n8n workflow uses GPT-4o-mini to generate clear, concise daily summaries and delivers them via Telegram.

Automate daily RAG research tracking using this powerful n8n workflow. It pulls papers from arXiv, analyzes summaries with Gemini AI, categorizes RAG methods, and archives data into Notion, followed by email and Feishu notifications.

Automate daily news curation using this advanced n8n workflow. It reads RSS feeds, uses the Google Gemini AI model to summarize articles, and posts actionable summaries to Slack with a direct 'Share on X' link. Leverage powerful n8n templates for content delivery.

AI and Automation developer. Im implementing n8n and ai tools to automate marketing and sales in companies







































