Skip to content

Instantly share code, notes, and snippets.

@shraey96
Last active January 26, 2026 19:22
Show Gist options
  • Select an option

  • Save shraey96/e8396438b13f9009f3d8a6489db4693b to your computer and use it in GitHub Desktop.

Select an option

Save shraey96/e8396438b13f9009f3d8a6489db4693b to your computer and use it in GitHub Desktop.
251 Free n8n Workflow Templates
[
{
"templateId": "2682",
"templateName": "🔍🛠️Perplexity Researcher to HTML Web Page",
"templateDescription": "Transform simple queries into comprehensive, well-structured content with this n8n workflow that leverages Perplexity AI for research and GPT-4 for content...",
"templateUrl": "https://n8n.io/workflows/2682",
"jsonFileName": "Perplexity_Researcher_to_HTML_Web_Page.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Perplexity_Researcher_to_HTML_Web_Page.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/ccf6d8b0b7e9f589f1d5e712d3288b89/raw/81c6b34b4191475558bf5559a021568c2c81027f/Perplexity_Researcher_to_HTML_Web_Page.json",
"screenshotURL": "https://i.ibb.co/KjkzhGGv/7c63edfd6c16.png",
"workflowUpdated": true,
"gistId": "ccf6d8b0b7e9f589f1d5e712d3288b89",
"templateDescriptionFull": "Transform simple queries into comprehensive, well-structured content with this n8n workflow that leverages Perplexity AI for research and GPT-4 for content transformation. Create professional blog posts and HTML content automatically while maintaining accuracy and depth.\n\nIntelligent Research & Analysis\n\n🚀 Automated Research Pipeline\n\nHarnesses Perplexity AI's advanced research capabilities\nProcesses complex topics into structured insights\nDelivers comprehensive analysis in minutes instead of hours\n\n🧠 Smart Content Organization\n\nAutomatically structures content with clear hierarchies\nIdentifies and highlights key concepts\nMaintains technical accuracy while improving readability\nCreates SEO-friendly content structure\n\nContent Transformation Features\n\n📝 Dynamic Content Generation\n\nConverts research into professional blog articles\nGenerates clean, responsive HTML output\nImplements proper semantic structure\nIncludes metadata and categorization\n\n🎨 Professional Formatting\n\nResponsive Tailwind CSS styling\nClean, modern HTML structure\nProper heading hierarchy\nMobile-friendly layouts\nBlockquote highlighting for key insights\n\nPerfect For\n\n📚 Content Researchers\nSave hours of manual research by automating the information gathering and structuring process.\n\n✍️ Content Writers\nFocus on creativity while the workflow handles research and technical formatting.\n\n🌐 Web Publishers\nGenerate publication-ready HTML content with modern styling and proper structure.\n\nTechnical Implementation\n\n⚡ Workflow Components\n\nWebhook endpoint for query submission\nPerplexity AI integration for research\nGPT-4 powered content structuring\nHTML transformation engine\nTelegram notification system (optional)\n\nTransform your content creation process with an intelligent system that handles research, writing, and formatting while you focus on strategy and creativity.",
"isPaid": false
},
{
"templateId": "3161",
"templateName": "template_3161",
"templateDescription": "AI-Powered Social Media Content Automation 🧑‍💻 Who is this for?This workflow is perfect for social media managers, content creators, and digital marketers...",
"templateUrl": "https://n8n.io/workflows/3161",
"jsonFileName": "template_3161.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_3161.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/9eb813e446a312179afb3cad183e2932/raw/3c7aa149191919da4ac2c9f66152e70e5f111d6e/template_3161.json",
"screenshotURL": "https://i.ibb.co/4RMZr704/fc94b92eace1.png",
"workflowUpdated": true,
"gistId": "9eb813e446a312179afb3cad183e2932",
"templateDescriptionFull": "This workflow is perfect for social media managers, content creators, and digital marketers who want to save time by automating social media post generation and publishing across platforms.\n\nManually generating and scheduling social media content is time-consuming and repetitive. This workflow automates content creation and publishing, allowing you to:\n\nStreamline content generation using AI\nEnsure consistent posting across social media platforms\nTrack published posts in Google Sheets\n\nFetches content ideas from a Google Sheet.\nGenerates social media posts using OpenAI's GPT-4.\nChecks the target platform (e.g., Twitter/X, LinkedIn).\nPosts the content to the chosen social media platform.\nUpdates the Google Sheet with the generated post and timestamp.\n\nConnect Google Sheets: Ensure you have a Google Sheet with content ideas (columns: Idea, Status, Generated Post).\nSet up OpenAI API Key: Provide your OpenAI API key for GPT-4.\nConfigure Social Media Accounts: Link your Twitter/X or other social media accounts using n8n's built-in nodes.\nTest the Workflow: Run the workflow to verify automation.\nSchedule Automation: Set a recurring trigger (e.g., daily) to automate posting.\n\nAdjust prompt inputs in the OpenAI node to tailor the tone and style.\nAdd more platforms (e.g., Instagram, Facebook) by duplicating the social media node.\nInclude analytics tracking for engagement insights.\n\nAutomatically generate and share daily motivational quotes.\nPost product updates and announcements.\nShare curated industry news and insights.\n\nThis workflow saves time and keeps your social media presence active and engaging effortlessly. 🚀",
"isPaid": false
},
{
"templateId": "2557",
"templateName": "Hacker News to Video Template - AlexK1919",
"templateDescription": "Hacker News to Video Content OverviewThis workflow converts trending articles from Hacker News into engaging video content. It integrates AI-based tools to...",
"templateUrl": "https://n8n.io/workflows/2557",
"jsonFileName": "Hacker_News_to_Video_Template_-_AlexK1919.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Hacker_News_to_Video_Template_-_AlexK1919.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/22e61ed4c166ae3805a4d1713484180a/raw/61c199cc54dd4fd46bdd6714bf87959ae6d4340d/Hacker_News_to_Video_Template_-_AlexK1919.json",
"screenshotURL": "https://i.ibb.co/twqggGFf/9f0d3b11a0a2.png",
"workflowUpdated": true,
"gistId": "22e61ed4c166ae3805a4d1713484180a",
"templateDescriptionFull": "This workflow converts trending articles from Hacker News into engaging video content. It integrates AI-based tools to analyze, summarize, and generate multimedia content, making it ideal for content creators, educators, and marketers.\n\nArticle Retrieval:\n\nPulls trending articles from Hacker News.\nLimits the number of articles to process (configurable).\nPulls trending articles from Hacker News.\nLimits the number of articles to process (configurable).\nContent Analysis:\n\nUses OpenAI's GPT model to:\n\nSummarize articles.\nAssess their relevance to specific topics like automation or AI.\nExtract key image URLs.\nUses OpenAI's GPT model to:\n\nSummarize articles.\nAssess their relevance to specific topics like automation or AI.\nExtract key image URLs.\nSummarize articles.\nAssess their relevance to specific topics like automation or AI.\nExtract key image URLs.\nImage and Video Generation:\n\nLeonardo.ai: Creates stunning AI-generated images based on extracted prompts.\nRunwayML: Converts images into high-quality videos.\nLeonardo.ai: Creates stunning AI-generated images based on extracted prompts.\nRunwayML: Converts images into high-quality videos.\nStructured Content Creation:\n\nParses content into structured formats for easy reuse.\nGenerates newsletter-friendly blurbs and social media-ready captions.\nParses content into structured formats for easy reuse.\nGenerates newsletter-friendly blurbs and social media-ready captions.\nCloud Integration:\n\nUploads generated assets to:\n\nDropbox\nGoogle Drive\nMicrosoft OneDrive\nMinIO\nUploads generated assets to:\n\nDropbox\nGoogle Drive\nMicrosoft OneDrive\nMinIO\nDropbox\nGoogle Drive\nMicrosoft OneDrive\nMinIO\nSocial Media Posting (Optional):\n\nSupports posting to YouTube, X (Twitter), LinkedIn, and Instagram.\nSupports posting to YouTube, X (Twitter), LinkedIn, and Instagram.\n\nInitiated manually via the \"Test Workflow\" button.\n\nRetrieves articles from Hacker News.\nLimits the results to avoid processing overload.\n\nEvaluates if articles are related to AI/Automation using OpenAI's language model.\n\nGenerates:\n\nAI-driven image prompts via Leonardo.ai.\nVideos using RunwayML.\nAI-driven image prompts via Leonardo.ai.\nVideos using RunwayML.\n\nSaves the output to cloud storage services or uploads directly to social media platforms.\n\nAPI Keys:\n\nHacker News\nOpenAI\nLeonardo.ai\nRunwayML\nCreatomate\nHacker News\nOpenAI\nLeonardo.ai\nRunwayML\nCreatomate\nn8n Installation:\nEnsure n8n is installed and configured locally or on a server.\nCredentials:\nSet up credentials in n8n for all external services used in the workflow.\n\nReplace Hacker News with any other data source node if needed.\nConfigure the \"Article Analysis\" node for different topics.\nAdjust the cloud storage services or add custom storage options.\n\nImport this workflow into your n8n instance.\nConfigure your API credentials.\nTrigger the workflow manually or schedule it as needed.\nCheck the outputs in your preferred cloud storage or social media platform.\n\nExtend this workflow further by automating social media posting or newsletter integration.\nFor any questions, refer to the official documentation or reach out to the creator.\n\nThis workflow was built by AlexK1919, an AI-native workflow automation architect. Check out the overview video for a quick demo.\n\nLeonardo.ai\nRunwayML\nCreatomate\nHacker News API\nOpenAI GPT\n\nFeel free to adapt and extend this workflow to meet your specific needs! 🎉",
"isPaid": false
},
{
"templateId": "4022",
"templateName": "BuzzBlast",
"templateDescription": "Amplify your social media presence with BuzzBlast, an n8n workflow designed to make your content go viral across X, Discord, and LinkedIn. By sending a...",
"templateUrl": "https://n8n.io/workflows/4022",
"jsonFileName": "BuzzBlast.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/BuzzBlast.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/a423f4a6b2b9c06c2b51d91a14b8fad6/raw/ac0ca5fd62d883aeb6f6b8fe042bb938e8ba2813/BuzzBlast.json",
"screenshotURL": "https://i.ibb.co/RG5wkTrZ/815c89d979d7.png",
"workflowUpdated": true,
"gistId": "a423f4a6b2b9c06c2b51d91a14b8fad6",
"templateDescriptionFull": "Amplify your social media presence with BuzzBlast, an n8n workflow designed to make your content go viral across X, Discord, and LinkedIn. By sending a single chat message, BuzzBlast leverages OpenRouter's AI to optimize your input for each platform’s unique audience—crafting punchy tweets for X, engaging messages for Discord, and professional posts for LinkedIn. With smart language detection, it ensures the output matches your input’s language for authentic engagement.\n\n🚀 Multi-Platform Posting: Shares optimized content to X, Discord, and LinkedIn simultaneously.\n🧠 AI Optimization: Uses OpenRouter’s AI to tailor content for virality on each platform.\n🌐 Language Detection: Matches output to your input language for seamless engagement.\n🔄 Smart Routing: Automatically directs content to the right platform using a switch node.\n📱 Chat Trigger: Initiates posts via a simple chat message.\n⚡ Zero Hassle: No manual reformatting—BuzzBlast handles it all.\n\nSocial media managers looking to streamline cross-platform posting.\nContent creators aiming to boost engagement with minimal effort.\nBusinesses seeking to maximize reach across diverse audiences.\n\nn8n instance: A running n8n instance (cloud or self-hosted).\nCredentials:\n\nX account with OAuth2 API access.\nDiscord Webhook API setup for your server.\nLinkedIn account with OAuth2 API access.\nOpenRouter account for AI language model access.\nX account with OAuth2 API access.\nDiscord Webhook API setup for your server.\nLinkedIn account with OAuth2 API access.\nOpenRouter account for AI language model access.\nChat Trigger Setup: A configured chat platform (e.g., Slack, Telegram) to send input messages to the workflow.\n\nImport the Workflow:\n\nCopy the provided workflow JSON and import it into your n8n instance via the \"Import Workflow\" option in the n8n editor.\nCopy the provided workflow JSON and import it into your n8n instance via the \"Import Workflow\" option in the n8n editor.\nConfigure Credentials:\n\nIn the Post to X node, set up OAuth2 credentials for your X account.\nIn the Post to Discord node, configure a Discord Webhook for your server.\nIn the Post to LinkedIn node, add OAuth2 credentials for your LinkedIn account.\nIn the OpenRouter AI Model node, provide API credentials for your OpenRouter account.\nIn the Post to X node, set up OAuth2 credentials for your X account.\nIn the Post to Discord node, configure a Discord Webhook for your server.\nIn the Post to LinkedIn node, add OAuth2 credentials for your LinkedIn account.\nIn the OpenRouter AI Model node, provide API credentials for your OpenRouter account.\nSet Up Chat Trigger:\n\nIn the Chat Input Trigger node, configure your preferred chat platform (e.g., Slack, Telegram) to send trigger messages.\nEnsure the webhook is active and correctly linked to your chat platform.\nIn the Chat Input Trigger node, configure your preferred chat platform (e.g., Slack, Telegram) to send trigger messages.\nEnsure the webhook is active and correctly linked to your chat platform.\nTest the Workflow:\n\nSend a test message via your chat platform (e.g., \"Announcing our new product launch!\").\nVerify that the AI optimizes the content and posts it to X, Discord, and LinkedIn as expected.\nSend a test message via your chat platform (e.g., \"Announcing our new product launch!\").\nVerify that the AI optimizes the content and posts it to X, Discord, and LinkedIn as expected.\nActivate the Workflow:\n\nOnce tested, toggle the workflow to \"Active\" in n8n to enable automatic execution when chat messages are received.\nOnce tested, toggle the workflow to \"Active\" in n8n to enable automatic execution when chat messages are received.\n\nChanges Chat Trigger: Adjust the chat trigger using your preference platform like telegram, discord, etc.\nModify AI Prompt: Adjust the prompt in the AI Content Optimizer node to change the tone or style (e.g., more professional for LinkedIn or conversational for Discord).\nAdd New Platforms: Extend the Route to Platforms node by adding conditions for additional platforms (e.g., Instagram or Facebook) and corresponding posting nodes.\nChange AI Model: In the OpenRouter AI Model node, select a different OpenRouter model to optimize content quality or manage costs.\nEnhance Output Format: Update the JSON schema in the Parse AI Output node to include additional fields like hashtags, emojis, or links for specific platforms.\nAdd Error Handling: Include an error-handling node after the Route to Platforms node to log failed posts or retry them automatically.\n\nBuzzBlast saves time, maximizes reach, and lets AI craft platform-perfect posts that resonate with your audience. Whether you're an influencer, marketer, or business, this workflow makes cross-platform posting effortless. Ready to make waves online? Grab BuzzBlast and start buzzing!\n\nmade by: khmuhtadin\nNeed a custom? contact me on LinkedIn or Web",
"isPaid": false
},
{
"templateId": "3057",
"templateName": "template_3057",
"templateDescription": "Description: Create Social Media Content from Telegram with AI This n8n workflow empowers you to effortlessly generate social media content and captivating...",
"templateUrl": "https://n8n.io/workflows/3057",
"jsonFileName": "template_3057.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_3057.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/8c2ac45f9d6cf029abf14a7a5cb298d8/raw/b9c6dca145f3f891977c5d8aad29a80b6a87dc27/template_3057.json",
"screenshotURL": "https://i.ibb.co/RG5wkTrZ/815c89d979d7.png",
"workflowUpdated": true,
"gistId": "8c2ac45f9d6cf029abf14a7a5cb298d8",
"templateDescriptionFull": "Description:\n\nThis n8n workflow empowers you to effortlessly generate social media content and captivating image prompts, all powered by AI. Simply send a topic request through Telegram (as a voice or text message), and watch as the workflow conducts research, crafts engaging social media posts, and creates detailed image prompts ready for use with your preferred AI art generation tool.\n\nThis workflow streamlines the content creation process by automating research, social media content generation, and image prompt creation, triggered by a simple Telegram message.\n\nSocial Media Managers: Quickly generate engaging content and image ideas for various platforms.\nContent Creators: Overcome writer's block and discover fresh content ideas with AI assistance.\nMarketing Teams: Boost productivity by automating social media content research and drafting.\nAnyone looking to leverage AI for efficient and creative social media content creation.\n\nEffortless Content and Image Prompt Generation: Automate the creation of social media posts and detailed image prompts.\nAI-Powered Creativity: Leverage the power of LLMs to generate original content ideas and captivating image prompts.\nIncreased Efficiency: Save time and resources by automating the research and content creation process.\nVoice-to-Content: Use voice messages to request content, making content creation even more accessible.\nEnhanced Engagement: Create high-quality, attention-grabbing content that resonates with your audience.\n\nReceive Request: The workflow listens for incoming voice or text messages on Telegram containing your content request.\nProcess Voice (if necessary): If the message is a voice message, it's transcribed into text using OpenAI's Whisper API.\nAI Takes Over: The AI agent, powered by an OpenAI Chat Model and SerpAPI, conducts online research based on your request.\nContent and Image Prompt Generation: The AI agent generates engaging social media content and a detailed image prompt based on the research.\nImage Generation (Optional): You can use the generated image prompt with your preferred AI art generation tool (e.g., DALL-E, Stable Diffusion) to create a visual.\nOutput: The workflow provides you with the social media content and the detailed image prompt, ready for you to use or refine.\n\nTelegram Trigger\nSwitch\nTelegram (for fetching voice messages)\nOpenAI (Whisper API for voice-to-text)\nSet (for preparing variables)\nAI Agent (with OpenAI Chat Model and SerpAPI tool)\nHTTP Request (for optional image generation)\nExtract from File (for optional image processing)\nSet (for final output)\n\nActive n8n instance\nTelegram account with a bot\nOpenAI API key\nSerpAPI account\nHugging Face API key (if you want to generate images within the workflow)\n\nImport the workflow JSON into your n8n instance.\nConfigure the Telegram Trigger node with your Telegram bot token.\nSet up the OpenAI and SerpAPI credentials in the respective nodes.\nIf you want to generate images directly within the workflow, configure the HTTP Request node with your Hugging Face API key.\nTest the workflow by sending a voice or text message to your Telegram bot with a topic request.\n\nThis workflow combines the convenience of Telegram with the power of AI to provide a seamless content creation experience. Start generating engaging social media content today!",
"isPaid": false
},
{
"templateId": "2903",
"templateName": "Youtube Searcher",
"templateDescription": "Video explanation This n8n workflow helps you identify trending videos within your niche by detecting outlier videos that significantly outperform a...",
"templateUrl": "https://n8n.io/workflows/2903",
"jsonFileName": "Youtube_Searcher.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Youtube_Searcher.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/c1b12bc705a546e85389a5007d103300/raw/eace0f1c15a7e531c4c636d63ac8209d6b43be41/Youtube_Searcher.json",
"screenshotURL": "https://i.ibb.co/Cp2z25LF/6db7e9681cd0.png",
"workflowUpdated": true,
"gistId": "c1b12bc705a546e85389a5007d103300",
"templateDescriptionFull": "Video explanation\n\nThis n8n workflow helps you identify trending videos within your niche by detecting outlier videos that significantly outperform a channel's average views. It automates the process of monitoring competitor channels, saving time and streamlining content research.\n\nAutomated Competitor Video Tracking\nMonitors videos from specified competitor channels, fetching data directly from the YouTube API.\nOutlier Detection Based on Channel Averages\nCompares each video’s performance against the channel’s historical average to identify significant spikes in viewership.\nHistorical Video Data Management\nStores video statistics in a PostgreSQL database, allowing the workflow to only fetch new videos and optimize API usage.\nShort Video Filtering\nAutomatically removes short videos based on duration thresholds.\nFlexible Video Retrieval\nFetches up to 3 months of historical data on the first run and only new videos on subsequent runs.\nPostgreSQL Database Integration\nIncludes SQL queries for database setup, video insertion, and performance analysis.\nConfigurable Outlier Threshold\nFocuses on videos published within the last two weeks with view counts at least twice the channel's average.\nData Output for Analysis\nOutputs best-performing videos along with their engagement metrics, making it easier to identify trending topics.\n\nn8n installed on your machine or server\nA valid YouTube Data API key\nAccess to a PostgreSQL database\n\nThis workflow is intended for educational and research purposes, helping content creators gain insights into what topics resonate with audiences without manual daily monitoring.",
"isPaid": false
},
{
"templateId": "2981",
"templateName": "✍️🌄 Your First Wordpress Content Creator - Quick Start",
"templateDescription": "✍️🌄 WordPress + AI Content Creator This workflow automates the creation and publishing of multi-reading-level content for WordPress blogs. It leverages AI...",
"templateUrl": "https://n8n.io/workflows/2981",
"jsonFileName": "_Your_First_Wordpress_Content_Creator_-_Quick_Start.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/_Your_First_Wordpress_Content_Creator_-_Quick_Start.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/4c5a33a39041ecefb44b64f5fbf4c0f9/raw/54756a0c24807140349ff24285b3422868412dfe/_Your_First_Wordpress_Content_Creator_-_Quick_Start.json",
"screenshotURL": "https://i.ibb.co/yBSLm408/d3c2eaf0b670.png",
"workflowUpdated": true,
"gistId": "4c5a33a39041ecefb44b64f5fbf4c0f9",
"templateDescriptionFull": "This workflow automates the creation and publishing of multi-reading-level content for WordPress blogs. It leverages AI to generate optimized articles, automatically creates featured images, and provides versions of the content at different reading levels (Grade 2, 5, and 9).\n\nStarts with a manual trigger and a user-defined blog topic\nUses AI to create a structured blog post with proper HTML formatting\nSeparates and validates the title and content components\nSaves a draft version to Google Drive for backup\n\nAutomatically rewrites the content for different reading levels:\n\nGrade 9: Sophisticated language with appropriate metaphors\nGrade 5: Simplified with light humor and age-appropriate examples\nGrade 2: Basic language with simple metaphors and child-friendly explanations\n\nCreates a draft post in WordPress with the Grade 9 version\nGenerates a relevant featured image using Pollinations.ai\nAutomatically uploads and sets the featured image\nSends success/error notifications via Telegram\n\nSet up WordPress API connection\nConfigure OpenAI API access\nSet up Google Drive integration\nAdd Telegram bot credentials for notifications\n\nAdjust reading level prompts as needed\nModify image generation settings\nSet WordPress post parameters\n\nRun a test with a sample topic\nVerify all reading level versions\nCheck WordPress draft creation\nConfirm notification system\n\nThis workflow is perfect for content creators who need to maintain a consistent blog presence while catering to different audience reading levels. It's especially useful for educational content, news sites, or any platform that needs to communicate complex topics to diverse audiences.",
"isPaid": false
},
{
"templateId": "3822",
"templateName": "Search news using Perplexity AI and post to X (Twitter)",
"templateDescription": "Stay ahead of the curve and keep your followers informed—automatically. This n8n workflow uses Perplexity AI to generate insightful answers to scheduled...",
"templateUrl": "https://n8n.io/workflows/3822",
"jsonFileName": "Search_news_using_Perplexity_AI_and_post_to_X_Twitter.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Search_news_using_Perplexity_AI_and_post_to_X_Twitter.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/da8ac2a9a95b322061cc8cff5f23d62c/raw/31f8fe87520bbb27bdb036bf6dcd96b18f21b4fa/Search_news_using_Perplexity_AI_and_post_to_X_Twitter.json",
"screenshotURL": "https://i.ibb.co/DHtpcHMG/a5f9dd442054.png",
"workflowUpdated": true,
"gistId": "da8ac2a9a95b322061cc8cff5f23d62c",
"templateDescriptionFull": "Stay ahead of the curve and keep your followers informed—automatically.\nThis n8n workflow uses Perplexity AI to generate insightful answers to scheduled queries, then auto-posts the responses directly to X (Twitter).\n\nScheduled Trigger – Runs at set times (daily, hourly, etc.).\nsearchQuery – Define what kind of trending or relevant insight you want (e.g. “latest AI trends”).\nset API Key – Securely insert your Perplexity API key.\nPerplexity API Call – Fetches a short, insightful response to your query.\nPost to X – Automatically publishes the result as a tweet.\n\nAn n8n account (self-hosted or cloud)\nA Perplexity API key\nA connected X (Twitter) account via n8n’s credentials\n\nAdd this workflow into your n8n account.\nEdit the searchQuery node with a topic (e.g. “What’s new in ecommerce automation?”).\nPaste your Perplexity API key into the set API key node.\nConnect your X (Twitter) account in the final node.\nAdjust the schedule timing to suit your content frequency.\n\n💬 Add a formatting step to shorten or hashtag the response.\n📊 Pull multiple trending questions and auto-schedule posts.\n🔁 Loop responses to queue a full week of content.\n🌐 Translate content before posting to reach a global audience.\n\nFeel free to contact us at 1 Node.\nGet instant access to a library of free resources we created.",
"isPaid": false
},
{
"templateId": "4827",
"templateName": "template_4827",
"templateDescription": "Who is this for?This template is designed for internal support teams, product specialists, and knowledge managers in technology companies who want to...",
"templateUrl": "https://n8n.io/workflows/4827",
"jsonFileName": "template_4827.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_4827.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/a7459dac43335353c5dc793929ab0adb/raw/4542f5f6e9be4bf4d6559eccd7fecb008f7817db/template_4827.json",
"screenshotURL": "https://i.ibb.co/7938pnk/8670e3e1c08e.png",
"workflowUpdated": true,
"gistId": "a7459dac43335353c5dc793929ab0adb",
"templateDescriptionFull": "This template is designed for internal support teams, product specialists, and knowledge managers in technology companies who want to automate ingestion of product documentation and enable AI-driven, retrieval-augmented question answering via WhatsApp.\n\nSupport agents often spend too much time manually searching through lengthy documentation, leading to inconsistent or delayed answers. This solution automates importing, chunking, and indexing product manuals, then uses retrieval-augmented generation (RAG) to answer user queries accurately and quickly with AI via WhatsApp messaging.\n\nManually triggered to import product documentation from Google Docs.\nAutomatically splits large documents into chunks for efficient searching.\nGenerates vector embeddings for each chunk using OpenAI embeddings.\nInserts the embedded chunks and metadata into a MongoDB Atlas vector store, enabling fast semantic search.\n\nListens for incoming WhatsApp user messages, supporting various types:\n\nText messages: Plain text queries from users.\nAudio messages: Voice notes transcribed into text for processing.\nImage messages: Photos or screenshots analyzed to provide contextual answers.\nDocument messages: PDFs, spreadsheets, or other files parsed for relevant content.\nText messages: Plain text queries from users.\nAudio messages: Voice notes transcribed into text for processing.\nImage messages: Photos or screenshots analyzed to provide contextual answers.\nDocument messages: PDFs, spreadsheets, or other files parsed for relevant content.\nConverts incoming queries to vector embeddings and performs similarity search on the MongoDB vector store.\nUses OpenAI’s GPT-4o-mini model with retrieval-augmented generation to produce concise, context-aware answers.\nMaintains conversation context across multiple turns using a memory buffer node.\nRoutes different message types to appropriate processing nodes to maximize answer quality.\n\nAuthenticate Google Docs and connect your Google Docs URL containing the product documentation you want to index.\nAuthenticate MongoDB Atlas and connect the collection where you want to store the vector embeddings. Create a search index on this collection to support vector similarity queries.\nEnsure the index name matches the one configured in n8n (data_index).\nSee the example MongoDB search index template below for reference.\n\nAuthenticate the WhatsApp node with your Meta account credentials to enable message receiving and sending.\nConnect the MongoDB collection containing embedded product documentation to the MongoDB Vector Search node used for similarity queries.\nSet up the system prompt in the Knowledge Base Agent node to reflect your company’s tone, answering style, and any business rules, ensuring it references the connected MongoDB collection for context retrieval.\n\nBoth MongoDB nodes (in ingestion and chat workflows) are connected to the same collection with:\n\nAn embedding field storing vector data,\n\nRelevant metadata fields (e.g., document ID, source), and\n\nThe same vector index name configured (e.g., data_index).\n\n{\n\"mappings\": {\n\"dynamic\": false,\n\"fields\": {\n\"_id\": { \"type\": \"string\" },\n\"text\": { \"type\": \"string\" },\n\"embedding\": {\n\"type\": \"knnVector\",\n\"dimensions\": 1536,\n\"similarity\": \"cosine\"\n},\n\"source\": { \"type\": \"string\" },\n\"doc_id\": { \"type\": \"string\" }\n}\n}\n}",
"isPaid": false
},
{
"templateId": "3809",
"templateName": "Automated Content SEO Audit Report",
"templateDescription": "IntroductionThe Content SEO Audit Workflow is a powerful automated solution that generates comprehensive SEO audit reports for websites. By combining the...",
"templateUrl": "https://n8n.io/workflows/3809",
"jsonFileName": "Automated_Content_SEO_Audit_Report.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Automated_Content_SEO_Audit_Report.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/d50944b8189056a1279118cf913196bf/raw/21c9eba86b254b0535064811088da41683ab85b3/Automated_Content_SEO_Audit_Report.json",
"screenshotURL": "https://i.ibb.co/4RVVwb4n/f02b78e8ad77.png",
"workflowUpdated": true,
"gistId": "d50944b8189056a1279118cf913196bf",
"templateDescriptionFull": "The Content SEO Audit Workflow is a powerful automated solution that generates comprehensive SEO audit reports for websites.\n\nBy combining the crawling capabilities of DataForSEO with the search performance metrics from Google Search Console, this workflow delivers actionable insights into content quality, technical SEO issues, and performance optimization opportunities.\n\nThe workflow crawls up to 1,000 pages of a website, analyzes various SEO factors including metadata, content quality, internal linking, and search performance, and then generates a professional, branded HTML report that can be shared directly with clients.\n\nThe entire process is automated, transforming what would typically be hours of manual analysis into a streamlined workflow that produces consistent, thorough results.\n\nThis workflow bridges the gap between technical SEO auditing and practical, client-ready deliverables, making it an invaluable tool for SEO professionals and digital marketing agencies.\n\nThis workflow is designed for SEO consultants, digital marketing agencies, and content strategists who need to perform comprehensive content audits for clients or their own websites. It's particularly valuable for professionals who:\n\nRegularly conduct SEO audits as part of their service offerings\nNeed to provide branded, professional reports to clients\nWant to automate the time-consuming process of content analysis\nRequire data-driven insights to inform content strategy decisions\n\nUsers should have basic familiarity with SEO concepts and metrics, as well as a basic understanding of how to set up API credentials in n8n.\n\nWhile no coding knowledge is required to run the workflow, users should be comfortable with configuring workflow parameters and following setup instructions.\n\nContent audits are essential for SEO strategy but are traditionally labor-intensive and time-consuming. This workflow addresses several key challenges:\n\nManual Data Collection: Gathering data from multiple sources (crawlers, Google Search Console, etc.) typically requires hours of work. This workflow automates the entire data collection process.\nInconsistent Analysis: Manual audits can suffer from inconsistency in methodology. This workflow applies the same comprehensive analysis criteria to every page, ensuring thorough and consistent results.\nReport Generation: Creating professional, client-ready reports often requires additional design work after the analysis is complete. This workflow generates a fully branded HTML report automatically.\nData Integration: Correlating technical SEO issues with actual search performance metrics is difficult when working with separate tools. This workflow seamlessly integrates crawl data with Google Search Console metrics.\nScale Limitations: Manual audits become increasingly difficult with larger websites. This workflow can efficiently process up to 1,000 pages without additional effort.\n\nThe Content SEO Audit Workflow crawls a specified website, analyzes its content for various SEO issues, retrieves performance data from Google Search Console, and generates a comprehensive HTML report.\n\nThe workflow identifies issues in five key categories: status issues (404 errors, redirects), content quality (thin content, readability), metadata SEO (title/description issues), internal linking (orphan pages, excessive click depth), and performance (underperforming content).\n\nThe final report includes executive summaries, detailed issue breakdowns, and actionable recommendations, all branded with your company's colors and logo.\n\nInitial Configuration: The workflow begins by setting parameters including the target domain, crawl limits, company information, and branding colors.\nWebsite Crawling: The workflow creates a crawl task in DataForSEO and periodically checks its status until completion.\nData Collection: Once crawling is complete, the workflow:\n\nRetrieves the raw audit data from DataForSEO\nExtracts all URLs with status code 200 (successful pages)\nQueries Google Search Console API for each URL to get clicks and impressions data\nIdentifies 404 and 301 pages and retrieves their source links\nRetrieves the raw audit data from DataForSEO\nExtracts all URLs with status code 200 (successful pages)\nQueries Google Search Console API for each URL to get clicks and impressions data\nIdentifies 404 and 301 pages and retrieves their source links\nData Analysis: The workflow analyzes the collected data to identify issues including:\n\nTechnical issues: 404 errors, redirects, canonicalization problems\nContent issues: thin content, outdated content, readability problems\nSEO metadata issues: missing/duplicate titles and descriptions, H1 problems\nInternal linking issues: orphan pages, excessive click depth, low internal links\nPerformance issues: underperforming pages based on GSC data\nTechnical issues: 404 errors, redirects, canonicalization problems\nContent issues: thin content, outdated content, readability problems\nSEO metadata issues: missing/duplicate titles and descriptions, H1 problems\nInternal linking issues: orphan pages, excessive click depth, low internal links\nPerformance issues: underperforming pages based on GSC data\nReport Generation: Finally, the workflow:\n\nCalculates a health score based on the severity and quantity of issues\nGenerates prioritized recommendations\nCreates a comprehensive HTML report with interactive tables and visualizations\nCustomizes the report with your company's branding\nProvides the report as a downloadable HTML file\nCalculates a health score based on the severity and quantity of issues\nGenerates prioritized recommendations\nCreates a comprehensive HTML report with interactive tables and visualizations\nCustomizes the report with your company's branding\nProvides the report as a downloadable HTML file\n\nTo set up this workflow, follow these steps:\n\nImport the workflow: Download the JSON file and import it into your n8n instance.\nConfigure DataForSEO credentials:\n\nCreate a DataForSEO account at https://app.dataforseo.com/api-access (they offer a free $1 credit for testing)\nAdd a new \"Basic Auth\" credential in n8n following the HTTP Request Authentication guide\nAssign this credential to the \"Create Task\", \"Check Task Status\", \"Get Raw Audit Data\", and \"Get Source URLs Data\" nodes\nCreate a DataForSEO account at https://app.dataforseo.com/api-access (they offer a free $1 credit for testing)\nAdd a new \"Basic Auth\" credential in n8n following the HTTP Request Authentication guide\nAssign this credential to the \"Create Task\", \"Check Task Status\", \"Get Raw Audit Data\", and \"Get Source URLs Data\" nodes\nConfigure Google Search Console credentials:\n\nAdd a new \"Google OAuth2 API\" credential following the Google OAuth guide\nEnsure your Google account has access to the Google Search Console property you want to analyze\nAssign this credential to the \"Query GSC API\" node\nAdd a new \"Google OAuth2 API\" credential following the Google OAuth guide\nEnsure your Google account has access to the Google Search Console property you want to analyze\nAssign this credential to the \"Query GSC API\" node\nUpdate the \"Set Fields\" node with:\n\ndfs_domain: The website domain you want to audit\ndfs_max_crawl_pages: Maximum number of pages to crawl (default: 1000)\ndfs_enable_javascript: Whether to enable JavaScript rendering (default: false)\ncompany_name: Your company name for the report branding\ncompany_website: Your company website URL\ncompany_logo_url: URL to your company logo\nbrand_primary_color: Your primary brand color (hex code)\nbrand_secondary_color: Your secondary brand color (hex code)\ngsc_property_type: Set to \"domain\" or \"url\" depending on your Google Search Console property type\ndfs_domain: The website domain you want to audit\ndfs_max_crawl_pages: Maximum number of pages to crawl (default: 1000)\ndfs_enable_javascript: Whether to enable JavaScript rendering (default: false)\ncompany_name: Your company name for the report branding\ncompany_website: Your company website URL\ncompany_logo_url: URL to your company logo\nbrand_primary_color: Your primary brand color (hex code)\nbrand_secondary_color: Your secondary brand color (hex code)\ngsc_property_type: Set to \"domain\" or \"url\" depending on your Google Search Console property type\nRun the workflow: Click \"Start\" and wait for it to complete (approximately 20 minutes for 500 pages).\nDownload the report: Once complete, download the HTML file from the \"Download Report\" node.\n\nThis workflow can be adapted in several ways to better suit your specific requirements:\n\nAdjust crawl parameters: Modify the \"Set Fields\" node to change:\n\nThe maximum number of pages to crawl (dfs_max_crawl_pages). This workflow supports up to 1000 pages.\nWhether to enable JavaScript rendering for JavaScript-heavy sites (dfs_enable_javascript)\nThe maximum number of pages to crawl (dfs_max_crawl_pages). This workflow supports up to 1000 pages.\nWhether to enable JavaScript rendering for JavaScript-heavy sites (dfs_enable_javascript)\nCustomize issue detection thresholds: In the \"Build Report Structure\" code node, you can modify:\n\nWord count thresholds for thin content detection (currently 1500 words)\nClick depth thresholds (currently flags pages deeper than 4 clicks)\nTitle and description length parameters (currently 40-60 chars for titles, 70-155 for descriptions)\nReadability score thresholds (currently flags Flesch-Kincaid scores below 55)\nWord count thresholds for thin content detection (currently 1500 words)\nClick depth thresholds (currently flags pages deeper than 4 clicks)\nTitle and description length parameters (currently 40-60 chars for titles, 70-155 for descriptions)\nReadability score thresholds (currently flags Flesch-Kincaid scores below 55)\nModify the report design: In the \"Generate HTML Report\" code node, you can:\n\nAdjust the HTML/CSS to change the report layout and styling\nAdd or remove sections from the report\nChange the recommendations logic\nModify the health score calculation algorithm\nAdjust the HTML/CSS to change the report layout and styling\nAdd or remove sections from the report\nChange the recommendations logic\nModify the health score calculation algorithm\nAdd additional data sources: You could extend the workflow by:\n\nAdding Pagespeed Insights data for performance metrics\nIncorporating backlink data from other APIs\nAdding keyword ranking data from rank tracking APIs\nAdding Pagespeed Insights data for performance metrics\nIncorporating backlink data from other APIs\nAdding keyword ranking data from rank tracking APIs\nImplement automated delivery: Add nodes after the \"Download Report\" to:\n\nSend the report directly to clients via email\nUpload it to cloud storage\nCreate a PDF version of the report\nSend the report directly to clients via email\nUpload it to cloud storage\nCreate a PDF version of the report",
"isPaid": false
},
{
"templateId": "4484",
"templateName": "AI Voice Chat Agent with ElevenLabs and InfraNodus Graph RAG Knowledge",
"templateDescription": "Set Up ElevenLabs Voice Chat Agent using Graph RAG Knowledge Graphs as Experts This workflow creates an AI voice chatbot agent that has access to several...",
"templateUrl": "https://n8n.io/workflows/4484",
"jsonFileName": "AI_Voice_Chat_Agent_with_ElevenLabs_and_InfraNodus_Graph_RAG_Knowledge.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/AI_Voice_Chat_Agent_with_ElevenLabs_and_InfraNodus_Graph_RAG_Knowledge.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/6d6444420677b96f1ca4c2ee7aecb57c/raw/ae3fdaf2324142bae07fd1afa6f19819f6e4e61e/AI_Voice_Chat_Agent_with_ElevenLabs_and_InfraNodus_Graph_RAG_Knowledge.json",
"screenshotURL": "https://i.ibb.co/4RVVwb4n/f02b78e8ad77.png",
"workflowUpdated": true,
"gistId": "6d6444420677b96f1ca4c2ee7aecb57c",
"templateDescriptionFull": "This workflow creates an AI voice chatbot agent that has access to several knowledge bases at the same time (used as \"experts\").\n\nThese knowledge bases are provided using the InfraNodus GraphRAG using the knowledge graphs and providing high-quality responses without the need to set up complex RAG vector store workflows.\n\nWe use ElevenLabs to set up a voice agent that can be embedded to any website or used via their API.\n\nThe advantages of using GraphRAG instead of the standard vector stores for knowledge are:\n\nEasy and quick to set up (no complex data import workflows needed) and to update with new knowledge\nA knowledge graph has a holistic overview of your knowledge base\nBetter retrieval of relations between the document chunks = higher quality responses\nAbility to reuse in other n8n workflows\n\n\n\nThis template uses the n8n AI agent node as an orchestrating agent that decides which tool (knowledge graph) to use based on the user's prompt.\n\nThe user's prompt is received from the ElevenLabs Conversational AI agent via an n8n Webhook, which also takes care of the voice interaction.\n\nThe response from n8n is then sent to the Webhook, which is polled by the ElevenLabs voice agent. This agent processes the response and provides the final answer.\n\nHere's a description step by step:\n\nThe user submits a question using ElevenLabs voice interface\nThe question is sent via the knowledge_base tool in ElevenLabs to the n8n Webhook with the POST request containing the user's prompt and sessionID for Chat Memory node in n8n.\nThe n8n AI agent node checks a list of tools it has access to. Each tool has a description of the knowledge auto-generated by InfraNodus (we call each tool an \"expert\").\nThe n8n AI agent decides which tool should be used to generate a response. It may reformulate user's query to be more suitable for the expert.\nThe query is then sent to the InfraNodus HTTP node endpoint, which will query the graph that corresponds to that expert.\nEach InfraNodus GraphRAG expert provides a rich response that takes the whole context into account and provides a response from each expert (graph) along with a list of relevant statements retrieved using a combination or RAG and GraphRAG.\nThe n8n AI Agent node integrates the responses received from the experts to produce the final answer.\nThe final answer is sent back to the Webhook endpoint\nElevenLabs conversational AI agent picks up the response arriving from the knowledge_base tool via the webhook and then condenses it for conversational format and transforms text into voice.\n\nYou need an InfraNodus GraphRAG API account and key to use this workflow.\n\nCreate an InfraNodus account\nGet the API key at https://infranodus.com/api-access and create a Bearer authorization key for the InfraNodus HTTP nodes.\nCreate a separate knowledge graph for each expert (using PDF / content import options) in InfraNodus\nFor each graph, go to the workflow, paste the name of the graph into the body name field.\nKeep other settings intact or learn more about them at the InfraNodus access points page.\nOnce you add one or more graphs as experts to your flow, add the LLM key to the OpenAI node and launch the workflow\nYou will also need to set up an ElevenLabs account and to set up a conversational AI agent there. See the Post note in the n8n workflow for a complete step-by-step description or our support article on setting up ElevenLabs AI voice agent\nOnce the voice AI agent is ready, you might want to combine it with a text AI chatbot workflow so your users have a choice between the text and voice interaction. In that case, you may be interested to use our free open-source website popup chat widget popupchat.dev where you can create an embed code to add to your blog or website and allow the user to choose between the text and voice interaction.\n\nAn InfraNodus account and API key\nAn OpenAI (or any other LLM) API key\nAn ElevenLabs account\n\n1. How many \"experts\" should I aim for?\n\nWe recommend to aim for the number of experts as the optimal number of people in a team, which is usually 2-7. If you add more experts, your AI orchestrating agent will have troubles choosing the most suitable \"expert\" tool for the user's query. You can mitigate this by specifying in the AI agent description that it can choose maximum 3-7 experts to provide a response.\n\n2. Why use InfraNodus GraphRAG and not standard vector store for knowledge?\n\nFirst, vector stores are complex to set up and to update. You'd need a separate workflow for that, decide on the vector dimensions, add metadata to your knowledge, etc.\nWith InfraNodus, you have a complete RAG / GraphRAG solution under the hood that is easy to set up and provides high-quality responses that takes the overall structure and the relations between your ideas into account.\n\n3 Why not use ElevenLabs' own knowledge?\n\nOne of the reasons is that you want your knowledge base to be in one place so you can reuse it in other n8n workflows. Another reason is that you will not have such a good separation between the \"experts\" when you converse with the agent. So the answers you get will be based on top matches from all the books / articles you upload, while with the InfraNodus GraphRAG setup you can better control which graphs are consulted as experts and have an explicit way to display this data.\n\nYou can use this same workflow with a Telegram bot, so you can interact with it using Telegram. There are many more customizations available on our GitHub repo for n8n workflows.\n\nCheck out the complete setup guide for this workflow at https://support.noduslabs.com/hc/en-us/articles/20318967066396-How-to-Build-a-Text-Voice-AI-Agent-Chatbot-with-n8n-Elevenlabs-and-InfraNodus\n\nAlso check out the video tutorial with a demo:",
"isPaid": false
},
{
"templateId": "3289",
"templateName": "🎥 Analyze YouTube Video for Summaries, Transcripts & Content + Google Gemini AI",
"templateDescription": "🎥 Analyze YouTube Video for Summaries, Transcripts & Content + Google Gemini Who is this for?This workflow is ideal for content creators, video marketers,...",
"templateUrl": "https://n8n.io/workflows/3289",
"jsonFileName": "_Analyze_YouTube_Video_for_Summaries_Transcripts__Content__Google_Gemini_AI.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/_Analyze_YouTube_Video_for_Summaries_Transcripts__Content__Google_Gemini_AI.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/39127dc1dccad24a12eb1c39728dbd7f/raw/420b07ee25b5a558cbf3e60b7c3db349bd246181/_Analyze_YouTube_Video_for_Summaries_Transcripts__Content__Google_Gemini_AI.json",
"screenshotURL": "https://i.ibb.co/Jww72tHp/c052213c7a1a.png",
"workflowUpdated": true,
"gistId": "39127dc1dccad24a12eb1c39728dbd7f",
"templateDescriptionFull": "This workflow is ideal for content creators, video marketers, and research professionals who need to extract actionable insights, detailed transcripts, or metadata from YouTube videos efficiently. It is particularly useful for those leveraging AI tools to analyze video content and optimize audience engagement.\n\nAnalyzing video content manually can be time-consuming and prone to errors. This workflow automates the process by extracting key metadata, generating summaries, and providing structured transcripts tailored to specific use cases. It helps users save time and ensures accurate data extraction for content optimization.\n\nExtracts audience-specific metadata (e.g., video type, tone, key topics, engagement drivers).\nGenerates customized outputs based on six prompt types:\n\nDefault: Actionable insights and strategies.\nTranscribe: Verbatim transcription.\nTimestamps: Timestamped dialogue.\nSummary: Concise bullet-point summary.\nScene: Visual descriptions of settings and techniques.\nClips: High-engagement video segments with timestamps.\nDefault: Actionable insights and strategies.\nTranscribe: Verbatim transcription.\nTimestamps: Timestamped dialogue.\nSummary: Concise bullet-point summary.\nScene: Visual descriptions of settings and techniques.\nClips: High-engagement video segments with timestamps.\nSaves extracted data as a text file in Google Drive.\nSends analyzed outputs via Gmail or provides them in a completion form.\n\nConfigure API keys:\n\nAdd your Google API key as an environment variable.\nAdd your Google API key as an environment variable.\nInput requirements:\n\nProvide the YouTube video ID (e.g., wBuULAoJxok).\nSelect a prompt type from the dropdown menu.\nProvide the YouTube video ID (e.g., wBuULAoJxok).\nSelect a prompt type from the dropdown menu.\nConnect credentials:\n\nSet up Google Drive and Gmail integrations in n8n.\nSet up Google Drive and Gmail integrations in n8n.\n\nModify the metadata prompt to extract additional fields relevant to your use case.\nAdjust the output format for summaries or transcripts based on your preferences (e.g., structured bullets or plain text).\nAdd nodes to integrate with other platforms like Slack or Notion for further collaboration.\n\nInput: YouTube video ID (wBuULAoJxok) and prompt type (summary).\nOutput: A concise summary highlighting actionable insights, tools, and resources mentioned in the video.",
"isPaid": false
},
{
"templateId": "3586",
"templateName": "AI-Powered WhatsApp Chatbot for Text, Voice, Images & PDFs",
"templateDescription": "This workflow is a highly advanced multimodal AI assistant designed to operate through WhatsApp. It can understand and respond to text, images, voice...",
"templateUrl": "https://n8n.io/workflows/3586",
"jsonFileName": "AI-Powered_WhatsApp_Chatbot_for_Text_Voice_Images__PDFs.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/AI-Powered_WhatsApp_Chatbot_for_Text_Voice_Images__PDFs.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/ce929c8c16de64924220f512eefbda2b/raw/a23f1b30f4973c4943ae0fb752e3dac3ac82ce9a/AI-Powered_WhatsApp_Chatbot_for_Text_Voice_Images__PDFs.json",
"screenshotURL": "https://i.ibb.co/HDrskWzT/42fd9c95b240.png",
"workflowUpdated": true,
"gistId": "ce929c8c16de64924220f512eefbda2b",
"templateDescriptionFull": "This workflow is a highly advanced multimodal AI assistant designed to operate through WhatsApp. It can understand and respond to text, images, voice messages, and PDF documents by combining OpenAI models with smart logic to adapt to the content received.\n\nUsing the Input type node, the bot detects whether the user has sent:\n\nText\nVoice messages\nImages\nFiles (PDF)\nOther unsupported content\n\nText messages are processed by an OpenAI GPT-4o-mini agent with a customized system prompt.\nReplies are concise, accurate, and formatted for mobile readability.\n\nImages are downloaded, converted to base64, and analyzed by an image-aware AI model.\nThe output is a rich, structured description, designed for visually impaired users or visual content interpretation.\n\nAudio messages are downloaded and transcribed using OpenAI Whisper.\nThe transcribed text is analyzed and answered by the AI.\nOptionally, the AI reply can be converted back to voice using OpenAI's text-to-speech, and sent as an audio message.\n\nOnly PDFs are allowed (filtered via MIME type).\nThe document’s content is extracted and combined with the user's message.\nThe AI then provides a relevant summary or answer.\n\nEach user has a personalized session ID with a memory window of 10 interactions.\nThis ensures a more natural and contextual conversation flow.\n\nThisworkflow is designed to handle incoming WhatsApp messages and process different types of inputs (text, audio, images, and PDF documents) using AI-powered analysis. Here’s how it functions:\n\nTrigger: The workflow starts with the WhatsApp Trigger node, which listens for incoming messages (text, audio, images, or documents).\nInput Routing: The Input type (Switch node) checks the message type and routes it to the appropriate processing branch:\n\nText: Directly forwards the message to the AI agent for response generation.\nAudio: Downloads the audio file, transcribes it using OpenAI, and sends the transcription to the AI agent.\nImage: Downloads the image, analyzes it with OpenAI’s GPT-4 model, and generates a detailed description.\nPDF Document: Downloads the file, extracts text, and processes it with the AI agent.\nUnsupported Formats: Sends an error message if the input is not supported.\nText: Directly forwards the message to the AI agent for response generation.\nAudio: Downloads the audio file, transcribes it using OpenAI, and sends the transcription to the AI agent.\nImage: Downloads the image, analyzes it with OpenAI’s GPT-4 model, and generates a detailed description.\nPDF Document: Downloads the file, extracts text, and processes it with the AI agent.\nUnsupported Formats: Sends an error message if the input is not supported.\nAI Processing: The AI Agent1 node, powered by OpenAI, processes the input (text, transcribed audio, image description, or PDF content) and generates a response.\nResponse Handling:\n\nFor audio inputs, the AI’s response is converted back into speech (using OpenAI’s TTS) and sent as a voice message.\nFor other inputs, the response is sent as a text message via WhatsApp.\nFor audio inputs, the AI’s response is converted back into speech (using OpenAI’s TTS) and sent as a voice message.\nFor other inputs, the response is sent as a text message via WhatsApp.\nMemory: The Simple Memory node maintains conversation context for follow-up interactions.\n\nTo deploy this workflow in n8n, follow these steps:\n\nConfigure WhatsApp API Credentials:\n\nSet up WhatsApp Business API credentials (Meta Developer Account).\nAdd the credentials in the WhatsApp Trigger, Get Image/Audio/File URL, and Send Message nodes.\nSet up WhatsApp Business API credentials (Meta Developer Account).\nAdd the credentials in the WhatsApp Trigger, Get Image/Audio/File URL, and Send Message nodes.\n\nSet Up OpenAI Integration:\n\nProvide an OpenAI API key in the Analyze Image, Transcribe Audio, Generate Audio Response, and AI Agent1 nodes.\n\nAdjust Input Handling (Optional):\n\nModify the Switch node (\"Input type\") to handle additional message types if needed.\nUpdate the \"Only PDF File\" IF node to support other document formats.\n\nTest & Deploy:\n\nActivate the workflow and test with different message types (text, audio, image, PDF).\nEnsure responses are correctly generated and sent back via WhatsApp.\n\nContact me for consulting and support or add me on Linkedin.",
"isPaid": false
},
{
"templateId": "3427",
"templateName": "template_3427",
"templateDescription": "Who is this template for? This template is ideal for small businesses, agencies, and solo professionals who want to automate appointment scheduling and...",
"templateUrl": "https://n8n.io/workflows/3427",
"jsonFileName": "template_3427.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_3427.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/92cd8d48313b646c690e1434b45abc70/raw/56b0b8788ed0d51411937ddfbce6c778ac1d985e/template_3427.json",
"screenshotURL": "https://i.ibb.co/7NRrP65d/f6dc600b6396.png",
"workflowUpdated": true,
"gistId": "92cd8d48313b646c690e1434b45abc70",
"templateDescriptionFull": "This template is ideal for small businesses, agencies, and solo professionals who want to automate appointment scheduling and caller follow-up through a voice-based AI receptionist. If you’re using tools like Google Calendar, Airtable, and Vapi (Twilio), this setup is for you.\n\nManual call handling, appointment booking, and email coordination can be time-consuming and prone to errors. This workflow solves that by automating the receptionist role: answering calls, checking calendar availability, managing appointments, and storing call summaries—all without human intervention.\n\nThis Agent Receptionist manages inbound voice calls and scheduling tasks using Vapi and Google Calendar. It checks availability, books or updates calendar events, sends email confirmations, and logs call details into Airtable. The workflow includes built-in logic for slot management, email triggers, and storing call transcripts.\n\nDuplicate Airtable Base: Use this Airtable base templateBASE LINK\n\nImport Workflow: Load provided JSON into your n8n instance.\n\nCredentials: Connect your Google Calendar and Airtable credentials in n8n.\n\nActivate Workflow: Enable workflow to get live webhook URLs.\n\nVapi Configuration:\n\nPaste provided system prompt into Vapi Assistant.\n\nLink the appropriate webhook URLs from n8n (GetSlots, BookSlots, UpdateSlots, CancelSlots, and end-of-call report).\n\nDisclaimer\n\nOptimized for cloud-hosted n8n instances. Self-hosted users should verify webhook and credential setups.",
"isPaid": false
},
{
"templateId": "4087",
"templateName": "AI Powered Content Creation and Publishing Engine with Mistral, Creatomate, and YouTube",
"templateDescription": "Description This n8n workflow automates the entire process of creating and publishing AI-generated videos, triggered by a simple message from a Telegram bot...",
"templateUrl": "https://n8n.io/workflows/4087",
"jsonFileName": "AI_Powered_Content_Creation_and_Publishing_Engine_with_Mistral_Creatomate_and_YouTube.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/AI_Powered_Content_Creation_and_Publishing_Engine_with_Mistral_Creatomate_and_YouTube.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/831eb4b4e9b14a77f2079aca052aff9c/raw/0dadf9096e9dd471c7cd262efb5dfbf7cb95030a/AI_Powered_Content_Creation_and_Publishing_Engine_with_Mistral_Creatomate_and_YouTube.json",
"screenshotURL": "https://i.ibb.co/CZWHQPH/6cef182627e4.png",
"workflowUpdated": true,
"gistId": "831eb4b4e9b14a77f2079aca052aff9c",
"templateDescriptionFull": "This n8n workflow automates the entire process of creating and publishing AI-generated videos, triggered by a simple message from a Telegram bot (YTAdmin). It transforms a text prompt into a structured video with scenes, visuals, and voiceover, stores assets in MongoDB, renders the final output using Creatomate, and uploads the video to YouTube. Throughout the process, YTAdmin receives real-time updates on the workflow’s progress. This is ideal for content creators, marketers, or businesses looking to scale video production using automation and AI.\n\nYou can see a video demonstrating this template in action here: https://www.youtube.com/watch?v=EjI-ChpJ4xA&t=200s\n\nTrigger: Message from YTAdmin (Telegram Bot)\n\nThe flow starts when YTAdmin sends a content prompt.\n\nGenerate Structured Content\n\nA Mistral language model processes the input and outputs structured content, typically broken into scenes.\n\nSplit & Process Content into Scenes\n\nThe content is split into categorized parts for scene generation.\n\nGenerate Media Assets\n\nFor each scene:\nImages: Generated using OpenAI’s image model.\nVoiceovers: Created using OpenAI’s text-to-speech.\nAudio files are encoded and stored in MongoDB.\n\nScene Composition\n\nAssets are grouped into coherent scenes.\n\nRender with Creatomate\n\nA complete payload is generated and sent to the Creatomate rendering API to produce the video.\nProgress messages are sent to YTAdmin.\nThe flow pauses briefly to avoid rate limits.\n\nRender Callback\n\nOnce Creatomate completes rendering, it sends a callback to the flow.\nIf the render fails, an error message is sent to YTAdmin.\nIf the render succeeds, the flow proceeds to post-processing.\n\nGenerate Title & Description\n\nA second Mistral prompt generates a compelling title and description for YouTube.\n\nUpload to YouTube\n\nThe rendered video is retrieved from Creatomate.\nIt’s uploaded to YouTube with the AI-generated metadata.\n\nFinal Update\n\nA success message is sent to YTAdmin, confirming upload completion.\n\nCreate a Telegram bot via BotFather and get your API token.\nAdd this token in n8n's Telegram credentials and link to the \"Receive Message from YTAdmin\" trigger.\n\nStep 2: Connect Your AI Providers\n\nMistral: Add your API key under HTTP Request or AI Model nodes.\nOpenAI: Create an account at platform.openai.com and obtain an API key. Use it for both image generation and voiceover synthesis.\n\nStep 3: Configure Audio File Storage with MongoDB via Custom API\n\nReceives the Base64 encoded audio data sent in the request body.\nConnects to the configured MongoDB instance (connection details are managed securely within the API- code below).\nUses the MongoDB driver and GridFS to store the audio data.\nReturns the unique _id (ObjectId) of the stored file in GridFS as a response.\nThis _id is crucial as it will be used in subsequent steps to generate the download URL for the audio file.\nMy API code can be found here for reference: https://github.com/nanabrownsnr/YTAutomation.git\n\nStep 4: Set Up Creatomate\n\nCreate a Creatomate account, define your video templates, and retrieve your API key.\nConfigure the HTTP request node to match your Creatomate payload requirements.\n\nStep 5: Connect YouTube\n\nIn n8n, add OAuth2 credentials for your YouTube account.\nMake sure your Google Cloud project has YouTube Data API enabled.\n\nStep 6: Deploy and Test\n\nSend a message to YTAdmin and monitor the flow in n8n.\nVerify that content is generated, media is created, and the final video is rendered and uploaded.\n\nChange the AI Prompts\n\nModify the generation prompts to adjust tone, voice, or content type (e.g., news recaps, product videos, educational summaries).\n\nSwitch Messaging Platform\n\nReplace Telegram (YTAdmin) with Slack, Discord, or WhatsApp by swapping out the trigger and response nodes.\n\nAdd Subtitles or Effects\n\nIntegrate Whisper or another speech-to-text tool to generate subtitles.\nAdd overlay or transition effects in the Creatomate video payload.\n\nUse Local File Storage Instead of MongoDB\n\nSwap out MongoDB upload http nodes with filesystem or S3-compatible storage.\n\nRepurpose for Other Platforms\n\nSwap YouTube upload with TikTok, Instagram, or Vimeo endpoints for broader publishing.\n\nNeed Help or Want to Customize This Workflow?\nIf you'd like assistance setting this up or adapting it for a different use case, feel free to reach out to me at nanabrownsnr@gmail.com. I'm happy to help!",
"isPaid": false
},
{
"templateId": "4494",
"templateName": "Airbnb Telegram Agent - Template",
"templateDescription": "Welcome to my Airbnb Telegram Agent Workflow!This workflow creates an intelligent Telegram bot that helps users search and find Airbnb accommodations using...",
"templateUrl": "https://n8n.io/workflows/4494",
"jsonFileName": "Airbnb_Telegram_Agent_-_Template.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Airbnb_Telegram_Agent_-_Template.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/f33879468bd32bab3ea43b007c500a67/raw/90fcf7217b345b97ba329dc2061f381947f5747d/Airbnb_Telegram_Agent_-_Template.json",
"screenshotURL": "https://i.ibb.co/rRTG8Pfq/75e471af9ed0.png",
"workflowUpdated": true,
"gistId": "f33879468bd32bab3ea43b007c500a67",
"templateDescriptionFull": "This workflow creates an intelligent Telegram bot that helps users search and find Airbnb accommodations using natural language queries and voice messages.\n\nDISCLAIMER: This workflow only works with self-hosted n8n instances! You have to install the n8n-nodes-mcp-client Community Node!\n\n\n\nThis workflow processes incoming Telegram messages (text or voice) and provides personalized Airbnb accommodation recommendations. The AI agent understands natural language queries, searches through Airbnb data using MCP tools, and returns mobile-optimized results with clickable links, prices, and key details.\n\nKey Features:\n\nVoice message support (speech-to-text and text-to-speech)\nConversation memory for context-aware responses\nMobile-optimized formatting for Telegram\nReal-time Airbnb data access via MCP integration\n\nThis workflow has the following sequence:\n\nTelegram Trigger - Receives incoming messages from users\nText or Voice Switch - Routes based on message type\nVoice Processing (if applicable) - Downloads and transcribes voice messages\nText Preparation - Formats text input for the AI agent\nAirbnb AI Agent - Core logic that:\n\nLists available MCP tools for Airbnb data\nExecutes searches with parsed parameters\nFormats results for mobile display\nLists available MCP tools for Airbnb data\nExecutes searches with parsed parameters\nFormats results for mobile display\nResponse Generation - Sends formatted text response\nVoice Response (optional) - Creates and sends audio summary\n\nTelegram Bot API: Documentation\n\nCreate a bot via @BotFather on Telegram\nGet bot token and configure webhook\nCreate a bot via @BotFather on Telegram\nGet bot token and configure webhook\nOpenAI API: Documentation\n\nUsed for speech transcription (Whisper)\nUsed for chat completion (GPT-4)\nUsed for text-to-speech generation\nUsed for speech transcription (Whisper)\nUsed for chat completion (GPT-4)\nUsed for text-to-speech generation\nMCP Community Client Node: Documentation\n\nCustom integration for Airbnb data\nRequires MCP server setup with Airbnb/Airtable connection\nProvides tools for accommodation search and details\nCustom integration for Airbnb data\nRequires MCP server setup with Airbnb/Airtable connection\nProvides tools for accommodation search and details\n\nImportant: You need to set up an MCP server with Airbnb data access. The workflow uses MCP tools to retrieve real accommodation data, so ensure your MCP server is properly configured with the Airtable/Airbnb integration.\n\nConfiguration Notes:\n\nUpdate the Telegram chat ID in the trigger for your specific bot\nModify the system prompt in the Airbnb Agent for different use cases\nThe workflow supports both individual users and can be extended for group chats\n\nFeel free to contact me via LinkedIn, if you have any questions!",
"isPaid": false
},
{
"templateId": "4102",
"templateName": "Agent AI Calendar [n8n pro]",
"templateDescription": "Manage Calendar with Voice & Text Commands using GPT-4, Telegram & Google Calendar This n8n workflow transforms your Telegram bot into a personal AI...",
"templateUrl": "https://n8n.io/workflows/4102",
"jsonFileName": "Agent_AI_Calendar_n8n_pro.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Agent_AI_Calendar_n8n_pro.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/5839cee3a9038992fef1c4d0fc046a75/raw/34489871b71666454b572d3d6af902d3659fcf22/Agent_AI_Calendar_n8n_pro.json",
"screenshotURL": "https://i.ibb.co/pvcJM3yz/c5836dfc0f1f.png",
"workflowUpdated": true,
"gistId": "5839cee3a9038992fef1c4d0fc046a75",
"templateDescriptionFull": "This n8n workflow transforms your Telegram bot into a personal AI calendar assistant, capable of understanding both voice and text commands in Romanian, and managing your Google Calendar using the GPT-4 model via LangChain.\n\nWhether you want to create, update, fetch, or delete events, you can simply speak or write your request to your Telegram bot — and the assistant takes care of the rest.\n\nVoice command support using Telegram voice messages (.ogg)\nTranscription using OpenAI Whisper\nNatural language understanding with GPT-4 via LangChain\nGoogle Calendar integration:\n\n✅ Create Events\n🔁 Update Events\n❌ Delete Events\n📅 Fetch Events\n✅ Create Events\n🔁 Update Events\n❌ Delete Events\n📅 Fetch Events\nResponses sent back via Telegram\n\nGo to @BotFather on Telegram.\nSend /newbot and follow the instructions.\nSave the Bot Token.\n\nPaste the Telegram token into the Telegram Trigger and Telegram nodes.\nSet updates to [\"message\"].\n\nGet an OpenAI API key from https://platform.openai.com\nCreate a credential in n8n for OpenAI.\nThis is used for both transcription and AI reasoning.\n\nIn Google Cloud Console:\n\nEnable Google Calendar API\nSet up OAuth2 credentials\nAdd your n8n redirect URI (usually https://yourdomain/rest/oauth2-credential/callback)\nEnable Google Calendar API\nSet up OAuth2 credentials\nAdd your n8n redirect URI (usually https://yourdomain/rest/oauth2-credential/callback)\nCreate a credential in n8n using Google Calendar OAuth2\nGrant access to your calendar (e.g., \"Family\" calendar).\n\nThe transcription node uses \"en\" for English. Change to another locale if needed.\n\nYou can modify the prompt in the AI Agent node to include your name, work schedule, or specific behavior expectations.\n\nAdjust time ranges or filters in the Get Events node\nAdd custom logic before Create Event (e.g., validation, conflict checks)\n\nMake sure n8n has HTTPS enabled to receive Telegram updates.\nYou can test the flow first using only text, then voice.\nUse AI memory or vector stores (like Supabase) if you want context-aware planning in the future.",
"isPaid": false
},
{
"templateId": "2462",
"templateName": "template_2462",
"templateDescription": "How it works: This project creates a personal AI assistant named Angie that operates through Telegram. Angie can summarize daily emails, look up calendar...",
"templateUrl": "https://n8n.io/workflows/2462",
"jsonFileName": "template_2462.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_2462.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/d8a01c2b5e23b98aa9209591f0675fbf/raw/9bb5ca7442100616fcea0bee5a30d66a8890e0f3/template_2462.json",
"screenshotURL": "https://i.ibb.co/1YRgwRBV/6e2d5974f071.png",
"workflowUpdated": true,
"gistId": "d8a01c2b5e23b98aa9209591f0675fbf",
"templateDescriptionFull": "How it works:\n\nThis project creates a personal AI assistant named Angie that operates through Telegram. Angie can summarize daily emails, look up calendar entries, remind users of upcoming tasks, and retrieve contact information. The assistant can interact with users via both voice and text inputs.\n\nStep-by-step:\n\nTelegram Trigger: The workflow starts with a Telegram trigger that listens for incoming message events. The system determines if the incoming message is voice or text. If voice, the voice file is retrieved and transcribed to text using OpenAI's API Speech to Text\n\nAI Assistant: The telegram request is passed to the AI assistant (Angie).\n\nTools Integration: The AI assistant is equipped with several tools:\n\nGet Email: Uses Gmail API to fetch recent emails, filtering by date.\nGet Calendar: Retrieves calendar entries for specified dates.\nGet Tasks: Connects to a Baserow (open-source Airtable alternative) database to fetch to-do list items.\nGet Contacts: Also uses Baserow to retrieve contact information.\n\nResponse Generation: The AI formulates a response based on the gathered information and sends back to the user on Telegram",
"isPaid": false
},
{
"templateId": "2846",
"templateName": "Voice RAG Chatbot with ElevenLabs and OpenAI",
"templateDescription": "The \"Voice RAG Chatbot with ElevenLabs and OpenAI\" workflow in n8n is designed to create an interactive voice-based chatbot system that leverages both text...",
"templateUrl": "https://n8n.io/workflows/2846",
"jsonFileName": "Voice_RAG_Chatbot_with_ElevenLabs_and_OpenAI.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Voice_RAG_Chatbot_with_ElevenLabs_and_OpenAI.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/db4c2136017a0d8b148b08f76a54ac8e/raw/4c2003edba35a1b3bdc4d37b1466376fa49069dd/Voice_RAG_Chatbot_with_ElevenLabs_and_OpenAI.json",
"screenshotURL": "https://i.ibb.co/MD56shtY/5c50cc5a92ea.png",
"workflowUpdated": true,
"gistId": "db4c2136017a0d8b148b08f76a54ac8e",
"templateDescriptionFull": "The \"Voice RAG Chatbot with ElevenLabs and OpenAI\" workflow in n8n is designed to create an interactive voice-based chatbot system that leverages both text and voice inputs for providing information. Ideal for shops, commercial activities and restaurants\n\nHere's how it operates:\n\nWebhook Activation: The process begins when a user interacts with the voice agent set up on ElevenLabs, triggering a webhook in n8n. This webhook sends a question from the user to the AI Agent node.\nAI Agent Processing: Upon receiving the query, the AI Agent node processes the input using predefined prompts and tools. It extracts relevant information from the knowledge base stored within the Qdrant vector database.\nKnowledge Base Retrieval: The Vector Store Tool node interfaces with the Qdrant Vector Store to retrieve pertinent documents or data segments matching the user’s query.\nText Generation: Using the retrieved information, the OpenAI Chat Model generates a coherent response tailored to the user’s question.\nResponse Delivery: The generated response is sent back through another webhook to ElevenLabs, where it is converted into speech and delivered audibly to the user.\nContinuous Interaction: For ongoing conversations, the Window Buffer Memory ensures context retention by maintaining a history of interactions, enhancing the conversational flow.\n\nTo configure this workflow effectively, follow these detailed setup instructions:\n\nElevenLabs Agent Creation:\n\nCreate a FREE account on ElevenLabs\nBegin by creating an agent on ElevenLabs (e.g., named 'test_n8n').\nCustomize the first message and define the system prompt specific to your use case, such as portraying a character like a waiter at \"Pizzeria da Michele\".\nAdd a Webhook tool labeled 'test_chatbot_elevenlabs' configured to receive questions via POST requests.\nCreate a FREE account on ElevenLabs\nBegin by creating an agent on ElevenLabs (e.g., named 'test_n8n').\nCustomize the first message and define the system prompt specific to your use case, such as portraying a character like a waiter at \"Pizzeria da Michele\".\nAdd a Webhook tool labeled 'test_chatbot_elevenlabs' configured to receive questions via POST requests.\nQdrant Collection Initialization:\n\nUtilize the HTTP Request nodes ('Create collection' and 'Refresh collection') to initialize and clear existing collections in Qdrant. Ensure you update placeholders QDRANTURL and COLLECTION accordingly.\nUtilize the HTTP Request nodes ('Create collection' and 'Refresh collection') to initialize and clear existing collections in Qdrant. Ensure you update placeholders QDRANTURL and COLLECTION accordingly.\nDocument Vectorization:\n\nUse Google Drive integration to fetch documents from a designated folder. These documents are then downloaded and processed for embedding.\nEmploy the Embeddings OpenAI node to generate embeddings for the downloaded files before storing them into Qdrant via the Qdrant Vector Store node.\nUse Google Drive integration to fetch documents from a designated folder. These documents are then downloaded and processed for embedding.\nEmploy the Embeddings OpenAI node to generate embeddings for the downloaded files before storing them into Qdrant via the Qdrant Vector Store node.\nAI Agent Configuration:\n\nDefine the system prompt for the AI Agent node which guides its behavior and responses based on the nature of queries expected (e.g., product details, troubleshooting tips).\nLink necessary models and tools including OpenAI language models and memory buffers to enhance interaction quality.\nDefine the system prompt for the AI Agent node which guides its behavior and responses based on the nature of queries expected (e.g., product details, troubleshooting tips).\nLink necessary models and tools including OpenAI language models and memory buffers to enhance interaction quality.\nTesting Workflow:\n\nExecute test runs of the entire workflow by clicking 'Test workflow' in n8n alongside initiating tests on the ElevenLabs side to confirm all components interact seamlessly.\nMonitor logs and outputs closely during testing phases to ensure accurate data flow between systems.\nExecute test runs of the entire workflow by clicking 'Test workflow' in n8n alongside initiating tests on the ElevenLabs side to confirm all components interact seamlessly.\nMonitor logs and outputs closely during testing phases to ensure accurate data flow between systems.\nIntegration with Website:\n\nFinally, integrate the chatbot widget onto your business website replacing placeholder AGENT_ID with the actual identifier created earlier on ElevenLabs.\nFinally, integrate the chatbot widget onto your business website replacing placeholder AGENT_ID with the actual identifier created earlier on ElevenLabs.\n\nBy adhering to these comprehensive guidelines, users can successfully deploy a sophisticated voice-driven chatbot capable of delivering precise answers utilizing advanced retrieval-augmented generation techniques powered by OpenAI and ElevenLabs technologies.\n\nContact me for consulting and support or add me on Linkedin.",
"isPaid": false
},
{
"templateId": "3657",
"templateName": "Build a Chatbot, Voice Agent and Phone Agent with Voiceflow, Google Calendar and RAG",
"templateDescription": "Voiceflow is a no-code platform that allows you to design, prototype, and deploy conversational assistants across multiple channels—such as chat, voice, and...",
"templateUrl": "https://n8n.io/workflows/3657",
"jsonFileName": "Build_a_Chatbot_Voice_Agent_and_Phone_Agent_with_Voiceflow_Google_Calendar_and_RAG.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Build_a_Chatbot_Voice_Agent_and_Phone_Agent_with_Voiceflow_Google_Calendar_and_RAG.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/189f135c6ff46a308325dcf4e20504da/raw/cfad9f9b535c2d44b746e335250c9d7bc0ce30fa/Build_a_Chatbot_Voice_Agent_and_Phone_Agent_with_Voiceflow_Google_Calendar_and_RAG.json",
"screenshotURL": "https://i.ibb.co/MD56shtY/5c50cc5a92ea.png",
"workflowUpdated": true,
"gistId": "189f135c6ff46a308325dcf4e20504da",
"templateDescriptionFull": "Voiceflow is a no-code platform that allows you to design, prototype, and deploy conversational assistants across multiple channels—such as chat, voice, and phone—with advanced logic and natural language understanding. It supports integration with APIs, webhooks, and even tools like Twilio for phone agents. It's perfect for building customer support agents, voice bots, or intelligent assistants.\n\nThis workflow connects n8n and Voiceflow with tools like Google Calendar, Qdrant (vector database), OpenAI, and an order tracking API to power a smart, multi-channel conversational agent.\n\n\n\nThere are 3 main webhook endpoints in n8n that Voiceflow interacts with:\n\nn8n_order – receives user input related to order tracking, queries an API, and responds with tracking status.\nn8n_appointment – processes appointment booking, reformats date input using OpenAI, and creates a Google Calendar event.\nn8n_rag – handles general product/service questions using a RAG (Retrieval-Augmented Generation) system backed by:\n\nGoogle Drive document ingestion,\nQdrant vector store for search,\nand OpenAI models for context-based answers.\nGoogle Drive document ingestion,\nQdrant vector store for search,\nand OpenAI models for context-based answers.\n\nEach webhook is connected to a corresponding \"Capture\" block inside Voiceflow, which sends data to n8n and waits for the response.\n\nThis n8n workflow integrates Voiceflow for chatbot/voice interactions, Google Calendar for appointment scheduling, and RAG (Retrieval-Augmented Generation) for knowledge-based responses. Here’s the flow:\n\nTrigger:\n\nThree webhooks (n8n_order, n8n_appointment, n8n_rag) receive inputs from Voiceflow (chat, voice, or phone calls).\nEach webhook routes requests to specific functions:\n\nOrder Tracking: Fetches order status via an external API.\nAppointment Scheduling: Uses OpenAI to parse dates, creates Google Calendar events, and confirms via WhatsApp.\nRAG System: Queries a Qdrant vector store (populated with Google Drive documents) to answer customer questions using GPT-4.\nThree webhooks (n8n_order, n8n_appointment, n8n_rag) receive inputs from Voiceflow (chat, voice, or phone calls).\nEach webhook routes requests to specific functions:\n\nOrder Tracking: Fetches order status via an external API.\nAppointment Scheduling: Uses OpenAI to parse dates, creates Google Calendar events, and confirms via WhatsApp.\nRAG System: Queries a Qdrant vector store (populated with Google Drive documents) to answer customer questions using GPT-4.\nOrder Tracking: Fetches order status via an external API.\nAppointment Scheduling: Uses OpenAI to parse dates, creates Google Calendar events, and confirms via WhatsApp.\nRAG System: Queries a Qdrant vector store (populated with Google Drive documents) to answer customer questions using GPT-4.\nAI Processing:\n\nOpenAI Chains: Convert natural language dates to Google Calendar formats and generate responses.\nRAG Pipeline: Embeds documents (via OpenAI), stores them in Qdrant, and retrieves context-aware answers.\nVoiceflow Integration: Routes responses back to Voiceflow for multi-channel delivery (chat, voice, or phone).\nOpenAI Chains: Convert natural language dates to Google Calendar formats and generate responses.\nRAG Pipeline: Embeds documents (via OpenAI), stores them in Qdrant, and retrieves context-aware answers.\nVoiceflow Integration: Routes responses back to Voiceflow for multi-channel delivery (chat, voice, or phone).\nOutputs:\n\nConfirmation messages (e.g., \"Event created successfully\").\nDynamic responses for orders, appointments, or product support.\nConfirmation messages (e.g., \"Event created successfully\").\nDynamic responses for orders, appointments, or product support.\n\nAPIs:\n\nGoogle Calendar & Drive OAuth credentials.\nQdrant vector database (hosted or cloud).\nOpenAI API key (for GPT-4 and embeddings).\nGoogle Calendar & Drive OAuth credentials.\nQdrant vector database (hosted or cloud).\nOpenAI API key (for GPT-4 and embeddings).\n\nQdrant Setup:\n\nRun the \"Create collection\" and \"Refresh collection\" nodes to initialize the vector store.\nPopulate it with documents using the Google Drive → Qdrant pipeline (embeddings generated via OpenAI).\nRun the \"Create collection\" and \"Refresh collection\" nodes to initialize the vector store.\nPopulate it with documents using the Google Drive → Qdrant pipeline (embeddings generated via OpenAI).\nVoiceflow Webhooks:\n\nLink Voiceflow’s \"Captures\" to n8n’s webhook URLs (n8n_order, n8n_appointment, n8n_rag).\nLink Voiceflow’s \"Captures\" to n8n’s webhook URLs (n8n_order, n8n_appointment, n8n_rag).\nGoogle Calendar:\n\nAuthenticate the Google Calendar node and set event templates (e.g., summary, description).\nAuthenticate the Google Calendar node and set event templates (e.g., summary, description).\nRAG System:\n\nConfigure the Qdrant vector store and OpenAI embeddings nodes.\nAdjust the Retrieve Agent’s system prompt for domain-specific queries (e.g., electronics store support).\nConfigure the Qdrant vector store and OpenAI embeddings nodes.\nAdjust the Retrieve Agent’s system prompt for domain-specific queries (e.g., electronics store support).\n\nAdd Twilio for phone-agent capabilities.\nCustomize OpenAI prompts for tone/accuracy.\n\n\n\nPS. You can import a Twilio number to assign it to your agent for becoming a Phone Agent\n\n\n\nContact me for consulting and support or add me on Linkedin",
"isPaid": false
},
{
"templateId": "5042",
"templateName": "SmartMail Agent – Your AI Email Assistant, Powered by WhatsApp",
"templateDescription": "🔍 How it works This workflow turns WhatsApp into a smart email command center using AI. Users can speak or type instructions like: \"Send a follow-up to...",
"templateUrl": "https://n8n.io/workflows/5042",
"jsonFileName": "SmartMail_Agent__Your_AI_Email_Assistant_Powered_by_WhatsApp.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/SmartMail_Agent__Your_AI_Email_Assistant_Powered_by_WhatsApp.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/230baa9586a45cccf38f0253781ddf0b/raw/c696317c06ad7e6a508168cffdf564c3bc548a2b/SmartMail_Agent__Your_AI_Email_Assistant_Powered_by_WhatsApp.json",
"screenshotURL": "https://i.ibb.co/Y7Rps45M/4588447469b1.png",
"workflowUpdated": true,
"gistId": "230baa9586a45cccf38f0253781ddf0b",
"templateDescriptionFull": "This workflow turns WhatsApp into a smart email command center using AI.\n\nUsers can speak or type instructions like:\n\n\"Send a follow-up to Claire”\n\"Write a draft email to Claire to confirm tomorrow’s meeting at 5 PM”\n\"What is the name of Claire's firm?”\n\nThe agent transcribes voice notes, extracts intent with GPT, interacts with Gmail (send, draft, search), and replies with a confirmation via WhatsApp — either as text or a voice message.\n\nWhatsApp Business Webhook (Meta)\nOpenAI Whisper (voice transcription)\nGPT (intent + content generation)\nGmail (search, draft, send)\nAirtable (contact lookup + memory logging)\n\nThe agent logs key fields in Airtable:\n\nRecipient email\nCompany / job title\nAnd more...\nThis creates a lightweight \"gut memory” so the agent feels context-aware.\n\nConnect WhatsApp Business API (via Meta Developer Console)\nAdd OpenAI and Gmail credentials in n8n\nLink your Airtable base for contacts and logging\n\nHands-free email reply while commuting\nFast Gmail access for busy consultants / solopreneurs\nCustom business agents for service-based professionals\n\n30–60 minutes\n\nWhatsApp Business Cloud access\nOpenAI API Key\nGmail or Google Workspace\nAirtable account (free plan OK)\nn8n instance (cloud or self-hosted with HTTPS)",
"isPaid": false
},
{
"templateId": "2405",
"templateName": "AI Voice Chat using Webhook, Memory Manager, OpenAI, Google Gemini & ElevenLabs",
"templateDescription": "Who is this for?This workflow is designed for businesses or developers looking to integrate voice-based chat applications with dynamic responses and...",
"templateUrl": "https://n8n.io/workflows/2405",
"jsonFileName": "AI_Voice_Chat_using_Webhook_Memory_Manager_OpenAI_Google_Gemini__ElevenLabs.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/AI_Voice_Chat_using_Webhook_Memory_Manager_OpenAI_Google_Gemini__ElevenLabs.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/5aa7a9ed00acc3475487e5c7e9b2817f/raw/aa4369cdd9f5379a3cd8a7aad81bd446048141f2/AI_Voice_Chat_using_Webhook_Memory_Manager_OpenAI_Google_Gemini__ElevenLabs.json",
"screenshotURL": "https://i.ibb.co/8gDQxG4m/d2a6c9a0d106.png",
"workflowUpdated": true,
"gistId": "5aa7a9ed00acc3475487e5c7e9b2817f",
"templateDescriptionFull": "This workflow is designed for businesses or developers looking to integrate voice-based chat applications with dynamic responses and conversational memory.\n\nIt automates AI-powered voice conversations, maintaining context between sessions and converting speech-to-text and text-to-speech.\n\nThe workflow receives audio input, transcribes it using OpenAI, and processes the conversation using Google Gemini Chat Model (you can use OpenAI Chat Model). Responses are converted back to speech using ElevenLabs.\n\nYou'll need API keys for:\n\nOpenAI (you can obtain it from OpenAI website)\nElevenLabs (you can obtain it from their website)\nGoogle Gemini (You can obtain it from Google AI Studio)\n\nConfigure you API keys\nEnsure that the value (voice_message) in the \"Path\" parameter in the Webhook node is used as the name of the parameter that will contain the voice message you are sending via the HTTP Post request.",
"isPaid": false
},
{
"templateId": "2436",
"templateName": "template_2436",
"templateDescription": "This template demonstrates how to trigger an AI Agent with Siri and Apple Shortcuts, showing a simple pattern for voice-activated workflows in n8n. It's...",
"templateUrl": "https://n8n.io/workflows/2436",
"jsonFileName": "template_2436.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_2436.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/077d4f83e0023392c61529c8c586d97a/raw/13e6dec8c3a6abd9fb1f0290ec093032fc171abb/template_2436.json",
"screenshotURL": "https://i.ibb.co/1fjHvDgk/3ad64fbd3de0.png",
"workflowUpdated": true,
"gistId": "077d4f83e0023392c61529c8c586d97a",
"templateDescriptionFull": "This template demonstrates how to trigger an AI Agent with Siri and Apple Shortcuts, showing a simple pattern for voice-activated workflows in n8n. It's easy to customize—add app nodes before the AI Agent step to pass additional context, or modify the Apple Shortcut to send inputs like text, geolocation, images, or files.\n\n\n\nBasic instructions in template itself.\n\nn8n account (cloud or self-hosted)\nApple Shortcuts app on iOS or macOS. Dictation (\"Siri\") must be activated. Download the Shortcuts template here.\n\nVoice-Controlled AI: Trigger AI Agent via Siri for real-time voice replies.\nCustomizable Inputs: Modify Apple Shortcut to send text, images, geolocation, and more.\nFlexible Outputs: Siri can return the AI’s response as text, files, or customize it to trigger CRUD actions in connected apps.\nContext-Aware: Automatically feeds the current date and time to the AI Agent, with easy options to pass in more data.\n\nActivate Siri and speak your request.\nSiri sends the transcribed text to the n8n workflow via Apple Shortcuts.\nAI Agent processes the request and generates a response.\nSiri reads the response, or the workflow can return geolocation, files, or even perform CRUD actions in apps.\n\nTweak this template and make it your own.\n\nCapture Business Cards: Snap a photo of a business card and record a voice note. Have the AI Agent draft a follow-up email in Gmail, ready to send.\nVoice-to-Task Automation: Speak a new to-do item, and the workflow will add it to a Notion task board.\nBusiness English on the Fly: Convert casual speech into polished business language, and save the refined text directly to your pasteboard, ready to be pasted into any app. \"It's late because of you\" -> \"There has been a delay, and I believe your input may have contributed to it.\"",
"isPaid": false
},
{
"templateId": "3805",
"templateName": "template_3805",
"templateDescription": "OverviewThis workflow allows you to trigger custom logic in n8n directly from Retell's Voice Agent using Custom Functions.It captures a POST webhook from...",
"templateUrl": "https://n8n.io/workflows/3805",
"jsonFileName": "template_3805.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_3805.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/c004f40050734df8af5ed1e3dd60b7d6/raw/79dc8714eb0bf0adec7e29c7e0ad8df1f668f607/template_3805.json",
"screenshotURL": "https://i.ibb.co/LzhZWykj/4d278f57feb9.png",
"workflowUpdated": true,
"gistId": "c004f40050734df8af5ed1e3dd60b7d6",
"templateDescriptionFull": "This workflow allows you to trigger custom logic in n8n directly from Retell's Voice Agent using Custom Functions.\nIt captures a POST webhook from Retell every time a Voice Agent reaches a Custom Function node.\nYou can plug in any logic—call an external API, book a meeting, update a CRM, or even return a dynamic response back to the agent.\n\nFor builders using Retell who want to extend Voice Agent functionality with real-time custom workflows or AI-generated responses.\n\nHave a Retell AI Account\nA Retell agent with a Custom Function node in its conversation flow (see template below)\nSet your n8n webhook URL in the Custom Function configuration (see \"How to use it\" below)\n(Optional) Familiarity with Retell's Custom Function docs\nStart a conversation with the agent (text or voice)\n\nTo get you started, we've prepared a Retell Agent ready to be imported, that includes the call to this template.\n\nImport the agent to your Retell workspace (top-right button on your agent's page)\nYou will need to modify the function URL in order to call your own instance.\nThis template is a simple hotel agent that calls the custom function to confirm a booking, passing basic formatted data.\n\nRetell sends a webhook to n8n whenever a Custom Function is triggered during a call (or test chat).\nThe webhook includes:\n\nFull call context (transcript, call ID, etc.)\nParameters defined in the Retell function node\nFull call context (transcript, call ID, etc.)\nParameters defined in the Retell function node\nYou can process this data and return a response string back to the Voice Agent in real-time.\n\nCopy the webhook URL (e.g. https://your-instance.app.n8n.cloud/webhook/hotel-retell-template)\nModify the Retell Custom Function webhook URL (see template description for screenshots)\n\nEdit the function\n\nModify the URL\nEdit the function\nModify the URL\nModify the logic in the Set node or replace it with your own custom flow\nDeploy and test: Retell will hit your n8n workflow during the conversation\n\nCall a third-party API to fetch data (e.g. hotel availability, CRM records)\nUse an LLM node to generate dynamic responses\nTrigger a parallel automation (Slack message, calendar invite, etc.)",
"isPaid": false
},
{
"templateId": "4528",
"templateName": "Transcribe Telegram Audio with OpenAI",
"templateDescription": "This n8n workflow processes incoming Telegram messages, differentiating between text and voice messages. How it works:Message Trigger: The workflow...",
"templateUrl": "https://n8n.io/workflows/4528",
"jsonFileName": "Transcribe_Telegram_Audio_with_OpenAI.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Transcribe_Telegram_Audio_with_OpenAI.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/ddac758c71b3cfba0c6dd8a0541f3ae9/raw/22ed4c48bd67b2fc41e140e16939fa26f056417e/Transcribe_Telegram_Audio_with_OpenAI.json",
"screenshotURL": "https://i.ibb.co/RkHRcY2T/5a38c87b7413.png",
"workflowUpdated": true,
"gistId": "ddac758c71b3cfba0c6dd8a0541f3ae9",
"templateDescriptionFull": "This n8n workflow processes incoming Telegram messages, differentiating between text and voice messages.\n\nMessage Trigger: The workflow initiates when a new message is received via the Telegram \"Message Trigger\" node.\nSwitch Node: This node acts as a router. It examines the incoming message:\n\nIf the message is text, it directs the flow along the \"text\" branch.\nIf the message contains voice, it directs the flow along the \"voice\" branch.\nIf the message is text, it directs the flow along the \"text\" branch.\nIf the message contains voice, it directs the flow along the \"voice\" branch.\nGet Audio File: For audio messages, this node downloads the audio file from Telegram.\nTranscribe Audio: The downloaded audio file is then sent to an \"OpenAI Transcribe Recording\" node, which uses OpenAI's whisper-1 speech-to-text model to convert the audio into a text transcript.\nSend Transcription Message: Regardless of whether the original message was text or transcribed audio, the final text content is then passed to a \"Send transcription message\" node.\n\nTelegram Bot Token: You will need a Telegram bot token configured in the \"Message Trigger\" node to receive messages.\nOpenAI API Key: An OpenAI API key is required for the \"Transcribe audio\" node to perform speech transcription.\n\nThis workflow provides a foundational step for building more complex AI-driven applications. The transcribed text or original text message can be easily piped into an AI agent (e.g., a large language model) for analysis, response generation, or interaction with other tools, extending the bot's capabilities beyond simple message reception and transcription.\n\nFeel free to contact us at 1 Node.\nGet instant access to a library of free resources we created.",
"isPaid": false
},
{
"templateId": "3194",
"templateName": "template_3194",
"templateDescription": "This workflow automates voice reminders for upcoming appointments by generating a professional audio message and sending it to clients via email with the...",
"templateUrl": "https://n8n.io/workflows/3194",
"jsonFileName": "template_3194.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_3194.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/0a1ad3920dadf9b6fb036c1a807f233c/raw/3465fd099c671e1052d5bfd07baae29e11ed7f49/template_3194.json",
"screenshotURL": "https://i.ibb.co/DfYKnDWX/bed24ef15399.png",
"workflowUpdated": true,
"gistId": "0a1ad3920dadf9b6fb036c1a807f233c",
"templateDescriptionFull": "This workflow automates voice reminders for upcoming appointments by generating a professional audio message and sending it to clients via email with the voice file attached.\n\nIt integrates Google Calendar to track appointments, ElevenLabs to generate high-quality voice messages, and Gmail to deliver them efficiently.\n\n\n\nThis automated voice appointment reminder system is ideal for businesses that rely on scheduled appointments. It helps reduce no-shows, improve client engagement, and streamline communication.\n\nMedical Offices & Clinics – Ensure patients receive timely appointment reminders.\nReal Estate Agencies – Keep potential buyers and renters informed about property visits.\nService-Based Businesses – Perfect for salons, consultants, therapists, and coaches.\nLegal & Financial Services – Help clients remember important meetings and consultations.\n\nIf your business depends on scheduled appointments, this workflow saves time and enhances client satisfaction. 🚀\n\nEnsures clients receive timely reminders.\nReduces appointment no-shows and scheduling issues.\nAutomates the process with a personalized voice message.\n\nTrigger the Workflow – The system runs manually or on a schedule to check upcoming appointments in Google Calendar.\nRetrieve Appointment Data – It fetches event details (client name, time, and location) from Google Calendar.\nThe workflow uses the summary, start.dateTime, location, and attendees[0].email fields from Google Calendar to personalize and send the voice reminders.\nGenerate a Voice Reminder – Using ElevenLabs, the workflow converts the appointment details into a natural-sounding voice message.\nSend via Email – The generated audio file is attached to an email and sent to the client as a reminder.\n\nAdjust Trigger Frequency – Modify the scheduling to run daily, hourly, or at specific intervals.\nCustomize Voice Message Format – Change the script structure and voice tone to match your business needs.\nChange Notification Method – Instead of email, integrate SMS or WhatsApp for delivery.\n\nGoogle Calendar Access – Ensure you have access to the calendar with scheduled appointments.\nElevenLabs API Key – Required for generating voice messages (you can start for free).\nGmail API Access – Needed for sending reminder emails.\nn8n Setup – The workflow runs on an n8n instance (self-hosted or cloud).\n\nSet Up Google Calendar API\n\nGo to Google Cloud Console.\nCreate a new project and enable Google Calendar API.\nGenerate OAuth 2.0 credentials and save them for n8n.\nGo to Google Cloud Console.\nCreate a new project and enable Google Calendar API.\nGenerate OAuth 2.0 credentials and save them for n8n.\nGet an ElevenLabs API Key\n\nSign up at ElevenLabs.\nRetrieve your API key from the dashboard.\nSign up at ElevenLabs.\nRetrieve your API key from the dashboard.\nConfigure Gmail API\n\nEnable Gmail API in Google Cloud Console.\nCreate OAuth credentials and authorize your email address for sending.\nEnable Gmail API in Google Cloud Console.\nCreate OAuth credentials and authorize your email address for sending.\nDeploy n8n & Install the Workflow\n\nInstall n8n (Installation Guide).\nAdd the required Google Calendar, ElevenLabs, and Gmail nodes.\nImport or build the workflow with the correct credentials.\nTest and fine-tune as needed.\nInstall n8n (Installation Guide).\nAdd the required Google Calendar, ElevenLabs, and Gmail nodes.\nImport or build the workflow with the correct credentials.\nTest and fine-tune as needed.\n\nThe LangChain Community node used in this workflow only works on self-hosted n8n instances. It is not compatible with n8n Cloud. Please ensure you are running a self-hosted instance before using this workflow.\n\nThis workflow ensures a professional and seamless experience for your clients, keeping them informed and engaged. 🚀🔊\n\n.\n\nPhil | Inforeole | Linkedin\n\n🇫🇷 Contactez nous pour automatiser vos processus",
"isPaid": false
},
{
"templateId": "3054",
"templateName": "template_3054",
"templateDescription": "🎥 AI Video Generator with HeyGen 🚀 Create AI-Powered Videos in n8n with HeyGen This workflow enables you to generate realistic AI videos using HeyGen, an...",
"templateUrl": "https://n8n.io/workflows/3054",
"jsonFileName": "template_3054.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_3054.json",
"jsonURL": "",
"screenshotURL": "",
"workflowUpdated": true,
"templateDescriptionFull": "This workflow enables you to generate realistic AI videos using HeyGen, an advanced AI platform for video automation. Simply input your text, choose an AI avatar and voice, and let HeyGen generate a high-quality video for you – all within n8n!\n\n✅ Ideal for:\n\nContent creators & marketers 🏆\nAutomating personalized video messages 📩\nAI-powered video tutorials & training materials 🎓\n\n1️⃣ Provide a text script – This will be spoken in the AI-generated video.\n2️⃣ Select an Avatar & Voice – Choose from a variety of AI-generated avatars and voices.\n3️⃣ Run the workflow – HeyGen processes your request and generates a video.\n4️⃣ Download your video – Get the direct link to your AI-powered video!\n\nSign up for a HeyGen account.\nGo to your account settings and retrieve your API Key.\n\nIn n8n, create new credentials and select \"Custom Auth\" as the authentication type.\nIn the Name provide : X-Api-Key\nAnd in the value paste your API key from Heygen\nUpdate the 2 http node with the right credentials.\n\nBrowse available avatars & voices in your HeyGen account.\nCopy the Avatar ID and Voice ID for your video.\n\nEnter your text, avatar ID, and voice ID.\nExecute the workflow – your video will be generated automatically!\n\n✔️ Fully Automated – No manual editing required!\n✔️ Realistic AI Avatars – Choose from a variety of digital avatars.\n✔️ Seamless Integration – Works directly within your n8n workflow.\n✔️ Scalable & Fast – Generate multiple videos in minutes.\n\n🔗 Start automating AI-powered video creation today with n8n & HeyGen!",
"isPaid": false
},
{
"templateId": "3142",
"templateName": "xSend and check TTS (Text-to-speech) voice calls end email verification",
"templateDescription": "This workflow automates the process of sending voice calls for verification purposes and combines it with email verification. It uses the ClickSend API for...",
"templateUrl": "https://n8n.io/workflows/3142",
"jsonFileName": "xSend_and_check_TTS_Text-to-speech_voice_calls_end_email_verification.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/xSend_and_check_TTS_Text-to-speech_voice_calls_end_email_verification.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/9abde228dce66265585d41010fdb77b5/raw/e11785b0c69b6f23fb2d5496fb7bde27b36a87df/xSend_and_check_TTS_Text-to-speech_voice_calls_end_email_verification.json",
"screenshotURL": "https://i.ibb.co/SwC2KXcH/eb404d0b0f98.png",
"workflowUpdated": true,
"gistId": "9abde228dce66265585d41010fdb77b5",
"templateDescriptionFull": "This workflow automates the process of sending voice calls for verification purposes and combines it with email verification. It uses the ClickSend API for voice calls and integrates with SMTP for email verification.\n\nThis workflow is a powerful tool for automating phone and email verification, ensuring a seamless and secure user verification process.\n\nBelow is a breakdown of the workflow:\n\nThe workflow is designed to verify a user's phone number and email address through a combination of voice calls and email verification. Here's how it works:\n\nForm Submission:\n\nThe workflow starts with a Form Trigger node, where users submit a form with the following fields:\n\nTo: The recipient's phone number (including the international prefix, e.g., +1xxxx).\nVoice: The voice type (male or female).\nLang: The language for the voice call (e.g., en-us, it-it, fr-fr, etc.).\nEmail: The recipient's email address.\nName: The recipient's name.\nThe workflow starts with a Form Trigger node, where users submit a form with the following fields:\n\nTo: The recipient's phone number (including the international prefix, e.g., +1xxxx).\nVoice: The voice type (male or female).\nLang: The language for the voice call (e.g., en-us, it-it, fr-fr, etc.).\nEmail: The recipient's email address.\nName: The recipient's name.\nTo: The recipient's phone number (including the international prefix, e.g., +1xxxx).\nVoice: The voice type (male or female).\nLang: The language for the voice call (e.g., en-us, it-it, fr-fr, etc.).\nEmail: The recipient's email address.\nName: The recipient's name.\nSet Voice Code:\n\nThe Set Voice Code node defines the verification code that will be spoken during the voice call.\nThe Set Voice Code node defines the verification code that will be spoken during the voice call.\nFormat Code for Voice:\n\nThe Code for Voice node formats the verification code by adding spaces between characters for better clarity during the voice call.\nThe Code for Voice node formats the verification code by adding spaces between characters for better clarity during the voice call.\nSend Voice Call:\n\nThe call includes the verification code, which is read aloud to the recipient.\nThe call includes the verification code, which is read aloud to the recipient.\nVerify Voice Code:\n\nThe Verify Voice Code node prompts the user to enter the code they received via the voice call.\nThe Is Voice Code Correct? node checks if the entered code matches the predefined code.\n\nIf correct, the workflow proceeds to email verification.\nIf incorrect, the user is notified of the failure.\nThe Verify Voice Code node prompts the user to enter the code they received via the voice call.\nThe Is Voice Code Correct? node checks if the entered code matches the predefined code.\n\nIf correct, the workflow proceeds to email verification.\nIf incorrect, the user is notified of the failure.\nIf correct, the workflow proceeds to email verification.\nIf incorrect, the user is notified of the failure.\nSet Email Code:\n\nThe Set Email Code node defines the verification code that will be sent via email.\nThe Set Email Code node defines the verification code that will be sent via email.\nSend Email:\n\nThe Send Email node sends an email to the recipient with the verification code using SMTP.\nThe Send Email node sends an email to the recipient with the verification code using SMTP.\nVerify Email Code:\n\nThe Verify Email Code node prompts the user to enter the code they received via email.\nThe Is Email Code Correct? node checks if the entered code matches the predefined code.\n\nIf correct, the user is notified of successful verification.\nIf incorrect, the user is notified of the failure.\nThe Verify Email Code node prompts the user to enter the code they received via email.\nThe Is Email Code Correct? node checks if the entered code matches the predefined code.\n\nIf correct, the user is notified of successful verification.\nIf incorrect, the user is notified of the failure.\nIf correct, the user is notified of successful verification.\nIf incorrect, the user is notified of the failure.\n\nTo set up and use this workflow in n8n, follow these steps:\n\nClickSend API Key:\n\nCreate an account on ClickSend and obtain your API Key.\nIn the Send Voice node, set up HTTP Basic Authentication:\n\nUsername: Use the username you registered with on ClickSend.\nPassword: Use the API Key provided by ClickSend.\nCreate an account on ClickSend and obtain your API Key.\nIn the Send Voice node, set up HTTP Basic Authentication:\n\nUsername: Use the username you registered with on ClickSend.\nPassword: Use the API Key provided by ClickSend.\nUsername: Use the username you registered with on ClickSend.\nPassword: Use the API Key provided by ClickSend.\nSMTP Configuration:\n\nSet up SMTP credentials in n8n for the Send Email node.\nEnsure the SMTP server is configured to send emails from the specified email address.\nSet up SMTP credentials in n8n for the Send Email node.\nEnsure the SMTP server is configured to send emails from the specified email address.\nForm Configuration:\n\nThe Form Trigger node is pre-configured with fields for:\n\nTo: The recipient's phone number.\nVoice: Choose between male or female voice.\nLang: Select the language for the voice call.\nEmail: The recipient's email address.\nName: The recipient's name.\n\n\nCustomize the form fields if needed.\nThe Form Trigger node is pre-configured with fields for:\n\nTo: The recipient's phone number.\nVoice: Choose between male or female voice.\nLang: Select the language for the voice call.\nEmail: The recipient's email address.\nName: The recipient's name.\nTo: The recipient's phone number.\nVoice: Choose between male or female voice.\nLang: Select the language for the voice call.\nEmail: The recipient's email address.\nName: The recipient's name.\nCustomize the form fields if needed.\nSet Verification Codes:\n\nIn the Set Voice Code node, define the verification code that will be spoken during the voice call.\nIn the Set Email Code node, define the verification code that will be sent via email.\nIn the Set Voice Code node, define the verification code that will be spoken during the voice call.\nIn the Set Email Code node, define the verification code that will be sent via email.\nTest the Workflow:\n\nSubmit the form with the required details (phone number, voice, language, email, and name).\nThe workflow will:\n\nSend a voice call with the verification code.\nPrompt the user to verify the code.\nSend an email with the verification code.\nPrompt the user to verify the email code.\nNotify the user of success or failure.\nSubmit the form with the required details (phone number, voice, language, email, and name).\nThe workflow will:\n\nSend a voice call with the verification code.\nPrompt the user to verify the code.\nSend an email with the verification code.\nPrompt the user to verify the email code.\nNotify the user of success or failure.\nSend a voice call with the verification code.\nPrompt the user to verify the code.\nSend an email with the verification code.\nPrompt the user to verify the email code.\nNotify the user of success or failure.\n\nContact me for consulting and support or add me on Linkedin.",
"isPaid": false
},
{
"templateId": "3072",
"templateName": "Send TTS (Text-to-speech) voice calls",
"templateDescription": "This workflow automates the process of sending text-to-speech (TTS) voice calls using API. It allows users to submit a form with the message content,...",
"templateUrl": "https://n8n.io/workflows/3072",
"jsonFileName": "Send_TTS_Text-to-speech_voice_calls.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Send_TTS_Text-to-speech_voice_calls.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/23547a5cabf2eceaf66752184541bf6e/raw/db519df14da2929ce2270c8f9ec782e51523169a/Send_TTS_Text-to-speech_voice_calls.json",
"screenshotURL": "https://i.ibb.co/mFXC83sq/f163ad53ba6b.png",
"workflowUpdated": true,
"gistId": "23547a5cabf2eceaf66752184541bf6e",
"templateDescriptionFull": "This workflow automates the process of sending text-to-speech (TTS) voice calls using API. It allows users to submit a form with the message content, recipient's phone number, voice type, and language, and then sends a voice call with the provided text.\n\nThis workflow is a simple yet powerful way to automate text-to-speech voice calls using API. It’s ideal for notifications, reminders, or any scenario where voice communication is needed.\n\nBelow is a breakdown of the workflow:\n\nThe workflow is designed to send voice calls with text-to-speech functionality. Here's how it works:\n\nForm Submission:\n\nThe workflow starts with a Form Trigger node, where users submit a form with the following fields:\n\nBody: The text message to be converted to speech (max 600 characters).\nTo: The recipient's phone number (including the international prefix, e.g., +39xxxxxxxxxx).\nVoice: The voice type (male or female).\nLang: The language for the voice call (e.g., en-us, it-it, fr-fr, etc.).\n\n\nOnce the form is submitted, the workflow is triggered.\nThe workflow starts with a Form Trigger node, where users submit a form with the following fields:\n\nBody: The text message to be converted to speech (max 600 characters).\nTo: The recipient's phone number (including the international prefix, e.g., +39xxxxxxxxxx).\nVoice: The voice type (male or female).\nLang: The language for the voice call (e.g., en-us, it-it, fr-fr, etc.).\nBody: The text message to be converted to speech (max 600 characters).\nTo: The recipient's phone number (including the international prefix, e.g., +39xxxxxxxxxx).\nVoice: The voice type (male or female).\nLang: The language for the voice call (e.g., en-us, it-it, fr-fr, etc.).\nOnce the form is submitted, the workflow is triggered.\nSend Voice Call:\n\nThe Send Voice node sends a POST request to the ClickSend API (https://rest.clicksend.com/v3/voice/send).\nThe request includes:\n\nThe text message (Body) to be converted to speech.\nThe recipient's phone number (To).\nThe voice type (Voice).\nThe language (Lang).\nMachine detection is enabled to detect if the call is answered by a machine.\n\n\nThe API processes the request and initiates a voice call to the specified number, where the text is read aloud by the selected voice.\nThe Send Voice node sends a POST request to the ClickSend API (https://rest.clicksend.com/v3/voice/send).\nThe request includes:\n\nThe text message (Body) to be converted to speech.\nThe recipient's phone number (To).\nThe voice type (Voice).\nThe language (Lang).\nMachine detection is enabled to detect if the call is answered by a machine.\nThe text message (Body) to be converted to speech.\nThe recipient's phone number (To).\nThe voice type (Voice).\nThe language (Lang).\nMachine detection is enabled to detect if the call is answered by a machine.\nThe API processes the request and initiates a voice call to the specified number, where the text is read aloud by the selected voice.\nOutcome:\n\nThe recipient receives a voice call, and the submitted text is read aloud in the chosen voice and language.\nThe recipient receives a voice call, and the submitted text is read aloud in the chosen voice and language.\n\nTo set up and use this workflow in n8n, follow these steps:\n\nRegister on ClickSend:\n\nGo to ClickSend and create an account.\nObtain your API Key and take advantage of the 2 € free credits provided.\nGo to ClickSend and create an account.\nObtain your API Key and take advantage of the 2 € free credits provided.\nConfigure ClickSend API in n8n:\n\nIn the Send Voice node, set up HTTP Basic Authentication:\n\nUsername: Use the username you registered with on ClickSend.\nPassword: Use the API Key provided by ClickSend.\nIn the Send Voice node, set up HTTP Basic Authentication:\n\nUsername: Use the username you registered with on ClickSend.\nPassword: Use the API Key provided by ClickSend.\nUsername: Use the username you registered with on ClickSend.\nPassword: Use the API Key provided by ClickSend.\nSet Up the Form Trigger:\n\nThe Form Trigger node is pre-configured with fields for:\n\nBody: The text message to be converted to speech.\nTo: The recipient's phone number.\nVoice: Choose between male or female voice.\nLang: Select the language for the voice call.\n\n\nCustomize the form fields if needed (e.g., add more languages or voice options).\nThe Form Trigger node is pre-configured with fields for:\n\nBody: The text message to be converted to speech.\nTo: The recipient's phone number.\nVoice: Choose between male or female voice.\nLang: Select the language for the voice call.\nBody: The text message to be converted to speech.\nTo: The recipient's phone number.\nVoice: Choose between male or female voice.\nLang: Select the language for the voice call.\nCustomize the form fields if needed (e.g., add more languages or voice options).\nTest the Workflow:\n\nSubmit the form with the required details (text, phone number, voice, and language).\nThe workflow will send a voice call to the specified number, and the recipient will hear the text read aloud.\nSubmit the form with the required details (text, phone number, voice, and language).\nThe workflow will send a voice call to the specified number, and the recipient will hear the text read aloud.\nOptional Customization:\n\nModify the workflow to include additional features, such as:\n\nAdding more languages or voice options.\nSending multiple voice calls in bulk.\nIntegrating with other APIs or services for advanced use cases.\nModify the workflow to include additional features, such as:\n\nAdding more languages or voice options.\nSending multiple voice calls in bulk.\nIntegrating with other APIs or services for advanced use cases.\nAdding more languages or voice options.\nSending multiple voice calls in bulk.\nIntegrating with other APIs or services for advanced use cases.\n\nContact me for consulting and support or add me on Linkedin.",
"isPaid": false
},
{
"templateId": "2648",
"templateName": "template_2648",
"templateDescription": "This n8n template demonstrates a simple approach to using AI to automate the generation of blog content which aligns to your organisation's brand voice and...",
"templateUrl": "https://n8n.io/workflows/2648",
"jsonFileName": "template_2648.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_2648.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/bec8adbfdbbf27c3483a1c12136737a6/raw/8da6ed527621e9952901da9343fdb8e89de2d34d/template_2648.json",
"screenshotURL": "https://i.ibb.co/JR9k2d4L/d9121f468b91.png",
"workflowUpdated": true,
"gistId": "bec8adbfdbbf27c3483a1c12136737a6",
"templateDescriptionFull": "This n8n template demonstrates a simple approach to using AI to automate the generation of blog content which aligns to your organisation's brand voice and style by using examples of previously published articles.\n\nIn a way, it's quick and dirty \"training\" which can get your automated content generation strategy up and running for very little effort and cost whilst you evaluate our AI content pipeline.\n\nIn this demonstration, the n8n.io blog is used as the source of existing published content and 5 of the latest articles are imported via the HTTP node.\nThe HTML node is extract the article bodies which are then converted to markdown for our LLMs.\nWe use LLM nodes to (1) understand the article structure and writing style and (2) identify the brand voice characteristics used in the posts.\nThese are then used as guidelines in our final LLM node when generating new articles.\nFinally, a draft is saved to Wordpress for human editors to review or use as starting point for their own articles.\n\nUpdate Step 1 to fetch data from your desired blog or change to fetch existing content in a different way.\nUpdate Step 5 to provide your new article instruction. For optimal output, theme topics relevant to your brand.\n\nA source of text-heavy content is required to accurately breakdown the brand voice and article style. Don't have your own? Maybe try your competitors?\nOpenAI for LLM - though I recommend exploring other models which may give subjectively better results.\nWordpress for blog but feel free to use other preferred publishing platforms.\n\nIdeally, you'd want to \"train\" your agent on material which is similar to your output ie. your social media post may not get the best results from your blog content due to differing formats.\nTypically, this brand voice extraction exercise should run once and then be cached somewhere for reuse later. This would save on generation time and overall cost of the workflow.",
"isPaid": false
},
{
"templateId": "3435",
"templateName": "Lead Generation System (Template)",
"templateDescription": "You Don’t Need More Tools. You Just Need the Right Leads.Why spend $1,000s on lead gen when your perfect leads are already waiting in Apollo? You’ve already...",
"templateUrl": "https://n8n.io/workflows/3435",
"jsonFileName": "Lead_Generation_System_Template.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Lead_Generation_System_Template.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/490baea221a514e66137722574d38d27/raw/c4c684108c7bdf36e135485b25377aad2c1f2e0c/Lead_Generation_System_Template.json",
"screenshotURL": "https://i.ibb.co/SpZcRrS/6f2fa8c9f526.png",
"workflowUpdated": true,
"gistId": "490baea221a514e66137722574d38d27",
"templateDescriptionFull": "Why spend $1,000s on lead gen when your perfect leads are already waiting in Apollo? You’ve already filtered the ideal prospects. You know who they are, where they work, and what they do.\n\nNow imagine turning that list into enriched, ready-to-contact leads—without paying pricey Apollo's recurring subscription (spoiler: you will pay only 0.60$ per 500 leads).\n\nWith the Lead Generation System, you just drop your Apollo search URL.\n\nThe workflow does the rest:\n✅ Scrapes all matching contacts from your Apollo filter\n✅ Enriches and organizes the data (names, roles, emails, LinkedIns, companies, etc.)\n✅ Delivers the final lead list to Airtable—or your CRM of choice\n\nNo more manual exports.\nNo CSV mess.\nNo VA needed.\nJust qualified leads, cleaned and ready to go.\n\nFounders doing DIY outbound\nGrowth marketers scaling cold email\nAgencies running lead-gen for clients\nAnyone tired of paying too much for messy, outdated lists\n\nI built a step-by-step guide to setup this workflow in 5 to 10 minutes, available here: https://notanothermarketer.gitbook.io/home/templates/lead-generation\n\nThis template is free. Enjoy!",
"isPaid": false
},
{
"templateId": "3580",
"templateName": "template_3580",
"templateDescription": "LinkedIn Hiring Signal Scraper — Jobs & Prospecting Using Bright Data Purpose:Discover recent job posts from LinkedIn using Bright Data's Dataset API, clean...",
"templateUrl": "https://n8n.io/workflows/3580",
"jsonFileName": "template_3580.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_3580.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/9789a0e2da58207a040fde7967027082/raw/eddde4c088bff95017614ff730f6a2c48dbb0b78/template_3580.json",
"screenshotURL": "https://i.ibb.co/1YF76bhk/19d9f1977c9c.png",
"workflowUpdated": true,
"gistId": "9789a0e2da58207a040fde7967027082",
"templateDescriptionFull": "LinkedIn Hiring Signal Scraper — Jobs & Prospecting Using Bright Data\n\nDiscover recent job posts from LinkedIn using Bright Data's Dataset API, clean the results, and log them into Google Sheets — for both job hunting and identifying high-intent B2B leads based on hiring activity.\n\nJob Seekers – Spot relevant openings filtered by role, city, and country.\nSales & Prospecting – Use job posts as buying signals.\nIf a company is hiring for a role you support (e.g. marketers, developers, ops) —\nit's the perfect time to reach out and offer your services.\n\nn8n Nodes:\n\nForm Trigger\nHTTP Request\nWait\nIf\nCode\nGoogle Sheets\nSticky Notes (for embedded guidance)\nForm Trigger\nHTTP Request\nWait\nIf\nCode\nGoogle Sheets\nSticky Notes (for embedded guidance)\nExternal Services:\n\nBright Data (Dataset API)\nGoogle Sheets\nBright Data (Dataset API)\nGoogle Sheets\n\nBright Data API Key\n→ Add in the HTTP Request headers:\nAuthorization: Bearer YOUR_BRIGHTDATA_API_KEY\nGoogle Sheets OAuth2\n→ Connect your account in n8n to allow read/write access to the spreadsheet.\n\nUse descriptive names for all nodes.\nInclude retry logic in polling to avoid infinite loops.\nFlatten nested fields (like job_poster and base_salary).\nStrip out HTML tags from job descriptions for clean output.\n\nBright Data snapshots take ~1–3 minutes — use a Wait node and polling.\nForm filters affect output significantly:\n🔍 We recommend filtering by \"Last 7 days\" or \"Past 24 hours\" for fresher data.\nAvoid hardcoding values in the form — leave optional filters empty if unsure.\n\nAfter data lands in Google Sheets, you can use it to:\n\nPersonalize cold emails based on job titles, locations, and hiring signals.\nSend thoughtful LinkedIn messages (e.g., \"Saw you're hiring a CMO...\")\nPrioritize outreach to companies actively growing in your niche.\nPersonalize cold emails based on job titles, locations, and hiring signals.\nSend thoughtful LinkedIn messages (e.g., \"Saw you're hiring a CMO...\")\nPrioritize outreach to companies actively growing in your niche.\n\n📄 Copy the Google Sheet Template:\nClick here to make your copy\n→ Rename for each campaign or client.\nForm fields include:\n\nJob Location (city or region)\nKeyword (e.g., CMO, Backend Developer)\nCountry (2-letter code, e.g., US, UK)\nJob Location (city or region)\nKeyword (e.g., CMO, Backend Developer)\nCountry (2-letter code, e.g., US, UK)\n\nThis workflow gives you a competitive edge —\n📌 For candidates: Be first to apply.\n📌 For sellers: Be first to pitch.\nAll based on live hiring signals from LinkedIn.\n\nOpen this template\nGo to File → Make a copy\nYou'll use this copy as the destination for the scraped job posts\n\nThe form allows you to define what kind of job posts you want to scrape.\n\nFields:\n\nJob Location → e.g. New York, Berlin, Remote\nKeyword → e.g. CMO, AI Architect, Ecommerce Manager\nCountry Code (2-letter) → e.g. US, UK, IL\n\n💡 Pro Tip:\nFor best results, set the filter inside the workflow to:\ntime_range = \"Past 24 hours\" or \"Last 7 days\"\nThis keeps results relevant and fresh.\n\nThe workflow sends a request to Bright Data with your input.\n\nExample API Call Body:\n\nBright Data will start preparing the dataset in the background.\n\nThe workflow includes a Wait Node and Polling Loop that checks every few minutes until the data is ready.\n\nYou don't need to do anything here — it's all automated.\n\nOnce Bright Data responds with the full job post list:\n\n✔️ Nested fields like job_poster and base_salary are flattened\n✔️ HTML in job descriptions is removed\n✔️ Final data is formatted for export\n\nThe final cleaned list is added to your Google Sheet (first tab).\n\nEach row = one job post, with columns like:\n\njob_title, company_name, location, salary_min, apply_link, job_description_plain\n\nYou search for:\n\nLocation: Berlin\nKeyword: Product Designer\nCountry: DE\nTime range: Past 7 days\n\nNow you've got a live list of roles — with salary, recruiter info, and apply links.\n→ Use it to apply faster than others.\n\nYou search for:\n\nLocation: London\nKeyword: Growth Marketing\nCountry: UK\n\nAnd find companies hiring growth marketers.\n→ That's your signal to offer help with media buying, SEO, CRO, or your relevant service.\n\nUse the data to:\n\nWrite personalized cold emails (\"Saw you're hiring a Growth Marketer…\")\nStart warm LinkedIn outreach\nBuild lead lists of companies actively expanding in your niche\n\nBright Data API Key\nUsed in HTTP headers: Authorization: Bearer YOUR_BRIGHTDATA_API_KEY\nGoogle Sheets OAuth2\nAllows n8n to read/write to your spreadsheet\n\nModify the HTTP Request body to add more filters (e.g. job_type, remote, company)\nIncrease or reduce polling wait time depending on Bright Data speed\nAdd scoring logic to prioritize listings based on title or location\n\n📄 Google Sheet Template:\nMake your copy here\n⚙️ Bright Data Dataset API:\nVisit BrightData.com\n📬 Personalization works best when you act quickly.\nUse the freshest data to reach out with context — not generic pitches.\n\nThis workflow turns LinkedIn job posts into sales insights and job leads.\nAll in one click. Fully automated. Ready for your next move.",
"isPaid": false
},
{
"templateId": "5385",
"templateName": "Google Maps Email Scraper with HTTP Requests & JavaScript",
"templateDescription": "Google Maps Email Scraper System Categories: Lead Generation, Web Scraping, Business Automation This workflow creates a completely free Google Maps email...",
"templateUrl": "https://n8n.io/workflows/5385",
"jsonFileName": "Google_Maps_Email_Scraper_with_HTTP_Requests__JavaScript.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Google_Maps_Email_Scraper_with_HTTP_Requests__JavaScript.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/58737e75950b2920c57148399418dbb1/raw/0454b3844ea19244ab4a07591e905971efa8e89d/Google_Maps_Email_Scraper_with_HTTP_Requests__JavaScript.json",
"screenshotURL": "https://i.ibb.co/JWRkmTrJ/1c08cc80815c.png",
"workflowUpdated": true,
"gistId": "58737e75950b2920c57148399418dbb1",
"templateDescriptionFull": "Categories: Lead Generation, Web Scraping, Business Automation\n\nThis workflow creates a completely free Google Maps email scraping system that extracts unlimited business emails without requiring expensive third-party APIs. Built entirely in N8N using simple HTTP requests and JavaScript, this system can generate thousands of targeted leads for any industry or location while operating at 99% free cost structure.\n\nZero API Costs - Operates entirely through free Google Maps scraping without expensive third-party services\nUnlimited Lead Generation - Extract emails from thousands of Google Maps listings across any industry\nGeographic Targeting - Search by specific cities, regions, or business types for precise lead targeting\nComplete Automation - From search query to organized email list with minimal manual intervention\nBuilt-in Data Cleaning - Automatic duplicate removal, filtering, and data validation\nScalable Processing - Handle hundreds of businesses per search with intelligent rate limiting\n\nGoogle Maps Search Integration:\n\nUses strategic HTTP requests to Google Maps search URLs\nProcesses search queries like \"Calgary + dentist\" to extract business listings\nBypasses API restrictions through direct HTML scraping techniques\n\nIntelligent URL Extraction:\n\nCustom JavaScript regex patterns extract website URLs from Google Maps data\nFilters out irrelevant domains (Google, schema, static files)\nReturns clean list of actual business websites for processing\n\nSmart Website Processing:\n\nLoop-based architecture prevents IP blocking through intelligent batching\nBuilt-in delays and redirect handling for reliable scraping\nProcesses each website individually with error handling\n\nEmail Pattern Recognition:\n\nAdvanced regex patterns identify email addresses within website HTML\nExtracts contact emails, info emails, and administrative addresses\nHandles multiple email formats and validation patterns\n\nData Aggregation & Cleaning:\n\nAutomatically removes duplicate emails across all processed websites\nFilters null entries and invalid email formats\nExports clean, organized email lists to Google Sheets\n\nCreate a Google Sheet with these exact column headers:\n\nSearch Tracking Sheet:\n\nsearches - Contains your search queries (e.g., \"Calgary dentist\", \"Miami lawyers\")\n\nEmail Results Sheet:\n\nemails - Contains extracted email addresses from all processed websites\n\nSetup Instructions:\n\nCreate Google Sheet with two tabs: \"searches\" and \"emails\"\nAdd your target search queries to the searches tab (one per row)\nConnect Google Sheets OAuth credentials in n8n\nUpdate the Google Sheets document ID in all sheet nodes\n\nThe workflow reads search queries from the first sheet and exports results to the second sheet automatically.\n\nLocal Service Providers - Find competitors and potential partners in specific geographic areas\nB2B Sales Teams - Generate targeted prospect lists for cold outreach campaigns\nMarketing Agencies - Build industry-specific lead databases for client campaigns\nReal Estate Professionals - Identify businesses in target neighborhoods for commercial opportunities\nFranchise Development - Research potential markets and existing competition\nMarket Research - Analyze business density and contact information across regions\n\nThis system transforms lead generation economics:\n\n$0 per lead vs. $2-5 per lead from paid databases\nProcess 1,000+ leads daily without hitting API limits\nSell as a service for $500-2,000 per industry/location\nPerfect for agencies offering lead generation to local businesses\n\nDifficulty Level: Intermediate\nEstimated Build Time: 1-2 hours\nMonthly Operating Cost: $0 (completely free)\n\nWant to watch me build this entire system live from scratch? I walk through every single step - including the JavaScript code, regex patterns, error handling, and all the debugging that goes into creating a bulletproof scraping system.\n\n🎥 Watch My Live Build: \"Scrape Unlimited Leads WITHOUT Paying for APIs (99% FREE)\"\n\nThis comprehensive tutorial shows the real development process - including writing custom JavaScript, handling rate limits, and building systems that actually work at scale without getting blocked.\n\nBasic Workflow Architecture:\n\nSet up manual trigger for testing and Google Sheets integration\nConfigure initial HTTP request node for Google Maps searches\nEnable SSL ignore and response headers for reliable scraping\n\nURL Extraction Code Setup:\n\nConfigure JavaScript code node with custom regex patterns\nSet up input data processing from Google Maps HTML responses\nImplement URL filtering logic to remove irrelevant domains\n\nWebsite Processing Pipeline:\n\nAdd \"Split in Batches\" node for intelligent loop processing\nConfigure HTTP request nodes with proper delays and redirect handling\nSet up error handling for websites that can't be scraped\n\nEmail Extraction System:\n\nImplement JavaScript code node with email-specific regex patterns\nConfigure email validation and format checking\nSet up data aggregation for multiple emails per website\n\nData Cleaning & Export:\n\nConfigure filtering nodes to remove null entries and duplicates\nSet up \"Split Out\" node to aggregate emails into single list\nConnect Google Sheets integration for organized data export\n\nTesting & Optimization:\n\nUse limit nodes during testing to prevent IP blocking\nTest with small batches before scaling to full searches\nImplement proxy integration for high-volume usage\n\nScale the system with:\n\nMulti-Page Scraping: Extract URLs from homepages, then scrape contact pages for more emails\nProxy Integration: Add residential proxies for unlimited scraping without rate limits\nIndustry Templates: Create pre-configured searches for different business types\nContact Information Expansion: Extract phone numbers, addresses, and social media profiles\nCRM Integration: Automatically add leads to sales pipelines and marketing sequences\n\nRate Limiting: Built-in delays prevent IP blocking during normal usage\nScalability: For high-volume usage, consider proxy services for unlimited requests\nCompliance: Ensure proper usage rights for extracted contact information\nData Quality: System includes filtering but manual verification recommended for critical campaigns\n\nFor more advanced automation systems and business-building strategies that generate real revenue, explore my YouTube channel where I share proven automation techniques used by successful agencies and entrepreneurs.",
"isPaid": false
},
{
"templateId": "4685",
"templateName": "Lead 1",
"templateDescription": "🔧 Workflow Summary This system automates LinkedIn lead generation and enrichment in six clear stages: 1. Lead Collection (via Apollo.io)Automatically pulls...",
"templateUrl": "https://n8n.io/workflows/4685",
"jsonFileName": "Lead_1.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Lead_1.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/c580679135397ba214f7d2e039c3826d/raw/8fc7990f9348d687354a3a4e72a93ebd12854186/Lead_1.json",
"screenshotURL": "https://i.ibb.co/SwC2KXcH/eb404d0b0f98.png",
"workflowUpdated": true,
"gistId": "c580679135397ba214f7d2e039c3826d",
"templateDescriptionFull": "This system automates LinkedIn lead generation and enrichment in six clear stages:\n\n1. Lead Collection (via Apollo.io)\n\nAutomatically pulls leads based on keywords, roles, or industries using Apollo’s API.\nCaptures name, job title, company, and LinkedIn profile URL.\nYou can kick off the workflow via form, webhook, WhatsApp, Telegram, or any other custom trigger that passes search parameters.\n\n2. LinkedIn Username Extraction\n\nExtracts usernames from LinkedIn profile URLs using a script step.\nThese usernames are required for further enrichment using RapidAPI.\n\n3. Email Retrieval (via Apollo.io User ID)\n\nFetches verified work email using the Apollo User ID.\nEmail validity is double-checked using www.mails.so filtering out undeliverable or inactive emails by checking MX records and deliverability.\n\n4. Profile Summary (via LinkedIn API on RapidAPI)\n\nEnriches lead data by pulling bio/summary details to understand their background and expertise.\n\n5. Activity Insights (Posts & Reposts)\n\nCollects recent posts or reposts to help craft personalised messages based on what they’re currently engaging with.\n\n6. Leads Sheet Update\n\nAll data is written into a Google Sheet.\nNew columns are populated dynamically without erasing existing data.\n\n⸻\n\nEach workflow is equipped with a fail-safe system:\n\nTracks status per row: ✅ done, ❌ failed, ⏳ pending\nFailed rows are automatically retried after a custom delay (e.g., 2 weeks).\nEnsures minimal drop-offs and complete data coverage.\n\nMake a copy of the following:\n\nTemplate 1: Apollo Leads Scraper & Enrichment\nTemplate 2: Final Enriched Leads\n\nThe system appends data (like emails, bios, activity) step by step.\n\n1. Apollo API\n\nSign up and generate API key at [Apollo Developer Portal](Apollo Developer Portal)\nBe sure to enable the “Master API Key” toggle so the same key works for all endpoints.\n\n2. LinkedIn Data API (via RapidAPI)\n\nSubscribe at RapidAPI - LinkedIn Data\nUse your key in the x-rapidapi-key\nheader.\n\n3. Mails.so API\n\nGet your API Key from mails.so dashboard\n\n✅ Common Mistakes & Fixes\n\n1. API Keys Not Working\n\nMake sure API keys for Apollo, RapidAPI, and mails.so are correct.\nApollo “Master API Key” must be enabled.\nKeys should be saved as Generic Credentials in n8n.\n\n2. Leads Not Found\n\nCheck if the search query (keyword/job title) is too narrow.\nApollo might return empty results if the filters are incorrect.\n\n3. LinkedIn URLs Missing or Invalid\n\nEnsure Apollo is returning valid LinkedIn URLs.\nImproper URLs will cause username extraction and enrichment steps to fail.\n\n4. Emails Not Coming Through\n\nApollo may not have verified emails for all leads.\nmails.so might reject invalid or expired email addresses.\n\n5. Google Sheet Not Updating\n\nMake sure the Google Sheet is shared with the right Google account (linked to n8n).\nCheck if the column names match and data isn’t blocked due to formatting.\n\n6. Status Columns Not Changing\n\nEach row must have done, failed, or pending in the status column.\nIf the status doesn’t update, the retry logic won’t trigger.\n\n7. RapidAPI Not Returning Data\n\nDouble-check if username is present and valid.\nMake sure the RapidAPI plan is active and within limits.\n\n8. Workflow Not Running\n\nCheck if the trigger node (form, webhook, etc.) is connected and active.\nMake sure you’re passing the required inputs (keyword, role, etc.).\n\nNeed Help? Contact www.KrupalPatel.com for support and custom workflow development",
"isPaid": false
},
{
"templateId": "4794",
"templateName": "Upwork Job Poster Email",
"templateDescription": "Automated solution to extract and organize contact information from Upwork job postings, enabling direct outreach to potential clients who post jobs...",
"templateUrl": "https://n8n.io/workflows/4794",
"jsonFileName": "Upwork_Job_Poster_Email.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Upwork_Job_Poster_Email.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/b7c5c18f0c0493a9f755c82afd7ba6d2/raw/9f1917bab1b9479731ad74b107a78fbff2e7fecc/Upwork_Job_Poster_Email.json",
"screenshotURL": "https://i.ibb.co/mFXC83sq/f163ad53ba6b.png",
"workflowUpdated": true,
"gistId": "b7c5c18f0c0493a9f755c82afd7ba6d2",
"templateDescriptionFull": "Automated solution to extract and organize contact information from Upwork job postings, enabling direct outreach to potential clients who post jobs matching your expertise.\n\nScrapes job postings for contact information\nExtracts email addresses and social profiles\nOrganizes leads in a structured format\nEnables direct outreach campaigns\nTracks response rates\n\nFreelancers looking to expand their client base\nAgencies targeting specific industries\nSales professionals in the gig economy\nRecruiters sourcing clients\nDigital marketing agencies\n\n✅ Access to hidden contact information\n✅ Expand your client base\n✅ Beat the competition to opportunities\n✅ Targeted outreach campaigns\n✅ Higher response rates\n\nUpwork account\nn8n instance\nEmail service (for outreach)\nCRM (optional)\n\nEmail pattern detection\nSocial media profile extraction\nCompany website discovery\nLead scoring system\nOutreach tracking\n\nStart collecting leads in 20 minutes with our step-by-step guide\n\n\n\n\n\n\n\nTake control of your freelance career with direct access to potential clients. Transform how you find and secure projects on Upwork.",
"isPaid": false
},
{
"templateId": "2446",
"templateName": "template_2446",
"templateDescription": "PurposeUse a lightweight Voice Interface, for you and your entire organization, to interact with an AI Supervisor, a personal AI Assistant, which has access...",
"templateUrl": "https://n8n.io/workflows/2446",
"jsonFileName": "template_2446.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_2446.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/9d286eecc37157601bdf4a164c5fea79/raw/56643579389606036cb9c60f683ab47a22c87653/template_2446.json",
"screenshotURL": "https://i.ibb.co/yFmjHvGZ/fa3bfbf39a72.png",
"workflowUpdated": true,
"gistId": "9d286eecc37157601bdf4a164c5fea79",
"templateDescriptionFull": "Use a lightweight Voice Interface, for you and your entire organization, to interact with an AI Supervisor, a personal AI Assistant, which has access to your custom workflows. You can also connect the supervisor to your already existing Agents.\n\n\n\nAfter recording a message in the Vagent App, it gets transcribed and sent in combination with a session ID to the registered webhook\nThe Main Agent acts as a router. I interprets the message while using the stored chat history (bound to the session ID) and chooses which tool to use to perform the required action and. Tools on this level are workflows, which contain subordinated Agents. Since the Main Agent interprets the original message, the raw input is passed to the Tools/Sub-Agents as a separate parameter\nWithin the Sub-Agents the actual processing takes place. Each of those has it’s separate chat memory (with a suffix to the main session ID), to achieve a clear separation of concerns\nDepending on the required action an HTTP Request Tool is called. The result is being formatted in Markdown and returned to the Main Agent with an additional short prompt, so it does not get interpreted by the Main Agent.\nDrafts are separated from a short message by added indentation (angle brackets). If some information is missing, no tool is called just yet, instead a message is returned back to the user\nThe Main Agent then outputs the result from the called Sub-Agent. If a draft is included, it gets separated from the spoken output\nFinally the formatted output is returned as response to the webhook. The message is split into a spoken and a text version, which enables the App to read out loud unnecessary information like drafts in this example\n\nSee the full documentation of Vagent: https://vagent.io/docs\n\nImport this workflow into your n8n instance\nFollow the instructions given in the sticky notes on the canvas\nSetup your credentials. OpenAI can be replaced by another LLM in the workflow, but is required for the App to work. Google Calendar and Notion are required for all scenarios to work\nCopy the Webhook URL from the Webhook node of the main workflow\nDownload the Vagent App from https://vagent.io\nIn the settings paste your OpenAI API Token, the Webhook URL and the password defined for Header Auth\nNow you can use the App to interact with the Multi-Agent using your Voice by tapping the Mic symbol in the App to record your message.\n\nTo use the chat trigger (for testing) properly, temporarily disable the nodes after the Tools Agent.",
"isPaid": false
},
{
"templateId": "3666",
"templateName": "Property Lead Contact Enrichment from CRM",
"templateDescription": "How It WorksThis workflow automates the entire property lead generation process in a few simple steps: Property Search: Connects to BatchData's Property...",
"templateUrl": "https://n8n.io/workflows/3666",
"jsonFileName": "Property_Lead_Contact_Enrichment_from_CRM.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Property_Lead_Contact_Enrichment_from_CRM.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/dab4a84695a124fd47a54a05b4330719/raw/81abfa3b106e49c27b11d92c8ab4aeb5f390911d/Property_Lead_Contact_Enrichment_from_CRM.json",
"screenshotURL": "https://i.ibb.co/8L1Kbbmm/f0a410e5b0b1.png",
"workflowUpdated": true,
"gistId": "dab4a84695a124fd47a54a05b4330719",
"templateDescriptionFull": "This workflow automates the entire property lead generation process in a few simple steps:\n\nProperty Search: Connects to BatchData's Property Search API with customizable parameters (location, property type, value range, equity percentage, etc.)\nLead Filtering & Scoring: Processes results to identify the most promising leads based on criteria like absentee ownership, years owned, equity percentage, and tax status. Each property receives a lead score to prioritize follow-up.\nSkip Tracing: Automatically retrieves owner contact information (phone, email, mailing address) for each qualified property.\nData Formatting: Structures all property and owner data into a clean, organized format ready for your systems.\nMulti-Channel Output:\n\nGenerates an Excel spreadsheet with all lead details\nPushes leads directly to your CRM (configurable for HubSpot, Salesforce, etc.)\nSends a summary email with the spreadsheet attached\n\nThe workflow can run on a daily schedule or be triggered manually as needed. All parameters are easily configurable through dedicated nodes, requiring no coding knowledge.\n\nThis workflow is perfect for:\n\nReal Estate Investors looking to find off-market properties with motivated sellers\nReal Estate Agents who want to generate listing leads from distressed or high-equity properties\nInvestment Companies that need regular lead flow for acquisitions\nReal Estate Marketers who run targeted campaigns to property owners\nWholesalers seeking to build a pipeline of potential deals\nProperty Service Providers (roof repair, renovation contractors, etc.) who target specific property types\n\nAnyone who needs reliable, consistent lead generation for real estate without the manual work of searching, filtering, and organizing property data will benefit from this automation.\n\nBatchData is a comprehensive property data provider that offers access to nationwide property information, owner details, and skip tracing services. Key features include:\n\nExtensive Database: Covers 150+ million properties across all 50 states\nRich Property Data: Includes ownership information, tax records, sales history, valuation estimates, equity positions, and more\nSkip Tracing Services: Provides owner contact information including phone numbers, email addresses, and mailing addresses\nDistressed Property Indicators: Flags for pre-foreclosure, tax delinquency, vacancy, and other motivation factors\nRESTful API: Professional API for programmatic access to all property data services\nRegular Updates: Continuously refreshed data for accurate information\n\nBatchData's services are designed for real estate professionals who need reliable property and owner information to power their marketing and acquisition strategies. Their API-first approach makes it ideal for workflow automation tools like N8N.",
"isPaid": false
},
{
"templateId": "3490",
"templateName": "HDW Lead Geländewagen",
"templateDescription": "Screenshot 20250411 at 00.36. ⚠️ DISCLAIMER: This workflow uses the AnySite LinkedIn community node, which is only available on self-hosted n8n instances....",
"templateUrl": "https://n8n.io/workflows/3490",
"jsonFileName": "HDW_Lead_Geländewagen.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/HDW_Lead_Geländewagen.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/f700d904aabeab178a35cd7572359cb7/raw/5888f17540aff996e9d7c7e42effd42b0136bd89/HDW_Lead_Gel%C3%A4ndewagen.json",
"screenshotURL": "https://i.ibb.co/fY3M11Vt/9490ae2743f3.png",
"workflowUpdated": true,
"gistId": "f700d904aabeab178a35cd7572359cb7",
"templateDescriptionFull": "⚠️ DISCLAIMER: This workflow uses the AnySite LinkedIn community node, which is only available on self-hosted n8n instances. It will not work on n8n.cloud.\n\nThis workflow automates the entire LinkedIn lead generation process from finding prospects that match your Ideal Customer Profile (ICP) to sending personalized messages. It uses AI to analyze lead data, score potential clients, and prioritize your outreach efforts.\n\nAI-Driven Lead Generation: Convert ICP descriptions into LinkedIn search parameters\nComprehensive Data Enrichment: Analyze company websites, LinkedIn posts, and news\nIntelligent Lead Scoring: Prioritize leads based on AI analysis of intent signals\nAutomated Outreach: Connect with prospects and send personalized messages\n\nSelf-hosted n8n instance with the AnySite LinkedIn community node installed\nOpenAI API access (for GPT-4o)\nGoogle Sheets access\nAnySite API key (available at anysite.io)\nLinkedIn account\n\nEnsure the AnySite LinkedIn community node is installed on your n8n instance\nCommand: npm install n8n-nodes-hdw\n(or use this instruction)\n\nOpenAI: Add your OpenAI API key\nGoogle Sheets: Set up Google account access\nAnySite LinkedIn: Configure your API key from AnySite.io\n\nCreate a new Google Sheet with the following columns (or copy template):\n\nName, URN, URL, Headline, Location, Current company, Industry, etc.\nThe workflow will populate these columns automatically\nName, URN, URL, Headline, Location, Current company, Industry, etc.\nThe workflow will populate these columns automatically\n\nUse chat to provide the AI Agent with your Ideal Customer Profile\nExample: \"Target marketing directors at SaaS companies with 50-200 employees\"\n\nModify the lead scoring prompt in the \"Company Score Analysis\" node to match your specific product/service\nTune the evaluation criteria based on your unique business needs\n\nUpdate the AnySite LinkedIn Send Message node with your custom message\n\nICP Translation: AI converts your ICP description into LinkedIn search parameters\nLead Discovery: Workflow searches LinkedIn using these parameters\nData Collection: Results are saved to Google Sheets\nEnrichment: System collects additional data about each lead:\n\nCompany website analysis\nLead's LinkedIn posts\nCompany's LinkedIn posts\nRecent company news\nCompany website analysis\nLead's LinkedIn posts\nCompany's LinkedIn posts\nRecent company news\nIntent Analysis: AI analyzes all data to identify buying signals\nLead Scoring: Leads are scored on a 1-10 scale based on likelihood of interest\nConnection Requests: Top-scoring leads receive connection requests\nFollow-Up: When connections are accepted, automated messages are sent\n\nSearch Parameters: Adjust the AI Agent prompt to refine your target audience\nScoring Criteria: Modify scoring prompts to highlight indicators relevant to your product\nMessage Content: Update message templates for personalized outreach\nSchedule: Configure when connection requests and messages are sent\n\nLinkedIn has connection request limits (approximately 100-200 per week)\nThe workflow includes safeguards to avoid exceeding these limits\nConsider spacing your outreach for better response rates\n\nNote: Always use automation tools responsibly and in accordance with LinkedIn's terms of service.",
"isPaid": false
},
{
"templateId": "5611",
"templateName": "template_5611",
"templateDescription": "AI-Powered Lead Generation with Apollo, GPT-4, and Telegram to Database Overview This intelligent lead generation workflow transforms voice commands or text...",
"templateUrl": "https://n8n.io/workflows/5611",
"jsonFileName": "template_5611.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_5611.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/8227aaf5b2d421f737b1912ff6809ecd/raw/546047066b1d0c48a86a5e3d768fe5e0450f53a9/template_5611.json",
"screenshotURL": "https://i.ibb.co/vxGxjtht/a31901309051.png",
"workflowUpdated": true,
"gistId": "8227aaf5b2d421f737b1912ff6809ecd",
"templateDescriptionFull": "This intelligent lead generation workflow transforms voice commands or text input into verified prospect lists through automated Apollo.io scraping. The system processes natural language requests, extracts search parameters using AI, and delivers clean, verified contact data directly to your database.\n\nVoice Recognition: Converts audio messages to text using OpenAI's transcription API\nNatural Language Processing: AI agent interprets requests and extracts search criteria\nFlexible Input: Supports both voice commands and text messages\n\nApollo.io Integration: Automated scraping using official Apollo.io API\nDynamic URL Generation: Builds search URLs based on extracted parameters\nIntelligent Parsing: Processes location, industry, and job title criteria\n\nVerified Emails Only: Filters results to include only verified email addresses\nDuplicate Prevention: Compares against existing database to avoid duplicates\nData Quality Control: Ensures high-quality prospect data\n\nDatabase Integration: Automatic storage in PostgreSQL/Supabase\nStructured Data: Organizes contacts with complete profile information\nReal-time Updates: Instant database updates with new prospects\n\nInput Processing: Receive voice message or text command\nAI Analysis: Extract search parameters (location, industry, job titles)\nURL Construction: Build Apollo.io search URL with extracted criteria\nData Scraping: Retrieve prospect data via Apollo.io API\nEmail Verification: Filter for verified email addresses only\nDuplicate Check: Compare against existing database records\nData Storage: Save new prospects to database\nConfirmation: Send success notification with count of new leads\n\nLocation: City, state, country combinations\nIndustry: Business sectors and verticals\nJob Titles: Executive roles, departments, seniority levels\nCompany Size: Organization scale and employee count\n\nFirst Name & Last Name\nEmail Address (verified only)\nLinkedIn Profile URL\nPhone Number (when available)\n\nCurrent Job Title\nCompany Name\nIndustry\nSeniority Level\nEmployment History\n\nCity & State\nCountry\nFull Location String\n\nWebsite URL\nBusiness Industry\nOrganization Details\n\nn8n Workflow Engine: Orchestrates the entire process\nOpenAI Integration: Powers voice transcription and AI analysis\nApollo.io API: Source for prospect data\nPostgreSQL/Supabase: Database storage and management\n\nOpenAI Whisper API for voice transcription\nOpenAI GPT for natural language processing\nApollo.io API for lead data retrieval\nSupabase API for database operations\n\nQuickly build prospect lists for outreach campaigns\nTarget specific industries or job roles\nMaintain clean, verified contact databases\n\nGenerate targeted lead lists for campaigns\nResearch prospects in specific markets\nBuild comprehensive contact databases\n\nIdentify potential partners or clients\nResearch competitive landscapes\nGenerate contact lists for networking\n\nFind candidates in specific locations\nTarget particular job roles or industries\nBuild talent pipeline databases\n\nVoice commands for instant lead generation\nAutomated processing eliminates manual work\nBatch processing for large prospect lists\n\nAI-powered parameter extraction\nFlexible search criteria combinations\nIndustry and role-specific filtering\n\nVerified email addresses only\nDuplicate prevention\nStructured, consistent data format\n\nEnd-to-end automated workflow\nReal-time database updates\nInstant confirmation notifications\n\nn8n workflow platform\nOpenAI API access\nApollo.io API credentials\nPostgreSQL or Supabase database\nMessaging platform integration\n\nImport workflow into n8n\nConfigure API credentials\nSet up database connections\nCustomize search parameters\nTest with sample voice/text input\n\nModify location formats\nAdd custom industry categories\nAdjust job title variations\nSet result limits\n\nCustomize field mappings\nAdd data validation rules\nImplement additional filters\nConfigure output formats\n\nConnect to CRM systems\nAdd email marketing tools\nIntegrate with sales platforms\nExport to various formats\n\nProcessing Speed: Voice-to-database in under 30 seconds\nData Accuracy: 95%+ verified email addresses\nAutomation Level: 100% hands-free operation\nScalability: Process 500+ leads per request\n\nTransform your lead generation process with intelligent automation that understands natural language and delivers verified prospects directly to your database.",
"isPaid": false
},
{
"templateId": "3830",
"templateName": "template_3830",
"templateDescription": "🧩 What This Workflow DoesThis workflow automates the process of identifying and enriching decision-maker contacts from a list of companies. By integrating...",
"templateUrl": "https://n8n.io/workflows/3830",
"jsonFileName": "template_3830.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_3830.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/56c644b11a15ac49c728bbe569b22427/raw/771e5d14a7c95d0db6916fda940dbecc28852957/template_3830.json",
"screenshotURL": "https://i.ibb.co/jvrsgZQP/0394f254e9e1.png",
"workflowUpdated": true,
"gistId": "56c644b11a15ac49c728bbe569b22427",
"templateDescriptionFull": "This workflow automates the process of identifying and enriching decision-maker contacts from a list of companies. By integrating with Apollo's APIs and Google Sheets, it streamlines lead generation, ensures data accuracy through human verification, and maintains an organized leads database.\n\nIdeal for sales and marketing teams aiming to:\n\nAutomate the discovery of key decision-makers (e.g., CEOs, CTOs).\nEnrich contact information with LinkedIn profiles, emails, and phone numbers.\nMaintain an up-to-date leads database with minimal manual intervention.\nReceive weekly summaries of newly verified leads.\n\n1. Google Sheets Preparation:\n\nUse the following pre-configured Google Sheet: Company Decision Maker Discovery Sheet.\nThis spreadsheet includes the necessary tabs and columns: Companies, Contacts, and Contacts (Verified).\nIt also contains a custom onEdit Apps Script function that automatically updates the Status column to Pending whenever the Domain field is modified.\nTo review or modify the script, navigate to Extensions > Apps Script within the Google Sheet.\n\n2. Credentials Setup:\n\nConfigure the following credentials in your n8n instance:\n\nGoogle Sheets: To read from and write to the spreadsheet.\nSlack: To send verification prompts and weekly reports.\nApollo: To access the Organization Search, Organization Enrichment, People Search, and Bulk People Enrichment APIs.\nLLM Service (e.g., OpenAI): To generate company summaries and determine departments based on job titles.\nGoogle Sheets: To read from and write to the spreadsheet.\nSlack: To send verification prompts and weekly reports.\nApollo: To access the Organization Search, Organization Enrichment, People Search, and Bulk People Enrichment APIs.\nLLM Service (e.g., OpenAI): To generate company summaries and determine departments based on job titles.\n\n3. Workflow Configuration:\n\nImport the workflow into your n8n instance.\nUpdate the nodes to reference the correct Google Sheet and Slack channel.\nEnsure that the Apollo and LLM nodes have the appropriate API keys and configurations.\n\n4. Testing the Workflow:\n\nAdd a new company entry in the Companies tab of the Google Sheet.\nVerify that the workflow triggers automatically, processes the data, and updates the Contacts and Contacts (Verified) tabs accordingly.\nCheck Slack for any verification prompts and confirm that weekly reports are sent as scheduled.",
"isPaid": false
},
{
"templateId": "3791",
"templateName": "LinkedIn Leads Scraping & Enrichment (Main)",
"templateDescription": "Note: Now includes an Apify alternative for Rapid API (Some users can't create new accounts on Rapid API, so I have added an alternative for you. But...",
"templateUrl": "https://n8n.io/workflows/3791",
"jsonFileName": "LinkedIn_Leads_Scraping__Enrichment_Main.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/LinkedIn_Leads_Scraping__Enrichment_Main.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/a07f750717303220231e8cedd1ad9c2f/raw/c218a81f0ebb1c73b959d116762026fbfba84a43/LinkedIn_Leads_Scraping__Enrichment_Main.json",
"screenshotURL": "https://i.ibb.co/s927F32t/d822f1666af7.png",
"workflowUpdated": true,
"gistId": "a07f750717303220231e8cedd1ad9c2f",
"templateDescriptionFull": "Note: Now includes an Apify alternative for Rapid API (Some users can't create new accounts on Rapid API, so I have added an alternative for you. But immediately you are able to get access to Rapid API, please use that option, it returns more detailed data). Scroll to bottom for APify setup guide\n\nThis n8n workflow automates LinkedIn lead generation, enrichment, and activity analysis using Apollo.io, RapidAPI, Google Sheets and Mail.so.\n\nPerfect for sales teams, founders, B2B marketers, and cold outreach pros who want personalized lead insights to drive better conversion rates.\n\nThe workflow is broken down into several key steps, each designed to help you build and enrich a valuable list of LinkedIn leads:\n\nPulls leads using Apollo.io's API based on keywords, industries, or job titles.\nSaves lead name, title, company, and LinkedIn URL to your Google Sheet.\nYou can replace the trigger node from the form node to a webhook, whatsapp, telegram, etc, any way for you to send over your query variables over to initiate the workflow.\n\nExtracts the LinkedIn username from profile URLs using a simple script node.\nThis is required for further enrichment via RapidAPI.\n\nUses the Apollo User ID to retrieve the lead’s verified work email.\nEnsures high-quality leads with reliable contact info.\nTo double check that the email is currently valid, we use the mail.so api and filter out emails that fail deliverability and mx-record check. We don't wanna risk sending emails to no longer existent addresses, right?\n\nQueries the LinkedIn Data API to fetch a lead’s profile summary/bio.\nGives you a deeper understanding of their background and expertise.\n\nRetrieves recent posts or reposts from each lead’s profile.\nGreat for tailoring outreach with reference to what they’re currently talking about.\n\nAll enriched data is written to the same Google Sheet.\nNew columns are filled in without overwriting existing data.\n\nEvery subworkflow includes a fail-safe mechanism to ensure:\n\n✅ Each row has status columns (e.g., done, failed, pending).\n🕒 A scheduled retry workflow resets failed rows to pending after 2 weeks (customizable).\n💬 This gives failed enrichments another chance to be processed later, reducing data loss.\n\nTemplate 1: Apollo Leads Scraping & Enrichment\n\nTemplate 2: Enriched Leads Database\n\nMake a copy to your Drive and use.\n\nColumns will be filled as each subworkflow runs (email, summary, interests, etc.)\n\nTo use this workflow, you’ll need the following credentials:\n\nSign up and get your key here: Apollo.io API Keys\n⚠️ Important: Toggle the “Master API Key” option to ON when generating your key.\nThis ensures the same key can be used for all Apollo endpoints in this workflow.\n\nSubscribe to the API here: LinkedIn Data API on RapidAPI\nUse the key in the x-rapidapi-key header in the relevant nodes.\n\nSign up and get your key here: Mail.so API\n\nModify the Apollo filters (location, industry, seniority) to target your ideal customers.\nChange retry interval in the scheduler (e.g., weekly instead of 2 weeks).\nConnect the database to your email campaign tool like Mailchimp or Instantly.ai.\nReplace the AI nodes with your desired AI agents and customize the system messages further to get desired results.\n\nTo use this workflow, you’ll need the following credentials:\n\nLogin to Apify, then open this link; https://console.apify.com/actors/2SyF0bVxmgGr8IVCZ/\n\nClick on integrations and scroll down to API Solutions and select \"Use API endpoints\". Scroll to \"Run Actor synchronously and get dataset items\" and copy the actor endpoint url then paste it in the placeholder inside the http node of Apify alternative flow \"apify-actor-endpoint\". That's it, you are set to go.\n\nI am available for custom n8n workflows, if you like my work, please get in touch with me on email at joseph@uppfy.com",
"isPaid": false
},
{
"templateId": "2899",
"templateName": "Unique QRcode coupon assignment and validation for Lead Generation system",
"templateDescription": "This workflow is designed to manage the assignment and validation of unique QR code coupons within a lead generation system with SuiteCRM. How it Works This...",
"templateUrl": "https://n8n.io/workflows/2899",
"jsonFileName": "Unique_QRcode_coupon_assignment_and_validation_for_Lead_Generation_system.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Unique_QRcode_coupon_assignment_and_validation_for_Lead_Generation_system.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/b1c77d689cadc949baa0c2d5e0d1d3eb/raw/b87eb6520985a29093469d4b74d550ea44637ada/Unique_QRcode_coupon_assignment_and_validation_for_Lead_Generation_system.json",
"screenshotURL": "https://i.ibb.co/bgPw67tm/12e941f31f1b.png",
"workflowUpdated": true,
"gistId": "b1c77d689cadc949baa0c2d5e0d1d3eb",
"templateDescriptionFull": "This workflow is designed to manage the assignment and validation of unique QR code coupons within a lead generation system with SuiteCRM.\n\nThis workflow automates the process of assigning unique QR code coupons to leads generated through a form submission, ensuring no duplicates are created, and validating the usage of coupons. Here's how it operates:\n\nWebhook Trigger:\n\nThe workflow starts with a Webhook node that listens for incoming requests containing QR code data.\nA Set coupon node extracts the QR code value from the request parameters.\nThe workflow starts with a Webhook node that listens for incoming requests containing QR code data.\nA Set coupon node extracts the QR code value from the request parameters.\nValidation of QR Code:\n\nAn If node checks if the QR code exists in the incoming data. If it does, the process proceeds; otherwise, a \"No coupon\" response is sent back.\nAn If node checks if the QR code exists in the incoming data. If it does, the process proceeds; otherwise, a \"No coupon\" response is sent back.\nCoupon Lookup:\n\nThe Get Lead node queries a Google Sheets document to check if the QR code corresponds to an existing lead.\nA subsequent Not used? node verifies whether the coupon has already been used by checking the \"USED COUPON?\" field in the sheet.\nThe Get Lead node queries a Google Sheets document to check if the QR code corresponds to an existing lead.\nA subsequent Not used? node verifies whether the coupon has already been used by checking the \"USED COUPON?\" field in the sheet.\nLead Duplication Check:\n\nWhen a new lead submits the form (On form submission), the Duplicate Lead? node checks if the email already exists in the system to prevent duplicates.\nWhen a new lead submits the form (On form submission), the Duplicate Lead? node checks if the email already exists in the system to prevent duplicates.\nCoupon Assignment:\n\nIf the lead is not a duplicate, the Get Coupon node retrieves an available unassigned coupon from the Google Sheets document.\nThe Token SuiteCRM node generates an access token for SuiteCRM, and the Create Lead SuiteCRM node creates a new lead entry in SuiteCRM, associating it with the assigned coupon.\nIf the lead is not a duplicate, the Get Coupon node retrieves an available unassigned coupon from the Google Sheets document.\nThe Token SuiteCRM node generates an access token for SuiteCRM, and the Create Lead SuiteCRM node creates a new lead entry in SuiteCRM, associating it with the assigned coupon.\nQR Code Generation and Email Notification:\n\nThe Get QR node generates a QR code image URL for the assigned coupon.\nThe Send Email node sends an email to the lead with the QR code attached.\nThe Get QR node generates a QR code image URL for the assigned coupon.\nThe Send Email node sends an email to the lead with the QR code attached.\nResponse Handling:\n\nDepending on the validation results, the workflow responds with appropriate messages:\n\n\"Coupon OK\" if the coupon is valid and unused.\n\"Coupon KO\" if the coupon has already been used.\n\"Coupon not valid\" if the QR code does not exist.\nDepending on the validation results, the workflow responds with appropriate messages:\n\n\"Coupon OK\" if the coupon is valid and unused.\n\"Coupon KO\" if the coupon has already been used.\n\"Coupon not valid\" if the QR code does not exist.\n\"Coupon OK\" if the coupon is valid and unused.\n\"Coupon KO\" if the coupon has already been used.\n\"Coupon not valid\" if the QR code does not exist.\n\nTo replicate this workflow in your own n8n environment, follow these steps:\n\nConfiguration:\n\nSet up an n8n instance either locally or via cloud services.\nImport the provided JSON configuration file into your workspace.\nConfigure all required credentials, such as:\n\nGoogle Sheets OAuth2 API for accessing the spreadsheet.\nSuiteCRM API credentials (e.g., SUITECRMURL, CLIENTID, CLIENTSECRET).\nSMTP credentials for sending emails.\nSet up an n8n instance either locally or via cloud services.\nImport the provided JSON configuration file into your workspace.\nConfigure all required credentials, such as:\n\nGoogle Sheets OAuth2 API for accessing the spreadsheet.\nSuiteCRM API credentials (e.g., SUITECRMURL, CLIENTID, CLIENTSECRET).\nSMTP credentials for sending emails.\nGoogle Sheets OAuth2 API for accessing the spreadsheet.\nSuiteCRM API credentials (e.g., SUITECRMURL, CLIENTID, CLIENTSECRET).\nSMTP credentials for sending emails.\nCustomization:\n\nAdjust the Webhook URL to match your deployment environment.\nModify the Google Sheets document ID and sheet name in nodes like Duplicate Lead?, Get Coupon, Update Sheet, and Update coupon used.\nUpdate the SuiteCRM API endpoint and credentials in nodes like Token SuiteCRM and Create Lead SuiteCRM.\nCustomize the email template in the Send Email node to match your branding and messaging requirements.\nEnsure the QR code generation URL in the Get QR node points to a valid QR code generator service.\nAdjust the Webhook URL to match your deployment environment.\nModify the Google Sheets document ID and sheet name in nodes like Duplicate Lead?, Get Coupon, Update Sheet, and Update coupon used.\nUpdate the SuiteCRM API endpoint and credentials in nodes like Token SuiteCRM and Create Lead SuiteCRM.\nCustomize the email template in the Send Email node to match your branding and messaging requirements.\nEnsure the QR code generation URL in the Get QR node points to a valid QR code generator service.\n\nBy following these steps, you can effectively implement and customize this workflow to manage lead generation and coupon assignments in your organization.",
"isPaid": false
},
{
"templateId": "2890",
"templateName": "Automate Drive-To-Store Lead Generation System (with coupon) on SuiteCRM",
"templateDescription": "Drive-to-Store is a multi-channel marketing strategy that includes both the web and the physical context, with the aim of increasing the number of customers...",
"templateUrl": "https://n8n.io/workflows/2890",
"jsonFileName": "Automate_Drive-To-Store_Lead_Generation_System_with_coupon_on_SuiteCRM.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Automate_Drive-To-Store_Lead_Generation_System_with_coupon_on_SuiteCRM.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/3280c75e32bb8892292cc7d87b5badbc/raw/45722377a8ffaff730a2099af3df04bc8853416f/Automate_Drive-To-Store_Lead_Generation_System_with_coupon_on_SuiteCRM.json",
"screenshotURL": "https://i.ibb.co/TxqmJBhJ/fefbca407c66.png",
"workflowUpdated": true,
"gistId": "3280c75e32bb8892292cc7d87b5badbc",
"templateDescriptionFull": "Drive-to-Store is a multi-channel marketing strategy that includes both the web and the physical context, with the aim of increasing the number of customers and sales in physical stores. This strategy guides potential customers from the online world to the physical point of sale through the provision of a coupon that can be spent in the store or on an e-commerce site.\n\nThe basic idea is to have a landing page with a form and a series of unique coupons to assign to leads as a \"reward\" for filling out the form.\n\nThis workflow is ideal for businesses looking to automate lead generation and management, especially when integrating with CRM systems like SuiteCRM and using Google Sheets for data tracking.\n\nForm Submission:\n\nThe workflow starts with the On form submission node, which triggers when a user submits a form on a landing page. The form collects the user's name, surname, email, and phone number.\nThe workflow starts with the On form submission node, which triggers when a user submits a form on a landing page. The form collects the user's name, surname, email, and phone number.\nForm Data Processing:\n\nThe Form Fields node extracts and sets the form data (name, surname, email, and phone) for use in subsequent steps.\nThe Form Fields node extracts and sets the form data (name, surname, email, and phone) for use in subsequent steps.\nDuplicate Lead Check:\n\nThe Duplicate Lead? node checks if the submitted email already exists in a Google Sheets document. If the email is found, the workflow responds with a \"duplicate lead\" message (Respond KO node) and stops further processing.\nThe Duplicate Lead? node checks if the submitted email already exists in a Google Sheets document. If the email is found, the workflow responds with a \"duplicate lead\" message (Respond KO node) and stops further processing.\nCoupon Retrieval:\n\nIf the email is not a duplicate, the Get Coupon node retrieves a coupon code from the Google Sheets document based on the lead's email.\nIf the email is not a duplicate, the Get Coupon node retrieves a coupon code from the Google Sheets document based on the lead's email.\nLead Creation in SuiteCRM:\n\nThe Create Lead SuiteCRM node creates a new lead in SuiteCRM using the form data and the retrieved coupon code. The lead includes:\n\nFirst name, last name, email, phone number, and coupon code.\nThe Create Lead SuiteCRM node creates a new lead in SuiteCRM using the form data and the retrieved coupon code. The lead includes:\n\nFirst name, last name, email, phone number, and coupon code.\nFirst name, last name, email, phone number, and coupon code.\nGoogle Sheets Update:\n\nThe Update Sheet node updates the Google Sheets document with the newly created lead's details, including:\n\nName, surname, email, phone, coupon code, lead ID, and the current date and time.\nThe Update Sheet node updates the Google Sheets document with the newly created lead's details, including:\n\nName, surname, email, phone, coupon code, lead ID, and the current date and time.\nName, surname, email, phone, coupon code, lead ID, and the current date and time.\nResponse to Webhook:\n\nThe Respond OK node sends a success response back to the webhook, indicating that the lead was created successfully.\nThe Respond OK node sends a success response back to the webhook, indicating that the lead was created successfully.\n\nConfigure Form Trigger:\n\nSet up the On form submission node to collect user data (name, surname, email, and phone) via a web form.\nSet up the On form submission node to collect user data (name, surname, email, and phone) via a web form.\nSet Up Google Sheets Integration:\n\nConfigure the Duplicate Lead?, Get Coupon, and Update Sheet nodes to interact with the Google Sheets document. Ensure the document contains columns for email, coupon, lead ID, and other relevant fields.\nConfigure the Duplicate Lead?, Get Coupon, and Update Sheet nodes to interact with the Google Sheets document. Ensure the document contains columns for email, coupon, lead ID, and other relevant fields.\nSet Up SuiteCRM Authentication:\n\nConfigure the Token SuiteCRM node with the appropriate client credentials (client ID and client secret) to obtain an access token from SuiteCRM.\nConfigure the Token SuiteCRM node with the appropriate client credentials (client ID and client secret) to obtain an access token from SuiteCRM.\nSet Up Lead Creation in SuiteCRM:\n\nConfigure the Create Lead SuiteCRM node to send a POST request to SuiteCRM's API to create a new lead. Include the form data and coupon code in the request body.\nConfigure the Create Lead SuiteCRM node to send a POST request to SuiteCRM's API to create a new lead. Include the form data and coupon code in the request body.\nSet Up Webhook Responses:\n\nConfigure the Respond OK and Respond KO nodes to send appropriate JSON responses back to the webhook based on whether the lead was created or if it was a duplicate.\nConfigure the Respond OK and Respond KO nodes to send appropriate JSON responses back to the webhook based on whether the lead was created or if it was a duplicate.\nTest the Workflow:\n\nSubmit a test form to ensure the workflow correctly checks for duplicates, retrieves a coupon, creates a lead in SuiteCRM, and updates the Google Sheets document.\nSubmit a test form to ensure the workflow correctly checks for duplicates, retrieves a coupon, creates a lead in SuiteCRM, and updates the Google Sheets document.\nActivate the Workflow:\n\nOnce tested, activate the workflow to automate the process of handling form submissions and lead creation.\nOnce tested, activate the workflow to automate the process of handling form submissions and lead creation.\n\nDuplicate Lead Check: Prevents duplicate leads by checking if the email already exists in the Google Sheets document.\nCoupon Assignment: Retrieves a coupon code from Google Sheets and assigns it to the new lead.\nSuiteCRM Integration: Automatically creates a new lead in SuiteCRM with the form data and coupon code.\nData Logging: Logs all lead details in a Google Sheets document for tracking and analysis.\nWebhook Responses: Provides immediate feedback on whether the lead was created successfully or if it was a duplicate.",
"isPaid": false
},
{
"templateId": "4589",
"templateName": "template_4589",
"templateDescription": "📌 HubSpot Lead Enrichment with Bright Data MCP Screenshot 20250603 at 1.24.49 This template enables natural-language-driven automation using Bright Data's...",
"templateUrl": "https://n8n.io/workflows/4589",
"jsonFileName": "template_4589.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_4589.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/cf5d57a99190b1476aef71f0339299fb/raw/ea4d485bb5ce18427b044c0b155cb8e9640f40c7/template_4589.json",
"screenshotURL": "https://i.ibb.co/V0cjK9Yt/c998c6d096a9.png",
"workflowUpdated": true,
"gistId": "cf5d57a99190b1476aef71f0339299fb",
"templateDescriptionFull": "This template enables natural-language-driven automation using Bright Data's MCP tools, triggered directly by new leads in HubSpot. It dynamically extracts and executes the right tool based on lead context—powered by AI and configurable in N8N.\n\nManual lead enrichment is slow, inconsistent, and drains valuable time. This solution automates the process using a no-code workflow that connects HubSpot, Bright Data MCP, and an AI agent—without requiring scripts or technical skills. Perfect for marketing, sales, and RevOps teams.\n\nTo use this template, you’ll need:\n\nA self-hosted or cloud instance of N8N\nA Bright Data MCP API token\nA valid OpenAI API key (or compatible AI model)\nA HubSpot account\nEither a Private App token or OAuth credentials for HubSpot\nBasic familiarity with N8N workflows\n\nLog in to your HubSpot account.\nNavigate to Settings → Integrations → Private Apps.\nCreate a new Private App with the following scopes:\n\ncrm.objects.contacts.read\ncrm.objects.contacts.write\ncrm.schemas.contacts.read\ncrm.objects.companies.read (optional)\ncrm.objects.contacts.read\ncrm.objects.contacts.write\ncrm.schemas.contacts.read\ncrm.objects.companies.read (optional)\nCopy the Access Token.\nIn N8N, create a credential for HubSpot App Token and paste the app token in the field.\nGo back to Hubspot Private App settings to setup a webhook.\n\nCopy the url in your workflow's Webhook node and paste it here.\nCopy the url in your workflow's Webhook node and paste it here.\n\nIn HubSpot, go to Settings → Integrations → Apps → Create App.\nSet your Redirect URL to match your N8N OAuth2 redirect path.\nChoose scopes like:\n\ncrm.objects.companies.read\ncrm.objects.contacts.read\ncrm.objects.deals.read\ncrm.schemas.companies.read\ncrm.schemas.contacts.read\ncrm.schemas.deals.read\ncrm.objects.contacts.write (conditionally required)\ncrm.objects.companies.read\ncrm.objects.contacts.read\ncrm.objects.deals.read\ncrm.schemas.companies.read\ncrm.schemas.contacts.read\ncrm.schemas.deals.read\ncrm.objects.contacts.write (conditionally required)\nNote the Client ID and Client Secret.\nCopy the App ID and the developer API key\nIn N8N, create a credential for HubSpot Developer API and paste those info from previous step.\nAttach these credentials to the HubSpot node in N8N.\n\nIn your Bright Data account, obtain the following information:\n\nAPI token\nWeb Unlocker zone name (optional)\nBrowser API username and password string separated by colon (optional)\n\nThe methods below will allow you to receive SSE (Server-Sent Events) from Bright Data MCP via a local Supergateway or Smithery\n\nMethod 1: Run Supergateway in a separate web service (Recommended)\n\nThis method will work for both cloud version and self-hosted N8N.\n\nSignup to any cloud services of your choice (DigitalOcean, Heroku, Hetzner, Render, etc.).\n\nCreate a new web service.\nChoose Node.js as runtime environment and setup a custom server without repository.\nIn your server’s settings to define environment variables or .env file, add:\nAPI_TOKEN=your_brightdata_api_token WEB_UNLOCKER_ZONE=optional_zone_name BROWSER_AUTH=optional_browser_auth\nPaste the following text as a start command: npx -y supergateway --stdio \"npx -y @brightdata/mcp\" --port 8000 --baseUrl http://localhost:8000 --ssePath /sse --messagePath /message\nDeploy it and copy the web server URL, then append /sse into it.\nYour SSE server should now be accessible at: https://your_server_url/sse\n\nCreate a new web service.\nChoose Docker as the runtime environment.\nSet up your Docker environment by pulling the necessary images or creating a custom Dockerfile.\nIn your server’s settings to define environment variables or .env file, add:\nAPI_TOKEN=your_brightdata_api_token WEB_UNLOCKER_ZONE=optional_zone_name BROWSER_ZONE=optional_browser_zone_name\n- Use the following Docker command to run Supergateway: docker run -it --rm -p 8000:8000 supercorp/supergateway \\ --stdio \"npx -y @brightdata/mcp /\" \\ --port 8000\nDeploy it and copy the web server URL, then append /sse into it.\nYour SSE server should now be accessible at: https://your_server_url/sse\n\nFor more installation guides, please refer to https://github.com/supercorp-ai/supergateway.git.\n\nMethod 2: Run Supergateway in the same web service as the N8N instance\n\nThis method will only work for self-hosted N8N.\n\nIn your server's settings to define environment variables or .env file, add:\n\nUse the command above to execute it through the cloud shell or set it as a pre-deploy command.\n\nYour SSE server should now be accessible at:\nhttp://localhost:8000/sse\n\nFor more installation guides, please refer to https://github.com/supercorp-ai/supergateway.git.\n\nMethod 3: Configure via Smithery.ai (Easiest)\nIf you don't want additional setup and want to test it right away, follow these instructions:\n\nVisit https://smithery.ai/server/@luminati-io/brightdata-mcp/tools to:\n\nSignup (if you are new to Smithery)\nCreate an API key\nDefine environment variables via a profile\nRetrieve your SSE server HTTP URL\n\nEnsure your Google Sheet:\n\nContains columns like row_id, first_name, last_name, email, and status.\nIs shared with your N8N service account (or connected via OAuth)\nContains columns like row_id, first_name, last_name, email, and status.\nIs shared with your N8N service account (or connected via OAuth)\nIn N8N:\n\nAdd a Google Sheets Trigger node\nSet it to watch for new rows in your lead sheet\nAdd a Google Sheets Trigger node\nSet it to watch for new rows in your lead sheet\n\nImport the provided JSON workflow into N8N\nUpdate nodes with your credentials:\n\nHubspot: Add your API key or connect it via OAuth.\nGoogle Sheets Trigger: Link to your actual sheet\nOpenAI Node: Add your API key\nBright Data Tool Execution: Add Bright Data token and SSE URL\nHubspot: Add your API key or connect it via OAuth.\nGoogle Sheets Trigger: Link to your actual sheet\nOpenAI Node: Add your API key\nBright Data Tool Execution: Add Bright Data token and SSE URL\n\nNew contact in Hubspot or a new row is added to the Google Sheet\nN8N triggers the workflow\nAI agent classifies the task (e.g., “Find LinkedIn”, “Get company info”)\nThe relevant MCP tool is called\nResults are appended back to the sheet or routed to another destination\nRerun the specific record by specifying status \"needs more enrichment\", or leaving it blank.\n\nB2B Lead Enrichment – Add missing fields (title, domain, social profiles)\nEmail Intelligence – Validate and enrich based on email\nMarket Research – Pull company or contact data on demand\nCRM Auto-fill – Push enriched leads to tools like HubSpot or Salesforce\n\nPrompt Tuning – Adjust how the AI interprets input data\nColumn Mapping – Customize which fields to pull from the sheet\nTool Logic – Add retries, fallback tools, or confidence-based routing\nDestination Output – Integrate with CRMs, Slack, or webhook endpoints\n\nThis template turns a Google Sheet into an AI-powered lead enrichment engine. By combining Bright Data’s tools with a natural language AI agent, your team can automate repetitive tasks and scale lead ops—without writing code.\n\nJust add a row, and let the workflow do the rest.",
"isPaid": false
},
{
"templateId": "2840",
"templateName": "template_2840",
"templateDescription": "n8n Workflow: Automate SIEM Alert Enrichment with MITRE ATT&CK & Qdrant Who is this for? This workflow is ideal for: Cybersecurity teams & SOC analysts who...",
"templateUrl": "https://n8n.io/workflows/2840",
"jsonFileName": "template_2840.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_2840.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/7bbab337eec93cd1afdaac2a9e8036a4/raw/b82605b4160c0d70f91d1f2ff8cd8fdf57550437/template_2840.json",
"screenshotURL": "https://i.ibb.co/Gf4yFbZy/089403418057.png",
"workflowUpdated": true,
"gistId": "7bbab337eec93cd1afdaac2a9e8036a4",
"templateDescriptionFull": "This workflow is ideal for:\n\nCybersecurity teams & SOC analysts who want to automate SIEM alert enrichment.\nIT security professionals looking to integrate MITRE ATT&CK intelligence into their ticketing system.\nOrganizations using Zendesk for security incidents who need enhanced contextual threat data.\nAnyone using n8n and Qdrant to build AI-powered security workflows.\n\nSecurity teams receive large volumes of raw SIEM alerts that lack actionable context. Investigating every alert manually is time-consuming and can lead to delayed response times. This workflow solves this problem by:\n✔ Automatically enriching SIEM alerts with MITRE ATT&CK TTPs.\n✔ Tagging & classifying alerts based on known attack techniques.\n✔ Providing remediation steps to guide the response team.\n✔ Enhancing security tickets in Zendesk with relevant threat intelligence.\n\n1️⃣ Ingests SIEM alerts (via chatbot or ticketing system like Zendesk).\n2️⃣ Queries a Qdrant vector store containing MITRE ATT&CK techniques.\n3️⃣ Extracts relevant TTPs (Tactics, Techniques, & Procedures) from the alert.\n4️⃣ Generates remediation steps using AI-powered enrichment.\n5️⃣ Updates Zendesk tickets with threat intelligence & recommended actions.\n6️⃣ Provides structured alert data for further automation or reporting.\n\nn8n instance (Cloud or Self-hosted).\nQdrant vector store with MITRE ATT&CK data embedded.\nOpenAI API key (for AI-based threat processing).\nZendesk account (for ticket enrichment, if applicable).\nClean Mitre Data Python Script\nCleaned Mitre Data\nFull Mitre Data\n\n1️⃣ Embed MITRE ATT&CK data into Qdrant\n\nThis workflow pulls MITRE ATT&CK data from Google Drive and loads it into Qdrant.\nThe data is vectorized using OpenAI embeddings for fast retrieval.\n\n2️⃣ Deploy the n8n Chatbot\n\nThe chatbot listens for SIEM alerts and sends them to the AI processing pipeline.\nAlerts are analyzed using an AI agent trained on MITRE ATT&CK.\n\n3️⃣ Enrich Zendesk Tickets\n\nThe workflow extracts MITRE ATT&CK techniques from alerts.\nIt updates Zendesk tickets with contextual threat intelligence.\nThe remediation steps are included as internal notes for SOC teams.\n\n🔧 Modify the chatbot trigger: Adapt the chatbot node to receive alerts from Slack, Microsoft Teams, or any other tool.\n\n🔧 Change the SIEM input source: Connect your workflow to Splunk, Elastic SIEM, or Chronicle Security.\n\n🔧 Customize remediation steps: Use a custom AI model to tailor remediation responses based on organization-specific security policies.\n\n🔧 Extend ticketing integration: Modify the Zendesk node to also work with Jira, ServiceNow, or another ITSM platform.\n\n✅ Saves time: Automates alert triage & classification.\n✅ Improves security posture: Helps SOC teams act faster on threats.\n✅ Leverages AI & vector search: Uses LLM-powered enrichment for real-time context.\n✅ Works across platforms: Supports n8n Cloud, Self-hosted, and Qdrant.\n\n📖 Watch the Setup Video\n💬 Have Questions? Join the Discussion in the YouTube Comments!",
"isPaid": false
},
{
"templateId": "1862",
"templateName": "template_1862",
"templateDescription": "Enrich your company lists with OpenAI GPT-3 ↓ You’ll get valuable information such as: Market (B2B or B2C)IndustryTarget AudienceValue Proposition This will...",
"templateUrl": "https://n8n.io/workflows/1862",
"jsonFileName": "template_1862.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_1862.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/b08f975d6f79d423e86d4b6192e6ab85/raw/67588fa35f31d0543988487625a6f27e93de9f33/template_1862.json",
"screenshotURL": "https://i.ibb.co/W4GBQmyP/0377df5a7720.png",
"workflowUpdated": true,
"gistId": "b08f975d6f79d423e86d4b6192e6ab85",
"templateDescriptionFull": "Enrich your company lists with OpenAI GPT-3 ↓\n\nYou’ll get valuable information such as:\n\nMarket (B2B or B2C)\nIndustry\nTarget Audience\nValue Proposition\n\nThis will help you to:\n\nadd more personalization to your outreach\nmake informed decisions about which accounts to target\n\nI've made the process easy with an n8n workflow.\n\nHere is what it does:\n\nRetrieve website URLs from Google Sheets\nExtract the content for each website\nAnalyze it with GPT-3\nUpdate Google Sheets with GPT-3 data",
"isPaid": false
},
{
"templateId": "4641",
"templateName": "template_4641",
"templateDescription": "This workflow automates the process of handling conversation transcriptions and distributing key information across your organization. Here's what it does:...",
"templateUrl": "https://n8n.io/workflows/4641",
"jsonFileName": "template_4641.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_4641.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/47bda1c84c74d895d80324d970dd4384/raw/7dc88769cc9df0cc9d91979f723051ab46740c7f/template_4641.json",
"screenshotURL": "https://i.ibb.co/339c8JS/f89e35cbd795.png",
"workflowUpdated": true,
"gistId": "47bda1c84c74d895d80324d970dd4384",
"templateDescriptionFull": "This workflow automates the process of handling conversation transcriptions and distributing key information across your organization. Here's what it does:\n\nTrigger: The workflow is initiated via a webhook that receives a transcription (e.g., from a call or meeting).\n\nSummarization & Extraction: Using AI, the transcription is summarized, and key information is extracted — such as action items, departments involved, and client details.\n\nDepartment Notifications: The relevant summarized information is automatically routed to specific departments via email based on content classification.\n\nCRM Sync: The summarized version is saved to the associated contact or deal in HubSpot for future reference and visibility.\n\n**Multi-Channel Alerts: **The summary is also sent via WhatsApp and Slack to keep internal teams instantly informed, regardless of platform.\n\nUse Case:\nIdeal for sales, customer service, or operations teams who manage client conversations and want to ensure seamless cross-departmental communication, documentation, and follow-up.\n\nApps Used:\n\nWebhook (Trigger)\n\nOpenAI (or other AI/NLP for summarization)\n\nHubSpot\n\nEmail\n\nSlack\n\nWhatsApp (via Twilio or third-party provider)",
"isPaid": false
},
{
"templateId": "3767",
"templateName": "template_3767",
"templateDescription": "Who is this for? This workflow is designed for Customer Success Managers (CSM), sales, support, or marketing teams using HubSpot CRM who want to automate...",
"templateUrl": "https://n8n.io/workflows/3767",
"jsonFileName": "template_3767.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_3767.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/f84b7dc8e2c6e48d37b7a860630dcda5/raw/1f1d87fdc293c7264ca9281beceb3c1c62cb378e/template_3767.json",
"screenshotURL": "https://i.ibb.co/bgPw67tm/12e941f31f1b.png",
"workflowUpdated": true,
"gistId": "f84b7dc8e2c6e48d37b7a860630dcda5",
"templateDescriptionFull": "This workflow is designed for Customer Success Managers (CSM), sales, support, or marketing teams using HubSpot CRM who want to automate customer engagement tracking when new emails arrive. It’s ideal for businesses looking to streamline CRM updates without manual data entry.\n\nManually logging email interactions in HubSpot is time-consuming. This workflow automatically parses incoming emails, checks if the sender exists in HubSpot, and either:\n\nCreates a new contact + logs the email as an engagement (if the sender is new).\nLogs the email as an engagement for an existing contact.\n\nTriggers when a new email arrives in a connected IMAP inbox.\nParses the email using AI (OpenAI) to extract structured data.\nSearches HubSpot for the sender’s email address.\nUpdates HubSpot:\n\nCreates a contact (if missing) and logs the email as an engagement.\nOr logs the engagement for an existing contact.\nCreates a contact (if missing) and logs the email as an engagement.\nOr logs the engagement for an existing contact.\n\nConfigure Email Account: Replace the default IMAP node with your email provider\nHubSpot Credentials: Add your HubSpot API key in the HubSpot nodes.\nOpenAI Integration: Ensure your OpenAI API key is set for email parsing.\n\nImprove AI Prompt: Modify the OpenAI prompt to extract specific email data (e.g., customer intent).\nAdd Filters: Exclude auto-replies or spam by adding a filter node.\nExtend Functionality: Use the parsed data to trigger follow-up tasks (e.g., Slack alerts, tickets).\n\nNeed Help? Contact thomas@pollup.net for workflow modifications or help.\n\nDiscover my other workflows here",
"isPaid": false
},
{
"templateId": "3706",
"templateName": "template_3706",
"templateDescription": "Who is this for? This workflow is designed for Customer Satisfaction Managers (CSM), sales professionals, and operations managers who need to automate the...",
"templateUrl": "https://n8n.io/workflows/3706",
"jsonFileName": "template_3706.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_3706.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/0e819f71f35600971835edd956c1a8bc/raw/4e6c5a98112b5104c10180a3f79f53c10bc121b2/template_3706.json",
"screenshotURL": "https://i.ibb.co/xttqpStX/d48f37aa53a0.png",
"workflowUpdated": true,
"gistId": "0e819f71f35600971835edd956c1a8bc",
"templateDescriptionFull": "This workflow is designed for Customer Satisfaction Managers (CSM), sales professionals, and operations managers who need to automate the analysis of client transcripts, save summarized notes to HubSpot, and route relevant feedback to the appropriate departments via email.\n\nManually processing client conversations, extracting key insights, and distributing them to the right teams is time-consuming and error-prone. This workflow automates:\n\nTranscript analysis using AI (OpenAI) to identify relevant content.\nHubSpot integration to log meeting notes against client records.\nEmail routing to ensure feedback reaches the correct departments (e.g., support, sales, product, admin).\n\nInput Transcript: Accepts a client conversation transcript (e.g., from emails, calls, or chats).\nHubSpot Sync:\n\nSearches for the client’s HubSpot ID using their email.\nUploads a summarized version of the conversation as meeting notes.\nSearches for the client’s HubSpot ID using their email.\nUploads a summarized version of the conversation as meeting notes.\nAI-Powered Routing:\n\nUses an OpenAI model to analyze the transcript and categorize content by department.\nTriggers emails (via Gmail) to route feedback to the relevant teams.\nUses an OpenAI model to analyze the transcript and categorize content by department.\nTriggers emails (via Gmail) to route feedback to the relevant teams.\nForm Completion: Ends the workflow with optional user confirmation.\n\nPrerequisites:\n\nn8n instance (cloud or self-hosted).\nHubSpot API credentials (for contact lookup and notes upload).\nOpenAI API key (for transcript analysis).\nGmail account (for sending emails).\nn8n instance (cloud or self-hosted).\nHubSpot API credentials (for contact lookup and notes upload).\nOpenAI API key (for transcript analysis).\nGmail account (for sending emails).\nConfiguration:\n\nReplace placeholder nodes (e.g., HubSpot, OpenAI, Gmail) with your authenticated accounts.\nDefine email templates and recipient addresses for routing.\nAdjust the OpenAI prompt to match your categorization criteria (e.g., \"support,\" \"billing\").\nReplace placeholder nodes (e.g., HubSpot, OpenAI, Gmail) with your authenticated accounts.\nDefine email templates and recipient addresses for routing.\nAdjust the OpenAI prompt to match your categorization criteria (e.g., \"support,\" \"billing\").\n\nTranscript Sources: Extend the workflow to pull transcripts from other sources (e.g., Zoom, Slack).\nDepartments: Modify the routing logic to include additional teams or conditions.\nNotifications: Add Slack/MS Teams alerts for urgent feedback.\nError Handling: Introduce retries or fallback actions for failed HubSpot/Gmail steps.",
"isPaid": false
},
{
"templateId": "2610",
"templateName": "template_2610",
"templateDescription": "*Smartlead to HubSpot Performance AnalyticsA streamlined workflow to analyze your Smartlead performance metrics by tracking lifecycle stages in HubSpot and...",
"templateUrl": "https://n8n.io/workflows/2610",
"jsonFileName": "template_2610.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_2610.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/ca35f4ce375f8692e6cdf2bde233b99a/raw/b1f2e0322ad3622ef8f05abf900e48e707482f77/template_2610.json",
"screenshotURL": "https://i.ibb.co/QFVq54Mz/d6303cc7b532.png",
"workflowUpdated": true,
"gistId": "ca35f4ce375f8692e6cdf2bde233b99a",
"templateDescriptionFull": "Smartlead to HubSpot Performance Analytics\nA streamlined workflow to analyze your Smartlead performance metrics by tracking lifecycle stages in HubSpot and generating automated reports.\n\n(Outbound) Automation Agencies, Sales and marketing teams using Smartlead for outreach campaigns who want to track their performance metrics and lead progression in HubSpot.\n\nWhat problem does this workflow solve?\n\nManual tracking of lead performance across Smartlead and HubSpot is time-consuming and error-prone. This workflow automates performance reporting by connecting your Smartlead data with HubSpot lifecycle stages, providing clear insights into your outreach campaign effectiveness.\n\nWhat this workflow does\n\nAutomatically pulls performance data from your Smartlead campaigns\nCross-references contact status with HubSpot lifecycle stages\nGenerates comprehensive performance reports in Google Sheets\nProvides customizable reporting schedules to match your team's needs\n\nSet up your PostgreSQL instance (includes $300 free GCP credits)\nFollow our step-by-step setup guide: Find a step-by-step guide here\n\nConnect your Google Account to n8n\nFind the guide here\n\nConnect your Smartlead instance:\nDetailed connection guide included in workflow\n\nConfigure the Trigger node to adjust report frequency\nModify the Google Sheets template to match your specific KPIs\nCustomize HubSpot lifecycle stage mapping in the Function node\nAdjust PostgreSQL queries to track additional metrics\n\nNeed assistance or have suggestions?\nlmk here",
"isPaid": false
},
{
"templateId": "2131",
"templateName": "template_2131",
"templateDescription": "Use CaseWhen tracking your contacts and leads in Hubspot CRM, every new contact might be a potential customer. To guarantee that you're keeping the overview...",
"templateUrl": "https://n8n.io/workflows/2131",
"jsonFileName": "template_2131.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_2131.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/bd58b890ea43effbe59d74661ea01a62/raw/583d441d02f6fb3bfea70bce1b0b15739f391671/template_2131.json",
"screenshotURL": "https://i.ibb.co/TxPtkh8f/fd19e9bb44cd.png",
"workflowUpdated": true,
"gistId": "bd58b890ea43effbe59d74661ea01a62",
"templateDescriptionFull": "When tracking your contacts and leads in Hubspot CRM, every new contact might be a potential customer. To guarantee that you're keeping the overview you'd normally need to look at every new lead that is coming in manually to identify high-quality leads to prioritize their engagement and optimize the sales process. This workflow saves the work and does it for you.\n\nThe workflow runs every 5 minutes. On every run, it checks the Hubspot CRM for contacts that were added since the last check. It then checks if they meet certain criteria (in this case if they are making +5m annual revenue) and alerts you in Slack for every match.\n\nAdd Hubspot, and Slack credentials.\nClick on Test workflow.\n\nChange the schedule interval\nAdjust the criteria to send alerts",
"isPaid": false
},
{
"templateId": "2130",
"templateName": "template_2130",
"templateDescription": "Use CaseWhenever someone shows interest in your offerings by subscribing to a list in ConvertKit it could be a potential new customer. Typically you need to...",
"templateUrl": "https://n8n.io/workflows/2130",
"jsonFileName": "template_2130.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_2130.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/61c1a958a51af794f3516cabe09e13e2/raw/e4662cab9d35cdc263f332147b1b69b98c4c725b/template_2130.json",
"screenshotURL": "https://i.ibb.co/nNXWP44h/9ed21248cb76.png",
"workflowUpdated": true,
"gistId": "61c1a958a51af794f3516cabe09e13e2",
"templateDescriptionFull": "Whenever someone shows interest in your offerings by subscribing to a list in ConvertKit it could be a potential new customer. Typically you need to gather more detailed information about them (data enrichment) and finally update their profile in your CRM system to better manage and nurture your relationship with them. This workflow does this all for you!\n\nThe workflow runs every time a user is subscribed to a ConvertKit list. It then filters out personal emails, before enriching the email. If the email is attached to a company it enriches the company and upserts it in your Hubspot CRM.\n\nAdd Clearbit, Hubspot, and ConvertKit credentials.\nClick on Test workflow.\nSubscribe to a list on ConvertKit to trigger the workflow.\n\nBe aware that you can adapt this workflow to work with your enrichment tool, CRM, and email automation tool of choice.",
"isPaid": false
},
{
"templateId": "4837",
"templateName": "Automated AI Lead Enrichment: Salesforce to Explorium for Enhanced Prospect Data",
"templateDescription": "Salesforce Lead Enrichment with Explorium Template Download the following json file and import it to a new n8n workflow: salesforce\\_workflow.json Automated...",
"templateUrl": "https://n8n.io/workflows/4837",
"jsonFileName": "Automated_AI_Lead_Enrichment_Salesforce_to_Explorium_for_Enhanced_Prospect_Data.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Automated_AI_Lead_Enrichment_Salesforce_to_Explorium_for_Enhanced_Prospect_Data.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/bc461e11596ecad25090991b2c4479ec/raw/b1f1685a12da9f0bc13489cfeecf5a3ca1a634a9/Automated_AI_Lead_Enrichment_Salesforce_to_Explorium_for_Enhanced_Prospect_Data.json",
"screenshotURL": "https://i.ibb.co/wrL9fWcx/06ff9231c1eb.png",
"workflowUpdated": true,
"gistId": "bc461e11596ecad25090991b2c4479ec",
"templateDescriptionFull": "Salesforce Lead Enrichment with Explorium\n\nDownload the following json file and import it to a new n8n workflow:\n\nsalesforce_workflow.json\n\n\n\nThis n8n workflow monitors your Salesforce instance for new leads and automatically enriches them with missing contact information. When a lead is created, the workflow:\n\nDetects the new lead via Salesforce trigger\nMatches the lead against Explorium's database using name and company\nEnriches the lead with professional email addresses and phone numbers\nUpdates the Salesforce lead record with the discovered contact information\n\nThis automation ensures your sales team always has the most up-to-date contact information for new leads, improving reach rates and accelerating the sales process.\n\nReal-time Processing: Triggers automatically when new leads are created in Salesforce\nIntelligent Matching: Uses lead name and company to find the correct person in Explorium's database\nContact Enrichment: Adds professional emails, mobile phones, and office phone numbers\nBatch Processing: Efficiently handles multiple leads to optimize API usage\nError Handling: Continues processing other leads even if some fail to match\nSelective Updates: Only updates leads that successfully match in Explorium\n\nBefore setting up this workflow, ensure you have:\n\nn8n instance (self-hosted or cloud)\nSalesforce account with:\n\nOAuth2 API access enabled\nLead object permissions (read/write)\nAPI usage limits available\nOAuth2 API access enabled\nLead object permissions (read/write)\nAPI usage limits available\nExplorium API credentials (Bearer token) - Get explorium api key\nBasic understanding of Salesforce lead management\n\nThe workflow expects these standard Salesforce lead fields:\n\nFirstName - Lead's first name\nLastName - Lead's last name\nCompany - Company name\nEmail - Will be populated/updated by the workflow\nPhone - Will be populated/updated by the workflow\nMobilePhone - Will be populated/updated by the workflow\n\nYour Salesforce integration user needs:\n\nRead access to Lead object\nWrite access to Lead object fields (Email, Phone, MobilePhone)\nAPI enabled on the user profile\nSufficient API calls remaining in your org limits\n\nCopy the workflow JSON from the template\nIn n8n: Navigate to Workflows → Add Workflow → Import from File\nPaste the JSON and click Import\n\nClick on the Salesforce Trigger node\nUnder Credentials, click Create New\nFollow the OAuth2 flow:\n\nClient ID: From your Salesforce Connected App\nClient Secret: From your Salesforce Connected App\nCallback URL: Copy from n8n and add to your Connected App\nClient ID: From your Salesforce Connected App\nClient Secret: From your Salesforce Connected App\nCallback URL: Copy from n8n and add to your Connected App\nAuthorize the connection\nSave the credentials as \"Salesforce account connection\"\n\nNote: Use the same credentials for all Salesforce nodes in the workflow.\n\nClick on the Match_prospect node\nUnder Credentials, click Create New (HTTP Header Auth)\nConfigure the header:\n\nName: Authorization\nValue: Bearer YOUR_EXPLORIUM_API_TOKEN\nName: Authorization\nValue: Bearer YOUR_EXPLORIUM_API_TOKEN\nSave as \"Header Auth account\"\nApply the same credentials to the Explorium Enrich Contacts Information node\n\nSalesforce Trigger:\n\nTrigger On: Lead Created\nPoll Time: Every minute (adjust based on your needs)\nTrigger On: Lead Created\nPoll Time: Every minute (adjust based on your needs)\nSalesforce Get Leads:\n\nOperation: Get All\nCondition: CreatedDate = TODAY (fetches today's leads)\nLimit: 20 (adjust based on volume)\nOperation: Get All\nCondition: CreatedDate = TODAY (fetches today's leads)\nLimit: 20 (adjust based on volume)\nLoop Over Items:\n\nBatch Size: 6 (optimal for API rate limits)\nBatch Size: 6 (optimal for API rate limits)\n\nSave the workflow\nToggle the Active switch to ON\nThe workflow will now monitor for new leads every minute\n\nSalesforce Trigger: Polls Salesforce every minute for new leads\nGet Today's Leads: Retrieves all leads created today to ensure none are missed\nLoop Over Items: Processes leads in batches of 6 for efficiency\nMatch Prospect: Searches Explorium for matching person using name + company\nFilter: Checks if a valid match was found\nExtract Prospect IDs: Collects all matched prospect IDs\nEnrich Contacts: Fetches detailed contact information from Explorium\nMerge: Combines original lead data with enrichment results\nSplit Out: Separates individual enriched records\nUpdate Lead: Updates Salesforce with new contact information\n\nThe workflow maps Explorium data to Salesforce fields as follows:\n\nOnce activated, the workflow runs automatically:\n\nChecks for new leads every minute\nProcesses any leads created since the last check\nUpdates leads with discovered contact information\nContinues running until deactivated\n\nTo test the workflow manually:\n\nCreate a test lead in Salesforce\nClick \"Execute Workflow\" in n8n\nMonitor the execution to see each step\nVerify the lead was updated in Salesforce\n\nTrack workflow performance:\n\nGo to Executions in n8n\nFilter by this workflow\nReview successful and failed executions\nCheck logs for any errors or issues\n\nNo leads are being processed\n\nVerify the workflow is activated\nCheck Salesforce API limits haven't been exceeded\nEnsure new leads have FirstName, LastName, and Company populated\nConfirm OAuth connection is still valid\n\nLeads not matching in Explorium\n\nVerify company names are accurate (not abbreviations)\nCheck that first and last names are properly formatted\nSome individuals may not be in Explorium's database\nTry testing with known companies/contacts\n\nContact information not updating\n\nCheck Salesforce field-level security\nVerify the integration user has edit permissions\nEnsure Email, Phone, and MobilePhone fields are writeable\nCheck for validation rules blocking updates\n\nAuthentication errors\n\nSalesforce: Re-authorize OAuth connection\nExplorium: Verify Bearer token is valid and not expired\nCheck API quotas haven't been exceeded\n\nThe workflow includes built-in error handling:\n\nFailed matches don't stop other leads from processing\nEach batch is processed independently\nFailed executions are logged for review\nPartial successes are possible (some leads updated, others skipped)\n\nEnsure complete lead data: FirstName, LastName, and Company should be populated\nUse full company names: \"Microsoft Corporation\" matches better than \"MSFT\"\nStandardize data entry: Consistent formatting improves match rates\n\nAdjust batch size: Lower if hitting API limits, higher for efficiency\nModify polling frequency: Every minute for high volume, less frequent for lower volume\nSet appropriate limits: Balance between processing speed and API usage\n\nData permissions: Ensure you have rights to enrich lead data\nGDPR compliance: Consider privacy regulations in your region\nData retention: Follow your organization's data policies\nAudit trail: Monitor who has access to enriched data\n\nAdd more Explorium enrichment by:\n\nAdding firmographic data (company size, revenue)\nIncluding technographic information\nAppending social media profiles\nAdding job title and department verification\n\nChange when enrichment occurs:\n\nTrigger on lead updates (not just creation)\nAdd specific lead source filters\nProcess only leads from certain campaigns\nInclude lead score thresholds\n\nEnhance with alerts:\n\nEmail sales reps when leads are enriched\nSend Slack notifications for high-value matches\nCreate tasks for leads that couldn't be enriched\nLog enrichment metrics to dashboards\n\nAPI calls: Each execution uses ~4 Salesforce API calls\nPolling frequency: Consider your daily API limit\nBatch processing: Reduces API usage vs. individual processing\n\nMatch API: One call per batch of leads\nEnrichment API: One call per batch of matched prospects\nRate limits: Respect your plan's requests per minute\n\nThis workflow can be part of a larger lead management system:\n\nLead Capture → This Workflow → Lead Scoring → Assignment\nCan trigger additional workflows based on enrichment results\nCompatible with existing Salesforce automation (Process Builder, Flows)\nWorks alongside other enrichment tools\n\nCredentials: Stored securely in n8n's credential system\nData transmission: Uses HTTPS for all API calls\nAccess control: Limit who can modify the workflow\nAudit logging: All executions are logged with details\n\nFor assistance with:\n\nn8n issues: Consult n8n documentation or community forum\nSalesforce integration: Reference Salesforce API documentation\nExplorium API: Contact Explorium support for API questions\nWorkflow logic: Review execution logs for debugging",
"isPaid": false
},
{
"templateId": "2129",
"templateName": "template_2129",
"templateDescription": "Use CaseWhen having a call with a new potential customer, one of the keys to getting the most out of the call is to find out as much information as you can...",
"templateUrl": "https://n8n.io/workflows/2129",
"jsonFileName": "template_2129.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_2129.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/07bab81e0ead70f4dfcac1089ecbf10f/raw/e8ae8f9729be1a0bd5be7d366ffd40b2b228f04c/template_2129.json",
"screenshotURL": "https://i.ibb.co/RTjZRQNr/4ab65d6e944a.png",
"workflowUpdated": true,
"gistId": "07bab81e0ead70f4dfcac1089ecbf10f",
"templateDescriptionFull": "When having a call with a new potential customer, one of the keys to getting the most out of the call is to find out as much information as you can about them before the call. Normally this involves a lot of manual research before every call. This workflow automates this tedious work for you.\n\nThe workflow runs every time a new call is booked via your Calendly. It then filters out personal emails, before enriching the email. If the email is attached to a company it enriches the company and upserts it in your Hubspot CRM.\n\nAdd Clearbit, Hubspot, and Calendly credentials.\nClick on Test workflow.\nBook a meeting on Calendly so the event starts the workflow.\nBe aware that you can adapt this workflow to work with your enrichment tool, CRM, and booking tool of choice.",
"isPaid": false
},
{
"templateId": "2122",
"templateName": "template_2122",
"templateDescription": "Use caseThis workflow automatically qualifies great leads from a form and sends them an email 😮..It also adds the user to Hubspot if not already added and...",
"templateUrl": "https://n8n.io/workflows/2122",
"jsonFileName": "template_2122.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_2122.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/68d57fe95ce14af4617f6f4fde8bcfc9/raw/525418666f1639048ebec964d8049b299bdd832e/template_2122.json",
"screenshotURL": "https://i.ibb.co/MyPpNLFt/66039aaaf6f8.png",
"workflowUpdated": true,
"gistId": "68d57fe95ce14af4617f6f4fde8bcfc9",
"templateDescriptionFull": "This workflow automatically qualifies great leads from a form and sends them an email 😮..\nIt also adds the user to Hubspot if not already added and records the outreach.\n\nAdd you MadKudu, Hunter, and Gmail credentials\nSetup your HubSpot Oauth2 creds using n8n docs\nSet the email content and subject\nClick the Test Workflow button, enter your email and check the Slack channel\nActivate the workflow and use the form trigger production URL to collect your leads in a smart way\n\nYou may want to raise or lower the threshold for your leads, as you see fit.\nYou also need to update the content (the email and the subject), obviously 😅.",
"isPaid": false
},
{
"templateId": "2118",
"templateName": "template_2118",
"templateDescription": "Use CaseFollowing up at the right time is one of the most important parts of sales. This workflow uses Gmail to send outreach emails to Hubspot contacts...",
"templateUrl": "https://n8n.io/workflows/2118",
"jsonFileName": "template_2118.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_2118.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/7cd7727101c043959dc5da18bd0a6725/raw/6c9008b49074830bd868f8ba0d8215eef1385eee/template_2118.json",
"screenshotURL": "https://i.ibb.co/ccyqhxTX/44354a3fc1d9.png",
"workflowUpdated": true,
"gistId": "7cd7727101c043959dc5da18bd0a6725",
"templateDescriptionFull": "Following up at the right time is one of the most important parts of sales. This workflow uses Gmail to send outreach emails to Hubspot contacts that have already been contacted only once more than a month ago, and records the engagement in Hubspot.\n\nSetup HubSpot Oauth2 creds (Be careful with scopes. They have to be exact, not less or more. Yes, it’s not simple, but it’s well documented in the n8n docs. Be smarter than me, read the docs)\nSetup Gmail creds.\nChange the email variables in the Set keys node\n\nThere's plenty to do here because the approach here is really just a starting point. Most important here is to figure out what your rules are to follow up. After a month? More than once?\n\nAlso, remember to update the follow-up email! Unless you want to sell n8n 😉",
"isPaid": false
},
{
"templateId": "2117",
"templateName": "template_2117",
"templateDescription": "Use caseTo guarantee an effective sales process deals must be distributed between sales reps in the best way. Normally, this involves manually assigning new...",
"templateUrl": "https://n8n.io/workflows/2117",
"jsonFileName": "template_2117.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_2117.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/5620c16ac529178a936979aef64a3cba/raw/f4e22fb95f599a168029be9c7ad936106b7c4697/template_2117.json",
"screenshotURL": "https://i.ibb.co/Ps6kxw1w/aacde34e0dfb.png",
"workflowUpdated": true,
"gistId": "5620c16ac529178a936979aef64a3cba",
"templateDescriptionFull": "To guarantee an effective sales process deals must be distributed between sales reps in the best way. Normally, this involves manually assigning new deals that have come in. This workflow automates it for you!\n\nThis workflow runs once a day and checks for unassigned deals in your Hubspot CRM. Once it finds one, it enriches the deal with information about the assigned contact and their company. It then checks the region of the assigned company before looking at the company's employee size. Based on this, it assigns the deal to the right sales rep within your company.\n\nNew deals in Hubspot need to be unassigned in the beginning\nNew deals have to have an attached contact that has an attached company in Hubspot\nThe company needs to have values for region and employee count in Hubspot\n\nThe setup is quite straight forward and will probably take a few minutes only.\n\nAdd your Hubspot credentials\nCustomize your criterias for assigning deals in the Assign by Region and the following Assign nodes\nMake sure deals are assigned to the right salesrep in the Hubspot nodes at the end\nActivate the workflow\n\nAdjust the trigger interval to your needs. Currently, it defaults to once a day\nAdjust your region settings by adding/updating/removing options in the respective node\nAdjust your employee size settings by adding/updating/removing options in the respective node\n\nWrap each region's assigned criteria into different sub-workflows for easier maintainability. This will not consume additional execution counts.\nAdd more logic on what happens once a deal does not match any criteria you've set",
"isPaid": false
},
{
"templateId": "3958",
"templateName": "template_3958",
"templateDescription": "This n8n workflow streamlines the onboarding process for new customers by automating personalized email communication, calendar scheduling, and contact...",
"templateUrl": "https://n8n.io/workflows/3958",
"jsonFileName": "template_3958.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_3958.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/4cfe81e4f6162b7b4000abcbff2a1a68/raw/5823ab9b710d4ec4beb8c1e93d0bd28dbf14dc3a/template_3958.json",
"screenshotURL": "https://i.ibb.co/pYYW6Nk/8b4417bdea1b.png",
"workflowUpdated": true,
"gistId": "4cfe81e4f6162b7b4000abcbff2a1a68",
"templateDescriptionFull": "This n8n workflow streamlines the onboarding process for new customers by automating personalized email communication, calendar scheduling, and contact assignment in HubSpot. It is perfect for businesses looking to ensure a smooth and personalized onboarding experience for new clients.\n\nCustomer success teams who need to onboard new clients efficiently.\nSales teams who want to ensure smooth transitions from prospect to customer.\nSmall businesses that want to automate customer onboarding without complex systems.\n\nThis workflow reduces the manual effort involved in onboarding new customers by:\n\nAutomatically sending personalized welcome emails.\nScheduling a welcome meeting using a calendar tool.\nAssigning the customer to a Customer Success Manager (CSM) in HubSpot.\n\nTrigger via Webhook or HubSpot:\n\nThe workflow can be triggered either by a webhook (direct API call) or a HubSpot trigger (e.g., when a new contact is created).\nThe workflow can be triggered either by a webhook (direct API call) or a HubSpot trigger (e.g., when a new contact is created).\nHubSpot Connection:\n\nRetrieves the list of HubSpot owners (users with contact access).\nIdentifies the owner of the new contact.\nRetrieves the list of HubSpot owners (users with contact access).\nIdentifies the owner of the new contact.\nCalendar Management:\n\nUtilizes a Calendar Agent to schedule a welcome meeting with the new customer.\nThe Calendar Agent can create, update, or delete events as needed.\nUtilizes a Calendar Agent to schedule a welcome meeting with the new customer.\nThe Calendar Agent can create, update, or delete events as needed.\nPersonalized Email Creation:\n\nUses an AI-powered Email Writer (OpenAI) to generate a personalized welcome email.\nTransforms the email text into HTML for a polished format.\nUses an AI-powered Email Writer (OpenAI) to generate a personalized welcome email.\nTransforms the email text into HTML for a polished format.\nEmail Sending via Gmail:\n\nSends the personalized email to the customer using Gmail.\nSets the new contact’s owner in HubSpot for further communication tracking.\nSends the personalized email to the customer using Gmail.\nSets the new contact’s owner in HubSpot for further communication tracking.\n\nWebhook Setup in n8n:\n\nCreate a new workflow and add a Webhook node.\nSet the Webhook URL path (e.g., /webhook-customer-onboarding).\nMake sure the workflow is active.\nCreate a new workflow and add a Webhook node.\nSet the Webhook URL path (e.g., /webhook-customer-onboarding).\nMake sure the workflow is active.\nWebhook Setup in HubSpot:\n\nGo to HubSpot Developer Account.\nNavigate to Settings > Integrations > Webhooks.\nCreate a new webhook and set the URL as the n8n Webhook URL.\nChoose POST as the request method.\nTest the webhook to ensure it triggers the workflow in n8n.\nGo to HubSpot Developer Account.\nNavigate to Settings > Integrations > Webhooks.\nCreate a new webhook and set the URL as the n8n Webhook URL.\nChoose POST as the request method.\nTest the webhook to ensure it triggers the workflow in n8n.\nCalendar Agent Configuration:\n\nThe Calendar Agent can be configured to create, update, or delete events.\nConnect it to your calendar tool (Google Calendar, Outlook, etc.).\nCustomize the calendar event details (title, description, time).\nThe Calendar Agent can be configured to create, update, or delete events.\nConnect it to your calendar tool (Google Calendar, Outlook, etc.).\nCustomize the calendar event details (title, description, time).\nEmail Writer Setup:\n\nCustomize the AI prompt in the Email Writer node to match your brand’s voice.\nAdjust the email text format for your specific needs.\nCustomize the AI prompt in the Email Writer node to match your brand’s voice.\nAdjust the email text format for your specific needs.\nGmail Integration:\n\nConnect your Gmail account in n8n.\nSet the recipient email to the new customer’s email address.\nConnect your Gmail account in n8n.\nSet the recipient email to the new customer’s email address.\n\nModify the AI-Powered Email:\n\nAdjust the email prompt for the AI model to create a different welcome message.\nChange the email format or add custom variables (e.g., customer name, service details).\nAdjust the email prompt for the AI model to create a different welcome message.\nChange the email format or add custom variables (e.g., customer name, service details).\nCustomize Calendar Settings:\n\nSet default time slots for welcome meetings.\nSpecify which calendar to use for scheduling.\nSet default time slots for welcome meetings.\nSpecify which calendar to use for scheduling.\nAdd Additional Steps:\n\nExtend the workflow to automatically assign the customer to a specific HubSpot list.\nAdd a follow-up email or survey after the welcome meeting.\nExtend the workflow to automatically assign the customer to a specific HubSpot list.\nAdd a follow-up email or survey after the welcome meeting.\n\nThis workflow is perfect for businesses seeking an efficient and personalized onboarding process, ensuring new customers feel welcomed and supported from day one.",
"isPaid": false
},
{
"templateId": "2116",
"templateName": "template_2116",
"templateDescription": "Use caseWhen collecting leads via a form you're typically facing a few problems: Often end up with a bunch of leads who don't have a valid email addressYou...",
"templateUrl": "https://n8n.io/workflows/2116",
"jsonFileName": "template_2116.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_2116.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/9d50ad4e96cf02d3eb11974eb12e0618/raw/54b54b174eeb09bbcefb7ccd912982a6128b877c/template_2116.json",
"screenshotURL": "https://i.ibb.co/zWYJ9371/b2fdd8637002.png",
"workflowUpdated": true,
"gistId": "9d50ad4e96cf02d3eb11974eb12e0618",
"templateDescriptionFull": "When collecting leads via a form you're typically facing a few problems:\n\nOften end up with a bunch of leads who don't have a valid email address\nYou want to know as much about the new lead as possible but also want to keep the form short\nAfter forms are submitted you have to walk over the submissions and see which you want to add to your CRM\n\nThis workflow helps you to fix all those problems.\n\nThe workflow checks every new form submission and verifies the email using Hunter.io. If the email is valid, it then tries to enrich the person using Clearbit and saves the new lead into your Hubspot CRM.\n\nAdd you Hunter, Clearbit and Hubspot credentials\nClick the Test Workflow button, enter your email and check your Hubspot\nActivate the workflow and use the form trigger production URL to collect your leads in a smart way\n\nChange the form to the form you need in your use case (e.g. Typeform, Google Forms, SurveyMonkey etc.)\nAdd criteria before an account is added to your CRM. This could for example be the size of company, industry etc. You can find some inspiration in our other template Reach out via Email to new form submissions that meet a certain criteria\nAdd more data sources to save the new lead in",
"isPaid": false
},
{
"templateId": "2112",
"templateName": "template_2112",
"templateDescription": "Use case This workflow uses Gmail to send outreach emails to new Hubspot contacts that have yet to be contacted (usually unknown contacts), and records the...",
"templateUrl": "https://n8n.io/workflows/2112",
"jsonFileName": "template_2112.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_2112.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/175884f1bca5886ee424d393d73a8202/raw/97ab8c1e1e6934841b66b34f34968bca9282f933/template_2112.json",
"screenshotURL": "https://i.ibb.co/ttZSrFr/58fa10f5221b.png",
"workflowUpdated": true,
"gistId": "175884f1bca5886ee424d393d73a8202",
"templateDescriptionFull": "This workflow uses Gmail to send outreach emails to new Hubspot contacts that have yet to be contacted (usually unknown contacts), and records the outreach in Hubspot.\n\nSetup HubSpot Oauth2 creds (Be careful with scopes. They have to be exact, not less or more. Yes, it's not simple, but it's well documented in the n8n docs. Be smarter than me, read the docs)\nSetup Gmail creds.\nChange the from email and from name in the Record outreach in HubSpot node\n\nChange the email message in the Set keys node\nThink about your criteria to reach out to new contacts. Here we simply filter for only contacts with unknown dates.",
"isPaid": false
},
{
"templateId": "2111",
"templateName": "template_2111",
"templateDescription": "Use Case This workflow aims to enrich new contacts in HubSpot. The more relevant the HubSpot profile, the more useful it is. Once active, this n8n workflow...",
"templateUrl": "https://n8n.io/workflows/2111",
"jsonFileName": "template_2111.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_2111.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/ac81ab391af63f39f56889394634d64f/raw/46cdebd4c537b0922495a57ae0e6f79e248958f7/template_2111.json",
"screenshotURL": "https://i.ibb.co/KzW8SRgn/b0bf193248da.png",
"workflowUpdated": true,
"gistId": "ac81ab391af63f39f56889394634d64f",
"templateDescriptionFull": "This workflow aims to enrich new contacts in HubSpot. The more relevant the HubSpot profile, the more useful it is. Once active, this n8n workflow will update the social profiles, contact data (phone, email) as well as location data from ExactBuyer.\n\nAdd HubSpot trigger credential (be careful, scopes must be exactly as in n8n docs )\nAdd your Exact Buyer API key\nAdd HubSpot credential for update node (be careful, scopes must be same as n8n docs for this. This is different from the trigger cred)\nActivate workflow\n\nThere's plenty of interesting info that ExactBuyer returns that could be helpful. Take a look and update this workflow to add what you need.",
"isPaid": false
},
{
"templateId": "1855",
"templateName": "template_1855",
"templateDescription": "This workflow pushes Stripe charges to HubSpot contacts. It uses the Stripe API to get all charges and the HubSpot API to update the contacts. The workflow...",
"templateUrl": "https://n8n.io/workflows/1855",
"jsonFileName": "template_1855.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_1855.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/282184a9a503e306d5f8f7942d3b49e5/raw/f0c4e0980caffe091af5b0e386b2ecf0b85d984c/template_1855.json",
"screenshotURL": "https://i.ibb.co/nNXWP44h/9ed21248cb76.png",
"workflowUpdated": true,
"gistId": "282184a9a503e306d5f8f7942d3b49e5",
"templateDescriptionFull": "This workflow pushes Stripe charges to HubSpot contacts. It uses the Stripe API to get all charges and the HubSpot API to update the contacts. The workflow will create a new HubSpot property to store the total amount charged. If the property already exists, it will update the property.\n\nStripe credentials.\nHubSpot credentials.\n\nOn a schedule, check if the property exists in HubSpot. If it doesn't exist, create it. The default schedule is once a day at midnight.\nOnce property is acertained, the first Stripe node gets all charges.\nOnce the charges are returned, the second Stripe node gets extra customer information.\nOnce the customer information is returned, Merge data node will merge the customer information with the charges so that the next node Aggregate totals can calculate the total amount charged per contact.\nOnce we have the total amount charged per contact, the Create or update customer node will create a new HubSpot contact if it doesn't exist or update the contact if it does exist with the total amount charged.",
"isPaid": false
},
{
"templateId": "1771",
"templateName": "template_1771",
"templateDescription": "This workflow sends new Mailchimp subscribers to HubSpot as new or updated contacts. PrerequisitesMailchimp account and Mailchimp credentialsHubSpot account...",
"templateUrl": "https://n8n.io/workflows/1771",
"jsonFileName": "template_1771.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_1771.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/89d71d468bfebe77937686c441d7326c/raw/27b12cb319d527d78ff4f75362156c39b76334c9/template_1771.json",
"screenshotURL": "https://i.ibb.co/SD6JgqMj/d464ae951802.png",
"workflowUpdated": true,
"gistId": "89d71d468bfebe77937686c441d7326c",
"templateDescriptionFull": "This workflow sends new Mailchimp subscribers to HubSpot as new or updated contacts.\n\nMailchimp account and Mailchimp credentials\nHubSpot account and HubSpot credentials\n\nCron node triggers this workflow every day at 7:00.\nMailchimp node searches for new subscribers.\nNew Mailchimp subscribes get sent to HubSpot.\nHubSpot node either updates the existing contact or adds a new one to the pipeline.",
"isPaid": false
},
{
"templateId": "3033",
"templateName": "template_3033",
"templateDescription": "CallForge - AI Gong Transcript PreProcessor Transform your Gong.io call transcripts into structured, enriched, and AI-ready data for better sales insights...",
"templateUrl": "https://n8n.io/workflows/3033",
"jsonFileName": "template_3033.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_3033.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/a5e577e990b77aae60e4657528adf7f1/raw/2c7e936757eac07a49eaadd704f9c3005277bd6d/template_3033.json",
"screenshotURL": "https://i.ibb.co/Kjtn4SXw/8dc13d9ad917.png",
"workflowUpdated": true,
"gistId": "a5e577e990b77aae60e4657528adf7f1",
"templateDescriptionFull": "Transform your Gong.io call transcripts into structured, enriched, and AI-ready data for better sales insights and analytics.\n\nThis workflow is designed for:\n✅ Sales teams looking to automate call transcript formatting.\n✅ Revenue operations (RevOps) professionals optimizing AI-driven insights.\n✅ Businesses using Gong.io that need structured, enriched call transcripts for better decision-making.\n\nManually processing raw Gong call transcripts is inefficient and often lacks essential context for AI-driven insights.\n\nWith CallForge, you can:\n✔ Extract and format Gong call transcripts for structured AI processing.\n✔ Enhance metadata using sales data from Salesforce.\n✔ Classify speakers as internal (sales team) or external (customers).\n✔ Identify external companies by filtering out free email domains (e.g., Gmail, Yahoo).\n✔ Enrich customer profiles using PeopleDataLabs to identify company details and locations.\n✔ Prepare transcripts for AI models by structuring conversations and removing unnecessary noise.\n\nCalls the Gong API to extract call metadata, speaker interactions, and collaboration details.\nFetches call transcripts for AI processing.\n\nConverts call transcripts into structured, speaker-based dialogues.\nAssigns each speaker as either Internal (Sales Team) or External (Customer).\n\nRetrieves Salesforce data to match customers with existing sales opportunities.\nFilters out free email domains to determine the customer’s actual company domain.\nCalls the PeopleDataLabs API to retrieve additional company data and location details.\n\nCombines Gong metadata, Salesforce customer details and insights.\nEnsures all necessary data is available for AI-driven sales insights.\n\nMerges all call transcript data into a single structured format for AI analysis.\nExtracts the final cleaned, enriched dataset for further AI-powered insights.\n\n🔹 Gong API Access – Set up your Gong API credentials in n8n.\n🔹 Salesforce Setup – Ensure API access if you want customer enrichment.\n🔹 PeopleDataLabs API – Required to retrieve company and location details based on email domains.\n🔹 Webhook Integration – Modify the webhook call to push enriched call data to an internal system.\n\nCallForge - 01 - Filter Gong Calls Synced to Salesforce by Opportunity Stage\nCallForge - 02 - Prep Gong Calls with Sheets & Notion for AI Summarization\nCallForge - 03 - Gong Transcript Processor and Salesforce Enricher\nCallForge - 04 - AI Workflow for Gong.io Sales Calls\nCallForge - 05 - Gong.io Call Analysis with Azure AI & CRM Sync\nCallForge - 06 - Automate Sales Insights with Gong.io, Notion & AI\nCallForge - 07 - AI Marketing Data Processing with Gong & Notion\nCallForge - 08 - AI Product Insights from Sales Calls with Notion\n\n💡 Modify Data Sources – Connect different CRMs (e.g., HubSpot, Zoho) instead of Salesforce.\n💡 Expand AI Analysis – Add another AI model (e.g., OpenAI GPT, Claude) for advanced conversation insights.\n💡 Change Speaker Classification Rules – Adjust internal vs. external speaker logic to match your team’s structure.\n💡 Filter Specific Customers – Modify the free email filtering logic to better fit your company’s needs.\n\n🚀 Automate Gong call transcript processing to save time.\n📊 Improve AI accuracy with enriched, structured data.\n🛠 Enhance sales strategy by extracting actionable insights from calls.\n\nStart optimizing your Gong transcript analysis today!",
"isPaid": false
},
{
"templateId": "3031",
"templateName": "template_3031",
"templateDescription": "Workflow Description Who is this for?This workflow is designed for sales and revenue teams using Gong and Salesforce to track and analyze sales calls. It...",
"templateUrl": "https://n8n.io/workflows/3031",
"jsonFileName": "template_3031.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_3031.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/f2562d781b51645cc0224ac167398aa9/raw/3e4efce0a7882e7bf3624b44a2d65430395ab2b1/template_3031.json",
"screenshotURL": "https://i.ibb.co/MxM0BP43/6d72c4e402d6.png",
"workflowUpdated": true,
"gistId": "f2562d781b51645cc0224ac167398aa9",
"templateDescriptionFull": "This workflow is designed for sales and revenue teams using Gong and Salesforce to track and analyze sales calls. It helps automate the extraction, filtering, and preprocessing of Gong call data for further AI analysis.\n\nSales teams often generate large amounts of call data, but not all calls are relevant for deeper analysis. This workflow filters calls based on predefined criteria, extracts relevant metadata, and formats the data before passing it to an AI processing pipeline.\n\nTriggers on new Gong calls synced to Salesforce every hour.\nFilters calls based on opportunity stage (Discovery or Meeting Booked).\nRetrieves Gong call details via API.\nFormats call data into a structured JSON object for AI processing.\nPasses the structured data to a Gong Call Preprocessor workflow for further insights.\n\nEnsure that you have connected Salesforce and Gong APIs with valid credentials.\nModify the Salesforce query in Get all custom Salesforce Gong Objects to match your organization’s requirements.\nSet the schedule trigger interval in the Run Hourly node if needed.\nConnect this workflow to an AI processing workflow to analyze call transcripts.\n\nWorkflow Templates:\n\nCallForge - 01 - Filter Gong Calls Synced to Salesforce by Opportunity Stage\nCallForge - 02 - Prep Gong Calls with Sheets & Notion for AI Summarization\nCallForge - 03 - Gong Transcript Processor and Salesforce Enricher\nCallForge - 04 - AI Workflow for Gong.io Sales Calls\nCallForge - 05 - Gong.io Call Analysis with Azure AI & CRM Sync\nCallForge - 06 - Automate Sales Insights with Gong.io, Notion & AI\nCallForge - 07 - AI Marketing Data Processing with Gong & Notion\nCallForge - 08 - AI Product Insights from Sales Calls with Notion\n\nChange filtering logic: Adjust the opportunity stage filter (Check if Opportunity Stage is Meeting Booked or Discovery) to match your sales process.\nModify data formatting: Add or remove fields in the Format call into correct JSON Object node to customize the output.\nAdjust trigger frequency: Change the Run Hourly node to run at a different interval if required.",
"isPaid": false
},
{
"templateId": "1794",
"templateName": "template_1794",
"templateDescription": "This workflow shows a no code approach to creating Salesforce accounts and contacts based on data coming from an Excel file. For Excel 365 (the online...",
"templateUrl": "https://n8n.io/workflows/1794",
"jsonFileName": "template_1794.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_1794.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/64b4d0a8e08c52acbf704911e21de257/raw/d855abf1e65da07e9a54c9c9711d5f15c2337b31/template_1794.json",
"screenshotURL": "https://i.ibb.co/QFJftmD7/460f07180a5e.png",
"workflowUpdated": true,
"gistId": "64b4d0a8e08c52acbf704911e21de257",
"templateDescriptionFull": "This workflow shows a no code approach to creating Salesforce accounts and contacts based on data coming from an Excel file. For Excel 365 (the online version of Microsoft Excel) check out this workflow instead.\n\n\n\nTo run the workflow:\n\nMake sure your Salesforce account is authenticated with n8n.\nHave a Microsoft Excel workbook with contacts and their account names ready. The workflow uses this example file, but you probably want to use your own data instead.\nHit the Execute Workflow button at the bottom of the n8n canvas.\n\nHere is how it works:\n\nThe workflow first searches for existing Salesforce accounts by name. It then branches out depending on whether the account already exists in Salesforce or not. If an account does not exist yet, it will be created. The data is then normalised before both branches converge again. Finally the contacts are created or updated as needed in Salesforce.",
"isPaid": false
},
{
"templateId": "1793",
"templateName": "template_1793",
"templateDescription": "This workflow shows a no code approach to creating Salesforce accounts and contacts based on data coming from Excel 365 (the online version of Microsoft...",
"templateUrl": "https://n8n.io/workflows/1793",
"jsonFileName": "template_1793.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_1793.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/212597b03af0e085424eef12fcad23b2/raw/620730ed1575d9cd71bcb23603325f155a6a4169/template_1793.json",
"screenshotURL": "https://i.ibb.co/twqNxq2D/b953c4e0f231.png",
"workflowUpdated": true,
"gistId": "212597b03af0e085424eef12fcad23b2",
"templateDescriptionFull": "This workflow shows a no code approach to creating Salesforce accounts and contacts based on data coming from Excel 365 (the online version of Microsoft Excel). For a version working with regular Excel files check out this workflow instead.\n\n\n\nTo run the workflow:\n\nMake sure you have both Excel 365 and Salesforce authenticated with n8n.\nHave a Microsoft Excel workbook with contacts and their account names ready:\nSelect the workbook and sheet in the Microsoft Excel node of the workflow, then configure the range to read data from:\nHit the Execute Workflow button at the bottom of the n8n canvas:\n\nHere is how it works:\n\nThe workflow first searches for existing Salesforce accounts by name. It then branches out depending on whether the account already exists in Salesforce or not. If an account does not exist yet, it will be created. The data is then normalised before both branches converge again. Finally the contacts are created or updated as needed in Salesforce.",
"isPaid": false
},
{
"templateId": "1792",
"templateName": "template_1792",
"templateDescription": "This workflow shows a no code approach to creating Salesforce accounts and contacts based on data coming from Google Sheets. To run the workflow: Make sure...",
"templateUrl": "https://n8n.io/workflows/1792",
"jsonFileName": "template_1792.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_1792.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/bb74400e953809c92c0f9046a59ffd7f/raw/3000c42bfdbf771d6adf5b4953e1ef50a205a18a/template_1792.json",
"screenshotURL": "https://i.ibb.co/j9DLZ6rm/82ab1b76c5e9.png",
"workflowUpdated": true,
"gistId": "bb74400e953809c92c0f9046a59ffd7f",
"templateDescriptionFull": "This workflow shows a no code approach to creating Salesforce accounts and contacts based on data coming from Google Sheets.\n\n\n\nTo run the workflow:\n\nMake sure you have both Google Sheets and Salesforce authenticated with n8n.\nHave a Google Sheet with contacts and their account names ready, copy the respective sheet ID from the URL:\nAdd the sheet ID to the Google Sheet node of the workflow:\nHit Execute Workflow\n\nHere is how it works:\n\nThe workflow first searches for existing Salesforce accounts by name. It then branches out depending on whether the account already exists in Salesforce or not. If an account does not exist yet, it will be created. The data is then normalised before both branches converge again. Finally the contacts are created or updated as needed in Salesforce.",
"isPaid": false
},
{
"templateId": "4031",
"templateName": "Cold Outreach Automation: Scrape Local Leads with Dumpling AI & Call via Vapi",
"templateDescription": "Who is this for? This template is for sales teams, agencies, or local service providers who want to quickly generate cold outreach lists and automatically...",
"templateUrl": "https://n8n.io/workflows/4031",
"jsonFileName": "Cold_Outreach_Automation_Scrape_Local_Leads_with_Dumpling_AI__Call_via_Vapi.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Cold_Outreach_Automation_Scrape_Local_Leads_with_Dumpling_AI__Call_via_Vapi.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/00b6761f02b6f317bf462e819eb9e491/raw/12ac66ce22aacb4844f6b4ca190a4ad7b73bc412/Cold_Outreach_Automation_Scrape_Local_Leads_with_Dumpling_AI__Call_via_Vapi.json",
"screenshotURL": "https://i.ibb.co/vt1wzHT/b4ff0f0a7757.png",
"workflowUpdated": true,
"gistId": "00b6761f02b6f317bf462e819eb9e491",
"templateDescriptionFull": "This template is for sales teams, agencies, or local service providers who want to quickly generate cold outreach lists and automatically call local businesses with a Vapi AI assistant. It’s perfect for automating cold calls from scraped local listings with no manual dialing or research.\n\nFinding leads and initiating outreach calls can be time-consuming. This workflow automates the process: it scrapes business listings from Google Maps using Dumpling AI, extracts phone numbers, filters out incomplete data, formats the numbers, and uses Vapi to make outbound AI-powered calls. Every call is logged in Google Sheets for follow-up and tracking.\n\nStarts manually and pulls search queries (e.g., \"plumbers in Austin\") from Google Sheets.\nSends each query to Dumpling AI’s Google Maps scraping endpoint.\nSplits the returned business data into individual leads.\nExtracts key info like business name, website, and phone number.\nFilters to only keep leads with valid phone numbers.\nFormats phone numbers for Vapi dialing (adds +1).\nCalls each business using Vapi AI.\nLogs each successful call in a Google Sheet.\n\nGoogle Sheets Setup\n\nCreate a sheet with business search queries in the first column (e.g., best+restaurants+in+Chicago)\nMake sure the tab name is set and authorized in your credentials.\nConnect your Google Sheets account in the Get Search Keywords from Google Sheets node.\nCreate a sheet with business search queries in the first column (e.g., best+restaurants+in+Chicago)\nMake sure the tab name is set and authorized in your credentials.\nConnect your Google Sheets account in the Get Search Keywords from Google Sheets node.\nDumpling AI Setup\n\nGo to dumplingai.com\nGenerate an API Key and connect it as a header token in the Scrape Google Map Businesses using Dumpling AI node\nGo to dumplingai.com\nGenerate an API Key and connect it as a header token in the Scrape Google Map Businesses using Dumpling AI node\nVapi Setup\n\nSign into Vapi and create an assistant\nGet your assistantId and phoneNumberId\nInsert these into the JSON payload of the Initiate Vapi AI Call to Business node\nAdd your Vapi API key to the credentials section\nSign into Vapi and create an assistant\nGet your assistantId and phoneNumberId\nInsert these into the JSON payload of the Initiate Vapi AI Call to Business node\nAdd your Vapi API key to the credentials section\nCall Logging\n\nCreate another tab in your sheet (e.g., “leads”) with these headers:\n\ncompany name\nphone number\nwebsite\n\n\nThis will be used in the Log Called Business Info to Sheet node\nCreate another tab in your sheet (e.g., “leads”) with these headers:\n\ncompany name\nphone number\nwebsite\ncompany name\nphone number\nwebsite\nThis will be used in the Log Called Business Info to Sheet node\n\nModify the business search terms in your Google Sheet to target specific industries or locations.\nAdd filters to exclude certain businesses based on ratings, keywords, or location.\nUpdate your Vapi assistant script to match the type of outreach or pitch you’re using.\nAdd additional integrations (e.g., CRM logging, Slack notifications, follow-up emails).\nChange the trigger to run on a schedule or webhook instead of manually.\n\nStart Workflow Manually: Initiates the automation manually for testing or controlled runs.\nGet Search Keywords from Google Sheets: Reads search phrases from the spreadsheet.\nScrape Google Map Businesses using Dumpling AI: Sends each search query to Dumpling AI and receives matching local business data.\nSplit Each Business Result: Breaks the returned array of businesses into individual records for processing.\nExtract Business Name, Phone and website: Extracts title, phone, and website from each business record.\nFilter Valid Phone Numbers Only: Ensures only entries with a phone number move forward.\nFormat Phone Number for Calling: Adds a +1 country code and strips non-numeric characters.\nInitiate Vapi AI Call to Business: Uses the business name and number to initiate a Vapi AI outbound call.\nLog Called Business Info to Sheet: Appends business details into a Google Sheet for tracking.\n\nYou must have valid API keys and authorized connections for Dumpling AI, Google Sheets, and Vapi.\nMake sure to handle API rate limits if you're running the workflow on large datasets.\nThis workflow is optimized for US-based leads (+1 country code); adjust the formatting node if calling internationally.",
"isPaid": false
},
{
"templateId": "4775",
"templateName": "LinkedIn Job Finder Automation using Bright Data API & Google Sheets Integration",
"templateDescription": "💼 LinkedIn Job Finder Automation using Bright Data API & Google Sheets A comprehensive n8n automation that searches LinkedIn job postings using Bright...",
"templateUrl": "https://n8n.io/workflows/4775",
"jsonFileName": "LinkedIn_Job_Finder_Automation_using_Bright_Data_API__Google_Sheets_Integration.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/LinkedIn_Job_Finder_Automation_using_Bright_Data_API__Google_Sheets_Integration.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/4ac58889d9c7f28667c40886121831fa/raw/d26fcfea7d4a82e4c679b8764ab17f4431db2d6e/LinkedIn_Job_Finder_Automation_using_Bright_Data_API__Google_Sheets_Integration.json",
"screenshotURL": "https://i.ibb.co/twqNxq2D/b953c4e0f231.png",
"workflowUpdated": true,
"gistId": "4ac58889d9c7f28667c40886121831fa",
"templateDescriptionFull": "A comprehensive n8n automation that searches LinkedIn job postings using Bright Data’s API and automatically organizes results in Google Sheets for efficient job hunting and recruitment workflows.\n\nThis workflow provides an automated LinkedIn job search solution that collects job postings based on your search criteria and organizes them in Google Sheets. Perfect for job seekers, recruiters, HR professionals, and talent acquisition teams.\n\n🔍 Smart Job Search: Form-based input for city, job title, country, and job type\n🛍 LinkedIn Integration: Uses Bright Data’s LinkedIn dataset for accurate job posting data\n📊 Automated Organization: Populates Google Sheets with structured job data\n📧 Real-time Processing: Processes job search requests in real-time\n📈 Data Storage: Stores job details including company info, locations, and apply links\n🔄 Batch Processing: Handles multiple job postings efficiently\n⚡ Fast & Reliable: Built-in error handling for scraping\n🎯 Customizable Filters: Advanced job filtering based on criteria\n\nJob Search Criteria: City, job title, country, and optional job type\nSearch Parameters: Configurable filters and limits\nOutput Preferences: Google Sheets destination\n\nForm Submission\nData Request to Bright Data API\nStatus Monitoring\nData Extraction\nData Filtering\nSheet Update\nError Handling\n\nField\n\nDescription\n\nExample\n\nJob Title\n\nPosition title from posting\n\nSenior Software Engineer\n\nCompany Name\n\nEmployer company name\n\nTech Solutions Inc.\n\nJob Detail\n\nJob summary/description\n\nRemote position requiring 5+ years…\n\nLocation\n\nJob location\n\nSan Francisco, CA\n\nCompany URL\n\nCompany profile link\n\nView Profile\n\nApply Link\n\nDirect application link\n\nApply Now\n\nn8n instance (self-hosted or cloud)\nGoogle account with Sheets access\nBright Data account with LinkedIn dataset access\n\nImport the Workflow: Use JSON import in n8n\nConfigure Bright Data: Add API credentials and dataset ID\nConfigure Google Sheets: Create sheet, set credentials, map columns\nUpdate Workflow Settings: Replace placeholders with your actual data\nTest & Activate: Submit test form and verify data in Google Sheets\n\nGo to your webhook URL and fill in the form with:\n\nCity: e.g., New York\nJob Title: e.g., Software Engineer\nCountry: e.g., US\nJob Type: Optional (Full-Time, Remote, etc.)\n\nComprehensive job data\nCompany info and profile links\nDirect application links\nLocation and job descriptions\n\nEdit the Create Snapshot ID node to change:\n\nTime range (e.g., “Past month”)\nResult limits\nCompany filters\n\nMore Data Points: Add salary, seniority, applicants, etc.\nCustom Form Fields: Add filters for salary, experience, industry\nMultiple Sheets: Route results by job type or location\n\nBright Data connection failed: Check API credentials and dataset access\nNo job data extracted: Verify search parameters and API limits\nGoogle Sheets permission denied: Re-authenticate and check sharing\nForm not working: Check webhook URL and field mappings\nFilter issues: Review logic and data types\nExecution failed: Check logs, retry logic, and network status\n\nJob Seeker Dashboard: Automate job search and track applications\nRecruitment Pipeline: Source candidates and monitor hiring trends\nMarket Research: Analyze job trends and salary benchmarks\nHR Analytics: Support workforce planning and competitive insights\n\nBatch Processing: Queue multiple searches with delays\nSearch History: Track and analyze past searches\nTool Integration: Connect to CRM, Slack, databases, BI tools\n\nProcessing Time: 30–60 seconds per search\nConcurrent Requests: 2–3 (depends on Bright Data plan)\nData Accuracy: 95%+\nSuccess Rate: 90%+\nDaily Capacity: 50–200 searches\nMemory: ~50MB per execution\nAPI Calls: 3–4 Bright Data + 1 Google Sheets per search\n\nn8n Community: community.n8n.io\nDocumentation: docs.n8n.io\nBright Data Support: Via your Bright Data dashboard\nGitHub Issues: Report bugs and request features\n\nYour workflow is ready for automated LinkedIn job searching. Customize it to your recruiting or job search needs.\n\nWebhook URL: https://your-n8n-instance.com/webhook/linkedin-job-finder</code></p>\n\n* ✅ Job Title * ✅ Company Information * ✅ Location Data * ✅ Job Details * ✅ Application Links * ✅ Processing Timestamps ### Use Cases: * 🔍 Job Search Automation * 📊 Recruitment Intelligence * 📝 Market Research * 🎯 HR Analytics",
"isPaid": false
},
{
"templateId": "4352",
"templateName": "Social_media_post _automation_from_google_trends_and _perplexity copy",
"templateDescription": "Screenshot 20250524 OverviewThis comprehensive n8n workflow automatically transforms trending Google search queries into engaging LinkedIn posts using AI....",
"templateUrl": "https://n8n.io/workflows/4352",
"jsonFileName": "Social_media_post__automation_from_google_trends_and__perplexity_copy.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Social_media_post__automation_from_google_trends_and__perplexity_copy.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/45036e90e3ee4e9bb4f7b189beeef1dd/raw/7657d27f66467cd540a895034d612562f569ec5c/Social_media_post__automation_from_google_trends_and__perplexity_copy.json",
"screenshotURL": "https://i.ibb.co/9kgK319M/5ac23acb1970.png",
"workflowUpdated": true,
"gistId": "45036e90e3ee4e9bb4f7b189beeef1dd",
"templateDescriptionFull": "This comprehensive n8n workflow automatically transforms trending Google search queries into engaging LinkedIn posts using AI. The system runs autonomously, discovering viral topics, researching content, and publishing professionally formatted posts to grow your social media presence.\n\nAutomate your entire social media content pipeline - from trend discovery to publication. This workflow monitors Google Trends, selects high-potential topics, creates human-like content using advanced AI, and publishes across multiple social platforms with built-in tracking.\n\nAutomated Trend Discovery: Pulls trending topics from Google Trends API with customizable filters\nIntelligent Topic Selection: AI chooses the most relevant trending topic for your niche\nMulti-AI Content Generation: Combines Perplexity for research and OpenAI for content curation\nHuman-Like Writing: Advanced prompts eliminate AI detection markers\nLinkedIn Optimization: Proper formatting with Unicode characters, emojis, and engagement hooks\nMulti-Platform Support: Ready for LinkedIn, Twitter/X, and Facebook posting\nAutomated Scheduling: Configurable posting times (default: 6 AM & 6 PM daily)\nPerformance Tracking: Automatic logging to Google Sheets with timestamps and metrics\nError Handling: Built-in delays and retry mechanisms for API stability\n\nSchedule Trigger: Automated execution at specified intervals\nGoogle Trends API: Fetches trending search queries with geographical filtering\nData Processing: JavaScript code node filters high-volume keywords (30+ search volume)\nTopic Selection: OpenAI GPT-3.5 evaluates and selects optimal trending topic\nContent Research: Perplexity AI researches selected topic for current information\nContent Generation: Advanced prompt engineering creates LinkedIn-optimized posts\nContent Distribution: Multi-platform posting with platform-specific formatting\nAnalytics Tracking: Google Sheets integration for performance monitoring\n\nSchedule Trigger: Configurable timing for automated execution\nHTTP Request (Google Trends): SerpAPI integration for trend data\nSet Node: Structures trending data for processing\nCode Node: JavaScript filtering for high-volume keywords\nOpenAI Node: Intelligent topic selection based on relevance and trend strength\nHTTP Request (Perplexity): Advanced AI research with anti-detection prompts\nWait Node: Rate limiting and API respect\nSplit Out: Prepares content for multi-platform distribution\nLinkedIn Node: Authenticated posting with community management\nGoogle Sheets Node: Automated tracking and analytics\nSocial Media Nodes: Twitter/X, LinkedIn and Facebook ready for activation\n\nContent Creators: Maintain consistent posting schedules with trending content\nMarketing Agencies: Scale content creation across multiple client accounts\nBusiness Development: Build thought leadership with timely industry insights\nPersonal Branding: Establish authority by commenting on trending topics\nSEO Professionals: Create content around high-search-volume keywords\n\nSerpAPI: Google Trends data access\nPerplexity AI: Advanced content research capabilities\nOpenAI: Content curation and topic selection\nLinkedIn Community Management API: Professional posting access\nGoogle Sheets API: Analytics and tracking\n\nLinkedIn OAuth2 community management credentials\nGoogle Sheets OAuth2 integration\nHTTP header authentication for AI services\n\nIndustry Targeting: Modify prompts for specific business verticals\nPosting Schedule: Adjust timing based on audience activity\nContent Tone: Customize voice and style through prompt engineering\nPlatform Selection: Enable/disable specific social media channels\nTrend Filtering: Adjust search volume thresholds and geographic targeting\nContent Length: Modify character limits for different platforms\n\nAnti-AI Detection: Sophisticated prompts create human-like content\nRate Limit Management: Built-in delays prevent API throttling\nError Recovery: Robust error handling with retry mechanisms\nContent Deduplication: Prevents posting duplicate content\nEngagement Optimization: LinkedIn-specific formatting for maximum reach\n\nTime Savings: Eliminates 10+ hours of weekly content creation\nConsistency: Maintains regular posting schedule without manual intervention\nRelevance: Content always based on current trending topics\nEngagement: Optimized formatting increases social media interaction\nScalability: Single workflow manages multiple platform posting\n\nImport JSON workflow file into n8n instance\nConfigure all required API credentials\nSet up [Google Sheets](Google Sheets) tracking document\nTest workflow execution with manual trigger\nEnable schedule trigger for automated operation\n\nMonitor API usage to stay within rate limits\nRegularly update prompts based on content performance\nReview and adjust trending topic filters for your niche\nMaintain backup of workflow configuration\nTest content output before enabling automation\n\nComprehensive setup documentation included\nConfiguration troubleshooting guide provided\nRegular workflow updates for API changes\nCommunity support through n8n forums\n\nsocial-media content-automation linkedin ai-generation google-trends perplexity openai marketing trend-analysis content-creation\n\nn8n Version: 1.0+\nNode Requirements: Standard n8n installation\nExternal Dependencies: API access to listed services\nHosting: Compatible with cloud and self-hosted n8n instances",
"isPaid": false
},
{
"templateId": "664",
"templateName": "template_664",
"templateDescription": "Companion workflow for Salesforce node docs workflow-screenshot",
"templateUrl": "https://n8n.io/workflows/664",
"jsonFileName": "template_664.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_664.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/fe9d2485cc1b8b7172b58f1a0b72f1f8/raw/08d16f55c62696932b1a4100c52f7aff77595a65/template_664.json",
"screenshotURL": "https://i.ibb.co/KxwtFN7C/ba165e47a624.png",
"workflowUpdated": true,
"gistId": "fe9d2485cc1b8b7172b58f1a0b72f1f8",
"templateDescriptionFull": "Companion workflow for Salesforce node docs",
"isPaid": false
},
{
"templateId": "2277",
"templateName": "Training Feedback Automation",
"templateDescription": "Who is this template for? This workflow template is designed for teams involved in training management and feedback analysis. It is particularly useful for:...",
"templateUrl": "https://n8n.io/workflows/2277",
"jsonFileName": "Training_Feedback_Automation.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Training_Feedback_Automation.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/2d80f529e7987cdfba4ecf3e27647ebf/raw/d5a5d266ad90ef3ce1a12588fb59bbd822c20ff8/Training_Feedback_Automation.json",
"screenshotURL": "https://i.ibb.co/bgr0Q0wc/fbd69415626e.png",
"workflowUpdated": true,
"gistId": "2d80f529e7987cdfba4ecf3e27647ebf",
"templateDescriptionFull": "This workflow template is designed for teams involved in training management and feedback analysis. It is particularly useful for:\n\nHR Departments: Automating the collection and response to training feedback.\nTraining Managers: Streamlining the process of handling feedback and ensuring timely follow-up.\nCorporate Trainers: Receiving direct feedback and taking actions to improve training sessions.\n\n\n\nThis workflow offers a comprehensive solution for automating feedback management, ensuring timely responses, and improving the quality of training programs.\n\nThis workflow operates with an Airtable trigger but can be easily adapted to work with other triggers like webhooks from external applications.\n\nOnce feedback data is captured, the workflow evaluates the feedback and directs it to the appropriate channel for action. Tasks are created in Usertask based on the feedback rating, and notifications are sent to relevant parties.\n\nHere’s a brief overview of this n8n workflow template:\n\nAirtable Trigger: Captures new or updated feedback entries from Airtable.\nSwitch Node: Evaluates the feedback rating and directs the workflow based on the rating.\nWebhook: Retrieves the result of a Usertask task.\nTask Creation:\n\nCreates tasks in Usertask for poor feedback.\nCreates follow-up tasks for fair to good feedback.\nDocuments positive feedback and posts recognition on LinkedIn for very good to excellent ratings.\nCreates tasks in Usertask for poor feedback.\nCreates follow-up tasks for fair to good feedback.\nDocuments positive feedback and posts recognition on LinkedIn for very good to excellent ratings.\nNotifications:\n\nSends email notifications to responsible parties for urgent actions.\nSends congratulatory emails and posts on LinkedIn for positive feedback.\nSends email notifications to responsible parties for urgent actions.\nSends congratulatory emails and posts on LinkedIn for positive feedback.\n\nFlexible Integration: This workflow can be triggered by various methods like Airtable updates or webhooks from other applications.\nAutomated Task Management: It creates tasks in Usertask based on feedback ratings to ensure timely follow-up.\nMultichannel Notifications: Sends notifications via email and LinkedIn to keep stakeholders informed and recognize successes.\nComprehensive Feedback Handling: Automates the evaluation and response to training feedback, improving efficiency and response time.\n\nSet Up Airtable: Create a table in Airtable to capture training feedback.\nConfigure n8n: Set up the Airtable trigger in n8n to capture new or updated feedback entries.\nSet Up Usertask: Configure the Usertask nodes in n8n to create and manage tasks based on feedback ratings.\nConfigure Email and LinkedIn Nodes: Set up the email and LinkedIn nodes to send notifications and post updates.\nTest the Workflow: Run tests to ensure the workflow captures feedback, creates tasks, and sends notifications correctly.\n\nVideo : https://youtu.be/U14MhTcpqeY\n\nRemember, this template was created in n8n v1.38.2.",
"isPaid": false
},
{
"templateId": "3304",
"templateName": "template_3304",
"templateDescription": "This n8n workflow template automates the process of collecting and delivering the \"Top Deals of the Day\" from MediaMarkt, tailored to user preferences. By...",
"templateUrl": "https://n8n.io/workflows/3304",
"jsonFileName": "template_3304.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_3304.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/8aeaeab19d3d9b0bb98f9f904f4dc4d7/raw/5bdacac18d0e20f949b83a61b1457b509dd1007a/template_3304.json",
"screenshotURL": "https://i.ibb.co/5XYPHWpc/da164a3e752d.png",
"workflowUpdated": true,
"gistId": "8aeaeab19d3d9b0bb98f9f904f4dc4d7",
"templateDescriptionFull": "This n8n workflow template automates the process of collecting and delivering the \"Top Deals of the Day\" from MediaMarkt, tailored to user preferences. By combining user-submitted forms, Bright Data web scraping, GPT-4o-mini deal generation, and email delivery, this workflow sends personalized product recommendations straight to a user’s inbox.\n\nCollects user preferences via a form (categories + email)\nScrapes MediaMarkt’s deals page using Bright Data\nUses GPT-4o-mini (OpenAI) to recommend top deals\nGenerates a structured HTML email using a template\nSends the personalized deals directly via email\n\nWe created and used the following community nodes:\n\nBright Data – To scrape MediaMarkt deals using proxy-based scraping\nDocument Generator – To generate a templated HTML document from deal data\n\nThese nodes are not available in n8n Cloud and require self-hosted n8n.\n\nInstall Community Nodes\nMake sure you're on a self-hosted n8n instance. Install:\n\nn8n-nodes-brightdata\nn8n-nodes-document-generator\nn8n-nodes-brightdata\nn8n-nodes-document-generator\nConfigure Credentials\n\nBright Data API Key (Proxy + Scraping setup)\nOpenAI API Key (GPT-4o-mini access)\nSMTP Credentials for sending emails\nBright Data API Key (Proxy + Scraping setup)\nOpenAI API Key (GPT-4o-mini access)\nSMTP Credentials for sending emails\nCustomize the Form\nAdapt the form node to collect desired categories and email addresses. Typical categories include appliances, phones, laptops, etc.\nDesign Your HTML Template\nIn the Document Generator node, you can tweak the HTML/CSS to change how deals appear in the final email.\nTest the Workflow\nSubmit the form with test data and check that the entire flow—from scraping to email—executes as expected.\n\nUser Interaction via Form\nUsers select product categories and enter their email. This triggers the workflow.\nData Extraction via Bright Data\nBright Data scrapes the MediaMarkt offers page and returns HTML content.\nHTML Parsing\nKey elements like product names, prices, and links are extracted for processing.\nGPT-4o-mini Recommendation Generation\nThe extracted data is sent to OpenAI (GPT-4o-mini), which filters, ranks, and enhances deals based on the user’s preferences.\nData Structuring & Split\nThe result is split into individual deal items to be formatted.\nHTML Document Creation\nDocument Generator populates a clean HTML template with the top recommended deals.\nEmail Delivery\nThe final document is emailed via SMTP to the user with a friendly message.\n\nUsers receive a custom HTML email featuring a curated list of top MediaMarkt deals based on their selected categories.\n\nBright Data API – Web scraping with proxy support\nOpenAI API – Generating personalized recommendations\nSMTP – Sending personalized deal emails\n\nChange the Data Source: You can adapt this to scrape other e-commerce sites.\nUpdate the Email Template: Make it match your branding or include images.\nExtend the Form: Add preferences like price range or specific brands.\nAdd Scheduling: Use Cron to run the workflow daily or weekly.\n\nTemplate and node created by Miquel Colomer and n8nhackers.com.\n\nNeed help customizing or deploying? Contact us for consulting and support.",
"isPaid": false
},
{
"templateId": "2367",
"templateName": "template_2367",
"templateDescription": "Replicate Line Items on New Deal in HubSpot Workflow Use CaseThis workflow solves the problem of manually copying line items from one deal to another in...",
"templateUrl": "https://n8n.io/workflows/2367",
"jsonFileName": "template_2367.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_2367.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/1b08cbaf300c98b8626c8e9d5541bffa/raw/950e3b44cca7fd91949329e6f64aab75d3ae025d/template_2367.json",
"screenshotURL": "https://i.ibb.co/yFgtj1Ms/a4715fd454ab.png",
"workflowUpdated": true,
"gistId": "1b08cbaf300c98b8626c8e9d5541bffa",
"templateDescriptionFull": "This workflow solves the problem of manually copying line items from one deal to another in HubSpot, reducing manual work and minimizing errors.\n\nTriggers upon receiving a webhook with deal IDs.\nRetrieves the IDs of the won and created deals.\nFetches line items associated with the won deal.\nExtracts product SKUs from the retrieved line items.\nFetches product details based on SKUs.\nCreates new line items for the created deal and associates them.\nSends a Slack notification with success details.\n\nCreate a HubSpot Deal Workflow\n1.1 Set up your trigger (ex: when deal stage = Won)\n1.2 Add step : Create Record (deal)\n1.3 Add Step : Send webhook. The webhook should be a Get to your n8n first trigger. Set two query parameter :\n\ndeal_id_won as the Record ID of the deal triggering the HubSpot Workflow\ndeal_id_create as the Record ID of the deal created above. Click Insert Data -> The created object\ndeal_id_won as the Record ID of the deal triggering the HubSpot Workflow\ndeal_id_create as the Record ID of the deal created above. Click Insert Data -> The created object\nSet up your HubSpot App token in HubSpot -> Settings -> Integration -> Private Apps\nSet up your HubSpot Token integration using the predefined model.\nSet up your Slack connection\nAdd an error Workflow to monitor errors",
"isPaid": false
},
{
"templateId": "4721",
"templateName": "Lead Magnet Agent - Trigify",
"templateDescription": "Want to check out all my flows, follow me on: https://maxmitcham.substack.com/ https://www.linkedin.com/in/max-mitcham/ This automation flow is designed to...",
"templateUrl": "https://n8n.io/workflows/4721",
"jsonFileName": "Lead_Magnet_Agent_-_Trigify.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Lead_Magnet_Agent_-_Trigify.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/78c7037d7a632bc4744e882f405b5bfe/raw/1031ba41e46b9e2b0157f525f2d5e0cf5391d15c/Lead_Magnet_Agent_-_Trigify.json",
"screenshotURL": "https://i.ibb.co/wFPZt6jN/0108fe5fe88c.png",
"workflowUpdated": true,
"gistId": "78c7037d7a632bc4744e882f405b5bfe",
"templateDescriptionFull": "Want to check out all my flows, follow me on:\n\nhttps://maxmitcham.substack.com/\n\nhttps://www.linkedin.com/in/max-mitcham/\n\nThis automation flow is designed to generate comprehensive, research-backed lead magnet articles based on a user-submitted topic, conduct deep research across multiple sources, and automatically create a professional Google Doc ready for LinkedIn sharing.\n\n⚙️ How It Works (Step-by-Step):\n\n📝 Chat Input (Entry Point)\nA user submits a topic through the chat interface:\n\n🔍 Query Builder Agent\nAn AI agent refines the input by:\n\n📚 Research Leader Agent\nConducts comprehensive research that:\n\n📋 Project Planner Agent\nStructures the content by:\n\n✍️ Research Assistant Team\nMultiple AI agents write simultaneously:\n\n📝 Editor Agent\nProfessional content polishing:\n\n📄 Google Docs Creation\nAutomated document generation:\n\n🛠️ Tools Used:\n\n📦 Key Features:\n\n🚀 Ideal Use Cases:",
"isPaid": false
},
{
"templateId": "5449",
"templateName": "Reranks #1",
"templateDescription": "This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Cold Calling Automation - End-to-End Automated Cold...",
"templateUrl": "https://n8n.io/workflows/5449",
"jsonFileName": "Reranks_1.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Reranks_1.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/9588290d1b6d51cb98dab67a8f43b6af/raw/4b753ea42f753861aa9d0d10cd8a339663f47bed/Reranks_1.json",
"screenshotURL": "https://i.ibb.co/jZqkcNR6/2d7d25ba616d.png",
"workflowUpdated": true,
"gistId": "9588290d1b6d51cb98dab67a8f43b6af",
"templateDescriptionFull": "This workflow contains community nodes that are only compatible with the self-hosted version of n8n.\n\n\n\nThe \"Cold Calling Automation\" workflow is designed to fully automate the end-to-end cold calling process by intelligently combining web scraping, AI-powered research, and WhatsApp messaging. Leveraging key technologies such as Apify for data scraping, RAG (Retrieval-Augmented Generation) for intelligent content creation, and WhatsApp integration for automated outreach, this workflow transforms raw prospect data into personalized, high-converting cold calling campaigns with minimal manual intervention.\n\nScale Your Outreach: Automate hundreds of personalized cold calls without manual effort or hiring additional staff.\nIntelligent Personalization: RAG technology creates highly relevant, personalized messages based on prospect research.\nMulti-Channel Approach: Seamlessly integrate WhatsApp messaging with traditional cold calling methods.\nReal-Time Optimization: Continuously improve message performance and conversion rates through AI analysis.\nCost-Effective: Reduce cold calling costs while dramatically increasing reach and response rates.\n\nSales Teams: Looking to scale their cold calling efforts with intelligent automation and personalization.\nLead Generation Agencies: Needing to deliver high-volume, high-quality cold calling services to clients.\nBusiness Development Professionals: Seeking to maximize outreach efficiency while maintaining personal touch.\nSmall Business Owners: Who want professional-grade cold calling capabilities without hiring expensive sales teams.\nMarketing Agencies: Offering comprehensive lead generation and conversion services to clients.\n\nTraditional cold calling is time-consuming, expensive, and often ineffective due to lack of personalization and poor timing. Manual prospect research, script writing, and call execution create bottlenecks that limit outreach scale. Generic messages result in low response rates and damaged brand reputation. This workflow solves these problems by automating the entire cold calling pipeline - from prospect identification and research to personalized message creation and delivery - while maintaining high quality and relevance that converts prospects into qualified leads.\n\n⏱ Prospect Scraping: Uses Apify to automatically scrape and identify high-quality prospects based on your target criteria.\n🔍 Intelligent Research: Employs RAG technology to research each prospect and gather relevant business intelligence.\n✍️ Personalized Content: Automatically generates custom messages, scripts, and talking points for each prospect.\n📱 WhatsApp Integration: Delivers personalized messages through WhatsApp automation for maximum engagement.\n📊 Performance Tracking: Monitors response rates, engagement metrics, and conversion data for continuous optimization.\n🤖 AI-Powered Follow-up: Automatically handles initial responses and schedules appropriate follow-up actions.\n📈 Campaign Analytics: Provides detailed insights on campaign performance and ROI metrics.\n🔄 Continuous Learning: Improves message effectiveness and targeting based on campaign results.\n\nThis workflow also using community node: @devlikeapro/n8n-nodes-waha\n\nImport the provided workflow JSON into your n8n instance (Cloud or self-hosted).\nSet up credentials:\n\nApify API credentials for prospect scraping\nOpenAI API key for RAG and content generation\nWhatsApp Business API credentials or WAHA integration\nDatabase credentials for prospect and campaign tracking\nEmail credentials for notifications and reporting\nApify API credentials for prospect scraping\nOpenAI API key for RAG and content generation\nWhatsApp Business API credentials or WAHA integration\nDatabase credentials for prospect and campaign tracking\nEmail credentials for notifications and reporting\nCustomize parameters:\n\nTarget prospect criteria and scraping parameters\nMessage templates and personalization rules\nCampaign timing and frequency settings\nResponse handling and follow-up logic\nPerformance tracking and reporting preferences\nTarget prospect criteria and scraping parameters\nMessage templates and personalization rules\nCampaign timing and frequency settings\nResponse handling and follow-up logic\nPerformance tracking and reporting preferences\nTest the complete workflow with a small prospect list to verify scraping, personalization, and delivery.\n\nActive n8n instance (Cloud or Self-hosted)\nApify account with appropriate scraping credits\nOpenAI API key with sufficient usage limits\nWhatsApp Business account or WAHA setup\nDatabase system for prospect and campaign management\nBasic understanding of your target audience and value proposition\n\nIntegrate with CRM systems to sync prospects and track conversion through sales pipeline.\nAdd voice calling capabilities using VoIP services for complete omnichannel outreach.\nImplement A/B testing for message templates and timing optimization.\nConnect with social media platforms for multi-channel prospecting and engagement.\nAdd sentiment analysis to optimize message tone and approach for different prospect types.\nIntegrate with calendar systems for automatic meeting scheduling from qualified responses.\n\nApify nodes for prospect scraping and data collection\nOpenAI Chat Model and Embeddings for RAG implementation\nWhatsApp/WAHA nodes for message delivery and response handling\nDatabase nodes for prospect storage and campaign tracking\nHTTP Request nodes for API integrations and webhooks\nCode nodes for data processing and personalization logic\nSchedule Trigger for automated campaign execution\nConditional nodes for response handling and follow-up logic\nSet nodes for parameter configuration and data transformation\nSplit In Batches for efficient bulk processing\n\n50-80% increase in cold calling efficiency and prospect reach\n25-40% higher response rates compared to generic cold calling\n60-75% reduction in manual research and message preparation time\nReal-time insights into campaign performance and prospect engagement\nScalable system that grows with your business needs\n\nMade by: khaisa Studio\nTag: automation, cold calling, lead generation, apify, RAG, whatsapp, AI, sales automation, outreach\nCategory: Sales Automation & Lead Generation\nNeed a custom? contact me for more tailored templates",
"isPaid": false
},
{
"templateId": "3350",
"templateName": "template_3350",
"templateDescription": "This n8n template demonstrates an approach to perform bot-to-human handoff using Human-in-the-loop functionality as a switch. In this experiment, we play...",
"templateUrl": "https://n8n.io/workflows/3350",
"jsonFileName": "template_3350.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_3350.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/85d3305abd4d99a8066d340673dcf58c/raw/f78dd4c10637f9c033a2fe24274042228ee413e2/template_3350.json",
"screenshotURL": "https://i.ibb.co/5XYPHWpc/da164a3e752d.png",
"workflowUpdated": true,
"gistId": "85d3305abd4d99a8066d340673dcf58c",
"templateDescriptionFull": "In this experiment, we play with the idea of states we want our agent to be in which controls it's interacton with the user.\n\nFirst state - the agent is onboarding the user by collecting their details for a sales inquiry. After which, they are handed-off / transferred to a human to continue the call.\nSecond state - the agent is essentially \"deactivated\" as further messages to the bot will not reach it. Instead, a canned response is given to the user. The human agent must \"reactivate\" the bot by completing the human-in-the-loop form and give a summary of their conversation with the user.\nThird state - the agent is \"reactivated\" with context of the human-to-user conversation and is set to provide after sales assistance. An tool is made available to the agent to again delegate back to the human agent when requested.\n\nThis template uses telegram to handle the interaction between the user and the agent.\nEach user message is checked for a session state to ensure it is guided to the right stage of the conversation. For this, we can use Redis as a simple key-value store.\nWhen no state is set, the user is directed through an onboarding step to attain their details. Once complete, the agent will \"transfer\" the user to a human agent - technically, all this involves is an update to the session state and a message to another chat forwarding the user's details.\nDuring this \"human\" state, the agent cannot reply to the user and must wait until the human \"transfers\" the conversation back. The human can do this by replying to \"human-in-the-loop\" message with a summary of their conversation with the user. This session state now changes to \"bot\" and the context is implanted in the agent's memory so that the agent can respond to future questions.\nAt this stage of the conversation, the agent is now expected to handle and help the user with after-sales questions. The user can at anytime request transfer back to the human agent, repeating the previous steps as necessary.\n\nPlan your user journey! Here is a very basic example of a sales inquiry with at most 3 states. More thought should be developed when many more states are involved.\nYou may want to better log and manage session states so no user is left in limbo. Try connecting the user and sessions to your CRM.\nNote, the Onboarding agent and After-Sales agent have separate chat memories. When adding more agents, it is recommend to continue having separate chat memories to help focus between states.\n\nTelegram for chatbot & interface\nRedis for session store and chat memory\nOpenAI for AI agent\n\nNot using Telegram? This template works with Whatsapp and other services with equivalent functionality.",
"isPaid": false
},
{
"templateId": "3433",
"templateName": "Customer and Sales Support",
"templateDescription": "Who is this tempate for?This workflow powers a simple yet effective customer and sales support chatbot for your webshop. It's perfect for solopreneurs who...",
"templateUrl": "https://n8n.io/workflows/3433",
"jsonFileName": "Customer_and_Sales_Support.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Customer_and_Sales_Support.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/8427bd96246fe6440bf7d5b9fc1ca544/raw/1aa98aef1c70d45fd365f428932bfc561b532d84/Customer_and_Sales_Support.json",
"screenshotURL": "https://i.ibb.co/HTkrx65p/6c8cc9cb6e53.png",
"workflowUpdated": true,
"gistId": "8427bd96246fe6440bf7d5b9fc1ca544",
"templateDescriptionFull": "This workflow powers a simple yet effective customer and sales support chatbot for your webshop. It's perfect for solopreneurs who want to automate customer interactions without relying on expensive or complex support tools.\n\nThe chatbot listens to user requests—such as checking product availability—and automatically handles the following\n\nFetches product information from a Google Sheet\nAnswers customer queries\nPlaces an order\nUpdates the stock after a successful purchase\n\nEverything runs through a single Google Sheet used for both stock tracking and order management.\n\nBefore you begin, connect your Google Sheets credentials by following this guide: This will be used to connect all the tools to Google Sheets\n👉 Setup Google sheets credentials\n\nGet Stock\n\nOpen \"Get Stock\" tool node and select the Google sheet credentials you created.\nChoose the correct google sheet document and sheet name and you are done.\nOpen \"Get Stock\" tool node and select the Google sheet credentials you created.\nChoose the correct google sheet document and sheet name and you are done.\nPlace order\n\nGo to your \"Place Order\" tool node and select the Google sheet credentials you have created.\nChoose the correct google sheet document and sheet name.\nGo to your \"Place Order\" tool node and select the Google sheet credentials you have created.\nChoose the correct google sheet document and sheet name.\nUpdate Stock\n\n\n\nOpen your \"Update Stock\" tool node and select the Google sheet credentials you have created.\n\n\nChoose the correct google sheet document and sheet name.\nIn \"Mapping Column Mode\" section select map each column manually.\nIn \"Column to match on\" select the column with a unique identifier (e.g., Product ID) to match stock items.\nIn values to update section, add only the column(s) that need to be updated—usually the stock count.\nOpen your \"Update Stock\" tool node and select the Google sheet credentials you have created.\nOpen your \"Update Stock\" tool node and select the Google sheet credentials you have created.\nChoose the correct google sheet document and sheet name.\nIn \"Mapping Column Mode\" section select map each column manually.\nIn \"Column to match on\" select the column with a unique identifier (e.g., Product ID) to match stock items.\nIn values to update section, add only the column(s) that need to be updated—usually the stock count.\nAI Agent node\n\nAdjust the prompt according to your use case and customize what you need.\nAdjust the prompt according to your use case and customize what you need.\n\nStock sheet\n\nOrder sheet",
"isPaid": false
},
{
"templateId": "2464",
"templateName": "template_2464",
"templateDescription": "Are you a popular tech startup accelerator (named after a particular higher order function) overwhelmed with 1000s of pitch decks on a daily basis? Wish you...",
"templateUrl": "https://n8n.io/workflows/2464",
"jsonFileName": "template_2464.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_2464.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/ddec926a045b1f95cd094be8e815b7c7/raw/8e719af15b0d8f6b23f4f99ce197cd5f5bc664a9/template_2464.json",
"screenshotURL": "https://i.ibb.co/PsYK9PfZ/8e02465b16f5.png",
"workflowUpdated": true,
"gistId": "ddec926a045b1f95cd094be8e815b7c7",
"templateDescriptionFull": "Are you a popular tech startup accelerator (named after a particular higher order function) overwhelmed with 1000s of pitch decks on a daily basis? Wish you could filter through them quickly using AI but the decks are unparseable through conventional means? Then you're in luck!\n\nThis n8n template uses Multimodal LLMs to parse and extract valuable data from even the most overly designed pitch decks in quick fashion. Not only that, it'll also create the foundations of a RAG chatbot at the end so you or your colleagues can drill down into the details if needed. With this template, you'll scale your capacity to find interesting companies you'd otherwise miss!\n\nRequires n8n v1.62.1+\n\nAirtable is used as the pitch deck database and PDF decks are downloaded from it.\nAn AI Vision model is used to transcribe each page of the pitch deck into markdown.\nAn Information Extractor is used to generate a report from the transcribed markdown and update required information back into pitch deck database.\nThe transcribed markdown is also uploaded to a vector store to build an AI chatbot which can be used to ask questions on the pitch deck.\n\nCheck out the sample Airtable here: https://airtable.com/appCkqc2jc3MoVqDO/shrS21vGqlnqzzNUc\n\nThis template depends on the availability of the Airtable - make a duplicate of the airtable (link) and its columns before running the workflow.\nWhen a new pitchdeck is received, enter the company name into the Name column and upload the pdf into the File column. Leave all other columns blank.\nIf you have the Airtable trigger active, the execution should start immediately once the file is uploaded. Otherwise, click the manual test trigger to start the workflow.\nWhen manually triggered, all \"new\" pitch decks will be handled by the workflow as separate executions.\n\nOpenAI for LLM\nAirtable For Database and Interface\nQdrant for Vector Store\n\nExtend this starter template by adding more AI agents to validate claims made in the pitch deck eg. Linkedin Profiles, Page visits, Reviews etc.",
"isPaid": false
},
{
"templateId": "2324",
"templateName": "template_2324",
"templateDescription": "Who is this for?This workflow is for all sales reps and lead generation manager who need to prepare their prospecting activities, and find relevant...",
"templateUrl": "https://n8n.io/workflows/2324",
"jsonFileName": "template_2324.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_2324.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/0ab641823b5b8616536fa9eca731c043/raw/b1c188e74bd12ef1f44ab78bfe21dcb7190a514f/template_2324.json",
"screenshotURL": "https://i.ibb.co/7tZZgKkR/f6da2fd98d8c.png",
"workflowUpdated": true,
"gistId": "0ab641823b5b8616536fa9eca731c043",
"templateDescriptionFull": "This workflow is for all sales reps and lead generation manager who need to prepare their prospecting activities, and find relevant information to personalize their outreach.\n\nThis workflow allows you to do account research with the web using AI.\n\nIt has the potential to replace manual work done by sales rep when preparing their prospecting activities by searching complex information available online.\n\nThe advanced AI module has 2 capabilities:\n\nResearch Google using SerpAPI\nVisit and get website content using a sub-workflow\n\nFrom an unstructured input like a domain or a company name.\n\nIt will return the following properties:\n\ndomain\ncompany Linkedin Url\ncheapest plan\nhas free trial\nhas entreprise plan\nhas API\nmarket (B2B or B2C)\n\nThe strength of n8n here is that you can adapt this workflow to research whatever information you need.\n\nYou just have to precise it in the prompt and to precise the output format in the \"Strutured Output Parser\" module.\n\nDetailed instructions + video guide can be found by following this link.",
"isPaid": false
},
{
"templateId": "2325",
"templateName": "ERP AI chatbot for Odoo sales module",
"templateDescription": "Who is this for?This workflow is for everyone who wants to have easier access to their Odoo sales data without complex queries. Use CaseTo have a clear...",
"templateUrl": "https://n8n.io/workflows/2325",
"jsonFileName": "ERP_AI_chatbot_for_Odoo_sales_module.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/ERP_AI_chatbot_for_Odoo_sales_module.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/b0244fa52c8c3d15af8ed183e8cd0938/raw/bfafdf6b5fe0ea32cd7372ed3000becb664c03af/ERP_AI_chatbot_for_Odoo_sales_module.json",
"screenshotURL": "https://i.ibb.co/JR65053s/a1d934f6b4d3.png",
"workflowUpdated": true,
"gistId": "b0244fa52c8c3d15af8ed183e8cd0938",
"templateDescriptionFull": "This workflow is for everyone who wants to have easier access to their Odoo sales data without complex queries.\n\nTo have a clear overview of your sales data in Odoo you typically needs to extract data from it manually to analyse it. This workflow uses OpenAI's language models to create an intelligent chatbot that provides conversational access to your Odoo sales opportunity data.\n\nCreates a summary of all Odoo sales opportunities using OpenAI\nUses that summary as context for the OpenAI chat model\nKeeps the summary up to date using a schedule trigger\n\nConfigure the Odoo credentials\nConfigure OpenAI credentials\nToggle \"Make Chat Publicly Available\" from the Chat Trigger node.",
"isPaid": false
},
{
"templateId": "5074",
"templateName": "Telegram Sales Agent For Business",
"templateDescription": "A complete, ready-to-deploy Telegram chatbot template for food delivery businesses. This intelligent assistant handles orders, payments, customer service,...",
"templateUrl": "https://n8n.io/workflows/5074",
"jsonFileName": "Telegram_Sales_Agent_For_Business.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Telegram_Sales_Agent_For_Business.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/4226301a1a5e97342d0ec38f7072bbb0/raw/d5e67784a8636672afed60d3092b8447a00c15cb/Telegram_Sales_Agent_For_Business.json",
"screenshotURL": "https://i.ibb.co/cKfjBMdr/f51d0c3647f4.png",
"workflowUpdated": true,
"gistId": "4226301a1a5e97342d0ec38f7072bbb0",
"templateDescriptionFull": "A complete, ready-to-deploy Telegram chatbot template for food delivery businesses. This intelligent assistant handles orders, payments, customer service, and order tracking with human-in-the-loop payment verification.\n✨ Key Features\n\n📱 Telegram Integration - Seamless customer interaction via Telegram\n💳 Payment Verification - Screenshot-based payment confirmation with admin approval\n📊 Order Tracking - Automatic Google Sheets logging of all orders\n🧠 Memory Management - Contextual conversation memory for better customer experience\n🌍 Multi-Currency Support - Easily customizable for any currency (USD, EUR, GBP, etc.)\n📍 Location Flexible - Adaptable to any city/country\n🔄 Human Oversight - Manual payment approval workflow for security\n\nCore Workflow\n\nCustomer Interaction - AI assistant takes orders via Telegram\nOrder Confirmation - Summarizes order with total and payment details\nInformation Collection - Gathers customer name, phone, and delivery address\nPayment Processing - Handles payment screenshots and verification\nAdmin Approval - Human verification of payments before order confirmation\nOrder Tracking - Automatic logging to Google Sheets with delivery estimates\n\nAI Agent Node - Google Gemini-powered conversation handler\nMemory System - Maintains conversation context per customer\nGoogle Sheets Integration - Automatic order logging and tracking\nTelegram Nodes - Customer and admin communication\nPayment Verification - Screenshot detection and approval workflow\nConditional Logic - Smart routing based on message types\n\nPrerequisites\n\nn8n instance (cloud or self-hosted)\nTelegram Bot Token\nGoogle Sheets API access\nGoogle Gemini API key\n\nSearch and replace the following placeholders throughout the template:\nBusiness Information\n\n[YOUR_BUSINESS_NAME] → Your restaurant/food business name\n[ASSISTANT_NAME] → Your bot's name (e.g., \"Alex\", \"Bella\", \"Chef Bot\")\n[YOUR_CITY] → Your city\n[YOUR_COUNTRY] → Your country\n[YOUR_ADDRESS] → Your business address\n[YOUR_PHONE] → Your business phone number\n[YOUR_EMAIL] → Your business email\n[YOUR_HOURS] → Your operating hours (e.g., \"9AM - 11PM daily\")\n\nCurrency & Localization\n\n[YOUR_CURRENCY] → Your currency name (e.g., \"USD\", \"EUR\", \"GBP\")\n[CURRENCY_SYMBOL] → Your currency symbol (e.g., \"$\", \"€\", \"£\")\n[YOUR_TIMEZONE] → Your timezone (e.g., \"EST\", \"PST\", \"GMT\")\n[PREFIX] → Order ID prefix (e.g., \"FB\" for \"Food Business\")\n\nMenu Items (Customize Completely)\n\n[CATEGORY_1] → Food category (e.g., \"Burgers\", \"Pizza\", \"Sandwiches\")\n[ITEM_1] through [ITEM_8] → Your menu items\n[PRICE_1] through [DELIVERY_FEE] → Your prices\nAdd or remove categories and items as needed\n\nPayment & Support\n\n[YOUR_PAYMENT_DETAILS] → Your payment information\n[YOUR_PAYMENT_PROVIDER] → Your payment method (e.g., \"Venmo\", \"PayPal\", \"Bank Transfer\")\n[YOUR_SUPPORT_HANDLE] → Your Telegram support username\n\nTelegram Bot - Add your bot token to Telegram credentials\nGoogle Sheets - Connect your Google account and create/select your orders spreadsheet\nGoogle Gemini - Add your Gemini API key\nSheet ID - Replace [YOUR_GOOGLE_SHEET_ID] with your actual Google Sheet ID\n\nUpdate the menu section in the AI Agent system message with your actual:\n\nFood categories\nItem names and prices\nDelivery fees\nAny special offerings or combos\n\nImport the template into your n8n instance\nTest the conversation flow with a test Telegram account\nVerify Google Sheets logging works correctly\nTest the payment approval workflow\nActivate the workflow\n\n💰 Currency Examples\nUSD Version\n🍔 MENU & PRICES (USD)\nBurgers\n\nClassic Burger – $12.99\nCheese Burger – $14.99\nDeluxe Burger – $18.99\n\nDelivery Fee – $3.99\nEUR Version\n🍔 MENU & PRICES (EUR)\nBurgers\n\nClassic Burger – €11.50\nCheese Burger – €13.50\nDeluxe Burger – €17.50\n\nDelivery Fee – €3.50\n📊 Google Sheets Structure\nThe template automatically logs orders with these columns:\n\nOrder ID\nCustomer Name\nChat ID\nPhone Number\nDelivery Address\nOrder Info\nTotal Price\nPayment Status\nOrder Status\nTimestamp\n\n🔧 Customization Options\nEasy Customizations\n\nMenu Items - Add/remove/modify any food items\nPricing - Update to your local pricing structure\nCurrency - Change to any currency worldwide\nBusiness Hours - Modify operating hours\nDelivery Areas - Add location restrictions\nPayment Methods - Update payment information# Header 1",
"isPaid": false
},
{
"templateId": "6137",
"templateName": "template_6137",
"templateDescription": "How it works This template is a complete, hands-on tutorial for building a RAG (Retrieval-Augmented Generation) pipeline. In simple terms, you'll teach an...",
"templateUrl": "https://n8n.io/workflows/6137",
"jsonFileName": "template_6137.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_6137.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/28b05230e3b7144e7f47de955b9071c9/raw/ca9904f0a17c5e4e87cb33844252ea125e40d99a/template_6137.json",
"screenshotURL": "https://i.ibb.co/fzjjLnLg/a6c038d941bd.png",
"workflowUpdated": true,
"gistId": "28b05230e3b7144e7f47de955b9071c9",
"templateDescriptionFull": "This template is a complete, hands-on tutorial for building a RAG (Retrieval-Augmented Generation) pipeline. In simple terms, you'll teach an AI to become an expert on a specific topic—in this case, the official n8n documentation—and then build a chatbot to ask it questions.\n\nThink of it like this: instead of a general-knowledge AI, you're building an expert librarian.\n\nThe workflow is split into two main parts:\n\nPart 1: Indexing the Knowledge (Building the Library)\nThis is a one-time process you run manually. The workflow automatically scrapes all pages of the n8n documentation, breaks them down into small, digestible chunks, and uses an AI model to create a special numerical representation (an \"embedding\") for each chunk. These embeddings are then stored in n8n's built-in Simple Vector Store. This is like a librarian reading every book and creating a hyper-detailed index card for every paragraph.\nImportant: This in-memory knowledge base is temporary. It will be erased if you restart your n8n instance, and you will need to run the indexing process again.\nPart 2: The AI Agent (The Expert Librarian)\nThis is the chat interface. When you ask a question, the AI agent doesn't guess the answer. Instead, it uses your question to find the most relevant \"index cards\" (chunks) from the knowledge base it just built. It then feeds these specific, relevant chunks to a powerful language model (Gemini) with a strict instruction: \"Answer the user's question using ONLY this information.\" This ensures the answers are accurate, factual, and grounded in your provided documents.\n\nSetup time: ~2 minutes (plus ~15-20 minutes for indexing)\n\nThis template uses n8n's built-in tools, removing the need for an external database. Follow these simple steps to get started.\n\nConfigure Google AI Credentials:\n\nYou will need a Google AI API key for the Gemini models.\nIn your n8n workflow, go to any of the three Gemini nodes (e.g., Gemini 2.5 Flash).\nClick the Credential dropdown and select + Create New Credential.\nEnter your Gemini API key and save.\nYou will need a Google AI API key for the Gemini models.\nIn your n8n workflow, go to any of the three Gemini nodes (e.g., Gemini 2.5 Flash).\nClick the Credential dropdown and select + Create New Credential.\nEnter your Gemini API key and save.\nApply Credentials to All Nodes:\n\nYour new Google AI credential is now saved. Go to the other two Gemini nodes (Gemini Chunk Embedding and Gemini Query Embedding) and select your newly created credential from the dropdown list.\nYour new Google AI credential is now saved. Go to the other two Gemini nodes (Gemini Chunk Embedding and Gemini Query Embedding) and select your newly created credential from the dropdown list.\nBuild the Knowledge Base:\n\nFind the Start Indexing manual trigger node at the top-left of the workflow.\nClick its \"Execute workflow\" button to start the indexing process.\n⚠️ Be Patient: This will take 15-20 minutes as it scrapes and processes the entire n8n documentation. You only need to do this once per n8n session. If you restart n8n, you must run this step again.\nFind the Start Indexing manual trigger node at the top-left of the workflow.\nClick its \"Execute workflow\" button to start the indexing process.\n⚠️ Be Patient: This will take 15-20 minutes as it scrapes and processes the entire n8n documentation. You only need to do this once per n8n session. If you restart n8n, you must run this step again.\nChat with Your Expert Agent:\n\nOnce the indexing is complete, Activate the entire workflow using the toggle at the top of the screen.\nOpen the RAG Chatbot chat trigger node (bottom-left) and copy its Public URL.\nOpen the URL in a new tab and start asking questions about n8n! For example: \"How does the IF node work?\" or \"What is a sub-workflow?\".\nOnce the indexing is complete, Activate the entire workflow using the toggle at the top of the screen.\nOpen the RAG Chatbot chat trigger node (bottom-left) and copy its Public URL.\nOpen the URL in a new tab and start asking questions about n8n! For example: \"How does the IF node work?\" or \"What is a sub-workflow?\".",
"isPaid": false
},
{
"templateId": "4969",
"templateName": "template_4969",
"templateDescription": "Description This workflow creates an automated video content pipeline that generates creative TikTok-style videos using AI. It combines OpenAI's GPT-4o-mini...",
"templateUrl": "https://n8n.io/workflows/4969",
"jsonFileName": "template_4969.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_4969.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/3bdedbfeea81978999ba639250cadc94/raw/81125d4b74355647e02485e014a0b2aa739a28b1/template_4969.json",
"screenshotURL": "https://i.ibb.co/tTmrGfBv/27b692020453.png",
"workflowUpdated": true,
"gistId": "3bdedbfeea81978999ba639250cadc94",
"templateDescriptionFull": "This workflow creates an automated video content pipeline that generates creative TikTok-style videos using AI. It combines OpenAI's GPT-4o-mini for idea generation with Sisif.ai's text-to-video AI technology to produce engaging short-form content automatically.\n\nPerfect for: Content creators, social media managers, marketing teams, and anyone who wants to maintain a consistent flow of AI-generated video content without manual intervention.\n\nSisif.ai Account: Sign up at sisif.ai and get your API token from sisif.ai/api/\nOpenAI Account: Get your API key from OpenAI platform\nn8n Instance: Self-hosted or cloud instance\n\nThe workflow operates on a scheduled cycle, generating fresh video content every 6 hours:\n\n🤖 AI Idea Generation: OpenAI's GPT-4o-mini acts as a creative video strategist, generating unique, trend-aware video concepts optimized for TikTok and social media\n🎬 Video Creation: Sisif.ai transforms each creative prompt into a high-quality 5-second video in 540x960 resolution\n⏱️ Smart Monitoring: The workflow intelligently monitors video generation progress, waiting for completion before proceeding\n📊 Data Processing: Final video data is structured and prepared for further use or storage\n\nRuns every 6 hours without manual intervention\nGenerates 4 unique videos daily (28 videos per week)\nSelf-monitoring with automatic retry logic\n\nTikTok-perfect 540x960 resolution\n5-second duration for maximum engagement\nTrend-aware content generation\nAction-packed, visual storytelling\n\nSimple HTTP requests for reliable operation\nBearer token authentication for secure API access\nAutomatic status checking and waiting logic\nError handling and retry mechanisms",
"isPaid": false
},
{
"templateId": "3495",
"templateName": "Qdrant Vector Database Embedding Pipeline",
"templateDescription": "🧠 This workflow is designed for one purpose only, to bulk-upload structured JSON articles from an FTP server into a Qdrant vector database for use in...",
"templateUrl": "https://n8n.io/workflows/3495",
"jsonFileName": "Qdrant_Vector_Database_Embedding_Pipeline.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Qdrant_Vector_Database_Embedding_Pipeline.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/f92a9f06e7a57c2600447d1fdec72e88/raw/a4f5ff9e861a7234250fd2527f6956bec3efff72/Qdrant_Vector_Database_Embedding_Pipeline.json",
"screenshotURL": "https://i.ibb.co/wFPZt6jN/0108fe5fe88c.png",
"workflowUpdated": true,
"gistId": "f92a9f06e7a57c2600447d1fdec72e88",
"templateDescriptionFull": "🧠 This workflow is designed for one purpose only, to bulk-upload structured JSON articles from an FTP server into a Qdrant vector database for use in LLM-powered semantic search, RAG systems, or AI assistants.\n\nThe JSON files are pre-cleaned and contain metadata and rich text chunks, ready for vectorization. This workflow handles\n\nDownloading from FTP\nParsing & splitting\nEmbedding with OpenAI-embedding\nStoring in Qdrant for future querying\n\n✅ Automated Vector Loading\nHandles FTP → JSON → Qdrant in a hands-free pipeline.\n\n✅ Clean Embedding Input\nSupports pre-validated chunks with metadata: titles, tags, language, and article ID.\n\n✅ AI-Ready Format\nPerfect for Retrieval-Augmented Generation (RAG), semantic search, or assistant memory.\n\n✅ Flexible Architecture\nModular and swappable: FTP can be replaced with GDrive/Notion/S3, and embeddings can switch to local models like Ollama.\n\n✅ Community Friendly\nThis template helps others adopt best practices for vector DB feeding and LLM integration.",
"isPaid": false
},
{
"templateId": "1045",
"templateName": "ETL pipeline",
"templateDescription": "This workflow allows you to collect tweets, store them in MongoDB, analyse their sentiment, insert them into a Postgres database, and post positive tweets...",
"templateUrl": "https://n8n.io/workflows/1045",
"jsonFileName": "ETL_pipeline.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/ETL_pipeline.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/5a24bdb60e3b5cccc9512498a312e29e/raw/7f54731dfe1f679e99bcda3d7e5ed5e74fff04f9/ETL_pipeline.json",
"screenshotURL": "https://i.ibb.co/MyfLBj60/a54c7ad0d74a.png",
"workflowUpdated": true,
"gistId": "5a24bdb60e3b5cccc9512498a312e29e",
"templateDescriptionFull": "This workflow allows you to collect tweets, store them in MongoDB, analyse their sentiment, insert them into a Postgres database, and post positive tweets in a Slack channel.\n\n\n\nCron node: Schedule the workflow to run every day\n\nTwitter node: Collect tweets\n\nMongoDB node: Insert the collected tweets in MongoDB\n\nGoogle Cloud Natural Language node: Analyse the sentiment of the collected tweets\n\nSet node: Extract the sentiment score and magnitude\n\nPostgres node: Insert the tweets and their sentiment score and magnitude in a Posgres database\n\nIF node: Filter tweets with positive and negative sentiment scores\n\nSlack node: Post tweets with a positive sentiment score in a Slack channel\n\nNoOp node: Ignore tweets with a negative sentiment score",
"isPaid": false
},
{
"templateId": "1575",
"templateName": "Zammad Open Tickets",
"templateDescription": "Fetches Zammad tickets at daily basis at 08:30 then sends them to #customer support>ticket on zulip for daily standups.",
"templateUrl": "https://n8n.io/workflows/1575",
"jsonFileName": "Zammad_Open_Tickets.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Zammad_Open_Tickets.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/8319e3bb45199ddd191624c4e2c9c5c5/raw/531cda7e9aaaf9b05ebf65172e39734e57c6380f/Zammad_Open_Tickets.json",
"screenshotURL": "https://i.ibb.co/qMc0GLKN/d195f465789f.png",
"workflowUpdated": true,
"gistId": "8319e3bb45199ddd191624c4e2c9c5c5",
"templateDescriptionFull": "Fetches Zammad tickets at daily basis at 08:30 then sends them to #customer support>ticket on zulip for daily standups.",
"isPaid": false
},
{
"templateId": "3890",
"templateName": "Create Custom Presentations per Lead",
"templateDescription": "👥 Who Is This For? Sales and marketing teams seeking efficient, hands‑free generation of personalized slide decks for each prospect from CSV lead lists. 🛠...",
"templateUrl": "https://n8n.io/workflows/3890",
"jsonFileName": "Create_Custom_Presentations_per_Lead.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Create_Custom_Presentations_per_Lead.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/20d39d96a4c755e4388a367e85e68317/raw/45e7f2a0e3bb888a3c8b67dfd4be97b54191711c/Create_Custom_Presentations_per_Lead.json",
"screenshotURL": "https://i.ibb.co/KcgnmMF0/37bce8c02850.png",
"workflowUpdated": true,
"gistId": "20d39d96a4c755e4388a367e85e68317",
"templateDescriptionFull": "Sales and marketing teams seeking efficient, hands‑free generation of personalized slide decks for each prospect from CSV lead lists.\n\nManually editing presentation decks for large lead lists is slow and error‑prone. This workflow fully automates:\n\nImporting and parsing CSV lead data\nLogging leads and outputs in Google Sheets\nDuplicating a master Slides template per lead\nInjecting lead‑specific variables into slides\n\nn8n with Google Drive, Sheets, and Slides credentials\nA master Google Slides deck with placeholder tokens (e.g. {{Name}}, {{Company}})\nA Drive folder for incoming CSV lead files\n\nImport this workflow into your n8n instance.\nConfigure the New Leads Arrived node to watch your CSV folder.\nEnter your Google credentials in the Drive, Sheets, and Slides nodes.\nSpecify the master Slides template ID in the Copy Slides Template node.\nIn Create Custom Presentation, map slide tokens to sheet column names.\nDisable “Keep Binary Data” in Copy Slides Template to conserve memory.\nUpload a sample CSV (with headers like Name, Company, Metric) to test.\n\nAdd or remove variables by editing the CSV headers and updating the mapping in Merge Data for new Lead Document.\nInsert an AI/natural‑language node before slide creation to generate more advanced and personalized text blocks.\nUse SplitInBatches to throttle API calls and avoid rate‑limit errors.\nAdd error‑handling branches to capture and log failed operations.\n\nThe workflow uses placeholder variables for file and folder IDs, so no actual IDs are exposed in the template.\nEnsure OAuth scopes are limited to only the required Google APIs.",
"isPaid": false
},
{
"templateId": "4804",
"templateName": "AI Proposal Generator System",
"templateDescription": "AI Proposal Generator System CategoriesSales AutomationDocument GenerationAI Business Tools This workflow creates a complete AI-powered proposal generation...",
"templateUrl": "https://n8n.io/workflows/4804",
"jsonFileName": "AI_Proposal_Generator_System.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/AI_Proposal_Generator_System.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/2163d922c2209942df50d65bf67a6e7a/raw/32169cd05fcd51c40b24750abbb3d7ecb0e1762f/AI_Proposal_Generator_System.json",
"screenshotURL": "https://i.ibb.co/jv9Z09c4/b38c9d6de26b.png",
"workflowUpdated": true,
"gistId": "2163d922c2209942df50d65bf67a6e7a",
"templateDescriptionFull": "AI Proposal Generator System\n\nCategories\n\nSales Automation\nDocument Generation\nAI Business Tools\n\nThis workflow creates a complete AI-powered proposal generation system that transforms simple form inputs into professional, personalized proposals in under 30 seconds and can be deployed during live sales calls, allowing you to send polished proposals before the call even ends.\n\nBenefits\n\nInstant Proposal Generation - Convert 30-second form inputs into professional proposals automatically\nHigh-Value Business Tool - Generates $1,500-$5,000 per client implementation\nLive Sales Integration - Generate and send proposals during active sales calls\nComplete Automation Pipeline - From form submission to email delivery with zero manual work\nProfessional Presentation - Produces proposals indistinguishable from manually crafted documents\nDual Platform Support - Works with both Google Slides (free) and PandaDoc (premium) integration\n\nHow It Works\n\nSmart Form Interface:\n\nSimple N8N form captures essential deal information\nCollects prospect details, problems, solutions, scope, timeline, and budget\nDesigned for rapid completion during live sales conversations\n\nAdvanced AI Processing:\n\nUses sophisticated GPT-4 prompting with example-based training\nConverts basic form inputs into professionally written proposal sections\nApplies consistent tone, formatting, and business language automatically\n\nDynamic Document Generation:\n\nCreates duplicate proposal templates for each new prospect\nReplaces template variables with AI-generated personalized content\nMaintains professional formatting and visual consistency\n\nAutomated Email Delivery:\n\nSends personalized email with proposal link immediately after generation\nIncludes professional messaging and clear next steps\nOptionally includes invoice for immediate payment processing\n\nPremium PandaDoc Integration:\n\nAdvanced version includes built-in payment processing\nCombines proposal, agreement, and invoice in single document\nEnables immediate signature and payment collection\n\nBusiness Use Cases\n\nService-Based Businesses - Generate proposals for consulting, agencies, and professional services\nAutomation Agencies - Offer proposal generation as a high-value service to clients\nSales Teams - Accelerate proposal creation and improve close rates\nFreelancers - Professionalize client interactions with instant custom proposals\nConsultants - Streamline business development with automated proposal workflows\nB2B Companies - Scale personalized proposal generation across entire sales organization\n\nDifficulty Level: Intermediate\nEstimated Build Time: 2-3 hours\nMonthly Operating Cost: $20-150 (depending on Google Slides vs PandaDoc)\n\nWatch My Complete Live Build\nWant to see me build this entire $2,485 proposal system from scratch? I walk through every component live - including the AI prompting strategies, form design, Google Slides integration, and the advanced PandaDoc setup that enables payment collection.\n🎥 See My Live Build Process: \"I Built A $2,485 AI Proposal Generator In N8N (Copy This)\"\nThis comprehensive tutorial shows the real development process - including advanced AI prompting, template design, API integrations, and the exact pricing strategy that generates $1,500-$5,000 per client.\n\nGoogle Slides Template: Create a professional proposal template with these variable placeholders (wrapped in double curly braces):\n{{proposalTitle}} - Main proposal heading\n{{descriptionName}} - Project subtitle/description\n{{oneParagraphProblemSummary}} - Problem analysis section\n{{solutionHeadingOne}}, {{solutionHeadingTwo}}, {{solutionHeadingThree}} - Solution titles\n{{shortScopeTitleOne}} through {{shortScopeTitleThree}} - Scope sections\n{{milestoneOneDay}} through {{milestoneFourDay}} - Timeline milestones\n{{cost}} - Project pricing\nForm Field Requirements: The N8N form must include these exact field labels:\nFirst Name, Last Name, Company Name, Email, Website\nProblem (textarea) - Client's current challenges\nSolution (textarea) - Your proposed approach\nScope (textarea) - Specific deliverables\nCost - Project pricing\nHow soon? - Timeline expectations\nPandaDoc Setup (Premium): Configure PandaDoc template with token placeholders matching the AI-generated content structure. Template must include pricing tables and signature fields for complete proposal-to-payment automation.\n\nSet Up Steps\n\nForm Design & Integration:\n\nCreate N8N form with optimized fields for proposal generation\nDesign form flow for rapid completion during sales calls\nConfigure form triggers and data validation\n\nAI Content Generation Setup:\n\nConfigure OpenAI API for sophisticated proposal writing\nImplement example-based training with input/output pairs\nSet up JSON formatting for structured content generation\n\nGoogle Slides Integration (Free Version):\n\nCreate professional proposal templates with variable placeholders\nSet up Google Cloud Console API access and credentials\nConfigure template duplication and text replacement workflows\n\nEmail Automation Setup:\n\nConfigure Gmail integration for automated proposal delivery\nDesign professional email templates with proposal links\nSet up dynamic content insertion and personalization\n\nPandaDoc Integration (Premium Version):\n\nSet up PandaDoc API for advanced document generation\nConfigure payment processing and signature collection\nImplement proposal-to-payment automation workflows\n\nTesting & Quality Control:\n\nTest complete workflow with various proposal scenarios\nValidate AI output quality and professional presentation\nOptimize form fields and content generation based on results\n\nAdvanced Features\n\nPremium system includes:\n\nPayment Processing Integration: Collect payments immediately after proposal acceptance\nDigital Signature Collection: Streamline agreement execution with electronic signatures\nCustom Branding: Apply company branding and visual identity automatically\nMulti-Template Support: Generate different proposal types based on service offerings\nCRM Integration: Automatically sync proposal data with existing sales systems\n\nWhy This System Works\n\nThe competitive advantage lies in speed and professionalism:\n30-second generation time vs. hours of manual proposal writing\nProfessional presentation that matches or exceeds manual proposals\nLive sales integration - send proposals during active conversations\nConsistent quality - eliminates human error and formatting inconsistencies\nImmediate follow-up - maintain sales momentum with instant delivery\n\nSystem Architecture\n\nThe workflow follows a simple but powerful 6-step process:\nForm Trigger - Captures essential deal information\nAI Processing - Converts inputs to professional content\nTemplate Duplication - Creates unique document for each prospect\nContent Replacement - Populates template with AI-generated content\nEmail Delivery - Sends proposal with professional messaging\nPayment Collection (PandaDoc) - Enables immediate signature and payment\n\nCheck Out My Channel\nFor more high-value automation systems and proven business-building strategies, explore my YouTube channel where I share the exact systems used to build successful automation businesses and scale to $72K+ monthly revenue.",
"isPaid": false
},
{
"templateId": "2244",
"templateName": "template_2244",
"templateDescription": "🎉 Do you want to master AI automation, so you can save time and build cool stuff? I’ve created a welcoming Skool community for non-technical yet...",
"templateUrl": "https://n8n.io/workflows/2244",
"jsonFileName": "template_2244.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_2244.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/377be0a0ae60ef053904c30efc15e1a4/raw/fbdaf0da0f222f42c15a30122d19a6ddf66d5cb8/template_2244.json",
"screenshotURL": "https://i.ibb.co/qvqyL7x/0a3d66b42c24.png",
"workflowUpdated": true,
"gistId": "377be0a0ae60ef053904c30efc15e1a4",
"templateDescriptionFull": "🎉 Do you want to master AI automation, so you can save time and build cool stuff?\n\nI’ve created a welcoming Skool community for non-technical yet resourceful learners.\n\n👉🏻 Join the AI Atelier 👈🏻\n\nThis workflow exposes an API endpoint that lets you dynamically replace an image in Google Slides, perfect for automating deck presentations like updating backgrounds or client logos.\n\n📺 Youtube Overview 📺\n\nAdd a unique key identifier to the images you want to replace.\n\nClick on the image.\nGo to Format Options and then Alt Text.\nEnter your unique identifier, like client_logo or background.\n\nSend a POST request to the workflow endpoint with the following parameters in the body:\n\npresentation_id: The ID of your Google Slides presentation.\nYou can find it in the URL of your Google presentation: https://docs.google.com/presentation/d/<this-part>/edit)\nimage_key: The unique identifier you created.\nimage_url: The URL of the new image.\n\nThat's it! The specified image in your Google Slides presentation will be replaced with the new one from the provided URL.\n\nThis workflow is designed to be flexible, allowing you to use the same identifier across multiple slides and presentations. I hope it streamlines your slide automation process!\n\nExample Curl Request to execute:\n\nHappy automating!\nThe n8Ninja 🥷",
"isPaid": false
},
{
"templateId": "4359",
"templateName": "AI Proposal Generator System",
"templateDescription": "AI-Powered Proposal Generator - Sales Automation Workflow OverviewScreenshot 20250524 This n8n workflow automates the entire proposal generation process...",
"templateUrl": "https://n8n.io/workflows/4359",
"jsonFileName": "AI_Proposal_Generator_System.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/AI_Proposal_Generator_System.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/d12fd8f0e80da101f1ade489c699362e/raw/fa693daec5c25edbb8c494cc03d4d7b03cc8191e/AI_Proposal_Generator_System.json",
"screenshotURL": "https://i.ibb.co/JR65053s/a1d934f6b4d3.png",
"workflowUpdated": true,
"gistId": "d12fd8f0e80da101f1ade489c699362e",
"templateDescriptionFull": "This n8n workflow automates the entire proposal generation process using AI, transforming client requirements into professional, customized proposals delivered via email in seconds.\n\nPerfect for agencies, consultants, and sales teams who need to generate high-quality proposals quickly. Instead of spending hours writing proposals manually, this workflow captures client information through a web form and uses GPT-4 to generate contextually relevant, professional proposals.\n\nForm Trigger - Captures client information through a customizable web form\nOpenAI Integration - Processes form data and generates structured proposal content\nGoogle Drive - Creates a copy of your proposal template\nGoogle Slides - Populates the template with AI-generated content\nGmail - Automatically sends the completed proposal to the client\n\nAI Content Generation: Uses GPT-4 to create personalized proposal content\nProfessional Templates: Integrates with Google Slides for polished presentations\nAutomated Delivery: Sends proposals directly to clients via email\nForm Integration: Captures all necessary client data through web forms\nCustomizable Output: Generates structured proposals with multiple sections\n\nProposal title and description\nProblem summary analysis\nThree-part solution breakdown\nProject scope details\nMilestone timeline with dates\nCost integration\n\nn8n instance (cloud or self-hosted)\nOpenAI API key for content generation\nGoogle Workspace account for Slides and Gmail\nBasic n8n knowledge for setup and customization\n\nIntermediate - Requires API credentials setup and basic workflow customization\n\nTime Savings: Reduces proposal creation from hours to minutes\nConsistency: Ensures all proposals follow the same professional structure\nPersonalization: AI analyzes client needs for relevant content\nAutomation: Eliminates manual copy-paste and formatting work\nScalability: Handle multiple proposal requests simultaneously\n\nModify AI prompts for different industries or services\nCustomize Google Slides template design\nAdjust form fields for specific information needs\nPersonalize email templates and signatures\nConfigure milestone templates for different project types\n\nIncludes basic error handling for API failures and form validation to ensure reliable operation.\n\nAll credentials have been removed from this template. Users must configure their own:\n\nOpenAI API credentials\nGoogle OAuth2 connections for Slides, Drive, and Gmail\nForm webhook configuration\n\nThis workflow demonstrates practical AI integration in business processes and showcases n8n's capabilities for complex automation scenarios.",
"isPaid": false
},
{
"templateId": "1225",
"templateName": "template_1225",
"templateDescription": "This workflow is triggered when a new deal is created in HubSpot. Then, it processes the deal based on its value and stage. The first branching follows...",
"templateUrl": "https://n8n.io/workflows/1225",
"jsonFileName": "template_1225.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_1225.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/1514d60427c03132e58a43c464eb9f7d/raw/6aaba95e4009572807c189363b46cecc56321d59/template_1225.json",
"screenshotURL": "https://i.ibb.co/DPL9w0qs/df6b8fa803da.png",
"workflowUpdated": true,
"gistId": "1514d60427c03132e58a43c464eb9f7d",
"templateDescriptionFull": "This workflow is triggered when a new deal is created in HubSpot. Then, it processes the deal based on its value and stage.\n\nThe first branching follows three cases:\n\nIf the deal is closed and won, a message is sent in a Slack channel, so that the whole team can celebrate the success.\nIf a presentation has been scheduled for the deal, then a Google Slides presentation template is created.\nIf the deal is closed and lost, the deal’s details are added to an Airtable table. From here, you can analyze the data to get insights into what and why certain deals don’t get closed.\n\nThe second branching follows two cases:\n\nIf the deal is for a new business and has a value above 500, a high-priority ticket assigned to an experienced team member is created in HubSpot\nIf the deal is for an existing business and has a value below 500, a low-priority ticket is created.",
"isPaid": false
},
{
"templateId": "4328",
"templateName": "Scrape TikTok Profile & Transcript with Dumpling AI and Save to Google Sheets",
"templateDescription": "Who is this for? This workflow is built for marketers, researchers, and content analysts who need to monitor TikTok content, analyze user data, or track...",
"templateUrl": "https://n8n.io/workflows/4328",
"jsonFileName": "Scrape_TikTok_Profile__Transcript_with_Dumpling_AI_and_Save_to_Google_Sheets.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Scrape_TikTok_Profile__Transcript_with_Dumpling_AI_and_Save_to_Google_Sheets.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/21db64567797da45a5dcea7ab0ae9d16/raw/d99de562203202a88f0e272ccf7137379404d655/Scrape_TikTok_Profile__Transcript_with_Dumpling_AI_and_Save_to_Google_Sheets.json",
"screenshotURL": "https://i.ibb.co/ksWt60Qw/993cc0df6757.png",
"workflowUpdated": true,
"gistId": "21db64567797da45a5dcea7ab0ae9d16",
"templateDescriptionFull": "This workflow is built for marketers, researchers, and content analysts who need to monitor TikTok content, analyze user data, or track trends across influencers. It's useful for agencies that manage creators or want to keep an organized record of profile performance and video content for reporting or outreach.\n\nInstead of manually checking TikTok profiles or watching videos to understand performance or content, this workflow automates everything. It extracts both profile statistics and full video transcripts, then logs them in Google Sheets for easy access, filtering, and segmentation.\n\nThe automation watches for new TikTok video URLs added to a Google Sheet. When a new row is detected:\n\nIt extracts the username from the URL.\nSends the username to Dumpling AI to get full profile data (followers, likes, videos).\nSends the video URL to Dumpling AI to extract the full transcript.\nAppends all this information back into the same sheet.\n\nEverything happens automatically after a new URL is added to the sheet.\n\nGoogle Sheets Trigger\n\nConnect your Google account and select the spreadsheet where TikTok links will be added.\nThe workflow will trigger on each new row.\nExample sheet column: USERNAME Video\nConnect your Google account and select the spreadsheet where TikTok links will be added.\nThe workflow will trigger on each new row.\nExample sheet column: USERNAME Video\nExtract Username\n\nThis Set node uses RegEx to extract the username (handle) from the TikTok video URL.\nNo need to change anything unless TikTok URL formatting changes.\nThis Set node uses RegEx to extract the username (handle) from the TikTok video URL.\nNo need to change anything unless TikTok URL formatting changes.\nDumpling AI Profile Scraper\n\nGo to Dumpling AI\nSign in and retrieve your API key\nCreate an agent using the get-tiktok-profile endpoint\nPaste your API key into the httpHeaderAuth field in n8n\nGo to Dumpling AI\nSign in and retrieve your API key\nCreate an agent using the get-tiktok-profile endpoint\nPaste your API key into the httpHeaderAuth field in n8n\nDumpling AI Transcript Scraper\n\nAlso uses Dumpling AI\nMake sure the endpoint get-tiktok-transcript is enabled in your Dumpling account\nConnect using the same API key\nAlso uses Dumpling AI\nMake sure the endpoint get-tiktok-transcript is enabled in your Dumpling account\nConnect using the same API key\nSave to Google Sheets\n\nThe final node appends data back to your original Google Sheet\nRequired columns: USERNAME Video, Username, Follower count, Following Count, heart count, Video Count, Transcript\nThe final node appends data back to your original Google Sheet\nRequired columns: USERNAME Video, Username, Follower count, Following Count, heart count, Video Count, Transcript\n\nAdd a filter node to only save profiles with a minimum follower count\nAdd sentiment analysis for the transcript using OpenAI\nConnect Airtable or Notion instead of Google Sheets\nUse GPT to summarize or classify transcripts for research\n\nRequires a Dumpling AI account and API key\nMake sure Google Sheets API is connected and has the correct permissions\nTikTok usernames must start with @ for RegEx to work",
"isPaid": false
},
{
"templateId": "1035",
"templateName": "template_1035",
"templateDescription": "This workflow allows you to get all the slides from a presentation and get thumbnails of pages. workflow-screenshot Google Slides node: This Google Slides...",
"templateUrl": "https://n8n.io/workflows/1035",
"jsonFileName": "template_1035.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_1035.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/89584e442b1069b6043f66a36cbf3490/raw/83db9a37e65246146eb4261a057cca45afc3292b/template_1035.json",
"screenshotURL": "https://i.ibb.co/zV2CRKgg/285634992792.png",
"workflowUpdated": true,
"gistId": "89584e442b1069b6043f66a36cbf3490",
"templateDescriptionFull": "This workflow allows you to get all the slides from a presentation and get thumbnails of pages.\n\n\n\nGoogle Slides node: This Google Slides node will get all the slides from a presentation.\n\nGoogle Slides1 node: This node will return thumbnails of the pages that were returned by the previous node.\n\nBased on your use case, to upload the thumbnails to Dropbox, Google Drive, etc, you can use the respective nodes.",
"isPaid": false
},
{
"templateId": "4731",
"templateName": "CrunchBase Invester Data",
"templateDescription": "🚀 Automated Investor Intelligence: CrunchBase to Google Sheets Data Harvester! Workflow OverviewThis cutting-edge n8n automation is a sophisticated...",
"templateUrl": "https://n8n.io/workflows/4731",
"jsonFileName": "CrunchBase_Invester_Data.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/CrunchBase_Invester_Data.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/30c9f759fb30dbbe81ce6c4356bea3e5/raw/29b9901404f8f951584cbc1e7a6dfb055065ca66/CrunchBase_Invester_Data.json",
"screenshotURL": "https://i.ibb.co/0jCtxwdm/0415f8127db7.png",
"workflowUpdated": true,
"gistId": "30c9f759fb30dbbe81ce6c4356bea3e5",
"templateDescriptionFull": "This cutting-edge n8n automation is a sophisticated investor intelligence tool designed to transform market research into actionable insights. By intelligently connecting CrunchBase, data processing, and Google Sheets, this workflow:\n\nDiscovers Investor Insights:\n\nAutomatically retrieves latest investor data\nTracks key investment organizations\nEliminates manual market research efforts\nAutomatically retrieves latest investor data\nTracks key investment organizations\nEliminates manual market research efforts\nIntelligent Data Processing:\n\nFilters investor-specific organizations\nExtracts critical investment metrics\nEnsures comprehensive market intelligence\nFilters investor-specific organizations\nExtracts critical investment metrics\nEnsures comprehensive market intelligence\nSeamless Data Logging:\n\nAutomatically updates Google Sheets\nCreates real-time investor database\nEnables rapid market trend analysis\nAutomatically updates Google Sheets\nCreates real-time investor database\nEnables rapid market trend analysis\nScheduled Intelligence Gathering:\n\nDaily automated tracking\nConsistent investor insight updates\nZero manual intervention required\nDaily automated tracking\nConsistent investor insight updates\nZero manual intervention required\n\n🤖 Full Automation: Zero-touch investor research\n💡 Smart Filtering: Targeted investment insights\n📊 Comprehensive Tracking: Detailed investor intelligence\n🌐 Multi-Source Synchronization: Seamless data flow\n\nScheduled Trigger: Daily market scanning\nCrunchBase API Integration\nIntelligent Filtering:\n\nInvestor-specific organizations\nKey investment metrics\nMost recent data\nInvestor-specific organizations\nKey investment metrics\nMost recent data\n\nComprehensive Metadata Parsing\nKey Information Retrieval\nStructured Data Preparation\n\nGoogle Sheets Integration\nAutomatic Row Appending\nReal-Time Database Updates\n\nVenture Capitalists: Investment ecosystem mapping\nStartup Scouts: Investor trend analysis\nMarket Researchers: Comprehensive investment insights\nBusiness Development: Strategic partnership identification\nInvestment Analysts: Market intelligence gathering\n\nCrunchBase API\n\nAPI credentials\nConfigured access permissions\nInvestor organization tracking setup\nAPI credentials\nConfigured access permissions\nInvestor organization tracking setup\nGoogle Sheets\n\nConnected Google account\nPrepared tracking spreadsheet\nAppropriate sharing settings\nConnected Google account\nPrepared tracking spreadsheet\nAppropriate sharing settings\nn8n Installation\n\nCloud or self-hosted instance\nWorkflow configuration\nAPI credential management\nCloud or self-hosted instance\nWorkflow configuration\nAPI credential management\n\n🤖 Advanced investment trend analysis\n📊 Multi-source investor aggregation\n🔔 Customizable alert mechanisms\n🌐 Expanded investment stage tracking\n🧠 Machine learning insights generation\n\nImplement robust error handling\nUse secure API authentication\nMaintain flexible data processing\nEnsure compliance with API usage guidelines\n\nRespect business privacy\nUse data for legitimate research\nMaintain transparent information gathering\nProvide proper attribution\n\n#InvestorIntelligence #VentureCapital #MarketResearch #AIWorkflow #DataAutomation #StartupEcosystem #InvestmentTracking #BusinessIntelligence #TechInnovation #StartupFunding\n\nReady to revolutionize your investor research?\n\n📧 Email: Yaron@nofluff.online\n\n🎥 YouTube: @YaronBeen\n\n💼 LinkedIn: Yaron Been\n\nTransform your market intelligence with intelligent, automated workflows!",
"isPaid": false
},
{
"templateId": "2074",
"templateName": "template_2074",
"templateDescription": "This template is an error handler that will log n8n workflow errors to a Monday.com board for troubleshooting and tracking. PrerequisitesMonday account and...",
"templateUrl": "https://n8n.io/workflows/2074",
"jsonFileName": "template_2074.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_2074.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/55422225764adb4edcd704d72ad5bec4/raw/15dcfc4f0db2436ff840319bece8ac3f48471a4b/template_2074.json",
"screenshotURL": "https://i.ibb.co/hJMxKgbg/8e4e488e803b.png",
"workflowUpdated": true,
"gistId": "55422225764adb4edcd704d72ad5bec4",
"templateDescriptionFull": "This template is an error handler that will log n8n workflow errors to a Monday.com board for troubleshooting and tracking.\n\nPrerequisites\n\nMonday account and Monday credential\nCreate a board on Monday for error logging, with the following columns and types:\n\nTimestamp (text)\nError Message (text)\nStack Trace (long text)\nTimestamp (text)\nError Message (text)\nStack Trace (long text)\nDetermine the column IDs using Monday's instructions\n\nSetup\n\nEdit the Monday nodes to use your credential\nEdit the node labeled CREATE ERROR ITEM to point to your error log board and group name\nEdit the column IDs in the \"Column Values\" field of the UPDATE node to match the IDs of the fields on your error log board\nTo trigger error logging, select this automation as the error workflow on any automation\n\nFor more detailed logging, add Stop and Error nodes in your workflow to send specific error messages to your board.\nFor more detailed logging, add Stop and Error nodes in your workflow to send specific error messages to your board.",
"isPaid": false
},
{
"templateId": "556",
"templateName": "template_556",
"templateDescription": "Companion workflow for monday.com node docs workflow-screenshot",
"templateUrl": "https://n8n.io/workflows/556",
"jsonFileName": "template_556.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_556.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/58779db1df4e6427b8f59a0d6cf6d1df/raw/ad0867b65a0099f429d3f5975c5688d7fe422f33/template_556.json",
"screenshotURL": "https://i.ibb.co/XZ4SB8yr/63f8069192b6.png",
"workflowUpdated": true,
"gistId": "58779db1df4e6427b8f59a0d6cf6d1df",
"templateDescriptionFull": "Companion workflow for monday.com node docs",
"isPaid": false
},
{
"templateId": "2741",
"templateName": "RAG Workflow For Stock Earnings Report Analysis",
"templateDescription": "This n8n workflow creates a financial analysis tool that generates reports on a company's quarterly earnings using the capabilities of OpenAI GPT-4o-mini,...",
"templateUrl": "https://n8n.io/workflows/2741",
"jsonFileName": "RAG_Workflow_For_Stock_Earnings_Report_Analysis.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/RAG_Workflow_For_Stock_Earnings_Report_Analysis.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/c3b493b56a1be84ac8b05d09da1de556/raw/028847a1f4c83aadc64d220ab60d07f7dcefc0cc/RAG_Workflow_For_Stock_Earnings_Report_Analysis.json",
"screenshotURL": "https://i.ibb.co/qvqyL7x/0a3d66b42c24.png",
"workflowUpdated": true,
"gistId": "c3b493b56a1be84ac8b05d09da1de556",
"templateDescriptionFull": "This n8n workflow creates a financial analysis tool that generates reports on a company's quarterly earnings using the capabilities of OpenAI GPT-4o-mini, Google's Gemini AI and Pinecone's vector search. By analyzing PDFs of any company's earnings reports from their Investor Relations page, this workflow can answer complex financial questions and automatically compile findings into a structured Google Doc.\n\nData loading and indexing\n\nFetches links to PDF earnings document from a Google Sheet containing a list of file links.\nDownloads the PDFs from Google Drive.\nParses the PDFs, splits the text into chunks, and generates embeddings using the Embeddings Google AI node (text-embedding-004 model).\nStores the embeddings and corresponding text chunks in a Pinecone vector database for semantic search.\n\nReport generation with AI agent\n\nUtilizes an AI Agent node with a specifically crafted system prompt. The agent orchestrates the entire process.\nThe agent uses a Vector Store Tool to access and retrieve information from the Pinecone database.\n\nReport delivery\n\nSaves the generated report as a Google Doc in a specified Google Drive location.\n\nGoogle Cloud Project & Vertex AI API:\n\nCreate a Google Cloud project.\nEnable the Vertex AI API for your project.\nCreate a Google Cloud project.\nEnable the Vertex AI API for your project.\nGoogle AI API key:\n\nObtain a Google AI API key from Google AI Studio.\nObtain a Google AI API key from Google AI Studio.\nPinecone account and API key:\n\nCreate a free account on the Pinecone website.\nObtain your API key from your Pinecone dashboard.\nCreate an index named company-earnings in your Pinecone project.\nCreate a free account on the Pinecone website.\nObtain your API key from your Pinecone dashboard.\nCreate an index named company-earnings in your Pinecone project.\nGoogle Drive - download and save financial documents:\n\nGo to a company you want to analize and download their quarterly earnings PDFs\nSave the PDFs in Google Drive\nCreate a Google Sheet that stores a list of file URLs pointing to the PDFs you downloaded and saved to Google Drive\nGo to a company you want to analize and download their quarterly earnings PDFs\nSave the PDFs in Google Drive\nCreate a Google Sheet that stores a list of file URLs pointing to the PDFs you downloaded and saved to Google Drive\nConfigure credentials in your n8n environment for:\n\nGoogle Sheets OAuth2\nGoogle Drive OAuth2\nGoogle Docs OAuth2\nGoogle Gemini(PaLM) Api (using your Google AI API key)\nPinecone API (using your Pinecone API key)\nGoogle Sheets OAuth2\nGoogle Drive OAuth2\nGoogle Docs OAuth2\nGoogle Gemini(PaLM) Api (using your Google AI API key)\nPinecone API (using your Pinecone API key)\nImport and configure the workflow:\n\nImport this workflow into your n8n instance.\nUpdate the List Of Files To Load (Google Sheets) node to point to your Google Sheet.\nUpdate the Download File From Google Drive to point to the column where the file URLs are\nUpdate the Save Report to Google Docs node to point to your Google Doc where you want the report saved.\nImport this workflow into your n8n instance.\nUpdate the List Of Files To Load (Google Sheets) node to point to your Google Sheet.\nUpdate the Download File From Google Drive to point to the column where the file URLs are\nUpdate the Save Report to Google Docs node to point to your Google Doc where you want the report saved.",
"isPaid": false
},
{
"templateId": "4849",
"templateName": "ocr Telegram - SAP",
"templateDescription": "++HOW IT WORKS:++This workflow automates the processing of invoices sent via Telegram. It extracts the data using LlamaIndex OCR, logs it in Google Sheets,...",
"templateUrl": "https://n8n.io/workflows/4849",
"jsonFileName": "ocr_Telegram_-_SAP.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/ocr_Telegram_-_SAP.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/2f06476d02edd721108c9c50ba40ec77/raw/d2c3ee080bba2cf6de5e98d6aa46da56ba6571e7/ocr_Telegram_-_SAP.json",
"screenshotURL": "https://i.ibb.co/Jj3m3HLn/0eb6ff7e5b56.png",
"workflowUpdated": true,
"gistId": "2f06476d02edd721108c9c50ba40ec77",
"templateDescriptionFull": "++HOW IT WORKS:++\nThis workflow automates the processing of invoices sent via Telegram. It extracts the data using LlamaIndex OCR, logs it in Google Sheets, and optionally pushes the structured data to SAP Business One\n\n🔹 1. Receive Invoice via Telegram:\n\nA user sends a PDF of an invoice through Telegram\nA Telegram Trigger node listens for incoming messages and captures the file and metadata\nThe document is downloaded and prepared for OCR\n\n🔹 2. OCR with LlamaIndex:\n\nThe file is uploaded to the LlamaIndex OCR API.\nThe workflow polls the API until the processing status returns SUCCESS\nOnce ready, the parsed content is fetched in Markdown format\n\n🔹 3. Data Extraction via LLM (editable):\n\nThe Markdown content is sent to a language model (LLM) using LangChain\nA Structured Output Parser transforms the result into a clean, structured editable JSON\n\n🔹 4. Save to Google Sheets:\nThe structured JSON is split into:\n\nHeader (main invoice metadata)\nDetail (individual line items)\n\nEach part is stored in a dedicated tab within a connected Google Sheets file\n\n🔹 5. Ask for SAP Confirmation:\nThe bot replies to the user via Telegram:\n\n\"Do you want to send the data to SAP?\"\n\nIf the user clicks \"Yes\", the next automation path is triggered.\n\n🔹 6. Push Data to SAP B1:\nA connection is made to SAP Business One's Service Layer API\n\nHeader and detail data are fetched from Google Sheets\n\nThe invoice structure is rebuilt as required by SAP (DocumentLines, CardCode, etc.)\n\nA POST request creates the Purchase Invoice in SAP\n\nA confirmation message with the created DocEntry is sent back to the user on Telegram\n\n++SET UP STEPS:++\nFollow these steps to properly configure the workflow before execution:\n\n1️⃣ Create Required Credentials:\nGo to Credentials > + New Credential and create the following:\n\nTelegram API (set your bot token get it from BotFather)\nGoogle Sheets\nOpenAI\n\n2️⃣ Set Up Environment Variables (Optional but Recommended):\nLLAMAINDEX_API_KEY\nSAP_USER\nSAP_PASSWORD\nSAP_COMPANY_DB\nSAP_URL\n\n3️⃣ Prepare Google Sheets:\nEnsure your Google Spreadsheet has the following:\n➤ Sheet 1: Header\n➤ Sheet 2: Details\nContains columns for invoice lines",
"isPaid": false
},
{
"templateId": "3790",
"templateName": "template_3790",
"templateDescription": "Stock Analysis Agent (Hebrew, RTL, GPT-4o) Overview Get comprehensive stock analysis with this AI-powered workflow that provides actionable insights for...",
"templateUrl": "https://n8n.io/workflows/3790",
"jsonFileName": "template_3790.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_3790.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/6865bf5869bc7e90256b3144de5d289f/raw/268a39ecd881db1a24da9c12a2311770f47deccc/template_3790.json",
"screenshotURL": "https://i.ibb.co/8gWpbgry/5b354bcd4d71.png",
"workflowUpdated": true,
"gistId": "6865bf5869bc7e90256b3144de5d289f",
"templateDescriptionFull": "Stock Analysis Agent (Hebrew, RTL, GPT-4o)\n\nGet comprehensive stock analysis with this AI-powered workflow that provides actionable insights for your investment decisions. On a weekly basis, this workflow:\n\nAnalyzes stock data from multiple sources (Chart-img, Twelve Data API, Alphavantage)\nPerforms technical analysis using advanced indicators (RSI, MACD, Bollinger Bands, Resistance and Support Levels)\nScans financial news from Alpha Vantage to capture market sentiment\nUses OpenAI's GPT-4o to identify patterns, trends, and trading opportunities\nGenerates a fully styled, responsive HTML email (with proper RTL layout) in Hebrew\nSends detailed recommendations directly to your inbox\n\nPerfect for investors, traders, and financial analysts who want data-driven stock insights - combining technical indicators with news sentiment for more informed decisions.\n\nEstimated setup time:\n\n15 minutes\n\nRequired credentials:\n\nOpenAI API Key\nChart-img API Key (free tier)\nTwelve Data API Key (free tier)\nAlpha Vantage API Key (free tier)\nSMTP credentials (for email delivery)\n\nSteps:\n\nImport this template into your n8n instance.\nAdd your API keys under credentials.\nConfigure the SMTP Email node with: Host (e.g., smtp.gmail.com), Port (465 or 587), Username (your email), Password (app-specific password or login).\nActivate the workflow.\nFill in the Form.\nEnjoy! (Check your Spam mailbox)\n\nModify the analysis timeframe (daily, weekly, monthly)\nAdd integrations with trading platforms or portfolio management tools\nAdjust the recommendation criteria based on your risk tolerance\n\nThis is more than just stock data. It's an intelligent financial assistant that combines technical analysis with market sentiment to provide actionable recommendations - automatically.\n\nThis report is being generated automatically and does not constitute an investment recommendation. Please consult a licensed investment advisor before making any investment decisions.",
"isPaid": false
},
{
"templateId": "4877",
"templateName": "template_4877",
"templateDescription": "Who is this for?Content creators, social media managers, digital marketers, and businesses looking to automate video production without expensive equipment...",
"templateUrl": "https://n8n.io/workflows/4877",
"jsonFileName": "template_4877.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_4877.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/da5a4f09d310e332d7fee81b5e5acec0/raw/eae0ead52938543c69d5d1c1f1a7e25c484761cd/template_4877.json",
"screenshotURL": "https://i.ibb.co/YFc5SnK6/d2c82d4a4393.png",
"workflowUpdated": true,
"gistId": "da5a4f09d310e332d7fee81b5e5acec0",
"templateDescriptionFull": "Content creators, social media managers, digital marketers, and businesses looking to automate video production without expensive equipment or technical expertise.\n\nTraditional video creation requires cameras, editing software, voice recording equipment, and hours of post-production work. This workflow eliminates all these barriers by automatically generating professional videos with audio using just text prompts.\n\nThis automated workflow takes video ideas from Google Sheets, generates optimized prompts using AI, creates videos through Google's V3 model via Fal AI, monitors the generation progress, and saves the final video URLs back to your spreadsheet for easy access and management.\n\nSign up for Fal AI account and obtain API key\nCreate Google Sheet with video ideas and status columns\nConfigure n8n with required credentials (Google Sheets, Fal AI API)\nImport the workflow template\nSet up authentication for all connected services\nTest with sample video idea\n\nModify the AI prompts to match your brand voice, adjust video styles and camera movements, change polling intervals for video generation status, customize Google Sheet column mappings, and add additional processing steps like thumbnail generation or social media posting.",
"isPaid": false
},
{
"templateId": "3900",
"templateName": "Youtube_Automation",
"templateDescription": "👥 Who Is This For?Content creators, marketing teams, and channel managers who need to streamline video publishing with optimized metadata and scheduled...",
"templateUrl": "https://n8n.io/workflows/3900",
"jsonFileName": "Youtube_Automation.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Youtube_Automation.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/0d380fc68b4e74477ff97a8c982a172f/raw/442da72088ebf928633cfa13e4db2157bfa45cb2/Youtube_Automation.json",
"screenshotURL": "https://i.ibb.co/W4WdVYDK/552a7fb668a4.png",
"workflowUpdated": true,
"gistId": "0d380fc68b4e74477ff97a8c982a172f",
"templateDescriptionFull": "Content creators, marketing teams, and channel managers who need to streamline video publishing with optimized metadata and scheduled releases across multiple videos.\n\nManual YouTube video publishing is time-consuming and often results in inconsistent descriptions, tags, and scheduling. This workflow fully automates:\n\nExtracting video transcripts via Apify for metadata generation\nCreating SEO-optimized descriptions and tags for each video\nSetting videos to private during initial upload (critical for scheduling)\nImplementing scheduled publishing at strategic times\nMaintaining consistent branding and formatting across all content\n\nn8n with YouTube API credentials configured\nApify account with API access for transcript extraction\nYouTube channel with upload permissions\nMaster templates for description formatting\nVideos must be initially set to private for scheduling to work\n\nImport this workflow into your n8n instance.\nConfigure YouTube API credentials with proper channel access.\nSet up Apify integration with appropriate actor for transcript extraction.\nDefine scheduling parameters in the Every Day node.\nConfigure description templates with placeholders for dynamic content.\nSet default tags and customize tag generation rules.\nTest with a single video before batch processing.\n\nAdjust prompt templates for description generation to match your brand voice.\nModify tag selection algorithms based on your channel's SEO strategy.\nCreate multiple publishing schedules for different content categories.\nIntegrate with analytics tools to optimize publishing times.\nAdd notification nodes to alert when videos are successfully scheduled.\n\nVideos MUST be uploaded as private initially - the Publish At logic only works for private videos that haven't been published before.\nPublishing schedules require videos to remain private until their scheduled time.\nTranscript quality affects metadata generation results.\nConsider YouTube API quotas when scheduling large batches of videos.\n\nAPI credentials are stored securely within n8n.\nTranscripts are processed temporarily and not stored permanently.\nWebhook URLs should be protected to prevent unauthorized triggering.\nAccess to the workflow should be limited to authorized team members only.",
"isPaid": false
},
{
"templateId": "3442",
"templateName": "AI-Powered Short-Form Video Generator with OpenAI, Flux, Kling, and ElevenLabs and upload to all social networks",
"templateDescription": "Description This comprehensive n8n automation template orchestrates a complete end-to-end workflow for generating engaging short-form Point-of-View (POV)...",
"templateUrl": "https://n8n.io/workflows/3442",
"jsonFileName": "AI-Powered_Short-Form_Video_Generator_with_OpenAI_Flux_Kling_and_ElevenLabs_and_upload_to_all_social_networks.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/AI-Powered_Short-Form_Video_Generator_with_OpenAI_Flux_Kling_and_ElevenLabs_and_upload_to_all_social_networks.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/34d10cebfe954fcd66b762ec02c4a01e/raw/396e1b2414f68ba91b8c41c9840be67c20e7d2f4/AI-Powered_Short-Form_Video_Generator_with_OpenAI_Flux_Kling_and_ElevenLabs_and_upload_to_all_social_networks.json",
"screenshotURL": "https://i.ibb.co/Jj3m3HLn/0eb6ff7e5b56.png",
"workflowUpdated": true,
"gistId": "34d10cebfe954fcd66b762ec02c4a01e",
"templateDescriptionFull": "This comprehensive n8n automation template orchestrates a complete end-to-end workflow for generating engaging short-form Point-of-View (POV) style videos using multiple AI services and automatically publishing them across major social media platforms. It takes ideas from a Google Sheet and transforms them into finished videos with captions, voiceovers, and platform-specific descriptions, ready for distribution.\n\nContent Creators & Agencies: Mass-produce unique short-form video content for various clients or channels with minimal manual effort.\nDigital Marketers: Automate video content pipelines to boost online presence and engagement across multiple platforms simultaneously.\nSocial Media Managers: Schedule and distribute consistent video content efficiently without juggling multiple tools and manual uploads.\nBusinesses: Leverage AI to create branded video content for marketing, reducing production time and costs.\n\nCreating and distributing high-quality short-form video content consistently across multiple social networks is incredibly time-consuming and resource-intensive. This workflow tackles these challenges by:\n\nAutomating Idea-to-Video Pipeline: Generates video concepts, image prompts, scripts, images, video clips, and voiceovers using AI.\nStreamlining Video Assembly: Automatically combines generated assets into a final video using a template.\nGenerating Platform-Optimized Descriptions: Creates relevant descriptions for posts by transcribing the final video audio.\nAutomating Multi-Platform Publishing: Uploads the final video and description to TikTok, Instagram, YouTube, Facebook, and LinkedIn simultaneously.\nReducing Manual Workload: Drastically cuts down the time and effort required for video production and distribution.\nCentralized Tracking: Updates a Google Sheet with results, costs, and status for easy monitoring.\n\nTrigger & Input: Runs on a daily schedule (configurable) and fetches new video ideas from a designated Google Sheet.\nAI Content Generation:\n\nUses OpenAI to generate video captions and image prompts based on the idea.\nUses PiAPI (Flux) to generate images from prompts.\nUses PiAPI (Kling) to generate video clips from the images (Image-to-Video).\nUses OpenAI to generate a voiceover script based on the captions.\nUses ElevenLabs to generate voiceover audio from the script and uploads it to Google Drive.\nUses OpenAI to generate video captions and image prompts based on the idea.\nUses PiAPI (Flux) to generate images from prompts.\nUses PiAPI (Kling) to generate video clips from the images (Image-to-Video).\nUses OpenAI to generate a voiceover script based on the captions.\nUses ElevenLabs to generate voiceover audio from the script and uploads it to Google Drive.\nVideo Assembly: Combines the generated video clips, captions, and voiceover audio using a Creatomate template to render the final video.\nDescription Generation: Uploads the final video to Google Drive, extracts the audio using OpenAI (Whisper), and generates a social media description using OpenAI (GPT).\nMulti-Platform Distribution: Uses upload-post.com to upload the final video and generated description to TikTok, Instagram, YouTube, Facebook, and LinkedIn.\nTracking & Notification: Updates the original Google Sheet row with output details (video link, costs, tokens used) and sends a completion notification via Discord.\n\nAccounts & API Keys: Obtain accounts and generate API keys/credentials for:\n\nn8n\nGoogle Cloud Platform (for Google Sheets & Google Drive APIs + OAuth Credentials)\nOpenAI\nPiAPI\nElevenLabs\nCreatomate\nupload-post.com\nDiscord (Webhook URL)\nn8n\nGoogle Cloud Platform (for Google Sheets & Google Drive APIs + OAuth Credentials)\nOpenAI\nPiAPI\nElevenLabs\nCreatomate\nupload-post.com\nDiscord (Webhook URL)\nGoogle Sheet: Make a copy of the provided Google Sheet Template and connect it in the Load Google Sheet node.\nCreatomate Template: Set up a video template in Creatomate (use the provided JSON source code as a base) and note its Template ID.\nConfigure Nodes:\n\nEnter all API Keys/Credentials in the Set API Keys node and other relevant credential sections (Google nodes, upload-post nodes, etc.).\nConfigure Google Drive nodes (Folder IDs, Permissions).\nConfigure the upload-post.com nodes with your user identifier and necessary platform details (e.g., Facebook Page ID).\nCustomize AI prompts within the OpenAI nodes (Generate Video Captions, Generate Image Prompts, Generate Script, Generate Description...) if desired.\nSet the Discord Webhook URL in the Notify me on Discord node.\nEnter all API Keys/Credentials in the Set API Keys node and other relevant credential sections (Google nodes, upload-post nodes, etc.).\nConfigure Google Drive nodes (Folder IDs, Permissions).\nConfigure the upload-post.com nodes with your user identifier and necessary platform details (e.g., Facebook Page ID).\nCustomize AI prompts within the OpenAI nodes (Generate Video Captions, Generate Image Prompts, Generate Script, Generate Description...) if desired.\nSet the Discord Webhook URL in the Notify me on Discord node.\nEnable Google APIs: Ensure Google Drive API and Google Sheets API are enabled in your Google Cloud Project.\n\nAccounts: n8n, Google (Sheets, Drive, Cloud Platform), OpenAI, PiAPI, ElevenLabs, Creatomate, The social media api Upload-Post, Discord.\nAPI Keys & Credentials: API Keys for OpenAI, PiAPI, ElevenLabs, Creatomate, upload-post.com. Google Cloud OAuth 2.0 Credentials. Discord Webhook URL.\nTemplates: A configured Google Sheet based on the template, a configured Creatomate video template.\n(Potentially) Paid Plans: Some services (OpenAI, PiAPI, Creatomate, upload-post.com) may require paid plans depending on usage volume after free trials/credits are exhausted.\n\nUse this template to build a powerful, automated video content factory, scaling your production and distribution efforts across the social media landscape.",
"isPaid": false
},
{
"templateId": "4588",
"templateName": "FireCrawl Summary Bot",
"templateDescription": "Workflow OverviewThis cutting-edge n8n automation is a sophisticated market research and intelligence gathering tool designed to transform web content...",
"templateUrl": "https://n8n.io/workflows/4588",
"jsonFileName": "FireCrawl_Summary_Bot.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/FireCrawl_Summary_Bot.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/e3dfc89b84dd4fde5a19896d46a800f6/raw/5e36cfe4c0c058fdca64dc88060c0b606f826be4/FireCrawl_Summary_Bot.json",
"screenshotURL": "https://i.ibb.co/cXp97Ry5/6826311799dc.png",
"workflowUpdated": true,
"gistId": "e3dfc89b84dd4fde5a19896d46a800f6",
"templateDescriptionFull": "This cutting-edge n8n automation is a sophisticated market research and intelligence gathering tool designed to transform web content discovery into actionable insights. By intelligently combining web crawling, AI-powered filtering, and smart summarization, this workflow:\n\nDiscovers Relevant Content:\n\nAutomatically crawls target websites\nIdentifies trending topics\nExtracts comprehensive article details\nAutomatically crawls target websites\nIdentifies trending topics\nExtracts comprehensive article details\nIntelligent Content Filtering:\n\nApplies custom keyword matching\nFilters for most relevant articles\nEnsures high-quality information capture\nApplies custom keyword matching\nFilters for most relevant articles\nEnsures high-quality information capture\nAI-Powered Summarization:\n\nGenerates concise, meaningful summaries\nExtracts key insights\nProvides quick, digestible information\nGenerates concise, meaningful summaries\nExtracts key insights\nProvides quick, digestible information\nSeamless Delivery:\n\nSends summaries directly to Slack\nEnables instant team communication\nFacilitates rapid information sharing\nSends summaries directly to Slack\nEnables instant team communication\nFacilitates rapid information sharing\n\n🤖 Full Automation: Continuous market intelligence\n💡 Smart Filtering: Precision content discovery\n📊 AI-Powered Insights: Intelligent summarization\n🚀 Instant Delivery: Real-time team updates\n\nScheduled Trigger: Daily market research\nFireCrawl Integration: Web content crawling\nComprehensive Site Scanning:\n\nExtracts article metadata\nCaptures full article content\nIdentifies key information sources\nExtracts article metadata\nCaptures full article content\nIdentifies key information sources\n\nKeyword-Based Matching\nRelevance Assessment\nCustom Domain Optimization:\n\nAI and technology focus\nStartup and innovation tracking\nAI and technology focus\nStartup and innovation tracking\n\nOpenAI GPT Integration\nContextual Understanding\nConcise Insight Generation:\n\n3-point summary format\nCaptures essential information\n3-point summary format\nCaptures essential information\n\nSlack Integration\nInstant Information Sharing\nFormatted Insight Delivery\n\nMarket Research Teams: Trend tracking\nInnovation Departments: Technology monitoring\nStartup Ecosystems: Competitive intelligence\nProduct Management: Industry insights\nStrategic Planning: Rapid information gathering\n\nFireCrawl API\n\nWeb crawling credentials\nConfigured crawling parameters\nWeb crawling credentials\nConfigured crawling parameters\nOpenAI API\n\nGPT model access\nSummarization configuration\nAPI key management\nGPT model access\nSummarization configuration\nAPI key management\nSlack Workspace\n\nChannel for insights delivery\nAppropriate app permissions\nWebhook configuration\nChannel for insights delivery\nAppropriate app permissions\nWebhook configuration\nn8n Installation\n\nCloud or self-hosted instance\nWorkflow configuration\nAPI credential management\nCloud or self-hosted instance\nWorkflow configuration\nAPI credential management\n\n🤖 Multi-source crawling\n📊 Advanced sentiment analysis\n🔔 Customizable alert mechanisms\n🌐 Expanded topic tracking\n🧠 Machine learning refinement\n\nImplement robust error handling\nUse exponential backoff for API calls\nMaintain flexible crawling strategies\nEnsure compliance with website terms of service\n\nRespect content creator rights\nUse data for legitimate research\nMaintain transparent information gathering\nProvide proper attribution\n\nReady to revolutionize your market research?\n\n📧 Email: Yaron@nofluff.online\n\n🎥 YouTube: @YaronBeen\n\n💼 LinkedIn: Yaron Been\n\nTransform your information gathering with intelligent, automated workflows!\n\n#AIResearch #MarketIntelligence #AutomatedInsights #TechTrends #WebCrawling #AIMarketing #InnovationTracking #BusinessIntelligence #DataAutomation #TechNews",
"isPaid": false
},
{
"templateId": "2860",
"templateName": "HR-focused automation pipeline with AI",
"templateDescription": "How it Works This workflow automates the process of handling job applications by extracting relevant information from submitted CVs, analyzing the...",
"templateUrl": "https://n8n.io/workflows/2860",
"jsonFileName": "HR-focused_automation_pipeline_with_AI.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/HR-focused_automation_pipeline_with_AI.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/74f8a24dfeab93989e8da1ba1bfc5234/raw/d3b1f0fd3b42c3d00f0d596cf01878e0172e5322/HR-focused_automation_pipeline_with_AI.json",
"screenshotURL": "https://i.ibb.co/SXhKK6St/daa6ffb7deef.png",
"workflowUpdated": true,
"gistId": "74f8a24dfeab93989e8da1ba1bfc5234",
"templateDescriptionFull": "This workflow automates the process of handling job applications by extracting relevant information from submitted CVs, analyzing the candidate's qualifications against a predefined profile, and storing the results in a Google Sheet. Here’s how it operates:\n\nData Collection and Extraction:\n\nThe workflow begins with a form submission (On form submission node), which triggers the extraction of data from the uploaded CV file using the Extract from File node.\nTwo informationExtractor nodes (Qualifications and Personal Data) are used to parse specific details such as educational background, work history, skills, city, birthdate, and telephone number from the text content of the CV.\nThe workflow begins with a form submission (On form submission node), which triggers the extraction of data from the uploaded CV file using the Extract from File node.\nTwo informationExtractor nodes (Qualifications and Personal Data) are used to parse specific details such as educational background, work history, skills, city, birthdate, and telephone number from the text content of the CV.\nProcessing and Evaluation:\n\nA Merge node combines the extracted personal and qualification data into a single output.\nThis merged data is then passed through a Summarization Chain that generates a concise summary of the candidate’s profile.\nAn HR Expert chain evaluates the candidate against a desired profile (Profile Wanted), assigning a score and providing considerations for hiring.\nFinally, all collected and processed data including the evaluation results are appended to a Google Sheets document via the Google Sheets node for further review or reporting purposes [[9]].\nA Merge node combines the extracted personal and qualification data into a single output.\nThis merged data is then passed through a Summarization Chain that generates a concise summary of the candidate’s profile.\nAn HR Expert chain evaluates the candidate against a desired profile (Profile Wanted), assigning a score and providing considerations for hiring.\nFinally, all collected and processed data including the evaluation results are appended to a Google Sheets document via the Google Sheets node for further review or reporting purposes [[9]].\n\nTo replicate this workflow within your own n8n environment, follow these steps:\n\nConfiguration:\n\nBegin by setting up an n8n instance if you haven't already; you can sign up directly on their website or self-host the application.\nImport the provided JSON configuration into your n8n workspace. Ensure that all necessary credentials (e.g., Google Drive, Google Sheets, OpenAI API keys) are correctly configured under the Credentials section since some nodes require external service integrations like Google APIs and OpenAI for language processing tasks.\nBegin by setting up an n8n instance if you haven't already; you can sign up directly on their website or self-host the application.\nImport the provided JSON configuration into your n8n workspace. Ensure that all necessary credentials (e.g., Google Drive, Google Sheets, OpenAI API keys) are correctly configured under the Credentials section since some nodes require external service integrations like Google APIs and OpenAI for language processing tasks.\nCustomization:\n\nAdjust the parameters of each node according to your specific requirements. For example, modify the fields in the formTrigger node to match what kind of information you wish to collect from applicants.\nCustomize the prompts given to AI models in nodes like Qualifications, Summarization Chain, and HR Expert so they align with the type of analyses you want performed on the candidates' profiles.\nUpdate the destination settings in the Google Sheets node to point towards your own spreadsheet where you would like the final outputs recorded.\nAdjust the parameters of each node according to your specific requirements. For example, modify the fields in the formTrigger node to match what kind of information you wish to collect from applicants.\nCustomize the prompts given to AI models in nodes like Qualifications, Summarization Chain, and HR Expert so they align with the type of analyses you want performed on the candidates' profiles.\nUpdate the destination settings in the Google Sheets node to point towards your own spreadsheet where you would like the final outputs recorded.\n\nContact me for consulting and support or add me on Linkedin.",
"isPaid": false
},
{
"templateId": "5139",
"templateName": "insta Reel publish",
"templateDescription": "This workflow automates the process of creating and posting Instagram Reels, combining Google Drive, AI, Airtable, and the Facebook Graph API. It supports...",
"templateUrl": "https://n8n.io/workflows/5139",
"jsonFileName": "insta_Reel_publish.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/insta_Reel_publish.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/36e49f01d955bfb1efd1079518ba845c/raw/7edf57b2077d97c23e0493ebb707b9c5df923ab9/insta_Reel_publish.json",
"screenshotURL": "https://i.ibb.co/Kjnvwb6w/b1eb8bbee442.png",
"workflowUpdated": true,
"gistId": "36e49f01d955bfb1efd1079518ba845c",
"templateDescriptionFull": "This workflow automates the process of creating and posting Instagram Reels, combining Google Drive, AI, Airtable, and the Facebook Graph API. It supports two content creation paths:\n\nScheduled Random Video Selection & Posting\n\nSelects a random video from a Google Drive folder named \"Random video mover\" based on a schedule.\n\nMoves the video to a processing folder for posting.\n\nManual Upload Trigger & Posting\n\nWatches a specific Google Drive folder (\"n8n reels automation on instagram\").\n\nTriggers the workflow when a new video is uploaded.\n\nCore Process (applies to both paths)\n\nDownload Video from Google Drive.\n\nAI Caption Generation with Google Gemini, using the file name as context. The AI creates concise captions with hashtags and a call-to-action.\n\nAirtable Logging to store video name, caption, and URL.\n\nInstagram Reels Posting via the Facebook Graph API.\n\nRecent Change\nIn early 2025, Meta tightened its requirements for video_url and image_url parameters. URLs must now be direct, public links to the raw media file with no redirects or authentication. Google Drive links no longer work.\n\nOur Fix\n\nStore the binary file locally on the n8n server at /tmp/video.mp4.\n\nServe the file through a public n8n webhook with the correct Content-Type.\n\nUse the webhook URL in the Facebook Graph API request.\n\nUpload succeeds without the “Media download has failed” error.\n\nCleanup\n\nDeletes the temporary file after posting.\n\nBenefits\n\nSaves time with full automation.\n\nImproves engagement through AI-generated captions.\n\nKeeps content organized in Airtable.\n\nWorks with Meta’s updated API requirements by hosting files directly from the n8n server.",
"isPaid": false
},
{
"templateId": "4971",
"templateName": "Free_Auto_Post_Latest_Breaking_News_Content_Using_Groq_+_Google_Search_to_X__Twitter_",
"templateDescription": "This n8n workflow automates the process of finding, summarizing, and posting breaking news headlines on X (formerly Twitter). It combines Google Custom...",
"templateUrl": "https://n8n.io/workflows/4971",
"jsonFileName": "Free_Auto_Post_Latest_Breaking_News_Content_Using_Groq__Google_Search_to_X__Twitter_.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Free_Auto_Post_Latest_Breaking_News_Content_Using_Groq__Google_Search_to_X__Twitter_.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/cd2d9d4283eaa1ce7a53fa417e0b5d19/raw/307ca497c83348b8b40c59cc7205069389ce29f3/Free_Auto_Post_Latest_Breaking_News_Content_Using_Groq__Google_Search_to_X__Twitter_.json",
"screenshotURL": "https://i.ibb.co/4ZCZtJkx/fec00525cbc3.png",
"workflowUpdated": true,
"gistId": "cd2d9d4283eaa1ce7a53fa417e0b5d19",
"templateDescriptionFull": "This n8n workflow automates the process of finding, summarizing, and posting breaking news headlines on X (formerly Twitter). It combines Google Custom Search for finding the latest news articles with Groq's LLaMA 3 model to generate short, engaging headlines — complete with hashtags — and posts them on your X account.\n\n🔧 Features\n\nCustom topic support (e.g., \"AI\", \"health\", \"technology\")\n\nAutomated scheduling every few hours\n\nGoogle Custom Search to find the most recent news articles\n\nGroq LLaMA3-based headline generation with hashtags\n\nAuto-post to X (Twitter)\n\nBuilt-in credential separation for API keys and access tokens\n\n📦 Included Nodes\n\nSchedule Trigger\n\nSet (Set Topic, Google API Key, Custom Search CX, etc.)\n\nHTTP Request (Google Search API)\n\nCode Node (Format prompt and extract article data)\n\nHTTP Request (Groq API for headline generation)\n\nTwitter Node (Post to X)\n\n⚙️ How It Works (Step-by-Step)\n\nTrigger\n\nThe workflow starts on a scheduled interval (default: every 5 hours, at a random minute within the hour).\n\nSet Topic\n\nYou can define your own topic keyword (e.g., AI, mental health, climate change) by editing the Set Topic node.\n\nBuild Search Query\n\nConstructs a Google search query like:\nlatest {topic} news.\n\nGoogle API Config\n\nInjects your own Google API Key and Custom Search CX (replace the placeholders in the Google Config node).\n\nSearch for News\n\nPerforms a real-time search using Google Custom Search API and fetches the latest article result.\n\nGenerate Prompt for AI\n\nA JavaScript Function node extracts the top article’s title and link, formats it into a clean prompt including instructions to append hashtags.\n\nGroq AI Request\n\nSends the prompt to Groq’s LLaMA 3 model to generate a concise, tweet-length headline with 1–2 relevant hashtags.\n\nPost to Twitter (X)\n\nThe generated headline is posted to your connected X account via the Twitter OAuth2 API.\n\n✅ Requirements\n\nGoogle API Key\nGoogle Custom Search Engine (CX)\nGroq API Key\nTwitter Developer App with OAuth2 credentials\n\n💡 Customization Tips\n\nChange the topic in the Set Topic node to anything you like.\nAdjust the posting frequency in the Schedule Trigger node.\nModify prompt behavior in the Function node to fit a specific tone or brand voice.\nAdd logging, filtering, or multiple post variations as needed.",
"isPaid": false
},
{
"templateId": "3477",
"templateName": "LinkedIn Profile Discovery",
"templateDescription": "About The LinkedIn Profile Discovery Automation Are you tired of manually searching for LinkedIn profiles or paying expensive data providers for often...",
"templateUrl": "https://n8n.io/workflows/3477",
"jsonFileName": "LinkedIn_Profile_Discovery.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/LinkedIn_Profile_Discovery.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/ef0373c665a99df4efaadfa5668d0de3/raw/39cfe091642bf5de8f1810ba3cac58b07f4e4108/LinkedIn_Profile_Discovery.json",
"screenshotURL": "https://i.ibb.co/gbbn9rY9/0a13c18c528c.png",
"workflowUpdated": true,
"gistId": "ef0373c665a99df4efaadfa5668d0de3",
"templateDescriptionFull": "Are you tired of manually searching for LinkedIn profiles or paying expensive data providers for often outdated information? If you spend countless hours trying to find accurate LinkedIn URLs for your prospects or candidates, this automation will change your workflow forever. Just give this workflow the information you have about a contact, and it will automatically augment it with a LinkedIn profile.\n\nIn this guide, you'll learn how to automate LinkedIn profile link discovery using Airtop's built-in node in n8n. Using this automation, you'll have a fully automated workflow that saves you hours of manual searching while providing accurate, validated LinkedIn URLs.\n\nA free Airtop API key\nA Google Workspace account. If you have a Gmail account, you’re all set\nEstimated setup time: 10 minutes\n\nThis automation leverages the power of intelligent search algorithms combined with LinkedIn validation to ensure accuracy. Here's how it works:\n\nTakes your input data (name, company, etc.) and constructs intelligent search queries\nUtilizes Google search to identify potential LinkedIn profile URLs\nValidates the discovered URLs directly against LinkedIn to ensure accuracy\nReturns confirmed, accurate LinkedIn profile URLs\n\n\n\nGetting started with this automation is straightforward:\n\nCreate a new Google Sheet with columns for input data (name, company, domain, etc.)\nAdd columns for the output LinkedIn URL and validation status (see this example)\n\nConnect your Google Workspace account to n8n if you haven't already\nAdd your Airtop API credentials\n(Optionally) Configure your Airtop Profile and sign-in to LinkedIn in order to validate profile URL's\n\nAdd a few test entries to your Google Sheet\nRun the workflow\nCheck the results in your output columns\n\n\n\nWhile the default setup uses Google Sheets, this automation is highly flexible:\n\nWebhook Integration: Perfect for integrating with tools like Clay, Instantly, or your custom applications\nAlternatives: Replace Google Sheets with Airtable, Notion, or any other tools you already use for more robust database capabilities\nCustom Output Formatting: Modify the output structure to match your existing systems\nBatch Processing: Configure for bulk processing of multiple profiles\n\nThis automation has the potential to transform how we organizations handle profile enrichment.\n\nWith this automation, a recruiting firm could save hundreds of dollars a month in data enrichment fees, achieve better accuracy, and eliminate subscription costs. They would also be able to process thousands of profiles weekly with near-perfect accuracy.\n\nA B2B sales team could integrate this automation with their CRM, automatically enriching new leads with validated LinkedIn profiles and saving their SDRs hours per week on manual research.\n\nTo maximize the accuracy of your results:\n\nAlways include company information (domain or company name) with your search queries\nUse full names rather than nicknames or initials when possible\nConsider including location data for more accurate results with common names\nImplement rate limiting to respect LinkedIn's usage guidelines\nKeep your input data clean and standardized for best results\nUse the integrated proxy to navigate more effectively through Google and LinkedIn\n\nNow that you've automated LinkedIn profile discovery, consider exploring related automations:\n\nAutomated lead scoring based on LinkedIn profile data\nEmail finder automation using validated LinkedIn profiles\nIntegration with your CRM for automated contact enrichment",
"isPaid": false
},
{
"templateId": "4783",
"templateName": "Google Calendar Coming Week",
"templateDescription": "Workflow: Automated Weekly Google Calendar Summary via Email with AI ✨🗓️📧 Get a personalized, AI-powered summary of your upcoming week's Google Calendar...",
"templateUrl": "https://n8n.io/workflows/4783",
"jsonFileName": "Google_Calendar_Coming_Week.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Google_Calendar_Coming_Week.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/4748deceabc615d6f7d0e2e777e54c8e/raw/c92dcf7bb3ffd2ed6db724eb2684ec5cbf827b34/Google_Calendar_Coming_Week.json",
"screenshotURL": "https://i.ibb.co/DHsyK2wx/0f5cbb16e985.png",
"workflowUpdated": true,
"gistId": "4748deceabc615d6f7d0e2e777e54c8e",
"templateDescriptionFull": "Get a personalized, AI-powered summary of your upcoming week's Google Calendar events delivered straight to your inbox! This workflow automates the entire process, from fetching events to generating an intelligent summary and emailing it to you.\n\nThis n8n workflow connects to your Google Calendar, retrieves events for the upcoming week (Monday to Sunday, based on the day the workflow runs), uses Google Gemini AI to create a well-structured and insightful summary, and then emails this summary to you. It's designed to help you start your week organized and aware of your commitments.\n\nKey Features:\n\nAutomated Weekly Summary: Runs on a schedule (default: weekly) to keep you updated.\nAI-Powered Insights: Leverages Google Gemini to not just list events, but to identify important ones and offer a brief weekly outlook.\nPersonalized Content: Uses your specified timezone, locale, name, and city for accurate and relevant information.\nClear Formatting: Events are grouped by day and displayed chronologically with start and end times. Important events are highlighted.\nEmail Delivery: Receive your schedule directly in your inbox in a clean HTML format.\nCustomizable: Easily adapt to your specific calendar, AI preferences, and email settings.\n\nThe workflow consists of the following nodes, working in sequence:\n\nweekly_schedule (Schedule Trigger):\n\nWhat it does: Initiates the workflow.\nDefault: Triggers once a week at 12:00 PM. You can adjust this to your preference (e.g., Sunday evening or Monday morning).\nWhat it does: Initiates the workflow.\nDefault: Triggers once a week at 12:00 PM. You can adjust this to your preference (e.g., Sunday evening or Monday morning).\nlocale (Set Node):\n\nWhat it does: This is a crucial node for you to configure! It sets user-specific parameters like your preferred language/region (users-locale), timezone (users-timezone), your name (users-name), and your home city (users-home-city). These are used throughout the workflow for correct date/time formatting and personalizing the AI prompt.\nWhat it does: This is a crucial node for you to configure! It sets user-specific parameters like your preferred language/region (users-locale), timezone (users-timezone), your name (users-name), and your home city (users-home-city). These are used throughout the workflow for correct date/time formatting and personalizing the AI prompt.\ndate-time (Set Node):\n\nWhat it does: Dynamically generates various date and time strings based on the current execution time and the locale settings. This is used to define the precise 7-day window (from the current day to 7 days ahead, ending at midnight) for fetching calendar events.\nWhat it does: Dynamically generates various date and time strings based on the current execution time and the locale settings. This is used to define the precise 7-day window (from the current day to 7 days ahead, ending at midnight) for fetching calendar events.\nget_next_weeks_events (Google Calendar Node):\n\nWhat it does: Connects to your specified Google Calendar and fetches all events within the 7-day window calculated by the date-time node.\nRequires: Google Calendar API credentials and the ID of the calendar you want to use.\nWhat it does: Connects to your specified Google Calendar and fetches all events within the 7-day window calculated by the date-time node.\nRequires: Google Calendar API credentials and the ID of the calendar you want to use.\nsimplify_evens_json (Code Node):\n\nWhat it does: Runs a small JavaScript snippet to clean up the raw event data from Google Calendar. It removes several fields that aren't needed for the summary (like htmlLink, etag, iCalUID), making the data more concise for the AI.\nWhat it does: Runs a small JavaScript snippet to clean up the raw event data from Google Calendar. It removes several fields that aren't needed for the summary (like htmlLink, etag, iCalUID), making the data more concise for the AI.\naggregate_events (Aggregate Node):\n\nWhat it does: Takes all the individual (and now simplified) event items and groups them into a single JSON array called eventdata. This is the format the AI agent expects for processing.\nWhat it does: Takes all the individual (and now simplified) event items and groups them into a single JSON array called eventdata. This is the format the AI agent expects for processing.\nGoogle Gemini (LM Chat Google Gemini Node):\n\nWhat it does: This node is the connection point to the Google Gemini language model.\nRequires: Google Gemini (or PaLM) API credentials.\nWhat it does: This node is the connection point to the Google Gemini language model.\nRequires: Google Gemini (or PaLM) API credentials.\nevent_summary_agent (Agent Node):\n\nWhat it does: This is where the magic happens! It uses the Google Gemini model and a detailed system prompt to generate the weekly schedule summary.\nThe Prompt Instructs the AI to:\n\nStart with a friendly greeting.\nGroup events by day (Monday to Sunday) for the upcoming week, using the user's timezone and locale.\nFormat event times clearly (e.g., 09:30 AM - 10:30 AM: Event Summary).\nIdentify and prefix \"IMPORTANT:\" to events with keywords like \"urgent,\" \"deadline,\" \"meeting,\" etc., in their summary or description.\nConclude with a 1-2 sentence helpful insight about the week's schedule.\nProcess the input eventdata (the JSON array of calendar events).\nWhat it does: This is where the magic happens! It uses the Google Gemini model and a detailed system prompt to generate the weekly schedule summary.\nThe Prompt Instructs the AI to:\n\nStart with a friendly greeting.\nGroup events by day (Monday to Sunday) for the upcoming week, using the user's timezone and locale.\nFormat event times clearly (e.g., 09:30 AM - 10:30 AM: Event Summary).\nIdentify and prefix \"IMPORTANT:\" to events with keywords like \"urgent,\" \"deadline,\" \"meeting,\" etc., in their summary or description.\nConclude with a 1-2 sentence helpful insight about the week's schedule.\nProcess the input eventdata (the JSON array of calendar events).\nStart with a friendly greeting.\nGroup events by day (Monday to Sunday) for the upcoming week, using the user's timezone and locale.\nFormat event times clearly (e.g., 09:30 AM - 10:30 AM: Event Summary).\nIdentify and prefix \"IMPORTANT:\" to events with keywords like \"urgent,\" \"deadline,\" \"meeting,\" etc., in their summary or description.\nConclude with a 1-2 sentence helpful insight about the week's schedule.\nProcess the input eventdata (the JSON array of calendar events).\nMarkdown (Markdown to HTML Node):\n\nWhat it does: Converts the text output from the event_summary_agent (which is generated in Markdown format for easy structure) into HTML. This ensures the email body is well-formatted with proper line breaks, lists, and emphasis.\nWhat it does: Converts the text output from the event_summary_agent (which is generated in Markdown format for easy structure) into HTML. This ensures the email body is well-formatted with proper line breaks, lists, and emphasis.\nsend_email (Email Send Node):\n\nWhat it does: Sends the final HTML summary to your specified email address.\nRequires: SMTP (email sending) credentials and your desired \"From\" and \"To\" email addresses.\nWhat it does: Sends the final HTML summary to your specified email address.\nRequires: SMTP (email sending) credentials and your desired \"From\" and \"To\" email addresses.\n\nFollow these steps to get the workflow up and running:\n\nImport the Workflow:\n\nDownload the workflow JSON file.\nIn your n8n instance, go to \"Workflows\" and click the \"Import from File\" button. Select the downloaded JSON file.\nDownload the workflow JSON file.\nIn your n8n instance, go to \"Workflows\" and click the \"Import from File\" button. Select the downloaded JSON file.\nConfigure Credentials:\nYou'll need to set up credentials for three services. In n8n, go to \"Credentials\" on the left sidebar and click \"Add credential.\"\n\nGoogle Calendar API:\n\nSearch for \"Google Calendar\" and create new credentials using OAuth2. Follow the authentication flow.\nOnce created, select these credentials in the get_next_weeks_events node.\n\n\nGoogle Gemini (PaLM) API:\n\nSearch for \"Google Gemini\" or \"Google PaLM\" and create new credentials. You'll typically need an API key from Google AI Studio or Google Cloud.\nOnce created, select these credentials in the Google Gemini node.\n\n\nSMTP / Email:\n\nSearch for your email provider (e.g., \"SMTP,\" \"Gmail,\" \"Outlook\") and create credentials. This usually involves providing your email server details, username, and password/app password.\nOnce created, select these credentials in the send_email node.\nGoogle Calendar API:\n\nSearch for \"Google Calendar\" and create new credentials using OAuth2. Follow the authentication flow.\nOnce created, select these credentials in the get_next_weeks_events node.\nSearch for \"Google Calendar\" and create new credentials using OAuth2. Follow the authentication flow.\nOnce created, select these credentials in the get_next_weeks_events node.\nGoogle Gemini (PaLM) API:\n\nSearch for \"Google Gemini\" or \"Google PaLM\" and create new credentials. You'll typically need an API key from Google AI Studio or Google Cloud.\nOnce created, select these credentials in the Google Gemini node.\nSearch for \"Google Gemini\" or \"Google PaLM\" and create new credentials. You'll typically need an API key from Google AI Studio or Google Cloud.\nOnce created, select these credentials in the Google Gemini node.\nSMTP / Email:\n\nSearch for your email provider (e.g., \"SMTP,\" \"Gmail,\" \"Outlook\") and create credentials. This usually involves providing your email server details, username, and password/app password.\nOnce created, select these credentials in the send_email node.\nSearch for your email provider (e.g., \"SMTP,\" \"Gmail,\" \"Outlook\") and create credentials. This usually involves providing your email server details, username, and password/app password.\nOnce created, select these credentials in the send_email node.\n‼️ IMPORTANT: Customize User Settings in the locale Node:\n\nOpen the locale node.\nUpdate the following values in the \"Assignments\" section:\n\nusers-locale: Set your locale string (e.g., \"en-AU\" for English/Australia, \"en-US\" for English/United States, \"de-DE\" for German/Germany). This affects how dates, times, and numbers are formatted.\nusers-timezone: Set your timezone string (e.g., \"Australia/Sydney\", \"America/New_York\", \"Europe/London\"). This is critical for ensuring event times are displayed correctly for your location.\nusers-name: Enter your name (e.g., \"Bob\"). This is used to personalize the email greeting.\nusers-home-city: Enter your home city (e.g., \"Sydney\"). This can be used for additional context by the AI.\nOpen the locale node.\nUpdate the following values in the \"Assignments\" section:\n\nusers-locale: Set your locale string (e.g., \"en-AU\" for English/Australia, \"en-US\" for English/United States, \"de-DE\" for German/Germany). This affects how dates, times, and numbers are formatted.\nusers-timezone: Set your timezone string (e.g., \"Australia/Sydney\", \"America/New_York\", \"Europe/London\"). This is critical for ensuring event times are displayed correctly for your location.\nusers-name: Enter your name (e.g., \"Bob\"). This is used to personalize the email greeting.\nusers-home-city: Enter your home city (e.g., \"Sydney\"). This can be used for additional context by the AI.\nusers-locale: Set your locale string (e.g., \"en-AU\" for English/Australia, \"en-US\" for English/United States, \"de-DE\" for German/Germany). This affects how dates, times, and numbers are formatted.\nusers-timezone: Set your timezone string (e.g., \"Australia/Sydney\", \"America/New_York\", \"Europe/London\"). This is critical for ensuring event times are displayed correctly for your location.\nusers-name: Enter your name (e.g., \"Bob\"). This is used to personalize the email greeting.\nusers-home-city: Enter your home city (e.g., \"Sydney\"). This can be used for additional context by the AI.\nConfigure the get_next_weeks_events (Google Calendar) Node:\n\nOpen the node.\nIn the \"Calendar\" parameter, you need to specify which calendar to fetch events from.\n\nThe default might be a placeholder like c_4d9c2d4e139327143ee4a5bc4db531ffe074e98d21d1c28662b4a4d4da898866@group.calendar.google.com.\nChange this to your primary calendar (often your email address) or the specific Calendar ID you want to use. You can find Calendar IDs in your Google Calendar settings.\nOpen the node.\nIn the \"Calendar\" parameter, you need to specify which calendar to fetch events from.\n\nThe default might be a placeholder like c_4d9c2d4e139327143ee4a5bc4db531ffe074e98d21d1c28662b4a4d4da898866@group.calendar.google.com.\nChange this to your primary calendar (often your email address) or the specific Calendar ID you want to use. You can find Calendar IDs in your Google Calendar settings.\nThe default might be a placeholder like c_4d9c2d4e139327143ee4a5bc4db531ffe074e98d21d1c28662b4a4d4da898866@group.calendar.google.com.\nChange this to your primary calendar (often your email address) or the specific Calendar ID you want to use. You can find Calendar IDs in your Google Calendar settings.\nConfigure the send_email Node:\n\nOpen the node.\nSet the fromEmail parameter to the email address you want the summary to be sent from.\nSet the toEmail parameter to the email address(es) where you want to receive the summary.\nYou can also customize the subject line if desired.\nOpen the node.\nSet the fromEmail parameter to the email address you want the summary to be sent from.\nSet the toEmail parameter to the email address(es) where you want to receive the summary.\nYou can also customize the subject line if desired.\n(Optional) Customize the AI Prompt in event_summary_agent:\n\nIf you want to change how the AI summarizes events (e.g., different keywords for important events, a different tone, or specific formatting tweaks), you can edit the \"System Message\" within the event_summary_agent node's parameters.\nIf you want to change how the AI summarizes events (e.g., different keywords for important events, a different tone, or specific formatting tweaks), you can edit the \"System Message\" within the event_summary_agent node's parameters.\n(Optional) Adjust the Schedule in weekly_schedule:\n\nOpen the weekly_schedule node.\nModify the \"Rule\" to change when and how often the workflow runs (e.g., a specific day of the week, a different time).\nOpen the weekly_schedule node.\nModify the \"Rule\" to change when and how often the workflow runs (e.g., a specific day of the week, a different time).\nActivate the Workflow:\n\nOnce everything is configured, toggle the \"Active\" switch in the top right corner of the workflow editor to ON.\nOnce everything is configured, toggle the \"Active\" switch in the top right corner of the workflow editor to ON.\n\nYou'll receive an email (based on your schedule) with a subject like \"Next Week Calendar Summary : [Start Date] - [End Date]\". The email body will contain:\n\nA friendly greeting.\nYour schedule for the upcoming week (Monday to Sunday), with events listed chronologically under each day.\nEvent times displayed in your local timezone (e.g., 09:30 AM - 10:30 AM: Team Meeting).\nPriority events clearly marked (e.g., IMPORTANT: 02:00 PM - 03:00 PM: Project Deadline Review).\nA brief, insightful observation about your week's schedule.\n\nTimezone is Key: Ensure your users-timezone in the locale node is correct. This is the most common source of incorrect event times.\nGoogle API Permissions: When setting up Google Calendar and Gemini credentials, make sure you grant the necessary permissions.\nAI Output Varies: The AI-generated summary can vary slightly each time. The prompt is designed to guide it, but LLMs have inherent creativity.\nCalendar Event Details: The quality of the summary (especially for identifying important events) depends on how detailed your calendar event titles and descriptions are. Including keywords like \"meeting,\" \"urgent,\" \"prepare for,\" etc., in your events helps the AI.\n\nFeel free to modify and enhance this workflow! If you have suggestions, improvements, or run into issues, please share them in the n8n community.\n\nHappy scheduling!",
"isPaid": false
},
{
"templateId": "4733",
"templateName": "Upwork Job Aggregator & Notifier",
"templateDescription": "🚀 Automated Job Hunter: Upwork Opportunity Aggregator & AI-Powered Notifier! Workflow OverviewThis cutting-edge n8n automation is a sophisticated job...",
"templateUrl": "https://n8n.io/workflows/4733",
"jsonFileName": "Upwork_Job_Aggregator__Notifier.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Upwork_Job_Aggregator__Notifier.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/3eb6d343aa76d98e01b1b40b3744ae1f/raw/c31cee156e6552f16ae925045fb4c56f11068e8d/Upwork_Job_Aggregator__Notifier.json",
"screenshotURL": "https://i.ibb.co/4nnffr1V/77f971e09022.png",
"workflowUpdated": true,
"gistId": "3eb6d343aa76d98e01b1b40b3744ae1f",
"templateDescriptionFull": "This cutting-edge n8n automation is a sophisticated job discovery and notification tool designed to transform freelance job hunting into a seamless, intelligent process. By intelligently connecting Apify, OpenAI, Google Sheets, and Gmail, this workflow:\n\nDiscovers Job Opportunities:\n\nAutomatically scrapes Upwork job listings\nTracks recent freelance opportunities\nEliminates manual job searching efforts\nAutomatically scrapes Upwork job listings\nTracks recent freelance opportunities\nEliminates manual job searching efforts\nIntelligent Data Processing:\n\nFilters and extracts key job details\nStructures job information\nEnsures comprehensive opportunity tracking\nFilters and extracts key job details\nStructures job information\nEnsures comprehensive opportunity tracking\nAI-Powered Summarization:\n\nGenerates concise job summaries\nCreates human-readable job digests\nProvides quick, actionable insights\nGenerates concise job summaries\nCreates human-readable job digests\nProvides quick, actionable insights\nSeamless Notification:\n\nAutomatically logs jobs to Google Sheets\nSends personalized email digests\nEnables rapid opportunity assessment\nAutomatically logs jobs to Google Sheets\nSends personalized email digests\nEnables rapid opportunity assessment\n\n🤖 Full Automation: Zero-touch job discovery\n💡 Smart Filtering: Targeted job opportunities\n📊 Comprehensive Tracking: Detailed job market insights\n🌐 Multi-Platform Synchronization: Seamless data flow\n\nScheduled Trigger: Daily job scanning\nApify Integration: Upwork job scraping\nIntelligent Filtering:\n\nRecent job postings\nSpecific keywords\nRelevant opportunities\nRecent job postings\nSpecific keywords\nRelevant opportunities\n\nComprehensive Job Metadata Parsing\nKey Information Retrieval\nStructured Data Preparation\n\nOpenAI GPT Processing\nProfessional Summary Generation\nContextual Job Insight Creation\n\nGoogle Sheets Logging\nGmail Integration\nAutomated Job Digest Delivery\n\nFreelancers: Opportunity tracking\nJob Seekers: Automated job discovery\nRecruitment Agencies: Market intelligence\nSkill Development Professionals: Trend monitoring\nCareer Coaches: Client opportunity identification\n\nApify\n\nUpwork scraping actor\nAPI token\nConfigured scraping parameters\nUpwork scraping actor\nAPI token\nConfigured scraping parameters\nOpenAI API\n\nGPT model access\nSummarization configuration\nAPI key management\nGPT model access\nSummarization configuration\nAPI key management\nGoogle Sheets\n\nConnected Google account\nPrepared job tracking spreadsheet\nAppropriate sharing settings\nConnected Google account\nPrepared job tracking spreadsheet\nAppropriate sharing settings\nGmail Account\n\nConnected email\nJob digest configuration\nAppropriate sending permissions\nConnected email\nJob digest configuration\nAppropriate sending permissions\nn8n Installation\n\nCloud or self-hosted instance\nWorkflow configuration\nAPI credential management\nCloud or self-hosted instance\nWorkflow configuration\nAPI credential management\n\n🤖 Advanced job matching algorithms\n📊 Multi-platform job aggregation\n🔔 Customizable alert mechanisms\n🌐 Expanded job category tracking\n🧠 Machine learning job recommendation\n\nImplement robust error handling\nUse secure API authentication\nMaintain flexible data processing\nEnsure compliance with platform guidelines\n\nRespect job poster privacy\nUse data for legitimate job searching\nMaintain transparent information gathering\nProvide proper attribution\n\n#FreelanceJobHunting #CareerAutomation #JobDiscovery #AIJobSearch #WorkflowAutomation #FreelanceTech #CareerIntelligence #JobMarketInsights #ProfessionalNetworking #TechJobSearch\n\nReady to revolutionize your job hunting strategy?\n\n📧 Email: Yaron@nofluff.online\n\n🎥 YouTube: @YaronBeen\n\n💼 LinkedIn: Yaron Been\n\nTransform your job search with intelligent, automated workflows!",
"isPaid": false
},
{
"templateId": "4452",
"templateName": "template_4452",
"templateDescription": "Who is this for? This workflow is ideal for: Finance teams that need to process incoming invoices faster with minimal errorsSmall to mid-sized businesses...",
"templateUrl": "https://n8n.io/workflows/4452",
"jsonFileName": "template_4452.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_4452.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/1315e8770dfcd98ad0072db9f50c48da/raw/6632d989bf94caae5020f98138a7d7238b9fb719/template_4452.json",
"screenshotURL": "https://i.ibb.co/ych9wDNp/4ad107681ebf.png",
"workflowUpdated": true,
"gistId": "1315e8770dfcd98ad0072db9f50c48da",
"templateDescriptionFull": "This workflow is ideal for:\n\nFinance teams that need to process incoming invoices faster with minimal errors\nSmall to mid-sized businesses that want to automate invoice intake, review, and storage\nOperations managers who require approval workflows and centralized record-keeping\n\nManually processing invoices is time-consuming, error-prone, and often lacks structure. This workflow solves those challenges by:\n\nAutomating the intake of invoices from multiple sources (email, Google Drive, web form)\nExtracting invoice data using AI, eliminating manual data entry\nImplementing an email-based approval system to add human oversight\nAutomatically storing approved invoice data in Google Sheets for easy access and reporting\nNotifying stakeholders when invoices are approved or rejected\n\nThis end-to-end invoice processing workflow includes:\n\nThree invoice input methods: Google Drive folder monitor, Gmail attachments, and web form uploads\nPDF to text extraction for each input method using native PDF parsing\nAI-powered invoice analysis with GPT-4 to extract structured fields such as vendor, total, and due date\nDynamic categorization of invoice type (e.g., Travel, Software, Utilities) via AI\nEmail-based approval workflow with embedded forms to collect decisions and notes\nAutomated Google Sheets logging of all invoice data, approval status, and reviewer feedback\nRejection notifications sent automatically to your finance team for transparency and follow-up\n\nCopy the Google Sheet template here:\n👉 PDF Invoice Parser with Approval Workflow – Google Sheet Template\nConnect your Google Drive account and specify the invoice folder ID\nSet up Gmail to monitor incoming invoices with PDF attachments\nEnable your form trigger to accept direct uploads from your internal or external users\nEnter your OpenAI API key in the AI processing node for data extraction\nConfigure Google Sheets with a target spreadsheet to store invoice data\nSet recipient email addresses for invoice approvals and rejection notifications\nTest with a sample invoice to ensure end-to-end flow is working\n\nChange input sources: Replace Gmail with Outlook or use Slack uploads instead\nAdd validation steps: Include regex or keyword checks before AI analysis\nCustomize the AI schema: Modify the expected JSON structure based on your internal finance system\nIntegrate with accounting tools: Add Xero, QuickBooks, or custom API nodes to push data\nRoute based on category: Add conditional logic to handle invoices differently based on vendor or category\nMulti-level approvals: Add additional email steps if higher-level signoff is needed\nAudit logging: Use database or Google Sheets to maintain a historical log of approvals and rejections\n\nContact me for consulting and support:\n📧 billychartanto@gmail.com",
"isPaid": false
},
{
"templateId": "3713",
"templateName": "comentarios automaticos",
"templateDescription": "Instagram Auto-Comment Responder with AI Agent Integration Version: 1.1.0 ‧ n8n Version: 1.88.0+ ‧ License: MIT A fully automated workflow for managing and...",
"templateUrl": "https://n8n.io/workflows/3713",
"jsonFileName": "comentarios_automaticos.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/comentarios_automaticos.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/cad1bf10b23598ea71d4dec26b84b9a1/raw/f05fe7f5ef1e7ae308cd5636e8fd7bded744a44e/comentarios_automaticos.json",
"screenshotURL": "https://i.ibb.co/SXhKK6St/daa6ffb7deef.png",
"workflowUpdated": true,
"gistId": "cad1bf10b23598ea71d4dec26b84b9a1",
"templateDescriptionFull": "Version: 1.1.0 ‧ n8n Version: 1.88.0+ ‧ License: MIT\n\nA fully automated workflow for managing and responding to Instagram comments using AI agents. Designed to improve engagement and save time, this system listens for new Instagram comments, verifies and filters them, fetches relevant post data, processes valid messages with a natural language AI, and posts context-aware replies directly on the original post.\n\n💬 AI-Driven Engagement: Intelligent responses to comments via a GPT-powered agent.\n✅ Webhook Verification: Handles Instagram webhook handshake to ensure secure integration.\n📦 Data Extraction: Maps incoming payload fields (user ID, username, message text, media ID) for processing.\n🚫 Self-Comment Filtering: Automatically skips comments made by the account owner to prevent loops.\n📡 Post Data Retrieval: Fetches the media’s id and caption from the Graph API (v22.0) before generating a reply.\n🧠 Natural Language Processing: Uses a custom system prompt to maintain brand tone and context.\n🔁 Automated Replies: Posts the AI-generated message back to the comment thread using Instagram’s API.\n🧩 Modular Architecture: Clear separation of steps via sticky notes and dedicated HTTP Request and Agent nodes.\n\nSocial Media Automation: Keep followers engaged 24/7 with instant, relevant replies.\nCommunity Building: Maintain a consistent voice and tone across all interactions.\nBrand Reputation Management: Ensure no valid comment goes unanswered.\nAI Customer Support: Triage simple questions and direct followers to resources or support.\n\nWebhook Verification\n\nNode: Webhook + Respond to Webhook\nEchoes hub.challenge to confirm subscription and secure incoming events.\nNode: Webhook + Respond to Webhook\nEchoes hub.challenge to confirm subscription and secure incoming events.\nData Extraction\n\nNode: Set\nMaps payload fields into structured variables: conta.id, usuario.id, usuario.name, usuario.message.id, usuario.message.text, usuario.media.id, endpoint.\nNode: Set\nMaps payload fields into structured variables: conta.id, usuario.id, usuario.name, usuario.message.id, usuario.message.text, usuario.media.id, endpoint.\nUser Validation\n\nNode: Filter\nSkips processing if conta.id equals usuario.id (self-comments).\nNode: Filter\nSkips processing if conta.id equals usuario.id (self-comments).\nPost Data Retrieval\n\nNode: HTTP Request (Get post data)\nGET https://graph.instagram.com/v22.0/{{ $json.usuario.media.id }}?fields=id,caption&access_token={{ credentials }}\nCaptures the media’s caption for richer context in replies.\nNode: HTTP Request (Get post data)\nGET https://graph.instagram.com/v22.0/{{ $json.usuario.media.id }}?fields=id,caption&access_token={{ credentials }}\nCaptures the media’s caption for richer context in replies.\nAI Response Generation\n\nNodes: AI Agent + OpenRouter Chat Model\nUses a detailed system prompt with:\n\nProfile persona (expert in AI & automations, friendly tone).\nInput data (username, comment text, post caption).\nFiltering logic (spam, praise, questions, vague comments).\n\n\nReturns either the reply text or [IGNORE] for irrelevant content.\nNodes: AI Agent + OpenRouter Chat Model\nUses a detailed system prompt with:\n\nProfile persona (expert in AI & automations, friendly tone).\nInput data (username, comment text, post caption).\nFiltering logic (spam, praise, questions, vague comments).\nProfile persona (expert in AI & automations, friendly tone).\nInput data (username, comment text, post caption).\nFiltering logic (spam, praise, questions, vague comments).\nReturns either the reply text or [IGNORE] for irrelevant content.\nPosting the Reply\n\nNode: HTTP Request (Post comment)\nPOST {{ $json.endpoint }}/{{ $json.usuario.message.id }}/replies with message={{ $json.output }}\nSends the AI answer back under the original comment.\nNode: HTTP Request (Post comment)\nPOST {{ $json.endpoint }}/{{ $json.usuario.message.id }}/replies with message={{ $json.output }}\nSends the AI answer back under the original comment.\n\nImport Workflow\nIn n8n > Workflows > Import from File, upload the provided .json template.\nConfigure Credentials\n\nInstagram Graph API (Header Auth or FacebookGraphApi) with instagram_basic, instagram_manage_comments scopes.\nOpenRouter/OpenAI API key for AI agent.\nInstagram Graph API (Header Auth or FacebookGraphApi) with instagram_basic, instagram_manage_comments scopes.\nOpenRouter/OpenAI API key for AI agent.\nCustomize System Prompt\n\nEdit the AI Agent’s prompt to adjust brand tone, language (Brazilian Portuguese), length, or emoji usage.\nEdit the AI Agent’s prompt to adjust brand tone, language (Brazilian Portuguese), length, or emoji usage.\nTest & Activate\n\nPublish a test comment on an Instagram post.\nVerify each node’s execution, ensuring the webhook, filter, data extraction, HTTP requests, and AI Agent respond as expected.\nPublish a test comment on an Instagram post.\nVerify each node’s execution, ensuring the webhook, filter, data extraction, HTTP requests, and AI Agent respond as expected.\nExtend & Monitor\n\nAdd sentiment analysis or lead capture nodes as needed.\nMonitor execution logs for errors or rate-limit events.\nAdd sentiment analysis or lead capture nodes as needed.\nMonitor execution logs for errors or rate-limit events.\n\nSocial Media • Instagram Automation • Webhook Verification • AI Agent • HTTP Request • Auto Reply • Community Management",
"isPaid": false
},
{
"templateId": "4064",
"templateName": "template_4064",
"templateDescription": "This workflow provides a robust solution for automatically backing up all your n8n workflows to a designated GitHub repository on a daily basis. By...",
"templateUrl": "https://n8n.io/workflows/4064",
"jsonFileName": "template_4064.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_4064.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/9b93dcb617d5921bb51774e6dea5e087/raw/e351f3fa501cfdb4e78b1ea27598428010403d6a/template_4064.json",
"screenshotURL": "https://i.ibb.co/HfHnDH03/30f1b4fe0e1c.png",
"workflowUpdated": true,
"gistId": "9b93dcb617d5921bb51774e6dea5e087",
"templateDescriptionFull": "This workflow provides a robust solution for automatically backing up all your n8n workflows to a designated GitHub repository on a daily basis. By leveraging the n8n API and GitHub API, it ensures your workflows are version-controlled and securely stored, safeguarding against data loss and facilitating disaster recovery.\n\nThe automation follows these key steps:\n\nScheduled trigger: The workflow is initiated automatically every day at a pre-configured time.\nList existing backups: It first connects to your GitHub repository to retrieve a list of already backed-up workflow files. This helps in determining whether a workflow's backup file needs to be created or updated.\nRetrieve n8n workflows: The workflow then fetches all current workflows directly from your n8n instance using the n8n REST API.\nProcess and prepare: Each retrieved workflow is individually processed. Its data is converted into JSON format. This JSON content is then encoded to base64, a format suitable for GitHub API file operations.\nCommit to GitHub: For each n8n workflow:\n\nA standardized filename is generated (e.g., workflow-name-tag.json).\nThe workflow checks if a file with this name already exists in the GitHub repository (based on the list fetched in step 2).\nIf the file exists: It updates the existing file with the latest version of the workflow.\nIf it's a new workflow (file doesn't exist): A new file is created in the repository.\nEach commit is timestamped for clarity.\nA standardized filename is generated (e.g., workflow-name-tag.json).\nThe workflow checks if a file with this name already exists in the GitHub repository (based on the list fetched in step 2).\nIf the file exists: It updates the existing file with the latest version of the workflow.\nIf it's a new workflow (file doesn't exist): A new file is created in the repository.\nEach commit is timestamped for clarity.\n\nThis process ensures that you always have an up-to-date version of all your n8n workflows stored securely in your GitHub version control system, providing peace of mind and a reliable backup history.\n\nBefore you can use this template, please ensure you have the following:\n\nAn active n8n instance (self-hosted or cloud).\nA GitHub account.\nA GitHub repository created where you want to store the workflow backups.\nA GitHub Personal Access Token with repo scope (or fine-grained token with read/write access to the specific backup repository). This token will be used for GitHub API authentication.\nn8n API credentials (API key) for your n8n instance.\n\nSetting up this workflow should take approximately 10-15 minutes if you have your credentials ready.\n\nImport the template: Import this workflow into your n8n instance.\nConfigure n8n API credentials:\n\nLocate the \"Retrieve workflows\" node.\nIn the \"Credentials\" section for \"n8n API\", create new credentials (or select existing ones).\nEnter your n8n instance URL and your n8n API Key (you can create your n8n api key in the settings of your n8n instance)\nLocate the \"Retrieve workflows\" node.\nIn the \"Credentials\" section for \"n8n API\", create new credentials (or select existing ones).\nEnter your n8n instance URL and your n8n API Key (you can create your n8n api key in the settings of your n8n instance)\nConfigure GitHub credentials:\n\nLocate the \"List files from repo\" node (and subsequently \"Update file\" / \"Upload file\" nodes which will use the same credential).\nIn the \"Credentials\" section for \"GitHub API\", create new credentials.\nSelect OAuth2/Personal Access Token authentication method.\nEnter the GitHub Personal Access Token you generated as per the pre-requisites.\nLocate the \"List files from repo\" node (and subsequently \"Update file\" / \"Upload file\" nodes which will use the same credential).\nIn the \"Credentials\" section for \"GitHub API\", create new credentials.\nSelect OAuth2/Personal Access Token authentication method.\nEnter the GitHub Personal Access Token you generated as per the pre-requisites.\nSpecify repository details:\n\nIn the \"List files from repo\", \"Update file\", and \"Upload file\" GitHub nodes:\n\nSet the Owner: Your GitHub username or organization name.\nSet the Repository: The name of your GitHub repository dedicated to backups.\nSet the Branch (e.g., main or master) where backups should be stored.\n(Optional) Specify a Path within the repository if you want backups in a specific folder (e.g., n8n_backups/). Leave blank to store in the root.\nIn the \"List files from repo\", \"Update file\", and \"Upload file\" GitHub nodes:\n\nSet the Owner: Your GitHub username or organization name.\nSet the Repository: The name of your GitHub repository dedicated to backups.\nSet the Branch (e.g., main or master) where backups should be stored.\n(Optional) Specify a Path within the repository if you want backups in a specific folder (e.g., n8n_backups/). Leave blank to store in the root.\nSet the Owner: Your GitHub username or organization name.\nSet the Repository: The name of your GitHub repository dedicated to backups.\nSet the Branch (e.g., main or master) where backups should be stored.\n(Optional) Specify a Path within the repository if you want backups in a specific folder (e.g., n8n_backups/). Leave blank to store in the root.\nAdjust schedule (Optional):\n\nSelect the \"Schedule Trigger\" node.\nModify the trigger interval (e.g., change the time of day or frequency) as needed. By default, it's set for a daily run.\nSelect the \"Schedule Trigger\" node.\nModify the trigger interval (e.g., change the time of day or frequency) as needed. By default, it's set for a daily run.\nActivate the workflow: Save and activate the workflow.\n\nHere's a detailed breakdown of each node used in this workflow:\n\nSchedule trigger\n\nType: n8n-nodes-base.scheduleTrigger\nPurpose: This node automatically starts the workflow based on a defined schedule (e.g., daily at midnight).\nType: n8n-nodes-base.scheduleTrigger\nPurpose: This node automatically starts the workflow based on a defined schedule (e.g., daily at midnight).\nList files from repo\n\nType: n8n-nodes-base.github\nPurpose: Connects to your specified GitHub repository and lists all files, primarily to check for existing workflow backups.\nType: n8n-nodes-base.github\nPurpose: Connects to your specified GitHub repository and lists all files, primarily to check for existing workflow backups.\nAggregate\n\nType: n8n-nodes-base.aggregate\nPurpose: Consolidates the list of file names obtained from the \"List files from repo\" node into a single item for easier lookup later in the \"Check if file exists\" node.\nType: n8n-nodes-base.aggregate\nPurpose: Consolidates the list of file names obtained from the \"List files from repo\" node into a single item for easier lookup later in the \"Check if file exists\" node.\nRetrieve workflows\n\nType: n8n-nodes-base.n8n\nPurpose: Uses the n8n API to fetch a list of all workflows currently present in your n8n instance.\nType: n8n-nodes-base.n8n\nPurpose: Uses the n8n API to fetch a list of all workflows currently present in your n8n instance.\nJson file\n\nType: n8n-nodes-base.convertToFile\nPurpose: Takes the data of each workflow (retrieved by the \"Retrieve workflows\" node) and converts it into a structured JSON file format.\nType: n8n-nodes-base.convertToFile\nPurpose: Takes the data of each workflow (retrieved by the \"Retrieve workflows\" node) and converts it into a structured JSON file format.\nTo base64\n\nType: n8n-nodes-base.extractFromFile\nPurpose: Converts the binary content of the JSON file (from the \"Json file\" node) into a base64 encoded string. This encoding is required by the GitHub API for file content.\nType: n8n-nodes-base.extractFromFile\nPurpose: Converts the binary content of the JSON file (from the \"Json file\" node) into a base64 encoded string. This encoding is required by the GitHub API for file content.\nCommit date & file name\n\nType: n8n-nodes-base.set\nPurpose: Prepares metadata for the GitHub commit. It generates:\n\ncommitDate: The current date and time for the commit message.\nfileName: A standardized file name for the workflow backup (e.g., my-workflow-vps-backups.json), typically using the workflow's name and its first tag.\nType: n8n-nodes-base.set\nPurpose: Prepares metadata for the GitHub commit. It generates:\n\ncommitDate: The current date and time for the commit message.\nfileName: A standardized file name for the workflow backup (e.g., my-workflow-vps-backups.json), typically using the workflow's name and its first tag.\ncommitDate: The current date and time for the commit message.\nfileName: A standardized file name for the workflow backup (e.g., my-workflow-vps-backups.json), typically using the workflow's name and its first tag.\nCheck if file exists\n\nType: n8n-nodes-base.if\nPurpose: A conditional node. It checks if the fileName (generated by \"Commit date & file name\") is present in the list of files aggregated by the \"Aggregate\" node. This determines if the workflow backup already exists in GitHub.\nType: n8n-nodes-base.if\nPurpose: A conditional node. It checks if the fileName (generated by \"Commit date & file name\") is present in the list of files aggregated by the \"Aggregate\" node. This determines if the workflow backup already exists in GitHub.\nUpdate file\n\nType: n8n-nodes-base.github\nPurpose: If the \"Check if file exists\" node determines the file does exist, this node updates that existing file in your GitHub repository with the latest workflow content (base64 encoded) and a commit message.\nType: n8n-nodes-base.github\nPurpose: If the \"Check if file exists\" node determines the file does exist, this node updates that existing file in your GitHub repository with the latest workflow content (base64 encoded) and a commit message.\nUpload file\n\nType: n8n-nodes-base.github\nPurpose: If the \"Check if file exists\" node determines the file does not exist, this node creates and uploads a new file to your GitHub repository with the workflow content and a commit message.\nType: n8n-nodes-base.github\nPurpose: If the \"Check if file exists\" node determines the file does not exist, this node creates and uploads a new file to your GitHub repository with the workflow content and a commit message.\n\nHere are a few ways you can customize this template to better fit your needs:\n\nBackup path: In the GitHub nodes (\"List files from repo\", \"Update file\", \"Upload file\"), you can specify a Path parameter to store backups in a specific folder within your repository (e.g., workflows/ or daily_backups/).\nFilename convention: Modify the \"Commit date & file name\" node (specifically the expression for fileName) to change how backup files are named. For example, you might want to include the workflow ID or a different date format.\nCommit messages: Customize the commit messages in the \"Update file\" and \"Upload file\" GitHub nodes to include more specific information if needed.\nError handling: Consider adding error handling branches (e.g., using the \"Error Trigger\" node or checking for node execution failures) to notify you if a backup fails for any reason.\nFiltering workflows: If you only want to back up specific workflows (e.g., those with a particular tag or name pattern), you can add a \"Filter\" node after \"Retrieve workflows\" to include only the desired workflows in the backup process.\nBackup frequency: Adjust the \"Schedule Trigger\" node to change how often the backup runs (e.g., hourly, weekly, or on specific days).\n\nTemplate was created in n8n v1.92.2",
"isPaid": false
},
{
"templateId": "3960",
"templateName": "N8N Financial Tracker Telegram Invoices to Notion with AI Summaries & Reports",
"templateDescription": "Automated Financial Tracker: Telegram Invoices to Notion with AI Summaries & Reports Tired of manually logging every expense? Streamline your financial...",
"templateUrl": "https://n8n.io/workflows/3960",
"jsonFileName": "N8N_Financial_Tracker_Telegram_Invoices_to_Notion_with_AI_Summaries__Reports.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/N8N_Financial_Tracker_Telegram_Invoices_to_Notion_with_AI_Summaries__Reports.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/b2ec1a85ca4f44c784b97b0c68bc0bb1/raw/3521b201349cf5bdc6c47f0f7b44608d4becf7ad/N8N_Financial_Tracker_Telegram_Invoices_to_Notion_with_AI_Summaries__Reports.json",
"screenshotURL": "https://i.ibb.co/jvBxQ4j0/fc89f9f3d926.png",
"workflowUpdated": true,
"gistId": "b2ec1a85ca4f44c784b97b0c68bc0bb1",
"templateDescriptionFull": "Automated Financial Tracker: Telegram Invoices to Notion with AI Summaries & Reports\n\nTired of manually logging every expense? Streamline your financial tracking with this powerful n8n workflow!\n\nSnap a photo of your invoice in Telegram, and let AI (powered by Google Gemini) automatically extract the details, record them in your Notion database, and even send you a quick summary. Plus, get scheduled weekly reports with charts to visualize your spending. Automate your finances, save time, and gain better insights with this easy-to-use template!\n\nTransform your expense tracking from a chore into an automated breeze. Try it out!\n\nThis workflow revolutionizes how you track your finances by automating the entire process from invoice capture to reporting. Simply send a photo of an invoice or receipt to a designated Telegram chat, and this workflow will:\n\nExtract Data with AI: Utilize Google Gemini's capabilities to perform OCR on the image, understand the content, and extract key details like item name, quantity, price, total, date, and even attempt to categorize the expense.\nStore in Notion: Automatically log each extracted transaction into a structured Notion database.\nInstant Feedback: Send a summary of the processed transaction back to your Telegram chat.\nScheduled Reporting: Generate and send a visual summary of your expenses (e.g., weekly spending by category) as a chart to your preferred Telegram chat or group.\n\nThis workflow is perfect for individuals, freelancers, or small teams looking to effortlessly manage their expenses without manual data entry.\n\nEffortless Expense Logging: Just send a picture – no more typing!\nAI-Powered Data Extraction: Leverages Google Gemini for intelligent invoice processing.\nCentralized Data in Notion: Keep all your financial records neatly organized in a Notion database.\nAutomated Categorization: AI helps in categorizing your expenses (e.g., Food & Beverage, Transportation).\nInstant Summaries: Get immediate confirmation and a summary of what was recorded.\nVisual Reporting: Receive scheduled charts (e.g., bar charts of spending by category) directly in Telegram.\nCustomizable: Easily adapt the workflow to your specific needs, categories, and reporting preferences.\nTime-Saving: Drastically reduces the time spent on manual financial administration.\n\nThe workflow is divided into two main parts:\n\nPart 1: Real-time Invoice Processing & Logging (## Auto Notes Transaction with Telegram and Notion database)\n\nTelegram Trigger (Telegram Trigger | When recive photo): Activates when a new photo is sent to the configured Telegram chat.\nGet Photo Info (Get Info Photo from telegram chat): Retrieves the details of the received photo.\nGet Image Info (Get Image Info): Prepares the image data.\nAI Data Extraction (Google Gemini Chat Model & Basic LLM Chain):\n\nThe image data is sent to the Google Gemini Chat Model.\nA specific prompt instructs the AI to extract details (date, ID, name, quantity, price, total, category, tax) in a JSON array format and provide a summary message. The categories include Food & Beverage, Transportation, Utilities, Shopping, Healthcare, Entertainment, Housing, and Education.\nThe image data is sent to the Google Gemini Chat Model.\nA specific prompt instructs the AI to extract details (date, ID, name, quantity, price, total, category, tax) in a JSON array format and provide a summary message. The categories include Food & Beverage, Transportation, Utilities, Shopping, Healthcare, Entertainment, Housing, and Education.\nParse AI Output (Parse To your object | Table): Structures the AI's JSON output for easier handling.\nSplit Transactions (Split Out | data transaction): If an invoice contains multiple items, this node splits them into individual records.\nRecord to Notion (Record To Notion Database): Each transaction item is added as a new page/entry in your specified Notion database, mapping fields like Name, Quantity, Price, Total, Category, Date, and Tax.\nSend Telegram Summary (Sendback to chat and give summarize text): The summary message generated by the AI is sent back to the original Telegram chat.\n\nPart 2: Scheduled Financial Reporting (## Schedule report to send on chanel or private message)\n\nSchedule Trigger (Schedule Trigger | for send chart report): Runs at a predefined interval (e.g., every week) to generate reports.\nGet Recent Data from Notion (Get Recent Data from Notions): Fetches transaction data from the Notion database for a specific period (e.g., the past week).\nSummarize Data (Summarize Transaction Data): Aggregates the data, for example, by summing up the 'total' amount for each 'category'.\nPrepare Chart Data (Convert Data to JSON chart payload): Transforms the summarized data into a JSON format suitable for generating a chart (e.g., labels for categories, data for spending amounts).\nGenerate Chart (Generate Chart): Uses the QuickChart node to create a visual chart (e.g., a bar chart) from the prepared data.\nSend Chart to Telegram (Send Chart Image to Group or Private Chat): Sends the generated chart image to a specified Telegram chat ID or group.\n\nTelegram Trigger & Telegram Node: For receiving images and sending messages/images.\nGoogle Gemini Chat Model (Langchain): For AI-powered OCR and data extraction from invoices.\nBasic LLM Chain (Langchain): To interact with the language model using specific prompts.\nOutput Parser Structured (Langchain): To structure the output from the language model.\nNotion Node: For reading from and writing to your Notion databases.\nSchedule Trigger: To automate the reporting process.\nSummarize Node: To aggregate data for reports.\nCode Node: Used here to format data for the chart.\nQuickChart Node: For generating charts.\nSplitOut Node: To process multiple items from a single invoice.\n\nCredentials:\n\nTelegram: Create a Telegram bot and get its API token. You'll also need the Chat ID where you'll send invoices and where reports should be sent.\nGoogle Gemini (PaLM) API: You'll need an API key for Google Gemini.\nNotion: Create a Notion integration and get the API key. Create a Notion database with properties corresponding to the data you want to save (e.g., Name (Title), Quantity (Number), Price (Number), Total (Number), Category (Select), Date (Text or Date), Tax (Number)). Share this database with your Notion integration.\nTelegram: Create a Telegram bot and get its API token. You'll also need the Chat ID where you'll send invoices and where reports should be sent.\nGoogle Gemini (PaLM) API: You'll need an API key for Google Gemini.\nNotion: Create a Notion integration and get the API key. Create a Notion database with properties corresponding to the data you want to save (e.g., Name (Title), Quantity (Number), Price (Number), Total (Number), Category (Select), Date (Text or Date), Tax (Number)). Share this database with your Notion integration.\nConfigure Telegram Trigger:\n\nAdd your Telegram Bot API token.\nWhen you first activate the workflow or test the trigger, send /start to your bot in the chat you want to use for sending invoices. n8n will then capture the Chat ID.\nAdd your Telegram Bot API token.\nWhen you first activate the workflow or test the trigger, send /start to your bot in the chat you want to use for sending invoices. n8n will then capture the Chat ID.\nConfigure Google Gemini Node (Google Gemini Chat Model):\n\nSelect or add your Google Gemini API credentials.\nReview the prompt in the Basic LLM Chain node and adjust if necessary (e.g., date format, categories).\nSelect or add your Google Gemini API credentials.\nReview the prompt in the Basic LLM Chain node and adjust if necessary (e.g., date format, categories).\nConfigure Notion Nodes:\n\nRecord To Notion Database:\n\nSelect or add your Notion API credentials.\nSelect your target Notion Database ID.\nMap the properties from the workflow (e.g., ={{ $json.name }}) to your Notion database columns.\n\n\nGet Recent Data from Notions:\n\nSelect or add your Notion API credentials.\nSelect your target Notion Database ID.\nAdjust the filter if needed (default is \"past_week\").\nRecord To Notion Database:\n\nSelect or add your Notion API credentials.\nSelect your target Notion Database ID.\nMap the properties from the workflow (e.g., ={{ $json.name }}) to your Notion database columns.\nSelect or add your Notion API credentials.\nSelect your target Notion Database ID.\nMap the properties from the workflow (e.g., ={{ $json.name }}) to your Notion database columns.\nGet Recent Data from Notions:\n\nSelect or add your Notion API credentials.\nSelect your target Notion Database ID.\nAdjust the filter if needed (default is \"past_week\").\nSelect or add your Notion API credentials.\nSelect your target Notion Database ID.\nAdjust the filter if needed (default is \"past_week\").\nConfigure Telegram Node for Reports (Send Chart Image to Group or Private Chat):\n\nSelect or add your Telegram Bot API token.\nEnter the Chat ID for the group or private chat where you want to receive the reports.\nSelect or add your Telegram Bot API token.\nEnter the Chat ID for the group or private chat where you want to receive the reports.\nConfigure Schedule Trigger (Schedule Trigger | for send chart report):\n\nSet your desired schedule (e.g., every Monday at 9 AM).\nSet your desired schedule (e.g., every Monday at 9 AM).\nTest: Send an image of an invoice to your Telegram bot and check if the data appears in Notion and if you receive a summary message. Wait for the scheduled report or manually trigger it to test the reporting functionality.\n\n(These are suggestions. You would place these directly into the sticky notes within your n8n workflow editor.)\n\nExisting High-Level Sticky Notes:\n\n## Auto Notes Transaction with Telegram and Notion database\n## Schedule report to send on chanel or private message\n\nSpecific Sticky Notes to Add:\n\nOn Telegram Trigger | When recive photo:📸 INVOICE INPUT 📸\nBot listens here for photos of your receipts/invoices.\nEnsure your Telegram Bot API token is set in credentials.\nNear Google Gemini Chat Model & Basic LLM Chain:🤖 AI MAGIC HAPPENS HERE 🧠\n- Image is sent to Google Gemini for data extraction.\n- Check 'Basic LLM Chain' to customize the AI prompt (e.g., categories, output format).\n- Requires Google Gemini API credentials.\nOn Parse To your object | Table:✨ STRUCTURING AI DATA ✨\nConverts the AI's text output into a usable JSON object.\nCheck the schema if you modify the AI prompt significantly.\nOn Record To Notion Database:📝 SAVING TO NOTION 📝\n- Extracted transaction data is saved here.\n- Configure with your Notion API key & Database ID.\n- Map fields correctly to your database columns!\nOn Sendback to chat and give summarize text:💬 TRANSACTION SUMMARY 💬\nSends a confirmation message back to the user in Telegram\nwith a summary of the recorded expense.\nOn Schedule Trigger | for send chart report:🗓️ REPORTING SCHEDULE 🗓️\nSet how often you want to receive your spending report (e.g., weekly, monthly).\nOn Get Recent Data from Notions:📊 FETCHING DATA FOR REPORT 📊\n- Retrieves transactions from Notion for the report period.\n- Default: \"Past Week\". Adjust filter as needed.\n- Requires Notion API credentials & Database ID.\nOn Summarize Transaction Data:➕ SUMMARIZING SPENDING ➕\nAggregates your expenses, usually by category,\nto prepare for the chart.\nOn Convert Data to JSON chart payload (Code Node):🎨 PREPARING CHART DATA 🎨\nThis Code node formats the summarized data\ninto the JSON structure needed by QuickChart.\nOn Generate Chart (QuickChart Node):📈 GENERATING VISUAL REPORT 📈\nCreates the actual chart image based on your spending data.\nYou can customize chart type (bar, pie, etc.) here.\nOn Send Chart Image to Group or Private Chat:📤 SENDING REPORT TO TELEGRAM 📤\n- Delivers the generated chart to your chosen Telegram chat/group.\n- Set the correct Chat ID and Bot API token.\nGeneral Sticky Note (Place where relevant):🔑 CREDENTIALS NEEDED 🔑\nRemember to set up API keys/tokens for:\n- Telegram\n- Google Gemini\n- Notion\nGeneral Sticky Note (Place where relevant):💡 CUSTOMIZE ME! 💡\n- Adjust AI prompts for better accuracy.\n- Change Notion database structure.\n- Modify report frequency and content.",
"isPaid": false
},
{
"templateId": "3804",
"templateName": "Code Review workflow",
"templateDescription": "AI-Agent Code Review for GitHub Pull Requests Description: This n8n workflow automates the process of reviewing code changes in GitHub pull requests using...",
"templateUrl": "https://n8n.io/workflows/3804",
"jsonFileName": "Code_Review_workflow.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Code_Review_workflow.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/7bfbbcab112ec62c8f7fa7f40e8f2912/raw/508f5d2d47e13f13c7be88a8e25a2741035cf060/Code_Review_workflow.json",
"screenshotURL": "https://i.ibb.co/ktrbfxj/eca950589342.png",
"workflowUpdated": true,
"gistId": "7bfbbcab112ec62c8f7fa7f40e8f2912",
"templateDescriptionFull": "AI-Agent Code Review for GitHub Pull Requests\n\nThis n8n workflow automates the process of reviewing code changes in GitHub pull requests using an OpenAI-powered agent.\n\nIt connects your GitHub repo, extracts modified files, analyzes diffs, and uses an AI agent to generate a code review based on your internal code best practices (fed from a Google Sheet).\n\nIt ends by posting the review as a comment on the PR and tagging it with a visual label like ✅ Reviewed by AI.\n\n🔧 What It Does\n\nTriggered on PR creation\nExtracts code diffs from the PR\nFormats and feeds them into an OpenAI prompt\nEnriches the prompt using a Google Sheet of Swift best practices\nPosts an AI-generated review as a comment on the PR\nApplies a PR label to visually mark reviewed PRs\n\nBefore deploying this workflow, ensure you have the following:\n\nn8n Instance (Self-hosted or Cloud)\nGitHub Repository with PR activity\nOpenAI API Key for GPT-4o, GPT-4-turbo, or GPT-3.5\nGitHub OAuth App (or PAT) connected to n8n to post comments and access PR diffs\n(Optional) Google Sheets API credentials if using the code best practices lookup node.\n\n1. Import the Workflow in n8n, click on Workflows → Import from file or JSON\nPaste or upload the JSON code of this template\n\n2. Configure Triggers and Connections\n\nNode: PR Trigger\nRepository: Select the GitHub repo(s) to monitor\nEvents: Set to pull_request\nAuth: Use GitHub OAuth2 credentials\n\nNode: Get file's Diffs from PR\n\nNo authentication needed; it uses dynamic path from trigger\n\nNode: OpenAI Chat Model\nModel: Select gpt-4o, gpt-4-turbo, or gpt-3.5-turbo\nCredential: Provide your OpenAI API Key\n\nNode : Code Review Agent\nConnected to OpenAI and optionally to tools like Google Sheets\n\nUses GitHub API to post review comments back on PR\nNode: GitHub Robot\nCredential: Use the agent Github account (OAuth or PAT)\nRepo : Pick your owen Github Repository\n\nAdds label ReviewedByAI after successful comment\n\nNode: Add Label to PR\nLabel : you ca customize the label text of your owen tag.\n\nConnects to a Google Sheet for coding guideline lookups, we can replace Google sheet by another tool or data base\n\nFirst prepare your best practices list with the clear description and the code bad/good examples\nAdd al the best practices in your Google Sheet\nConfigure the Code Best Practices node in the template :",
"isPaid": false
},
{
"templateId": "4330",
"templateName": "Automated Resume Job Matching Engine with Bright Data & OpenAI 4o mini",
"templateDescription": "Automated Resume Job Matching NoticeCommunity nodes can only be installed on self-hosted instances of n8n. Who this is forThe Automated Resume Job Matching...",
"templateUrl": "https://n8n.io/workflows/4330",
"jsonFileName": "Automated_Resume_Job_Matching_Engine_with_Bright_Data__OpenAI_4o_mini.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Automated_Resume_Job_Matching_Engine_with_Bright_Data__OpenAI_4o_mini.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/443ac0db013a9d186f1700b19ad60993/raw/abf9c03be62049852024c5f636395737bafa9416/Automated_Resume_Job_Matching_Engine_with_Bright_Data__OpenAI_4o_mini.json",
"screenshotURL": "https://i.ibb.co/4Z2sZgPY/bd85d786ec1e.png",
"workflowUpdated": true,
"gistId": "443ac0db013a9d186f1700b19ad60993",
"templateDescriptionFull": "Community nodes can only be installed on self-hosted instances of n8n.\n\nThe Automated Resume Job Matching Engine is an intelligent workflow designed for career platforms, HR tech startups, recruiting firms, and AI developers who want to streamline job-resume matching using real-time data from LinkedIn and job boards.\n\nThis workflow is tailored for:\n\nHR Tech Founders - Building next-gen recruiting products\nRecruiters & Talent Sourcers - Seeking automated candidate-job fit evaluation\nJob Boards & Portals - Enriching user experience with AI-driven job recommendations\nCareer Coaches & Resume Writers - Offering personalized job fit analysis\nAI Developers - Automating large-scale matching tasks using LinkedIn and job data\n\nManually matching a resume to job description is time-consuming, biased, and inefficient. Additionally, accessing live job postings and candidate profiles requires overcoming web scraping limitations.\n\nThis workflow solves:\n\nAutomated LinkedIn profile and job post data extraction using Bright Data MCP infrastructure\nSemantic matching between job requirements and candidate resume using OpenAI 4o mini\nPagination handling for high-volume job data\nEnd-to-end automation from scraping to delivery via webhook and persisting the job matched response to disk\n\nBright Data MCP for Job Data Extraction\n\nUses Bright Data MCP Clients to extract multiple job listings (supports pagination)\nPulls job data from LinkedIn with the pre-defined filtering criteria's\n\nOpenAI 4o mini LLM Matching Engine\n\nExtracts paginated job data from the Bright Data MCP extracted info via the MCP scrape_as_html tool.\nExtracts textual job description information via the scraped job information by leveraging the Bright Data MCP scrape_as_html tool.\nAI Job Matching node handles the job description and the candidate resume compare to generate match scores with insights\n\nData Delivery\n\nSends final match report to a Webhook Notification endpoint\nPersistence of AI matched job response to disk\n\nKnowledge of Model Context Protocol (MCP) is highly essential. Please read this blog post - model-context-protocol\nYou need to have the Bright Data account and do the necessary setup as mentioned in the Setup section below.\nYou need to have the Google Gemini API Key. Visit Google AI Studio\nYou need to install the Bright Data MCP Server @brightdata/mcp\nYou need to install the n8n-nodes-mcp\n\nPlease make sure to setup n8n locally with MCP Servers by navigating to n8n-nodes-mcp\nPlease make sure to install the Bright Data MCP Server @brightdata/mcp on your local machine.\nSign up at Bright Data.\nNavigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions.\nCreate a Web Unlocker proxy zone called mcp_unlocker on Bright Data control panel.\nIn n8n, configure the OpenAi account credentials.\nIn n8n, configure the credentials to connect with MCP Client (STDIO) account with the Bright Data MCP Server as shown below.\n\nMake sure to copy the Bright Data API_TOKEN within the Environments textbox above as API_TOKEN=<your-token>.\nUpdate the Set input fields for candidate resume, keywords and other filtering criteria's.\nUpdate the Webhook HTTP Request node with the Webhook endpoint of your choice.\nUpdate the file name and path to persist on disk.\n\nTarget Different Job Boards\n\nSet input fields with the sites like Indeed, ZipRecruiter, or Monster\n\nCustomize Matching Criteria\n\nAdjust the prompt inside the AI Job Match node\nInclude scoring metrics like skills match %, experience relevance, or cultural fit\n\nAutomate Scheduling\n\nUse a Cron Node to periodically check for new jobs matching a profile\nSet triggers based on webhook or input form submissions\n\nOutput Customization\n\nAdd Markdown/PDF formatting for report summaries\nExtend with Google Sheets export for internal analytics\n\nEnhance Data Security\n\nMask personal info before sending to external endpoints",
"isPaid": false
},
{
"templateId": "3363",
"templateName": "template_3363",
"templateDescription": "✨ Overview This workflow allows candidates to schedule interviews through a conversational AI assistant. It integrates with your Google Calendar to check...",
"templateUrl": "https://n8n.io/workflows/3363",
"jsonFileName": "template_3363.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_3363.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/944256a00933bccbfba2aeb6fdc23676/raw/416add85dda0c66cd0b2f8a9418f4e624c8af6de/template_3363.json",
"screenshotURL": "https://i.ibb.co/kgCnSdyR/1a9ac5a3fdaf.png",
"workflowUpdated": true,
"gistId": "944256a00933bccbfba2aeb6fdc23676",
"templateDescriptionFull": "✨ Overview\n\nThis workflow allows candidates to schedule interviews through a conversational AI assistant. It integrates with your Google Calendar to check for existing events and generates a list of available 30-minute weekday slots between 9 AM and 5 PM Eastern Time. Once the candidate selects a suitable time and provides their contact information, the AI bot automatically books the meeting on your calendar and confirms the appointment.\n\n⚡ Prerequisites\n\nTo use this workflow, you need an OpenAI account with access to the GPT-4o model, a Google account with a calendar that can be accessed through the Google Calendar API, and an active instance of n8n—either self-hosted or via n8n cloud. Within n8n, you must have two credential configurations ready: one for Google Calendar using OAuth2 authentication, and another for your OpenAI API key.\n\n🔐 API Credentials Setup\n\nFor Google Calendar, go to the Google Cloud Console and create a new project. Enable the Google Calendar API, then create OAuth2 credentials by selecting “Web Application” as the application type. Add http://localhost:5678/rest/oauth2-credential/callback as the redirect URI if using local n8n. After that, go to n8n, navigate to the Credentials section, and create a new Google Calendar OAuth2 credential using your account. For OpenAI, visit platform.openai.com to retrieve your API key. Then go to the n8n Credentials page, create a new credential for OpenAI, paste your key, and name it for reference.\n\n🔧 How to Make This Workflow Yours\n\nTo customize the workflow for your use, start by replacing all instances of the calendar email rbreen.ynteractive@gmail.com with your own Google Calendar email. This email is referenced in multiple places, including Google Calendar nodes and the ToolWorkflow JSON for the node named \"Run Get Availability.\" Also update any instances where the Google Calendar credential is labeled as Google Calendar account to match your own credential name within n8n. Do the same for the OpenAI credential label, replacing OpenAi account with the name of your own credential.\n\nNext, go to the node labeled Candidate Chat and copy the webhook URL. This is the public chat interface where candidates will engage with the bot—share this URL with them through email, your website, or anywhere you want to allow access. Optionally, you can also tweak the system message in the Interview Scheduler node to modify the tone, language, or logic used during conversations. If you want to add branding, update the title, subtitle, and inputPlaceholder in the Candidate Chat node, and consider modifying the final confirmation message in Final Response to User to reflect your brand voice. You can also update the business rules such as time zone, working hours, or default duration by editing the logic in the Generate 30 Minute Timeslots code node.\n\n🧩 Workflow Explanation\n\nThis workflow begins with the Candidate Chat node, which triggers when a user visits the public chat URL. The Interview Scheduler node acts as an AI agent, guiding the user through providing their email, phone number, and preferred interview time. It checks availability using the Run Get Availability tool, which in turn reads your calendar and compares it with generated free time slots from the Generate 30 Minute Timeslots node. The check day names tool helps the AI interpret natural language date expressions like “next Tuesday.”\n\nThe schedule is only populated with 30-minute weekday slots from 9 AM to 5 PM Eastern Time, and no events are scheduled if they overlap with existing ones. When a suitable time is confirmed, the AI formats the result into structured JSON, creates an event on your Google Calendar, and sends a confirmation back to the user with all relevant meeting details.\n\n🚀 Deployment Steps\n\nTo deploy the interview scheduler, import the provided workflow JSON into your n8n instance. Update the Google Calendar email, OpenAI and Google credential labels, system prompts, and branding as needed. Test the connections to ensure the API credentials are working correctly. Once everything is configured, copy and share the public chat URL from the Candidate Chat node. When candidates engage with the chat, the workflow will walk them through the interview booking process, check your availability, and finalize the booking automatically.\n\n💡 Additional Tips\n\nBy default, the workflow avoids scheduling interviews on weekends and outside of 9–5 EST. Each interview lasts exactly 30 minutes, and overlapping with existing events is prevented. The assistant does not reveal details about other meetings. You can customize every part of this workflow to fit your use case, including subworkflows like Get Availability and check day names, or even white-label it for client use. This workflow is ready to become your AI-powered interview scheduling assistant.\n\nI’m Robert Breen, founder of Ynteractive — a consulting firm that helps businesses automate operations using n8n, AI agents, and custom workflows. I’ve helped clients build everything from intelligent chatbots to complex sales automations, and I’m always excited to collaborate or support new projects.\n\nIf you found this workflow helpful or want to talk through an idea, I’d love to hear from you.\n\n🌐 Website: https://www.ynteractive.com\n📺 YouTube: @ynteractivetraining\n💼 LinkedIn: https://www.linkedin.com/in/robert-breen\n📬 Email: rbreen@ynteractive.com",
"isPaid": false
},
{
"templateId": "5374",
"templateName": "Copycat SEO article (public version)",
"templateDescription": "**Content engine that ships fresh, SEO-ready articles every single day.** Workflow: ⸻ Layout Blueprint •\tPurpose: Define content structure before writing...",
"templateUrl": "https://n8n.io/workflows/5374",
"jsonFileName": "Copycat_SEO_article_public_version.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Copycat_SEO_article_public_version.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/b493bfc59cf98b7a43975d37b9df68b2/raw/cec65839c290effb6a7897465b26dec96fc607cf/Copycat_SEO_article_public_version.json",
"screenshotURL": "https://i.ibb.co/ymJN60wm/c3dcc792d641.png",
"workflowUpdated": true,
"gistId": "b493bfc59cf98b7a43975d37b9df68b2",
"templateDescriptionFull": "**Content engine that ships fresh, SEO-ready articles every single day.\n**\n\nWorkflow:\n\n⸻\n\nLayout Blueprint\n•\tPurpose: Define content structure before writing begins.\n•\tWhat’s Included:\n•\tSearch intent mapping\n•\tInternal link planning\n•\tCall-to-action (CTA) placement\n•\tBenefit: Ensures consistency, SEO alignment, and content goals are baked in early.\n\n⸻\n\nAI-Assisted Drafting\n•\tTool: GPT generates the first draft.\n•\tEditor’s Role:\n•\tFocus on depth and accuracy\n•\tAlign tone and style with existing site content\n•\tContext-Aware: Pulls insights from top-ranking articles already live on the site.\n\n⸻\n\nSEO Validation\n•\tAutomated Checks for:\n•\tKeyword coverage\n•\tReadability scoring\n•\tSchema markup\n•\tInternal/external link quality\n•\tOutcome: Each piece is validated before hitting publish.\n\n⸻\n\nMedia Production\n•\tProcess: AI auto-generates relevant images.\n•\tDelivery: Visual assets are automatically added to the CMS library.\n\n⸻\n\nOptional Human Review: Team feedback via Slack or Teams if needed.\n\n⸻\n\nAutomated Publishing\n•\tAction: Instantly publishes content to Webflow once approved.\n•\tResult: A fully streamlined pipeline from draft to live with minimal manual steps.",
"isPaid": false
},
{
"templateId": "2644",
"templateName": "Flux Dev Image Generation Fal.ai",
"templateDescription": "This workflow automates AI-based image generation using the Fal.ai Flux API. Define custom prompts, image parameters, and effortlessly generate, monitor,...",
"templateUrl": "https://n8n.io/workflows/2644",
"jsonFileName": "Flux_Dev_Image_Generation_Fal.ai.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Flux_Dev_Image_Generation_Fal.ai.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/b430c3bbd5cfd3e0b21a14e44c92c852/raw/e134513142a35106d94919235c6f5f0a9be45aee/Flux_Dev_Image_Generation_Fal.ai.json",
"screenshotURL": "https://i.ibb.co/rGZhkt6S/b85d9d9949c2.png",
"workflowUpdated": true,
"gistId": "b430c3bbd5cfd3e0b21a14e44c92c852",
"templateDescriptionFull": "This workflow automates AI-based image generation using the Fal.ai Flux API. Define custom prompts, image parameters, and effortlessly generate, monitor, and save the output directly to Google Drive. Streamline your creative automation with ease and precision.\n\nWho is this for?\n\nThis template is for content creators, developers, automation experts, and creative professionals looking to integrate AI-based image generation into their workflows. It’s ideal for generating custom visuals with the Fal.ai Flux API and automating storage in Google Drive.\n\nWhat problem is this workflow solving?\n\nManually generating AI-based images, checking their status, and saving results can be tedious. This workflow automates the entire process — from requesting image generation, monitoring its progress, downloading the result, and saving it directly to a Google Drive folder.\n\nWhat this workflow does\n1.\tSets Custom Image Parameters: Allows you to define the prompt, resolution, guidance scale, and steps for AI image generation.\n2.\tSends a Request to Fal.ai: Initiates the image generation process using the Fal.ai Flux API.\n3.\tMonitors Image Status: Checks for completion and waits if needed.\n4.\tDownloads the Generated Image: Fetches the completed image once ready.\n5.\tSaves to Google Drive: Automatically uploads the generated image to a specified Google Drive folder.\n\nSetup\n1.\tPrerequisites:\n•\tFal.ai API Key: Obtain it from the Fal.ai platform and set it as the Authorization header in HTTP Header Auth credentials.\n•\tGoogle Drive OAuth Credentials: Connect your Google Drive account in n8n.\n2.\tConfiguration:\n•\tUpdate the “Edit Fields” node with your desired image parameters:\n•\tPrompt: Describe the image (e.g., “Thai young woman net idol 25 yrs old, walking on the street”).\n•\tWidth/Height: Define image resolution (default: 1024x768).\n•\tSteps: Number of inference steps (e.g., 30).\n•\tGuidance Scale: Controls image adherence to the prompt (e.g., 3.5).\n•\tSet your Google Drive folder ID in the “Google Drive” node to save the image.\n3.\tRun the Workflow:\n•\tTrigger the workflow manually to generate the image.\n•\tThe workflow waits, checks status, and saves the final output seamlessly.\n\nCustomization\n•\tModify Image Parameters: Adjust the prompt, resolution, steps, and guidance scale in the “Edit Fields” node.\n•\tChange Storage Location: Update the Google Drive node with a different folder ID.\n•\tAdd Notifications: Integrate an email or messaging node to alert you when the image is ready.\n•\tAdditional Outputs: Expand the workflow to send the generated image to Slack, Dropbox, or other platforms.\n\nThis workflow streamlines AI-based image generation and storage, offering flexibility and customization for creative automation.",
"isPaid": false
},
{
"templateId": "4467",
"templateName": "PromptCraft AI",
"templateDescription": "PromptCraft AI – Telegram Image Generator 🚀 How It Works PromptCraft AI is an n8n automation that transforms simple image ideas sent through Telegram into...",
"templateUrl": "https://n8n.io/workflows/4467",
"jsonFileName": "PromptCraft_AI.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/PromptCraft_AI.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/0004ecdee6bd5255763a7caeaa2bb478/raw/82441318a920af45c5fa49dfb119b7a35e723371/PromptCraft_AI.json",
"screenshotURL": "https://i.ibb.co/4RkDbRqM/4ca496b7f81e.png",
"workflowUpdated": true,
"gistId": "0004ecdee6bd5255763a7caeaa2bb478",
"templateDescriptionFull": "PromptCraft AI is an n8n automation that transforms simple image ideas sent through Telegram into stunning AI-generated images using OpenAI's DALL·E (or other image models).\n\nTelegram Trigger: Listens for messages from a user on Telegram.\nPrompt Expansion: The message is transformed into a rich image description using GPT (OpenAI Chat Model).\nImage Generation: The prompt is passed to OpenAI's image API to generate a high-quality image.\nSend Image: The final image is sent back to the user on Telegram.\n(Optional) Log image titles and links to Google Drive and Google Sheets.\n\n[ ] n8n installed (Self-hosted or via n8n.cloud)\n[ ] Telegram bot token (via @BotFather)\n[ ] OpenAI API key (platform.openai.com)\n[ ] Google Sheets & Drive OAuth2 credentials (optional)\n\nGo to n8n → click Import → upload PromptCraft_AI_Template.json\n\nIn Credentials, add the following:\n\nTelegram API → Paste your bot token\nOpenAI API → Paste your OpenAI API key\n(Optional) Google Sheets OAuth2, Google Drive OAuth2\nTelegram API → Paste your bot token\nOpenAI API → Paste your OpenAI API key\n(Optional) Google Sheets OAuth2, Google Drive OAuth2\n\nOpen each node that requires credentials:\n\nReplace REPLACE_OPENAI_API_KEY with your actual OpenAI API key\nReplace REPLACE_TELEGRAM_API_ID and credential names as needed\n(Optional) Update Google Drive Folder ID & Sheet ID in respective nodes\nReplace REPLACE_OPENAI_API_KEY with your actual OpenAI API key\nReplace REPLACE_TELEGRAM_API_ID and credential names as needed\n(Optional) Update Google Drive Folder ID & Sheet ID in respective nodes\n\nTurn on the Telegram Trigger node.\nDeploy and activate the full workflow.\n\nSend your Telegram bot a message like:\n\na knight riding a robotic horse in the future\nReceive the generated image back in Telegram!\n\nUse detailed or imaginative inputs for better outputs.\nFine-tune the GPT prompt for specific visual styles.\nExtend with Google Vision, image upscaling, or watermarking.\n\nFor setup assistance or custom feature requests, feel free to contact me @dimejicole21@gmail.com\n\nHappy Prompting! 🖼✨",
"isPaid": false
},
{
"templateId": "2217",
"templateName": "Image Generation API",
"templateDescription": "How it works:Webhook URL that responds to Requests with an AI generated Image based on the prompt provided in the URL. Setup Steps:Ideate your promptURL...",
"templateUrl": "https://n8n.io/workflows/2217",
"jsonFileName": "Image_Generation_API.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Image_Generation_API.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/33c5cc4409a91bed0b15ddd168a78f76/raw/a5cb5fbc36d2b79b3ce1abfcd687db553494fb82/Image_Generation_API.json",
"screenshotURL": "https://i.ibb.co/whgdF6Kd/e291961b3e25.png",
"workflowUpdated": true,
"gistId": "33c5cc4409a91bed0b15ddd168a78f76",
"templateDescriptionFull": "How it works:\nWebhook URL that responds to Requests with an AI generated Image based on the prompt provided in the URL.\n\nSetup Steps:\n\nIdeate your prompt\nURL Encode The Prompt (as shown in the Template)\nAuthenticate with your OpenAI Credentials\nPut together the Webhook URL with your prompt and enter into a webbrowser\n\nIn this way you can expose a public url to users, employee's etc. without exposing your OpenAI API Key to them.\n\nClick here to find a blog post with additional information.",
"isPaid": false
},
{
"templateId": "1900",
"templateName": "OpenAI-model-examples",
"templateDescription": "Primer workflow for OpenAI models: ChatGPT, DALLE-2, WhisperThis workflow contains 5 examples on how to work with OpenAI API. Transcribe voice into text via...",
"templateUrl": "https://n8n.io/workflows/1900",
"jsonFileName": "OpenAI-model-examples.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/OpenAI-model-examples.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/1cd2a92afa6b3062e84157973818fabb/raw/6637074d8811ac906da1c36afa655e03fd79912e/OpenAI-model-examples.json",
"screenshotURL": "https://i.ibb.co/qY0T5sG7/323abe693a86.png",
"workflowUpdated": true,
"gistId": "1cd2a92afa6b3062e84157973818fabb",
"templateDescriptionFull": "This workflow contains 5 examples on how to work with OpenAI API.\n\nTranscribe voice into text via Whisper model (disabled, please put your own mp3 file with voice)\nThe old way of using OpenAI conversational model via text-davinci-003\nExamples 1.x. Simple ChatGPT calls. Text completion and text edit\nExample 2. Provide system and user content into ChatGPT\nExamples 3.x. Create system / user / assistanc content via Code Node. Promtp chaining technique example\nExample 4. Generate code via ChatGPT\nExample 5. Return multiple answers. Useful for providing picking the most relevant reply",
"isPaid": false
},
{
"templateId": "3655",
"templateName": "Create Animated Stories using GPT-4o-mini, Midjourney, Kling and Creatomate API",
"templateDescription": "What does the workflow do? This workflow is designed to generate high-quality short videos, primarily uses GPT-4o-mini (unofficial), Midjourney (unofficial)...",
"templateUrl": "https://n8n.io/workflows/3655",
"jsonFileName": "Create_Animated_Stories_using_GPT-4o-mini_Midjourney_Kling_and_Creatomate_API.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Create_Animated_Stories_using_GPT-4o-mini_Midjourney_Kling_and_Creatomate_API.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/11814c518c0454d39b62893a97e580a8/raw/071985722cf5c0b7267dd8a477764eed3289a79b/Create_Animated_Stories_using_GPT-4o-mini_Midjourney_Kling_and_Creatomate_API.json",
"screenshotURL": "https://i.ibb.co/r2jr65r0/74a35086a498.png",
"workflowUpdated": true,
"gistId": "11814c518c0454d39b62893a97e580a8",
"templateDescriptionFull": "This workflow is designed to generate high-quality short videos, primarily uses GPT-4o-mini (unofficial), Midjourney (unofficial) and Kling (unofficial) APIs from PiAPI and Creatomate API mainly for content creator, social media bloggers and short-form video creators. Through this short video workflow, users can quickly validate their creative ideas and focus more on enhancing the quality of their video concepts.\n\nSocial Media Influencers: produce content videos based on inspiration efficiently.\nVloggers: generate vlogs based on inspiration.\nEducational Creators: explain specific topics via animated short videos or demonstrate a specific imagined scenario to students for enhanced educational impact.\nAdvertising Agencies: generate short videos based on specific products.\nAI Tool Developers: automatically generate product demo videos.\n\nFill in X-API-key of PiAPI account in Basic Params node.\nFill in the scenario of the image and video prompt.\nSet a video template on Creatomate and make an API call in the final node with core and processing modules provided in Creatomate. Before full video generation, you can first use basic assets in Creatomate for a prototype demo, then integrate with n8n after verifying the expected results.\nFill in your Creatomate account settings following the image guildline.\nClick Test Workflow and wait for a generation (within 10~20min).\n\nIn this workflow, we've established a basic structure for image-to-video generation with subtitle integration. You can further enhance it by adding music nodes using either PiAPI's audio models or your preferred music solution. All video elements will ultimately be composited through Creatomate. For best practice, please refer to PiAPI's official API documentation or Creatomate's API documentation to comprehend more use cases.\n\nParams Settings\n\nstyle: a children’s book cover, ages 6-10. --s 500 --sref 4028286908 --niji 6\ncharacter: A gentle girl and a fluffy rabbit explore a sunlit forest together, playing by a sparkling stream\nsituational_keywords: Butterflies flutter around them as golden sunlight filters through green leaves. Warm and peaceful atmosphere\n\nOutput Video\n<video src=\"https://static.piapi.ai/n8n-instruction/short-video/example1.mp4\" controls />",
"isPaid": false
},
{
"templateId": "3627",
"templateName": "Generate Graphic Wallpaper with Midjourney, GPT-4o-mini and Canvas APIs",
"templateDescription": "Who is the template for?This workflow is specifically designed for content creators and social media professionals, enabling Instagram and X (Twitter)...",
"templateUrl": "https://n8n.io/workflows/3627",
"jsonFileName": "Generate_Graphic_Wallpaper_with_Midjourney_GPT-4o-mini_and_Canvas_APIs.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Generate_Graphic_Wallpaper_with_Midjourney_GPT-4o-mini_and_Canvas_APIs.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/79e38cf78114aed374a98a92929a41bb/raw/8e8cb36e86dc638b8d94315c4885b2fe3b976a9c/Generate_Graphic_Wallpaper_with_Midjourney_GPT-4o-mini_and_Canvas_APIs.json",
"screenshotURL": "https://i.ibb.co/7dQGg67s/787bbb8520dc.png",
"workflowUpdated": true,
"gistId": "79e38cf78114aed374a98a92929a41bb",
"templateDescriptionFull": "This workflow is specifically designed for content creators and social media professionals, enabling Instagram and X (Twitter) influencers to produce highly artistic visual posts, empowering marketing teams to quickly generate event promotional graphics, assisting blog authors in creating featured images and illustrations, and helping knowledge-based creators transform key insights into easily shareable card visuals.\n\nFill in your API key from PiAPI.\nFill in Basic Params Node following the sticky note guidelines.\nSet up a design template in Canvas Switchboard.\nMake a simple template in Switchboard.\nClick Crul and get the API code to fill in JSON of Design in Canvas.\nClick Test Workflow and get a url result.\n\nHere we will provide some setting examples to help users find a proper way to use this workflow. User could change these settings based on specific purposes.\n\nBasic Params Setting:\n\ntheme: Hope\nscenario: Don't know about the future, confused and feel lost with tech-development.\nstyle: Cinematic Grandeur, Sci-Tech Aesthetic, 3D style\nexample: 1. March. Because of your faith, it will happen. 2. Something in me will save me. 3. To everyone carrying a heavy heart in silence. You are going to be okay. 4. Tomorrow will be better.\nimage prompt: A cinematic sci-fi metropolis where Deep Neural Nets control a hyper-connected society. Holographic interfaces glow in the air as robotic agents move among humans, symbolizing Industry 4.0. The scene contrasts organic human emotion with cold machine precision, rendered in a hyper-realistic 3D style with futuristic lighting. Epic wide shots showcase the grandeur of this civilization’s industrial evolution.\n\nOutput Image:",
"isPaid": false
},
{
"templateId": "3626",
"templateName": "Motion-illustration Workflow Generated with Midjourney and Kling API",
"templateDescription": "What does the workflow do?This workflow is primarily designed to generate animated illustrations for content creators and social media professionals with...",
"templateUrl": "https://n8n.io/workflows/3626",
"jsonFileName": "Motion-illustration_Workflow_Generated_with_Midjourney_and_Kling_API.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Motion-illustration_Workflow_Generated_with_Midjourney_and_Kling_API.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/7f19d86120add94818ac36ef65fc8a38/raw/c314360fbbb903ff529725b15822ea724ac86130/Motion-illustration_Workflow_Generated_with_Midjourney_and_Kling_API.json",
"screenshotURL": "https://i.ibb.co/j9bcPV7Z/91ec26358f0f.png",
"workflowUpdated": true,
"gistId": "7f19d86120add94818ac36ef65fc8a38",
"templateDescriptionFull": "This workflow is primarily designed to generate animated illustrations for content creators and social media professionals with Midjourney (unoffcial) and Kling (unofficial) API served by PiAPI.\nPiAPI is an API platform which provides professional API service. With service provided by PiAPI, users could generate a fantastic animated artwork simply using workflow on n8n without complex settings among various AI models.\n\nAn animated illustration is a digitally enhanced artwork that combines traditional illustration styles with subtle, purposeful motion to enrich storytelling while preserving its original artistic essence.\n\nSocial Media Content Creators: Produces animated illustrations for social media posts.\nDigital Marketers: Generates marketing materials with motion graphics.\nIndependent Content Producers: Creates animated content without specialized animation skills.\n\nTo simplify workflow settings, usually users just need to change basic prompt of the image and the motion of the final video following the instrution below:\n\nSign in your PiAPI account and get your X-API-Key.\nFill in your X-API-Key of PiAPI account in Midjourney and Kling nodes.\nEnter your desired image prompt in the Prompt node.\nEnter the motion prompt in Kling Video Generator node.\n\nFor more complex or customization settings, users could also add more nodes to get more output images and generate more videos. Also, they could change the target image to gain a better result. As for recommendation, users could change the video models for which we would recommend live-wallpaper LoRA of Wanx. Users could check API doc to see more use cases of video models and image models for best practice.\n\nInput Prompt\nA gentle girl and a fluffy rabbit explore a sunlit forest together, playing by a sparkling stream. Butterflies flutter around them as golden sunlight filters through green leaves. Warm and peaceful atmosphere, 4K nature documentary style. --s 500 --sref 4028286908 --niji 6\n\nOutput Video\n<video src=\"https://static.piapi.ai/n8n-instruction/motion-illustration/example1.mp4\" controls />\n\nCheck if the X-API-Key has been filled in nodes needed.\nCheck your task status in Task History in PiAPI to get more details about task status.\n\n<video src=\"https://static.piapi.ai/n8n-instruction/motion-illustration/example2.mp4\" controls />",
"isPaid": false
},
{
"templateId": "2417",
"templateName": "template_2417",
"templateDescription": "Easily generate images with Black Forest's Flux Text-to-Image AI models using Hugging Face’s Inference API. This template serves a webform where you can...",
"templateUrl": "https://n8n.io/workflows/2417",
"jsonFileName": "template_2417.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_2417.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/01f05e0de48738e56052e4ac1f69eb26/raw/7603bf76877ea360211abb12d0e1528bdb6114aa/template_2417.json",
"screenshotURL": "https://i.ibb.co/WpDTqK9X/1d4f87bc9278.png",
"workflowUpdated": true,
"gistId": "01f05e0de48738e56052e4ac1f69eb26",
"templateDescriptionFull": "Easily generate images with Black Forest's Flux Text-to-Image AI models using Hugging Face’s Inference API. This template serves a webform where you can enter prompts and select predefined visual styles that are customizable with no-code. The workflow integrates seamlessly with Hugging Face's free tier, and it’s easy to modify for any Text-to-Image model that supports API access.\n\nCurious what this template does? Try a public version here: https://devrel.app.n8n.cloud/form/flux\n\nWatch this quick set up video 👇\n\nHuggingface.co account (free)\nCloudflare.com account (free - used for storage; but can be swapped easily e.g. GDrive)\n\nText-to-Image Creation: Generates unique visuals based on your prompt and style.\nHugging Face Integration: Utilizes Hugging Face’s Inference API for reliable image generation.\nCustomizable Visual Styles: Select from preset styles or easily add your own.\nAdaptable: Swap in any Hugging Face Text-to-Image model that supports API calls.\n\nCreators: Rapidly create visuals for projects.\nMarketers: Prototype campaign visuals.\nDevelopers: Test different AI image models effortlessly.\n\nYou submit an image prompt via the webform and select a visual style, which appends style instructions to your prompt. The Hugging Face Inference API then generates and returns the image, which gets hosted on Cloudflare S3. The workflow can be easily adjusted to use other models and styles for complete flexibility.",
"isPaid": false
},
{
"templateId": "2738",
"templateName": "Transform Image to Lego Style Using Line and Dall-E",
"templateDescription": "Who is this for?This workflow is designed for:Content creators, artists, or hobbyists looking to experiment with AI-generated art.Small business owners or...",
"templateUrl": "https://n8n.io/workflows/2738",
"jsonFileName": "Transform_Image_to_Lego_Style_Using_Line_and_Dall-E.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Transform_Image_to_Lego_Style_Using_Line_and_Dall-E.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/27fbefba6d9ddbd2beeb7a53df7b7064/raw/c3594f3bd06869d184970867f25a2a44bd02e1bf/Transform_Image_to_Lego_Style_Using_Line_and_Dall-E.json",
"screenshotURL": "https://i.ibb.co/DDF07j3p/6dadf928bbd1.png",
"workflowUpdated": true,
"gistId": "27fbefba6d9ddbd2beeb7a53df7b7064",
"templateDescriptionFull": "This workflow is designed for:\n\nContent creators, artists, or hobbyists looking to experiment with AI-generated art.\nSmall business owners or marketers using LEGO-style designs for branding or promotions.\nDevelopers or AI enthusiasts wanting to automate image transformations through messaging platforms like LINE.\n\nSimplifies the process of creating custom AI-generated LEGO-style images.\nAutomates the manual effort of transforming user-uploaded images into AI-generated artwork.\nBridges the gap between messaging platforms (LINE) and advanced AI tools (DALL·E).\nProvides a seamless system for users to upload an image and receive an AI-transformed output without technical expertise.\n\nImage Upload via LINE:\n\nUsers send an image to the LINE chatbot.\nUsers send an image to the LINE chatbot.\nAI-Powered Prompt Creation:\n\nGPT generates a prompt to describe the uploaded image for LEGO-style conversion.\nGPT generates a prompt to describe the uploaded image for LEGO-style conversion.\nAI Image Generation:\n\nDALL·E 3 processes the prompt and creates a LEGO-style isometric image.\nDALL·E 3 processes the prompt and creates a LEGO-style isometric image.\nImage Delivery:\n\nThe generated image is returned to the user in LINE.\nThe generated image is returned to the user in LINE.\n\nLINE Developer Account with API credentials.\nAccess to OpenAI API with DALL·E and GPT-4 capabilities.\nA configured n8n instance to run this workflow.\n\nEnvironment Setup:\n\nAdd your LINE API Token and OpenAI credentials as environment variables (LINE_API_TOKEN, OPENAI_API_KEY) in n8n.\nAdd your LINE API Token and OpenAI credentials as environment variables (LINE_API_TOKEN, OPENAI_API_KEY) in n8n.\nConfigure LINE Webhook:\n\nPoint the LINE webhook to your n8n instance.\nPoint the LINE webhook to your n8n instance.\nConnect OpenAI:\n\nSet up OpenAI API credentials in the workflow nodes for GPT-4 and DALL·E.\nSet up OpenAI API credentials in the workflow nodes for GPT-4 and DALL·E.\nTest Workflow:\n\nUpload a sample image in LINE and ensure it returns the LEGO-style AI image.\nUpload a sample image in LINE and ensure it returns the LEGO-style AI image.\n\nLocalization:\n\nModify response messages in LINE to fit your audience's language and tone.\nModify response messages in LINE to fit your audience's language and tone.\nIntegration:\n\nAdd nodes to send notifications through other platforms like Slack or email.\nAdd nodes to send notifications through other platforms like Slack or email.\nImage Style:\n\nReplace the LEGO-style image prompt with other artistic styles or themes.\nReplace the LEGO-style image prompt with other artistic styles or themes.\n\nArt Contests:\n\nUsers upload images and receive AI-enhanced outputs for community voting or branding.\nUsers upload images and receive AI-enhanced outputs for community voting or branding.\nMarketing Campaigns:\n\nQuickly generate creative visual content for ads and promotions using customer-submitted photos.\nQuickly generate creative visual content for ads and promotions using customer-submitted photos.\nEducation:\n\nUse the workflow to teach students about AI-generated art and automation through a hands-on approach.\nUse the workflow to teach students about AI-generated art and automation through a hands-on approach.\n\nError Handling:\n\nAdd fallback nodes to handle invalid images or API errors gracefully.\nAdd fallback nodes to handle invalid images or API errors gracefully.\nLogging:\n\nImplement a logging mechanism to track requests and outputs for debugging and analytics.\nImplement a logging mechanism to track requests and outputs for debugging and analytics.\nScalability:\n\nUse queue-based systems or cloud scaling to handle large volumes of image requests.\nUse queue-based systems or cloud scaling to handle large volumes of image requests.\n\nAdd sticky notes in n8n to provide inline instructions for configuring each node.\nCreate a tutorial video or documentation for first-time users to set up and customize the workflow.\nInclude advanced filters to allow users to select from multiple styles beyond LEGO (e.g., pixel art, watercolor).\n\nThis workflow enables seamless interaction between messaging platforms and advanced AI capabilities, making it highly versatile for various creative and business applications.",
"isPaid": false
},
{
"templateId": "3531",
"templateName": "Convert YouTube Videos into SEO Blog Posts",
"templateDescription": "Workflow Description This workflow helps content creators automatically repurpose YouTube videos into SEO-friendly blog posts. It extracts the video...",
"templateUrl": "https://n8n.io/workflows/3531",
"jsonFileName": "Convert_YouTube_Videos_into_SEO_Blog_Posts.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Convert_YouTube_Videos_into_SEO_Blog_Posts.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/01ac546aaaa050ae46618695c3e6cac4/raw/1c268996df055660ab1d8ed6dbc91c2d87b33a9b/Convert_YouTube_Videos_into_SEO_Blog_Posts.json",
"screenshotURL": "https://i.ibb.co/XfZbc5kC/a54494964ab6.png",
"workflowUpdated": true,
"gistId": "01ac546aaaa050ae46618695c3e6cac4",
"templateDescriptionFull": "This workflow helps content creators automatically repurpose YouTube videos into SEO-friendly blog posts. It extracts the video transcript, uses AI to generate a full blog post with a relevant image, and sends the complete package via email, ready for publication.\n\nThis workflow relies on external AI services. You will need:\n\nOpenAI Account: Used for generating the blog post text (specifically mentioned using GPT-4o in the workflow notes).\n\nCredentials: Requires an API key from OpenAI.\nCost: OpenAI API usage is typically paid based on the amount of text processed (tokens). Check OpenAI's current pricing.\nSetup: Sign up at OpenAI and obtain your API key.\nCredentials: Requires an API key from OpenAI.\nCost: OpenAI API usage is typically paid based on the amount of text processed (tokens). Check OpenAI's current pricing.\nSetup: Sign up at OpenAI and obtain your API key.\nDumpling AI Account: Used for retrieving YouTube video transcript and generating the blog post image.\n\nCredentials: Requires an API key from Dumpling AI.\nCost: Dumpling AI offers 250 free credits to start with and different plans for different levels of usage. Check the pricing page for more details.\nSetup: Sign up at Dumpling AI and obtain your API key/credentials.\nCredentials: Requires an API key from Dumpling AI.\nCost: Dumpling AI offers 250 free credits to start with and different plans for different levels of usage. Check the pricing page for more details.\nSetup: Sign up at Dumpling AI and obtain your API key/credentials.\nEmail Account: Credentials for the email service (e.g., Gmail) used to send the final result.\n\nInput Video Details: You provide the YouTube video URL and your email address.\nGet Transcript: The workflow fetches the transcript of the specified YouTube video.\nGenerate Content: An AI model crafts a blog post (title, description, body) based on the transcript.\nCreate Image: Another AI model generates a suitable image for the blog post.\nFormat & Package: The blog post is converted to HTML, and the image is prepared for sending.\nEmail Result: The final HTML blog post and image are emailed to you.\n\n\n\nConfigure Variables: Enter the specific YouTube video URL and the recipient email address in the \"Set Variables\" node.\nConnect Credentials: Add your credentials for the services used (e.g., OpenAI for text generation, Dumpling AI for YouTube Transcript and AI image generation service).\nConnect Email Credentials: Authenticate your Gmail account (or chosen email provider) to allow the workflow to send the email.\n\nDirect Publishing: Instead of emailing the result, connect directly to your CMS (like WordPress, Ghost, Webflow) to automatically create a draft or publish the blog post.\nAI Agent Integration: Replace the single \"Generate Blog Post\" step with an AI Agent for more sophisticated content generation, potentially researching topics or structuring the post section by section based on the transcript.\nSocial Media Snippets: Add steps to generate companion social media posts (e.g., for Twitter, LinkedIn) summarizing the blog post.\nBatch Processing: Modify the trigger to read multiple YouTube URLs from a spreadsheet or database to convert videos in bulk.\nEnhanced SEO: Refine the AI prompts to specifically target keywords or incorporate SEO best practices more deeply into the generated content.\nMultiple Image Options: Generate several image variations and include them in the email or draft post for selection.",
"isPaid": false
},
{
"templateId": "4916",
"templateName": "Generate AI Contextualize Images with FLUX Kontext",
"templateDescription": "This workflow automates the generation of AI-enhanced, contextualized images using FLUX Kontext, based on prompts stored in a Google Sheet. The generated...",
"templateUrl": "https://n8n.io/workflows/4916",
"jsonFileName": "Generate_AI_Contextualize_Images_with_FLUX_Kontext.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Generate_AI_Contextualize_Images_with_FLUX_Kontext.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/892c6019ae4cb9949aabb5d4e848b8f6/raw/e4b1e5ee51d84f1547de12cb0140bd34a1af38d6/Generate_AI_Contextualize_Images_with_FLUX_Kontext.json",
"screenshotURL": "https://i.ibb.co/whgdF6Kd/e291961b3e25.png",
"workflowUpdated": true,
"gistId": "892c6019ae4cb9949aabb5d4e848b8f6",
"templateDescriptionFull": "This workflow automates the generation of AI-enhanced, contextualized images using FLUX Kontext, based on prompts stored in a Google Sheet. The generated images are then saved to Google Drive, and their URLs are written back to the spreadsheet for easy access.\n\nImage:\n\nPrompt:\nThe girl is lying on the bed and sleeping\n\nResult:\n\nThis workflow is especially useful for e-commerce businesses:\n\nGenerate product images with dynamic backgrounds based on the use-case or season.\nCreate contextual marketing visuals for ads, newsletters, or product pages.\nScale visual content creation without the need for manual design work.\n\nTrigger: The workflow can be started manually (via \"Test workflow\") or scheduled at regular intervals (e.g., every 5 minutes) using the \"Schedule Trigger\" node.\nData Fetch: The \"Get new image\" node retrieves a row from a Google Sheet where the \"RESULT\" column is empty. It extracts the prompt, image URL, output format, and aspect ratio for processing.\nImage Generation: The \"Create Image\" node sends a request to the FLUX Kontext API (fal.run) with the provided parameters to generate a new AI-contextualized image.\nStatus Check: The workflow waits 60 seconds (\"Wait 60 sec.\" node) before checking the status of the image generation request via the \"Get status\" node. If the status is \"COMPLETED,\" it proceeds; otherwise, it loops back to wait.\nResult Handling: Once completed, the \"Get Image Url\" node fetches the generated image URL, which is then downloaded (\"Get Image File\"), uploaded to Google Drive (\"Upload Image\"), and the Google Sheet is updated with the result (\"Update result\").\n\nTo configure this workflow, follow these steps:\n\nGoogle Sheet Setup:\n\nCreate a Google Sheet with columns for PROMPT, IMAGE URL, ASPECT RATIO, OUTPUT FORMAT, and RESULT (leave this empty).\nLink the sheet in the \"Get new image\" and \"Update result\" nodes.\nCreate a Google Sheet with columns for PROMPT, IMAGE URL, ASPECT RATIO, OUTPUT FORMAT, and RESULT (leave this empty).\nLink the sheet in the \"Get new image\" and \"Update result\" nodes.\nAPI Key Configuration:\n\nSign up at fal.ai to obtain an API key.\nIn the \"Create Image\" node, set the Header Auth with:\n\nName: Authorization\nValue: Key YOURAPIKEY\nSign up at fal.ai to obtain an API key.\nIn the \"Create Image\" node, set the Header Auth with:\n\nName: Authorization\nValue: Key YOURAPIKEY\nName: Authorization\nValue: Key YOURAPIKEY\nGoogle Drive Setup:\n\nSpecify the target folder ID in the \"Upload Image\" node where generated images will be saved.\nSpecify the target folder ID in the \"Upload Image\" node where generated images will be saved.\nSchedule Trigger (Optional):\n\nAdjust the \"Schedule Trigger\" node to run the workflow at desired intervals (e.g., every 5 minutes).\nAdjust the \"Schedule Trigger\" node to run the workflow at desired intervals (e.g., every 5 minutes).\nTest Execution:\n\nRun the workflow manually via the \"Test workflow\" node to verify all steps function correctly.\nRun the workflow manually via the \"Test workflow\" node to verify all steps function correctly.\n\nOnce configured, the workflow will automatically process pending prompts, generate images, and update the Google Sheet with results.\n\nContact me for consulting and support or add me on Linkedin.",
"isPaid": false
},
{
"templateId": "3416",
"templateName": "Youtube Shorts Generator",
"templateDescription": "Automated Video Creation Workflow Using n8n This workflow automates the creation and publishing of animated videos based on ideas listed in a Google Sheet....",
"templateUrl": "https://n8n.io/workflows/3416",
"jsonFileName": "Youtube_Shorts_Generator.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Youtube_Shorts_Generator.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/54bfc8444c420e375df3b9edd2fda205/raw/49802240d6d08aec773fe160b00b1d837b2e500d/Youtube_Shorts_Generator.json",
"screenshotURL": "https://i.ibb.co/35Dfj6nH/6e8dee3fb093.png",
"workflowUpdated": true,
"gistId": "54bfc8444c420e375df3b9edd2fda205",
"templateDescriptionFull": "This workflow automates the creation and publishing of animated videos based on ideas listed in a Google Sheet. It processes one idea at a time, generating text prompts, images, animations, sound effects, and merging them into a final video before uploading it to YouTube.\n\nPre-conditions and Requirements\nGoogle Sheets Setup\nStep-by-Step Workflow Explanation\nCustomization Guide\n\nTo run this workflow, you'll need API access to the following services:\n\nAnthropic Claude or Google Gemini (for text prompt generation)\nFlux AI (RapidAPI) (for AI-generated images)\nRunwayML (API Documentation) (for AI video animation)\nElevenLabs (for AI-generated voiceovers and sound effects)\nCreatomate (Website) (for video/audio merging and rendering)\nYouTube API (for video upload and posting)\n\nUse cloud (n8n.io) or Install and run n8n (Official Guide)\nSet up credentials for each API in n8n’s settings\n\nBefore running the workflow, ensure your Google Sheet is structured as follows:\n\nThe workflow retrieves the first row where videoStatus = \"To Do\".\nMarks it as Processing to avoid duplicate processing.\n\nUses Anthropic Claude or Google Gemini to generate prompts.\n\nSends the prompt to Flux AI to create a high-quality image.\n\nThe generated image is sent to RunwayML, which animates the image.\n\nElevenLabs produces a realistic narration based on the video content.\nBackground sound effects (e.g., storm sounds, fire crackling) are also generated.\n\nCreatomate compiles the animated video with the audio.\n\nThe finalized video is automatically uploaded to YouTube using the YouTube API.\n\nMarks videoStatus as Created.\nMarks publishStatus as Processed.\n\nUpdate the style column in Google Sheets with custom animation preferences (e.g., cinematic, slow-motion).\nModify the prompt generation step in n8n to incorporate different styles.\n\nAdjust the RunwayML settings to control animation speed and length.\nModify the Creatomate rendering step to adjust clip duration.\n\nModify the Creatomate step to include AI-generated subtitles from ElevenLabs' text output.\n\nAdd additional steps to post to TikTok, Instagram, or Facebook using their respective APIs.\n\nThis workflow ensures a fully automated video creation pipeline, reducing manual effort and optimizing content production. 🚀",
"isPaid": false
},
{
"templateId": "3121",
"templateName": "AI Automated TikTok/Youtube Shorts/Reels Generator",
"templateDescription": "Who is this for?Content creators, digital marketers, and social media managers who want to automate the creation of short-form videos for platforms like...",
"templateUrl": "https://n8n.io/workflows/3121",
"jsonFileName": "AI_Automated_TikTok_Youtube_Shorts_Reels_Generator.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/AI_Automated_TikTok_Youtube_Shorts_Reels_Generator.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/027d27b7a1063d5a19f32383ad4f6483/raw/ed5987f4a3e606e6b874d45829be9b5eff1d0427/AI_Automated_TikTok_Youtube_Shorts_Reels_Generator.json",
"screenshotURL": "https://i.ibb.co/DPg4GHgs/e307f8f89241.png",
"workflowUpdated": true,
"gistId": "027d27b7a1063d5a19f32383ad4f6483",
"templateDescriptionFull": "Content creators, digital marketers, and social media managers who want to automate the creation of short-form videos for platforms like TikTok, YouTube Shorts, and Instagram Reels without extensive video editing skills.\n\nCreating engaging short-form videos consistently is time-consuming and requires multiple tools and skills. This workflow automates the entire process from ideation to publishing, significantly reducing the manual effort needed while maintaining content quality.\n\nThis all-in-one solution transforms ideas into fully produced short-form videos through a 5-step process:\n\nGenerate video captions from ideas stored in a Google Sheet\nCreate AI-generated images using Flux and the OpenAI API\nConvert images to videos using Kling's API\nAdd voice-overs to your content with Eleven Labs\nComplete the video production with Creatomate by adding templates, transitions, and combining all elements\n\nThe workflow handles everything from sourcing content ideas to rendering the final video, and even notifies you on Discord when videos are ready.\n\nBefore getting started, you'll need:\n\nn8n installation (tested on version 1.81.4)\nOpenAI API Key (free trial credits available)\nPiAPI (free trial credits available)\nEleven Labs (free account)\nCreatomate API Key (free trial credits available)\nGoogle Sheets API enabled in Google Cloud Console\nGoogle Drive API enabled in Google Cloud Console\nOAuth 2.0 Client ID and Client Secret from your Google Cloud Console Credentials\n\nAdjust the Google Sheet structure to include additional data like video length, duration, style, etc.\nModify the prompt templates for each AI service to match your brand voice and content style\nUpdate the Creatomate template to reflect your visual branding\nConfigure notification preferences in Discord to manage your workflow\n\nThis workflow combines multiple AI technologies to create a seamless content production pipeline, saving you hours of work per video and allowing you to focus on strategy rather than production.",
"isPaid": false
},
{
"templateId": "3501",
"templateName": "💥AI Social Video Generator with GPT-4, Kling & Blotato —Auto-Post to Instagram, Facebook,, TikTok, Twitter & Pinterest - vide",
"templateDescription": "Workflow Screenshot AI-Powered Social Video Generator with Auto-Posting to Instagram, TikTok, YouTube, Facebook, LinkedIn, Threads, Pinterest, Twitter (X),...",
"templateUrl": "https://n8n.io/workflows/3501",
"jsonFileName": "AI_Social_Video_Generator_with_GPT-4_Kling__Blotato_Auto-Post_to_Instagram_Facebook_TikTok_Twitter__Pinterest_-_vide.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/AI_Social_Video_Generator_with_GPT-4_Kling__Blotato_Auto-Post_to_Instagram_Facebook_TikTok_Twitter__Pinterest_-_vide.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/371e5c0a866187a2de96a2194c98b28f/raw/e9cf7c80ccffd508a93b0cf4d70faab024ca4148/AI_Social_Video_Generator_with_GPT-4_Kling__Blotato_Auto-Post_to_Instagram_Facebook_TikTok_Twitter__Pinterest_-_vide.json",
"screenshotURL": "https://i.ibb.co/fs5MGb0/4d8a7c9e70b6.png",
"workflowUpdated": true,
"gistId": "371e5c0a866187a2de96a2194c98b28f",
"templateDescriptionFull": "This workflow is ideal for content creators, marketers, social media managers, and automation enthusiasts who want to generate, customize, and publish short-form videos across multiple platforms without manual editing or posting. If you use tools like ChatGPT, Kling, or Blotato and want to streamline your content creation process, this workflow is made for you.\n\nPublishing regular video content on multiple platforms is time-consuming—especially when adding voice-overs, captions, and managing distribution. This workflow solves that by:\n\nAutomating video generation using AI (Kling + GPT-4)\nAdding realistic voice narration\nStyling subtitles for social media\nCreating titles and social captions\nAuto-posting to Instagram, TikTok, YouTube, Facebook, Threads, Twitter (X), LinkedIn, Pinterest, and Bluesky\n\nAll of this is triggered by a simple message sent via Telegram.\n\nThis end-to-end automation transforms a short Telegram message into a fully produced and published social video:\n\nReceives a text prompt from Telegram\nTransforms it into a detailed video scene using GPT-4\nGenerates a cinematic video with Kling\nCreates a voice-over script and converts it to audio\nMerges the video and the audio\nAdds styled captions\nWrites a social caption and an SEO-optimized title\nSaves metadata to Google Sheets\nSends a preview via Telegram\nPublishes the video to 9 social platforms using Blotato\n\nConnect your Telegram bot to the \"Telegram Trigger\" node\nAdd your OpenAI API key to all GPT-related nodes\nConfigure Kling API access in the \"Generate Video\" node\nLink your Cloudinary account for audio upload\nConnect JSON2Video to handle video merging and captioning\nSet up Google Sheets with your preferred spreadsheet ID\nConnect your Blotato API key and fill in the platform IDs (Instagram, TikTok, etc.)\nTest by sending a Telegram message like:\ngenerate video A robot exploring Mars, Why AI will reshape humanity\n\nChange the visual style: Adjust the GPT-4 prompt formatting to reflect your brand tone\nEdit voice style: Replace the TTS provider or tweak OpenAI's voice settings\nRevise captions or titles: Fine-tune the system prompts in the \"Create Description\" or \"Create Title\" nodes\nTarget fewer platforms: Disable or remove nodes for platforms you don’t use\nAdd approval steps: Insert a Telegram confirmation step before auto-publishing\n\n📄 Documentation: Notion Guide\n\n🎥 Watch the full tutorial here: YouTube Demo",
"isPaid": false
},
{
"templateId": "1306",
"templateName": "template_1306",
"templateDescription": "This easy-to-extend workflow automatically serves a static HTML page when a URL is accessed in a browser. Prerequisites Basic knowledge of HTML Nodes...",
"templateUrl": "https://n8n.io/workflows/1306",
"jsonFileName": "template_1306.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_1306.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/57cb2471b580e46e7e04bfcb2204c6db/raw/3018d6392f4f2d04287a47f8708600c546b1debf/template_1306.json",
"screenshotURL": "https://i.ibb.co/ymKHbdXd/6094e71ca80a.png",
"workflowUpdated": true,
"gistId": "57cb2471b580e46e7e04bfcb2204c6db",
"templateDescriptionFull": "This easy-to-extend workflow automatically serves a static HTML page when a URL is accessed in a browser.\n\nBasic knowledge of HTML\n\nWebhook node triggers the workflow on an incoming request.\nRespond to Webhook node serves the HTML page in response to the webhook.",
"isPaid": false
},
{
"templateId": "3438",
"templateName": "AutoQoutesV2_template",
"templateDescription": "⚠️ Important Disclaimer:This template is only compatible with a self-hosted n8n instance using a community node.Who is this for?This workflow is ideal for...",
"templateUrl": "https://n8n.io/workflows/3438",
"jsonFileName": "AutoQoutesV2_template.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/AutoQoutesV2_template.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/fc1e140ee0d0bff10adb5390a464446a/raw/f895640f84b09ef0347845298ffa5b32c2b6e9fd/AutoQoutesV2_template.json",
"screenshotURL": "https://i.ibb.co/h14mLMWP/08c0de9b718b.png",
"workflowUpdated": true,
"gistId": "fc1e140ee0d0bff10adb5390a464446a",
"templateDescriptionFull": "This template is only compatible with a self-hosted n8n instance using a community node.\n\nThis workflow is ideal for digital content creators, marketers, social media managers, and automation enthusiasts who want to produce fully automated vertical video content featuring inspirational or motivational quotes. Specifically tailored for Thai language, it effectively demonstrates integration of AI-generated imagery, video, ambient sound, and visually appealing quote overlays.\n\nManually creating high-quality, vertically formatted quote videos is often repetitive, time-consuming, and involves multiple tedious steps like selecting suitable visuals, editing audio tracks, and correctly overlaying text. Additionally, manual uploading to platforms like YouTube and maintaining accurate content records are prone to errors and inefficiencies.\n\nFetches a quote, author, and scenic background description from a Google Sheet.\nAutomatically generates a vertical background image using the Flux AI (txt2img) API.\nTransforms the AI-generated image into a subtly animated cinematic vertical video using the Kling video-generation API.\nGenerates an immersive, ambient background sound using ElevenLabs’ sound generation API.\nDynamically overlays the selected Thai-language quote and author text onto the generated video using FFmpeg, ensuring visually appealing typography (e.g., Kanit font).\nAutomatically uploads the final video to YouTube.\nUpdates the resulting YouTube video URL back to the Google Sheet, keeping your content records current and well-organized.\n\nThis workflow requires a self-hosted n8n instance, as the execution of FFmpeg commands is not supported on n8n Cloud. Ensure FFmpeg is installed on your self-hosted environment.\nAPI keys and accounts setup for Flux, Kling, ElevenLabs, Google Sheets, Google Drive, and YouTube.\n\nYour Google Sheet must include these columns:\n\nIndex\tUnique identifier for each quote\nQuote (Thai)\tQuote text in Thai language (or your chosen language)\nPen Name (Thai)\tAuthor or pen name of the quote's creator\nBackground (EN)\tShort English description of the scene (e.g., \"sunrise over mountains\")\nPrompt (EN)\tDetailed English prompt describing the image/video scene (e.g., \"peaceful sunrise with misty mountains\")\nBackground Image\tURL of AI-generated image (updated automatically)\nBackground Video\tURL of generated video (updated automatically)\nMusic Background\tURL of generated ambient audio (updated automatically)\nVideo Status\tYouTube URL (updated automatically after upload)\nA ready-to-use Google Sheets template is provided [here (provide your actual link)].\n\nTo help you get started quickly, you can use this template spreadsheet.\n\nAuthenticate Google Sheets, Google Drive, YouTube API, Flux AI, Kling API, and ElevenLabs API within n8n.\nEnsure FFmpeg supports fonts compatible with your chosen language (for Thai, \"Kanit\" font is recommended).\nPrepare your Google Sheets with desired quotes, authors, and image/video prompts.\n\nFonts: Adjust font type, size, color, and positioning within the provided FFmpeg commands in the workflow’s code nodes. Verify that selected fonts properly support your target language.\nMedia Customization: Customize the scene descriptions in your Google Sheet to change image/video backgrounds automatically generated by AI.\nQuote Management: Easily manage, add, or update quotes and associated details directly via Google Sheets without workflow modifications.\nAudio Ambiance: Customize or adjust the ambient sound prompt for ElevenLabs within the workflow’s HTTP Request node to match your video's desired mood.\n\nLeveraging AI-generated visual and audio elements along with localized fonts greatly enhances audience engagement by creating visually appealing, professional-quality content tailored specifically for your target audience. This automated workflow drastically reduces production time and manual effort, enabling rapid, consistent content creation optimized for platforms such as YouTube Shorts, Instagram Reels, and TikTok.",
"isPaid": false
},
{
"templateId": "5228",
"templateName": "GoogleVertex_template",
"templateDescription": "Who’s it forThis template is perfect for content creators, AI enthusiasts, marketers, and developers who want to automate the generation of cinematic videos...",
"templateUrl": "https://n8n.io/workflows/5228",
"jsonFileName": "GoogleVertex_template.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/GoogleVertex_template.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/4668b276f01246c492276a386fea4dac/raw/4688f19100d431642917240ac3d891598170985d/GoogleVertex_template.json",
"screenshotURL": "https://i.ibb.co/GQDgq0Dc/90be2ee73378.png",
"workflowUpdated": true,
"gistId": "4668b276f01246c492276a386fea4dac",
"templateDescriptionFull": "This template is perfect for content creators, AI enthusiasts, marketers, and developers who want to automate the generation of cinematic videos using Google Vertex AI’s Veo 3 model. It’s also ideal for anyone experimenting with generative AI for video using n8n.\n\n\n\nThis workflow:\n\nAccepts a text prompt and a GCP access token via form.\nSends the prompt to the Veo 3 (preview model) using Vertex AI’s predictLongRunning endpoint.\nWaits for the video rendering to complete.\nFetches the final result and converts the base64-encoded video to a file.\nUploads the resulting .mp4 to your Google Drive.\n\n\n\n\n\nEnable Vertex AI API in your GCP project:\nhttps://console.cloud.google.com/marketplace/product/google/aiplatform.googleapis.com\nAuthenticate with GCP using Cloud Shell or local terminal:\n\nCopy the token and use it in the form when running the workflow.\n⚠️ This token lasts ~1 hour. Regenerate as needed.\n\nConnect your Google Drive OAuth2 credentials to allow file upload.\nImport this workflow into n8n and execute it via form trigger.\n\nn8n (v1.94.1+)\nA Google Cloud project with:\n\nVertex AI API enabled\nBilling enabled\nVertex AI API enabled\nBilling enabled\nA way to get Access Token gcloud auth print-access-token\nA Google Drive OAuth2 credential connected to n8n\n\nYou can modify the\n\ndurationSeconds\naspectRatio\ngenerateAudio\ndurationSeconds\naspectRatio\ngenerateAudio\nin the HTTP node to match your use case.\nReplace the Google Drive upload node with alternatives like Dropbox, S3, or YouTube upload.\nExtend the workflow to add subtitles, audio dubbing, or LINE/Slack alerts.\n\nStep-by-step for each major node:\n\nPrompt Input → Vertex Predict → Wait → Fetch Result → Convert to File → Upload\n\nNo hardcoded API tokens\nSecure: GCP token is input via form, not stored in workflow\nAll nodes are renamed with clear purpose\nAll editable config grouped in Set node\n\nGCP Veo API Docs:\nhttps://cloud.google.com/vertex-ai/docs/generative-ai/video/overview\n\nThis workflow uses official Google Cloud APIs and requires a valid GCP project.\nAccess token should be generated securely using gcloud CLI.\nDo not embed tokens in the workflow itself.\n\nTo use the Vertex AI API in n8n securely:\n\nRun the following on your local machine or GCP Cloud Shell:\n\nPaste the token in the workflow form field YOUR_ACCESS_TOKEN\nwhen submitting.\nDo not hardcode the token into HTTP nodes or Set nodes — input it each time or use a secure credential vault.",
"isPaid": false
},
{
"templateId": "4921",
"templateName": "Veo3 AI Marketing Agent",
"templateDescription": "This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Automate your entire video content creation pipeline...",
"templateUrl": "https://n8n.io/workflows/4921",
"jsonFileName": "Veo3_AI_Marketing_Agent.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Veo3_AI_Marketing_Agent.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/40e5db5560c1ca4b9ada91aa2a9fce47/raw/8271f9eaf91794398139d16b33423f9fd0cdcb70/Veo3_AI_Marketing_Agent.json",
"screenshotURL": "https://i.ibb.co/KtpG4wd/64da6003a173.png",
"workflowUpdated": true,
"gistId": "40e5db5560c1ca4b9ada91aa2a9fce47",
"templateDescriptionFull": "This workflow contains community nodes that are only compatible with the self-hosted version of n8n.\n\nAutomate your entire video content creation pipeline with this AI-powered, no-code workflow built in n8n.\nWatch Step-by-step video guide here: https://www.youtube.com/watch?v=x7nHpcggpX8&t=5s\n\nThis template connects a suite of smart tools to help you generate scroll-stopping short video ideas based on daily trending topics and auto-deliver them via email—ready for production in Veo 3.\n\n🔧 How it works:\nScheduled Trigger (Daily)\nKicks off the process each day at your chosen time.\n\nTavily Agent (Web Search)\nSearches the latest trends, viral moments, or market news based on your e-commerce brand (e.g. “Sally’s Closet”).\n\nOpenAI GPT-4 Agent (Creative Brainstorming)\nGenerates high-conversion marketing video ideas based on your brand’s tone and what’s trending.\n\nPrompt Formatter for Veo 3\nConverts the idea into a cinematic-style prompt, optimized for Veo’s video generation engine (via FAL API).\n\nSend via Gmail\nThe final Veo 3 prompt is emailed to you or your creative team for immediate use or manual refinement.\n\nWatch full step-by-step Tutorial Build Video: https://youtu.be/x7nHpcggpX8\n\n🧠 Use Cases:\nE-commerce brands that need fresh daily content\n\nMarketing teams looking to automate creative ideation\n\nSolopreneurs building a lean video production engine\n\nAnyone experimenting with Veo 3 prompt-based storytelling\n\n🛠️ Tools used:\nn8n Scheduled Trigger\n\nTavily Node (for real-time web search)\n\nLangchain Agent (GPT-4 via OpenAI)\n\nFAL API (Veo 3 prompt delivery)\n\nGmail Node (send final output)\n\n⚡️ Ready-to-use. Fully editable. Zero coding required.\n\n💡 Pro Tip: You can hook this up with the Veo 3 generation API (FAL) to complete the automation end-to-end!",
"isPaid": false
},
{
"templateId": "3775",
"templateName": "🎥 Gemini AI Video Analysis",
"templateDescription": "📝 Overview This workflow leverages Google Gemini 2.0 Flash multimodal AI to automatically generate detailed descriptions of video content from any public...",
"templateUrl": "https://n8n.io/workflows/3775",
"jsonFileName": "_Gemini_AI_Video_Analysis.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/_Gemini_AI_Video_Analysis.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/cb5fc0327f3a5f776f80c1526424e4f1/raw/4bf8b7d7e5be6bc86835ac7db6d8385342eaf3b1/_Gemini_AI_Video_Analysis.json",
"screenshotURL": "https://i.ibb.co/k2ddRTC7/42df3fb0ace0.png",
"workflowUpdated": true,
"gistId": "cb5fc0327f3a5f776f80c1526424e4f1",
"templateDescriptionFull": "This workflow leverages Google Gemini 2.0 Flash multimodal AI to automatically generate detailed descriptions of video content from any public URL. It streamlines video understanding, making it ideal for content cataloging, accessibility, and content moderation.\n\n♿ Accessibility: Automatically generate detailed video descriptions for visually impaired users.\n🛡️ Content Moderation: Detect inappropriate or off-brand material without manual watching.\n🗂️ Media Cataloging: Enrich your media library with automatically extracted metadata.\n📈 Marketing & Branding: Gain fast insights into key elements, tone, and branding in video content.\n\n🔑 Get a Gemini API Key\n\nRegister at ai.google.dev and create an API key.\nBefore running the workflow, set your Gemini API key as an environment variable named GeminiKey for secure access within the workflow.\nIn the Set Input node, reference this environment variable instead of hardcoding the key.\nRegister at ai.google.dev and create an API key.\nBefore running the workflow, set your Gemini API key as an environment variable named GeminiKey for secure access within the workflow.\nIn the Set Input node, reference this environment variable instead of hardcoding the key.\n🌐 Configure Video URL\n\nReplace the sample URL in the Set Input node with your desired public video URL.\nEnsure the video is directly accessible (no login or special permissions required).\nReplace the sample URL in the Set Input node with your desired public video URL.\nEnsure the video is directly accessible (no login or special permissions required).\n📝 Optional: Customize the Analysis\n\nEdit the prompt in the Analyze video Gemini node to focus on the most relevant video details for your use case (e.g., branding, key actions, visual elements).\nEdit the prompt in the Analyze video Gemini node to focus on the most relevant video details for your use case (e.g., branding, key actions, visual elements).\n🔒 Security Tip\n\nUse n8n's credentials manager or environment variables (like GeminiKey) to store your API key securely.\nAvoid hardcoding API keys directly in workflow nodes, especially in production environments.\nUse n8n's credentials manager or environment variables (like GeminiKey) to store your API key securely.\nAvoid hardcoding API keys directly in workflow nodes, especially in production environments.\n\n📥 Download the video from the provided URL.\n☁️ Upload the video to Gemini’s server for processing.\n⏳ Wait for Gemini to complete processing.\n🤖 Analyze the video with Gemini AI using your customized prompt.\n📄 Output a comprehensive description of the video as videoDescription.\n\nUses HTTP Request nodes to interact with Gemini API endpoints.\nHandles file download, upload, status checking, and result retrieval.\nCustomizable Gemini AI parameters for fine-tuned response.\nMain output: videoDescription (detailed text describing video content).\n\nSet your Gemini API key as the GeminiKey environment variable and configure your video URL in the workflow.\nExecute the workflow.\nRetrieve your rich, AI-generated video description for downstream use such as automation, tagging, or reporting.",
"isPaid": false
},
{
"templateId": "3408",
"templateName": "⚡AI-Powered YouTube Playlist & Video Summarization and Analysis v2",
"templateDescription": "AI YouTube Playlist & Video Analyst Chatbot This n8n workflow transforms entire YouTube playlists or single videos into interactive knowledge bases you can...",
"templateUrl": "https://n8n.io/workflows/3408",
"jsonFileName": "AI-Powered_YouTube_Playlist__Video_Summarization_and_Analysis_v2.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/AI-Powered_YouTube_Playlist__Video_Summarization_and_Analysis_v2.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/e60d566f482e04d2261e8ec76a5f0dc3/raw/20176d61558f215488d93e7a326c753cbee4ec2b/AI-Powered_YouTube_Playlist__Video_Summarization_and_Analysis_v2.json",
"screenshotURL": "https://i.ibb.co/tpHTGbpD/722070e4b6b8.png",
"workflowUpdated": true,
"gistId": "e60d566f482e04d2261e8ec76a5f0dc3",
"templateDescriptionFull": "This n8n workflow transforms entire YouTube playlists or single videos into interactive knowledge bases you can chat with. Ask questions and get summaries without needing to watch hours of content.\n\n🔗 Provide a Link: Start by giving the workflow a URL for a YouTube playlist or a single video.\n📄 Content Retrieval: The workflow automatically fetches the video details and transcripts for the provided link. For playlists, it can process multiple videos at once (you might be asked how many).\n🧠 AI Processing: Google's Gemini AI reads through the transcripts, understands the content, and creates summaries.\n💾 Storage & Context: The processed information and summaries are stored in a vector database (Qdrant), making them ready for conversation. Context is managed using Redis, remembering the current video/playlist you're discussing.\n💬 Chat & Ask: Now, you can ask the AI agent questions about the playlist or video content! Because context is maintained, you can ask follow-up questions (like \"expand on point X\") without needing to provide the URL again.\n\nCommunity Node: This workflow uses the youtubeTranscripter community node to fetch video transcripts. You'll need to install it in your n8n environment.\n\nInstallation: npm install n8n-nodes-youtube-transcription-dmr\nImportant: Community nodes require a self-hosted n8n instance.\nInstallation: npm install n8n-nodes-youtube-transcription-dmr\nImportant: Community nodes require a self-hosted n8n instance.\nRedis: A Redis instance is required for managing conversation context and status between interactions.\nCredentials: You will need API credentials configured in your n8n instance for:\n\nGoogle Gemini (AI Models)\nQdrant (Vector Store)\nRedis (Context Store)\nGoogle Gemini (AI Models)\nQdrant (Vector Store)\nRedis (Context Store)\n\nEngage with the AI agent to explore the video content. Since the agent remembers the context of your conversation, you can ask detailed follow-up questions naturally:\n\nGet a quick summary of a single video or an entire playlist.\nAsk for key takeaways or main topics discussed.\nQuery for specific information mentioned in the videos.\nAsk the agent to elaborate on a specific point previously mentioned.\nUnderstand complex subjects without watching the full duration.\n\n📊 Content Analysis: Quickly understand the themes and key points across a playlist or long video.\n📚 Research & Learning: Extract insights from educational series or tutorials efficiently.\n✍️ Content Creation: Easily repurpose video transcript information into blog posts, notes, or social media content.\n⏱️ Save Time: Get the essence of video content when you're short on time.\n♿ Accessibility: Offers a text-based way to interact with and understand video content.\n\nPlease analyze this playlist and tell me the main topics covered: [YouTube Playlist URL]\nSummarize the first 5 videos in this playlist: [YouTube Playlist URL]\n(Follow-up) Tell me more about the main point in video 3.\nWhat are the key points discussed in this video? [YouTube Video URL]\n(Follow-up) Expand on the second key point you mentioned.\nDoes the video at [YouTube Video URL] mention [specific topic]?",
"isPaid": false
},
{
"templateId": "2467",
"templateName": "template_2467",
"templateDescription": "This n8n template takes a video and extracts frames from it which are used with a multimodal LLM to generate a script. The script is then passed to the same...",
"templateUrl": "https://n8n.io/workflows/2467",
"jsonFileName": "template_2467.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_2467.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/f131f18c4d444f7fe5b1aec3f6552dfd/raw/6935b781809f22700a18e0cdf472053351254f7f/template_2467.json",
"screenshotURL": "https://i.ibb.co/kskD6HHW/f0844ed7cbd4.png",
"workflowUpdated": true,
"gistId": "f131f18c4d444f7fe5b1aec3f6552dfd",
"templateDescriptionFull": "This n8n template takes a video and extracts frames from it which are used with a multimodal LLM to generate a script. The script is then passed to the same multimodal LLM to generate a voiceover clip.\n\nThis template was inspired by Processing and narrating a video with GPT's visual capabilities and the TTS API\n\nVideo is downloaded using the HTTP node.\nPython code node is used to extract the frames using OpenCV.\nLoop node is used o batch the frames for the LLM to generate partial scripts.\nAll partial scripts are combined to form the full script which is then sent to OpenAI to generate audio from it.\nThe finished voiceover clip is uploaded to Google Drive.\n\nSample the finished product here: https://drive.google.com/file/d/1-XCoii0leGB2MffBMPpCZoxboVyeyeIX/view?usp=sharing\n\nOpenAI for LLM\nIdeally, a mid-range (16GB RAM) machine for acceptable performance!\n\nFor larger videos, consider splitting into smaller clips for better performance\nUse a multimodal LLM which supports fully video such as Google's Gemini.",
"isPaid": false
},
{
"templateId": "3200",
"templateName": "Luma AI Dream Machine - Simple v1 - AK",
"templateDescription": "Automate Video Creation with Luma AI Dream Machine and Airtable (Part 1) Description This workflow automates video creation using Luma AI Dream Machine and...",
"templateUrl": "https://n8n.io/workflows/3200",
"jsonFileName": "Luma_AI_Dream_Machine_-_Simple_v1_-_AK.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Luma_AI_Dream_Machine_-_Simple_v1_-_AK.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/0a522b7c78d6cf9875d879f1e655c6fa/raw/82e1a32eeb341f0dafbf3fb9b1e34028f4a175e2/Luma_AI_Dream_Machine_-_Simple_v1_-_AK.json",
"screenshotURL": "https://i.ibb.co/pB1y9YBM/8efa0dafd8e6.png",
"workflowUpdated": true,
"gistId": "0a522b7c78d6cf9875d879f1e655c6fa",
"templateDescriptionFull": "This workflow automates video creation using Luma AI Dream Machine and n8n. It generates dynamic videos based on custom prompts, random camera motion, and predefined settings, then stores the video and thumbnail URLs in Airtable for easy access and tracking. This automation makes it easy to create high-quality videos at scale with minimal effort.\n\n👉 Airtable Base Template\n🎥 Tutorial Video\n\nCreate an account with Luma AI.\nGenerate an API key from Luma AI for authentication.\nEnsure the API key has permission to create and manage video requests.\n\nCreate an Airtable base with the following fields:\n\nGeneration ID – To match incoming webhook data.\nStatus – Workflow status (e.g., \"Done\").\nVideo URL – Stores the generated video URL.\nThumbnail URL – Stores the thumbnail URL.\nPrompt – The video prompt used in the request.\nAspect Ratio – Defines the video format (e.g., 9:16).\nDuration – Length of the video.\n\n👉 Use the Airtable template linked above to simplify setup.\n\nInstall n8n (local or cloud).\nSet up Luma AI and Airtable credentials in n8n.\nImport the workflow and customize the settings based on your needs.\n\nThe Set node defines key settings such as:\n\nPrompt – Example: \"A crocheted parrot in a crocheted pirate outfit swinging on a crocheted perch.\"\nAspect Ratio – Example: \"9:16\"\nLoop – Example: \"true\"\nDuration – Example: \"5 seconds\"\nCluster ID – Used to group related videos for easy tracking.\nCallback URL - Used for the Webhook workflow in Part 2\n\nThe Code node randomly selects a camera motion (e.g., Zoom In, Pan Left, Crane Up) to create dynamic and visually engaging videos.\n\nThe HTTP Request node sends a POST request to Luma AI’s API with the following parameters:\n\nPrompt – Uses the defined global settings.\nAspect Ratio – Matches the target platform (e.g., TikTok or YouTube).\nDuration – Length of the video.\nLoop – Determines if the video should loop.\nCallback URL – Sends a POST response when the video is complete.\nPrompt – Uses the defined global settings.\nAspect Ratio – Matches the target platform (e.g., TikTok or YouTube).\nDuration – Length of the video.\nLoop – Determines if the video should loop.\nCallback URL – Sends a POST response when the video is complete.\n\nLuma AI sends a POST response to the callback URL once video generation is complete.\nThe response includes:\n\nVideo URL – Direct link to the video.\nThumbnail URL – Link to the video thumbnail.\nGeneration ID – Used to match the record in Airtable.\nVideo URL – Direct link to the video.\nThumbnail URL – Link to the video thumbnail.\nGeneration ID – Used to match the record in Airtable.\n\nThe Airtable node updates the record with the video and thumbnail URLs.\nGeneration ID is crucial for matching future webhook responses to the correct video record.\n\n✅ Automates high-quality video creation\n✅ Reduces manual effort by handling prompt generation and API calls\n✅ Random camera motion makes videos more dynamic\n✅ Ensures organized tracking with Airtable\n✅ Scalable – Ideal for automating large-scale content creation\n\nPart 2 – Handling webhook responses and updating Airtable automatically.\nFuture Enhancements – Adding more camera motions, multi-platform support, and automated video editing.",
"isPaid": false
},
{
"templateId": "4881",
"templateName": "Google Sheets to Veo 3: Instantly Create Videos with n8n & Fal.AI",
"templateDescription": "Turn Your Ideas into Videos—Right from Google Sheets! This workflow helps you make cool 8-second videos using Fal.AI and Veo 3, just by typing your idea...",
"templateUrl": "https://n8n.io/workflows/4881",
"jsonFileName": "Google_Sheets_to_Veo_3_Instantly_Create_Videos_with_n8n__Fal.AI.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Google_Sheets_to_Veo_3_Instantly_Create_Videos_with_n8n__Fal.AI.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/24b7e48d2ed316da01a3ee00b67dab03/raw/52a24282d878ead727f8392f4ff97e48654dab21/Google_Sheets_to_Veo_3_Instantly_Create_Videos_with_n8n__Fal.AI.json",
"screenshotURL": "https://i.ibb.co/B2WMMgYs/137a98aa9916.png",
"workflowUpdated": true,
"gistId": "24b7e48d2ed316da01a3ee00b67dab03",
"templateDescriptionFull": "Turn Your Ideas into Videos—Right from Google Sheets!\n\nThis workflow helps you make cool 8-second videos using Fal.AI and Veo 3, just by typing your idea into a Google Sheet. You can even choose if you want your video to have sound or not. It’s super easy—no tech skills needed!\n\nWhy use this?\n\nJust type your idea in a sheet—no fancy tools or uploads.\n\nGet a video link back in the same sheet.\n\nWorks with or without sound—your choice!\n\nHow does it work?\n\nYou write your idea, pick the video shape, and say if you want sound (true or false) in the Google Sheet.\n\nn8n reads your idea and asks Fal.AI to make your video.\n\nWhen your video is ready, the link shows up in your sheet.\n\nWhat do you need?\n\nA Google account and Google Sheets connected with service account (check this link for reference)\nA copy of the following Google Spreadsheet:\nSpreadsheet to copy\nAn OpenAI API key\nA Fal.AI account with some money in it\n\nThat’s it! Just add your ideas and let the workflow make the videos for you. Have fun creating!\n\nIf you have any questions, just contact me in X @maxrojasdelgado.",
"isPaid": false
},
{
"templateId": "4846",
"templateName": "Create Video with Google Veo3 and Upload to Youtube",
"templateDescription": "This workflow allows users to generate AI videos using Google Veo3, save them to Google Drive, generate optimized YouTube titles with GPT-4o, and...",
"templateUrl": "https://n8n.io/workflows/4846",
"jsonFileName": "Create_Video_with_Google_Veo3_and_Upload_to_Youtube.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Create_Video_with_Google_Veo3_and_Upload_to_Youtube.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/b166c68fa055184ecabe569e063b78b7/raw/f99a7efb22352dd98af835be7a6fcf02857fe152/Create_Video_with_Google_Veo3_and_Upload_to_Youtube.json",
"screenshotURL": "https://i.ibb.co/PGFHqfVn/818f238d7626.png",
"workflowUpdated": true,
"gistId": "b166c68fa055184ecabe569e063b78b7",
"templateDescriptionFull": "This workflow allows users to generate AI videos using Google Veo3, save them to Google Drive, generate optimized YouTube titles with GPT-4o, and automatically upload them to YouTube with Upload-Post. The entire process is triggered from a Google Sheet that acts as the central interface for input and output.\n\nIT automates video creation, uploading, and tracking, ensuring seamless integration between Google Sheets, Google Drive, Google Veo3, and YouTube.\n\n💡 No Code Interface: Trigger and control the video production pipeline from a simple Google Sheet.\n⚙️ Full Automation: Once set up, the entire video generation and publishing process runs hands-free.\n🧠 AI-Powered Creativity:\n\nGenerates engaging YouTube titles using GPT-4o.\nLeverages advanced generative video AI from Google Veo3.\nGenerates engaging YouTube titles using GPT-4o.\nLeverages advanced generative video AI from Google Veo3.\n📁 Cloud Storage & Backup: Stores all generated videos on Google Drive for safekeeping.\n📈 YouTube Ready: Automatically uploads to YouTube with correct metadata, saving time and boosting visibility.\n🧪 Scalable: Designed to process multiple video prompts by looping through new entries in Google Sheets.\n🔒 API-First: Utilizes secure API-based communication for all services.\n\nTrigger: The workflow can be started manually (\"When clicking ‘Test workflow’\") or scheduled (\"Schedule Trigger\") to run at regular intervals (e.g., every 5 minutes).\nFetch Data: The \"Get new video\" node retrieves unfilled video requests from a Google Sheet (rows where the \"VIDEO\" column is empty).\nVideo Creation:\n\nThe \"Set data\" node formats the prompt and duration from the Google Sheet.\nThe \"Create Video\" node sends a request to the Fal.run API (Google Veo3) to generate a video based on the prompt.\nThe \"Set data\" node formats the prompt and duration from the Google Sheet.\nThe \"Create Video\" node sends a request to the Fal.run API (Google Veo3) to generate a video based on the prompt.\nStatus Check:\n\nThe \"Wait 60 sec.\" node pauses execution for 60 seconds.\nThe \"Get status\" node checks the video generation status. If the status is \"COMPLETED,\" the workflow proceeds; otherwise, it waits again.\nThe \"Wait 60 sec.\" node pauses execution for 60 seconds.\nThe \"Get status\" node checks the video generation status. If the status is \"COMPLETED,\" the workflow proceeds; otherwise, it waits again.\nVideo Processing:\n\nThe \"Get Url Video\" node fetches the video URL.\nThe \"Generate title\" node uses OpenAI (GPT-4.1) to create an SEO-optimized YouTube title.\nThe \"Get File Video\" node downloads the video file.\nThe \"Get Url Video\" node fetches the video URL.\nThe \"Generate title\" node uses OpenAI (GPT-4.1) to create an SEO-optimized YouTube title.\nThe \"Get File Video\" node downloads the video file.\nUpload & Update:\n\nThe \"Upload Video\" node saves the video to Google Drive.\nThe \"HTTP Request\" node uploads the video to YouTube via the Upload-Post API.\nThe \"Update Youtube URL\" and \"Update result\" nodes update the Google Sheet with the video URL and YouTube link.\nThe \"Upload Video\" node saves the video to Google Drive.\nThe \"HTTP Request\" node uploads the video to YouTube via the Upload-Post API.\nThe \"Update Youtube URL\" and \"Update result\" nodes update the Google Sheet with the video URL and YouTube link.\n\nGoogle Sheet Setup:\n\nCreate a Google Sheet with columns: PROMPT, DURATION, VIDEO, and YOUTUBE_URL.\nShare the Sheet link in the \"Get new video\" node.\nCreate a Google Sheet with columns: PROMPT, DURATION, VIDEO, and YOUTUBE_URL.\nShare the Sheet link in the \"Get new video\" node.\nAPI Keys:\n\nObtain a Fal.run API key (for Veo3) and set it in the \"Create Video\" node (Header: Authorization: Key YOURAPIKEY).\nGet an Upload-Post API key (for YouTube uploads) and configure the \"HTTP Request\" node (Header: Authorization: Apikey YOUR_API_KEY).\nObtain a Fal.run API key (for Veo3) and set it in the \"Create Video\" node (Header: Authorization: Key YOURAPIKEY).\nGet an Upload-Post API key (for YouTube uploads) and configure the \"HTTP Request\" node (Header: Authorization: Apikey YOUR_API_KEY).\nYouTube Upload Configuration:\n\nReplace YOUR_USERNAME in the \"HTTP Request\" node with your Upload-Post profile name.\nReplace YOUR_USERNAME in the \"HTTP Request\" node with your Upload-Post profile name.\nSchedule Trigger:\n\nConfigure the \"Schedule Trigger\" node to run periodically (e.g., every 5 minutes).\nConfigure the \"Schedule Trigger\" node to run periodically (e.g., every 5 minutes).\n\n👉 Subscribe to my new YouTube channel. Here I’ll share videos and Shorts with practical tutorials and FREE templates for n8n.\n\n\n\nContact me for consulting and support or add me on Linkedin.",
"isPaid": false
},
{
"templateId": "4767",
"templateName": "VEO3 Video Generator TEMPLATE",
"templateDescription": "Welcome to my VEO3 Video Generator Workflow! This automated workflow transforms simple text descriptions into professional 8-second videos using Google's...",
"templateUrl": "https://n8n.io/workflows/4767",
"jsonFileName": "VEO3_Video_Generator_TEMPLATE.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/VEO3_Video_Generator_TEMPLATE.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/1ca42c41fc058fad7a96bd4594da83ad/raw/8024915e31282d16a6fa9734224bde13f4ce3368/VEO3_Video_Generator_TEMPLATE.json",
"screenshotURL": "https://i.ibb.co/cc8p8tK6/9f7711dfdac2.png",
"workflowUpdated": true,
"gistId": "1ca42c41fc058fad7a96bd4594da83ad",
"templateDescriptionFull": "This automated workflow transforms simple text descriptions into professional 8-second videos using Google's cutting-edge VEO3 AI model. Users submit video ideas through a web form, and the system automatically generates optimized prompts, creates high-quality videos with native audio, and delivers them via Google Drive - all powered by Claude 4 Sonnet for intelligent prompt optimization.\n\n\n\nVEO3 Generator Form - Web form interface for users to input video content, format, and duration\nVideo Prompt Generator - AI agent powered by Claude 4 Sonnet that:\n\nAnalyzes user input for video content requirements\nCreates factual, professional video titles\nGenerates detailed VEO3 prompts with subject, context, action, style, camera motion, composition, ambiance, and audio elements\nOptimizes prompts for 16:9 landscape format and 8-second duration\nAnalyzes user input for video content requirements\nCreates factual, professional video titles\nGenerates detailed VEO3 prompts with subject, context, action, style, camera motion, composition, ambiance, and audio elements\nOptimizes prompts for 16:9 landscape format and 8-second duration\nCreate VEO3 Video - Submits the optimized prompt to fal.ai VEO3 API for video generation\nWait 30 seconds - Initial waiting period for video processing to begin\nCheck VEO3 Status - Monitors the video generation status via fal.ai API\nVideo completed? - Decision node that checks if video generation is finished\n\nIf not completed: Returns to wait cycle\nIf completed: Proceeds to video retrieval\nIf not completed: Returns to wait cycle\nIf completed: Proceeds to video retrieval\nGet VEO3 Video URL - Retrieves the final video download URL from fal.ai\nDownload VEO3 Video - Downloads the generated MP4 video file\nMerge - Combines video data with metadata for final processing\nSave Video to Google Drive - Uploads the video to specified Google Drive folder\nVideo Output - Displays completion message with Google Drive link to user\n\nAnthropic API (Claude 4 Sonnet): Documentation\nFal.ai API (VEO3 Model): Create API key at https://fal.ai/dashboard/keys\nGoogle Drive API: Documentation\n\nUser-friendly web form: Simple interface for video content input\nAI-powered prompt optimization: Claude 4 Sonnet creates professional VEO3 prompts\nAutomatic video generation: Leverages Google's VEO3 model via fal.ai\nStatus monitoring: Real-time tracking of video generation progress\nGoogle Drive integration: Automatic upload and sharing of generated videos\nStructured output: Consistent video titles and professional prompt formatting\nAudio optimization: VEO3's native audio generation with ambient sounds and music\n\nFormat: Only 16:9 landscape videos supported\nDuration: Only 8-second videos supported\nProcessing time: Videos typically take 60-120 seconds to generate\n\nContent creation: Generate videos for social media, websites, and presentations\nMarketing materials: Create promotional videos and advertisements\nEducational content: Produce instructional and explanatory videos\nPrototyping: Rapid video concept development and testing\nCreative projects: Artistic and experimental video generation\nBusiness presentations: Professional video content for meetings and pitches\n\nFeel free to contact me via LinkedIn, if you have any questions!",
"isPaid": false
},
{
"templateId": "5835",
"templateName": "Create Cheaper Video with Google Veo3 Fast and Upload to Social",
"templateDescription": "This workflow allows users to generate AI videos using the cheaper model Google Veo3 Fast, save them to Google Drive, generate optimized titles with GPT-4o,...",
"templateUrl": "https://n8n.io/workflows/5835",
"jsonFileName": "Create_Cheaper_Video_with_Google_Veo3_Fast_and_Upload_to_Social.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Create_Cheaper_Video_with_Google_Veo3_Fast_and_Upload_to_Social.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/a8da86a62cb64d4da8187edb208484b9/raw/9a1ec670446c629814e68e3dab5928c1940b77a8/Create_Cheaper_Video_with_Google_Veo3_Fast_and_Upload_to_Social.json",
"screenshotURL": "https://i.ibb.co/b5z5VnRz/7a51bc6d14d6.png",
"workflowUpdated": true,
"gistId": "a8da86a62cb64d4da8187edb208484b9",
"templateDescriptionFull": "This workflow allows users to generate AI videos using the cheaper model Google Veo3 Fast, save them to Google Drive, generate optimized titles with GPT-4o, and automatically upload them to YouTube and TikTok with Upload-Post. The entire process is triggered from a Google Sheet that acts as the central interface for input and output.\n\nIT automates video creation, uploading, and tracking, ensuring seamless integration between Google Sheets, Google Drive, Google Veo3 Fast, TikTok and YouTube.\n\n💡 No Code Interface: Trigger and control the video production pipeline from a simple Google Sheet.\n⚙️ Full Automation: Once set up, the entire video generation and publishing process runs hands-free.\n🧠 AI-Powered Creativity:\n\nGenerates engaging YouTube and TikTok titles using GPT-4o.\nLeverages advanced generative video AI from Google Veo3.\nGenerates engaging YouTube and TikTok titles using GPT-4o.\nLeverages advanced generative video AI from Google Veo3.\n📁 Cloud Storage & Backup: Stores all generated videos on Google Drive for safekeeping.\n📈 YouTube Ready: Automatically uploads to YouTube with correct metadata, saving time and boosting visibility.\n📈 TikTok Ready: Automatically uploads to TikTok with correct metadata, saving time and boosting visibility.\n🧪 Scalable: Designed to process multiple video prompts by looping through new entries in Google Sheets.\n🔒 API-First: Utilizes secure API-based communication for all services.\n\nTrigger: The workflow can be started manually (\"When clicking ‘Test workflow’\") or scheduled (\"Schedule Trigger\") to run at regular intervals (e.g., every 5 minutes).\nFetch Data: The \"Get new video\" node retrieves unfilled video requests from a Google Sheet (rows where the \"VIDEO\" column is empty).\nVideo Creation:\n\nThe \"Set data\" node formats the prompt and duration from the Google Sheet.\nThe \"Create Video\" node sends a request to the Fal.run API (Google Veo3 Fast) to generate a video based on the prompt.\nThe \"Set data\" node formats the prompt and duration from the Google Sheet.\nThe \"Create Video\" node sends a request to the Fal.run API (Google Veo3 Fast) to generate a video based on the prompt.\nStatus Check:\n\nThe \"Wait 60 sec.\" node pauses execution for 60 seconds.\nThe \"Get status\" node checks the video generation status. If the status is \"COMPLETED,\" the workflow proceeds; otherwise, it waits again.\nThe \"Wait 60 sec.\" node pauses execution for 60 seconds.\nThe \"Get status\" node checks the video generation status. If the status is \"COMPLETED,\" the workflow proceeds; otherwise, it waits again.\nVideo Processing:\n\nThe \"Get Url Video\" node fetches the video URL.\nThe \"Generate title\" node uses OpenAI (GPT-4.1) to create an SEO-optimized YouTube and TikTok title.\nThe \"Get File Video\" node downloads the video file.\nThe \"Get Url Video\" node fetches the video URL.\nThe \"Generate title\" node uses OpenAI (GPT-4.1) to create an SEO-optimized YouTube and TikTok title.\nThe \"Get File Video\" node downloads the video file.\nUpload & Update:\n\nThe \"Upload Video\" node saves the video to Google Drive.\nThe \"HTTP Request\" node uploads the video to YouTube via the Upload-Post API.\nThe \"HTTP Request\" node uploads the video to TikTok via the Upload-Post API.\nThe \"Update Youtube URL\" and \"Update result\" nodes update the Google Sheet with the video URL and YouTube link.\nThe \"Upload Video\" node saves the video to Google Drive.\nThe \"HTTP Request\" node uploads the video to YouTube via the Upload-Post API.\nThe \"HTTP Request\" node uploads the video to TikTok via the Upload-Post API.\nThe \"Update Youtube URL\" and \"Update result\" nodes update the Google Sheet with the video URL and YouTube link.\n\nGoogle Sheet Setup:\n\nCreate a Google Sheet with columns: PROMPT, DURATION, VIDEO, and YOUTUBE_URL.\nShare the Sheet link in the \"Get new video\" node.\nCreate a Google Sheet with columns: PROMPT, DURATION, VIDEO, and YOUTUBE_URL.\nShare the Sheet link in the \"Get new video\" node.\nAPI Keys:\n\nObtain a Fal.run API key (for Veo3) and set it in the \"Create Video\" node (Header: Authorization: Key YOURAPIKEY).\nGet an Upload-Post API key (for YouTube uploads) and configure the \"HTTP Request\" node (Header: Authorization: Apikey YOUR_API_KEY).\nGet an Upload-Post API key (for TikTok uploads) and configure the \"HTTP Request\" node (Header: Authorization: Apikey YOUR_API_KEY).\nObtain a Fal.run API key (for Veo3) and set it in the \"Create Video\" node (Header: Authorization: Key YOURAPIKEY).\nGet an Upload-Post API key (for YouTube uploads) and configure the \"HTTP Request\" node (Header: Authorization: Apikey YOUR_API_KEY).\nGet an Upload-Post API key (for TikTok uploads) and configure the \"HTTP Request\" node (Header: Authorization: Apikey YOUR_API_KEY).\nYouTube Upload Configuration:\n\nReplace YOUR_USERNAME in the \"HTTP Request\" node with your Upload-Post profile name.\nReplace YOUR_USERNAME in the \"HTTP Request\" node with your Upload-Post profile name.\nSchedule Trigger:\n\nConfigure the \"Schedule Trigger\" node to run periodically (e.g., every 5 minutes).\nConfigure the \"Schedule Trigger\" node to run periodically (e.g., every 5 minutes).\n\nContact me for consulting and support or add me on Linkedin.",
"isPaid": false
},
{
"templateId": "5633",
"templateName": "Image-to-Video with MiniMax Hailuo 02 and upload on Youtube and TikTok",
"templateDescription": "This automated workflow takes a static image and a textual prompt and transforms them into an animated video using the MiniMax Hailuo 02 model. It then...",
"templateUrl": "https://n8n.io/workflows/5633",
"jsonFileName": "Image-to-Video_with_MiniMax_Hailuo_02_and_upload_on_Youtube_and_TikTok.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Image-to-Video_with_MiniMax_Hailuo_02_and_upload_on_Youtube_and_TikTok.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/c7c9dbf1e57e4a016892bdd8970fe0c9/raw/79d5c7fdfde0b4ce2edea90f2f201bdd4d0d2034/Image-to-Video_with_MiniMax_Hailuo_02_and_upload_on_Youtube_and_TikTok.json",
"screenshotURL": "https://i.ibb.co/8gz2FRd0/deb5fdeee47a.png",
"workflowUpdated": true,
"gistId": "c7c9dbf1e57e4a016892bdd8970fe0c9",
"templateDescriptionFull": "This automated workflow takes a static image and a textual prompt and transforms them into an animated video using the MiniMax Hailuo 02 model. It then uploads the generated video to YouTube and TikTok, and updates a Google Sheet with relevant links and metadata.\n\nFully Automated Pipeline: From prompt to video to social media publication — all without manual intervention.\nScalable Content Creation: Generate and distribute dozens of videos per hour with minimal human input.\nCross-Platform Posting: Automatically pushes content to YouTube and TikTok simultaneously.\nSEO Optimization: Uses AI to generate catchy, keyword-rich video titles that improve visibility.\nEasy Integration: Based on Google Sheets for input/output, making it accessible to non-technical users.\nTime-Efficient: Batch-processing enabled with scheduled runs every few minutes.\nCustomizable Duration: Video duration can be adjusted (default is 6 seconds).\n\nTrigger & Data Fetching:\n\nThe workflow starts either manually or via a scheduled trigger (e.g., every 5 minutes).\nIt checks a Google Sheet for new entries where the \"VIDEO\" column is empty, indicating pending video generation tasks.\nThe workflow starts either manually or via a scheduled trigger (e.g., every 5 minutes).\nIt checks a Google Sheet for new entries where the \"VIDEO\" column is empty, indicating pending video generation tasks.\nVideo Creation:\n\nFor each entry, the workflow extracts the image URL and prompt from the Google Sheet.\nIt sends these inputs to the MiniMax Hailuo 02 to generate a video. The API processes the image and prompt, optimizes the prompt, and creates a short video (default: 6 seconds).\nFor each entry, the workflow extracts the image URL and prompt from the Google Sheet.\nIt sends these inputs to the MiniMax Hailuo 02 to generate a video. The API processes the image and prompt, optimizes the prompt, and creates a short video (default: 6 seconds).\nStatus Monitoring:\n\nThe workflow polls the API every 60 seconds to check if the video is COMPLETED.\nOnce ready, it retrieves the video URL and uploads the file to Google Drive.\nThe workflow polls the API every 60 seconds to check if the video is COMPLETED.\nOnce ready, it retrieves the video URL and uploads the file to Google Drive.\nYouTube & TikTok Upload:\n\nThe video is sent to YouTube and TikTok via the Upload-Post.com API (The free plan allows uploads to all platforms except TikTok. To enable, upgrade to a paid plan.).\nA GPT-generated SEO-optimized title is created for the video.\nThe Google Sheet is updated with the video URL and YouTube link.\nThe video is sent to YouTube and TikTok via the Upload-Post.com API (The free plan allows uploads to all platforms except TikTok. To enable, upgrade to a paid plan.).\nA GPT-generated SEO-optimized title is created for the video.\nThe Google Sheet is updated with the video URL and YouTube link.\n\nGoogle Sheet Setup:\n\nCreate a Google Sheet with columns: IMAGE (input image URL), PROMPT (video description), VIDEO (auto-filled), and YOUTUBE_URL (auto-filled).\nLink the sheet to the workflow using the Google Sheets node.\nCreate a Google Sheet with columns: IMAGE (input image URL), PROMPT (video description), VIDEO (auto-filled), and YOUTUBE_URL (auto-filled).\nLink the sheet to the workflow using the Google Sheets node.\nAPI Keys:\n\nObtain a fal.run API key (for MiniMax Hailuo) and configure the \"Authorization\" header in the \"Create video\" node.\nGet an Upload-Post.com API key (10 free uploads/month) and set it in the \"Upload on YouTube/TikTok\" nodes.\nObtain a fal.run API key (for MiniMax Hailuo) and configure the \"Authorization\" header in the \"Create video\" node.\nGet an Upload-Post.com API key (10 free uploads/month) and set it in the \"Upload on YouTube/TikTok\" nodes.\nWorkflow Configuration:\n\nReplace YOUR_USERNAME in the Upload-Post nodes with your profile name (e.g., \"test1\").\nAdjust the video duration (6 or 10 seconds) in the \"Create video\" node.\nSet the Schedule Trigger interval (e.g., 5 minutes) to automate checks for new tasks.\nReplace YOUR_USERNAME in the Upload-Post nodes with your profile name (e.g., \"test1\").\nAdjust the video duration (6 or 10 seconds) in the \"Create video\" node.\nSet the Schedule Trigger interval (e.g., 5 minutes) to automate checks for new tasks.\nExecution:\n\nRun the workflow manually or let the scheduler process new rows automatically.\nThe system handles video generation, uploads, and Google Sheet updates end-to-end.\nRun the workflow manually or let the scheduler process new rows automatically.\nThe system handles video generation, uploads, and Google Sheet updates end-to-end.\n\nContact me for consulting and support or add me on Linkedin.",
"isPaid": false
},
{
"templateId": "3201",
"templateName": "Luma AI - Webhook Response v1 - AK",
"templateDescription": "Automate Video Creation with Luma AI Dream Machine and Airtable (Part 2) Description This is the second part of the Luma AI Dream Machine automation. It...",
"templateUrl": "https://n8n.io/workflows/3201",
"jsonFileName": "Luma_AI_-_Webhook_Response_v1_-_AK.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Luma_AI_-_Webhook_Response_v1_-_AK.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/7e16996f6f77b6309668b8a0cb09f519/raw/0da5285a65ce1b7c80c746e60b4c6a6a29163bd9/Luma_AI_-_Webhook_Response_v1_-_AK.json",
"screenshotURL": "https://i.ibb.co/CXtGrZX/9ad4a7a52dad.png",
"workflowUpdated": true,
"gistId": "7e16996f6f77b6309668b8a0cb09f519",
"templateDescriptionFull": "This is the second part of the Luma AI Dream Machine automation. It captures the webhook response from Luma AI after video generation is complete, processes the data, and automatically updates Airtable with the video and thumbnail URLs. This completes the end-to-end automation for video creation and tracking.\n\n👉 Airtable Base Template\n👉 Tutorial Video\n\nEnsure you’ve created an account with Luma AI and generated an API key.\nConfirm that the API key has permission to manage video requests.\n\nMake sure your Airtable base includes the following fields (set up in Part 1):\n\nUse the Airtable Base Template linked above to simplify setup.\nGeneration ID – To match incoming webhook data.\nStatus – Workflow status (e.g., \"Done\").\nVideo URL – Stores the generated video URL.\nThumbnail URL – Stores the thumbnail URL.\n\nEnsure that the n8n workflow from Part 1 is set up and configured.\nImport this workflow and connect it to the webhook callback from Luma AI.\n\nThe Webhook node listens for a POST response from Luma AI once video generation is finished.\nThe response includes:\n\nVideo URL – Direct link to the video.\nThumbnail URL – Link to the video thumbnail.\nGeneration ID – Used to match the record in Airtable.\nVideo URL – Direct link to the video.\nThumbnail URL – Link to the video thumbnail.\nGeneration ID – Used to match the record in Airtable.\n\nThe Set node extracts the video data from the webhook response.\nThe If node checks if the video URL is valid before proceeding.\n\nThe Airtable node updates the record with:\n\nVideo URL – Direct link to the video.\nThumbnail URL – Link to the video thumbnail.\nStatus – Marked as \"Done.\"\nVideo URL – Direct link to the video.\nThumbnail URL – Link to the video thumbnail.\nStatus – Marked as \"Done.\"\nUses the Generation ID to match and update the correct record.\n\n✅ Automates the completion step for video creation\n✅ Ensures accurate record-keeping by matching generation IDs\n✅ Simplifies the process of managing and organizing video content\n✅ Reduces manual effort by automating the update process\n\nFuture Enhancements – Adding more complex post-processing, video trimming, and multi-platform publishing.",
"isPaid": false
},
{
"templateId": "5035",
"templateName": "Automate video creation with Veo3 and auto-post to Instagram, TikTok via Blotato - vide",
"templateDescription": "Workflow Screenshot Automate video creation with Veo3 and auto-post to Instagram, TikTok via Blotato Who is this for? This template is ideal for content...",
"templateUrl": "https://n8n.io/workflows/5035",
"jsonFileName": "Automate_video_creation_with_Veo3_and_auto-post_to_Instagram_TikTok_via_Blotato_-_vide.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Automate_video_creation_with_Veo3_and_auto-post_to_Instagram_TikTok_via_Blotato_-_vide.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/4e2576b82cd604be7978839e0553b4dc/raw/e8cb732345b8e9d12e2090f44b58d37cadd84027/Automate_video_creation_with_Veo3_and_auto-post_to_Instagram_TikTok_via_Blotato_-_vide.json",
"screenshotURL": "https://i.ibb.co/Z1BVsWPJ/8c47a1bf4a1f.png",
"workflowUpdated": true,
"gistId": "4e2576b82cd604be7978839e0553b4dc",
"templateDescriptionFull": "This template is ideal for content creators, social media managers, YouTubers, and digital marketers who want to generate high-quality videos daily using AI and distribute them effortlessly across multiple platforms.\n\nIt’s perfect for anyone who wants to scale short-form content creation without video editing tools.\n\nCreating and distributing consistent video content requires:\n\nGenerating ideas\nWriting scripts and prompts\nRendering videos\nManually posting to platforms\n\nThis workflow automates all of that. It transforms one prompt into a professional AI-generated video and publishes it automatically — saving time and increasing reach.\n\nTriggers daily to generate a new idea with OpenAI (or your custom prompt).\nCreates a video prompt formatted specifically for Google Veo3.\nGenerates a cinematic video using the Veo3 API.\nLogs the video data into a Google Sheet.\nRetrieves the final video URL once Veo3 finishes rendering.\nUploads the video to Blotato for publishing.\nAuto-posts the video to Instagram, TikTok, YouTube, Facebook, LinkedIn, Threads, Twitter (X), Pinterest, and Bluesky.\n\nAdd your OpenAI API key to the GPT-4.1 nodes.\nConnect your Veo3 API credentials in the video generation node.\nLink your Google Sheets account and use a sheet with columns: Prompt, Video URL, Status.\nConnect your Blotato API key and set your platform IDs in the Assign Social Media IDs node.\nAdjust the Schedule Trigger to your desired posting frequency.\n\nEdit the AI prompt to align with your niche (fitness, finance, education, etc.).\nAdd your own branding overlays using JSON2Video or similar tools.\nChange platform selection by enabling/disabling specific HTTP Request nodes.\nAdd a Telegram step to preview the video before auto-posting.\nTrack performance by adding metrics columns in Google Sheets.\n\n📄 Documentation: Notion Guide\n\nContact me for consulting and support : Linkedin / Youtube",
"isPaid": false
},
{
"templateId": "2783",
"templateName": "Online Marketing Weekly Report",
"templateDescription": "What this workflow doesThis workflow retrieves Online Marketing data (Google Analytics for several domains, Google Ads, Meta Ads) from the last 7 days and...",
"templateUrl": "https://n8n.io/workflows/2783",
"jsonFileName": "Online_Marketing_Weekly_Report.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Online_Marketing_Weekly_Report.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/6c6c7615d62b4be2a592f0ac43545126/raw/330577a4e1eaf2935ea7273c58d0e928b5717231/Online_Marketing_Weekly_Report.json",
"screenshotURL": "https://i.ibb.co/Pzw6h0y8/af74a48c73f4.png",
"workflowUpdated": true,
"gistId": "6c6c7615d62b4be2a592f0ac43545126",
"templateDescriptionFull": "This workflow retrieves Online Marketing data (Google Analytics for several domains, Google Ads, Meta Ads) from the last 7 days and the same period in the previous year. The data is then prepared by AI as a table, analyzed and provided with a small summary.\nThe summary is then sent by email to a desired address and, shortened and summarized again, sent to a Telegram account.\n\nThis workflow has the following sequence:\n\ntime trigger (e.g. every Monday at 7 a.m.)\nretrieval of Online Marketing data from the last 7 days (via sub workflows)\nassignment and summary of the data\nretrieval of Online Marketing data from the same time period of the previous year\nallocation and summary of the data\npreparation in tabular form and brief analysis by AI.\nsending the report as an email\npreparation in short form by AI for Telegram (optional)\nsending as Telegram message.\n\nThe following accesses are required for the workflow:\n\nGoogle Analytics (via Google Analytics API): Documentation\nGoogle Ads (via HTTP Request -> Google Ads API):Documentation\nMeta Ads (via Facebook Graph API): Documentation\nAI API access (e.g. via OpenAI, Anthropic, Google or Ollama)\nSMTP access data (for sending the mail)\nTelegram access data (optional for sending as Telegram message): Documentation\n\nYou must set up the individual sub-workflows as separate workflows. Then set the “Execute workflow trigger” here. Then select the corresponding sub-workflow in the AI Agent Tools.\nYou can select the number of domains yourself. If the data queries are not required, simply delete the corresponding tool (e.g. “Analytics_Domain_5).\n\nFeel free to contact me via LinkedIn, if you have any questions!",
"isPaid": false
},
{
"templateId": "5386",
"templateName": "AI-Powered Meta Ads Weekly PDF Report – Sends to your Slack or Email",
"templateDescription": "What this workflow doesRuns automatically every Monday morning at 8 AMCollects your Meta Ads data from the last 7 days for a given account (date range is...",
"templateUrl": "https://n8n.io/workflows/5386",
"jsonFileName": "AI-Powered_Meta_Ads_Weekly_PDF_Report___Sends_to_your_Slack_or_Email.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/AI-Powered_Meta_Ads_Weekly_PDF_Report___Sends_to_your_Slack_or_Email.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/6a0e8bda3df6fb4393760fbb7e616206/raw/428db481d72b3db633cef6fd1d39712934623be3/AI-Powered_Meta_Ads_Weekly_PDF_Report___Sends_to_your_Slack_or_Email.json",
"screenshotURL": "https://i.ibb.co/6cm0Fm2Q/a94ffe78de15.png",
"workflowUpdated": true,
"gistId": "6a0e8bda3df6fb4393760fbb7e616206",
"templateDescriptionFull": "Runs automatically every Monday morning at 8 AM\nCollects your Meta Ads data from the last 7 days for a given account (date range is configurable)\nFormats the data, aggregating it at the campaign, ad set, and ad levels\nGenerates AI-driven analysis and insights on your results, providing actionable recommendations\nRenders the report as a visually appealing PDF with charts and tables\nSends the report via Slack (you can also add email or WhatsApp)\n\nA sample for the first page of the report:\n\nCreate an account of pdf noodle and use the pre-made Meta Ads template.\nConnect Meta Ads, OpenAI and Slack to n8n\nSet your Ad Account Id and date range (choose from 'last_7d', 'last_14d', 'last30d')\n(opcional) Customize the scheduling date and time\n\nMeta Ads (via Facebook Graph API): Documentation\npdf noodle access: Integration guide\nAI API access (e.g. via OpenAI, Anthropic, Google or Ollama)\nSlack acces (via OAuth2): Documentation\n\nFeel free to contact me via Linkedin, if you have any questions! 👋🏻",
"isPaid": false
},
{
"templateId": "4717",
"templateName": "Create Landing Page Layouts with OpenAI GPT-4.1 from Competitor Analysis",
"templateDescription": "Who is this for? This workflow is ideal for SEO specialists, web designers, and digital marketers who want to quickly draft effective landing page layouts...",
"templateUrl": "https://n8n.io/workflows/4717",
"jsonFileName": "Create_Landing_Page_Layouts_with_OpenAI_GPT-4.1_from_Competitor_Analysis.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Create_Landing_Page_Layouts_with_OpenAI_GPT-4.1_from_Competitor_Analysis.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/e5a9d5d31367027655d41f353e459eec/raw/8aa8a8f302b9e9e99c80ba8718a3087ddfae87c9/Create_Landing_Page_Layouts_with_OpenAI_GPT-4.1_from_Competitor_Analysis.json",
"screenshotURL": "https://i.ibb.co/8gz2FRd0/deb5fdeee47a.png",
"workflowUpdated": true,
"gistId": "e5a9d5d31367027655d41f353e459eec",
"templateDescriptionFull": "This workflow is ideal for SEO specialists, web designers, and digital marketers who want to quickly draft effective landing page layouts by referencing established competitors. It suits users who need a fast, structured starting point for web design while ensuring competitive relevance.\n\nDesigning a high-converting landing page from scratch can be time-consuming. This workflow automates the process of analyzing a competitor’s website, identifying essential sections, and producing a tailored layout—helping users save time and improve their website’s effectiveness.\n\nThe workflow fetches and analyzes your chosen competitor’s landing page, using web scraping and structure-detection nodes in n8n. It identifies primary sections like hero banners, service highlights, testimonials, and contact forms, and then generates a simplified, customizable layout suitable for wireframing or initial design.\n\nPrepare your unique services and target audience profile for customization later.\nGather the competitor’s landing page URL you wish to analyze.\nRun the workflow, inputting your competitor’s URL when prompted.\n\nAfter generating the initial layout, adapt section names and content blocks to highlight your services and brand messaging.\nAdd or remove sections based on your objectives and audience insights.\nIntegrate additional nodes for richer analysis, such as keyword extraction or design pattern detection, to tailor the output further.",
"isPaid": false
},
{
"templateId": "3621",
"templateName": "template_3621",
"templateDescription": "📈 Daily Crypto Market Summary Bot (Binance to Telegram) This workflow fetches 24h price change data from Binance for selected crypto pairs (BTC/USDC,...",
"templateUrl": "https://n8n.io/workflows/3621",
"jsonFileName": "template_3621.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_3621.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/f786098cfcb7c0a14226e379a82717c5/raw/1419a2f5f405e59e894309027a14b34d6ae77b23/template_3621.json",
"screenshotURL": "https://i.ibb.co/gb760J9n/773b5f83955f.png",
"workflowUpdated": true,
"gistId": "f786098cfcb7c0a14226e379a82717c5",
"templateDescriptionFull": "This workflow fetches 24h price change data from Binance for selected crypto pairs (BTC/USDC, ETH/USDC, SOL/USDC) every hour using a cron schedule.\nIt performs in-depth analysis—including volatility, volume, bid-ask spread, momentum, and market comparison—then formats a detailed market summary.\nThe final report is sent to a Telegram chat using HTML formatting, highlighting top gainers, losers, and key metrics in a clean, readable layout.\n\n⏱ Runs every hour (cron: 5 * * * *)\n🔍 Filters and analyzes major coins: BTC, ETH, SOL\n📊 Calculates market metrics:\n\nVolatility\nBid-ask spread\nMomentum\nEstimated market cap\nMarket average comparison\nVolatility\nBid-ask spread\nMomentum\nEstimated market cap\nMarket average comparison\n📈 Highlights gainers, losers, and top coins by volume\n✂️ Splits messages to fit Telegram’s 4096 character limit\n💬 Sends output in rich HTML format to a Telegram group or chat\n\n✅ Crypto traders wanting hourly performance insights\n✅ Telegram groups needing automated market updates\n✅ Analysts monitoring key coin metrics in real-time\n✅ Bot developers creating crypto dashboards or alerts\n\nData Source: Binance 24hr ticker API (/api/v3/ticker/24hr)\nCoins Monitored: BTCUSDC, ETHUSDC, SOLUSDC (can be expanded)\nMetrics Calculated:\n\nPrice change percentage\nVolatility (high vs low price)\nBid-ask spread %\nMomentum (vs weighted average)\nEstimated market cap\nNumber of trades\nMarket average movement\nPrice change percentage\nVolatility (high vs low price)\nBid-ask spread %\nMomentum (vs weighted average)\nEstimated market cap\nNumber of trades\nMarket average movement\nMessage Format:\n\nHTML with emojis, bold styling, and section headings\nAuto-split messages when exceeding Telegram's 4096-char limit\nHTML with emojis, bold styling, and section headings\nAuto-split messages when exceeding Telegram's 4096-char limit\nError Handling:\n\nRetry on HTTP failure (up to 5 times with 5s delay)\nMessage length checked and split for Telegram compatibility\nRetry on HTTP failure (up to 5 times with 5s delay)\nMessage length checked and split for Telegram compatibility\n\nTelegram Bot Token — Create a bot via @BotFather on Telegram\nChat ID — Use a personal ID or group chat ID (add the bot to the group)\nn8n Instance — Either cloud or self-hosted\n(Optional) Modify relevantSymbols in the Function node to track different coins\n\nThis workflow is highly customizable—feel free to modify the analytics, tracked pairs, or formatting.\nGreat base for alerting systems or crypto dashboards.",
"isPaid": false
},
{
"templateId": "4741",
"templateName": "Binance SM Financial Analyst Tool",
"templateDescription": "This workflow powers the Binance Spot Market Quant AI Agent, acting as the Financial Market Analyst. It fuses real-time market structure data (price,...",
"templateUrl": "https://n8n.io/workflows/4741",
"jsonFileName": "Binance_SM_Financial_Analyst_Tool.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Binance_SM_Financial_Analyst_Tool.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/64a8a291845d3e6f7ee3881ae73df79b/raw/9778e76bf8661578a50474b6d639019c0780193b/Binance_SM_Financial_Analyst_Tool.json",
"screenshotURL": "https://i.ibb.co/0jc7LCpJ/2834ba3891d1.png",
"workflowUpdated": true,
"gistId": "64a8a291845d3e6f7ee3881ae73df79b",
"templateDescriptionFull": "This workflow powers the Binance Spot Market Quant AI Agent, acting as the Financial Market Analyst. It fuses real-time market structure data (price, volume, kline) with multiple timeframe technical indicators (15m, 1h, 4h, 1d) and returns a structured trading outlook—perfect for intraday and swing traders who want actionable analysis in Telegram.\n\n🔗 Requires the following sub-workflows to function:\n• Binance SM 15min Indicators Tool\n• Binance SM 1hour Indicators Tool\n• Binance SM 4hour Indicators Tool\n• Binance SM 1day Indicators Tool\n• Binance SM Price/24hStats/Kline Tool\n\nTriggered via webhook (typically by the Quant AI Agent).\nExtracts user symbol + timeframe from input (e.g., \"DOGE outlook today\").\nCalls all linked sub-workflows to retrieve indicators + live price data.\nMerges the data and formats a clean trading report using GPT-4o-mini.\nReturns HTML-formatted message suitable for Telegram delivery.\n\n🎥 Watch Tutorial:\n\n© 2025 Treasurium Capital Limited Company\nArchitecture, prompts, and trade report structure are IP-protected.\nNo unauthorized rebranding or redistribution permitted.\n\n🔗 For support: LinkedIn – Don Jayamaha",
"isPaid": false
},
{
"templateId": "5157",
"templateName": "Perplexity Powered AI News Search",
"templateDescription": "This workflow contains community nodes that are only compatible with the self-hosted version of n8n. 🧠 Perplexity-Powered Daily AI News Digest (via...",
"templateUrl": "https://n8n.io/workflows/5157",
"jsonFileName": "Perplexity_Powered_AI_News_Search.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Perplexity_Powered_AI_News_Search.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/f42ed7e402653f27c0c89559eeea171e/raw/5fcb983b4eabe8c302a8b1b420f7641cfabe7ad6/Perplexity_Powered_AI_News_Search.json",
"screenshotURL": "https://i.ibb.co/997QvtnS/c8f1f9afa50a.png",
"workflowUpdated": true,
"gistId": "f42ed7e402653f27c0c89559eeea171e",
"templateDescriptionFull": "This workflow contains community nodes that are only compatible with the self-hosted version of n8n.\n\n🧠 Perplexity-Powered Daily AI News Digest (via Telegram)\n\nThis ready-to-deploy n8n workflow automates the entire process of collecting, filtering, formatting, and distributing daily AI industry news summaries directly to your Telegram group or channel.\n\nPowered by Perplexity and OpenAI, it fetches only high-signal AI updates from trusted sources (e.g. OpenAI, DeepMind, HuggingFace, MIT Tech Review), filters out duplicates based on a Google Sheet archive, and delivers beautifully formatted news directly to your team — every morning at 10AM.\n\nFor more such build and step-by-step tutorials, check out:\nhttps://www.youtube.com/@Automatewithmarc\n\n🚀 Key Features:\nPerplexity AI Integration: Automatically fetches the most relevant AI developments from the last 24 hours.\n\nAI Formatter Agent: Cleans the raw feed, removes duplicates, adds summaries, and ensures human-friendly formatting.\n\nGoogle Sheets Log: Tracks previously reported news items to avoid repetition.\n\nTelegram Delivery: Sends a polished daily digest straight to your chat, ready for immediate team consumption.\n\nCustomizable Scheduling: Preconfigured for daily use, but can be modified to fit your team's preferred cadence.\n\n💼 Ideal For:\nAnyone who wants to stay ahead of fast-moving AI trends with zero manual effort\n\n🛠️ Tech Stack:\nPerplexity AI\n\nOpenAI (GPT-4 or equivalent)\n\nGoogle Sheets\n\nTelegram API\n\n✅ Setup Notes:\nYou’ll need to connect your own OpenAI, Perplexity, Google Sheets, and Telegram credentials.\n\nReplace the Google Sheet ID and Telegram channel settings with your own.",
"isPaid": false
},
{
"templateId": "5202",
"templateName": "AI-Powered Blog Post Generator",
"templateDescription": "This workflow contains community nodes that are only compatible with the self-hosted version of n8n. 🧠 AI-Powered Blog Post GeneratorCategory: Content...",
"templateUrl": "https://n8n.io/workflows/5202",
"jsonFileName": "AI-Powered_Blog_Post_Generator.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/AI-Powered_Blog_Post_Generator.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/c6a480dc4340190b1aba0bd892b974b2/raw/3b6c0cc46fbccd41578c73e24a0aae8f5ec4623c/AI-Powered_Blog_Post_Generator.json",
"screenshotURL": "https://i.ibb.co/nM2D4bbY/97cd57921761.png",
"workflowUpdated": true,
"gistId": "c6a480dc4340190b1aba0bd892b974b2",
"templateDescriptionFull": "This workflow contains community nodes that are only compatible with the self-hosted version of n8n.\n\n🧠 AI-Powered Blog Post Generator\nCategory: Content Automation / AI Writing / Marketing\n\nDescription:\nThis automated workflow helps you generate fresh, SEO-optimized blog posts daily using AI tools—perfect for solo creators, marketers, and content teams looking to stay on top of the latest AI trends without manual research or writing.\n\nFor more of such builds and step-by-step Tutorial Guides, check out:\nhttps://www.youtube.com/@Automatewithmarc\n\nHere’s how it works:\n\nSchedule Trigger kicks off the workflow daily (or at your preferred interval).\n\nPerplexity AI Node researches the most interesting recent AI news tailored for a non-technical audience.\n\nAI Agent (Claude via Anthropic) turns that news into a full-length blog post based on a structured prompt that includes title, intro, 3+ section headers, takeaway, and meta description—designed for clarity, engagement, and SEO.\n\nOptional Memory & Perplexity Tool Nodes enhance the agent's responses by allowing it to clarify facts or fetch more context.\n\nGoogle Docs Node automatically saves the final blog post to your selected document—ready for review, scheduling, or publishing.\n\nKey Features:\n\nCombines Perplexity AI + Claude AI (Anthropic) for research + writing\n\nBuilt-in memory and retrieval logic for deeper contextual accuracy\n\nNon-technical, friendly writing style ideal for general audiences\n\nOutput saved directly to Google Docs\n\nFully no-code, customizable, and extendable\n\nUse Cases:\n\nAutomate weekly blog content for your newsletter or site\n\nRepurpose content into social posts or scripts\n\nKeep your brand relevant in the fast-moving AI landscape\n\nSetup Requirements:\n\nPerplexity API Key\n\nAnthropic API Key\n\nGoogle Docs (OAuth2 connected)",
"isPaid": false
},
{
"templateId": "4412",
"templateName": "AI Daily News",
"templateDescription": "Overview This automated workflow delivers a weekly digest of the most important AI news directly to your inbox. Every Monday at 9 AM, it uses Perplexity AI...",
"templateUrl": "https://n8n.io/workflows/4412",
"jsonFileName": "AI_Daily_News.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/AI_Daily_News.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/793a4206f6fa6a4de41cd05aa8292e78/raw/370e09da4caf9bc4aa9650b9ab20e0aa63038b9a/AI_Daily_News.json",
"screenshotURL": "https://i.ibb.co/8nQGwTSG/5aa876d11e60.png",
"workflowUpdated": true,
"gistId": "793a4206f6fa6a4de41cd05aa8292e78",
"templateDescriptionFull": "This automated workflow delivers a weekly digest of the most important AI news directly to your inbox. Every Monday at 9 AM, it uses Perplexity AI to research the latest developments and organizes them into four key categories: New Technology, Trending Topics, Top Stories, and AI Security. The workflow then formats this information into a beautifully designed HTML email with summaries, significance explanations, and source links.\n\nAutomatically searches for the latest AI news using Perplexity AI\nCategorizes content into four focused areas most relevant to AI enthusiasts and professionals\nGenerates comprehensive summaries explaining why each story matters\nCreates a professional HTML email with styled sections and clickable links\nSends weekly on Monday at 9 AM (customizable schedule)\nIncludes error handling with fallback content if news parsing fails\n\nCopy the JSON code and import it into your n8n instance\nThe workflow will appear as “Daily AI News Summary”\n\nSign up for a Perplexity API account at perplexity.ai\nCreate new credentials in n8n:\n\nType: “OpenAI”\nName: “perplexity-credentials”\nAPI Key: Your Perplexity API key\nBase URL: https://api.perplexity.ai\nType: “OpenAI”\nName: “perplexity-credentials”\nAPI Key: Your Perplexity API key\nBase URL: https://api.perplexity.ai\n\nConfigure SMTP credentials in n8n:\n\nName: “email-credentials”\nAdd your email provider’s SMTP settings\nTest the connection to ensure emails can be sent\nName: “email-credentials”\nAdd your email provider’s SMTP settings\nTest the connection to ensure emails can be sent\n\nOpen the “Send Email Summary” node\nUpdate the toEmail field with your email address\nModify the fromEmail if needed (must match your SMTP credentials)\n\nChange Schedule: Modify the “Daily Trigger” node to run at your preferred time\nAdjust Categories: Edit the Perplexity prompt to focus on different AI topics or change the theme altogether\nModify Styling: Update the HTML template in the “Format Email Content” node\n\nRun a test execution to ensure everything works correctly\nActivate the workflow to start receiving daily AI news summaries\n\nn8n instance (cloud or self-hosted)\nPerplexity API account and key\nSMTP email access (Gmail, Outlook, etc.)",
"isPaid": false
},
{
"templateId": "3672",
"templateName": "SEO Blog Generator with GPT-4o, Perplexity, and Telegram Integration",
"templateDescription": "SEO Blog Generator with GPT-4o, Perplexity, and Telegram Integration This workflow helps you automatically generate SEO-optimized blog posts using...",
"templateUrl": "https://n8n.io/workflows/3672",
"jsonFileName": "SEO_Blog_Generator_with_GPT-4o_Perplexity_and_Telegram_Integration.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/SEO_Blog_Generator_with_GPT-4o_Perplexity_and_Telegram_Integration.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/2ba591cbe42b0f4c0e10df543f61ff57/raw/e9f0c827d0c08a991dcee4c4261e0b08e3b36f00/SEO_Blog_Generator_with_GPT-4o_Perplexity_and_Telegram_Integration.json",
"screenshotURL": "https://i.ibb.co/b5ZsCgth/e8437c9d984e.png",
"workflowUpdated": true,
"gistId": "2ba591cbe42b0f4c0e10df543f61ff57",
"templateDescriptionFull": "This workflow helps you automatically generate SEO-optimized blog posts using Perplexity.ai, OpenAI GPT-4o, and optionally Telegram for interaction.\n\n🧠 Topic research via Perplexity sub-workflow\n✍️ AI-written blog post generated with GPT-4o\n📊 Structured output with metadata: title, slug, meta description\n📩 Integration with Telegram to trigger workflows or receive outputs (optional)\n\n✅ OpenAI API Key (GPT-4o or GPT-3.5)\n✅ Perplexity API Key (with access to /chat/completions)\n✅ (Optional) Telegram Bot Token and webhook setup\n\nCredentials:\n\nAdd your OpenAI credentials (openAiApi)\nAdd your Perplexity credentials under httpHeaderAuth\nOptional: Setup Telegram credentials under telegramApi\nAdd your OpenAI credentials (openAiApi)\nAdd your Perplexity credentials under httpHeaderAuth\nOptional: Setup Telegram credentials under telegramApi\nInputs:\n\nUse the Form Trigger or Telegram input node to send a Research Query\nUse the Form Trigger or Telegram input node to send a Research Query\nSubworkflow:\n\nMake sure to import and activate the subworkflow Perplexity_Searcher to fetch recent search results\nMake sure to import and activate the subworkflow Perplexity_Searcher to fetch recent search results\nCustomization:\n\nEdit prompt texts inside the Blog Content Generator and Metadata Generator to change writing style or target industry\nAdd or remove output nodes like Google Sheets, Notion, etc.\nEdit prompt texts inside the Blog Content Generator and Metadata Generator to change writing style or target industry\nAdd or remove output nodes like Google Sheets, Notion, etc.\n\nThe final blog post includes:\n\n✅ Blog content (1500-2000 words)\n✅ Metadata: title, slug, and meta description\n✅ Extracted summary in JSON\n✅ Delivered to Telegram (if connected)\n\nNeed help? Reach out on the n8n community forum",
"isPaid": false
},
{
"templateId": "3107",
"templateName": "template_3107",
"templateDescription": "Startup Funding Research Automation with Claude, Perplexity AI, and Airtable How it worksThis intelligent workflow automatically discovers and analyzes...",
"templateUrl": "https://n8n.io/workflows/3107",
"jsonFileName": "template_3107.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_3107.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/b1eef214866d78e3712a4e5fb9f4ea35/raw/abd0ce72a5a6d4746a8bb5ff6cb4ba4016ba4d03/template_3107.json",
"screenshotURL": "https://i.ibb.co/GvBdG4DN/f979c7f58a85.png",
"workflowUpdated": true,
"gistId": "b1eef214866d78e3712a4e5fb9f4ea35",
"templateDescriptionFull": "This intelligent workflow automatically discovers and analyzes recently funded startups by:\n\nMonitoring multiple news sources (TechCrunch and VentureBeat) for funding announcements\nUsing AI to extract key funding details (company name, amount raised, investors)\nConducting automated deep research on each company through perplexity deep research or jina deep search.\nOrganizing all findings into a structured Airtable database for easy access and analysis\n\nConnect your news feed sources (TechCrunch and VentureBeat). Could be extended. These were easy to scrape and this data can be expensive.\nSet up your AI service credentials (Claude and Perplexity or jina which has generous free tier)\nConnect your Airtable account and create a base with appropriate fields (can be imported from my base) or see structure below.\nAirtable Base\n\nI found that by using perplexity via open router, we lose access to the sources, as they are not stored in the same location as the report itself so I opted to use perplexity API via HTTP node.\n\nFor using perplexity and or jina you have to configure header auth as described in Header Auth - n8n Docs\n\nHow to scrape data using sitemaps\nHow to extract strucutred data from unstructured text\nHow to execute parts of the workflow as subworkflow\nHow to use deep research in a practical scenario\nHow to define more complex JSON schemas",
"isPaid": false
},
{
"templateId": "3534",
"templateName": "Search & Summarize Web Data with Perplexity, Gemini AI & Bright Data to Webhooks",
"templateDescription": "Who this is for?This workflow is designed for professionals and teams who need real-time, structured insights from Perplexity Search results without manual...",
"templateUrl": "https://n8n.io/workflows/3534",
"jsonFileName": "Search__Summarize_Web_Data_with_Perplexity_Gemini_AI__Bright_Data_to_Webhooks.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Search__Summarize_Web_Data_with_Perplexity_Gemini_AI__Bright_Data_to_Webhooks.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/cfb918371ddf9b673cf5f05498d1b338/raw/2420fb67091362da5b50447263804174a8a86e3d/Search__Summarize_Web_Data_with_Perplexity_Gemini_AI__Bright_Data_to_Webhooks.json",
"screenshotURL": "https://i.ibb.co/7xLpYvKy/73675b717a7b.png",
"workflowUpdated": true,
"gistId": "cfb918371ddf9b673cf5f05498d1b338",
"templateDescriptionFull": "This workflow is designed for professionals and teams who need real-time, structured insights from Perplexity Search results without manual effort.\n\nThis n8n workflow solves the problem of automating Perplexity Search result extraction, cleanup, summarization, and AI-enhanced formatting for downstream use like sending the results to a webhook or another system.\n\nAutomates Perplexity Search via Bright Data\n\nUses Bright Data’s proxy-based SERP API to run a Google Search query programmatically.\nMakes the process repeatable and scriptable with different search terms and regions/zones.\n\nCleans and Extracts Useful Content\n\nThe Readable Data Extractor uses LLM-based cleaning to remove HTML/CSS/JS from the response and extract pure text data.\nConverts messy, unstructured web content into structured, machine-readable format.\n\nSummarizes Search Results\nThrough the Gemini Flash + Summarization Chain, it generates a concise summary of the search results. Ideal for users who don’t have time to read full pages of search results.\nFormats Data Using AI Agent\nThe AI Agent acts like a virtual assistant that: - Understands search results\n\nFormats them in a readable, JSON-compatible form\nPrepares them for webhook delivery\n\nDelivers Results to Webhook\nSends the final summary + structured search result to a webhook (could be your app, a Slack bot, Google Sheets, or CRM).\n\nSign up at Bright Data.\nNavigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions.\nIn n8n, configure the Header Auth account under Credentials (Generic Auth Type: Header Authentication).\n\nThe Value field should be set with the\nBearer XXXXXXXXXXXXXX. The XXXXXXXXXXXXXX should be replaced by the Web Unlocker Token.\nIn n8n, configure the Google Gemini(PaLM) Api account with the Google Gemini API key (or access through Vertex AI or proxy).\nUpdate the Perplexity Search Request node with the prompt you wish to perform the search.\nUpdate the Webhook HTTP Request node with the Webhook endpoint of your choice.\n\n1. Change the Perplexity Search Input\n\nDefault: It searches a fixed query or dataset.\n\nCustomize:\n\nAccept input from a Google Sheet, Airtable, or a form.\nAuto-trigger searches based on keywords or schedules.\n\n2. Customize Summarization Style (LLM Output)\n\nDefault: General summary using Google Gemini or OpenAI.\n\nCustomize:\n\nAdd tone: formal, casual, technical, executive-summary, etc.\nFocus on specific sections: pricing, competitors, FAQs, etc.\nTranslate the summaries into multiple languages.\nAdd bullet points, pros/cons, or insight tags.\n\n3.Choose Where the Results Go\n\nOptions:\n\nEmail, Slack, Notion, Airtable, Google Docs, or a dashboard.\nAuto-create content drafts for WordPress or newsletters.\nFeed into CRM notes or attach to Salesforce leads.",
"isPaid": false
},
{
"templateId": "3673",
"templateName": "Perplexity Researcher",
"templateDescription": "Name:AI-Powered Research Agent using Perplexity Sonar Description:This workflow acts as an AI-powered research assistant using the Perplexity Sonar model....",
"templateUrl": "https://n8n.io/workflows/3673",
"jsonFileName": "Perplexity_Researcher.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Perplexity_Researcher.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/fdb68ad3da836f3124de750df504b635/raw/f8b9890e08286987dc15fab70c7de23cc1b563c0/Perplexity_Researcher.json",
"screenshotURL": "https://i.ibb.co/8ngNQdx5/1637e40de4bf.png",
"workflowUpdated": true,
"gistId": "fdb68ad3da836f3124de750df504b635",
"templateDescriptionFull": "Name:\nAI-Powered Research Agent using Perplexity Sonar\n\nDescription:\nThis workflow acts as an AI-powered research assistant using the Perplexity Sonar model. When triggered by another workflow, it sends a user-defined prompt to the Perplexity API to retrieve up-to-date search results. The response is then parsed into a clean format for downstream processing.\n\nHow it Works:\nTrigger: Activated from another workflow via Execute Workflow Trigger.\n\nPrompt Setup: Sets a system role message and user query dynamically.\n\nAPI Call: Sends a POST request to Perplexity's /chat/completions endpoint with your credentials.\n\nResponse Handling: Extracts the message content from the API response.\n\nOutput: Returns the result, ready for display or further processing.\n\nRequirements:\nA Perplexity AI API Key\n\nSet up authentication via Header Auth with Bearer token\n\nEnsure your account allows outbound HTTP requests in n8n\n\nCustomization Tips:\nModify the system prompt to suit your research domain\n\nChain this workflow with other automation like blog creation, summaries, etc.\n\nReplace the output handling logic to fit into Google Sheets, Notion, or Telegram",
"isPaid": false
},
{
"templateId": "2824",
"templateName": "template_2824",
"templateDescription": "This workflow illustrates how to use Perplexity AI in your n8n workflow. Perplexity is a free AI-powered answer engine that provides accurate, trusted, and...",
"templateUrl": "https://n8n.io/workflows/2824",
"jsonFileName": "template_2824.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_2824.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/38eebff18c7f3d7be4b7167733d72c73/raw/1c4051a5038313f695cc4ff6f5125e6662c43c90/template_2824.json",
"screenshotURL": "https://i.ibb.co/d07hjw65/2e94caee0ff1.png",
"workflowUpdated": true,
"gistId": "38eebff18c7f3d7be4b7167733d72c73",
"templateDescriptionFull": "This workflow illustrates how to use Perplexity AI in your n8n workflow.\n\nPerplexity is a free AI-powered answer engine that provides accurate, trusted, and real-time answers to any question.\n\n1/ Go to the perplexity dashboard, purchase some credits and create an API Key\n\nhttps://www.perplexity.ai/settings/api\n\n2/ In the perplexity Request node, use Generic Credentials, Header Auth.\n\nFor the name, use the value \"Authorization\"\nAnd for the value \"Bearer pplx-e4...59ea\" (Your Perplexity Api Key)\n\nSonar Pro is the current top model used by perplexity.\nIf you want to use a different one, check this page:\n\nhttps://docs.perplexity.ai/guides/model-cards",
"isPaid": false
},
{
"templateId": "4509",
"templateName": "Firecrawl Extract - Quiver Q",
"templateDescription": "📬 What This Workflow DoesThis workflow automatically scrapes recent high-value congressional stock trades from Quiver Quantitative, summarizes the key...",
"templateUrl": "https://n8n.io/workflows/4509",
"jsonFileName": "Firecrawl_Extract_-_Quiver_Q.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Firecrawl_Extract_-_Quiver_Q.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/5222c038b422c6efbe32bbab1047bbc7/raw/9e4b204c8014fb892a32e805773ca461a9e27b40/Firecrawl_Extract_-_Quiver_Q.json",
"screenshotURL": "https://i.ibb.co/chNP9XZH/9ec6594d7a41.png",
"workflowUpdated": true,
"gistId": "5222c038b422c6efbe32bbab1047bbc7",
"templateDescriptionFull": "📬 What This Workflow Does\nThis workflow automatically scrapes recent high-value congressional stock trades from Quiver Quantitative, summarizes the key transactions, and delivers a neatly formatted report to your inbox — every single day.\n\nIt combines Firecrawl's powerful content extraction, OpenAI's GPT formatting, and n8n's automation engine to turn raw HTML data into a digestible, human-readable email.\n\nWatch Full Tutorial on how to build this workflow here:\nhttps://www.youtube.com/watch?v=HChQSYsWbGo&t=947s&pp=0gcJCb4JAYcqIYzv\n\n🔧 How It Works\n🕒 Schedule Trigger\nFires daily at a set hour (e.g., 6 PM) to begin the data pipeline.\n\n🔥 Firecrawl Extract API (POST)\nTargets the Quiver Quantitative “Congress Trading” page and sends a structured prompt asking for all trades over $50K in the past month.\n\n⏳ Wait Node\nAllows time for Firecrawl to finish processing before retrieving results.\n\n📥 Firecrawl Get Result API (GET)\nRetrieves the extracted and structured data.\n\n🧠 OpenAI Chat Model (GPT-4o)\nFormats the raw trading data into a readable summary that includes:\n\nDate of Transaction\n\nStock/Asset traded\n\nAmount\n\nCongress member’s name and political party\n\n📧 Gmail Node\nSends the summary to your inbox with the subject “Congress Trade Updates - QQ”.\n\n🧠 Why This is Useful\nCongressional trading activity often reveals valuable signals — especially when high-value trades are made.\nThis workflow:\n\nSaves time manually tracking Quiver Quant updates\n\nConverts complex tables into a daily, readable email\n\nKeeps investors, researchers, and newsrooms in the loop — hands-free\n\n🛠 Requirements\nFirecrawl API Key (with extract access)\n\nOpenAI API Key\n\nGmail OAuth2 credentials\n\nn8n (self-hosted or cloud)\n\n💬 Sample Output:\nCongress Trade Summary – May 21\n\nNancy Pelosi (D) sold TSLA for $85,000 on April 28\n\nJohn Raynor (R) purchased AAPL worth $120,000 on May 2\n... and more\n\n🪜 Setup Steps\nAdd your Firecrawl, OpenAI, and Gmail credentials in n8n.\n\nAdjust the schedule node to your desired time.\n\nCustomize the OpenAI system prompt if you want a different summary style.\n\nDeploy the workflow — and enjoy your daily edge.",
"isPaid": false
},
{
"templateId": "3101",
"templateName": "spy tool",
"templateDescription": "Who is this template for?This workflow template is designed for people seeking alerts when certain specific changes are made to any web page. Leveraging...",
"templateUrl": "https://n8n.io/workflows/3101",
"jsonFileName": "spy_tool.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/spy_tool.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/e91737b8120456861a7a65bd8a56bb3b/raw/efc46113f724ea6aa04f5fb5fc688dfd3a615df1/spy_tool.json",
"screenshotURL": "https://i.ibb.co/HTC9mn13/2572ea10a456.png",
"workflowUpdated": true,
"gistId": "e91737b8120456861a7a65bd8a56bb3b",
"templateDescriptionFull": "This workflow template is designed for people seeking alerts when certain specific changes are made to any web page. Leveraging agentic AI, it analyzes the page every day and autonomously decides whether to send you an e-mail notification.\n\nTrack price changes on [competitor's website]. Notify me when the price drops below €50.\nMonitor new blog posts on [industry leader's website] and summarize key insights.\nCheck [competitor's job page] for new job postings related to software development.\nWatch for new product launches on [e-commerce site] and send me a summary.\nDetect any changes in the terms and conditions of [specific website].\nTrack customer reviews for [specific product] on [review site] and extract key themes.\n\nWhen clicking 'test workflow' in the editor, a new browser tab will open where you can fill in the details of your espionage assignment\nMake sure you be as concise as possible when instructing AI. Instruct specific and to the point (see examples at the bottom).\nAfter submission, the flow will start off by extracting both the relevant website url and an optimized prompt. OpenAI's structured outputs is utilized, followed by a code node to parse the results for further use.\nFrom here on, the endless loop of daily checks will begin:\n\nInitial scrape\n1 day delay\nSecond scrape\nAI agent decides whether or not to notify you\nBack to step 1\n\nYou can cancel an espionage assignment at any time in the executions tab\n\nInsert your OpenAI API key in the structured outputs node (second one)\nCreate a Firecrawl account and connect your Firecrawl API key in both 'Scrape page'-nodes\nConnect your OpenAI account in the AI agents' model node\nConnect your Gmail account in the AI agents' Gmail tool node",
"isPaid": false
},
{
"templateId": "2328",
"templateName": "template_2328",
"templateDescription": "This n8n workflow demonstrates how you can summarise and automate post-meeting actions from video transcripts fed into an AI Agent. Save time between...",
"templateUrl": "https://n8n.io/workflows/2328",
"jsonFileName": "template_2328.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_2328.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/3ca0f3e2aebe2d6ccc58736ebf77a325/raw/e6b1654da65e2ccbe606d53fa71896b9a323bc29/template_2328.json",
"screenshotURL": "https://i.ibb.co/8ngNQdx5/1637e40de4bf.png",
"workflowUpdated": true,
"gistId": "3ca0f3e2aebe2d6ccc58736ebf77a325",
"templateDescriptionFull": "This n8n workflow demonstrates how you can summarise and automate post-meeting actions from video transcripts fed into an AI Agent.\n\nSave time between meetings by allowing AI handle the chores of organising follow-up meetings and invites.\n\nThis workflow scans for the calendar for client or team meetings which were held online. * Attempts will be made to fetch any recorded transcripts which are then sent to the AI agent.\nThe AI agent summarises and identifies if any follow-on meetings are required.\nIf found, the Agent will use its Calendar Tool to to create the event for the time, date and place for the next meeting as well as add known attendees.\n\nGoogle Calendar and the ability to fetch Meeting Transcripts (There is a special OAuth permission for this action!)\nOpenAI account for access to the LLM.\n\nThis example only books follow-on meetings but could be extended to generate reports or send emails.",
"isPaid": false
},
{
"templateId": "2683",
"templateName": "template_2683",
"templateDescription": "Video Guide I prepared a comprehensive guide detailing how to create a Smart Agent that automates meeting task management by analyzing transcripts,...",
"templateUrl": "https://n8n.io/workflows/2683",
"jsonFileName": "template_2683.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_2683.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/c682c89bf3418a9c841f37fc21cc5f70/raw/13ec24fad585684cff651278b257d3990607b4d0/template_2683.json",
"screenshotURL": "https://i.ibb.co/1twjzC5c/8115220a3367.png",
"workflowUpdated": true,
"gistId": "c682c89bf3418a9c841f37fc21cc5f70",
"templateDescriptionFull": "I prepared a comprehensive guide detailing how to create a Smart Agent that automates meeting task management by analyzing transcripts, generating tasks in Airtable, and scheduling follow-ups when necessary.\n\n\n\nYoutube Link\n\nThis workflow is ideal for project managers, team leaders, and business owners looking to enhance productivity during meetings. It is particularly helpful for those who need to convert discussions into actionable items swiftly and effectively.\n\nManaging action items from meetings can often lead to missed tasks and poor follow-up. This automation alleviates that issue by automatically generating tasks from meeting transcripts, keeping everyone informed about their responsibilities and streamlining communication.\n\nThe workflow leverages n8n to create a Smart Agent that listens for completed meeting transcripts, processes them using AI, and generates tasks in Airtable. Key functionalities include:\n\nCapturing completed meeting events through webhooks.\nExtracting relevant meeting details such as transcripts and participants using API calls.\nGenerating structured tasks from meeting discussions and sending notifications to clients.\n\nWebhook Integration: Listens for meeting completion events to trigger subsequent actions.\nAPI Requests for Data: Pulls necessary details like transcripts and participant information from Fireflies.\nTask and Notification Generation: Automatically creates tasks in Airtable and notifies clients of their responsibilities.\n\nConfigure the Webhook:\n\nSet up a webhook to capture meeting completion events and integrate it with Fireflies.\nSet up a webhook to capture meeting completion events and integrate it with Fireflies.\nRetrieve Meeting Content:\n\nUse GraphQL API requests to extract meeting details and transcripts, ensuring appropriate authentication through Bearer tokens.\nUse GraphQL API requests to extract meeting details and transcripts, ensuring appropriate authentication through Bearer tokens.\nAI Processing Setup:\n\nDefine system messages for AI tasks and configure connections to the AI chat model (e.g., OpenAI's GPT) to process transcripts.\nDefine system messages for AI tasks and configure connections to the AI chat model (e.g., OpenAI's GPT) to process transcripts.\nTask Creation Logic:\n\nCreate structured tasks based on AI output, ensuring necessary details are captured and records are created in Airtable.\nCreate structured tasks based on AI output, ensuring necessary details are captured and records are created in Airtable.\nClient Notifications:\n\nUse an email node to notify clients about their tasks, ensuring communications are client-specific.\nUse an email node to notify clients about their tasks, ensuring communications are client-specific.\nScheduling Follow-Up Calls:\n\nSet up Google Calendar events if follow-up meetings are required, populating details from the original meeting context.\nSet up Google Calendar events if follow-up meetings are required, populating details from the original meeting context.",
"isPaid": false
},
{
"templateId": "2752",
"templateName": "HR & IT Helpdesk Chatbot with Audio Transcription",
"templateDescription": "An intelligent chatbot that assists employees by answering common HR or IT questions, supporting both text and audio messages. This unique feature ensures...",
"templateUrl": "https://n8n.io/workflows/2752",
"jsonFileName": "HR__IT_Helpdesk_Chatbot_with_Audio_Transcription.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/HR__IT_Helpdesk_Chatbot_with_Audio_Transcription.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/b1f317216b2488aaa0d30f42cd014d90/raw/9669234ff6373d65fefa3d9013c3efce6a380296/HR__IT_Helpdesk_Chatbot_with_Audio_Transcription.json",
"screenshotURL": "https://i.ibb.co/zWhvVb56/e81a47cb371c.png",
"workflowUpdated": true,
"gistId": "b1f317216b2488aaa0d30f42cd014d90",
"templateDescriptionFull": "An intelligent chatbot that assists employees by answering common HR or IT questions, supporting both text and audio messages. This unique feature ensures employees can conveniently ask questions via voice messages, which are transcribed and processed just like text queries.\n\nMessage Capture: When an employee sends a message to the chatbot in WhatsApp or Telegram (text or audio), the chatbot captures the input.\nAudio Transcription: For audio messages, the chatbot transcribes the content into text using an AI-powered transcription service (e.g., Whisper, Google Cloud Speech-to-Text).\nQuery Processing:\n\nThe transcribed text (or directly entered text) is sent to an AI service (e.g., OpenAI) to generate embeddings.\nThese embeddings are used to search a vector database (e.g., Supabase or Qdrant) containing the company’s internal HR and IT documentation.\nThe most relevant data is retrieved and sent back to the AI service to compose a concise and helpful response.\nThe transcribed text (or directly entered text) is sent to an AI service (e.g., OpenAI) to generate embeddings.\nThese embeddings are used to search a vector database (e.g., Supabase or Qdrant) containing the company’s internal HR and IT documentation.\nThe most relevant data is retrieved and sent back to the AI service to compose a concise and helpful response.\nResponse Delivery: The chatbot sends the final response back to the employee, whether the input was text or audio.\n\nEstimated Time: 20–25 minutes\nPrerequisites:\n\nCreate an account with an AI provider (e.g., OpenAI).\nConnect WhatsApp or Telegram credentials in n8n.\nSet up a transcription service (e.g., Whisper or Google Cloud Speech-to-Text).\nConfigure a vector database (e.g., Supabase or Qdrant) and add your internal HR and IT documentation.\nImport the workflow template into n8n and update environment variables for your credentials.\nCreate an account with an AI provider (e.g., OpenAI).\nConnect WhatsApp or Telegram credentials in n8n.\nSet up a transcription service (e.g., Whisper or Google Cloud Speech-to-Text).\nConfigure a vector database (e.g., Supabase or Qdrant) and add your internal HR and IT documentation.\nImport the workflow template into n8n and update environment variables for your credentials.",
"isPaid": false
},
{
"templateId": "2547",
"templateName": "template_2547",
"templateDescription": "Video Guide I prepared a detailed guide that showed the whole process of building a call analyzer. OPENAI .png) Who is this for?This workflow is ideal for...",
"templateUrl": "https://n8n.io/workflows/2547",
"jsonFileName": "template_2547.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_2547.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/4c7d5e4e91bd673385bbbb7d52fd3f8b/raw/e8c37c46bc6bba007772ae0652535597a61ff96f/template_2547.json",
"screenshotURL": "https://i.ibb.co/HTC9mn13/2572ea10a456.png",
"workflowUpdated": true,
"gistId": "4c7d5e4e91bd673385bbbb7d52fd3f8b",
"templateDescriptionFull": "I prepared a detailed guide that showed the whole process of building a call analyzer.\n\n\n\nThis workflow is ideal for sales teams, customer support managers, and online education services that conduct follow-up calls with clients. It’s designed for those who want to leverage AI to gain deeper insights into client needs and upsell opportunities from recorded calls.\n\nMany follow-up sales calls lack structured analysis, making it challenging to identify client needs, gauge interest levels, or uncover upsell opportunities. This workflow enables automated call transcription and AI-driven analysis to generate actionable insights, helping teams improve sales performance, refine client communication, and streamline upselling strategies.\n\nThis workflow transcribes and analyzes sales calls using AssemblyAI, OpenAI, and Supabase to store structured data. The workflow processes recorded calls as follows:\n\nTranscribe Call with AssemblyAI: Converts audio into text with speaker labels for clarity.\nAnalyze Transcription with OpenAI: Using a predefined JSON schema, OpenAI analyzes the transcription to extract metrics like client intent, interest score, upsell opportunities, and more.\nStore and Access Results in Supabase: Stores both transcription and analysis data in a Supabase database for further use and display in interfaces.\n\nCreate Accounts: Set up accounts for N8N, Supabase, AssemblyAI, and OpenAI.\nGet Call Link: Upload audio files to public Supabase storage or Dropbox to generate a direct link for transcription.\nPrepare Artifacts for OpenAI:\n\nDefine Metrics: Identify business metrics you want to track from call analysis, such as client needs, interest score, and upsell potential.\nGenerate JSON Schema: Use GPT to design a JSON schema for structuring OpenAI’s responses, enabling efficient storage, analysis, and display.\nCreate Analysis Prompt: Write a detailed prompt for GPT to analyze calls based on your metrics and JSON schema.\nDefine Metrics: Identify business metrics you want to track from call analysis, such as client needs, interest score, and upsell potential.\nGenerate JSON Schema: Use GPT to design a JSON schema for structuring OpenAI’s responses, enabling efficient storage, analysis, and display.\nCreate Analysis Prompt: Write a detailed prompt for GPT to analyze calls based on your metrics and JSON schema.\n\nSet Up Request:\n\nHeader Authentication: Set Authorization with AssemblyAI API key.\nURL: POST to https://api.assemblyai.com/v2/transcript/.\nParameters:\n\naudio_url: Direct URL of the audio file.\nwebhook_url: URL for an N8N webhook to receive the transcription result.\nAdditional Settings:\n\nspeaker_labels (true/false): Enables speaker diarization.\nspeakers_expected: Specify expected number of speakers.\nlanguage_code: Set language (default: en_us).\nHeader Authentication: Set Authorization with AssemblyAI API key.\nURL: POST to https://api.assemblyai.com/v2/transcript/.\nParameters:\n\naudio_url: Direct URL of the audio file.\nwebhook_url: URL for an N8N webhook to receive the transcription result.\nAdditional Settings:\n\nspeaker_labels (true/false): Enables speaker diarization.\nspeakers_expected: Specify expected number of speakers.\nlanguage_code: Set language (default: en_us).\naudio_url: Direct URL of the audio file.\nwebhook_url: URL for an N8N webhook to receive the transcription result.\nAdditional Settings:\n\nspeaker_labels (true/false): Enables speaker diarization.\nspeakers_expected: Specify expected number of speakers.\nlanguage_code: Set language (default: en_us).\nspeaker_labels (true/false): Enables speaker diarization.\nspeakers_expected: Specify expected number of speakers.\nlanguage_code: Set language (default: en_us).\n\nWebhook Configuration: Set up a POST webhook to receive AssemblyAI’s transcription data.\nGet Transcription:\n\nHeader Authentication: Set Authorization with AssemblyAI API key.\nURL: GET https://api.assemblyai.com/v2/transcript/&lt;transcript_id&gt;.\nHeader Authentication: Set Authorization with AssemblyAI API key.\nURL: GET https://api.assemblyai.com/v2/transcript/&lt;transcript_id&gt;.\nSend to OpenAI:\n\nURL: POST to https://api.openai.com/v1/chat/completions.\nHeader Authentication: Set Authorization with OpenAI API key.\nBody Parameters:\n\nModel: Use gpt-4o-2024-08-06 for JSON Schema support, or gpt-4o-mini for a less costly option.\nMessages:\n\nsystem: Contains the main analysis prompt.\nuser: Combined speakers’ utterances to analyze in text format.\n\n\nResponse Format:\n\ntype: json_schema.\njson_schema: JSON schema for structured responses.\nURL: POST to https://api.openai.com/v1/chat/completions.\nHeader Authentication: Set Authorization with OpenAI API key.\nBody Parameters:\n\nModel: Use gpt-4o-2024-08-06 for JSON Schema support, or gpt-4o-mini for a less costly option.\nMessages:\n\nsystem: Contains the main analysis prompt.\nuser: Combined speakers’ utterances to analyze in text format.\n\n\nResponse Format:\n\ntype: json_schema.\njson_schema: JSON schema for structured responses.\nModel: Use gpt-4o-2024-08-06 for JSON Schema support, or gpt-4o-mini for a less costly option.\nMessages:\n\nsystem: Contains the main analysis prompt.\nuser: Combined speakers’ utterances to analyze in text format.\nsystem: Contains the main analysis prompt.\nuser: Combined speakers’ utterances to analyze in text format.\nResponse Format:\n\ntype: json_schema.\njson_schema: JSON schema for structured responses.\ntype: json_schema.\njson_schema: JSON schema for structured responses.\nSave Results in Supabase:\n\nOperation: Create a new record.\nTable Name: demo_calls.\nFields:\n\nInput: Transcription text, audio URL, and transcription ID.\nOutput: Parsed JSON response from OpenAI’s analysis.\nOperation: Create a new record.\nTable Name: demo_calls.\nFields:\n\nInput: Transcription text, audio URL, and transcription ID.\nOutput: Parsed JSON response from OpenAI’s analysis.\nInput: Transcription text, audio URL, and transcription ID.\nOutput: Parsed JSON response from OpenAI’s analysis.",
"isPaid": false
},
{
"templateId": "4640",
"templateName": "monitoring competitor prices - for free",
"templateDescription": "How it works ++Download the google sheet here++ and replace this with the googles sheet node: Google sheet , upload to google sheets and replace in the...",
"templateUrl": "https://n8n.io/workflows/4640",
"jsonFileName": "monitoring_competitor_prices_-_for_free.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/monitoring_competitor_prices_-_for_free.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/47a7aa7f7869f0d7ebc2d6e0dbce86ad/raw/b640e650650dfb493e2817eff0f2738aa243cfdc/monitoring_competitor_prices_-_for_free.json",
"screenshotURL": "https://i.ibb.co/tPBz8m7N/c0418496d380.png",
"workflowUpdated": true,
"gistId": "47a7aa7f7869f0d7ebc2d6e0dbce86ad",
"templateDescriptionFull": "++Download the google sheet here++ and replace this with the googles sheet node: Google sheet , upload to google sheets and replace in the google sheets node.\n\nScheduled trigger: Runs once a day at 8 AM (server time).\nFetch product list: Reads your “master” sheet (product_url + last known price) from Google Sheets.\nLoop with delay: Iterates over each row (product) one at a time, inserting a short pause (20 s) between HTTP requests to avoid blocking.\nScrape current price: Loads each product_url, extracts the current price via a simple CSS selector.\nCompare & normalize: Compares the newly scraped price against the “last_price” from your sheet, calculates percentage change, and tags items where price_changed == true.\n\nOn price change:\n\nSend alert: Formats a Telegram message (“Price Drop” or “Price Hike”) and pushes it to your configured chat.\nLog history: Appends a new row to a separate “price_tracking” tab with timestamp, old price, new price, and % change.\nUpdate master sheet: After a 1 min pause, writes the updated current_price back to your “master” sheet so future runs use it as the new baseline.\n\nGoogle Sheets credentials (~5 min)\nCreate a Google Sheets OAuth credential in n8n.\nCopy your sheet’s ID and ensure you have two tabs:\nproduct_data (columns: product_url, price)\nprice_tracking (columns: timestamp, product_url, last_price, current_price, price_diff_pct, price_changed)\nPaste the sheet ID into both Google Sheets nodes (“Read” and “Append/Update”).\nTelegram credentials (~5 min)\nCreate a Telegram Bot token via BotFather.\nCopy your chat_id (for your target group or personal chat).\nAdd those credentials to n8n and drop them into the “Telegram” node.\n\nVerify the schedule in the Schedule Trigger node is set to 08:00 (or adjust to your preferred run time).\nIn the Loop Over Items node, confirm “Batch Size” is 1 (to process one URL at a time).\nAdjust the Delay to avoid Request Blocking node if your site requires a longer pause (default is 20 s).\nIn the Parse Data From The HTML Page node, double-check the CSS selector matches how prices appear on your target site.\nOnce credentials are in place and your sheet tabs match the expected column names, the flow should be ready to activate. Total setup time is under 15 minutes—detailed notes are embedded as sticky comments throughout the workflow to help you tweak selectors, change timeouts, or adjust sheet names without digging into code.",
"isPaid": false
},
{
"templateId": "1111",
"templateName": "template_1111",
"templateDescription": "This workflow allows you to create transcription jobs for all your audio and video files stored in AWS S3. workflow-screenshot AWS S3: This node will...",
"templateUrl": "https://n8n.io/workflows/1111",
"jsonFileName": "template_1111.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_1111.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/223bd4b94ba01e25ae1b723ba7c80245/raw/22dec3171af5114a5d42925ea335f21bce2c6838/template_1111.json",
"screenshotURL": "https://i.ibb.co/4nDHdvvq/b4f54d991ddc.png",
"workflowUpdated": true,
"gistId": "223bd4b94ba01e25ae1b723ba7c80245",
"templateDescriptionFull": "This workflow allows you to create transcription jobs for all your audio and video files stored in AWS S3.\n\n\n\nAWS S3: This node will retrieve all the files from an S3 bucket you specify.\n\nAWS Transcribe: This node will create a transcription job for the files that get returned by the previous node.",
"isPaid": false
},
{
"templateId": "2354",
"templateName": "template_2354",
"templateDescription": "This n8n workflow demonstrates a simple multi-agent setup to perform the task of competitor research. It showcases how using the HTTP request tool could...",
"templateUrl": "https://n8n.io/workflows/2354",
"jsonFileName": "template_2354.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_2354.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/c243d8785d6152a1facb15953fd6f604/raw/cad3c6d0e68c2e31688a18aa12a83fdad44920eb/template_2354.json",
"screenshotURL": "https://i.ibb.co/0ydRXxZk/ccaad223d0dc.png",
"workflowUpdated": true,
"gistId": "c243d8785d6152a1facb15953fd6f604",
"templateDescriptionFull": "This n8n workflow demonstrates a simple multi-agent setup to perform the task of competitor research. It showcases how using the HTTP request tool could reduce the number of nodes needed to achieve a workflow like this.\n\nFor this template, a source company is defined by the user which is sent to Exa.ai to find competitors.\nEach competitor is then funnelled through 3 AI agents that will go out onto the internet and retrieve specific datapoints about the competitor; company overview, product offering and customer reviews.\nOnce the agents are finished, the results are compiled into a report which is then inserted in a notion database.\n\nCheck out an example output here: https://jimleuk.notion.site/2d1c3c726e8e42f3aecec6338fd24333?v=de020fa196f34cdeb676daaeae44e110&pvs=4\n\nAn OpenAI account for the LLM.\nExa.ai account for access to their AI search engine.\nSerpAPI account for Google search.\nFirecrawl.dev account for webscraping.\nNotion.com account for database to save final reports.\n\nAdd additional agents to gather more datapoints such as SEO keywords and metrics.\n\nNot using notion? Feel free to swap this out for your own database.",
"isPaid": false
},
{
"templateId": "1546",
"templateName": "Create a QuickBooks invoice on a new Onfleet Task creation",
"templateDescription": "Summary Onfleet is a last-mile delivery software that provides end-to-end route planning, dispatch, communication, and analytics to handle the heavy lifting...",
"templateUrl": "https://n8n.io/workflows/1546",
"jsonFileName": "Create_a_QuickBooks_invoice_on_a_new_Onfleet_Task_creation.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Create_a_QuickBooks_invoice_on_a_new_Onfleet_Task_creation.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/0984d248772df8649511d0f2e1e25cb4/raw/f53be13a2b841a01c522b6c5115214a4606bf6ed/Create_a_QuickBooks_invoice_on_a_new_Onfleet_Task_creation.json",
"screenshotURL": "https://i.ibb.co/Zw5bP2N/ee3b1873f7ff.png",
"workflowUpdated": true,
"gistId": "0984d248772df8649511d0f2e1e25cb4",
"templateDescriptionFull": "Summary\n\nOnfleet is a last-mile delivery software that provides end-to-end route planning, dispatch, communication, and analytics to handle the heavy lifting while you can focus on your customers.\n\nThis workflow template listens to an Onfleet event and interacts with the QuickBooks API. You can easily streamline this with your QuickBooks invoices or other entities. Typically, you can create an invoice when an Onfleet task is created to allow your customers to pay ahead of an upcoming delivery.\n\nConfigurations\n\nUpdate the Onfleet trigger node with your own Onfleet credentials, to register for an Onfleet API key, please visit https://onfleet.com/signup to get started\nYou can easily change which Onfleet event to listen to. Learn more about Onfleet webhooks with Onfleet Support\nUpdate the QuickBooks Online node with your QuickBooks credentials",
"isPaid": false
},
{
"templateId": "2800",
"templateName": "Zoom AI Meeting Assistant",
"templateDescription": "Update 19-04-2025Change from OpenAI to Claude 3.7 Sonnet moduleAdding the Think Tool The update enables significantly better results to be achieved. This is...",
"templateUrl": "https://n8n.io/workflows/2800",
"jsonFileName": "Zoom_AI_Meeting_Assistant.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Zoom_AI_Meeting_Assistant.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/77c8de4776806825a9ffd9a53545ce48/raw/123df43219e2b46b0a6f31c5a456892b7eb419d8/Zoom_AI_Meeting_Assistant.json",
"screenshotURL": "https://i.ibb.co/W40Hv6N4/bbb988280c68.png",
"workflowUpdated": true,
"gistId": "77c8de4776806825a9ffd9a53545ce48",
"templateDescriptionFull": "Change from OpenAI to Claude 3.7 Sonnet module\nAdding the Think Tool\n\nThe update enables significantly better results to be achieved. This is particularly noticeable during longer meetings!\n\nThis workflow retrieves the Zoom meeting data from the last 24 hours. The transcript of the last meeting is then retrieved, processed, a summary is created using AI and sent to all participants by email.\nAI is then used to create tasks and follow-up appointments based on the content of the meeting.\n\nImportant: You need a Zoom Workspace Pro account and must have activated Cloud Recording/Transcripts!\n\nThis workflow has the following sequence:\n\nmanual trigger (Can be replaced by a scheduled trigger or a webhook)\nretrieval of of Zoom meeting data\nfilter the events of the last 24 hours\nretrieval of transcripts and extract of the text\ncreating a meeting summary, format to html and send per mail\ncreate tasks and follow-up call (if discussed in the meeting) in ClickUp/Outlook (can be replaced by Gmail, Airtable, and so forth) via sub workflow\n\nZoom Workspace (via API and HTTP Request): Documentation\nMicrosoft Outlook: Documentation\nClickUp: Documentation\nAI API access (e.g. via OpenAI, Anthropic, Google or Ollama)\nSMTP access data (for sending the mail)\n\nYou must set up the individual sub-workflows as separate workflows. Then set the “Execute workflow trigger” here. Then select the corresponding sub-workflow in the AI Agent Tools.\nYou can select the number of domains yourself. If the data queries are not required, simply delete the corresponding tool (e.g. “Analytics_Domain_5).\n\nFeel free to contact me via LinkedIn, if you have any questions!",
"isPaid": false
},
{
"templateId": "5904",
"templateName": "Google Meet Automation",
"templateDescription": "This workflow contains community nodes that are only compatible with the self-hosted version of n8n. ❓ What Problem Does It Solve? Manual transcription and...",
"templateUrl": "https://n8n.io/workflows/5904",
"jsonFileName": "Google_Meet_Automation.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Google_Meet_Automation.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/791a156e8edeeaf61154368f9c741048/raw/b6db8b50a0c59de94bcd4f67b9c1b3275d3b8cdc/Google_Meet_Automation.json",
"screenshotURL": "https://i.ibb.co/Jj4MNhgr/535addaccb79.png",
"workflowUpdated": true,
"gistId": "791a156e8edeeaf61154368f9c741048",
"templateDescriptionFull": "This workflow contains community nodes that are only compatible with the self-hosted version of n8n.\n\nManual transcription and action planning from meeting notes is often error-prone, time-consuming, and inconsistent. Important tasks, decisions, or deadlines can be overlooked or delayed. This workflow solves these pain points by automatically analyzing notes using AI and turning them into actionable, structured data. It drastically reduces follow-up delays, miscommunications, and administrative effort, letting teams focus on execution instead.\n\nSave Hours of Manual Work: Automatically transform raw meeting notes into structured tasks and emails without lifting a finger.\nEnsure Accurate Follow-up: Never miss important action items or decisions buried in text; everything is extracted and assigned clearly.\nImprove Team Collaboration: Instantly distribute meeting summaries and next steps to attendees, keeping everyone aligned.\nLeverage Advanced AI: Utilize Google Gemini’s powerful natural language processing tailored specifically for meetings.\nFully End-to-End Automated: From receiving notes to task creation and email dispatch — your post-meeting workflow is completely hands-free.\n\nProject Managers: Streamline task delegation and keep project timelines on track.\nTeam Leads: Quickly communicate key takeaways and follow-ups to team members.\nSales and Account Teams: Document client meetings efficiently and automate follow-up outreach.\nRemote Teams: Ensure clarity and continuity after virtual meetings.\nExecutives: Get concise summaries and important decision logs automatically.\n\n⏱ Trigger: Activated via a POST webhook receiving meeting notes, title, attendees, date, and duration.\n📎 Step 2: Validates inputs; if missing required fields, sends an error response.\n🔍 Step 3: Extracts and formats meeting data into structured variables for processing.\n🤖 Step 4: Sends meeting notes to Google Gemini AI for advanced analysis to identify action items, decisions, summaries, follow-ups, and dates.\n💌 Step 5: Splits AI responses to create Google Tasks from action items and send personalized follow-up emails via Gmail.\n🗂 Step 6: Generates a Google Docs meeting summary document and finally returns a success response with all processed results.\n\nImport the provided Google Meet Automation.json file into your n8n instance. use Payload example\nSet up credentials for:\n\nGoogle OAuth2 API (Google Tasks, Google Docs)\nGmail OAuth2 API for sending emails\nGoogle Palm API (for Google Gemini AI access)\nGoogle OAuth2 API (Google Tasks, Google Docs)\nGmail OAuth2 API for sending emails\nGoogle Palm API (for Google Gemini AI access)\nCustomize workflow parameters:\n\nWebhook URL and access permissions\nGoogle Tasks project or folders if applicable\nEmail templates if desired (subject line, branding)\nWebhook URL and access permissions\nGoogle Tasks project or folders if applicable\nEmail templates if desired (subject line, branding)\nUpdate any API endpoints or credential references to match your account setup.\nThoroughly test with sample meeting note payloads to ensure smooth execution.\n\nActive n8n instance (Cloud or Self-hosted)\nGoogle Cloud Platform project with:\n\nGoogle Tasks API enabled\nGoogle Docs API enabled\nGmail API enabled\nGoogle Palm API access (Google Gemini AI)\nGoogle Tasks API enabled\nGoogle Docs API enabled\nGmail API enabled\nGoogle Palm API access (Google Gemini AI)\nValid OAuth2 credentials configured in n8n for above services\nAPI quota and permissions for sending emails, creating docs, and tasks\n\nIntegrate with calendar apps (Google Calendar, Outlook) to auto-schedule next meetings.\nAdd Slack or Microsoft Teams notifications for real-time alerts.\nExtend AI prompt for deeper insights like sentiment analysis or risk flags.\nCustomize email templates with branding, signatures, or attachments.\nConnect task outputs with project management tools like Asana, Trello, or Jira.\n\nMade by: khaisa Studio\nTag: automation, google meet, meeting notes, AI, google tasks, gmail, google docs\nCategory: Productivity\nNeed a custom? Contact Us",
"isPaid": false
},
{
"templateId": "5001",
"templateName": "Google Calendar Interview Scheduler",
"templateDescription": "SEO-Optimized Description: Streamline your interview scheduling process with this intelligent n8n automation template powered by Google Calendar, Google...",
"templateUrl": "https://n8n.io/workflows/5001",
"jsonFileName": "Google_Calendar_Interview_Scheduler.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Google_Calendar_Interview_Scheduler.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/8e0126977fec688ab8f50dd202c5c83a/raw/c9d2d02f0f91a64cbed77ea517363a6ef4afdc91/Google_Calendar_Interview_Scheduler.json",
"screenshotURL": "https://i.ibb.co/j9C3k91P/768e050ceca3.png",
"workflowUpdated": true,
"gistId": "8e0126977fec688ab8f50dd202c5c83a",
"templateDescriptionFull": "SEO-Optimized Description:\n\nStreamline your interview scheduling process with this intelligent n8n automation template powered by Google Calendar, Google Sheets, and GPT-4. This workflow reads candidate information from a spreadsheet, automatically schedules interviews in Google Calendar, and sends personalized interview invitation emails—all without manual input.\n\nWhat This Template Does:\n\n📋 Monitors a Google Sheet for new candidate entries every minute\n🕒 Auto-selects the next available interview slot (Mon/Wed/Fri at 3 PM)\n📅 Creates a calendar invite in your Google Calendar\n✍️ Uses GPT-4 to generate personalized emails based on candidate data\n📧 Sends the email invite with the interview link via Gmail\n\nBuilt-in logic ensures:\n\nCandidates never get same-day interviews\nAI-generated emails are concise, polite, and professionally formatted\nScheduling remains conflict-free and easy to manage\n\nRequirements:\n\nGoogle Calendar API credentials\nGoogle Sheets with candidate info (Name, Email, Background)\nGmail account with OAuth2\nAzure OpenAI API (GPT-4o recommended)\n\nPerfect For:\n\nStartups, HR teams, and recruiters looking to automate interview scheduling, eliminate back-and-forth emails, and deliver a professional candidate experience—all with zero hassle.",
"isPaid": false
},
{
"templateId": "4927",
"templateName": "AILessonPlanGenerator_Teachers copy 2",
"templateDescription": "WhyTeachers now spend 3-4 hours per lesson creating materials and resources from scratch. With additional/special needs, this makes it difficult to create...",
"templateUrl": "https://n8n.io/workflows/4927",
"jsonFileName": "AILessonPlanGenerator_Teachers_copy_2.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/AILessonPlanGenerator_Teachers_copy_2.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/201345c3809dcec87719a7c80732e29f/raw/b16ddd5c5f6c6e215dce7fc7f7aca1c445ac5a98/AILessonPlanGenerator_Teachers_copy_2.json",
"screenshotURL": "https://i.ibb.co/1twjzC5c/8115220a3367.png",
"workflowUpdated": true,
"gistId": "201345c3809dcec87719a7c80732e29f",
"templateDescriptionFull": "Teachers now spend 3-4 hours per lesson creating materials and resources from scratch. With additional/special needs, this makes it difficult to create additional materials. This is unsustainable and takes their time away from teaching. Tailored for UK teachers but can be expanded globally with prompt and form enhancements.\n\nI built a system with three specialized AI agents that create complete lesson packages and automatically uploads a document in Google drive and puts an appointment in calendar to review the document.\n\nResearch agent to pull specific information including special education needs and curriculums.\nThe scoring and assessment agent to generate tailored assessment plans, assignments, grading mechanism based on chosen requirements.\nThe integration agent just provides ideas to expand to other tools. In nfuture there is opportunity to add on Kahoot or other tools to create quizzes.\nFinally the enriched document is emailed and a calendar invite is sent for review.\n\nN8N\nAny LLM API Key (I used OpenAI)\nGoogle drive integration\nGoogle calendar integration\nModify the email id from XXX@gmail.com to your Email id in email component.\n\nWatch this video for intro on how it works.\n\nContact me on info@pankstr.com for any queries.",
"isPaid": false
},
{
"templateId": "3105",
"templateName": "template_3105",
"templateDescription": "How it works 🗣️&gt; 📖 I set up this workflow to convert any audio or video file into structured text using the new ElevenLabs Scribe model, one of the...",
"templateUrl": "https://n8n.io/workflows/3105",
"jsonFileName": "template_3105.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_3105.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/3b2b5930e6547c871d794bda402b25d9/raw/04e58cf7e64c1a13e2861230e0e05df681d14949/template_3105.json",
"screenshotURL": "https://i.ibb.co/ks0bZ7Gj/841dc9f825a0.png",
"workflowUpdated": true,
"gistId": "3b2b5930e6547c871d794bda402b25d9",
"templateDescriptionFull": "I set up this workflow to convert any audio or video file into structured text using the new ElevenLabs Scribe model, one of the best Speech-to-Text AIs, available in 99+ languages. This workflow integrates seamlessly with n8n and leverages the ElevenLabs Scribe API to:\n\nThis workflow seamlessly integrates with n8n to:\n✅ Upload audio/video files automatically\n✅ Transcribe them with industry-leading accuracy in any language\n✅ Export the text for further processing (summaries, subtitles, SEO content, etc.)\n\n👉 Try the new ElevenLabs Scribe model now: Convert speech to text instantly\n\n🔹 Podcast Transcriptions – Convert podcast episodes into blog posts for SEO and accessibility\n🔹 YouTube Subtitles – Generate captions automatically for increased engagement\n🔹 Legal & Compliance – Accurately transcribe meetings, interviews, or customer calls\n🔹 E-learning – Turn lectures and webinars into structured course notes\n🔹 SEO & Content Marketing – Repurpose videos into articles, quotes, and social media content\n\n💡 Boost your productivity with the new Scribe model → Start with ElevenLabs Scribe\n\n🚀 Quick & simple setup in n8n – Upload your file, select the model (scribe_v1), and let the AI handle the rest via the ElevenLabs API.\n\n⸻\n\nI wanted the most accurate and reliable transcription tool for my workflow. After testing different options, Scribe outperformed Google Gemini & OpenAI Whisper in independent benchmarks. It delivers high-quality transcriptions, even in underserved languages like Serbian, Mongolian, and many more.\n\n✅ Transcribes in 99+ languages\n✅ Fast, accurate, and easy to integrate\n✅ Suitable for content creators, businesses, and professionals\n\n🔗 Get started now and revolutionize your workflow with the new Scribe model → Try Scribe AI today 🚀\n\nPhil | Inforeole | Linkedin\n\n🇫🇷 Contactez nous pour automatiser vos processus",
"isPaid": false
},
{
"templateId": "2237",
"templateName": "Automate Your Customer Service With WhatsApp Business Cloud & Asana",
"templateDescription": "How it works:This workflow automates your customer service with built in notifications for your users & ticket creation with Asana. If a user submits a...",
"templateUrl": "https://n8n.io/workflows/2237",
"jsonFileName": "Automate_Your_Customer_Service_With_WhatsApp_Business_Cloud__Asana.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Automate_Your_Customer_Service_With_WhatsApp_Business_Cloud__Asana.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/eb334b2ca16adf10998d3ccd7ab0f635/raw/d705d6404afe240ec7b307a74eb22bd6e1a69560/Automate_Your_Customer_Service_With_WhatsApp_Business_Cloud__Asana.json",
"screenshotURL": "https://i.ibb.co/6cBxxP4Y/58a95517ec83.png",
"workflowUpdated": true,
"gistId": "eb334b2ca16adf10998d3ccd7ab0f635",
"templateDescriptionFull": "This workflow automates your customer service with built in notifications for your users & ticket creation with Asana.\n\nIf a user submits a form, he gets send a confirmation message via WhatsApp a task is opened in Asana with his request in it.\n\nYou need to add your credentials to the WhatsApp Business Cloud node.\nYou need to add your credentials to the Asana node.\nReplace the placeholders with the correct phone number, id, and so on.\nChange the confirmation message to your liking.\n\nYou could extend this workflow to update your user on the progress of the ticket in Asana.\nYou can change the messaging from WhatsApp to E-Mail.\nYou can change the form submission service from n8n-native to Typeform or similar.\nYou can change the task management software from Asana to the one you use.\n\nClick here to find a blog post with additional information.",
"isPaid": false
},
{
"templateId": "1769",
"templateName": "template_1769",
"templateDescription": "This workflow syncs data between Notion and Asana whenever a new task or an update is done in one of the apps. PrerequisitesAsana account and Asana...",
"templateUrl": "https://n8n.io/workflows/1769",
"jsonFileName": "template_1769.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_1769.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/8ab1ee1684cb2e4a98dd26334f6776c3/raw/bcbacb1a8cd7e6894683ad53b8ea77ccfedc2682/template_1769.json",
"screenshotURL": "https://i.ibb.co/27Q7zt8R/8075bdedfe58.png",
"workflowUpdated": true,
"gistId": "8ab1ee1684cb2e4a98dd26334f6776c3",
"templateDescriptionFull": "This workflow syncs data between Notion and Asana whenever a new task or an update is done in one of the apps.\n\nAsana account and Asana credentials\nNotion account and Notion credentials\n\nGo to Asana account.\nCreate a new task in Asana.\nNotice a new task created in Notion account.\nUpdate the task in Asana.\nNotice the task is updated in Notion.",
"isPaid": false
},
{
"templateId": "654",
"templateName": "Receive updates when an event occurs in Asana",
"templateDescription": "workflow-screenshot",
"templateUrl": "https://n8n.io/workflows/654",
"jsonFileName": "Receive_updates_when_an_event_occurs_in_Asana.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Receive_updates_when_an_event_occurs_in_Asana.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/2d92c77ac524d1e625bf0ad761f7d43b/raw/4728b67576d70f793d6fcac1cc9e7f74ab980d84/Receive_updates_when_an_event_occurs_in_Asana.json",
"screenshotURL": "https://i.ibb.co/PGw3fpmN/fd523b64607e.png",
"workflowUpdated": true,
"gistId": "2d92c77ac524d1e625bf0ad761f7d43b",
"templateDescriptionFull": "workflow-screenshot",
"isPaid": false
},
{
"templateId": "478",
"templateName": "Create a new task in Asana",
"templateDescription": "workflow-screenshot",
"templateUrl": "https://n8n.io/workflows/478",
"jsonFileName": "Create_a_new_task_in_Asana.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Create_a_new_task_in_Asana.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/3be312df21a8b33abb3e80705fafc496/raw/1258e3d753561515f4ce16070bacbd8f9ff10a4a/Create_a_new_task_in_Asana.json",
"screenshotURL": "https://i.ibb.co/dJM6vrHZ/327f4423ece6.png",
"workflowUpdated": true,
"gistId": "3be312df21a8b33abb3e80705fafc496",
"templateDescriptionFull": "workflow-screenshot",
"isPaid": false
},
{
"templateId": "1118",
"templateName": "template_1118",
"templateDescription": "This workflow will allow you at the beginning of each day to copy your google calendar events into Trello so you can take notes, label, or automate your...",
"templateUrl": "https://n8n.io/workflows/1118",
"jsonFileName": "template_1118.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_1118.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/de51e52de7421e46b0878545bd104bb8/raw/7077f1aca6c20bf235ce80977523cce34b3e22c2/template_1118.json",
"screenshotURL": "https://i.ibb.co/zW8x0V9H/0579bf002efa.png",
"workflowUpdated": true,
"gistId": "de51e52de7421e46b0878545bd104bb8",
"templateDescriptionFull": "This workflow will allow you at the beginning of each day to copy your google calendar events into Trello so you can take notes, label, or automate your tasks.\n\nWhen deploying this, don't forget to change:\n\nLabel ID for meeting type under \"Create Trello Cards\". You should be able to find instructions Here on how to find the label ID.\nDescription for Trello cards under \"Create Trello Cards\". I currently pull in notes but it should be simple to change to pull the Gcal description instead.\nYou can change the trigger time to fire at a different time.",
"isPaid": false
},
{
"templateId": "791",
"templateName": "Get Product Feedback",
"templateDescription": "workflow-screenshot",
"templateUrl": "https://n8n.io/workflows/791",
"jsonFileName": "Get_Product_Feedback.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Get_Product_Feedback.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/a0b31e5e685e1932385b4127fffba666/raw/ac9c3e2fccd595ab1fcedfede88ae77c445844d7/Get_Product_Feedback.json",
"screenshotURL": "https://i.ibb.co/HLvjncTh/fe005bce23de.png",
"workflowUpdated": true,
"gistId": "a0b31e5e685e1932385b4127fffba666",
"templateDescriptionFull": "workflow-screenshot",
"isPaid": false
},
{
"templateId": "491",
"templateName": "Receive updates for changes in the specified list in Trello",
"templateDescription": "workflow-screenshot",
"templateUrl": "https://n8n.io/workflows/491",
"jsonFileName": "Receive_updates_for_changes_in_the_specified_list_in_Trello.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Receive_updates_for_changes_in_the_specified_list_in_Trello.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/316cd9e5b10b98004604c90622702c7c/raw/3337843e34922cf279065ac6f0a731afd258a950/Receive_updates_for_changes_in_the_specified_list_in_Trello.json",
"screenshotURL": "https://i.ibb.co/sJkRBXyq/781217a93c43.png",
"workflowUpdated": true,
"gistId": "316cd9e5b10b98004604c90622702c7c",
"templateDescriptionFull": "workflow-screenshot",
"isPaid": false
},
{
"templateId": "2785",
"templateName": "template_2785",
"templateDescription": "Who is this for? This workflow is designed for professionals and teams who need to monitor multiple RSS feeds, filter the latest content, and distribute...",
"templateUrl": "https://n8n.io/workflows/2785",
"jsonFileName": "template_2785.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_2785.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/2013dc1e2474901bc9f461bdf1cb1123/raw/5a2d5b035780e14617060c752f1e741bab428c04/template_2785.json",
"screenshotURL": "https://i.ibb.co/JWjJwyfw/a08d169bae6d.png",
"workflowUpdated": true,
"gistId": "2013dc1e2474901bc9f461bdf1cb1123",
"templateDescriptionFull": "This workflow is designed for professionals and teams who need to monitor multiple RSS feeds, filter the latest content, and distribute actionable updates as a Trello comment. Ideal for content managers, marketers, and team leads managing news or content pipelines.\n\nManually monitoring RSS feeds and keeping track of the latest content can be time-consuming. This workflow automates the aggregation, filtering, and distribution of news, ensuring that only relevant and timely updates are shared with your team or audience.\n\nAggregates RSS Feeds: Pulls data from up to three RSS feeds simultaneously.\nFilters Content: Filters articles based on their publication date (default: last 7 days).\nOrganizes and Sorts: Sorts filtered articles by date for clarity.\nFormats Updates: Transforms news items into Markdown format for better readability.\nPublishes and Notifies: Posts comments to Trello cards and sends an email to a moderator to check the comment.\n\nConnect your RSS feeds by configuring the RSS Read nodes.\nLink your Trello and Gmail accounts for seamless integration.\nAdjust the schedule trigger to set how often the workflow should run (e.g., daily, weekly).\nTest the workflow to ensure all connections and configurations are correct.\n\nChange the Number of RSS Feeds: Add or remove RSS Read nodes and update the merge configuration accordingly.\nAdjust the Date Filter: Modify the date logic in the “Filter by date” node to include more or fewer days.\nLimit the Number of Articles: Adjust the limit in the “Limit news to x” node.\nCustom Formatting: Update the Transform node to format the news items differently.\nAlternative Notifications: Replace Trello and Gmail with other integrations, such as Slack or Microsoft Teams.\n\nThis workflow ensures your team stays informed with minimal effort and delivers content updates in an organized and professional manner.",
"isPaid": false
},
{
"templateId": "2701",
"templateName": "Reschedule overdue Asana tasks and clean up completed tasks",
"templateDescription": "Description Boost your productivity and keep your Asana workspace clutter-free with this n8n workflow. It automatically scans for tasks whose due dates have...",
"templateUrl": "https://n8n.io/workflows/2701",
"jsonFileName": "Reschedule_overdue_Asana_tasks_and_clean_up_completed_tasks.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Reschedule_overdue_Asana_tasks_and_clean_up_completed_tasks.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/f9035f49822c8e1203921378bf380dbd/raw/e9d8122fb8f3eb19b0b8757fe5d344a66352343e/Reschedule_overdue_Asana_tasks_and_clean_up_completed_tasks.json",
"screenshotURL": "https://i.ibb.co/Q3KKMy8D/1d5400b0ec05.png",
"workflowUpdated": true,
"gistId": "f9035f49822c8e1203921378bf380dbd",
"templateDescriptionFull": "Boost your productivity and keep your Asana workspace clutter-free with this n8n workflow.\n\nIt automatically scans for tasks whose due dates have passed and reschedules them to the current date, ensuring no important to-dos slip through the cracks.\n\nAdditionally, any completed tasks in Asana with an overdue date are removed, maintaining a clear, organized task list.\n\nStreamline Task Management: No more manual updates—let the workflow reschedule overdue tasks for you.\nOptimize Workspace Organization: Eliminate finished tasks to focus on active priorities and reduce clutter.\nSave Time and Effort: Automate repetitive maintenance, freeing you to concentrate on what truly matters.\n\nAdd your Asana credentials\nSchedule the workflow to run at desired intervals (e.g., daily or weekly).\nSelect your Workspace Name and your Assignee Name (user) in the Get user tasks node\n(Optional) Tailor filtering conditions to match your preferred due-date rules and removal criteria.\nActivate the workflow and watch your Asana workspace stay up to date and clutter-free.",
"isPaid": false
},
{
"templateId": "1206",
"templateName": "template_1206",
"templateDescription": "This workflow is triggered when a new order is created in Shopify. Then:the order information is stored in Zoho CRM,an invoice is created in Harvest and...",
"templateUrl": "https://n8n.io/workflows/1206",
"jsonFileName": "template_1206.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_1206.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/5ec9940746104fab481935045b70fd3a/raw/38b561a869096a4e2e5953de6294bf40df00bed6/template_1206.json",
"screenshotURL": "https://i.ibb.co/cK2xQ5ts/09b1b2f3fb1a.png",
"workflowUpdated": true,
"gistId": "5ec9940746104fab481935045b70fd3a",
"templateDescriptionFull": "This workflow is triggered when a new order is created in Shopify. Then:\n\nthe order information is stored in Zoho CRM,\nan invoice is created in Harvest and stored in Trello,\nif the order value is above 50, an email with a discount coupon is sent to the customer and they are added to a MailChimp campaign for high-value customers; otherwise, only a \"thank you\" email is sent to the customer.\n\nNote that you need to replace the List ID in the Trello node with your own ID (see instructions in our docs). Same goes for the Account ID in the Harvest node (see instructions here).",
"isPaid": false
},
{
"templateId": "1109",
"templateName": "template_1109",
"templateDescription": "This workflow allows you to add positive feedback messages to a table in Notion. PrerequisitesCreate a Typeform that contains Long Text filed question type...",
"templateUrl": "https://n8n.io/workflows/1109",
"jsonFileName": "template_1109.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_1109.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/4f4d37f1ac18a9eb323cf5139bf5fce1/raw/2aa9b8448ad71257c1fe5eb5c9690dfff2772cfc/template_1109.json",
"screenshotURL": "https://i.ibb.co/0RdyHygP/93fa4acd6f48.png",
"workflowUpdated": true,
"gistId": "4f4d37f1ac18a9eb323cf5139bf5fce1",
"templateDescriptionFull": "This workflow allows you to add positive feedback messages to a table in Notion.\n\nCreate a Typeform that contains Long Text filed question type to accepts feedback from users.\nGet your Typeform credentials by following the steps mentioned in the documentation.\nFollow the steps mentioned in the documentation to create credentials for Google Cloud Natural Language.\nCreate a page on Notion similar to this page.\nCreate credentials for the Notion node by following the steps in the documentation.\nFollow the steps mentioned in the documentation to create credentials for Slack.\nFollow the steps mentioned in the documentation to create credentials for Trello.\n\n\n\nTypeform Trigger node: Whenever a user submits a response to the Typeform, the Typeform Trigger node will trigger the workflow. The node returns the response that the user has submitted in the form.\n\nGoogle Cloud Natural Language node: This node analyses the sentiment of the response the user has provided and gives a score.\n\nIF node: The IF node uses the score provided by the Google Cloud Natural Language node and checks if the score is positive (larger than 0). If the score is positive we get the result as True, otherwise False.\n\nNotion node: This node gets connected to the true branch of the IF node. It adds the positive feedback shared by the user in a table in Notion.\n\nSlack node: This node will share the positive feedback along with the score and username to a channel in Slack.\n\nTrello node: If the score is negative, the Trello node is executed. This node will create a card on Trello with the feedback from the user.",
"isPaid": false
},
{
"templateId": "2409",
"templateName": "template_2409",
"templateDescription": "Who might benfit from this workflow?Everyone organizing him/herself by using a notion database for tasks but losing track on some important tasks having a...",
"templateUrl": "https://n8n.io/workflows/2409",
"jsonFileName": "template_2409.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_2409.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/8195e7909dc898c6b80329fd65578a9e/raw/8a06554c5996cb27cc176d1e2ae556954e786adf/template_2409.json",
"screenshotURL": "https://i.ibb.co/rKH4Vj2S/3a8dc898e1f9.png",
"workflowUpdated": true,
"gistId": "8195e7909dc898c6b80329fd65578a9e",
"templateDescriptionFull": "Everyone organizing him/herself by using a notion database for tasks but losing track on some important tasks having a deadline. The weekly reminder helps you to not forget about your notion tasks.\n\nThe workflow fetches all your notion tasks from a desired database but the closed ones\nIt generates a html template for each tasks containing a headline and a short list of key data (prio, status deadline, tags)\nIt creates two groups based on the deadline date if a task is already overdue or not\nIt generates a complete html email containing both groups and some sugar around them\nIt sends the email to your desired email\nIt uses Pushover to send you a push notification to your phone\nIt is scheduled by the beginning of each week\n\nFill out the \"Set Workflow vars\" node with your data\nConnect your notion account and select the database your tasks are stored at\ndefine the status filters to the ones you are using for your tasks\nSetup your email server to enable the email node to deliver your html email\nCreate a Pushover account and setup the authentication for the Pushover node\nAdjust the last html node to change email style for your desire\n\nYou might adjust the filtering of the notion fetch node to filter for other statuses than provided in the example.\nYou apply your custom design to the html email\nYou could remove the filter which is filtering for tasks having a deadline and just send yourself a reminder for all tasks\n\nFeel free to contact me via LinkedIn or my business website.\n\nUse this link to register for n8n\n(This is an affiliate link)",
"isPaid": false
},
{
"templateId": "2377",
"templateName": "mails2notion V2",
"templateDescription": "Purpose This workflow automatically creates Tasks from forwarded Emails, similar to Asana, but better. Emails are processed by AI and converted to rather...",
"templateUrl": "https://n8n.io/workflows/2377",
"jsonFileName": "mails2notion_V2.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/mails2notion_V2.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/fc98016ed27f63394756fa166d3721c1/raw/6de652dbd4ac25bc35858a9cbbefdc7a4fe6e727/mails2notion_V2.json",
"screenshotURL": "https://i.ibb.co/GvSBJ7m7/c780dbfbb2a2.png",
"workflowUpdated": true,
"gistId": "fc98016ed27f63394756fa166d3721c1",
"templateDescriptionFull": "This workflow automatically creates Tasks from forwarded Emails, similar to Asana, but better. Emails are processed by AI and converted to rather actionable task.\n\nIn addition this workflow is build in a way, that multiple users can share this single process by setting up their individual configuration through a user friendly portal (internal tool) instead of the need to manage their own workflows.\n\n\n\nOne Gmail account is used to process inbound mails from different users.\nA custom web portal enables users to define “routes”. Thats where the mapping between an automatically generated Gmail Alias and a Notion Database URL, including the personal API Token, happens.\nUsing a Gmail Trigger, new entries are split by the Email Alias, so the corresponding route can be retrieved from the Database connected to the portal.\nEvery Email then gets processed by AI to get generate an actionable task and get a short summary of the original Email as well as some metadata.\nBased on a predefined structure a new Page is created in the corresponding Notion Database.\nFinally the Email is marked as “processed” in Gmail.\nIf an error happens, the route gets paused for a possible overflow and the user gets notified by Email.\n\nCreate a new Google account (alternatively you can use an existing one and set up rules to keep your inbox organized)\nCreate two Labels in Gmail: “Processed” and “Error”\nClone this Softr template including the Airtable dataset and publish the application\nClone this workflow and choose credentials (Gmail, Airtable)\nFollow the additional instructions provided within the workflow notes\nEnable the workflow, so it runs automatically in the background\n\nOpen published Softr application\nRegister as a new user\nCreate a new route containing the Notion API key and the Notion Database URL\nExpand the new entry to copy the Email address\nSave the address as a new contact in your Email provider of choice\nForward an Email to it and watch how it gets converted to an actionable task\n\nAirtable was chosen, so you can setup this template fairly quickly. It is advised to replace the persistence by something you own, like a self hosted SQL server, since we are dealing with sensitive information of multiple users\nThis solution is only meant for building internal tools, unless you own an embed license for n8n.",
"isPaid": false
},
{
"templateId": "1835",
"templateName": "template_1835",
"templateDescription": "This workflow creates/updates ClickUp tasks when Notion database pages are created/updated. All fields in the Notion database are mapped to a ClickUp...",
"templateUrl": "https://n8n.io/workflows/1835",
"jsonFileName": "template_1835.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_1835.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/e874faed9399705577e740bb66c33282/raw/00e9898690417e3fd02b76879b4de7f8d50f64f0/template_1835.json",
"screenshotURL": "https://i.ibb.co/Jwyjm76s/fa9bddfacef8.png",
"workflowUpdated": true,
"gistId": "e874faed9399705577e740bb66c33282",
"templateDescriptionFull": "This workflow creates/updates ClickUp tasks when Notion database pages are created/updated. All fields in the Notion database are mapped to a ClickUp property.\n\nNotion database will require setup before the workflow can be used. See the list of fields available in the setup below.\n\nNotion account and Notion credentials.\nClickUp account and ClickUp credentials.\n\nWhen a new database page is created in Notion, the workflow creates a new task in ClickUp with all required fields.\nThe new ClickUp task's ID is saved in the Notion database page's \"ClickUp ID\" field.\nThen, when the database page is updated in Notion, the workflow updates the specific ClickUp task identified by the \"ClickUp ID\" field in Notion.\n\nThis workflow requires that you set up a Notion database. To do so, follow the steps below:\n\nIn Notion, create a new database.\nAdd the following columns to the database:\n\nTask name (renamed from \"Name\")\nStatus (with type \"Select\" with the following options: \"to do\", \"in progress\", \"review\", \"revision\", \"complete\")\nDeadline (with type \"Date\")\nClickUp ID (with type \"Text\")\nAdd any other fields you require.\nTask name (renamed from \"Name\")\nStatus (with type \"Select\" with the following options: \"to do\", \"in progress\", \"review\", \"revision\", \"complete\")\nDeadline (with type \"Date\")\nClickUp ID (with type \"Text\")\nAdd any other fields you require.\nShare the database to n8n.\nBy default, the workflow will fill all the fields provided above, except for any other additional fields you add.",
"isPaid": false
},
{
"templateId": "1778",
"templateName": "Sync Todoist tasks to Notion",
"templateDescription": "This workflow checks if the task in Todoist has a specific label and based on that creates a new database page in Notion. Prerequisites Todoist account...",
"templateUrl": "https://n8n.io/workflows/1778",
"jsonFileName": "Sync_Todoist_tasks_to_Notion.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Sync_Todoist_tasks_to_Notion.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/8d63ed625efc08e339ec7e2d87597300/raw/ebed3626759a04860f1e9319cdbd2b2f9b446291/Sync_Todoist_tasks_to_Notion.json",
"screenshotURL": "https://i.ibb.co/7dy0dFmK/ad843a1d3a4d.png",
"workflowUpdated": true,
"gistId": "8d63ed625efc08e339ec7e2d87597300",
"templateDescriptionFull": "This workflow checks if the task in Todoist has a specific label and based on that creates a new database page in Notion.\n\nTodoist account and Todoist credentials\nNotion account and Notion credentials\n\nTo start the workflow add a task to Todoist and mark it with a label, e.g. “send-to-n8n”.\nWait a maximum of 30 seconds.\nTodoist node identifies the tasks marked as “send-to-n8n”.\nNotion node creates a new Notion database page. Notice Notion has a new task now with the same name as in Todoist.",
"isPaid": false
},
{
"templateId": "2582",
"templateName": "template_2582",
"templateDescription": "This n8n template builds a meeting assistant that compiles timely reminders of upcoming meetings filled with email history and recent LinkedIn activity of...",
"templateUrl": "https://n8n.io/workflows/2582",
"jsonFileName": "template_2582.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_2582.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/7eaebaed0a28623778678927f1a0927a/raw/d2e316495d115b6b1a536825f4216f5988e35e30/template_2582.json",
"screenshotURL": "https://i.ibb.co/RpnSshM8/ddcdb7350afe.png",
"workflowUpdated": true,
"gistId": "7eaebaed0a28623778678927f1a0927a",
"templateDescriptionFull": "This n8n template builds a meeting assistant that compiles timely reminders of upcoming meetings filled with email history and recent LinkedIn activity of other people on the invite. This is then discreetly sent via WhatsApp ensuring the user is always prepared, informed and ready to impress!\n\nA scheduled trigger fires hourly to check for upcoming personal meetings.\nWhen found, the invite is analysed by an AI agent to pull email and LinkedIn details of the other invitees.\n2 subworkflows are then triggered for each invitee to (1) search for last email correspondence with them and (2) scrape their LinkedIn profile + recent activity for social updates.\nUsing both available sources, another AI agent is used to summarise this information and generate a short meeting prep message for the user.\nThe notification is finally sent to the user's WhatsApp, allowing them ample time to review.\n\nThere are a lot of moving parts in this template so in it's current form, it's best to use this for personal rather than team calendars.\nThe LinkedIn scraping method used in this workflow requires you to paste in your LinkedIn cookies from your browser which essentially let's n8n impersonate you. You can retrieve this from dev console or ask someone technical for help!\n\nNote: It may be wise to switch to other LinkedIn scraping approaches which do not impersonate your own account for production.\n\nOpenAI for LLM\nGmail for Email\nGoogle Calendar for upcoming events\nWhatsApp Business account for notifications\n\nTry adding information sources which are relevant to you and your invitees. Such as company search, other social media sites etc.\nCreate an on-demand version which doesn't rely on the scheduled trigger. Sometimes you want to know prepare for meetings hours or days in advance where this could help immensely.",
"isPaid": false
},
{
"templateId": "4370",
"templateName": "template_4370",
"templateDescription": "🎤 Audio-to-Insights: Auto Meeting Summarizer Transform your meeting recordings into actionable insights automatically. This powerful n8n workflow monitors...",
"templateUrl": "https://n8n.io/workflows/4370",
"jsonFileName": "template_4370.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_4370.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/748109adbd7980cac70429709e9ac1f3/raw/ce569620bbfe58e862c1593f57c9865cc547dc51/template_4370.json",
"screenshotURL": "https://i.ibb.co/Y4KPHdbS/856cfd5e0589.png",
"workflowUpdated": true,
"gistId": "748109adbd7980cac70429709e9ac1f3",
"templateDescriptionFull": "Transform your meeting recordings into actionable insights automatically. This powerful n8n workflow monitors your Google Drive for new audio files, transcribes them using OpenAI's Whisper, generates intelligent summaries with ChatGPT, and logs everything in Google Sheets - all without lifting a finger.\n\nThis workflow operates as a seamless 6-step automation pipeline:\n\nStep 1: Smart Detection\nThe workflow continuously monitors a designated Google Drive folder (polls every minute) for newly uploaded audio files.\n\nStep 2: Secure Download\nWhen a new audio file is detected, the system automatically downloads it from Google Drive for processing.\n\nStep 3: AI Transcription\nOpenAI's Whisper technology converts your audio recording into accurate text transcription, supporting multiple audio formats.\n\nStep 4: Intelligent Summarization\nChatGPT processes the transcript using a specialized prompt that extracts:\n\nKey discussion points and decisions\nAction items with assigned persons and deadlines\nPriority levels and follow-up tasks\nClean, professional formatting\n\nStep 5: Timestamp Generation\nThe system automatically adds the current date and formats it consistently for tracking purposes.\n\nStep 6: Automated Logging\nThe final summary is appended to your Google Sheets document with the date, creating a searchable archive of all meeting insights.\n\nBefore setting up the workflow, ensure you have:\n\nActive Google Drive account\nOpenAI API key with credits\nGoogle Sheets access\nn8n instance (cloud or self-hosted)\n\n1. Credential Setup\n\nGoogle Drive OAuth2: Required for folder monitoring and file downloads\nOpenAI API Key: Needed for both transcription (Whisper) and summarization (ChatGPT)\nGoogle Sheets OAuth2: Essential for writing summaries to your spreadsheet\n\n2. Google Drive Configuration\n\nCreate a dedicated folder in Google Drive for meeting recordings\nCopy the folder ID from the URL (the long string after /folders/)\nUpdate the folderToWatch parameter in the workflow\n\n3. Google Sheets Preparation\n\nCreate a new Google Sheet or use an existing one\nEnsure it has columns: Date and Meeting Summary\nCopy the spreadsheet ID from the URL\nUpdate the documentId parameter in the workflow\n\n4. Audio Requirements\n\nSupported Formats: MP3, WAV, M4A, MP4\nRecommended Size: Under 100MB for optimal processing\nLanguage: Optimized for English (customizable for other languages)\nQuality: Clear audio produces better transcriptions\n\n5. Workflow Activation\n\nImport the workflow JSON into your n8n instance\nConfigure all credential connections\nTest with a sample audio file\nActivate the workflow trigger\n\nTeam Standup Summaries: Convert daily standups into actionable task lists\nSprint Retrospectives: Extract improvement points and action items\nStakeholder Updates: Generate concise reports for leadership\n\nDiscovery Call Notes: Capture prospect pain points and requirements\nDemo Follow-ups: Track questions, objections, and next steps\nCustomer Check-ins: Monitor satisfaction and expansion opportunities\n\nClient Strategy Sessions: Document recommendations and implementation plans\nRequirements Gathering: Organize complex project specifications\nProgress Reviews: Track deliverables and milestone achievements\n\nInterview Debriefs: Standardize candidate evaluation notes\nTraining Sessions: Create searchable knowledge bases\nPerformance Reviews: Document development plans and goals\n\nBrainstorming Sessions: Capture innovative ideas and concepts\nTechnical Reviews: Log decisions and architectural choices\nUser Research: Organize feedback and insights systematically\n\nEnhanced Summarization\nModify the ChatGPT prompt to focus on specific elements:\n\nIntegration Expansions\n\nSlack Integration: Auto-post summaries to relevant channels\nEmail Notifications: Send summaries to meeting participants\nCRM Updates: Push action items directly to Salesforce/HubSpot\nCalendar Integration: Schedule follow-up meetings based on action items\n\nQuality Improvements\n\nAudio Preprocessing: Add noise reduction before transcription\nMulti-language Support: Configure for international teams\nCustom Templates: Create industry-specific summary formats\nApproval Workflows: Add human review before final storage\n\nCommon Issues\n\nLarge File Processing: Split recordings over 100MB into smaller segments\nPoor Audio Quality: Use noise reduction tools before uploading\nAPI Rate Limits: Implement delay nodes for high-volume usage\nFormatting Issues: Adjust ChatGPT prompts for consistent output\n\nOptimization Tips\n\nUpload files in supported formats only\nEnsure stable internet connection for cloud processing\nMonitor OpenAI API usage and costs\nRegularly backup your Google Sheets data\nTest workflow changes with sample files first\n\nSample Summary Format:\n\nFor any questions, customizations, or technical support regarding this workflow:\n\n📧 Email Support\n\nPrimary Contact: Yaron@nofluff.online\nResponse Time: Within 24 hours on business days\nBest For: Setup questions, customization requests, troubleshooting\n\n🎥 Learning Resources\n\nYouTube Channel: https://www.youtube.com/@YaronBeen/videos\n\nStep-by-step setup tutorials\nAdvanced customization guides\nWorkflow optimization tips\nStep-by-step setup tutorials\nAdvanced customization guides\nWorkflow optimization tips\n\n🔗 Professional Network\n\nLinkedIn: https://www.linkedin.com/in/yaronbeen/\n\nConnect for ongoing support\nShare your workflow success stories\nGet updates on new automation ideas\nConnect for ongoing support\nShare your workflow success stories\nGet updates on new automation ideas\n\n💡 What to Include in Your Support Request\n\nDescribe your specific use case\nShare any error messages or logs\nMention your n8n version and setup type\nInclude sample audio file characteristics (if relevant)\n\nReady to transform your meeting chaos into organized insights? Download the workflow and start automating your meeting summaries today!",
"isPaid": false
},
{
"templateId": "2136",
"templateName": "template_2136",
"templateDescription": "Use caseWhen collecting leads via an online form, you often need to manually add those new leads into your Pipedrive CRM. This not only takes a lot of time...",
"templateUrl": "https://n8n.io/workflows/2136",
"jsonFileName": "template_2136.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_2136.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/26ceca2d68f2f523bd6caecc10a2ec07/raw/5c69df28192d2d445ba8c626200c6578556c8a14/template_2136.json",
"screenshotURL": "https://i.ibb.co/rKH4Vj2S/3a8dc898e1f9.png",
"workflowUpdated": true,
"gistId": "26ceca2d68f2f523bd6caecc10a2ec07",
"templateDescriptionFull": "When collecting leads via an online form, you often need to manually add those new leads into your Pipedrive CRM. This not only takes a lot of time but is also error-prone. This workflow automates this tedious work for you.\n\nThe workflow is triggered each time a form is submitted in n8n.\nIt validates the email address using Hunter.io.\nIf the email is valid, the workflow checks for an existing person with that email in Pipedrive.\nIf no existing person is found, it utilizes Clearbit to enrich the person's information.\nIt then verifies if the person's organization already exists in Pipedrive, creating a new organization if necessary.\nThe workflow then registers the person in Pipedrive.\nLastly, it creates a lead in Pipedrive using information from the person and organization.\n\nThis workflow is very quick to set up.\n\nAdd your Hunter.io, Clearbit and Pipedrive credentials\nClick the test workflow button\nActivate the workflow and use the form trigger production URL to collect your leads in a smart way\n\nExchange the n8n form trigger with your form of choice (Typeform, Google Forms, SurveyMonkey...)\nAdd a filter criteria to only add new leads if they match certain requirements\nRemove the email check with Hunter.io if you don't own this tool and expect new form submission to have a correct email anyways\nAdd ways to handle invalid emails or existing Persons",
"isPaid": false
},
{
"templateId": "2113",
"templateName": "template_2113",
"templateDescription": "Use CaseWhen trying to maximize your outreach, website visitors are often an overlooked source of qualified new leads. This workflow allows your to track...",
"templateUrl": "https://n8n.io/workflows/2113",
"jsonFileName": "template_2113.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_2113.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/f813df9c7a948c90afb4daf5d3d2029d/raw/e726fc5aa64e1b049dd872402a6a2664ad6df4a2/template_2113.json",
"screenshotURL": "https://i.ibb.co/JR65053s/a1d934f6b4d3.png",
"workflowUpdated": true,
"gistId": "f813df9c7a948c90afb4daf5d3d2029d",
"templateDescriptionFull": "When trying to maximize your outreach, website visitors are often an overlooked source of qualified new leads. This workflow allows your to track and enrich new website visitors and saves them to a Google Sheet once they meet a pre-defined criteria.\n\nThis workflow fires once a day and gets all your leads saved in Leadfeeder. It then takes the leads that meet a pre-defined engagement criteria, e.g. that they visited your site 3 times, and enriches them additionally with Clearbit. From there it filters the leads again by a criteria on the company, e.g. a minimum employee count, and saves matching leads into a Google Sheet document.\n\nAdd your Leedfeeder credentials. The name should be Authorization and the value Token token=yourapitoken. You can find your token via Settings -> Personal -> API-Token\nAdd your Google Sheet credentials\nSave the Leedfeeder account names you want to use in the Setup node\nCopy the Google Sheets Template and add its URL to the Setup node\n\nAdjust and/or remove the engagement and company criteria\nAdd more ways to enrich a company\n\nAutomatically reach out to users that meet the criteria / that get added to the sheet\nCreate a workflow that finds the right employee in companies that are identified by this workflow",
"isPaid": false
},
{
"templateId": "2109",
"templateName": "template_2109",
"templateDescription": "Who is this template for?This workflow template is designed for Sales and Customer Success professionals seeking alerts when potential high-value users,...",
"templateUrl": "https://n8n.io/workflows/2109",
"jsonFileName": "template_2109.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_2109.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/d30f20de0c6f8e61a0a0090ffa1072ec/raw/b2b34a6e453e7b68ed402fa83b167b48b23f7ccc/template_2109.json",
"screenshotURL": "https://i.ibb.co/Jwyjm76s/fa9bddfacef8.png",
"workflowUpdated": true,
"gistId": "d30f20de0c6f8e61a0a0090ffa1072ec",
"templateDescriptionFull": "This workflow template is designed for Sales and Customer Success professionals seeking alerts when potential high-value users, prospects, or existing customers register for a Discourse community. Leveraging Clearbit, it retrieves enriched data for the new member to assess their value.\n\n\n\nEach time a new member is created in Discourse, the workflow runs (powered by Discourse's native Webhooks feature).\nAfter filtering out popular private email accounts, we run the member's email through Clearbit to fetch available information on the member as well as their organization.\nIf the enriched data meets certain criteria, we send a Slack message to a channel. This message has a few quick actions: Open LinkedIn profile and Email member\n\nOverview is below. Watch this 🎥 quick set up video for detailed instructions on how to get the template running, as well as how to customize it.\n\nComplete the Set up credentials step when you first open the workflow. You'll need a Discourse (admin user), Clearbit, and Slack account.\nSet up the Webhook in Discourse, linking the On new Discourse user Trigger with your Discourse community.\nSet the correct channel to send to in the Post message in channel step\nAfter testing your workflow, swap the Test URL to Production URL in Discourse and activate your workflow\n\nTemplate was created in n8n v1.29.1",
"isPaid": false
},
{
"templateId": "1296",
"templateName": "template_1296",
"templateDescription": "This workflow enriches the information of a new contact that gets added to HubSpot. workflow-screenshot HubSpot Trigger: This node triggers the workflow...",
"templateUrl": "https://n8n.io/workflows/1296",
"jsonFileName": "template_1296.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_1296.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/9457e9013c0f0af48e709417271c1a77/raw/5d89904f24fc849abbd35509c646794de6609968/template_1296.json",
"screenshotURL": "https://i.ibb.co/rfK8W11p/589ac99809b1.png",
"workflowUpdated": true,
"gistId": "9457e9013c0f0af48e709417271c1a77",
"templateDescriptionFull": "This workflow enriches the information of a new contact that gets added to HubSpot.\n\n\n\nHubSpot Trigger: This node triggers the workflow when a new contact gets added to HubSpot.\n\nGet Contact: This node fetches the information of the new contact.\n\nClearbit: This node returns the data of the person and the company associated with the email address.\n\nUpdate Contact: This node will update the contact with the information returned by the Clearbit node. Based on your use case, you can select which fields you want to update.",
"isPaid": false
},
{
"templateId": "2124",
"templateName": "template_2124",
"templateDescription": "How it worksIt’s very important to come prepared to Sales calls. This often means a lot of manual research about the person you’re calling with. This...",
"templateUrl": "https://n8n.io/workflows/2124",
"jsonFileName": "template_2124.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_2124.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/73d4fb7c713d14e00a1e4172bb2eac1c/raw/d1566e6124edc1f5f87614a193ece0a350b0f4de/template_2124.json",
"screenshotURL": "https://i.ibb.co/KxNCsg64/fa63b82e153a.png",
"workflowUpdated": true,
"gistId": "73d4fb7c713d14e00a1e4172bb2eac1c",
"templateDescriptionFull": "It’s very important to come prepared to Sales calls. This often means a lot of manual research about the person you’re calling with. This workflow delivers the latest social media activity (LinkedIn + X) for businesses you are about to interact with each day.\n\nScans Your Calendar: Each morning, it reviews your Google Calendar for any scheduled meetings or calls with companies based on each attendee email address.\nFetches Latest Posts: For each identified company, it fetches recent LinkedIn and X posts\nDelivers Insights: You receive personalized emails via Gmail, each dedicated to a company you’re meeting with that day, containing a reminder of the meeting, list of posts categorized by the social media platform, and direct links to posts.\n\nThe workflow requires you to have the following accounts set up in their respective nodes:\n\nGoogle Calendar\nGMail\nClearbit\n\nBesides those, you will need an account on the RapidAPI platform and subscribe to the following APIs:\n\nFresh LinkedIn Profile Data\nTwitter",
"isPaid": false
},
{
"templateId": "2106",
"templateName": "template_2106",
"templateDescription": "Use caseWhen collecting new leads via a form, you need to follow up on new submissions. Often, this required a lot of manual work that includes reviewing...",
"templateUrl": "https://n8n.io/workflows/2106",
"jsonFileName": "template_2106.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_2106.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/dd3bcea56c2522e35db55f775aad2e98/raw/d275a062d4e4cf86dfbaa797bfe8dce32a8fa6bf/template_2106.json",
"screenshotURL": "https://i.ibb.co/dhm7fqn/9fd62a6072d7.png",
"workflowUpdated": true,
"gistId": "dd3bcea56c2522e35db55f775aad2e98",
"templateDescriptionFull": "When collecting new leads via a form, you need to follow up on new submissions. Often, this required a lot of manual work that includes reviewing each submission, checking if they meet your criteria and then outreaching. With this workflow you can do all of that fully automatically and save a lot of your valuable time.\n\nThis workflow runs every time you're receiving a new submission from an n8n form. It then filters out typical personal emails (such as Gmail, Hotmail, Yahoo etc.) before enriching the submission via Clearbit. It then checks, if the company of the submitter is a B2B company and has more than 499 employees. If it does, it sends an email via Gmail to the user.\n\nAdd the Clearbit and Gmail credentials\nClick on Test Workflow\nEnter your own email (which needs to be a business email to work) in the Form\nCheck your email\nOnce you're happy don't forget to activate this workflow\n\nReplace the form trigger with your form provider of choice (e.g. Typeform, SurveyMonkey, Google Forms etc.)\nAdjust the criteria to your needs via the If node\nAdjust the email you're sending in the Gmail node",
"isPaid": false
},
{
"templateId": "2390",
"templateName": "ProspectLens company research",
"templateDescription": "This n8n workflow automates the process of researching companies by gathering relevant data such as traffic volume, foundation details, funding information,...",
"templateUrl": "https://n8n.io/workflows/2390",
"jsonFileName": "ProspectLens_company_research.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/ProspectLens_company_research.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/e7e7e687c61c2f71a9ec4f3f25afc4d1/raw/22c0b8497b34aa8599f1c32f6c3e98971ef02cfd/ProspectLens_company_research.json",
"screenshotURL": "https://i.ibb.co/qFCJLvRg/c55031df9d46.png",
"workflowUpdated": true,
"gistId": "e7e7e687c61c2f71a9ec4f3f25afc4d1",
"templateDescriptionFull": "This n8n workflow automates the process of researching companies by gathering relevant data such as traffic volume, foundation details, funding information, founders, and more.\n\nThe workflow leverages the ProspectLens API, which is particularly useful for researching companies commonly found on Crunchbase and LinkedIn.\n\nProspectLens is an API that provides very detailed company data. All you need to do is supply the company's domain name.\n\nYou can obtain your ProspectLens API key here:\nhttps://apiroad.net/marketplace/apis/prospectlens\n\nIn n8n, create a new \"HTTP Header\" credential. Set x-apiroad-key as the \"Name\" and enter your APIRoad API key as the \"Value\". Use this credential in the HTTP Request node of the workflow.",
"isPaid": false
},
{
"templateId": "3664",
"templateName": "Lead Qualification with BatchData",
"templateDescription": "How It WorksThis workflow automates the real estate lead qualification process by leveraging property data from BatchData. The automation follows these...",
"templateUrl": "https://n8n.io/workflows/3664",
"jsonFileName": "Lead_Qualification_with_BatchData.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Lead_Qualification_with_BatchData.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/7db22ddd3431c244d1396482522d42e9/raw/dc41fa7a68f27822d61d89cb75f803d5bd768079/Lead_Qualification_with_BatchData.json",
"screenshotURL": "https://i.ibb.co/jvmWRyLX/2e003844fccb.png",
"workflowUpdated": true,
"gistId": "7db22ddd3431c244d1396482522d42e9",
"templateDescriptionFull": "This workflow automates the real estate lead qualification process by leveraging property data from BatchData. The automation follows these steps:\n\nWhen a new lead is received through your CRM webhook, the workflow captures their address information\nIt then makes an API call to BatchData to retrieve comprehensive property details\nA sophisticated scoring algorithm evaluates the lead based on property characteristics like:\n\nProperty value (higher values earn more points)\nSquare footage (larger properties score higher)\nProperty age (newer constructions score higher)\nInvestment status (non-owner occupied properties earn bonus points)\nLot size (larger lots receive additional score)\n\nLeads are automatically classified into categories (high-value, qualified, potential, or unqualified)\nThe workflow updates your CRM with enriched property data and qualification scores\nHigh-value leads trigger immediate follow-up tasks for your team\nNotifications are sent to your preferred channel (Slack in this example)\n\nThe entire process happens within seconds of receiving a new lead, ensuring your sales team can prioritize the most valuable opportunities immediately..\n\nThis workflow is perfect for:\n\nReal estate agents and brokers looking to prioritize high-value property leads\nMortgage lenders who need to qualify borrowers based on property assets\nHome service providers (renovators, contractors, solar installers) targeting specific property types\nProperty investors seeking specific investment opportunities\nReal estate marketers who want to segment audiences by property value\nHome insurance agents qualifying leads based on property characteristics\n\nAny business that bases lead qualification on property details will benefit from this automated qualification system.\n\nBatchData is a comprehensive property data provider that offers detailed information about residential and commercial properties across the United States. Their API provides:\n\nProperty valuation and estimates\nOwnership information\nProperty characteristics (size, age, bedrooms, bathrooms)\nTax assessment data\nTransaction history\nOccupancy status (owner-occupied vs. investment)\nLot details and dimensions\n\nBy integrating BatchData with your lead management process, you can automatically verify and enrich leads with accurate property information, enabling more intelligent lead scoring and routing based on actual property characteristics rather than just contact information.\n\nThis workflow demonstrates how to leverage BatchData's property API to transform your lead qualification process from manual research into an automated, data-driven system that ensures high-value leads receive immediate attention.",
"isPaid": false
},
{
"templateId": "2039",
"templateName": "template_2039",
"templateDescription": "About the workflowThe workflow reads every reply that is received from a cold email campaign and qualifies if the lead is interested in a meeting. If the...",
"templateUrl": "https://n8n.io/workflows/2039",
"jsonFileName": "template_2039.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_2039.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/2212b4fb57e32303a7402baf56f5e5bd/raw/a9c50b649a2d2cef9435e03f5d30d4c280beb238/template_2039.json",
"screenshotURL": "https://i.ibb.co/N2Z9JK7t/067efa6f5f1f.png",
"workflowUpdated": true,
"gistId": "2212b4fb57e32303a7402baf56f5e5bd",
"templateDescriptionFull": "The workflow reads every reply that is received from a cold email campaign and qualifies if the lead is interested in a meeting. If the lead is interested, a deal is made in pipedrive. You can add as many email inboxes as you need!\n\nAdd credentials to the Gmail, OpenAI and Pipedrive Nodes.\nAdd a in_campaign field in Pipedrive for persons. In Pipedrive click on your credentials at the top right, go to company settings > Data fields > Person and click on add custom field. Single option [TRUE/FALSE].\nIf you have only one email inbox, you can delete one of the Gmail nodes.\nIf you have more than two email inboxes, you can duplicate a Gmail node as many times as you like. Just connect it to the Get email node, and you are good to go!\nIn the Gmail inbox nodes, select Inbox under label names and uncheck Simplify.",
"isPaid": false
},
{
"templateId": "1822",
"templateName": "Two Way Sync Pipedrive and MySQL",
"templateDescription": "This workflow automates a two way sync of customer data between Pipedrive and MySQL. It will create new records in one source if it only exists in the...",
"templateUrl": "https://n8n.io/workflows/1822",
"jsonFileName": "Two_Way_Sync_Pipedrive_and_MySQL.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Two_Way_Sync_Pipedrive_and_MySQL.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/e154d6d747597b6721c4ebf639609f5b/raw/417df77ae90533945ef9ef8e39f6fbd6f859d5bf/Two_Way_Sync_Pipedrive_and_MySQL.json",
"screenshotURL": "https://i.ibb.co/nM1VsQXZ/cdb97758e468.png",
"workflowUpdated": true,
"gistId": "e154d6d747597b6721c4ebf639609f5b",
"templateDescriptionFull": "This workflow automates a two way sync of customer data between Pipedrive and MySQL. It will create new records in one source if it only exists in the other. Where matching records have different data for name, phone number or email address, it will sync the most recently updated version.",
"isPaid": false
},
{
"templateId": "2135",
"templateName": "template_2135",
"templateDescription": "Use CaseThis workflow is beneficial when you're automatically adding new leads to your Pipedrive CRM. Usually, you'd have to manually review each lead to...",
"templateUrl": "https://n8n.io/workflows/2135",
"jsonFileName": "template_2135.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_2135.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/235329c450b1ab68da3c24d23921134d/raw/9f0b4ca82233d1178b29c462c5108f5b081b31e7/template_2135.json",
"screenshotURL": "https://i.ibb.co/v6BQxHT0/916ad8a25088.png",
"workflowUpdated": true,
"gistId": "235329c450b1ab68da3c24d23921134d",
"templateDescriptionFull": "This workflow is beneficial when you're automatically adding new leads to your Pipedrive CRM. Usually, you'd have to manually review each lead to determine if they're a good fit. This process is time-consuming and increases the chances of missing important leads. This workflow ensures every new lead is promptly evaluated upon addition.\n\nThe workflow runs every 5 minutes. On every run, it checks your new Pipedrive leads and enriches them with Clearbit. It then marks items as enriched and checks if the company of the new lead matches certain criteria (in this case if they are B2B and have more than 100 employees) and sends a Slack alert to a channel for every match.\n\nYou must have Pipedrive, Clearbit, and Slack accounts. You also need to set up the custom fields Domain and Enriched at in Pipedrive.\n\nGo to Company Settings -> Data fields -> Organization and add Domain as a custom field\nGo to Company Settings -> Data fields -> Leads and add Enriched at as a custom date field\nAdd your Pipedrive, Clearbit and Slack credentials.\nFill the setup node below. To get the ID of your custom domain fields, simply run the Show only custom organization fields and Show only custom lead fields nodes below and copy the keys of your domain, and enriched at fields.\n\nModify the criteria to suit your definition of an interesting lead.\nIf you only want to focus on interesting leads in Pipedrive, add a node that archives all others.\n\nThis workflow was built using n8n version 1.29.1",
"isPaid": false
},
{
"templateId": "1787",
"templateName": "template_1787",
"templateDescription": "This workflow gets leads' contacts from a CSV file and adds it to the Pipedrive CRM by creating an organization and a person. The CSV file in this workflow...",
"templateUrl": "https://n8n.io/workflows/1787",
"jsonFileName": "template_1787.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_1787.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/805880c818026733dc6f21b83eac0442/raw/2bc192458573e0aa00a3a961bf1da8bb1c0c233c/template_1787.json",
"screenshotURL": "https://i.ibb.co/NR7sP6G/810529fbf7b0.png",
"workflowUpdated": true,
"gistId": "805880c818026733dc6f21b83eac0442",
"templateDescriptionFull": "This workflow gets leads' contacts from a CSV file and adds it to the Pipedrive CRM by creating an organization and a person. The CSV file in this workflow serves as a universal connector allowing you to export contacts from any platform like LinkedIn, Facebook, etc.\n\nGoogle account and Google credentials\nPipedrive account and Pipedrive credentials\n\nThe Google Drive Trigger node starts the workflow when a new CSV file is uploaded to a specific folder in Google Drive.\nGoogle Drive node downloads the CSV file.\nSpreadsheet File node reads data from the CSV file and sends the output to the Merge node. This Spreadsheet File's output becomes the input 1 for the Merge node.\nMeanwhile, the Pipedrive node gets the same list of contacts from the CSV file.\nIF node checks if Pipedrive has these contacts already created previously and sends the checked results to the Merge node. These results arrive at the Merge node as input 2.\nMerge node compares two inputs via email and removes the matches.\nPipedrive node creates new contacts based on the data provided by the Merge node with necessary details such as organization and notes.",
"isPaid": false
},
{
"templateId": "1776",
"templateName": "template_1776",
"templateDescription": "This workflow combines customers' details with their payment data and passes the input to Pipedrive as a note to the organization. Prerequisites Stripe...",
"templateUrl": "https://n8n.io/workflows/1776",
"jsonFileName": "template_1776.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_1776.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/3076c62bc6fb8dd379a9a3a109c9c599/raw/d548f07530c8a28f78842f0402a5fb3cde4f8441/template_1776.json",
"screenshotURL": "https://i.ibb.co/FbHjJLwW/b2beb3a9aeb2.png",
"workflowUpdated": true,
"gistId": "3076c62bc6fb8dd379a9a3a109c9c599",
"templateDescriptionFull": "This workflow combines customers' details with their payment data and passes the input to Pipedrive as a note to the organization.\n\nStripe account and Stripe credentials\nPipedrive account and Pipedrive credentials\n\nCron node triggers the workflow every day at 8 a.m.\nHTTP Request node searches for payments in Stripe.\nThe Item Lists node creates separate items from a list of payment data.\nMerge node takes in the payment data as an input 1.\nStripe node gets all the customers data.\nSet node renames customer-related data fields and keeps only needed fields.\nMerge node takes in the customer data as an input 2.\nMerge node combines the payment data with the customers one.\nPipedrive node searches for the organization and creates a note with payment data.",
"isPaid": false
},
{
"templateId": "1758",
"templateName": "template_1758",
"templateDescription": "This automated workflow takes a Typeform form, and once it is filled out, it is automatically uploaded as a Lead in Pipedrive. There is an option for custom...",
"templateUrl": "https://n8n.io/workflows/1758",
"jsonFileName": "template_1758.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_1758.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/491fbd267f2c71991675a173b88b113e/raw/c594940264898ab287c2b87ba7ce43062183cb24/template_1758.json",
"screenshotURL": "https://i.ibb.co/dhm7fqn/9fd62a6072d7.png",
"workflowUpdated": true,
"gistId": "491fbd267f2c71991675a173b88b113e",
"templateDescriptionFull": "This automated workflow takes a Typeform form, and once it is filled out, it is automatically uploaded as a Lead in Pipedrive. There is an option for custom fields (this workflow works with company size), and leaves notes in the note section based on questions answered.\n\nTypeform account and Typeform credentials and a form for people to fill out\nPipedrive account and Pipedrive credentials\n\nTypeform node gets the data after the survey is completed\nSet node extracts data from the Typeform node and keeps only relevant data\nFunction node maps the company size\nPipedrive node populates a pipeline with a deal and adds custom fields",
"isPaid": false
},
{
"templateId": "1333",
"templateName": "template_1333",
"templateDescription": "This workflow synchronizes data both ways between Pipedrive and HubSpot. workflow-screenshot Cron node schedules the workflow to run every minute.Pipedrive...",
"templateUrl": "https://n8n.io/workflows/1333",
"jsonFileName": "template_1333.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_1333.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/fc91c588f949948edd5952076c0d88b1/raw/e7250b28b04a16f02f8f61a01b2788fc6e754575/template_1333.json",
"screenshotURL": "https://i.ibb.co/qFCJLvRg/c55031df9d46.png",
"workflowUpdated": true,
"gistId": "fc91c588f949948edd5952076c0d88b1",
"templateDescriptionFull": "This workflow synchronizes data both ways between Pipedrive and HubSpot.\n\n\n\nCron node schedules the workflow to run every minute.\nPipedrive and Hubspot nodes pull in both lists of persons from Pipedrive and contacts from HubSpot.\nMerge1 and Merge2 nodes with the option Remove Key Matches identify the items that uniquely exist in HubSpot and Pipedrive, respectively.\nUpdate Pipedrive and Update HubSpot nodes take those unique items and add them in Pipedrive and HubSpot, respectively.",
"isPaid": false
},
{
"templateId": "2318",
"templateName": "piepdrive-test",
"templateDescription": "This workflow enriches new Pipedrive organization's data by adding a note to the organization object in Pipedrive. It assumes there is a custom \"website\"...",
"templateUrl": "https://n8n.io/workflows/2318",
"jsonFileName": "piepdrive-test.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/piepdrive-test.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/08c076057a4d12237b27c71a2f4ebd88/raw/af3348b809cafa338eb2fc8e2e1bd72f14d33ffd/piepdrive-test.json",
"screenshotURL": "https://i.ibb.co/0RZQHkCz/55e6933c170e.png",
"workflowUpdated": true,
"gistId": "08c076057a4d12237b27c71a2f4ebd88",
"templateDescriptionFull": "This workflow enriches new Pipedrive organization's data by adding a note to the organization object in Pipedrive. It assumes there is a custom \"website\" field in your Pipedrive setup, as data will be scraped from this website to generate a note using OpenAI. Then, a notification is sent in Slack.\n\nThis workflow uses a scraping API. Before using it, ensure you comply with the regulations regarding web scraping in your country or state.\n\nThe OpenAI model used is GPT-4o, chosen for its large input token capacity. However, it is not the cheapest model if cost is very important to you.\nThe system prompt in the OpenAI Node generates output with relevant information, but feel free to improve or modify it according to your needs.\n\nThis is the trigger of the workflow. When an organization object is created in Pipedrive, this node is triggered and retrieves the data. Make sure you have a \"website\" custom field in Pipedrive (the name of the field in the n8n node will appear as a random ID and not with the Pipedrive custom field name).\n\nThis node scrapes the content from the URL of the website associated with the Pipedrive Organization created in Node 1. The workflow uses the ScrapingBee API, but you can use any preferred API or simply the HTTP request node in n8n.\n\nThis node sends HTML-scraped data from the previous node to the OpenAI GPT-4o model. The system prompt instructs the model to extract company data, such as products or services offered and competitors (if known by the model), and format it as HTML for optimal use in a Pipedrive Note.\n\nThis node adds a Note to the Organization created in Pipedrive using the OpenAI node output. The Note will include the company description, target market, selling products, and competitors (if GPT-4o was able to determine them).\n\nThese two nodes format the HTML output to Slack Markdown.\n\nThe Note created in Pipedrive is in HTML format, as specified by the System Prompt of the OpenAI Node. To send it to Slack, it needs to be converted to Markdown and then to Slack Markdown.\n\nThis node sends a message in Slack containing the Pipedrive Organization Note created with this workflow.",
"isPaid": false
},
{
"templateId": "1998",
"templateName": "WordPress-to-Pipedrive Integration: Automating Contact & Lead Management",
"templateDescription": "How it Works: Capture Contact Requests: This template efficiently handles contact requests coming through a WordPress website using the Contact Form 7 (CF7)...",
"templateUrl": "https://n8n.io/workflows/1998",
"jsonFileName": "WordPress-to-Pipedrive_Integration_Automating_Contact__Lead_Management.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/WordPress-to-Pipedrive_Integration_Automating_Contact__Lead_Management.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/d22566c8987b6a3953042ee0789ad102/raw/f0e28a5144ec2fe7a4e336628df4f581115f6ff8/WordPress-to-Pipedrive_Integration_Automating_Contact__Lead_Management.json",
"screenshotURL": "https://i.ibb.co/Y4pW7Gm2/acb7ddc5d527.png",
"workflowUpdated": true,
"gistId": "d22566c8987b6a3953042ee0789ad102",
"templateDescriptionFull": "Capture Contact Requests: This template efficiently handles contact requests coming through a WordPress website using the Contact Form 7 (CF7) plugin with a webhook extension.\nContact Management: It automatically creates or updates contacts in Pipedrive upon receiving a new request.\nLead Management: Each contact request is securely stored in the lead inbox of Pipedrive, ensuring no opportunity is missed.\nTask Creation: For each new contact or update, the workflow triggers the creation of a related task, streamlining follow-up actions.\nNote Attachment: A comprehensive note containing all details from the contact request is attached to the corresponding lead, ensuring that all information is readily accessible.\n\nEstimated Setup Time: The setup process is straightforward and can be completed quickly. Specific time may vary depending on your familiarity with n8n and the systems involved.\n\nDetailed setup instructions are provided within the workflow via sticky notes. These notes offer in-depth guidance for configuring each component of the template to suit your specific needs.",
"isPaid": false
},
{
"templateId": "2319",
"templateName": "2. Refresh Pipedrive tokens",
"templateDescription": "This workflow provides an OAuth 2.0 auth token refresh process for better control. Developers can utilize it as an alternative to n8n's built-in OAuth flow...",
"templateUrl": "https://n8n.io/workflows/2319",
"jsonFileName": "2._Refresh_Pipedrive_tokens.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/2._Refresh_Pipedrive_tokens.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/b2c63b516289e1515954d987b0c241c6/raw/ef3238f46e7759f63a70f1a5e9af4a5978b03c5f/2._Refresh_Pipedrive_tokens.json",
"screenshotURL": "https://i.ibb.co/SwGyMwcY/f16b31074340.png",
"workflowUpdated": true,
"gistId": "b2c63b516289e1515954d987b0c241c6",
"templateDescriptionFull": "This workflow provides an OAuth 2.0 auth token refresh process for better control. Developers can utilize it as an alternative to n8n's built-in OAuth flow to achieve improved control and visibility. In this template, I've used Pipedrive API, but users can apply it with any app that requires the authorization_code for token access.\n\nThis resolves the issue of manually refreshing the OAuth 2.0 token when it expires, or when n8n's native OAuth stops working.\n\nYour database with a pre-existing table for storing authentication tokens and associated information. I'm using Supabase in this example, but you can also employ a self-hosted MySQL.\n\nHere's a quick video on setting up the Supabase table.\n\nCreate a client app for your chosen application that you want to access via the API.\nAfter duplicating the template:\n\na. Add credentials to your database and connect the DB nodes in all 3 workflows.\n\nEnable/Publish the first workflow, \"1. Generate and Save Pipedrive tokens to Database.\"\nOpen your client app and follow the Pipedrive instructions to authenticate.\n\nClick on Install and test.\n\nThis will save your initial refresh token and access token to the database.\n\nPlease watch the YouTube video for a detailed demonstration of the workflow:\n\nWorkflow 1. Create a workflow to capture the authorization_code, generate the access_token, and refresh the token, and then save the token to the database.\n\nWorkflow 2. Develop your primary workflow to fetch or post data to/from your application. Observe the logic to include an if condition when an error occurs with an invalid token. This triggers the third workflow to refresh the token.\n\nWorkflow 3. This workflow will handle the token refresh. Remember to send the unique ID to the webhook to fetch the necessary tokens from your table.\n\nDetailed demonstration of the workflow:\nhttps://youtu.be/6nXi_yverss",
"isPaid": false
},
{
"templateId": "4912",
"templateName": "Lead Workflow: Yelp & Trustpilot Scraping + OpenAI Analysis via BrightData",
"templateDescription": "🛒 Lead Workflow: Yelp & Trustpilot Scraping + OpenAI Analysis via BrightData &gt; Description: Automated lead generation workflow that scrapes business...",
"templateUrl": "https://n8n.io/workflows/4912",
"jsonFileName": "Lead_Workflow_Yelp__Trustpilot_Scraping__OpenAI_Analysis_via_BrightData.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Lead_Workflow_Yelp__Trustpilot_Scraping__OpenAI_Analysis_via_BrightData.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/33ec6821f576799532ae2d4a01e57d4c/raw/5327ab3d0ee5c8a1bdb735b469d7ac17eacb1260/Lead_Workflow_Yelp__Trustpilot_Scraping__OpenAI_Analysis_via_BrightData.json",
"screenshotURL": "https://i.ibb.co/FbHjJLwW/b2beb3a9aeb2.png",
"workflowUpdated": true,
"gistId": "33ec6821f576799532ae2d4a01e57d4c",
"templateDescriptionFull": "This workflow provides an automated lead generation solution that identifies high-quality prospects from Yelp and Trustpilot, analyzes their credibility through reviews, and sends personalized outreach emails. Perfect for digital marketing agencies, sales teams, and business development professionals.\n\n🎯 Smart Location Analysis\nAI breaks down cities into sub-locations for comprehensive coverage\n🛍 Yelp Integration\nScrapes business details using BrightData's Yelp dataset\n⭐ Trustpilot Verification\nValidates business credibility through review analysis\n📊 Data Storage\nAutomatically saves results to Google Sheets\n🤖 AI-Powered Outreach\nGenerates personalized emails using Claude AI\n📧 Automated Sending\nSends emails directly through Gmail integration\n\nUser Input: Submit location, country, and business category through a form\nAI Location Analysis: Gemini AI identifies sub-locations within the specified area\nYelp Scraping: BrightData extracts business information from multiple locations\nData Processing: Cleans and stores business details in Google Sheets\nTrustpilot Verification: Scrapes reviews and company details for credibility check\nEmail Generation: Claude AI creates personalized outreach messages\nAutomated Outreach: Sends emails to qualified prospects via Gmail\n\n⏱️ Estimated Setup Time: 10–15 minutes\n\nn8n instance (self-hosted or cloud)\nGoogle account with Sheets access\nBrightData account with Yelp and Trustpilot datasets\nGoogle Gemini API access\nAnthropic API key for Claude\nGmail account for sending emails\n\nCopy the JSON workflow code\nIn n8n: Workflows → + Add workflow → Import from JSON\nPaste JSON and click Import\n\nCreate two Google Sheets:\n\nYelp data: Name, Categories, Website, Address, Phone, URL, Rating\nTrustpilot data: Company Name, Email, Phone Number, Address, Rating, Company About\nYelp data: Name, Categories, Website, Address, Phone, URL, Rating\nTrustpilot data: Company Name, Email, Phone Number, Address, Rating, Company About\nCopy Sheet IDs from URLs\nIn n8n: Credentials → + Add credential → Google Sheets OAuth2 API\nComplete OAuth setup and test connection\nUpdate all Google Sheets nodes with your Sheet IDs\n\nSet up BrightData credentials in n8n\nReplace API token with: BRIGHT_DATA_API_KEY\nVerify dataset access:\n\nYelp dataset: gd_lgugwl0519h1p14rwk\nTrustpilot dataset: gd_lm5zmhwd2sni130p\nYelp dataset: gd_lgugwl0519h1p14rwk\nTrustpilot dataset: gd_lm5zmhwd2sni130p\nTest connections\n\nGoogle Gemini (Location Analysis)\n\nAdd Google Gemini API credentials\nConfigure model: models/gemini-1.5-flash\nAdd Google Gemini API credentials\nConfigure model: models/gemini-1.5-flash\nClaude AI (Email Generation)\n\nAdd Anthropic API credentials\nConfigure model: claude-sonnet-4-20250514\nAdd Anthropic API credentials\nConfigure model: claude-sonnet-4-20250514\n\nSet up Gmail OAuth2 credentials in n8n\nUpdate \"Send Outreach Email\" node\nTest email sending\n\nActivate the workflow\nTest with sample data:\n\nCountry: United States\nLocation: Dallas\nCategory: Restaurants\nCountry: United States\nLocation: Dallas\nCategory: Restaurants\nVerify data appears in Google Sheets\nCheck that emails are generated and sent\n\nAccess the form trigger URL\nEnter your target criteria:\n\nCountry: Target country\nLocation: City or region\nCategory: Business type (e.g., restaurants)\nCountry: Target country\nLocation: City or region\nCategory: Business type (e.g., restaurants)\nSubmit the form to start the process\n\nYelp Data Sheet: View scraped business information\nTrustpilot Sheet: Review credibility data\nGmail Sent Items: Track outreach emails sent\n\nEdit the \"AI Generate Email Content\" node to customize:\n\nEmail tone and style\nServices mentioned\nCall-to-action messages\nBranding elements\n\nModify rating thresholds\nSet minimum review counts\nAdd geographic restrictions\nFilter by business size\n\nIncrease batch sizes\nAdd delays between requests\nUse parallel processing\nAdd error handling\n\n1. BrightData Connection Failed\n\nCause: Invalid API credentials or dataset access\nSolution: Verify credentials and dataset permissions\n\n2. No Data Extracted\n\nCause: Invalid location or changed page structure\nSolution: Verify location names and test other categories\n\n3. Gmail Authentication Issues\n\nCause: Expired OAuth tokens\nSolution: Re-authenticate and check permissions\n\n4. AI Model Errors\n\nCause: API quota exceeded or invalid keys\nSolution: Check usage limits and API key\n\nRate Limiting: Add delays\nError Handling: Retry failed requests\nData Validation: Check for malformed data\nMemory Management: Process in smaller batches\n\nGoal: Find businesses needing marketing\nTarget: Restaurants, retail stores\nApproach: Focus on good-rated but low-online-presence businesses\n\nGoal: Find software solution clients\nTarget: Growing businesses\nApproach: Focus on recent positive reviews\n\nGoal: Find complementary businesses\nTarget: Established businesses\nApproach: Focus on reputation and satisfaction scores\n\nProcessing Time: 5–10 minutes/location\nData Accuracy: 90%+\nSuccess Rate: 85%+\nDaily Capacity: 100–500 leads\n\nAPI Calls: ~10–20 per business\nStorage: Minimal (Google Sheets)\nExecution Time: 3–8 minutes/10 businesses\nNetwork Usage: ~5–10MB/business\n\nn8n Community Forum: community.n8n.io\nDocs: docs.n8n.io\nBrightData Support: Via dashboard\n\nShare improvements\nReport issues and suggestions\nCreate industry-specific variations\nDocument best practices\n\nThis workflow provides a complete solution for automated lead generation and outreach. Customize it to fit your needs and start building your pipeline today!\n\nFor any questions or support, please contact:\n📧 info@incrementors.com\nor fill out this form: Contact Us",
"isPaid": false
},
{
"templateId": "2792",
"templateName": "Scrape Trustpilot Reviews with DeepSeek, Analyze Sentiment with OpenAI",
"templateDescription": "Workflow Overview This workflow automates the process of scraping Trustpilot reviews, extracting key details, analyzing sentiment, and saving the results to...",
"templateUrl": "https://n8n.io/workflows/2792",
"jsonFileName": "Scrape_Trustpilot_Reviews_with_DeepSeek_Analyze_Sentiment_with_OpenAI.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Scrape_Trustpilot_Reviews_with_DeepSeek_Analyze_Sentiment_with_OpenAI.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/5cd6cba47c9aa1134464c2e88abc1dcb/raw/4e9b6ebb8cad0213e2f586b1d00b411e2c200e5f/Scrape_Trustpilot_Reviews_with_DeepSeek_Analyze_Sentiment_with_OpenAI.json",
"screenshotURL": "https://i.ibb.co/fY2vNtY1/5ec70d95effd.png",
"workflowUpdated": true,
"gistId": "5cd6cba47c9aa1134464c2e88abc1dcb",
"templateDescriptionFull": "This workflow automates the process of scraping Trustpilot reviews, extracting key details, analyzing sentiment, and saving the results to Google Sheets. It uses OpenAI for sentiment analysis and HTML parsing for review extraction.\n\nHTTP Request:\n\nFetches review pages from Trustpilot (https://it.trustpilot.com/review/{{company_id}}).\nPaginates through pages (up to max_page limit).\nFetches review pages from Trustpilot (https://it.trustpilot.com/review/{{company_id}}).\nPaginates through pages (up to max_page limit).\nHTML Parsing:\n\nExtracts review URLs using CSS selectors\nSplits the URLs into individual review links.\nExtracts review URLs using CSS selectors\nSplits the URLs into individual review links.\n\nInformation Extractor:\n\nUses DeepSeek to extract structured data from the review:\n\nAuthor: Name of the reviewer.\nRating: Numeric rating (1-5).\nDate: Review date in YYYY-MM-DD format.\nTitle: Review title.\nText: Full review text.\nTotal Reviews: Number of reviews by the user.\nCountry: Reviewer’s country (2-letter code).\nUses DeepSeek to extract structured data from the review:\n\nAuthor: Name of the reviewer.\nRating: Numeric rating (1-5).\nDate: Review date in YYYY-MM-DD format.\nTitle: Review title.\nText: Full review text.\nTotal Reviews: Number of reviews by the user.\nCountry: Reviewer’s country (2-letter code).\nAuthor: Name of the reviewer.\nRating: Numeric rating (1-5).\nDate: Review date in YYYY-MM-DD format.\nTitle: Review title.\nText: Full review text.\nTotal Reviews: Number of reviews by the user.\nCountry: Reviewer’s country (2-letter code).\n\nSentiment Analysis Node:\n\nUses OpenAI to classify the review text as Positive, Neutral, or Negative.\nExample output:{ \n \"category\": \"Positive\", \n \"confidence\": 0.95 \n}\nUses OpenAI to classify the review text as Positive, Neutral, or Negative.\nExample output:{ \n \"category\": \"Positive\", \n \"confidence\": 0.95 \n}\n\nGoogle Sheets Node:\n\nAppends or updates the extracted data to a Google Sheet\nAppends or updates the extracted data to a Google Sheet\n\nEdit Fields1 Node:\n\nSet company_id to the Trustpilot company name\nSet max_page to limit the number of pages scraped.\nSet company_id to the Trustpilot company name\nSet max_page to limit the number of pages scraped.\n\nGoogle Sheets Node:\n\nUpdate the documentId with your Google Sheet ID\nEnsure the sheet has the required columns (Id, Data, Nome, etc.).\nUpdate the documentId with your Google Sheet ID\nEnsure the sheet has the required columns (Id, Data, Nome, etc.).\n\nOpenAI Chat Model Node:\n\nAdd your OpenAI API key.\nAdd your OpenAI API key.\nSentiment Analysis Node:\n\nEnsure the categories match your desired sentiment labels (Positive, Neutral, Negative).\nEnsure the categories match your desired sentiment labels (Positive, Neutral, Negative).\n\nNodes:\n\nHTTP Request/HTML: Scrape and parse Trustpilot reviews.\nInformation Extractor: Extract structured review data using DeepSeek.\nSentiment Analysis: Classify review sentiment.\nGoogle Sheets: Save and update review data.\nHTTP Request/HTML: Scrape and parse Trustpilot reviews.\nInformation Extractor: Extract structured review data using DeepSeek.\nSentiment Analysis: Classify review sentiment.\nGoogle Sheets: Save and update review data.\nCredentials:\n\nOpenAI API key.\nDeepSeek API key.\nGoogle Sheets OAuth2.\nOpenAI API key.\nDeepSeek API key.\nGoogle Sheets OAuth2.",
"isPaid": false
},
{
"templateId": "1221",
"templateName": "template_1221",
"templateDescription": "This workflow is triggered when a meeting is scheduled via Calendly. Then, an activity is automatically created in Pipedrive and 15 minutes after the end of...",
"templateUrl": "https://n8n.io/workflows/1221",
"jsonFileName": "template_1221.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_1221.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/7aeaa90737fb29ea2384ea12bebc1132/raw/3b2e847c78b8a719932c27a9daa8484e32a9581e/template_1221.json",
"screenshotURL": "https://i.ibb.co/p6QqR8PQ/49dbd81d849b.png",
"workflowUpdated": true,
"gistId": "7aeaa90737fb29ea2384ea12bebc1132",
"templateDescriptionFull": "This workflow is triggered when a meeting is scheduled via Calendly. Then, an activity is automatically created in Pipedrive and 15 minutes after the end of the meeting, a message is sent to the interviewer in Slack, reminding them to write down their notes and insights from the meeting.",
"isPaid": false
},
{
"templateId": "3168",
"templateName": "Scrape Trustpilot Reviews to Google Sheets",
"templateDescription": "This workflow scrapes Trustpilot reviews for a given profile and saves them into Google Sheets. How It Works Clone this Google Sheets template, which...",
"templateUrl": "https://n8n.io/workflows/3168",
"jsonFileName": "Scrape_Trustpilot_Reviews_to_Google_Sheets.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Scrape_Trustpilot_Reviews_to_Google_Sheets.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/20b04adc961f4b3a3deef03235118ec7/raw/0d679b3cd43101a119c4dbe9aeae7ae1b543d934/Scrape_Trustpilot_Reviews_to_Google_Sheets.json",
"screenshotURL": "https://i.ibb.co/SDTTvF6P/026e350e8391.png",
"workflowUpdated": true,
"gistId": "20b04adc961f4b3a3deef03235118ec7",
"templateDescriptionFull": "This workflow scrapes Trustpilot reviews for a given profile and saves them into Google Sheets.\n\nClone this Google Sheets template, which includes two sheets:\n\nA raw collection of Trustpilot reviews. You can customize it as needed.\n\nThis sheet follows the format from this HelpfulCrowd guide, with a slight modification: an added review_id column to support the upsert process.\n\nOnce the workflow is complete, export the sheet as a CSV and upload it to HelpfulCrowd. For detailed steps, see this post.\n\nYou can trigger the workflow on-demand or schedule it to run at a set interval.\n\nTrustpilot business name (e.g., n8n.io in https://www.trustpilot.com/review/n8n.io). Update this name and pagination settings in the Global node.\nGoogle Sheets API credentials\n\nCheck out my other templates:\n👉 My n8n Templates",
"isPaid": false
},
{
"templateId": "2852",
"templateName": "Email AI Auto-responder. Summerize and send email",
"templateDescription": "This workflow is ideal for businesses looking to automate their email responses, especially for handling inquiries about company information. It leverages...",
"templateUrl": "https://n8n.io/workflows/2852",
"jsonFileName": "Email_AI_Auto-responder._Summerize_and_send_email.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Email_AI_Auto-responder._Summerize_and_send_email.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/ae28fb053bb7e4d1945c09b8d6ab59f9/raw/1abddbc285062c9d46846622fee6974be812df4f/Email_AI_Auto-responder._Summerize_and_send_email.json",
"screenshotURL": "https://i.ibb.co/DNmQRTW/ec1a1f644e74.png",
"workflowUpdated": true,
"gistId": "ae28fb053bb7e4d1945c09b8d6ab59f9",
"templateDescriptionFull": "This workflow is ideal for businesses looking to automate their email responses, especially for handling inquiries about company information. It leverages AI to ensure accurate and professional communication.\n\nEmail Trigger:\n\nThe workflow starts with the Email Trigger (IMAP) node, which monitors an email inbox for new messages. When a new email arrives, it triggers the workflow.\nThe workflow starts with the Email Trigger (IMAP) node, which monitors an email inbox for new messages. When a new email arrives, it triggers the workflow.\nEmail Preprocessing:\n\nThe Markdown node converts the email's HTML content into plain text for easier processing by the AI models.\nThe Markdown node converts the email's HTML content into plain text for easier processing by the AI models.\nEmail Summarization:\n\nThe Email Summarization Chain node uses an AI model (DeepSeek R1) to generate a concise summary of the email. The summary is limited to 100 words and is written in Italian.\nThe Email Summarization Chain node uses an AI model (DeepSeek R1) to generate a concise summary of the email. The summary is limited to 100 words and is written in Italian.\nEmail Classification:\n\nThe Email Classifier node categorizes the email into predefined categories (e.g., \"Company info request\"). If the email does not fit any category, it is classified as \"other\".\nThe Email Classifier node categorizes the email into predefined categories (e.g., \"Company info request\"). If the email does not fit any category, it is classified as \"other\".\nEmail Response Generation:\n\nThe Write email node uses an AI model (OpenAI) to draft a professional response to the email. The response is based on the email content and is limited to 100 words.\nThe Review email node uses another AI model (DeepSeek) to review and format the drafted response. It ensures the response is professional and formatted in HTML (e.g., using &lt;br&gt;, &lt;b&gt;, &lt;i&gt;, &lt;p&gt; tags where necessary).\nThe Write email node uses an AI model (OpenAI) to draft a professional response to the email. The response is based on the email content and is limited to 100 words.\nThe Review email node uses another AI model (DeepSeek) to review and format the drafted response. It ensures the response is professional and formatted in HTML (e.g., using &lt;br&gt;, &lt;b&gt;, &lt;i&gt;, &lt;p&gt; tags where necessary).\nEmail Sending:\n\nThe Send Email node sends the reviewed and formatted response back to the original sender.\nThe Send Email node sends the reviewed and formatted response back to the original sender.\nVector Database Integration:\n\nThe Qdrant Vector Store node retrieves relevant information from a vector database (Qdrant) to assist in generating accurate responses. This is particularly useful for emails classified as \"Company info request\".\nThe Embeddings OpenAI node generates embeddings for the email content, which are used to query the vector database.\nThe Qdrant Vector Store node retrieves relevant information from a vector database (Qdrant) to assist in generating accurate responses. This is particularly useful for emails classified as \"Company info request\".\nThe Embeddings OpenAI node generates embeddings for the email content, which are used to query the vector database.\nDocument Vectorization:\n\nThe workflow includes steps to create and refresh a Qdrant collection (Create collection and Refresh collection nodes).\nDocuments from Google Drive are downloaded (Get folder and Download Files nodes), processed into embeddings (Embeddings OpenAI1 node), and stored in the Qdrant vector store (Qdrant Vector Store1 node).\nThe workflow includes steps to create and refresh a Qdrant collection (Create collection and Refresh collection nodes).\nDocuments from Google Drive are downloaded (Get folder and Download Files nodes), processed into embeddings (Embeddings OpenAI1 node), and stored in the Qdrant vector store (Qdrant Vector Store1 node).\n\nConfigure Email Trigger:\n\nSet up the Email Trigger (IMAP) node with the appropriate IMAP credentials to monitor the email inbox.\nSet up the Email Trigger (IMAP) node with the appropriate IMAP credentials to monitor the email inbox.\nSet Up AI Models:\n\nConfigure the DeepSeek R1, OpenAI, and DeepSeek nodes with the appropriate API credentials for text summarization, response generation, and review.\nConfigure the DeepSeek R1, OpenAI, and DeepSeek nodes with the appropriate API credentials for text summarization, response generation, and review.\nSet Up Email Classification:\n\nDefine the categories in the Email Classifier node (e.g., \"Company info request\", \"Other\").\nEnsure the OpenAI 4-o-mini node is configured to assist in classification.\nDefine the categories in the Email Classifier node (e.g., \"Company info request\", \"Other\").\nEnsure the OpenAI 4-o-mini node is configured to assist in classification.\nSet Up Vector Database:\n\nConfigure the Qdrant Vector Store and Qdrant Vector Store1 nodes with the appropriate Qdrant API credentials and collection details.\nSet up the Embeddings OpenAI and Embeddings OpenAI1 nodes to generate embeddings for the email content and documents.\nConfigure the Qdrant Vector Store and Qdrant Vector Store1 nodes with the appropriate Qdrant API credentials and collection details.\nSet up the Embeddings OpenAI and Embeddings OpenAI1 nodes to generate embeddings for the email content and documents.\nSet Up Document Processing:\n\nConfigure the Get folder and Download Files nodes to access and download documents from Google Drive.\nUse the Token Splitter and Default Data Loader nodes to process and split the documents into manageable chunks for vectorization.\nConfigure the Get folder and Download Files nodes to access and download documents from Google Drive.\nUse the Token Splitter and Default Data Loader nodes to process and split the documents into manageable chunks for vectorization.\nSet Up Email Sending:\n\nConfigure the Send Email node with the appropriate SMTP credentials to send responses.\nConfigure the Send Email node with the appropriate SMTP credentials to send responses.\nTest the Workflow:\n\nTrigger the workflow manually using the When clicking ‘Test workflow’ node to ensure all steps execute correctly.\nVerify that emails are summarized, classified, and responded to accurately.\nTrigger the workflow manually using the When clicking ‘Test workflow’ node to ensure all steps execute correctly.\nVerify that emails are summarized, classified, and responded to accurately.\nActivate the Workflow:\n\nOnce tested, activate the workflow to automate the process of handling incoming emails.\nOnce tested, activate the workflow to automate the process of handling incoming emails.\n\nAutomated Email Handling: Automatically processes incoming emails, summarizes them, and generates professional responses.\nAI-Powered Classification: Uses AI to classify emails into relevant categories for targeted responses.\nVector Database Integration: Retrieves relevant information from a vector database to enhance response accuracy.\nDocument Vectorization: Processes and stores documents from Google Drive in a vector database for quick retrieval.\nProfessional Email Formatting: Ensures responses are professionally formatted and concise.\n\nContact me for consulting and support or add me on Linkedin.",
"isPaid": false
},
{
"templateId": "4376",
"templateName": "template_4376",
"templateDescription": "Transform your invoice processing from manual data entry into an intelligent automation system. This powerful n8n workflow monitors Gmail for invoice...",
"templateUrl": "https://n8n.io/workflows/4376",
"jsonFileName": "template_4376.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_4376.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/5751ba02c69806a63a1ba1416da476f0/raw/71c82e984115217bf5f39761482638ac6654ea94/template_4376.json",
"screenshotURL": "https://i.ibb.co/Y4pW7Gm2/acb7ddc5d527.png",
"workflowUpdated": true,
"gistId": "5751ba02c69806a63a1ba1416da476f0",
"templateDescriptionFull": "Transform your invoice processing from manual data entry into an intelligent automation system. This powerful n8n workflow monitors Gmail for invoice attachments, extracts data using AI-powered analysis, and creates organized Google Sheets with all relevant financial information automatically structured and ready for your accounting workflows.\n\nThis sophisticated 8-step automation eliminates manual invoice processing:\n\nStep 1: Intelligent Email Monitoring\nThe workflow continuously monitors your Gmail account for emails with specific labels, checking every minute for new invoice attachments that need processing.\n\nStep 2: Attachment Verification\nSmart filtering ensures only emails with PDF attachments are processed, preventing unnecessary workflow triggers from text-only emails.\n\nStep 3: Advanced PDF Extraction\nThe system automatically downloads and converts PDF invoices into readable text, handling various invoice formats and layouts with high accuracy.\n\nStep 4: AI-Powered Data Analysis\nGPT-4 processes the extracted text using specialized prompts designed for financial document analysis, identifying and extracting:\n\nCompany information and contact details\nInvoice numbers, dates, and payment terms\nDetailed line items with quantities and pricing\nTax calculations including CGST, SGST, and VAT\nBilling and shipping addresses\nPayment methods and transaction references\n\nStep 5: Structured Data Formatting\nThe AI output is automatically formatted into clean, consistent JSON structure with 25+ standardized fields for comprehensive invoice tracking.\n\nStep 6: Dynamic Spreadsheet Creation\nEach processed invoice generates a new Google Sheets document with timestamp naming and organized data layout, ready for accounting review.\n\nStep 7: Automated File Organization\nProcessed spreadsheets are automatically moved to designated Google Drive folders, maintaining organized file structure for easy retrieval and audit trails.\n\nStep 8: Data Population\nAll extracted invoice data is populated into the spreadsheet with proper formatting, formulas, and structure for immediate use in accounting workflows.\n\nGmail account with invoice-receiving capability\nGoogle Workspace access for Sheets and Drive\nOpenAI API account for data extraction\nn8n instance (cloud or self-hosted)\nPDF invoices (text-based, not scanned images)\n\nLabel Setup:\nCreate specific Gmail labels for invoice processing:\n\nEmail Filter Configuration:\nSet up automatic labeling rules:\n\nEmails from known vendors → Auto-apply \"Invoice-Processing\"\nEmails with \"Invoice\" in subject → Auto-apply \"Invoice-Processing\"\nAttachments with PDF extension → Auto-apply \"Invoice-Processing\"\n\n1. Credential Setup\n\nGmail OAuth2: Full email access including attachments\nOpenAI API Key: GPT-4 access for intelligent data extraction\nGoogle Sheets OAuth2: Spreadsheet creation and editing permissions\nGoogle Drive OAuth2: File organization and folder management\n\n2. Google Drive Folder Structure\nCreate organized folder hierarchy:\n\n3. AI Extraction Customization\nThe default AI prompt extracts standard invoice fields but can be customized for:\n\nRegional Tax Systems: GST (India), VAT (EU), Sales Tax (US)\nIndustry-Specific Fields: Purchase orders, project codes, cost centers\nCompany Standards: Custom fields, approval workflows, coding requirements\nMulti-Currency: Exchange rates, currency conversion, international invoices\n\n4. Data Validation Rules\nImplement quality control measures:\n\nRequired Field Validation: Ensure critical data is always extracted\nFormat Standardization: Consistent date formats, number formatting\nDuplicate Detection: Identify potentially duplicate invoices\nAccuracy Scoring: Confidence levels for extracted data\n\n5. Workflow Activation\n\nImport the workflow JSON into your n8n instance\nConfigure all credential connections and test each step\nProcess test invoices to verify accuracy\nActivate Gmail trigger for continuous monitoring\n\nClient Service Automation: Process invoices for multiple clients efficiently\nData Entry Elimination: Convert hours of manual work into automated processing\nAccuracy Improvement: Reduce human errors in financial data transcription\nScalable Operations: Handle increased client volume without proportional staff increase\n\nAccounts Payable Automation: Streamline vendor invoice processing\nCash Flow Management: Quick access to payment due dates and amounts\nExpense Tracking: Organized categorization of business expenses\nAudit Preparation: Maintain organized, searchable invoice records\n\nProcurement Processing: Handle purchase orders and vendor invoices at scale\nMulti-Location Operations: Centralize invoice processing across offices\nCompliance Management: Ensure consistent data capture for regulatory requirements\nIntegration Readiness: Prepare data for ERP and accounting system import\n\nClient Invoice Tracking: Organize incoming payments and project billing\nExpense Management: Categorize business expenses for tax preparation\nCash Flow Monitoring: Track outstanding invoices and payment schedules\nProfessional Organization: Maintain clean financial records for business growth\n\nSupplier Invoice Processing: Manage inventory purchasing and cost tracking\nMulti-Vendor Operations: Handle invoices from numerous suppliers efficiently\nCost Analysis: Track product costs and supplier performance\nInventory Reconciliation: Match invoice data with purchase orders and receipts\n\nExtend processing capabilities:\n\nImplement quality assurance features:\n\nCross-Reference Validation: Compare extracted data against purchase orders\nVendor Database Matching: Verify company details against known vendor lists\nTax Calculation Verification: Validate tax amounts and rates for accuracy\nCurrency Conversion: Handle multi-currency invoices with real-time exchange rates\n\nConnect to existing business systems:\n\nERP Integration: Direct data export to SAP, Oracle, or Microsoft Dynamics\nAccounting Software: Push data to QuickBooks, Xero, or FreshBooks\nApproval Workflows: Add review and approval steps before final processing\nPayment Processing: Connect to banking systems for automated payment scheduling\n\nGenerate business insights:\n\nVendor Performance Analysis: Track pricing trends and payment terms\nExpense Category Reporting: Automated expense categorization and analysis\nCash Flow Forecasting: Predict payment obligations based on due dates\nAudit Trail Management: Maintain comprehensive processing logs for compliance\n\nThe AI extraction captures comprehensive invoice information:\n\nHeader Information:\n\nBilled To (Customer/Company Name)\nInvoice Number (Unique Identifier)\nDate of Issue (Invoice Creation Date)\nDue Date (Payment Deadline)\n\nLine Item Details:\n\nItem Description (Product/Service Details)\nQuantity (Number of Items/Hours)\nRate (Unit Price)\nAmount (Line Total)\n\nTax and Financial Calculations:\n\nCGST/SGST Rates and Amounts (Indian GST System)\nVAT Calculations (European Tax System)\nSubtotal (Pre-tax Amount)\nTotal Amount (Final Invoice Value)\n\nCompany and Contact Information:\n\nVendor Company Name\nContact Phone/Mobile\nEmail Address\nWebsite URL\nGST Registration Number\nPAN Number (Indian Tax ID)\n\nAddress Information:\n\nBilling Address\nShipping Address\nPlace of Supply\nPlace of Delivery\n\nPayment Details:\n\nTransaction IDs\nPayment Mode (Check, Bank Transfer, Card)\nTerms and Conditions\nSpecial Instructions\n\nPDF Extraction Challenges\n\nScanned Documents: Original workflow handles text-based PDFs only\nComplex Layouts: Some invoice formats may require prompt refinement\nMulti-Page Invoices: Large invoices might need pagination handling\nPassword Protection: Encrypted PDFs require manual processing\n\nAI Extraction Accuracy\n\nField Recognition: Some custom invoice formats may need prompt tuning\nCurrency Handling: Multi-currency invoices may require specific configuration\nDate Formats: International date formats might need standardization\nVendor Variations: Different vendor invoice styles may affect accuracy\n\nGmail Integration Limitations\n\nLabel Management: Ensure consistent labeling for proper processing\nAttachment Size: Large PDFs may hit Gmail API limits\nEmail Volume: High-volume processing may require rate limiting\nSecurity Settings: Corporate Gmail may have additional restrictions\n\nProcessing Efficiency\n\nBatch Processing: Group similar invoices for more efficient processing\nTemplate Recognition: Create vendor-specific extraction templates\nQuality Scoring: Implement confidence ratings for extracted data\nError Handling: Add fallback processes for failed extractions\n\nData Quality Assurance\n\nValidation Rules: Implement business logic for data verification\nDuplicate Detection: Prevent duplicate invoice processing\nManual Review Queues: Flag uncertain extractions for human review\nAudit Logging: Maintain detailed processing logs for troubleshooting\n\nBusiness Process Integration\n\nApproval Workflows: Add management approval steps for high-value invoices\nException Handling: Create special processes for unusual invoice types\nReporting Automation: Generate regular summaries of processed invoices\nArchive Management: Implement retention policies for processed documents\n\nProcessing Time: Reduce manual data entry from hours to minutes\nAccuracy Rates: Achieve 95%+ data extraction accuracy\nVolume Capacity: Process 10-50x more invoices with same resources\nError Reduction: Eliminate manual transcription errors\n\nCost Savings: Calculate labor cost reduction from automation\nCash Flow Management: Faster invoice processing enables better payment scheduling\nCompliance: Improved audit trails and data consistency\nScalability: Ability to handle business growth without proportional staff increase\n\nNeed help implementing or optimizing your AI Invoice Processor Agent?\n\n📧 Expert Technical Support\n\nEmail: Yaron@nofluff.online\nResponse Time: Within 24 hours on business days\nSpecialization: Invoice processing automation, AI data extraction, accounting workflow integration\n\n🎥 Comprehensive Training Resources\n\nYouTube Channel: https://www.youtube.com/@YaronBeen/videos\n\nComplete setup and configuration walkthroughs\nAdvanced customization for different invoice types\nIntegration tutorials for popular accounting software\nTroubleshooting common extraction and processing issues\nBest practices for financial document automation\nComplete setup and configuration walkthroughs\nAdvanced customization for different invoice types\nIntegration tutorials for popular accounting software\nTroubleshooting common extraction and processing issues\nBest practices for financial document automation\n\n🤝 Professional Community & Updates\n\nLinkedIn: https://www.linkedin.com/in/yaronbeen/\n\nConnect for ongoing automation consulting and support\nShare your invoice processing success stories and ROI results\nAccess exclusive workflow templates and advanced configurations\nJoin discussions about financial automation trends and innovations\nConnect for ongoing automation consulting and support\nShare your invoice processing success stories and ROI results\nAccess exclusive workflow templates and advanced configurations\nJoin discussions about financial automation trends and innovations\n\n💬 Support Request Guidelines\nInclude in your support message:\n\nYour current invoice processing volume and types\nSpecific vendor formats or invoice layouts you handle\nTarget accounting software or systems for integration\nAny technical errors or extraction accuracy issues\nCurrent manual processing workflow and pain points\n\nReady to eliminate manual invoice processing forever? Deploy this AI Invoice Processor Agent and transform your accounting workflow from tedious data entry into intelligent, automated financial management!",
"isPaid": false
},
{
"templateId": "122",
"templateName": "Steam + CF Report",
"templateDescription": "Webhook to report through Mailgun phishing websites to Steam and CloudFlare (if the domain is on CloudFlare) You have to set the Credentials for webhook and...",
"templateUrl": "https://n8n.io/workflows/122",
"jsonFileName": "Steam__CF_Report.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Steam__CF_Report.json",
"jsonURL": "",
"screenshotURL": "",
"workflowUpdated": false,
"templateDescriptionFull": "Webhook to report through Mailgun phishing websites to Steam and CloudFlare (if the domain is on CloudFlare)\n\nYou have to set the Credentials for webhook and Mailgun.\n\nYou have to set the email from for Mailgun.\n\nThis assumes it is running in n8n's Docker image where bind-tools is not readily available but installable.",
"isPaid": false
},
{
"templateId": "8",
"templateName": "template_8",
"templateDescription": "workflow-screenshot When set as \"Error Workflow\" on other workflow which does fail will it send an Email with information about which workflow did fail and...",
"templateUrl": "https://n8n.io/workflows/8",
"jsonFileName": "template_8.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_8.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/65637196c7555a280ec6cfe7f91553da/raw/9dee0738afae350c8664bf85b2007a84c88becae/template_8.json",
"screenshotURL": "https://i.ibb.co/Kx5dCj5v/f649cbbcb8e2.png",
"workflowUpdated": true,
"gistId": "65637196c7555a280ec6cfe7f91553da",
"templateDescriptionFull": "When set as \"Error Workflow\" on other workflow which does fail will it send an Email with information about which workflow did fail and what went wrong.",
"isPaid": false
},
{
"templateId": "2709",
"templateName": "Email form",
"templateDescription": "Case Study 📧Want to collect email subscribers without paying expensive monthly fees? 💰 This workflow creates a free email collection system with built-in...",
"templateUrl": "https://n8n.io/workflows/2709",
"jsonFileName": "Email_form.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Email_form.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/d766a44a784eca07efd5fda6ba88d4ae/raw/dae531fcde859a62f35c880638be9289a56af0cf/Email_form.json",
"screenshotURL": "https://i.ibb.co/pvvd2dm4/d0416d0548ce.png",
"workflowUpdated": true,
"gistId": "d766a44a784eca07efd5fda6ba88d4ae",
"templateDescriptionFull": "Want to collect email subscribers without paying expensive monthly fees? 💰 This workflow creates a free email collection system with built-in email verification to ensure you only collect legitimate email addresses! ✨\n\n📺 Watch the tutorial:\n\nCreates a customizable email collection form that can be embedded on your website 🌐\nVerifies email addresses using Hunter.io to filter out fake or invalid emails ✅\nStores verified email addresses in SendGrid for your email marketing needs 📊\nCompletely free solution (except for Hunter.io's 50 free monthly credits) 🆓\n\nSet up a free Hunter.io account for email verification\nConfigure your SendGrid account credentials\nCustomize the email collection form fields\nGet the embedded form code for your website\n\nAdd additional form fields beyond just email collection\nCustomize the form's appearance and labels\nModify the verification logic based on your requirements\nConnect to different email marketing platforms instead of SendGrid\nAdd additional automation steps after email verification\n\nNo monthly subscription fees for email collection 💸\nBuilt-in email verification prevents fake signups 🛡️\nScalable solution that won't lock you into expensive plans 📈\nClean email list with only verified addresses ✨\nSimple setup and customization 🎯\n\nThis workflow is perfect for bloggers, small businesses, and anyone looking to build an email list without getting locked into expensive email marketing platforms as their subscriber count grows! 🚀\n\nBuilt by rumjahn",
"isPaid": false
},
{
"templateId": "4774",
"templateName": "Automated TikTok Influencer Discovery & Analysis via Bright Data and Anthropic AI and Send Email Notification",
"templateDescription": "🎯 Automated TikTok Influencer Discovery & Analysis A complete n8n automation that discovers TikTok influencers using Bright Data, evaluates their fit using...",
"templateUrl": "https://n8n.io/workflows/4774",
"jsonFileName": "Automated_TikTok_Influencer_Discovery__Analysis_via_Bright_Data_and_Anthropic_AI_and_Send_Email_Notification.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Automated_TikTok_Influencer_Discovery__Analysis_via_Bright_Data_and_Anthropic_AI_and_Send_Email_Notification.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/1cb36d2f9d33f4bdffb7c3b8c8ecfa78/raw/eccb3c0e8f68babb220b98981901aa2fb5253cff/Automated_TikTok_Influencer_Discovery__Analysis_via_Bright_Data_and_Anthropic_AI_and_Send_Email_Notification.json",
"screenshotURL": "https://i.ibb.co/d0t1DC7s/4b67f037e701.png",
"workflowUpdated": true,
"gistId": "1cb36d2f9d33f4bdffb7c3b8c8ecfa78",
"templateDescriptionFull": "A complete n8n automation that discovers TikTok influencers using Bright Data, evaluates their fit using Claude AI, and sends personalized outreach emails. Designed for marketing teams and brands that need a scalable, intelligent way to find and connect with relevant creators.\n\nThis workflow provides a full-service influencer discovery pipeline: it finds TikTok profiles using search keywords, uses AI to assess alignment with your brand, and initiates contact with qualified influencers. Ideal for influencer marketing, brand partnerships, and campaign planning.\n\n🔍 Keyword-Based Discovery\nLocate TikTok influencers by specific niche-related keywords.\n📊 Bright Data Integration\nAccess accurate TikTok profile data from Bright Data’s datasets.\n🤖 AI-Powered Analysis\nClaude AI evaluates each profile's fit with your brand based on bio, content, and more.\n📧 Smart Email Notifications\nSends tailored outreach emails to creators deemed highly relevant.\n📈 Data Storage\nGoogle Sheets stores profile details, AI evaluation results, and outreach status.\n🎯 Intelligent Filtering\nProcesses only influencers who meet your criteria (e.g., 5000+ followers, industry match).\n⚡ Fast & Reliable\nUses professional scraping with robust error handling.\n🔄 Batch Processing\nSupports bulk influencer processing through a single automated flow.\n\nSearch Keywords – TikTok terms for finding niche creators\nBusiness Info – Brand description and industry\nCollaboration Criteria – Follower count minimum, niche alignment\n\nForm Submission\nTikTok Discovery via Bright Data\nData Extraction and Normalization\nSave to Google Sheets\nRelevance Scoring via Claude AI\nFiltering Based on AI Score + Follower Count\nPersonalized Email Outreach\n\nn8n (cloud or self-hosted)\nBright Data account with TikTok access\nGoogle Sheets + Gmail\nAnthropic Claude API key\n10–15 minutes setup time\n\nImport Workflow via JSON in n8n\nConfigure Bright Data – Add API credentials and dataset ID\nGoogle Sheets – Setup credentials and map columns\nClaude AI – Insert API key and select desired model\nGmail – Authenticate Gmail and update mail node settings\nUpdate Variables – Replace placeholders with business info\nTest & Launch – Submit a sample form and verify all outputs\n\nSubmit the form with search terms, business description, and industry category to trigger the workflow.\n\nEmails are sent only if:\n\nCollaboration status = Highly Relevant\nFollower count ≥ 5000\nIndustry alignment confirmed\nCollaboration status = Highly Relevant\nFollower count ≥ 5000\nIndustry alignment confirmed\nClaude AI returns a 50-word analysis justifying the match\n\nEdit the \"Find the Collaborator\" prompt to adjust:\n\nFollower thresholds\nIndustry relevance\nAdditional metrics (e.g., engagement rate)\n\nGoogle Sheets log includes:\n\nInfluencer metadata\nAI scores and rationale\nCollaboration status\nEmail delivery timestamp\n\nAdd More Fields: Engagement rate, contact email, content themes\nEmail Personalization: Customize message templates or integrate other mail services\nEnhanced Filtering: Use engagement rates, region, content frequency\n\nCheck n8n execution logs\nRun individual nodes for pinpointing failures\nConfirm all data formats\nHandle API rate limits\nAdd error-catch nodes for retries\n\nBrand Discovery: Fashion, tech, fitness creators\nCompetitor Insights: Find influencers used by rival brands\nCampaign Planning: Build targeted influencer lists\nMarket Research: Identify creator trends across regions\n\nBatch Execution: Process multiple keywords with delay logic\nEngagement Metrics: Scrape and calculate likes-to-follower ratios\nCRM Integration: Sync qualified profiles to HubSpot, Salesforce, or Slack\n\nProcessing Time: 3–5 minutes per keyword\nConcurrency: 3–5 simultaneous fetches (depends on plan)\nAccuracy: >95% influencer data reliability\nSuccess Rate: 90%+ for outreach and processing",
"isPaid": false
},
{
"templateId": "4606",
"templateName": "01 Scraping Site & Storing Records",
"templateDescription": "🔧 Automated Workflow: Scrape Travel Agent Contacts and Send Personalized Survey EmailsThis workflow is designed to automate the process of scraping travel...",
"templateUrl": "https://n8n.io/workflows/4606",
"jsonFileName": "01_Scraping_Site__Storing_Records.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/01_Scraping_Site__Storing_Records.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/dcbe30da87fa00b6bbbb136cb7335444/raw/891fe565a1f42482ebb493335920a72ddf64a2ab/01_Scraping_Site__Storing_Records.json",
"screenshotURL": "https://i.ibb.co/hJDtGkY9/2221faa8555b.png",
"workflowUpdated": true,
"gistId": "dcbe30da87fa00b6bbbb136cb7335444",
"templateDescriptionFull": "🔧 Automated Workflow: Scrape Travel Agent Contacts and Send Personalized Survey Emails\nThis workflow is designed to automate the process of scraping travel agent contact data, standardizing the information, storing it, and then sending out personalized survey emails using AI. It’s especially useful for outreach campaigns, research, or lead generation.\n\n⚙️ Workflow Breakdown\n📍 Part 1: Scraping and Storing Travel Agent Data\n\nHTTP Scrape Website\nType: HTTP Request (POST)\n\nFunction: Calls a third-party scraping API (https://api.firecrawl.dev...) to scrape data from a travel agent listing site.\n\nPurpose: Extract raw HTML or structured data from a website containing contact info.\n\nOpenAI Standardise Data\nType: OpenAI Message Model\n\nFunction: Uses AI to clean and standardize the raw scraped data into structured JSON (e.g., name, email, agency, location).\n\nPurpose: Ensures uniformity in formatting, making data easier to process downstream.\n\nSplit Out\nType: Item Splitter\n\nFunction: Splits the standardized array of agent records into individual items.\n\nPurpose: Allows appending each agent as a separate row in Google Sheets.\n\nGoogle Sheet - Data Store\nType: Google Sheets (Append)\n\nFunction: Stores each individual record in a spreadsheet.\n\nPurpose: Maintains a centralized and accessible log of scraped and processed contacts.\n\n📍 Part 2: Read Records and Send Personalized Survey Emails\nTriggered Manually – when “Test Workflow” button is clicked.\n\nTrigger – When clicking 'Test workflow'\nType: Manual Trigger\n\nFunction: Starts the second part of the workflow manually.\n\nUse Case: Testing or running the outreach email process on demand.\n\nGoogle Sheet Data Store (Read)\nType: Google Sheets (Read)\n\nFunction: Reads the stored travel agent records from the spreadsheet.\n\nPurpose: Retrieves contact details and context for personalized messaging.\n\nOpenAI Mail Composer\nType: OpenAI Message Model\n\nFunction: Generates a custom email for each agent using their details.\n\nPurpose: Creates human-like, engaging emails that include a survey link (optional input).\n\nGoogle Sheet Update Records\nType: Google Sheets (Update)\n\nFunction: Optionally marks the record as \"emailed\" or logs the date of outreach.\n\nPurpose: Prevents duplicate outreach and helps track campaign status.\n\nSend Email\nType: Email Node (SMTP or integrated service)\n\nFunction: Sends the personalized email generated by OpenAI.\n\nPurpose: Delivers the survey to each travel agent with contextually relevant messaging.\n\n🧠 Use Case:\nTargeted email outreach to travel agents.\n\nCollect insights or feedback via survey links.\n\nUse personalized messaging to improve response rates.\n\n📌 Benefits:\n✅ Fully automated scraping and processing.\n\n✅ Personalized at scale using OpenAI.\n\n✅ Easily repeatable for different domains or campaigns.\n\n✅ Centralized recordkeeping in Google Sheets.\n\n🛠️ Tech Stack:\nn8n: Automation and workflow management\n\nOpenAI: AI-based text standardization and email generation\n\nFirecrawl (or similar): Web scraping API\n\nGoogle Sheets: Data storage and tracking\n\nEmail Node: Survey email delivery",
"isPaid": false
},
{
"templateId": "2327",
"templateName": "template_2327",
"templateDescription": "This n8n workflow demonstrates how to build a simple uptime monitoring service using scheduled triggers. Useful for webmasters with a handful of sites who...",
"templateUrl": "https://n8n.io/workflows/2327",
"jsonFileName": "template_2327.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_2327.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/d10d445d4c2dedf2bed57b80b7d885da/raw/53238621cca23b6bbd8e56b46c6933b36783d308/template_2327.json",
"screenshotURL": "https://i.ibb.co/v6BQxHT0/916ad8a25088.png",
"workflowUpdated": true,
"gistId": "d10d445d4c2dedf2bed57b80b7d885da",
"templateDescriptionFull": "This n8n workflow demonstrates how to build a simple uptime monitoring service using scheduled triggers.\n\nUseful for webmasters with a handful of sites who want a cost-effective solution without the need for all the bells and whistles.\n\nScheduled trigger reads a list of website urls in a Google Sheet every 5 minutes\nEach website url is checked using the HTTP node which determines if the website is either in the UP or DOWN state.\nAn email and Slack message are sent for websites which are in the DOWN state.\nThe Google Sheet is updated with the website's state and a log created.\nLogs can be used to determine total % of UP and DOWN time over a period.\n\nGoogle Sheet for storing websites to monitor and their states\nGmail for email alerts\nSlack for channel alerts\n\nDon't use Google Sheets? This can easily be exchanged with Excel or Airtable.",
"isPaid": false
},
{
"templateId": "2530",
"templateName": "Automated Work Attendance with Location Triggers",
"templateDescription": "his workflow automates time tracking using location-based triggers. How it worksTrigger: It starts when you enter or exit a specified location, triggering a...",
"templateUrl": "https://n8n.io/workflows/2530",
"jsonFileName": "Automated_Work_Attendance_with_Location_Triggers.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Automated_Work_Attendance_with_Location_Triggers.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/85ae2b82c4678245cb638073b7a64dea/raw/3eb2ead86b189302a9c3cbdf6696c79c36bca632/Automated_Work_Attendance_with_Location_Triggers.json",
"screenshotURL": "https://i.ibb.co/Z6nvD0CG/ee178553ab36.png",
"workflowUpdated": true,
"gistId": "85ae2b82c4678245cb638073b7a64dea",
"templateDescriptionFull": "his workflow automates time tracking using location-based triggers.\n\nTrigger: It starts when you enter or exit a specified location, triggering a shortcut on your iPhone.\nWebhook: The shortcut sends a request to a webhook in n8n.\nCheck-In/Check-Out: The webhook receives the request and records the time and whether it was a \"Check-In\" or \"Check-Out\" event.\nGoogle Sheets: This data is then logged into a Google Sheet, creating a record of your work hours.\n\nGoogle Drive: Connect your Google Drive account.\nGoogle Sheets: Connect your Google Sheets account.\nWebhook: Set up a webhook node in n8n.\niPhone Shortcuts: Create two shortcuts on your iPhone, one for \"Check-In\" and one for \"Check-Out.\"\nConfigure Shortcuts: Configure each shortcut to send a request to the webhook with the appropriate \"Direction\" header.\n\nIt's easy to setup, around 5 minutes.",
"isPaid": false
},
{
"templateId": "2857",
"templateName": "template_2857",
"templateDescription": "Overview This template describes a possible approach to handle a pseudo-callback/trigger from an independent, external process (initiated from a workflow)...",
"templateUrl": "https://n8n.io/workflows/2857",
"jsonFileName": "template_2857.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_2857.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/74b93a87bac20603b40af0a2496142b7/raw/317c98d3af9ab25a520766ab3e0b5e3df5f1c4e5/template_2857.json",
"screenshotURL": "https://i.ibb.co/B5QbfTDW/b6079e8b498a.png",
"workflowUpdated": true,
"gistId": "74b93a87bac20603b40af0a2496142b7",
"templateDescriptionFull": "This template describes a possible approach to handle a pseudo-callback/trigger from an independent, external process (initiated from a workflow) and combine the received input with the workflow execution that is already in progress. This requires the external system to pass through some context information (resumeUrl), but allows the \"primary\" workflow execution to continue with BOTH its own (previous-node) context, AND the input received in the \"secondary\" trigger/process.\n\nThe workflow path from the primary trigger initiates some external, independent process and provides \"context\" which includes the value of $execution.resumeUrl. This execution then reaches a Wait node configured with Resume - On Webhook Call and stops until a call to resumeUrl is received.\n\nThe external, independent process could be anything like a Telegram conversation, or a web-service as long as:\n\nit results in a single execution of the Secondary Workflow Trigger, and\nit can pass through the value of resumeUrl associated with the Primary Workflow Execution\n\nThe secondary workflow execution can start with any kind of trigger as long as part of the input can include the resumeUrl. To combine / rejoin the primary workflow execution, this execution passes along whatever it receives from its trigger input to the resume-webhook endpoint on the Wait node.\n\nIMPORTANT: The workflow ids in the Set nodes marked Update Me have embedded references to the workflow IDs in the original system. They will need to be CHANGED to make this demo work.\nNote: The Resume Other Workflow Execution node in the template uses the $env.WEBHOOK_URL configuration to convert to an internal \"localhost\" call in a Docker environment. This can be done differently.\nALERT: This pattern is NOT suitable for a workflow that handles multiple items because the first workflow execution will only be waiting for one callback.\nThe second workflow (not the second trigger in the first workflow) is just to demonstrate how the Independent, External Process needs to work.",
"isPaid": false
},
{
"templateId": "1785",
"templateName": "template_1785",
"templateDescription": "This workflow uses a number of technologies to track the value of ETFs, stocks and other exchange-traded products: Baserow: To keep track of our...",
"templateUrl": "https://n8n.io/workflows/1785",
"jsonFileName": "template_1785.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_1785.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/722f426f3160ac41979a0ab4068531de/raw/8e6c28b25d2a5a25ffe2005e76cdfe5af78cb985/template_1785.json",
"screenshotURL": "https://i.ibb.co/ccSVrGsz/f43af65b0b83.png",
"workflowUpdated": true,
"gistId": "722f426f3160ac41979a0ab4068531de",
"templateDescriptionFull": "This workflow uses a number of technologies to track the value of ETFs, stocks and other exchange-traded products:\n\nBaserow: To keep track of our investments\nn8n’s Cron node: To trigger the workflow compiling our daily morning briefing\nWebscraping: The HTTP Request & HTML Extract nodes to fetch up-to-date prices from the relevant stock exchange and structure this infromation\nJavascript: We’ll use the Function node to build a custom HTML body with all the relevant information\nSendgrid: The Email Service Provider in this workflow to send out our email\n\nThanks to n8n, the steps in this workflow can easily be changed. Not a Sendgrid user? Simply remove the Sendgrid node and add a Gmail node instead. The stock exchange has a REST API? Just throw away the HTML Extract node.\n\nHere’s how it works:\n\nIn this scenario, our data source is Baserow. In our table, we’ll track all information needed to identify each investment product:\n\n\n\nWe have two text type columns (Name and ISIN) as well as two number type columns (Count and Purchase Price).\n\n\n\nThe Cron node will trigger our workflow to run each work day in the morning hours.\n\nThe Baserow node will fetch our investments from the database table shown above.\n\nUsing the HTTP Request node we can fetch live data from the stock exchange of our choice based on the ISIN. This example uses Tradegate, which is used by many German fintechs. The basic approach should also work for other exchanges, as long as they provide the required data to the public.\n\nSince our HTTP Request node fetches full websites, we’re using the HTML Extract node to extract the information we’re looking for from each website. If an exchange other than Tradegate is used, the selectors used in this node will most likely need to be updated.\n\nThe Set nodes helps with setting the exact columns we’ll use in our table. In this case we’re first formatting the results from our exchange, then calculate the changes based on the purchase price.\n\nHere were using a bit of Javascript magic to build an HTML email. This is where any changes to the email content would have to be made.\n\nFinally we send out the email built in the previous step. This is where you can configure sender and recipients.\n\nThe basic email generated by this workflow will look like so:",
"isPaid": false
},
{
"templateId": "3561",
"templateName": "template_3561",
"templateDescription": "LinkedIn Enrichment & Ice Breaker Generator For SDRs, growth marketers, and founders looking to scale personalized outreach. This workflow enriches LinkedIn...",
"templateUrl": "https://n8n.io/workflows/3561",
"jsonFileName": "template_3561.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_3561.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/75d423b2cf93a25a7b959ae81948c4c0/raw/83fdae3908c288015dbc4c3847760c10f8a1267a/template_3561.json",
"screenshotURL": "https://i.ibb.co/CKJvwgGV/df7364d909d2.png",
"workflowUpdated": true,
"gistId": "75d423b2cf93a25a7b959ae81948c4c0",
"templateDescriptionFull": "For SDRs, growth marketers, and founders looking to scale personalized outreach.\nThis workflow enriches LinkedIn profile data using Bright Data and generates AI-powered ice breakers using Claude (Anthropic).\nIt automates research and messaging to help you connect smarter and faster — without manual effort.\n\nThis workflow combines Google Sheets, [Brigt Data](Bright Data), and Claude (Anthropic) to fully automate your outreach research:\n\nTrigger\n\nManually trigger the workflow or run it on a schedule (via Manual Trigger or Schedule Trigger).\nManually trigger the workflow or run it on a schedule (via Manual Trigger or Schedule Trigger).\nRead Input Sheet\n\nFetches rows from a Google Sheet. Each row must contain at least a Linkedin_URL_Person and row_number.\nFetches rows from a Google Sheet. Each row must contain at least a Linkedin_URL_Person and row_number.\nPrepare Input\n\nFormats each row for Bright Data’s API using Set and SplitInBatches nodes.\nFormats each row for Bright Data’s API using Set and SplitInBatches nodes.\nEnrich Profile (Bright Data API)\n\nSends LinkedIn URLs to Bright Data’s Dataset API via HTTP Request.\nWaits for snapshot to be ready using polling logic with Wait, If, and Snapshot Progress nodes.\nOnce ready, retrieves the enriched profile data including:\n\nName\nCity\nCurrent company\nAbout section\nRecent posts\nSends LinkedIn URLs to Bright Data’s Dataset API via HTTP Request.\nWaits for snapshot to be ready using polling logic with Wait, If, and Snapshot Progress nodes.\nOnce ready, retrieves the enriched profile data including:\n\nName\nCity\nCurrent company\nAbout section\nRecent posts\nName\nCity\nCurrent company\nAbout section\nRecent posts\nUpdate Sheet with Profile Data\n\nWrites the retrieved enrichment data into the corresponding row in Google Sheets (via row_number).\nWrites the retrieved enrichment data into the corresponding row in Google Sheets (via row_number).\nGenerate Ice Breaker (Claude AI)\n\nSends enriched profile content to Claude (Anthropic) using a custom prompt.\nFocuses on recent posts for crafting relevant, respectful, 1–4-line ice breakers.\nSends enriched profile content to Claude (Anthropic) using a custom prompt.\nFocuses on recent posts for crafting relevant, respectful, 1–4-line ice breakers.\nUpdate Sheet with Ice Breaker\n\nWrites the generated ice breaker to the Ice Breaker 1 column in the original row.\nWrites the generated ice breaker to the Ice Breaker 1 column in the original row.\n\nTo use this workflow, you must have the following:\n\nA Google account\nA Google Sheet with at least one sheet/tab containing:\n\nColumn: Linkedin_URL_Person\nColumn: row_number (used for mapping input and output rows)\nColumn: Linkedin_URL_Person\nColumn: row_number (used for mapping input and output rows)\n\nA Bright Data account with access to the Dataset API\nAn active dataset that accepts LinkedIn URLs\nAPI key with Dataset API access\n\nAn Anthropic API key (for Claude 3.5 Haiku or other Claude models)\n\nAccess to HTTP Request, Set, Wait, SplitInBatches, If, and Google Sheets nodes\nAccess to Claude integration (via LangChain nodes: @n8n/n8n-nodes-langchain)\nCredential manager properly configured with:\n\nGoogle Sheets OAuth2 credentials\nBright Data API key\nAnthropic API key\nGoogle Sheets OAuth2 credentials\nBright Data API key\nAnthropic API key\n\nFill the Linkedin_URL_Person column with LinkedIn profile URLs you want to enrich\nDo not modify headers or add filters to the sheet\nLeave other columns (name, city, about, posts, ice breaker) blank — the workflow fills them\n\nGoogle Sheets: Create a credential under Google Sheets OAuth2 API\nBright Data: Add your API key as a credential under HTTP Request (Authorization header)\nAnthropic: Create a credential for Anthropic API with your Claude key\n\nImport the workflow into your n8n instance.\nIn each Google Sheets node:\n\nSelect the copied Google Sheet\nSelect the correct tab (usually input or Sheet1)\nSelect the copied Google Sheet\nSelect the correct tab (usually input or Sheet1)\nIn the HTTP Request node to Bright Data:\n\nPaste your Bright Data dataset ID\nPaste your Bright Data dataset ID\nIn the Claude prompt node:\n\nOptionally adjust the tone and length of the ice breaker prompt\nOptionally adjust the tone and length of the ice breaker prompt\n\nTest it using the Manual Trigger node\nFor daily automation, enable the Schedule Trigger and configure interval settings\nWatch your Google Sheet populate with enriched data and tailored ice breakers\n\nBright Data Delay: Snapshots may take time. The workflow polls the status until complete.\nRetry Protection: If and Wait nodes avoid infinite loops by checking snapshot status.\nMapping via row_number: Critical to ensure data is updated in the right row.\nPrompt Engineering: You can fine-tune Claude's behavior by editing the text prompt.\n\nOnce complete, each row in your Google Sheet will contain:\n\nQuestions? Want to tweak the prompt or expand the enrichment?\n\n📧 Email: Yaron@nofluff.online\n📺 YouTube: @YaronBeen\n🔗 LinkedIn: linkedin.com/in/yaronbeen",
"isPaid": false
},
{
"templateId": "2241",
"templateName": "Unsubscribe Mautic contacts from automated unsubscribe emails",
"templateDescription": "Who is this for?This template is designed for businesses and organizations that use Mautic for email marketing and want to automate the process of removing...",
"templateUrl": "https://n8n.io/workflows/2241",
"jsonFileName": "Unsubscribe_Mautic_contacts_from_automated_unsubscribe_emails.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Unsubscribe_Mautic_contacts_from_automated_unsubscribe_emails.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/75fc3d352657b96ab940615f27855c7c/raw/f75f828a04d3224f4eedff730d8945da64a3ff49/Unsubscribe_Mautic_contacts_from_automated_unsubscribe_emails.json",
"screenshotURL": "https://i.ibb.co/CK3Syjpv/4f708b2dbe65.png",
"workflowUpdated": true,
"gistId": "75fc3d352657b96ab940615f27855c7c",
"templateDescriptionFull": "This template is designed for businesses and organizations that use Mautic for email marketing and want to automate the process of removing contacts from specific segments when they receive an unsubscribe request via email.\n\nMany email recipients, especially those who are less tech-savvy, may not follow the standard unsubscribe link provided in emails. Instead, for example in Gmail, they click the \"Unsubscribe\" button in the Gmail web interface, which in turn sends an email with a consistent format, these emails contain the word unsubscribe in the 'To' field using the following structure:\n\nhello+unsubscribe_6629823aa976f053068426@example.com\n\nThis workflow automates the process of identifying such unsubscribe emails and removing the contact from the relevant Mautic segments, ensuring compliance with unsubscribe requests and maintaining a clean mailing list.\n\nMonitors a Gmail account for incoming emails.\nIdentifies unsubscribe emails based on specific patterns in the \"To\" field (e.g., containing the word \"unsubscribe\").\nRetrieves the contact's ID from Mautic based on the email address.\nRemoves the contact from the specified \"newsletter\" segment in Mautic.\nAdds the contact to the \"unsubscribed\" segment in Mautic.\nSends a confirmation email to the contact, acknowledging their unsubscribe request.\n\nConfigure your email address and unsubscribe message in the \"Edit Fields\" node.\nSet your credentials in the Gmail trigger and in the Mautic nodes.\nSet the segments for the \"newsletter\" and \"unsubscribed\" in the Mautic nodes.\nMake sure your n8n installation has a public endpoint for your Gmail trigger to work correctly.\nDeploy the workflow.\n\nAdjust the conditions for identifying unsubscribe emails based on your specific requirements.\nModify the segments or actions taken in Mautic according to your desired behavior.\nCustomize the confirmation email message and sender details.\n\nNote: This workflow assumes a consistent structure for unsubscribe emails, where the \"From\" field contains the word \"unsubscribe\" using the \"+\" sign. If your email provider follows a different convention, adjust the conditions in the \"Is automated unsubscribe?\" node accordingly.",
"isPaid": false
},
{
"templateId": "2191",
"templateName": "template_2191",
"templateDescription": "This workflow helps marketers verify and update data using EffiBotics Email Verifier API. Copy and create a list with emails as on this one...",
"templateUrl": "https://n8n.io/workflows/2191",
"jsonFileName": "template_2191.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_2191.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/216f6e22d717d5b77884a6045f95d5b4/raw/70bf267258495f74d4b6cff59f01cf061f96afa2/template_2191.json",
"screenshotURL": "https://i.ibb.co/RGyShhnn/8ed569c7668e.png",
"workflowUpdated": true,
"gistId": "216f6e22d717d5b77884a6045f95d5b4",
"templateDescriptionFull": "This workflow helps marketers verify and update data using EffiBotics Email Verifier API.\n\nCopy and create a list with emails as on this one https://docs.google.com/spreadsheets/d/1rzuojNGTaBvaUEON6cakQRDva3ueGg5kNu9v12aaSP4/edit#gid=0\n\nThe trigger checks for any updates in the number of rows that are present in a sheet and updates the verified emails on Google sheets\n\nOnce you update a new cell, the new data is read, and the email is checked for its validity before. The results are then updated in real-time on the sheet.\n\nHappy Emailing!",
"isPaid": false
},
{
"templateId": "2125",
"templateName": "template_2125",
"templateDescription": "How it works It’s very important to come prepared to Sales calls. This often means a lot of manual research about the person you’re calling with. This...",
"templateUrl": "https://n8n.io/workflows/2125",
"jsonFileName": "template_2125.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_2125.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/021f5e13fcbf33efd791b04a8f2a1887/raw/06be65fdf08d76ec46134c53d38d40df1c9e010f/template_2125.json",
"screenshotURL": "https://i.ibb.co/ZpGFHcy8/d86dc709d677.png",
"workflowUpdated": true,
"gistId": "021f5e13fcbf33efd791b04a8f2a1887",
"templateDescriptionFull": "It’s very important to come prepared to Sales calls. This often means a lot of manual research about the person you’re calling with. This workflow delivers a summary of the latest social media activity (LinkedIn + X) for businesses you are about to interact with each day.\n\nScans Your Calendar: Each morning, it reviews your Google Calendar for any scheduled meetings or calls with companies based on each attendee email address.\nFetches Latest Posts: For each identified company, it fetches recent LinkedIn and X posts and summerizes them using AI to deliver a qucik overview for a busy sales rep.\nDelivers Insights: You receive personalized emails via Gmail, each dedicated to a company you’re meeting with that day, containing a reminder of the meeting and a summary of company's recent social media activity.\n\nThe workflow requires you to have the following accounts set up in their respective nodes:\n\nGoogle Calendar\nGMail\nClearbit\nOpenAI\n\nBesides those, you will need an account on the RapidAPI platform and subscribe to the following APIs:\n\nFresh LinkedIn Profile Data\nTwitter",
"isPaid": false
},
{
"templateId": "2803",
"templateName": "Generate Instagram Content from Top Trends with AI Image Generation",
"templateDescription": "How it works This automated workflow discovers trending Instagram posts and creates similar AI-generated content. Here's the high-level process: 1. Content...",
"templateUrl": "https://n8n.io/workflows/2803",
"jsonFileName": "Generate_Instagram_Content_from_Top_Trends_with_AI_Image_Generation.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Generate_Instagram_Content_from_Top_Trends_with_AI_Image_Generation.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/645a200eebf9c3d0ba630bce10bf1b64/raw/fbc937ead2215c1f9a037bc2d00cff6bc3140243/Generate_Instagram_Content_from_Top_Trends_with_AI_Image_Generation.json",
"screenshotURL": "https://i.ibb.co/HDv0917c/f12f4e7c5152.png",
"workflowUpdated": true,
"gistId": "645a200eebf9c3d0ba630bce10bf1b64",
"templateDescriptionFull": "This automated workflow discovers trending Instagram posts and creates similar AI-generated content. Here's the high-level process:\n\nScrapes trending posts from specific hashtags\nAnalyzes visual elements using AI\nFilters out videos and duplicates\n\nCreates unique images based on trending content\nGenerates engaging captions with relevant hashtags\nMaintains brand consistency while being original\n\nPosts content directly to Instagram\nMonitors publication status\nSends notifications via Telegram\n\nSetting up this workflow takes approximately 15-20 minutes:\n\nInstagram Business Account setup\nTelegram Bot creation\nAPI key generation (OpenAI, Replicate, Rapid Api)\n\nCreate required database table\nConfigure PostgreSQL credentials\n\nSet scheduling preferences\nConfigure notification settings\nTest connection and permissions\n\nDetailed technical specifications and configurations are available in sticky notes within the workflow.",
"isPaid": false
},
{
"templateId": "3018",
"templateName": "Automated Content Generation & Publishing - Wordpress",
"templateDescription": "Workflow Description: Automated Content Publishing for WordPressThis n8n workflow automates the entire process of content generation, image selection, and...",
"templateUrl": "https://n8n.io/workflows/3018",
"jsonFileName": "Automated_Content_Generation__Publishing_-_Wordpress.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Automated_Content_Generation__Publishing_-_Wordpress.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/047d18c8e0f2b913a73434626e6f974b/raw/3b2d290c1befdd228cd7b0bf57127c2cd0f4c7ad/Automated_Content_Generation__Publishing_-_Wordpress.json",
"screenshotURL": "https://i.ibb.co/gbCTMtF7/5d93b7cfe874.png",
"workflowUpdated": true,
"gistId": "047d18c8e0f2b913a73434626e6f974b",
"templateDescriptionFull": "This n8n workflow automates the entire process of content generation, image selection, and scheduled publishing to a self-hosted WordPress website. It is designed for bloggers, marketers, and businesses who want to streamline their content creation and posting workflow.\n\n✅ AI-Powered Content Generation\n\nUses ChatGPT to generate engaging, market-ready blog articles\nDynamically incorporates high-search volume keywords\n\n✅ Automated Image Selection\n\nSearches for relevant stock images from Pexels\nEmbeds images directly into posts\n(Optional) Supports Featured Image from URL (FIFU) plugin for WordPress\n\n✅ Scheduled & Randomized Posting\n\nAutomatically schedules posts at predefined intervals\nSupports randomized delay (0-6 hours) for natural publishing\n\n✅ WordPress API Integration\n\nUses WordPress REST API to directly publish posts\nConfigures featured images, categories, and metadata\nSupports SEO-friendly meta fields\n\n✅ Flexible & Customizable\n\nWorks with any WordPress website (self-hosted)\nCan be modified for other CMS platforms\n\n1️⃣ Trigger & Scheduling\n\nAutomatically runs at preset times or on-demand\nSupports cron-like scheduling\n\n2️⃣ AI Content Generation\n\nUses a well-crafted prompt to generate high-quality blog posts\nExtracts relevant keywords for both SEO and image selection\n\n3️⃣ Image Fetching from Pexels\n\nSearches and retrieves high-quality images\nEmbeds image credits and ensures proper formatting\n\n4️⃣ WordPress API Integration\n\nSends post title, content, image, and metadata via HTTP Request\nCan include custom fields, categories, and tags\n\n5️⃣ Randomized Delay Before Publishing\n\nEnsures natural posting behavior\nAvoids bulk publishing issues\n\nSelf-hosted WordPress website with REST API enabled\nFIFU Plugin (optional) for external featured images\nn8n Self-Hosted or Cloud Instance\n\n✅ Bloggers who want to automate content publishing\n✅ Marketing teams looking to scale content production\n✅ Business owners who want to boost online presence\n✅ SEO professionals who need consistent, optimized content\n\n👉 Click here to get this workflow! (Replace with Purchase URL)",
"isPaid": false
},
{
"templateId": "3291",
"templateName": "🔍🛠️Generate SEO-Optimized WordPress Content with Perplexity Research",
"templateDescription": "Generate SEO-Optimized WordPress Content with Perplexity Research Who is This For?This workflow is ideal for content creators, marketers, and businesses...",
"templateUrl": "https://n8n.io/workflows/3291",
"jsonFileName": "Generate_SEO-Optimized_WordPress_Content_with_Perplexity_Research.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Generate_SEO-Optimized_WordPress_Content_with_Perplexity_Research.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/33aa8127bc2a76dc3d0c6d491a7c1ddd/raw/8765903b4c4ae6acfa9dde7b78b899c9ec09017b/Generate_SEO-Optimized_WordPress_Content_with_Perplexity_Research.json",
"screenshotURL": "https://i.ibb.co/n8VHPwNg/dc2acceab462.png",
"workflowUpdated": true,
"gistId": "33aa8127bc2a76dc3d0c6d491a7c1ddd",
"templateDescriptionFull": "This workflow is ideal for content creators, marketers, and businesses looking to streamline the creation of SEO-optimized blog posts for WordPress. It is particularly suited for professionals in the AI consulting and workflow automation industries.\n\nCreating high-quality, SEO-friendly blog posts can be time-consuming and challenging, especially when trying to balance research, formatting, and publishing. This workflow automates the process by integrating research capabilities, AI-driven content creation, and seamless WordPress publishing. It reduces manual effort while ensuring professional-grade output.\n\nResearch: Gathers detailed insights from Perplexity AI based on user-provided queries.\nContent Generation: Uses OpenAI models to create structured blog posts, including titles, slugs, meta descriptions, and HTML content optimized for WordPress.\nImage Handling: Automatically fetches and uploads featured images to WordPress posts.\nPublishing: Drafts the blog post directly in WordPress with all necessary formatting and metadata.\nNotification: Sends a success message via Telegram upon completion.\n\nPrerequisites:\n\nA WordPress account with API access.\nOpenAI API credentials.\nPerplexity AI API credentials.\nTelegram bot credentials for notifications.\nA WordPress account with API access.\nOpenAI API credentials.\nPerplexity AI API credentials.\nTelegram bot credentials for notifications.\nSteps:\n\nImport the workflow into your n8n instance.\nConfigure API credentials for WordPress, OpenAI, Perplexity AI, and Telegram.\nCustomize the form trigger to define your research query.\nTest the workflow using sample queries to ensure smooth execution.\nImport the workflow into your n8n instance.\nConfigure API credentials for WordPress, OpenAI, Perplexity AI, and Telegram.\nCustomize the form trigger to define your research query.\nTest the workflow using sample queries to ensure smooth execution.\n\nModify the research query prompt in the \"Form Trigger\" node to suit your industry or niche.\nAdjust content generation guidelines in the \"Copywriter AI Agent\" node for specific formatting preferences.\nReplace the image URL in the \"Set Image URL\" node with your own source or dynamic image selection logic.",
"isPaid": false
},
{
"templateId": "3348",
"templateName": "DeepSeek v3.1",
"templateDescription": "Workflow Screenshot Who Is This ForThis workflow is ideal for content creators, bloggers, marketers, and professionals seeking to automate the creation and...",
"templateUrl": "https://n8n.io/workflows/3348",
"jsonFileName": "DeepSeek_v3.1.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/DeepSeek_v3.1.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/3c921d02d233b83bd33e7bc72bcabfb9/raw/d478f9608856d1df08d7d19ac884a6a80dcd51b1/DeepSeek_v3.1.json",
"screenshotURL": "https://i.ibb.co/67ggZBDp/602ea5ba2de0.png",
"workflowUpdated": true,
"gistId": "3c921d02d233b83bd33e7bc72bcabfb9",
"templateDescriptionFull": "This workflow is ideal for content creators, bloggers, marketers, and professionals seeking to automate the creation and publication of SEO-optimized articles. It's particularly beneficial for those utilizing Notion for content management and WordPress for publishing.​\n\nManually creating SEO-friendly articles is time-consuming and requires consistent effort. This workflow streamlines the entire process—from detecting updates in Notion to publishing on WordPress—by leveraging AI for content generation, thereby reducing the time and effort involved.​\n\nMonitor Notion Updates: Detects changes in a specified Notion database.​\nAI Content Generation: Utilizes an AI model to produce an SEO-optimized article based on Notion data.​\nPublish to WordPress: Automatically posts the generated article to a WordPress site.​\nEmail Notification: Sends an email containing the article's title and URL.​\n\nUpdate Notion Database: Updates the corresponding entry in the Notion database with the article details.​\n\nWordPress account with API access.​\nAPI key for the AI model used.​\nNotion integration with the relevant database ID.​\nCredentials for the email service used (e.g., Gmail).​\n\nCommunity Node Requirement: This workflow utilizes the n8n-nodes-mcp community node, which is only compatible with self-hosted instances of n8n. For more information on installing and managing community nodes, refer to the n8n documentation.​\nn8n Docs\n\nImport the workflow into your self-hosted n8n instance.​\nInstall the required community node (n8n-nodes-mcp).​\nConfigure API credentials for WordPress, the AI service, Notion, and the email service.​\nDefine necessary variables, such as the notification email address and Notion database IDs.​\nActivate the workflow to automate the process.​\n\nAI Prompt: Adjust the prompt used for content generation to align with your preferred tone and style.​\nArticle Structure: Modify the structure of the generated article by tweaking settings in the content generation node.​\nNotifications: Customize the content and recipients of the emails sent post-publication.​\nNotion Updates: Tailor the fields updated in Notion to suit your specific requirements.",
"isPaid": false
},
{
"templateId": "3581",
"templateName": "Domain -> Email Extraction using Apollo API copy",
"templateDescription": "Who is this for?Sales professionals looking to build lead lists from target company domainsBusiness development teams conducting outreach campaignsMarketers...",
"templateUrl": "https://n8n.io/workflows/3581",
"jsonFileName": "Domain_-_Email_Extraction_using_Apollo_API_copy.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Domain_-_Email_Extraction_using_Apollo_API_copy.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/f6ffb7200e07a1f9c58042553c996393/raw/3d2a9985b34f5fb456ccc0ef0ae71de57eff2aa3/Domain_-_Email_Extraction_using_Apollo_API_copy.json",
"screenshotURL": "https://i.ibb.co/xqDjkRYH/766367b0a3ef.png",
"workflowUpdated": true,
"gistId": "f6ffb7200e07a1f9c58042553c996393",
"templateDescriptionFull": "Sales professionals looking to build lead lists from target company domains\nBusiness development teams conducting outreach campaigns\nMarketers building contact databases for account-based marketing\nRecruiters searching for potential candidates at specific companies\nAnyone needing to transform a list of company domains into actionable contact information\n\nFinding business email addresses for outreach is a time-consuming process. The Apollo API doesn't provide a direct way to extract email contacts from domains in a single call. This workflow bridges that gap by:\n\nAutomating the two-step process required by Apollo's API\nProcessing multiple domains in batches without manual intervention\nExtracting, enriching, and storing contact information in a structured format\nEliminating hours of manual data entry and API interaction\n\nThis workflow creates an automated pipeline between Google Sheets and Apollo's API to:\n\nPull a list of target domains from a Google Sheet\nSubmit each domain to Apollo's search API to find associated people\nLoop through each person found and enrich their profile data\nExtract key information: name, title, email address, and LinkedIn URL\nWrite the enriched contact information back to a results sheet\nProcess the next domain automatically until all are complete\n\nAn n8n instance (cloud or self-hosted)\nApollo.io account with API access\nGoogle account with access to Google Sheets\n\nCreate a new Google Sheet with two tabs:\n\nTab 1: \"Target Domains\" with a column named \"Domain To Enrich\"\nTab 2: \"Results\" with columns: Company, First Name, Last Name, Title, Email, LinkedIn\nTab 1: \"Target Domains\" with a column named \"Domain To Enrich\"\nTab 2: \"Results\" with columns: Company, First Name, Last Name, Title, Email, LinkedIn\n\nImport the workflow JSON into your n8n instance\nSet up Google Sheets credentials in n8n\nUpdate the Google Sheets document ID in both Google Sheets nodes\nAdd your Apollo API key to both HTTP Request nodes\nReview and adjust API rate limits if needed\n\nAdd a few test domains to your \"Target Domains\" sheet\nRun the workflow manually to verify it's working correctly\nCheck the \"Results\" sheet to confirm data is being properly populated\n\nModify the \"Clean Up\" node to extract additional fields from the Apollo API response\nAdd corresponding columns to your \"Results\" sheet\nUpdate the \"Results To Results Sheet\" node mapping to include the new fields\n\nAdd a Filter node after \"Clean Up\" to include only contacts with specific roles\nCreate conditions based on title, seniority, or other fields returned by Apollo\n\nReplace the manual trigger with a Schedule Trigger to run daily/weekly\nAdd a Filter node to process only domains with \"Not Processed\" status\nUpdate the status field in Google Sheets after processing\n\nThis workflow respects Apollo's API rate limits by processing one contact at a time\nThe Apollo API may not return contact information for all domains or all employees\nConsider legal and privacy implications when collecting and storing contact information\n\nMade with ❤️ by Hueston",
"isPaid": false
},
{
"templateId": "3624",
"templateName": "template_3624",
"templateDescription": "Scrape Competitor Reviews & Generate Ad Creatives with Bright data and OpenAI How the Flow Runs Fill the Form Enter the Amazon product URL to analyze...",
"templateUrl": "https://n8n.io/workflows/3624",
"jsonFileName": "template_3624.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_3624.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/6b9da75b45b6fb24612021015175aca0/raw/828261d1b871f30b9a57edf5162c5d41e76451d1/template_3624.json",
"screenshotURL": "https://i.ibb.co/1fmXT35R/acd840aa5ecc.png",
"workflowUpdated": true,
"gistId": "6b9da75b45b6fb24612021015175aca0",
"templateDescriptionFull": "Fill the Form\n\nEnter the Amazon product URL to analyze competitor reviews.\nEnter the Amazon product URL to analyze competitor reviews.\nTrigger Bright Data Scraper\n\nBright Data scrapes Amazon reviews based on the provided URL.\nBright Data scrapes Amazon reviews based on the provided URL.\nWait for Snapshot Completion\n\nPeriodically checks Bright Data until the scraping is complete.\nPeriodically checks Bright Data until the scraping is complete.\nRetrieve JSON Data\n\nCollects the scraped review data in JSON format.\nCollects the scraped review data in JSON format.\nSave Reviews to Google Sheets\n\nAutomatically appends the scraped reviews to your Google Sheets.\nAutomatically appends the scraped reviews to your Google Sheets.\nAggregate Reviews\n\nConsolidates all reviews into a single summary for simpler analysis.\nConsolidates all reviews into a single summary for simpler analysis.\nAnalyze Reviews with OpenAI LLM\n\nSends the aggregated reviews to OpenAI (GPT-4o mini) to summarize competitors’ main weaknesses clearly.\nSends the aggregated reviews to OpenAI (GPT-4o mini) to summarize competitors’ main weaknesses clearly.\nGenerate Creative Ad Image\n\nOpenAI generates a visually appealing 1080x1080 ad image addressing these identified pain points.\nOpenAI generates a visually appealing 1080x1080 ad image addressing these identified pain points.\nSend Ad Creative via Gmail\n\nAutomatically emails the creative and review summary to your media buying team for immediate use in Meta ads.\nAutomatically emails the creative and review summary to your media buying team for immediate use in Meta ads.\n\nGoogle Sheets: Template\nBright Data: Dataset and API key:\nwww.brightdata.com\nOpenAI API Key: For GPT-4o mini or your preferred LLM\nAutomation Tool: Ensure it supports HTTP Requests, Wait, Conditional (If), Google Sheets integration, Form Trigger, OpenAI integration, and Gmail integration.\n\nAmazon Product URL: Enter the competitor’s product URL from Amazon.\n\nCopy the provided Google Sheet template.\nImport the JSON workflow into your automation tool.\nUpdate credentials for Bright Data, Google Sheets, Gmail, and OpenAI.\nTest manually by submitting the form and verifying functionality.\nOptional: Set a schedule for regular workflow execution.",
"isPaid": false
},
{
"templateId": "3665",
"templateName": "Property Lead Contact Enrichment from CRM",
"templateDescription": "How It WorksThis N8N workflow creates an automated system for discovering high-potential real estate investment opportunities. The workflow runs on a...",
"templateUrl": "https://n8n.io/workflows/3665",
"jsonFileName": "Property_Lead_Contact_Enrichment_from_CRM.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Property_Lead_Contact_Enrichment_from_CRM.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/9b297be3088bd8035e0ccea92ccd4928/raw/fdb8ffba64c71d013b05bcf1a650060646b4a880/Property_Lead_Contact_Enrichment_from_CRM.json",
"screenshotURL": "https://i.ibb.co/8L1Kbbmm/f0a410e5b0b1.png",
"workflowUpdated": true,
"gistId": "9b297be3088bd8035e0ccea92ccd4928",
"templateDescriptionFull": "This N8N workflow creates an automated system for discovering high-potential real estate investment opportunities. The workflow runs on a customizable schedule to scan the market for properties that match your specific criteria, then alerts your team about the most promising leads.\n\nThe process follows these steps:\n\nConnects to BatchData API on a regular schedule to search for properties matching your parameters\nCompares new results with previous scans to identify new listings and property changes\nApplies intelligent filtering to focus on high-potential opportunities (high equity, absentee owners, etc.)\nRetrieves comprehensive property details and owner information for qualified leads\nDelivers formatted alerts through multiple channels (email and Slack/Teams)\n\nEach email alert includes detailed property information, owner details, equity percentage, and a direct Google Maps link to view the property location. The workflow also posts concise notifications to your team's communication channels for quick updates.\n\nThis workflow is designed for:\n\nReal Estate Investors: Find off-market properties with high equity and motivated sellers\nReal Estate Agents: Identify potential listing opportunities before they hit the market\nProperty Acquisition Teams: Streamline the lead generation process with automated scanning\nReal Estate Wholesalers: Discover properties with significant equity spreads for potential deals\nREITs and Property Management Companies: Monitor market changes and expansion opportunities\n\nThe workflow is especially valuable for professionals who want to:\n\nSave hours of manual market research time\nGet early notifications about high-potential properties\nAccess comprehensive property and owner information in one place\nFocus their efforts on the most promising opportunities\n\nBatchData is a powerful property data platform for real estate professionals. Their API provides access to comprehensive property and owner information across the United States, including:\n\nProperty details (bedrooms, bathrooms, square footage, year built, etc.)\nValuation and equity estimates\nOwner information (name, mailing address, contact info)\nTransaction history and sales data\nForeclosure and distressed property status\nDemographic and neighborhood data\n\nThe platform specializes in providing accurate, actionable property data that helps real estate professionals make informed decisions and identify opportunities efficiently. BatchData's extensive database covers millions of properties nationwide and is regularly updated to ensure data accuracy.\nThe API's flexible search capabilities allow you to filter properties based on numerous criteria, making it an ideal data source for automated lead generation workflows like this one.",
"isPaid": false
},
{
"templateId": "4403",
"templateName": "Find the Content Gaps in Your Competitors' Discourse for Market Research and SEO",
"templateDescription": "This template can be used to find the content gaps in your competitors' discourse: identifying the topics they are not yet connecting and giving you an...",
"templateUrl": "https://n8n.io/workflows/4403",
"jsonFileName": "Find_the_Content_Gaps_in_Your_Competitors_Discourse_for_Market_Research_and_SEO.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Find_the_Content_Gaps_in_Your_Competitors_Discourse_for_Market_Research_and_SEO.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/0d1212f8fdc4acc4ca69d2e829fe7202/raw/c87e202d8c98d51ca43d1887be81f6c593caebf0/Find_the_Content_Gaps_in_Your_Competitors_Discourse_for_Market_Research_and_SEO.json",
"screenshotURL": "https://i.ibb.co/fd0BT0Vf/a975aaea9ed1.png",
"workflowUpdated": true,
"gistId": "0d1212f8fdc4acc4ca69d2e829fe7202",
"templateDescriptionFull": "This template can be used to find the content gaps in your competitors' discourse: identifying the topics they are not yet connecting and giving you an opportunity to fill in this gap with your content and product ideas. It will also generate research questions that will help bridge the gaps and generate new ideas.\n\nThe template showcases the use of multiple n8n nodes and processes:\n\nenriching Google sheets file with the new data\ndata extraction\ncontent enhancement using GraphRAG approach\ncontent gap / research question generation\n\nThis approach can be very useful for research, marketing, and SEO applications as you can quickly get an overview of the main topics that are available online for a certain niche and understand what is missing.\n\nIn the context of SEO, content gaps are usually understood as the topics that your competitors rank for but you do not.\n\nHowever, it's hard to rank for these topics because there's very high competition. So a much more effective way is to identify the gaps between the topics your competitors are talking about that are not yet bridged in their discourse.\n\nIf you address these gaps in your content, you will increase the informational gain that your content offers and also offer a novel perspective while touching upon the topics that are relevant in your field.\n\nFor example, if we analyze the top websites for \"body and physical practices, fitness, etc.\" we will see that most of them are talking about the health and fitness aspects and another big topic is the community aspect.\n\nHowever, there is a gap between the two topics: which means that most of the websites (companies) that talk about this topic don't mention the two in the same context. This might be an opportunity: bridging the gap between health, fitness but also emphasizing the community aspect that comes with a collective practice.\n\n\n\nThis template consists of the two stages:\n\nData enrichment of a Google sheet file with a list of your competitors using InfraNodus' GraphRAG to generate topical summaries and graph summaries for every URL you're analyzing.\nInsight generation (using InfraNodus to identify the main topical clusters and gaps in those summaries, these insights are then added to the Google sheet file.\n\nAdditionally, it contains a sub workflow that you can activate and launch to ask Perplexity model to conduct a market research and find the companies that operate in your field and populate the original Google sheet file.\n\nHere's a description step by step:\n\nStep 0: Populate the Google sheets file with the company data (either manually or using the sub-workflow provided or Manus AI / Deep Research)\nSteps 1-2: Triggering and Launching the workflow, extracting the company URL from the Google sheet row\nStep 3: Scraping the url content from the companies' websites and cleaning the data\nSteps 5-7: Use InfraNodus GraphRAG Content Enhancer to get a topical summary and graph summary. This is what you're going to get:\n\n\n\nSteps 8-10: Use InfraNodus AI to generate insight advice and research questions based on the content gaps\n\nYou need an InfraNodus GraphRAG API account and key to use this workflow.\n\nCreate an InfraNodus account\nGet the API key at https://infranodus.com/api-access and create a Bearer authorization key for the InfraNodus HTTP nodes.\nCreate a separate knowledge graph for each expert (using PDF / content import options) in InfraNodus\nFor each graph, go to the workflow, paste the name of the graph into the body name field.\nKeep other settings intact or learn more about them at the InfraNodus access points page.\nOnce you add one or more graphs as experts to your flow, add the LLM key to the OpenAI node and launch the workflow\n\nAn InfraNodus account and API key\nA Google Sheet account and an authorization key\n\nNote: OpenAI key is not required. But you might want to get a Perplexity AI key if you'd like to use the sub-workflow that populates the Google sheet with your competitors' website addresses (if you don't have this list yet).\n\nYou can use this same workflow with a Telegram bot or Slack (to be notified of the summaries and ideas).\n\nYou can also hook up automated social media content creation workflows in the end of this template, so you can generate posts that are relevant (covering the important topics in your niche) but also novel (because they connect them in a new way).\n\nCheck out our n8n templates for ideas at https://n8n.io/creators/infranodus/\n\nCheck out the complete guide at https://support.noduslabs.com/hc/en-us/articles/20234254556828-Find-Content-Gaps-in-Websites-Market-Research-and-SEO-n8n-Workflow\n\nAlso check the full tutorial with a conceptual explanation at https://support.noduslabs.com/hc/en-us/articles/20454382597916-Beat-Your-Competition-Target-Their-Content-Gaps-with-this-n8n-Automation-Workflow\n\nAlso check out the video tutorial with a demo:\n\n\n\nFor support and help with this workflow, please, contact us at https://support.noduslabs.com",
"isPaid": false
},
{
"templateId": "4432",
"templateName": "Generate Content Ideas with Gemini and Store in Google Sheets",
"templateDescription": "Automated Content Idea Generation and Expansion with Google Gemini and Google Sheets This n8n workflow automates the process of generating content ideas...",
"templateUrl": "https://n8n.io/workflows/4432",
"jsonFileName": "Generate_Content_Ideas_with_Gemini_and_Store_in_Google_Sheets.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Generate_Content_Ideas_with_Gemini_and_Store_in_Google_Sheets.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/ab992197fbabfab142cd1c97f6d5633f/raw/2c417b7e8fcb94a58121f8973ce626a7c1ab04f8/Generate_Content_Ideas_with_Gemini_and_Store_in_Google_Sheets.json",
"screenshotURL": "https://i.ibb.co/ccv8t3Ln/d1fca96fe528.png",
"workflowUpdated": true,
"gistId": "ab992197fbabfab142cd1c97f6d5633f",
"templateDescriptionFull": "This n8n workflow automates the process of generating content ideas based on a user-defined topic, then expands each idea into a more detailed content piece (like a blog post) using Google Gemini, and finally saves all the generated data (idea title, description, and full content) into a Google Sheet. It's a powerful tool for streamlining content creation workflows.\n\nThis workflow includes:\n\nGeneration of multiple content ideas from a single topic.\nExpansion of each idea into detailed content using AI.\nStorage of ideas and generated content in a structured Google Sheet.\nSticky Notes within the workflow for inline documentation and setup guidance.\n\nn8n Instance: You need a running n8n instance (self-hosted or cloud).\nGoogle AI Account: Access to Google AI (Gemini). You will need an API key.\nGoogle Account: Access to Google Sheets. You will need to create or use an existing spreadsheet with specific column headers.\n\nImport the Workflow:\n\nCopy the entire JSON code provided.\nIn your n8n instance, go to \"Workflows\".\nClick \"New\" -> \"Import from JSON\".\nPaste the JSON code and click \"Import\".\nCopy the entire JSON code provided.\nIn your n8n instance, go to \"Workflows\".\nClick \"New\" -> \"Import from JSON\".\nPaste the JSON code and click \"Import\".\nConfigure Credentials:\n\nGoogle AI (Gemini):\n\nFind the \"Google Gemini Chat Model for Content Idea Generator\" node and the \"Google Gemini Chat Model for Content Generation\" node.\nClick on the \"Credentials\" field in both nodes (it will likely show a placeholder name like \"Google Gemini(PaLM) Api account\").\nClick \"Create New\".\nSelect \"Google AI API\".\nEnter your Google AI API Key.\nSave the credential. (You can reuse the same credential for both nodes).\n\n\nGoogle Sheets:\n\nFind the \"Google Sheets\" node.\nClick on the \"Credentials\" field (it will likely show a placeholder name like \"Google Sheets account\").\nClick \"Create New\".\nSelect \"Google Sheets OAuth2 API\".\nFollow the steps to connect your Google Account and grant n8n access to Google Sheets.\nSave the credential.\nGoogle AI (Gemini):\n\nFind the \"Google Gemini Chat Model for Content Idea Generator\" node and the \"Google Gemini Chat Model for Content Generation\" node.\nClick on the \"Credentials\" field in both nodes (it will likely show a placeholder name like \"Google Gemini(PaLM) Api account\").\nClick \"Create New\".\nSelect \"Google AI API\".\nEnter your Google AI API Key.\nSave the credential. (You can reuse the same credential for both nodes).\nFind the \"Google Gemini Chat Model for Content Idea Generator\" node and the \"Google Gemini Chat Model for Content Generation\" node.\nClick on the \"Credentials\" field in both nodes (it will likely show a placeholder name like \"Google Gemini(PaLM) Api account\").\nClick \"Create New\".\nSelect \"Google AI API\".\nEnter your Google AI API Key.\nSave the credential. (You can reuse the same credential for both nodes).\nGoogle Sheets:\n\nFind the \"Google Sheets\" node.\nClick on the \"Credentials\" field (it will likely show a placeholder name like \"Google Sheets account\").\nClick \"Create New\".\nSelect \"Google Sheets OAuth2 API\".\nFollow the steps to connect your Google Account and grant n8n access to Google Sheets.\nSave the credential.\nFind the \"Google Sheets\" node.\nClick on the \"Credentials\" field (it will likely show a placeholder name like \"Google Sheets account\").\nClick \"Create New\".\nSelect \"Google Sheets OAuth2 API\".\nFollow the steps to connect your Google Account and grant n8n access to Google Sheets.\nSave the credential.\nConfigure Google Sheets Node:\n\nOpen the \"Google Sheets\" node settings.\nSpreadsheet ID: Replace the placeholder value with the actual ID of your Google Sheet. You can find the Spreadsheet ID in the URL of your Google Sheet (it's the long string of characters between /d/ and /edit).\nSheet Name: Select or enter the name or GID of the sheet within your spreadsheet where you want to save the data (e.g., Sheet1 or gid=0).\nColumns: Ensure your Google Sheet has columns named title, description, and content. The node is configured to map the generated data to these specific column headers.\nSave the node settings.\nOpen the \"Google Sheets\" node settings.\nSpreadsheet ID: Replace the placeholder value with the actual ID of your Google Sheet. You can find the Spreadsheet ID in the URL of your Google Sheet (it's the long string of characters between /d/ and /edit).\nSheet Name: Select or enter the name or GID of the sheet within your spreadsheet where you want to save the data (e.g., Sheet1 or gid=0).\nColumns: Ensure your Google Sheet has columns named title, description, and content. The node is configured to map the generated data to these specific column headers.\nSave the node settings.\nReview Sticky Notes:\n\nLook at the Sticky Notes placed around the workflow canvas. They provide helpful context and reminders for setup, required Google Sheet columns, and the AI models used.\nLook at the Sticky Notes placed around the workflow canvas. They provide helpful context and reminders for setup, required Google Sheet columns, and the AI models used.\n\nActivate the Workflow: Toggle the workflow switch to \"Active\".\nTrigger the Workflow:\n\nSince this workflow uses a \"When clicking ‘Execute workflow’\" node as the trigger, you can run it directly from the n8n editor.\nClick the \"Execute Workflow\" button.\nThe workflow will start automatically.\nSince this workflow uses a \"When clicking ‘Execute workflow’\" node as the trigger, you can run it directly from the n8n editor.\nClick the \"Execute Workflow\" button.\nThe workflow will start automatically.\nSet the Topic:\n\nOpen the \"Set the input fields\" node.\nModify the topic value to the subject you want to generate content ideas about.\nSave the node settings.\nOpen the \"Set the input fields\" node.\nModify the topic value to the subject you want to generate content ideas about.\nSave the node settings.\nMonitor Execution: Watch the workflow execute. The nodes will light up as they process. The \"Loop Over Items\" node will show multiple executions as it processes each generated idea.\nCheck Results:\n\nThe generated content ideas (title, description) and the expanded content will be written as new rows in the Google Sheet you configured. Each row will correspond to one generated idea and its content.\nThe generated content ideas (title, description) and the expanded content will be written as new rows in the Google Sheet you configured. Each row will correspond to one generated idea and its content.\n\nThis workflow provides a robust starting point for AI-assisted content creation. You can customize the AI prompts in the \"Content Idea Generator\" and \"LLM Content Generator\" nodes to refine the output style and format, or integrate additional steps like sending notifications or further processing the generated content.",
"isPaid": false
},
{
"templateId": "4457",
"templateName": "template_4457",
"templateDescription": "Create your own intelligent Telegram bot that summarizes articles and processes commands automatically. This powerful workflow turns Telegram into your...",
"templateUrl": "https://n8n.io/workflows/4457",
"jsonFileName": "template_4457.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_4457.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/bbd7b74440126942021a7efa27dbab10/raw/ff2853e2a9d83ae92be3cd503c87f2c18970f644/template_4457.json",
"screenshotURL": "https://i.ibb.co/HT7YgRYR/3a6b7c6f26f9.png",
"workflowUpdated": true,
"gistId": "bbd7b74440126942021a7efa27dbab10",
"templateDescriptionFull": "Create your own intelligent Telegram bot that summarizes articles and processes commands automatically.\n\nThis powerful workflow turns Telegram into your personal AI assistant, handling /help, /summary &lt;URL&gt;, and /img &lt;prompt&gt; commands with intelligent responses - perfect for teams, content creators, and anyone wanting smart automation in their messaging.\n\nSmart Command Processing: Automatically recognizes and routes /help, /summary, and /img commands to appropriate AI-powered responses.\n\nArticle Summarization: Fetches any URL, extracts content, and generates professional 10-12 bullet point summaries using OpenAI.\n\nImage Generation: Processes image prompts and integrates with AI image generation services.\n\nHelp System: Provides users with clear command instructions and usage examples.\n\n✅ Personal AI Assistant: Get instant article summaries in Telegram\n✅ Team Productivity: Share quick content summaries with colleagues\n✅ Content Research: Rapidly digest articles and web content\n✅ 24/7 Availability: Bot works around the clock without maintenance\n✅ Easy Commands: Simple /summary &lt;link&gt; format anyone can use\n✅ Scalable: Handles multiple users and requests simultaneously\n\nJournalists quickly summarizing news articles\nMarketing teams researching competitor content\nStudents processing academic papers and articles\nAnalysts digesting industry reports\n\nTeam Communication: Share article insights in group chats\nResearch Assistance: Quick content analysis for decision making\nContent Curation: Summarize articles for newsletters or reports\nKnowledge Sharing: Help teams stay informed efficiently\n\nComplete Bot Workflow: Ready-to-deploy Telegram bot with all commands\nAI Integration: OpenAI-powered content summarization and processing\nSmart Routing: Intelligent command recognition and response system\nError Handling: Robust system handles invalid commands gracefully\nExtensible Design: Easy to add new commands and features\n\nn8n Platform: Cloud or self-hosted instance\nTelegram Bot Token: Create via @BotFather (free, 5 minutes)\nOpenAI API: For content summarization (pay-per-use)\nBasic Configuration: Follow 15-minute setup guide\n\nSimple Commands:\n\nSample Summary Output:\n\nCommand Extensions: Add custom commands for specific workflows\nResponse Formatting: Customize summary style and length\nMulti-Language: Support different languages for international teams\nIntegration APIs: Connect additional AI services and tools\nUser Permissions: Control who can use specific commands\nAnalytics: Track usage patterns and popular content\n\n#telegram-bot #ai-automation #content-summarization #article-processing #team-productivity #openai-integration #smart-assistant #workflow-automation #messaging-bot #content-research #ai-agent #n8n-workflow #business-automation #telegram-integration #ai-powered-bot\n\nNews Team: Quickly summarize breaking news articles for editorial meetings\nMarketing Agency: Research competitor content and industry trends efficiently\nSales Team: Digest industry reports and share insights with prospects\nRemote Team: Keep everyone informed with summarized company updates\n\n80% faster content research and analysis\n50% more articles processed per day vs manual reading\n100% team accessibility through familiar Telegram interface\n24/7 availability for global teams across time zones\n\nQuick Start: Deploy your bot in 15 minutes with included guide\nVideo Tutorial: Complete walkthrough available\nTemplate Commands: Pre-built responses and formatting\nExpert Support: Direct help from workflow creator\n\nYouTube: https://www.youtube.com/@YaronBeen/videos\n\n💼 Professional Support\nLinkedIn: https://www.linkedin.com/in/yaronbeen/\n\n📧 Direct Help\nEmail: Yaron@nofluff.online - Response within 24 hours\n\nReady to build your intelligent Telegram assistant?\n\nGet this AI Telegram Bot Agent and transform your messaging app into a powerful content processing tool. Perfect for teams, researchers, and anyone who wants AI-powered assistance directly in Telegram.\n\nStop manually reading long articles. Start getting instant, intelligent summaries with simple commands.",
"isPaid": false
},
{
"templateId": "4498",
"templateName": "template_4498",
"templateDescription": "Automated Instagram posting with Facebook Graph API and content routing Who is this for? This workflow is perfect for social media managers, content...",
"templateUrl": "https://n8n.io/workflows/4498",
"jsonFileName": "template_4498.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_4498.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/cfe4085fb69b764361cc831f62036b23/raw/5f6d9f3dde0dd3ddca8011c1352a04561fd03fad/template_4498.json",
"screenshotURL": "https://i.ibb.co/9HDLnFrs/8a3cb3d9d5c7.png",
"workflowUpdated": true,
"gistId": "cfe4085fb69b764361cc831f62036b23",
"templateDescriptionFull": "This workflow is perfect for social media managers, content creators, digital marketing agencies, and small business owners who need to automate their Instagram posting process. Whether you're managing multiple client accounts or maintaining consistent personal branding, this template streamlines your social media operations.\n\nManual Instagram posting is time-inconsistent and prone to inconsistency. Content creators struggle with:\n\nRemembering to post at optimal times\nManaging different content types (images, videos, reels, stories, carousels)\nMaintaining posting schedules across multiple accounts\nEnsuring content is properly formatted for each post type\n\nThis workflow eliminates manual posting, reduces human error, and ensures consistent content delivery across all Instagram format types.\n\nThe workflow automatically publishes content to Instagram using Facebook's Graph API with intelligent routing based on content type. It handles image posts, video stories, Instagram reels, carousel posts, and story content. The system creates media containers, monitors processing status, and publishes content when ready. It supports both HTTP requests and Facebook SDK methods for maximum reliability and includes automatic retry mechanisms for failed uploads.\n\nConnect Instagram Business Account to a Facebook Page\nConfigure Facebook Graph API credentials with instagram_basic permissions\nUpdate the \"Configure Post Settings\" node with your Instagram Business Account ID\nSet media URLs and captions in the configuration section\nChoose post type (http_image, fb_reel, http_carousel, etc.)\nTest workflow with sample content before going live\n\nModify the post_type variable to control content routing:\n\nUse http_* prefixes for direct API calls\nUse fb_* prefixes for Facebook SDK calls\nUse both HTTP and Facebook SDK nodes as fallback mechanisms - if one method fails, automatically try the other for maximum success rate\nAdd scheduling by connecting a Cron node trigger\nIntegrate with Google Sheets or Airtable for content management\nConnect webhook triggers for automated posting from external systems\nCustomize wait times based on your content file sizes\nSet up error handling to switch between HTTP and Facebook SDK methods when API limits are reached",
"isPaid": false
},
{
"templateId": "4573",
"templateName": "Google Maps business scraper with contact extraction via Apify and Firecrawl",
"templateDescription": "Who is this for?Marketing agencies, sales teams, lead generation specialists, and business development professionals who need to build comprehensive...",
"templateUrl": "https://n8n.io/workflows/4573",
"jsonFileName": "Google_Maps_business_scraper_with_contact_extraction_via_Apify_and_Firecrawl.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Google_Maps_business_scraper_with_contact_extraction_via_Apify_and_Firecrawl.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/cb0e59100dd59491705e5733278e8bd5/raw/45eecf9a683164058356d9b8ded4c11b18110131/Google_Maps_business_scraper_with_contact_extraction_via_Apify_and_Firecrawl.json",
"screenshotURL": "https://i.ibb.co/84rHmQJB/52408eda52ac.png",
"workflowUpdated": true,
"gistId": "cb0e59100dd59491705e5733278e8bd5",
"templateDescriptionFull": "Marketing agencies, sales teams, lead generation specialists, and business development professionals who need to build comprehensive business databases with contact information for outreach campaigns across any industry.\n\nFinding businesses and their contact details manually is time-consuming and inefficient. This workflow automates the entire process of discovering businesses through Google Maps and extracting their digital contact information from websites, saving hours of manual research.\n\nThis automated workflow runs every 30 minutes to:\n\nScrape business data from Google Maps using Apify's Google Places crawler\nSave basic business information (name, address, phone, website) to Google Sheets\nFilter businesses that have websites\nScrape each business's website content using Firecrawl\nExtract contact information including emails, LinkedIn, Facebook, Instagram, and Twitter profiles\nStore all extracted data in organized Google Sheets for easy access and follow-up\n\nRequired Services:\n\nGoogle Sheets account with OAuth2 setup\nApify account with API access for Google Places scraping\nFirecrawl account with API access for website scraping\n\nPre-setup:\n\nCopy this Google Sheet\nConfigure your Apify and Firecrawl API credentials in n8n\nSet up Google Sheets OAuth2 connection\nUpdate the Google Sheet ID in all Google Sheets nodes\n\nQuick Start:\nThe workflow includes detailed sticky notes explaining each phase. Simply configure your API credentials and Google Sheet, then activate the workflow.\n\nChange search criteria: Modify the Apify scraping parameters to target different business types (restaurants, gyms, salons, etc.) or locations\nAdjust schedule: Change the trigger interval from 30 minutes to your preferred frequency\nAdd more contact fields: Extend the extraction code to find additional contact information like WhatsApp or Telegram\nFilter criteria: Modify the filter conditions to target businesses with specific characteristics\nBatch size: Adjust the batch processing to handle more or fewer websites simultaneously\n\nPerfect for lead generation, competitor research, and building targeted marketing lists across any industry or business type.",
"isPaid": false
},
{
"templateId": "4600",
"templateName": "🤖 AI content generation for Auto Service 🚘 Automate your social media📲!",
"templateDescription": "Auto Service social media ai generated content posting automation Who Is This For? 🚘This workflow is designed for auto service / car repair businesses...",
"templateUrl": "https://n8n.io/workflows/4600",
"jsonFileName": "_AI_content_generation_for_Auto_Service__Automate_your_social_media.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/_AI_content_generation_for_Auto_Service__Automate_your_social_media.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/b5773610701e4050f958eb99abc1ab3f/raw/f8d15295e4b28c0cd77eac9915c67e8f07896883/_AI_content_generation_for_Auto_Service__Automate_your_social_media.json",
"screenshotURL": "https://i.ibb.co/twT5jWZ6/348f25f37ea0.png",
"workflowUpdated": true,
"gistId": "b5773610701e4050f958eb99abc1ab3f",
"templateDescriptionFull": "Who Is This For?\n\nWhether you’re a small garage owner, a car repair shop, an automotive specialist or automechanic - this tool helps you maintain an active online presence without spending hours creating content.\n\n💪🏼 Though this template is set up for Auto Service daily content uploads, but the underlying logic is universal. You can easily adapt it for any niche by editing prompts, adding nodes, and creating or uploading a variety of content to any platform. You can use any LLM and generative AI of your choice. Personally, I prefer the smooth and effective results from ChatGPT 4.1 combined with GPT Image 1. But you can generate videos instead of images for your posts as well 😉\n\nWhat Problem Is This Workflow Solving?\n\n🤦‍♂️ Many auto service businesses struggle with consistently posting engaging content due to time constraints or lack of marketing expertise.\n\nWhat This Workflow Does:\n\nGenerates daily social media posts tailored specifically to the auto service niche using AI.\nAllows easy customization of post and image prompts.\nIntegrates research links through the Tavily Internet Search tool for relevant content.\nSupports starting posts based on reference article links via Google Sheets.\nAutomatically publishes posts to your connected social media platforms.\nEnables scheduled or trigger-based posting for maximum flexibility.\n\nHow It Works?\n\nEasy, actually ☺️\n\nAI creates daily social media content made just for Auto Service. You can simply edit prompts for both posts and images, set up news or articles research links via the Tavily Internet Search tool. You can also start the workflow with a reference article link through Google Sheets.\n\nSet Up Steps:\n\nI kept it quick and simple for you ✨\n\nIf you’re happy with the current LLM and image model configurations, just connect your OpenAI API credentials to enable AI content generation.\nLink your social media accounts (Facebook, Telegram, X, etc.) for autoposting.\nOptionally connect Google Sheets if you want to trigger posts based on sheet updates with reference links.\nCustomize prompts as needed to match your brand voice, style and marketing tasks.\nChoose between:\n\nScheduled automatic generation and posting at the same time as socials algorythms like it.\nGoogle Sheets trigger with reference.\nManual start.\n\nHow to Customize This Workflow to Your Needs?\n\nSwitch AI models and edit prompts to better reflect your specific services or promotions.\nAdd or modify research links in Tavily to keep your posts timely and relevant.\nAdjust posting schedules to match peak engagement times for your audience.\nExpand or reduce the number of social platforms integrated depending on your marketing strategy.\nUse Google Sheets to batch upload ideas or curate specific content topics.\n\nAfter adjusting a few settings, activate the workflow and let it run.\n\n🤖 The system will then automatically publish your content across your selected social platforms — saving you time and effort.\n\n\n\n📌 You’ll find more detailed tips and additional ai models for customizing ai generation process inside the workflow via sticky notes.\n\n💬 Need help? For additional guidance, feel free to message me — here’s my profile in the n8n community for direct contact 👈 click!",
"isPaid": false
},
{
"templateId": "461",
"templateName": "Create a new card in Trello",
"templateDescription": "workflow-screenshot",
"templateUrl": "https://n8n.io/workflows/461",
"jsonFileName": "Create_a_new_card_in_Trello.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Create_a_new_card_in_Trello.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/5bb70b925800689579d71a28e15c8923/raw/d01aaa535bb536527b96fef0cf3d8737319fd494/Create_a_new_card_in_Trello.json",
"screenshotURL": "https://i.ibb.co/8496Cmth/ca6c62e38d40.png",
"workflowUpdated": true,
"gistId": "5bb70b925800689579d71a28e15c8923",
"templateDescriptionFull": "workflow-screenshot",
"isPaid": false
},
{
"templateId": "4621",
"templateName": "22. Automated Video Analysis: AI-Powered Insight Generation from Google Drive",
"templateDescription": "Automated Video Analysis: AI-Powered Insight Generation from Google Drive Subtitle: From Google Drive Upload → Gemini AI → Video Insights 🌍 Overview This...",
"templateUrl": "https://n8n.io/workflows/4621",
"jsonFileName": "22._Automated_Video_Analysis_AI-Powered_Insight_Generation_from_Google_Drive.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/22._Automated_Video_Analysis_AI-Powered_Insight_Generation_from_Google_Drive.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/ad8fa167cf62e132e202622f203cb479/raw/21c3a26896946b536e78d5adb887ff9519bf9f46/22._Automated_Video_Analysis_AI-Powered_Insight_Generation_from_Google_Drive.json",
"screenshotURL": "https://i.ibb.co/LXdS209q/71e90a24f7c2.png",
"workflowUpdated": true,
"gistId": "ad8fa167cf62e132e202622f203cb479",
"templateDescriptionFull": "Subtitle: From Google Drive Upload → Gemini AI → Video Insights\n\nThis workflow automates the analysis of videos stored in Google Drive.\nIt downloads a video, validates it, sends it to Google Gemini AI for analysis, and returns a structured summary of the content.\n\nThink of it as your AI-powered video analyst that works on schedule.\n\n🔗 Node: Schedule Trigger\n\nRuns the workflow automatically at a defined interval (e.g., daily).\nEliminates the need to manually start each run.\n\n💡 Why useful?\nKeeps analysis consistent without human intervention.\n\n🔗 Node: Download Video from Drive\n\nConnects to Google Drive.\nFetches the video file you want to analyze.\n\n💡 Why useful?\nPulls the raw video directly from storage → no manual download needed.\n\n🔗 Nodes:\n\nBasic LLM Chain → Prepares a structured prompt for Gemini.\nGoogle Gemini Chat Model → Defines Gemini as the AI engine for analysis.\n\n💡 Why useful?\nEnsures that Gemini gets both the video file + clear instructions on what to analyze (e.g., \"Please provide a summary\").\n\n🔗 Node: Check File Status\n\nConfirms that the video file is uploaded and ready to be processed by Gemini’s API.\n\n💡 Why useful?\nPrevents wasted runs by making sure the file exists and is accessible before analysis.\n\n🔗 Node: Analyze Video\n\nSends the video file to Gemini (via API request).\nAsks Gemini to analyze and summarize the video.\n\n💡 Why useful?\nExtracts insights from video content automatically — no need to watch manually.\n\n📩 Example Output:\n\n🔗 Node: Format Analysis Result\n\nStructures the Gemini response into clean output.\nMakes it easy to forward results into email, Slack, or reporting tools.\n\n💡 Why useful?\nInstead of messy raw JSON, you get clear summaries ready to share.\n\nHands-free analysis → Videos summarized automatically.\nSaves time → No need to watch entire footage.\nReliable → Validates file before sending to AI.\nFlexible → Schedule runs (daily, weekly, etc.).\nScalable → Works for 1 video or 1,000.\nBeginner-friendly → Includes sticky notes and author support.",
"isPaid": false
},
{
"templateId": "4696",
"templateName": "Conversational Telegram Bot with GPT-4o for Text and Voice Messages",
"templateDescription": "This n8n workflow leverages a Telegram Message Trigger to activate an intelligent AI Agent capable of processing both text and voice messages. When a user...",
"templateUrl": "https://n8n.io/workflows/4696",
"jsonFileName": "Conversational_Telegram_Bot_with_GPT-4o_for_Text_and_Voice_Messages.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Conversational_Telegram_Bot_with_GPT-4o_for_Text_and_Voice_Messages.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/9a02522c8190eff8de6e48f11531a513/raw/3365b4952ffc5a9608c375945445c461485f97a1/Conversational_Telegram_Bot_with_GPT-4o_for_Text_and_Voice_Messages.json",
"screenshotURL": "https://i.ibb.co/20G5cNjT/8c6a7042c601.png",
"workflowUpdated": true,
"gistId": "9a02522c8190eff8de6e48f11531a513",
"templateDescriptionFull": "This n8n workflow leverages a Telegram Message Trigger to activate an intelligent AI Agent capable of processing both text and voice messages. When a user sends a message in text or in voice format, the workflow captures and transcribes it (if necessary), then passes it to the AI Agent for understanding and response generation.\n\nTo enhance user experience, the bot also displays a typing indicator while processing requests, simulating a natural, human-like interaction.\n\nMulti-Modal Input: Supports both text messages and voice notes from users.\nReal-Time Interaction: Shows a “typing…” action in Telegram while the AI processes the input.\nAI Agent Integration: Provides intelligent, context-aware, and conversational responses.\nSeamless Feedback Loop: Replies are sent directly back to the user within Telegram for smooth interaction.\n\nThe workflow triggers whenever a message or voice note is received on Telegram.\nIf the input is a voice note, the workflow transcribes it into text.\nThe text input is sent to the AI Agent for processing.\nWhile processing, the bot sends a typing indicator to the user.\nOnce the AI generates a response, the workflow sends it back to the user in Telegram.\n\nCreate a Telegram Bot:\n\nUse @BotFather to create a bot and obtain your bot token.\nUse @BotFather to create a bot and obtain your bot token.\nConfigure n8n Credentials:\n\nAdd Telegram API credentials in n8n with your bot token.\nAdd credentials for any speech-to-text service used for voice transcription (e.g., Open AI Transcribe A Recording).\nAdd Telegram API credentials in n8n with your bot token.\nAdd credentials for any speech-to-text service used for voice transcription (e.g., Open AI Transcribe A Recording).\nImport the Workflow:\n\nImport this workflow into your n8n instance.\nUpdate all credential nodes to use your Telegram and transcription service credentials.\nImport this workflow into your n8n instance.\nUpdate all credential nodes to use your Telegram and transcription service credentials.\nSet Webhook URLs:\n\nEnsure Telegram webhook is set properly for your bot to receive messages.\nMake sure your n8n instance is publicly accessible for Telegram callbacks.\nEnsure Telegram webhook is set properly for your bot to receive messages.\nMake sure your n8n instance is publicly accessible for Telegram callbacks.\nTest the Workflow:\n\nSend text messages and voice notes to your Telegram bot and observe the AI responses.\nSend text messages and voice notes to your Telegram bot and observe the AI responses.\n\nAdd new message handlers: Extend the workflow to handle additional message types (images, documents, etc.).\nImprove transcription: Swap or add speech-to-text services for better accuracy or language support.\nEnhance AI Agent: Customize prompts and context management to tailor the AI’s personality and responses.\nAI Model Flexibility: Swap between different AI models (e.g., GPT-5, GPT-4, Claude, or custom LLMs) based on task type, cost, or performance preferences. By default, I use GPT-4o in this template. However, you can use the latest GPT-5 model by changing them in OpenAI Chat Model Node. It will show you a list of all the available models to choose.\nTool-Based Control: Add custom tools to the AI Agent such as calendar access, Notion, Google Sheets, web search, database queries, or custom APIs—allowing for dynamic, multi-functional agents\n\nNotes The Telegram node manages message reception and sending but does not directly handle AI processing.\nVoice transcription requires integration with external APIs; secure those credentials in n8n and monitor usage.\nTo simulate typing, the workflow uses Telegram’s “sendChatAction” API method, providing users with feedback that the bot is processing.\nEnsure your AI API keys and Telegram tokens are securely stored in n8n credentials and not exposed in workflows or logs.\n\nHandles natural conversational inputs with text or voice.\nProvides a smooth, engaging user experience via typing indicators.\nEasy integration of advanced AI conversational agents with Telegram.\nFlexible for personal assistants, helpdesks, or interactive chatbots.",
"isPaid": false
},
{
"templateId": "4888",
"templateName": "ELEVEN LABS WITH MULTI AGENTS",
"templateDescription": "🧠 Gwen – The AI Voice Marketing AgentGwen is your intelligent voice-powered marketing assistant built in n8n. She combines the power of OpenAI, ElevenLabs,...",
"templateUrl": "https://n8n.io/workflows/4888",
"jsonFileName": "ELEVEN_LABS_WITH_MULTI_AGENTS.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/ELEVEN_LABS_WITH_MULTI_AGENTS.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/4bcb6f4e962f9108c6c874297e6d567b/raw/8129e8699ad221ffdc7f0c5167a5b251725d8a23/ELEVEN_LABS_WITH_MULTI_AGENTS.json",
"screenshotURL": "https://i.ibb.co/0jZr0M8D/44119f3df74a.png",
"workflowUpdated": true,
"gistId": "4bcb6f4e962f9108c6c874297e6d567b",
"templateDescriptionFull": "🧠 Gwen – The AI Voice Marketing Agent\nGwen is your intelligent voice-powered marketing assistant built in n8n. She combines the power of OpenAI, ElevenLabs, and automation workflows to handle content creation, image generation, and voice delivery — all from a single agent interface.\n\nThis template shows a graphical illustration of how Gwen will work with subworkflows. These subworkflows are modular placeholders and need to be linked into Gwen for full deployment.\n\n✨ What Gwen Can Do\n📝 Generate Voice-Optimized Blog Posts\nTailored for your target audience with engaging intros, real-time research, and polished structure.\n🖼️ Create AI-Generated Visuals\nFrom simple concepts to detailed image prompts and Google Drive uploads.\n🧑‍🎨 Edit Images On Demand\nModify past images with a few words — powered by OpenAI's image editing API.\n🔍 Search Image Database\nQuickly find past content using title or intent.\n🧠 Think Tool\nGwen uses this to clarify uncertain tasks or analyze complex requests.\n🔊 Deliver Results in Natural Voice\nWith ElevenLabs, Gwen transforms all responses into human-like audio, perfect for marketing, social content, or voice interfaces.\n🛠️ Setup Instructions\nEstimated Time: 15–30 mins\n\n✅ Step 1: Subworkflows\n\nImport These Workflows\nBlog Post, Create Image, Edit Image, Search Images\nConnect Them to Gwen\nAssign as tools inside the Gwen agent node (Langchain AI Agent in n8n).\n🎙️ Step 2: Enable ElevenLabs Voice Agent\n\nSign up or log in: https://try.elevenlabs.io\nCopy your API key\nIn the ElevenLabs interface, create a new tool:\nMethod: POST\nURL: https://your-n8n-domain/webhook/042cc868-28b7-42a2-ab65-bc2944fc5a54\nUnder Body Parameters, add:\nprompt → value type: LLM Prompt\nsessionId → value type: Dynamic variable, name: system__conversation_id\nSave and connect this tool to your ElevenLabs agent\nRun a test and check n8n execution logs to confirm Gwen’s voice integration is active\n🔐 Step 3: Credentials to Set\n\nOpenAI – For text and image generation\nElevenLabs – For voice output\nTavily – For real-time research in blog generation\nTelegram – For sending content to users\nGoogle Sheets – To log all outputs like blogs and images",
"isPaid": false
},
{
"templateId": "5195",
"templateName": "AI-Powered Product Research & SEO Content Automation",
"templateDescription": "AI-Powered Product Research & SEO Content Automation Skip the guesswork and manual effort — this n8n flow automates the entire process of researching your...",
"templateUrl": "https://n8n.io/workflows/5195",
"jsonFileName": "AI-Powered_Product_Research__SEO_Content_Automation.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/AI-Powered_Product_Research__SEO_Content_Automation.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/c6d0da29d9d40d36324426f803ded581/raw/e0c96390b4f950fa2d60986a0f83bf016efa3e52/AI-Powered_Product_Research__SEO_Content_Automation.json",
"screenshotURL": "https://i.ibb.co/gZ9DrGtB/de30527f2d7f.png",
"workflowUpdated": true,
"gistId": "c6d0da29d9d40d36324426f803ded581",
"templateDescriptionFull": "Skip the guesswork and manual effort — this n8n flow automates the entire process of researching your product's online competition and generating high-quality SEO content. Whether you're launching a new product or optimizing existing listings, this workflow leverages real-time web data and AI-driven copywriting to deliver:\n\n📈 Search-optimized metadata (Title, Description, Keywords)\n🛍️ Engaging product descriptions tailored for marketing\n📊 Auto-organized output ready for use in your content or e-commerce platform\n\nAll of this happens with just one product title input!\n\n• User submits a product title via a form.\n• The workflow uses Google Custom Search to gather real-time competitor content based on that title.\n• Titles, snippets, and keywords are extracted from the search results.\n• This information is sent to a language model (Google Gemini via LangChain) to generate:\n\nSEO-optimized metadata (Title, Description, Keywords)\nA compelling product description tailored for marketing\n• The AI-generated content is then parsed and organized into two categories: SEO data and product content.\n• The structured output is saved automatically into a connected Google Sheet for easy access or further automation.\n\nManual competitor research and writing SEO content from scratch can be:\n\nTime-consuming\nInconsistent in quality\nNot optimized for search engines\nHard to scale for multiple products\n\nThis workflow automates the entire research + writing + structuring process.\n\nInstant Content Creation: Generate polished SEO content in seconds.\nCompetitor-Aware: Pulls in real-time data from the web for relevant, market-aligned content.\nScalable: Easily repeat the process for multiple product titles with minimal effort.\nData Centralization: Stores everything in Google Sheets—great for collaboration or syncing with other tools.\nCustomizable: Easily extend or modify the workflow to include translations, publishing, or social media automation.\n\n• Connect Google Custom Search API with a valid API key and search engine ID (CX).\n• Connect and configure Google Gemini or LangChain with access credentials.\n• Provide access to a Google Sheet with columns for storing SEO and product data.\n• Estimated setup time: ~15–25 minutes depending on API access and sheet setup.\n\nCreate your free n8n account and set up the workflow in just a few minutes using the link below:\n\n👉 Start Automating with n8n\n\nSave time, stay consistent, and grow your LinkedIn presence effortlessly!",
"isPaid": false
},
{
"templateId": "5674",
"templateName": "Seedance Video Marketing AI Agent",
"templateDescription": "This workflow contains community nodes that are only compatible with the self-hosted version of n8n. 🎬 Seedance Video Marketing AI AgentDescription:This...",
"templateUrl": "https://n8n.io/workflows/5674",
"jsonFileName": "Seedance_Video_Marketing_AI_Agent.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Seedance_Video_Marketing_AI_Agent.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/646ce2805413888c48daddd7c1f25d77/raw/7beb432a8db75308a06aedd47ae60fa251d3e818/Seedance_Video_Marketing_AI_Agent.json",
"screenshotURL": "https://i.ibb.co/vvKsxZ16/7c907b6dfbc6.png",
"workflowUpdated": true,
"gistId": "646ce2805413888c48daddd7c1f25d77",
"templateDescriptionFull": "This workflow contains community nodes that are only compatible with the self-hosted version of n8n.\n\n🎬 Seedance Video Marketing AI Agent\nDescription:\nThis AI-powered marketing automation workflow takes a user prompt, researches trending topics, generates a compelling short-form video prompt, and sends it to the Seedance API via Wavespeed to create a ready-to-use video ad — all from a single Telegram message.\n\nBuilt for marketers, founders, and content creators who want to turn trend-based ideas into visual video content without touching a video editor.\n\nFor the step-by-step video tutorial guide on how to build this workflow, check out:\nhttps://youtu.be/2oZ5NhosKgo\n\n🤖 How It Works:\n📲 Telegram Trigger\n Kick off the workflow by messaging a short prompt (e.g., “Create a 5-second IG ad for my new perfume”) via Telegram.\n\n📈 Trend Research with Perplexity AI (Sonar Pro)\n An AI agent scans the latest 14-day trends and selects the top marketing angle based on the product/niche input.\n\n🧠 Video Prompt Engineer (ChatGPT)\n Crafts a concise, visually rich video generation prompt — optimized for Seedance — based on the trend insight and product.\n\n🎥 Video Generation (Wavespeed + Seedance API)\n Sends the AI-generated prompt to Seedance via Wavespeed to generate a 5-second short-form video ad.\n\n🔁 Status Loop & Response\n The workflow checks if the video is ready. Once complete, it sends the final video output URL back to you in Telegram.\n\n🔧 Tools Used:\nTelegram Trigger & Response\n\nPerplexity AI (Sonar Pro)\n\nOpenAI\n\nSeedance API (via Wavespeed)\n\nn8n HTTP Request, Wait, and Loop nodes\n\n💡 Use Cases:\nAuto-generate TikTok/Instagram ads from current trends\n\nScale creative content generation with zero design work\n\nPlug into your marketing chatbot or campaign assistant\n\nUse trends as visual inspiration for ad creatives\n\nIf you like the build, check out my channel and consider subscribing: https://www.youtube.com/@Automatewithmarc",
"isPaid": false
},
{
"templateId": "5840",
"templateName": "My workflow 19",
"templateDescription": "This n8n template from Intuz delivers a complete AI-powered solution for automated LinkedIn posts, including unique content, custom images, and optimized...",
"templateUrl": "https://n8n.io/workflows/5840",
"jsonFileName": "My_workflow_19.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/My_workflow_19.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/6d87f252ee5a7c3b079c02a8b7e90a75/raw/91f1bc9f0b4f0c274ea51584c282e3e28a7c5679/My_workflow_19.json",
"screenshotURL": "https://i.ibb.co/4ndkN5cB/eab1e21ef425.png",
"workflowUpdated": true,
"gistId": "6d87f252ee5a7c3b079c02a8b7e90a75",
"templateDescriptionFull": "Use cases are many: Generate and schedule tailored LinkedIn content for different use-cases. By feeding the AI specific prompts, you can create specific post depending upon the topics and visuals to maintain a consistency yet and an online presence.\n\nMaintaining a consistent and engaging presence on LinkedIn can be time-consuming, requiring constant ideation, content creation, and manual posting. This workflow takes that burden off your shoulders, delivering a fully automated solution for generating and publishing high-quality LinkedIn content.\n\nScheduled Content Engine: Each day (or on your chosen schedule), the workflow kicks into gear, ensuring a fresh stream of content.\nSmart Topic & Content Generation: Using the power of Google Gemini, it intelligently crafts unique content topics and then expands them into full, engaging posts, ensuring your message is always fresh and relevant.\nDynamic Image Creation: To make your posts stand out, the workflow leverages an AI image generator (like DALL-E) to produce a custom, eye-catching visual that perfectly complements your generated text.\nSEO-Optimized Hashtag Generation: Google Gemini then analyzes your newly created post and automatically generates a set of relevant, trending, and SEO-friendly hashtags, significantly boosting your content's reach and discoverability.\nSeamless LinkedIn Publishing: Finally, all these elements—your compelling text, unique image, and powerful hashtags—are merged and automatically published to your LinkedIn profile, establishing you as a thought leader with minimal effort.\n\nThis guide will get your AI LinkedIn Content Automation workflow up and running in n8n.\n\nImport Workflow Template:\n\nDownload the template's JSON file and import it into your n8n instance via \"File\" > \"Import from JSON.\"\n\nConfigure Credentials:\n\nGoogle Gemini: Set up and apply your API key credentials to all \"Google Gemini Chat Model\" nodes.\nAI Image Generation (e.g., OpenAI): Create and apply API key credentials for your chosen image generation service to the \"Generate an Image\" node.\nLinkedIn: Set up and apply OAuth credentials to the \"Create a post\" node for your LinkedIn account.\n\nCustomize Schedule & AI Prompts:\n\nSchedule Trigger: Double-click \"Schedule Trigger 1\" to set how often your workflow runs (e.g., daily, weekly).\nAI Prompts: Review and edit the prompts within the \"Content Topic Generator,\" \"Content Creator,\" and \"Hashtag Generator / SEO\" nodes to guide the AI for your desired content style and topics.\n\nTest & Activate:\n\nTest Run: Click \"Execute Workflow\" to perform a test run and verify all steps are working as expected.\nActivate: Once satisfied, toggle the workflow \"Active\" switch to enable automated posting on your defined schedule.\n\n\n\nTo use this workflow template, you will need:\n\nn8n Instance: A running n8n instance (cloud or self-hosted) to import and execute the workflow.\nGoogle Gemini Account: For content topic generation, content creation, and hashtag generation (requires Google Gemini API Key) from Google AI Studios.\nAI Image Generation Service Account: For creating images (e.g., OpenAI DALL-E API Key or similar service that the \"Generate an Image\" node uses).\nLinkedIn Account: For publishing the generated posts (requires LinkedIn OAuth Credentials for n8n connection).\n\nWebsite: https://www.intuz.com/services\nEmail: getstarted@intuz.com\nLinkedIn: https://www.linkedin.com/company/intuz\nGet Started: https://n8n.partnerlinks.io/intuz\n\nClick here- Get Started",
"isPaid": false
},
{
"templateId": "7372",
"templateName": "template_7372",
"templateDescription": "Automated trend monitoring for content strategy Who's it forContent creators, marketers, and social media managers who want to stay ahead of emerging trends...",
"templateUrl": "https://n8n.io/workflows/7372",
"jsonFileName": "template_7372.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_7372.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/091785d32a7bc2371856772688df6206/raw/1555a8cc6d6a5ffe91eba5fd2d8eb3329d5fb690/template_7372.json",
"screenshotURL": "https://i.ibb.co/Jj4MNhgr/535addaccb79.png",
"workflowUpdated": true,
"gistId": "091785d32a7bc2371856772688df6206",
"templateDescriptionFull": "Content creators, marketers, and social media managers who want to stay ahead of emerging trends and generate relevant content ideas based on data-driven insights.\n\nThis workflow automatically identifies trending topics related to your industry, collects recent news articles about these trends, and generates content suggestions. It transforms raw trend data into actionable editorial opportunities by analyzing search volume growth and current news coverage.\n\nThe workflow follows a three-step automation process:\nTrend Analysis: Examines searches related to your topics and identifies those with the strongest recent growth\nArticle Collection: Searches Google News for current articles about emerging trends and scrapes their full content\nContent Generation: Creates personalized content suggestions based on collected articles and trend data\nThe system automatically excludes geo-localized searches to provide a global perspective on trends, though this can be customized.\n\nSerpAPI account (for trend and news data)\nFirecrawl API key (for scraping article content from Google News results)\nGoogle Sheets access\nAI model API key (for content analysis and recommendations - you can use any LLM provider you prefer)\n\nDuplicate this Google Sheets template\nRename your copy and ensure it's accessible\n\nBefore running the workflow, set up the following credentials in n8n:\nSerpAPI: For trend analysis and Google News search\nFirecrawl API: For scraping article content\nAI Model API: For content analysis and recommendations (Anthropic Claude, OpenAI GPT, or any other LLM provider)\nGoogle Sheets OAuth2: For accessing and updating your tracking spreadsheet\n\nIn your Google Sheet \"Query\" tab:\n\nQuery column: Enter the main topics/keywords you want to monitor for trending queries (e.g., \"digital marketing\", \"artificial intelligence\", \"sustainable fashion\")\nQuery to avoid column: Optionally add specific queries you want to exclude from trend analysis (e.g., brand names, irrelevant terms, or overly specific searches that don't match your content strategy)\n\nThis step is crucial as these queries will be the foundation for discovering related trending topics.\n\nIn the \"Get Query\" node, paste your duplicated Google Sheets URL in the \"Document\" field\nEnsure your Google Sheet contains your monitoring topics in the Query column\n\nThe workflow is currently configured for French content and France location. You can modify these settings in the SerpAPI nodes:\nLanguage (hl): Change from \"fr\" to your preferred language code\nGeographic location (geo/gl): Change from \"FR\" to your target country code\nDate range: Currently set to \"today 1-m\" (last month) but can be adjusted\n\nThe \"Sorting Queries\" node excludes geo-localized queries by default. You can modify the AI agent's instructions to include location-specific queries or change filtering criteria based on your requirements. The system will also automatically exclude any queries you've listed in the \"Query to avoid\" column.\n\nThe workflow includes an automated scheduler that runs monthly (1st day of each month at 8 AM). You can modify the cron expression 0 8 1 * * in the Schedule Trigger node to change:\nFrequency (daily, weekly, monthly)\nTime of execution\nDay of the month\n\nChange trend count: The workflow processes up to 10 related queries per topic but filters them through AI to select the most relevant non-geolocalized ones\nAdjust article collection: Currently collects exactly 3 news articles per query for analysis\nContent style: Customize the AI prompts in content generation nodes to match your brand voice\nOutput format: Modify the Google Sheets structure to include additional data points\nAI model: Replace the Anthropic model with your preferred LLM provider\nScraping options: Configure Firecrawl settings to extract specific content elements from articles\n\nFor each monitored topic, the workflow generates a separate sheet named by month and topic (e.g., \"January Digital Marketing\") containing:\nData structure (four columns):\nQuery: The trending search term ranked by growth\nÉvolution: Growth percentage over the last month\nNews: Links to 3 relevant news articles\nIdée: AI-generated content suggestions based on comprehensive article analysis\nThe workflow provides monthly retrospective analysis, helping you identify emerging topics before competitors and optimize your content calendar with high-potential subjects.\n\nProcesses up to 10 related queries per topic with AI filtering\nCollects exactly 3 news articles per query\nResults are automatically organized in monthly sheets\nRequires stable internet connection for API calls",
"isPaid": false
}
]
This file has been truncated, but you can view the full file.
[
{
"templateId": "2682",
"templateName": "\ud83d\udd0d\ud83d\udee0\ufe0fPerplexity Researcher to HTML Web Page",
"templateDescription": "Transform simple queries into comprehensive, well-structured content with this n8n workflow that leverages Perplexity AI for research and GPT-4 for content...",
"templateUrl": "https://n8n.io/workflows/2682",
"jsonFileName": "Perplexity_Researcher_to_HTML_Web_Page.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Perplexity_Researcher_to_HTML_Web_Page.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/ccf6d8b0b7e9f589f1d5e712d3288b89/raw/81c6b34b4191475558bf5559a021568c2c81027f/Perplexity_Researcher_to_HTML_Web_Page.json",
"screenshotURL": "https://i.ibb.co/KjkzhGGv/7c63edfd6c16.png",
"workflowUpdated": true,
"gistId": "ccf6d8b0b7e9f589f1d5e712d3288b89",
"templateDescriptionFull": "Transform simple queries into comprehensive, well-structured content with this n8n workflow that leverages Perplexity AI for research and GPT-4 for content transformation. Create professional blog posts and HTML content automatically while maintaining accuracy and depth.\n\nIntelligent Research & Analysis\n\n\ud83d\ude80 Automated Research Pipeline\n\nHarnesses Perplexity AI's advanced research capabilities\nProcesses complex topics into structured insights\nDelivers comprehensive analysis in minutes instead of hours\n\n\ud83e\udde0 Smart Content Organization\n\nAutomatically structures content with clear hierarchies\nIdentifies and highlights key concepts\nMaintains technical accuracy while improving readability\nCreates SEO-friendly content structure\n\nContent Transformation Features\n\n\ud83d\udcdd Dynamic Content Generation\n\nConverts research into professional blog articles\nGenerates clean, responsive HTML output\nImplements proper semantic structure\nIncludes metadata and categorization\n\n\ud83c\udfa8 Professional Formatting\n\nResponsive Tailwind CSS styling\nClean, modern HTML structure\nProper heading hierarchy\nMobile-friendly layouts\nBlockquote highlighting for key insights\n\nPerfect For\n\n\ud83d\udcda Content Researchers\nSave hours of manual research by automating the information gathering and structuring process.\n\n\u270d\ufe0f Content Writers\nFocus on creativity while the workflow handles research and technical formatting.\n\n\ud83c\udf10 Web Publishers\nGenerate publication-ready HTML content with modern styling and proper structure.\n\nTechnical Implementation\n\n\u26a1 Workflow Components\n\nWebhook endpoint for query submission\nPerplexity AI integration for research\nGPT-4 powered content structuring\nHTML transformation engine\nTelegram notification system (optional)\n\nTransform your content creation process with an intelligent system that handles research, writing, and formatting while you focus on strategy and creativity."
},
{
"templateId": "3161",
"templateName": "template_3161",
"templateDescription": "AI-Powered Social Media Content Automation \ud83e\uddd1\u200d\ud83d\udcbb Who is this for?This workflow is perfect for social media managers, content creators, and digital marketers...",
"templateUrl": "https://n8n.io/workflows/3161",
"jsonFileName": "template_3161.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_3161.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/9eb813e446a312179afb3cad183e2932/raw/3c7aa149191919da4ac2c9f66152e70e5f111d6e/template_3161.json",
"screenshotURL": "https://i.ibb.co/4RMZr704/fc94b92eace1.png",
"workflowUpdated": true,
"gistId": "9eb813e446a312179afb3cad183e2932",
"templateDescriptionFull": "This workflow is perfect for social media managers, content creators, and digital marketers who want to save time by automating social media post generation and publishing across platforms.\n\nManually generating and scheduling social media content is time-consuming and repetitive. This workflow automates content creation and publishing, allowing you to:\n\nStreamline content generation using AI\nEnsure consistent posting across social media platforms\nTrack published posts in Google Sheets\n\nFetches content ideas from a Google Sheet.\nGenerates social media posts using OpenAI's GPT-4.\nChecks the target platform (e.g., Twitter/X, LinkedIn).\nPosts the content to the chosen social media platform.\nUpdates the Google Sheet with the generated post and timestamp.\n\nConnect Google Sheets: Ensure you have a Google Sheet with content ideas (columns: Idea, Status, Generated Post).\nSet up OpenAI API Key: Provide your OpenAI API key for GPT-4.\nConfigure Social Media Accounts: Link your Twitter/X or other social media accounts using n8n's built-in nodes.\nTest the Workflow: Run the workflow to verify automation.\nSchedule Automation: Set a recurring trigger (e.g., daily) to automate posting.\n\nAdjust prompt inputs in the OpenAI node to tailor the tone and style.\nAdd more platforms (e.g., Instagram, Facebook) by duplicating the social media node.\nInclude analytics tracking for engagement insights.\n\nAutomatically generate and share daily motivational quotes.\nPost product updates and announcements.\nShare curated industry news and insights.\n\nThis workflow saves time and keeps your social media presence active and engaging effortlessly. \ud83d\ude80"
},
{
"templateId": "4652",
"templateName": "GEO/SEO content engine",
"templateDescription": "\ud83e\udde0 Who is this for?Marketing teams, content creators, solopreneurs, and agencies who want to generate emotionally-resonant, SEO-optimized content tailored...",
"templateUrl": "https://n8n.io/workflows/4652",
"jsonFileName": "GEO_SEO_content_engine.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/GEO_SEO_content_engine.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/a820e5e6055ef3a91a5b06fd9937ae4b/raw/3c95fe689d84789952cac176f796b0b51b7f0a27/GEO_SEO_content_engine.json",
"screenshotURL": "https://i.ibb.co/WWvpWMFz/15d9a56231a8.png",
"workflowUpdated": true,
"gistId": "a820e5e6055ef3a91a5b06fd9937ae4b",
"templateDescriptionFull": "\ud83e\udde0 Who is this for?\nMarketing teams, content creators, solopreneurs, and agencies who want to generate emotionally-resonant, SEO-optimized content tailored to audience psychology and buyer journey stages \u2014 and get picked up by AI discovery engines like ChatGPT, Gemini, and Perplexity.\n\nHow it works:\n\u2705 Decodes why people buy (using buyer psychology)\n\u2705 Creates SEO + emotionally resonant content for 4 formats:\n\u2192 Blog Posts, Newsletters, Landing Pages, Social Media\n\u2705 Structures the content to be picked up by ChatGPT, Gemini, Perplexity & Google\n\u2705 Automatically routes it to Google Sheets, Gmail, or even WordPress\n\nThis isn\u2019t just about writing better content \u2014 it\u2019s about getting seen by the tools that shape the internet.\n\nHow long does it take to set-up: 30 Mins"
},
{
"templateId": "4226",
"templateName": "template_4226",
"templateDescription": "Automated Blog Post Review and Multi-Platform Publishing Workflow with RSS Feeds DescriptionThis workflow automates the process of generating, reviewing,...",
"templateUrl": "https://n8n.io/workflows/4226",
"jsonFileName": "template_4226.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_4226.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/42a3f83e1ebc0c15afb2ee6c8cd30978/raw/95dee7f4a8ea85d2cc58420dc6edccd4546b0f30/template_4226.json",
"screenshotURL": "https://i.ibb.co/JR65053s/a1d934f6b4d3.png",
"workflowUpdated": false,
"gistId": "42a3f83e1ebc0c15afb2ee6c8cd30978",
"templateDescriptionFull": "This workflow automates the process of generating, reviewing, and publishing blog posts across multiple platforms, now enhanced with support for RSS Feeds as a content source.\nIt streamlines the management of blog posts by fetching content from RSS Feeds, formatting, storing, reviewing, templating, and publishing to platforms like LinkedIn and WordPress.\nThe workflow is split into three key flows:\n\nInitial Flow: Fetches content from RSS Feeds, prepares and stores blog post data, sends a review email with approval/rejection links.\nApproval Flow: Handles review actions via a webhook to update the status in Google Sheets.\nStatus Update Flow: Monitors status changes and publishes approved posts.\n\nTarget Audience\n\nContent creators, bloggers, and digital marketers.\nTeams managing multi-platform content publishing.\nUsers familiar with n8n, Google Sheets, LinkedIn, and RSS Feeds.\n\nManually managing blog posts, especially when sourcing content from RSS Feeds, can be time-consuming and error-prone\nThis workflow addresses:\n\nContent Sourcing: Fetches blog posts from RSS Feeds for automated processing\nContent Formatting: Automatically formats and stores blog posts.\nReview Process: Simplifies approval with email notifications and webhook triggers.\nMulti-Platform Publishing: Publishes to LinkedIn, WordPress and optionally Medium) with delays to avoid rate limits\nStatus Tracking: Tracks approval and publishing status in Google Sheets.\n\nn8n Instance: Ensure you have an active n8n instance\nRSS Feed URL: Identify an RSS Feed URL (e.g., a blog\u2019s feed like https://example.com/feed)\nGoogle Sheets: Create a spreadsheet with columns: Title, Blogpost, Publication Date, Keywords, Status, Published, Featured Image, articleUrl, Rendered Blog.\n\nSheet Name: Posts Initial\nAdd a dropdown for Status: Pending, Approved, Rejected.\nSheet Name: Posts Initial\nAdd a dropdown for Status: Pending, Approved, Rejected.\nGmail Account: For sending review and notification emails.\nLinkedIn Account: For publishing posts (OAuth credentials needed).\nOptional: WordPress.com or Medium account for additional publishing.\n\nBelow is a detailed breakdown of each flow and node, including setup instructions.\n\nPurpose: Fetches blog posts from an RSS Feed, formats them, extracts images, stores data, and sends a review email.\n\nFetch from RSS Feed\n\nType: RSS Feed\nPurpose: Retrieves blog posts from an RSS Feed\nConfiguration:\n\nURL: https://example.com/feed (replace with your RSS Feed URL)\nLimit: 1 (or adjust based on your needs)\n\n\nSetup: Ensure the RSS Feed URL is valid and accessible; test the node to verify it fetches posts\nType: RSS Feed\nPurpose: Retrieves blog posts from an RSS Feed\nConfiguration:\n\nURL: https://example.com/feed (replace with your RSS Feed URL)\nLimit: 1 (or adjust based on your needs)\nURL: https://example.com/feed (replace with your RSS Feed URL)\nLimit: 1 (or adjust based on your needs)\nSetup: Ensure the RSS Feed URL is valid and accessible; test the node to verify it fetches posts\nSet Fields\n\nType: Set\nPurpose: Maps RSS Feed data to blog post fields\nSetup: Adjust field mappings based on your RSS Feed\u2019s structure\nType: Set\nPurpose: Maps RSS Feed data to blog post fields\nSetup: Adjust field mappings based on your RSS Feed\u2019s structure\nFormat Blog Post for Storage\n\nType: Code\nPurpose: Cleans up the blog post content.\nType: Code\nPurpose: Cleans up the blog post content.\nExtract Featured Image\n\nType: Code\nPurpose: Extracts or generates a featured image URL.\nSetup: Ensure originalHtml contains image data; otherwise, it uses a placeholder.\nType: Code\nPurpose: Extracts or generates a featured image URL.\nSetup: Ensure originalHtml contains image data; otherwise, it uses a placeholder.\nStore Blog Posts Initial\n\nType: Google Sheets\nPurpose: Stores initial blog post data\nSetup: Ensure Google Sheets credentials are set up and the spreadsheet has the required columns.\nType: Google Sheets\nPurpose: Stores initial blog post data\nSetup: Ensure Google Sheets credentials are set up and the spreadsheet has the required columns.\nSet Fields for Email\n\nType: Set\nPurpose: Prepares fields for the review email.\nSetup: Replace https://your-n8n-instance with your n8n instance URL.\nType: Set\nPurpose: Prepares fields for the review email.\nSetup: Replace https://your-n8n-instance with your n8n instance URL.\nPrepare Email HTML\n\nType: Code\nPurpose: Generates HTML email content with conditional image display\nSetup: No additional configuration needed\nType: Code\nPurpose: Generates HTML email content with conditional image display\nSetup: No additional configuration needed\nNotify for Review (Gmail)\n\nType: Gmail\nPurpose: Sends a review email with approval/rejection links\nType: Gmail\nPurpose: Sends a review email with approval/rejection links\n\nPurpose: Updates the blog post status based on approval/rejection\n\nWebhook Trigger\n\nType: Webhook\nPurpose: Triggers on approval/rejection link clicks\nConfiguration:\n\nHTTP Method: GET\nPath: approve-post\nResponse Code: 200\nResponse Data: {\"message\": \"Status updated\"}\n\n\nSetup: Ensure the webhook URL matches the one in Set Fields for Email\nType: Webhook\nPurpose: Triggers on approval/rejection link clicks\nConfiguration:\n\nHTTP Method: GET\nPath: approve-post\nResponse Code: 200\nResponse Data: {\"message\": \"Status updated\"}\nHTTP Method: GET\nPath: approve-post\nResponse Code: 200\nResponse Data: {\"message\": \"Status updated\"}\nSetup: Ensure the webhook URL matches the one in Set Fields for Email\nFind Row to Update\n\nType: Google Sheets\nPurpose: Retrieves all rows to find the matching blog post\nType: Google Sheets\nPurpose: Retrieves all rows to find the matching blog post\nFilter Row by Title\n\n\nType: Code\n\n\nPurpose: Filters the row matching the blog post title\n\n\nSetup: No additional configuration needed\nType: Code\nPurpose: Filters the row matching the blog post title\nSetup: No additional configuration needed\nUpdate Status on Approval\n\nType: Google Sheets\nPurpose: Updates the status to Approved or Rejected\nType: Google Sheets\nPurpose: Updates the status to Approved or Rejected\n\nPurpose: Monitors status changes and publishes approved posts\n\nGoogle Sheets Trigger (Fetch Row)\n\nType: Google Sheets Trigger\nPurpose: Triggers when a row\u2019s status is updated\nConfiguration:\n\nEvent: Update\nSheet Name: Posts Initial\nOutput Fields: title, status, published, featuredImage, articleUrl\n\n\nSetup: Ensure Google Sheets credentials are set up\nType: Google Sheets Trigger\nPurpose: Triggers when a row\u2019s status is updated\nConfiguration:\n\nEvent: Update\nSheet Name: Posts Initial\nOutput Fields: title, status, published, featuredImage, articleUrl\nEvent: Update\nSheet Name: Posts Initial\nOutput Fields: title, status, published, featuredImage, articleUrl\nSetup: Ensure Google Sheets credentials are set up\nRouter (Check Status)\n\nType: Router\nPurpose: Routes based on status and published state\nConfiguration:\n\nRoute 1: Approved and Not Published\n\nCondition: status equals Approved AND published equals NO\n\n\nRoute 2: Rejected\n\nCondition: status equals Rejected\n\n\nRoute 3: Pending\n\nCondition: status equals Pending\n\n\n\n\nSetup: No additional configuration needed\nType: Router\nPurpose: Routes based on status and published state\nConfiguration:\n\nRoute 1: Approved and Not Published\n\nCondition: status equals Approved AND published equals NO\n\n\nRoute 2: Rejected\n\nCondition: status equals Rejected\n\n\nRoute 3: Pending\n\nCondition: status equals Pending\nRoute 1: Approved and Not Published\n\nCondition: status equals Approved AND published equals NO\nCondition: status equals Approved AND published equals NO\nRoute 2: Rejected\n\nCondition: status equals Rejected\nCondition: status equals Rejected\nRoute 3: Pending\n\nCondition: status equals Pending\nCondition: status equals Pending\nSetup: No additional configuration needed\nApply Blog Template\nStore Blog Posts Final\n\nType: Google Sheets\nPurpose: Stores the final HTML content\nConfiguration:\n\nOperation: Update Row\n\n\nSetup: Ensure the Rendered Blog column exists\nType: Google Sheets\nPurpose: Stores the final HTML content\nConfiguration:\n\nOperation: Update Row\nOperation: Update Row\nSetup: Ensure the Rendered Blog column exists\nLoop Over Blog Posts\n\nType: Split in Batches\nPurpose: Processes each blog post individually\nConfiguration: Default settings\nSetup: No additional configuration needed\nType: Split in Batches\nPurpose: Processes each blog post individually\nConfiguration: Default settings\nSetup: No additional configuration needed\nDelay Between Posts\n\nType: Wait\nPurpose: Adds a delay to avoid rate limits\nConfiguration:\n\nWait Type: Delay\nAmount: 1 second\n\n\nSetup: Adjust delay as needed for LinkedIn rate limits\nType: Wait\nPurpose: Adds a delay to avoid rate limits\nConfiguration:\n\nWait Type: Delay\nAmount: 1 second\nWait Type: Delay\nAmount: 1 second\nSetup: Adjust delay as needed for LinkedIn rate limits\nPublish to LinkedIn\n\nType: LinkedIn\nPurpose: Publishes the blog post to LinkedIn\nConfiguration:\n\nOperation: Share Post\nAuthor: urn:li:person:YOUR_PERSONAL_URN\n\n\nSetup: Set up LinkedIn OAuth credentials and replace YOUR_PERSONAL_URN with your LinkedIn URN\nType: LinkedIn\nPurpose: Publishes the blog post to LinkedIn\nConfiguration:\n\nOperation: Share Post\nAuthor: urn:li:person:YOUR_PERSONAL_URN\nOperation: Share Post\nAuthor: urn:li:person:YOUR_PERSONAL_URN\nSetup: Set up LinkedIn OAuth credentials and replace YOUR_PERSONAL_URN with your LinkedIn URN\nUpdate Published State\n\n\nType: Google Sheets\n\n\nPurpose: Updates the published status\n\n\nConfiguration:\n\nOperation: Update Row\n\n\n\nSetup: Ensure the Published column exists\nType: Google Sheets\nPurpose: Updates the published status\nConfiguration:\n\nOperation: Update Row\nOperation: Update Row\nSetup: Ensure the Published column exists\nNotify Team\n\n\nType: Gmail\n\n\nPurpose: Notifies the team of successful publishing\n\n\nConfiguration:\nThe blog post \"{{ $json.title }}\" has been successfully published\n\n\nSetup: Set up Gmail credentials; replace [Link] with the LinkedIn URL if captured\nType: Gmail\nPurpose: Notifies the team of successful publishing\nConfiguration:\nThe blog post \"{{ $json.title }}\" has been successfully published\nSetup: Set up Gmail credentials; replace [Link] with the LinkedIn URL if captured\nNotify Rejection (Gmail) (Route 2)\n\n\nType: Gmail\n\n\nPurpose: Notifies on rejection\nThe blog post \"{{ $json.title }}\" has been rejected\nSuggestions: Rewrite with more engaging content, adjust keywords, or verify facts\nPlease update the status in Google Sheets if you wish to revise and resubmit\n\n\nSetup: Set up Gmail credentials\nType: Gmail\nPurpose: Notifies on rejection\nThe blog post \"{{ $json.title }}\" has been rejected\nSuggestions: Rewrite with more engaging content, adjust keywords, or verify facts\nPlease update the status in Google Sheets if you wish to revise and resubmit\nSetup: Set up Gmail credentials\nWait for Status Update (Route 3)\n\nType: Wait\nPurpose: Delays for status recheck\nConfiguration:\n\nWait Type: Delay\nDuration: 24h\n\n\nSetup: Adjust delay as needed\nType: Wait\nPurpose: Delays for status recheck\nConfiguration:\n\nWait Type: Delay\nDuration: 24h\nWait Type: Delay\nDuration: 24h\nSetup: Adjust delay as needed\n\nThis workflow streamlines blog post management with RSS Feeds, making it ideal\nfor busy content creators and teams.\nCustomize it by adding more platforms adjusting delays, or enhancing notifications.\nShare your feedback in the n8n community to help others benefit from this automation."
},
{
"templateId": "2557",
"templateName": "Hacker News to Video Template - AlexK1919",
"templateDescription": "Hacker News to Video Content OverviewThis workflow converts trending articles from Hacker News into engaging video content. It integrates AI-based tools to...",
"templateUrl": "https://n8n.io/workflows/2557",
"jsonFileName": "Hacker_News_to_Video_Template_-_AlexK1919.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Hacker_News_to_Video_Template_-_AlexK1919.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/22e61ed4c166ae3805a4d1713484180a/raw/61c199cc54dd4fd46bdd6714bf87959ae6d4340d/Hacker_News_to_Video_Template_-_AlexK1919.json",
"screenshotURL": "https://i.ibb.co/twqggGFf/9f0d3b11a0a2.png",
"workflowUpdated": true,
"gistId": "22e61ed4c166ae3805a4d1713484180a",
"templateDescriptionFull": "This workflow converts trending articles from Hacker News into engaging video content. It integrates AI-based tools to analyze, summarize, and generate multimedia content, making it ideal for content creators, educators, and marketers.\n\nArticle Retrieval:\n\nPulls trending articles from Hacker News.\nLimits the number of articles to process (configurable).\nPulls trending articles from Hacker News.\nLimits the number of articles to process (configurable).\nContent Analysis:\n\nUses OpenAI's GPT model to:\n\nSummarize articles.\nAssess their relevance to specific topics like automation or AI.\nExtract key image URLs.\nUses OpenAI's GPT model to:\n\nSummarize articles.\nAssess their relevance to specific topics like automation or AI.\nExtract key image URLs.\nSummarize articles.\nAssess their relevance to specific topics like automation or AI.\nExtract key image URLs.\nImage and Video Generation:\n\nLeonardo.ai: Creates stunning AI-generated images based on extracted prompts.\nRunwayML: Converts images into high-quality videos.\nLeonardo.ai: Creates stunning AI-generated images based on extracted prompts.\nRunwayML: Converts images into high-quality videos.\nStructured Content Creation:\n\nParses content into structured formats for easy reuse.\nGenerates newsletter-friendly blurbs and social media-ready captions.\nParses content into structured formats for easy reuse.\nGenerates newsletter-friendly blurbs and social media-ready captions.\nCloud Integration:\n\nUploads generated assets to:\n\nDropbox\nGoogle Drive\nMicrosoft OneDrive\nMinIO\nUploads generated assets to:\n\nDropbox\nGoogle Drive\nMicrosoft OneDrive\nMinIO\nDropbox\nGoogle Drive\nMicrosoft OneDrive\nMinIO\nSocial Media Posting (Optional):\n\nSupports posting to YouTube, X (Twitter), LinkedIn, and Instagram.\nSupports posting to YouTube, X (Twitter), LinkedIn, and Instagram.\n\nInitiated manually via the \"Test Workflow\" button.\n\nRetrieves articles from Hacker News.\nLimits the results to avoid processing overload.\n\nEvaluates if articles are related to AI/Automation using OpenAI's language model.\n\nGenerates:\n\nAI-driven image prompts via Leonardo.ai.\nVideos using RunwayML.\nAI-driven image prompts via Leonardo.ai.\nVideos using RunwayML.\n\nSaves the output to cloud storage services or uploads directly to social media platforms.\n\nAPI Keys:\n\nHacker News\nOpenAI\nLeonardo.ai\nRunwayML\nCreatomate\nHacker News\nOpenAI\nLeonardo.ai\nRunwayML\nCreatomate\nn8n Installation:\nEnsure n8n is installed and configured locally or on a server.\nCredentials:\nSet up credentials in n8n for all external services used in the workflow.\n\nReplace Hacker News with any other data source node if needed.\nConfigure the \"Article Analysis\" node for different topics.\nAdjust the cloud storage services or add custom storage options.\n\nImport this workflow into your n8n instance.\nConfigure your API credentials.\nTrigger the workflow manually or schedule it as needed.\nCheck the outputs in your preferred cloud storage or social media platform.\n\nExtend this workflow further by automating social media posting or newsletter integration.\nFor any questions, refer to the official documentation or reach out to the creator.\n\nThis workflow was built by AlexK1919, an AI-native workflow automation architect. Check out the overview video for a quick demo.\n\nLeonardo.ai\nRunwayML\nCreatomate\nHacker News API\nOpenAI GPT\n\nFeel free to adapt and extend this workflow to meet your specific needs! \ud83c\udf89"
},
{
"templateId": "4022",
"templateName": "BuzzBlast",
"templateDescription": "Amplify your social media presence with BuzzBlast, an n8n workflow designed to make your content go viral across X, Discord, and LinkedIn. By sending a...",
"templateUrl": "https://n8n.io/workflows/4022",
"jsonFileName": "BuzzBlast.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/BuzzBlast.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/a423f4a6b2b9c06c2b51d91a14b8fad6/raw/ac0ca5fd62d883aeb6f6b8fe042bb938e8ba2813/BuzzBlast.json",
"screenshotURL": "https://i.ibb.co/RG5wkTrZ/815c89d979d7.png",
"workflowUpdated": true,
"gistId": "a423f4a6b2b9c06c2b51d91a14b8fad6",
"templateDescriptionFull": "Amplify your social media presence with BuzzBlast, an n8n workflow designed to make your content go viral across X, Discord, and LinkedIn. By sending a single chat message, BuzzBlast leverages OpenRouter's AI to optimize your input for each platform\u2019s unique audience\u2014crafting punchy tweets for X, engaging messages for Discord, and professional posts for LinkedIn. With smart language detection, it ensures the output matches your input\u2019s language for authentic engagement.\n\n\ud83d\ude80 Multi-Platform Posting: Shares optimized content to X, Discord, and LinkedIn simultaneously.\n\ud83e\udde0 AI Optimization: Uses OpenRouter\u2019s AI to tailor content for virality on each platform.\n\ud83c\udf10 Language Detection: Matches output to your input language for seamless engagement.\n\ud83d\udd04 Smart Routing: Automatically directs content to the right platform using a switch node.\n\ud83d\udcf1 Chat Trigger: Initiates posts via a simple chat message.\n\u26a1 Zero Hassle: No manual reformatting\u2014BuzzBlast handles it all.\n\nSocial media managers looking to streamline cross-platform posting.\nContent creators aiming to boost engagement with minimal effort.\nBusinesses seeking to maximize reach across diverse audiences.\n\nn8n instance: A running n8n instance (cloud or self-hosted).\nCredentials:\n\nX account with OAuth2 API access.\nDiscord Webhook API setup for your server.\nLinkedIn account with OAuth2 API access.\nOpenRouter account for AI language model access.\nX account with OAuth2 API access.\nDiscord Webhook API setup for your server.\nLinkedIn account with OAuth2 API access.\nOpenRouter account for AI language model access.\nChat Trigger Setup: A configured chat platform (e.g., Slack, Telegram) to send input messages to the workflow.\n\nImport the Workflow:\n\nCopy the provided workflow JSON and import it into your n8n instance via the \"Import Workflow\" option in the n8n editor.\nCopy the provided workflow JSON and import it into your n8n instance via the \"Import Workflow\" option in the n8n editor.\nConfigure Credentials:\n\nIn the Post to X node, set up OAuth2 credentials for your X account.\nIn the Post to Discord node, configure a Discord Webhook for your server.\nIn the Post to LinkedIn node, add OAuth2 credentials for your LinkedIn account.\nIn the OpenRouter AI Model node, provide API credentials for your OpenRouter account.\nIn the Post to X node, set up OAuth2 credentials for your X account.\nIn the Post to Discord node, configure a Discord Webhook for your server.\nIn the Post to LinkedIn node, add OAuth2 credentials for your LinkedIn account.\nIn the OpenRouter AI Model node, provide API credentials for your OpenRouter account.\nSet Up Chat Trigger:\n\nIn the Chat Input Trigger node, configure your preferred chat platform (e.g., Slack, Telegram) to send trigger messages.\nEnsure the webhook is active and correctly linked to your chat platform.\nIn the Chat Input Trigger node, configure your preferred chat platform (e.g., Slack, Telegram) to send trigger messages.\nEnsure the webhook is active and correctly linked to your chat platform.\nTest the Workflow:\n\nSend a test message via your chat platform (e.g., \"Announcing our new product launch!\").\nVerify that the AI optimizes the content and posts it to X, Discord, and LinkedIn as expected.\nSend a test message via your chat platform (e.g., \"Announcing our new product launch!\").\nVerify that the AI optimizes the content and posts it to X, Discord, and LinkedIn as expected.\nActivate the Workflow:\n\nOnce tested, toggle the workflow to \"Active\" in n8n to enable automatic execution when chat messages are received.\nOnce tested, toggle the workflow to \"Active\" in n8n to enable automatic execution when chat messages are received.\n\nChanges Chat Trigger: Adjust the chat trigger using your preference platform like telegram, discord, etc.\nModify AI Prompt: Adjust the prompt in the AI Content Optimizer node to change the tone or style (e.g., more professional for LinkedIn or conversational for Discord).\nAdd New Platforms: Extend the Route to Platforms node by adding conditions for additional platforms (e.g., Instagram or Facebook) and corresponding posting nodes.\nChange AI Model: In the OpenRouter AI Model node, select a different OpenRouter model to optimize content quality or manage costs.\nEnhance Output Format: Update the JSON schema in the Parse AI Output node to include additional fields like hashtags, emojis, or links for specific platforms.\nAdd Error Handling: Include an error-handling node after the Route to Platforms node to log failed posts or retry them automatically.\n\nBuzzBlast saves time, maximizes reach, and lets AI craft platform-perfect posts that resonate with your audience. Whether you're an influencer, marketer, or business, this workflow makes cross-platform posting effortless. Ready to make waves online? Grab BuzzBlast and start buzzing!\n\nmade by: khmuhtadin\nNeed a custom? contact me on LinkedIn or Web"
},
{
"templateId": "3057",
"templateName": "template_3057",
"templateDescription": "Description: Create Social Media Content from Telegram with AI This n8n workflow empowers you to effortlessly generate social media content and captivating...",
"templateUrl": "https://n8n.io/workflows/3057",
"jsonFileName": "template_3057.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_3057.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/8c2ac45f9d6cf029abf14a7a5cb298d8/raw/b9c6dca145f3f891977c5d8aad29a80b6a87dc27/template_3057.json",
"screenshotURL": "https://i.ibb.co/RG5wkTrZ/815c89d979d7.png",
"workflowUpdated": true,
"gistId": "8c2ac45f9d6cf029abf14a7a5cb298d8",
"templateDescriptionFull": "Description:\n\nThis n8n workflow empowers you to effortlessly generate social media content and captivating image prompts, all powered by AI. Simply send a topic request through Telegram (as a voice or text message), and watch as the workflow conducts research, crafts engaging social media posts, and creates detailed image prompts ready for use with your preferred AI art generation tool.\n\nThis workflow streamlines the content creation process by automating research, social media content generation, and image prompt creation, triggered by a simple Telegram message.\n\nSocial Media Managers: Quickly generate engaging content and image ideas for various platforms.\nContent Creators: Overcome writer's block and discover fresh content ideas with AI assistance.\nMarketing Teams: Boost productivity by automating social media content research and drafting.\nAnyone looking to leverage AI for efficient and creative social media content creation.\n\nEffortless Content and Image Prompt Generation: Automate the creation of social media posts and detailed image prompts.\nAI-Powered Creativity: Leverage the power of LLMs to generate original content ideas and captivating image prompts.\nIncreased Efficiency: Save time and resources by automating the research and content creation process.\nVoice-to-Content: Use voice messages to request content, making content creation even more accessible.\nEnhanced Engagement: Create high-quality, attention-grabbing content that resonates with your audience.\n\nReceive Request: The workflow listens for incoming voice or text messages on Telegram containing your content request.\nProcess Voice (if necessary): If the message is a voice message, it's transcribed into text using OpenAI's Whisper API.\nAI Takes Over: The AI agent, powered by an OpenAI Chat Model and SerpAPI, conducts online research based on your request.\nContent and Image Prompt Generation: The AI agent generates engaging social media content and a detailed image prompt based on the research.\nImage Generation (Optional): You can use the generated image prompt with your preferred AI art generation tool (e.g., DALL-E, Stable Diffusion) to create a visual.\nOutput: The workflow provides you with the social media content and the detailed image prompt, ready for you to use or refine.\n\nTelegram Trigger\nSwitch\nTelegram (for fetching voice messages)\nOpenAI (Whisper API for voice-to-text)\nSet (for preparing variables)\nAI Agent (with OpenAI Chat Model and SerpAPI tool)\nHTTP Request (for optional image generation)\nExtract from File (for optional image processing)\nSet (for final output)\n\nActive n8n instance\nTelegram account with a bot\nOpenAI API key\nSerpAPI account\nHugging Face API key (if you want to generate images within the workflow)\n\nImport the workflow JSON into your n8n instance.\nConfigure the Telegram Trigger node with your Telegram bot token.\nSet up the OpenAI and SerpAPI credentials in the respective nodes.\nIf you want to generate images directly within the workflow, configure the HTTP Request node with your Hugging Face API key.\nTest the workflow by sending a voice or text message to your Telegram bot with a topic request.\n\nThis workflow combines the convenience of Telegram with the power of AI to provide a seamless content creation experience. Start generating engaging social media content today!"
},
{
"templateId": "2903",
"templateName": "Youtube Searcher",
"templateDescription": "Video explanation This n8n workflow helps you identify trending videos within your niche by detecting outlier videos that significantly outperform a...",
"templateUrl": "https://n8n.io/workflows/2903",
"jsonFileName": "Youtube_Searcher.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Youtube_Searcher.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/c1b12bc705a546e85389a5007d103300/raw/eace0f1c15a7e531c4c636d63ac8209d6b43be41/Youtube_Searcher.json",
"screenshotURL": "https://i.ibb.co/Cp2z25LF/6db7e9681cd0.png",
"workflowUpdated": true,
"gistId": "c1b12bc705a546e85389a5007d103300",
"templateDescriptionFull": "Video explanation\n\nThis n8n workflow helps you identify trending videos within your niche by detecting outlier videos that significantly outperform a channel's average views. It automates the process of monitoring competitor channels, saving time and streamlining content research.\n\nAutomated Competitor Video Tracking\nMonitors videos from specified competitor channels, fetching data directly from the YouTube API.\nOutlier Detection Based on Channel Averages\nCompares each video\u2019s performance against the channel\u2019s historical average to identify significant spikes in viewership.\nHistorical Video Data Management\nStores video statistics in a PostgreSQL database, allowing the workflow to only fetch new videos and optimize API usage.\nShort Video Filtering\nAutomatically removes short videos based on duration thresholds.\nFlexible Video Retrieval\nFetches up to 3 months of historical data on the first run and only new videos on subsequent runs.\nPostgreSQL Database Integration\nIncludes SQL queries for database setup, video insertion, and performance analysis.\nConfigurable Outlier Threshold\nFocuses on videos published within the last two weeks with view counts at least twice the channel's average.\nData Output for Analysis\nOutputs best-performing videos along with their engagement metrics, making it easier to identify trending topics.\n\nn8n installed on your machine or server\nA valid YouTube Data API key\nAccess to a PostgreSQL database\n\nThis workflow is intended for educational and research purposes, helping content creators gain insights into what topics resonate with audiences without manual daily monitoring."
},
{
"templateId": "2981",
"templateName": "\u270d\ufe0f\ud83c\udf04 Your First Wordpress Content Creator - Quick Start",
"templateDescription": "\u270d\ufe0f\ud83c\udf04 WordPress + AI Content Creator This workflow automates the creation and publishing of multi-reading-level content for WordPress blogs. It leverages AI...",
"templateUrl": "https://n8n.io/workflows/2981",
"jsonFileName": "_Your_First_Wordpress_Content_Creator_-_Quick_Start.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/_Your_First_Wordpress_Content_Creator_-_Quick_Start.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/4c5a33a39041ecefb44b64f5fbf4c0f9/raw/54756a0c24807140349ff24285b3422868412dfe/_Your_First_Wordpress_Content_Creator_-_Quick_Start.json",
"screenshotURL": "https://i.ibb.co/yBSLm408/d3c2eaf0b670.png",
"workflowUpdated": true,
"gistId": "4c5a33a39041ecefb44b64f5fbf4c0f9",
"templateDescriptionFull": "This workflow automates the creation and publishing of multi-reading-level content for WordPress blogs. It leverages AI to generate optimized articles, automatically creates featured images, and provides versions of the content at different reading levels (Grade 2, 5, and 9).\n\nStarts with a manual trigger and a user-defined blog topic\nUses AI to create a structured blog post with proper HTML formatting\nSeparates and validates the title and content components\nSaves a draft version to Google Drive for backup\n\nAutomatically rewrites the content for different reading levels:\n\nGrade 9: Sophisticated language with appropriate metaphors\nGrade 5: Simplified with light humor and age-appropriate examples\nGrade 2: Basic language with simple metaphors and child-friendly explanations\n\nCreates a draft post in WordPress with the Grade 9 version\nGenerates a relevant featured image using Pollinations.ai\nAutomatically uploads and sets the featured image\nSends success/error notifications via Telegram\n\nSet up WordPress API connection\nConfigure OpenAI API access\nSet up Google Drive integration\nAdd Telegram bot credentials for notifications\n\nAdjust reading level prompts as needed\nModify image generation settings\nSet WordPress post parameters\n\nRun a test with a sample topic\nVerify all reading level versions\nCheck WordPress draft creation\nConfirm notification system\n\nThis workflow is perfect for content creators who need to maintain a consistent blog presence while catering to different audience reading levels. It's especially useful for educational content, news sites, or any platform that needs to communicate complex topics to diverse audiences."
},
{
"templateId": "3822",
"templateName": "Search news using Perplexity AI and post to X (Twitter)",
"templateDescription": "Stay ahead of the curve and keep your followers informed\u2014automatically. This n8n workflow uses Perplexity AI to generate insightful answers to scheduled...",
"templateUrl": "https://n8n.io/workflows/3822",
"jsonFileName": "Search_news_using_Perplexity_AI_and_post_to_X_Twitter.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/Search_news_using_Perplexity_AI_and_post_to_X_Twitter.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/da8ac2a9a95b322061cc8cff5f23d62c/raw/31f8fe87520bbb27bdb036bf6dcd96b18f21b4fa/Search_news_using_Perplexity_AI_and_post_to_X_Twitter.json",
"screenshotURL": "https://i.ibb.co/DHtpcHMG/a5f9dd442054.png",
"workflowUpdated": true,
"gistId": "da8ac2a9a95b322061cc8cff5f23d62c",
"templateDescriptionFull": "Stay ahead of the curve and keep your followers informed\u2014automatically.\nThis n8n workflow uses Perplexity AI to generate insightful answers to scheduled queries, then auto-posts the responses directly to X (Twitter).\n\nScheduled Trigger \u2013 Runs at set times (daily, hourly, etc.).\nsearchQuery \u2013 Define what kind of trending or relevant insight you want (e.g. \u201clatest AI trends\u201d).\nset API Key \u2013 Securely insert your Perplexity API key.\nPerplexity API Call \u2013 Fetches a short, insightful response to your query.\nPost to X \u2013 Automatically publishes the result as a tweet.\n\nAn n8n account (self-hosted or cloud)\nA Perplexity API key\nA connected X (Twitter) account via n8n\u2019s credentials\n\nAdd this workflow into your n8n account.\nEdit the searchQuery node with a topic (e.g. \u201cWhat\u2019s new in ecommerce automation?\u201d).\nPaste your Perplexity API key into the set API key node.\nConnect your X (Twitter) account in the final node.\nAdjust the schedule timing to suit your content frequency.\n\n\ud83d\udcac Add a formatting step to shorten or hashtag the response.\n\ud83d\udcca Pull multiple trending questions and auto-schedule posts.\n\ud83d\udd01 Loop responses to queue a full week of content.\n\ud83c\udf10 Translate content before posting to reach a global audience.\n\nFeel free to contact us at 1 Node.\nGet instant access to a library of free resources we created."
},
{
"templateId": "4827",
"templateName": "template_4827",
"templateDescription": "Who is this for?This template is designed for internal support teams, product specialists, and knowledge managers in technology companies who want to...",
"templateUrl": "https://n8n.io/workflows/4827",
"jsonFileName": "template_4827.json",
"jsonFilePath": "/Users/shraeychikker/work/personal/n8n-mcp/n8n/templates/template_4827.json",
"jsonURL": "https://gist.githubusercontent.com/shraey96/a7459dac43335353c5dc793929ab0adb/raw/4542f5f6e9be4bf4d6559eccd7fecb008f7817db/template_4827.json",
"screenshotURL": "https://i.ibb.co/7938pnk/8670e3e1c08e.png",
"workflowUpdated": true,
"gistId": "a7459dac43335353c5dc793929ab0adb",
"templateDescriptionFull": "This template is designed for internal support teams, product specialists, and knowledge managers in technology companies who want to automate ingestion of product documentation and enable AI-driven, retrieval-augmented question answering via WhatsApp.\n\nSupport agents often spend too much time manually searching through lengthy documentation, leading to inconsistent or delayed answers. This solution automates importing, chunking, and indexing product manuals, then uses retrieval-augmented generation (RAG) to answer user queries accurately and quickly with AI via WhatsApp messaging.\n\nManually triggered to import product documentation from Google Docs.\nAutomatically splits large documents into chunks for efficient searching.\nGenerates vector embeddings for each chunk using OpenAI embeddings.\nInserts the embedded chunks and metadata into a MongoDB Atlas vector store, enabling fast semantic search.\n\nListens for incoming WhatsApp user messages, supporting various types:\n\nText messages: Plain text queries from users.\nAudio messages: Voice notes transcribed into text for processing.\nImage messages: Photos or screenshots analyzed to provide contextual answers.\nDocument messages: PDFs, spreadsheets, or other files parsed for relevant content.\nText messages: Plain text queries from users.\nAudio messages: Voice notes transcribed into text for processing.\nImage messages: Photos or screenshots analyzed to provide contextual answers.\nDocument messages: PDFs, spreadsheets, or other files parsed for relevant content.\nConverts incoming queries to vector embeddings and performs similarity search on the MongoDB vector store.\nUses OpenAI\u2019s GPT-4o-mini model with retrieval-augmented generation to produce concise, context-aware answers.\nMaintains conversation context across multiple turns using a memory buffer node.\nRoutes different message types to appropriate processing nodes to maximize answer quality.\n\nAuthenticate Google Docs and connect your Google Docs URL containing the product documentation you want to index.\nAuthenticate MongoDB Atlas and connect the collection where you want to store the vector embeddings. Create a search index on this collection to support vector similarity queries.\nEnsure the index name matches the one configured in n8n (data_index).\nSee the example MongoDB search index template below for reference.\n\nAuthenticate the WhatsApp node with your Meta account credentials to enable message receiving and sending.\nConnect the MongoDB collection containing embedded product documentation to the MongoDB Vector Search node used for similarity queries.\nSet up the system prompt in the Knowledge Base Agent node to reflect your company\u2019s tone, answering style, and any business rules, ensuring it references the connected MongoDB collection for context retrieval.\n\nBoth MongoDB nodes (in ingestion and chat workflows) are connected to the same collection with:\n\nAn embedding field storing vector data,\n\nRelevant metadata fields (e.g., document ID, source), and\n\nThe same vector index name configured (e.g., data_index).\n\n{\n\"mappings\": {\n\"dynamic\": false,\n\"fields\": {\n\"_id\": { \"type\": \"string\" },\n\"text\": { \"type\": \"string\" },\n\"embedding\": {\n\"type\": \"knnVector\",\n\"dimensions\": 1536,\n\"similarity\": \"cosine\"\n},\n\"source\": { \"type\": \"string\" },\n\"doc_id\": { \"type\": \"str
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment