I Built a Fully Automated LinkedIn Posting Pipeline (Node.js + BullMQ + MongoDB)
Most people think consistency on LinkedIn is about discipline. I treated it as a backend problem. So I built a system that: fetches content from multiple sources generates LinkedIn posts using AI s...

Source: DEV Community
Most people think consistency on LinkedIn is about discipline. I treated it as a backend problem. So I built a system that: fetches content from multiple sources generates LinkedIn posts using AI schedules them across the week and publishes automatically No manual intervention. đ High-Level Flow The system runs on 3 stages: Content Fetching (Sunday) Slot Allocation + AI Generation (Monday) Scheduled Publishing (TueâThu) đ
1. Fetch Scheduler Every Sunday, a scheduler pulls content from multiple platforms: const sources = [ "devto", "github", "medium", "npm", "Hashnode", "nodeweekly", "reddit" ]; Each source is processed and stored in MongoDB. Fetch Logic const rawItems = await FetcherService.fetchFromSource(source, keyword); for (const item of rawItems) { await FetchedContent.updateOne( { url: item.url }, { $set: { ...item, source, }, $setOnInsert: { expiresAt: new Date(Date.now() + 7 * 24 * 60 * 60 * 1000), }, }, { upsert: true } ); } Why this design? upsert prevents duplicates TTL (