I Automated My Daily Marketing Banners. For $1/Month.

Solo. No design staff. Canarist ships daily AI market intelligence. There’s one problem: every signal needs a banner. Social posts without visuals are invisible. Manual creation is unsustainable. Static templates break with dynamic content.

So I built a pipeline. It runs every morning without input. It costs $1/month. It has shipped a banner every day since it launched.

The Problem

Canarist surfaces AI market signals before they hit mainstream feeds. The product logic demands daily output—if signals are stale, the product is broken. That same logic applies to distribution: a daily newsletter without daily visuals reads like a ghost town.

The content is fully dynamic: headline changes every day, severity level changes, source citations change, recommended actions change. No two banners carry the same information. That kills every template-based approach before it starts.

Three hard constraints: high frequency (daily), zero design staff (solo founder), fully variable content (AI-synthesized every run). The standard solutions don’t survive all three at once.

What I Tried First

Figma template + manual updates. I lasted four days. The content changes too much to fit any fixed layout. When the alert headline ran 45 characters, the card looked fine. At 85 characters, I was manually resizing text boxes.

n8n canvas component. n8n’s canvas node is powerful but doesn’t support variable-length text layout. The output is rigid. When the hook phrase changes length, the composition breaks silently—no error, just a broken banner in your feed.

Commercial image APIs (Bannerbear, Placid) existed but charge per render—at daily frequency that compounds fast. I needed zero marginal cost per image: code that generates a pixel-perfect PNG from raw data and runs on serverless infrastructure.

The Pipeline

SchedulerHetzner cron
~20 SignalItems
Signal CollectionarXiv · HN · GitHub Trending
BannerData (15 fields)
LLM SynthesisClaude Haiku — structured JSON
2400×1254 PNG bytes
PNG RenderingSatori → Vercel edge function
public URL + metadata
Supabase Storagebanners/daily/{date}.png
photo + caption
Telegram Reviewowner approval loop

The split architecture—Hetzner for scheduling, Vercel edge for rendering—keeps the render stateless and horizontally scalable without running a persistent server.

Try It Live

Real past signals from the pipeline. Switch topics or accent color — watch how the layout adapts while the brand stays locked.

canaristLIVE SIGNAL
MULTI-AGENT AI INFLECTION
LLM agents solving
long-horizon tasks now
production readiness uncertain
HIGHRESEARCH
Agentic Multi-LLM Systems Emerge as Solvable Problem
Six papers on agentic coordination hit arXiv simultaneously. ByteDance releases production SuperAgent framework. The gap isn't capability—it's error forecasting and safe decomposition at scale.

Under the Hood

Three async scrapers (arXiv RSS, HN Firebase API, GitHub Trending) run in parallel and return up to 20 signal items per run. Claude Haiku then looks for non-obvious convergence—the prompt asks it to find 2–3 signals from different sources pointing at the same underlying shift, and return structured JSON with 15 fields: hook phrases, alert title and body, confidence score (60–95), exactly 3 recommended actions, and an x_caption for social posting. Reliable structured output without function calling took four prompt iterations; the key was including a filled-in JSON example in the system prompt.

The PNG renderer lives at /api/banner—a Vercel edge function that takes 14 query params and returns a Satori-rendered 2400×1254 PNG. Font loading was the hard part: import.meta.url returns a 307 redirect on edge functions, causing a silent white PNG with no error message. Fix: fs.readFileSync at module initialization, cached at module level. Three hours I won’t get back. The finished PNG uploads to Supabase Storage and the bot forwards it to Telegram. One tap copies the x_caption to clipboard.

Design-as-Code

The banner design lives entirely in a JSX component. Every brand constraint is encoded in code, not a Figma file: orange is #f97316, hook text is Inter Bold, severity HIGH gets an orange badge, MEDIUM gets amber. The layout uses Satori’s flexbox model.

This is the right trade-off for automation. A Figma component requires a human every time content changes. A JSX component doesn’t.

The trade-off: layout changes require a deploy. For a daily brand asset at consistent scale, that’s acceptable. Design consistency is the goal; design flexibility is not.

What Failed

  • Satori font loading. Three hours debugging a white PNG with no error. import.meta.url returns a redirect on Vercel edge. fs.readFileSync is the fix.
  • APScheduler silently skips jobs. If the container restarts within the scheduled window, the job is skipped without warning. Fix: misfire_grace_time: 600 to catch restarts within 10 minutes.
  • Structured JSON reliability. First prompt iteration produced varied field names across runs. Required four iterations and an explicit filled-in schema example to lock the output format.

Results

The pipeline has run every morning since mid-March 2026. No failed runs. No manual restarts. Each run takes 12–18 seconds end-to-end. Claude Haiku costs roughly $0.03/day at ~300 tokens per run. Vercel and Supabase stay within their free tiers.

The signal quality has been the real surprise. The LLM convergence logic finds non-obvious connections: the best banner flagged ByteDance’s SuperAgent release, three arXiv papers on agentic decomposition, and a GitHub trending spike in multi-agent frameworks—all on the same morning, pointing at the same inflection point.

Total cost: ~$1/month. Designer hours after initial build: zero.