Published on October 13, 2025 (1 day ago)

Introducing @mux/supabase, because every app needs a database

Dylan Jhaveri
By Dylan Jhaveri6 min readProduct

Okay, so maybe not every app strictly speaking needs a database. There’s a bunch of happy Mux customers who have hardcoded playback IDs into their page, or created a videos.json file to make things simple, and that’s great.

For the vast majority of non-trivial Mux integrations, however, you need a database of some sort. Let me briefly cover how a Mux integration typically works, and what @mux/supabase can do for you.

LinkSaving data associated with each video

Mux is a video API for developers. Supabase is the open-source Postgres development platform that's built to scale to millions. It is an all-in-one suite with Database, Auth, Storage, Edge Functions, Real-Time, and Vector search.

Similar to how Stripe is a payments infrastructure that doesn’t know all the business logic of your application, Mux operates at a similar layer of abstraction when it comes to video infrastructure. There’s probably a bunch of things your application cares about that Mux does not know about.

For example:

  • Who is allowed to watch the video?
  • If the video is part of a series, what order is it in the series?
  • For each user who watched the video, how much of it did they watch?
  • Description of the video.
  • Tags or categories associated with the video.

Note: the Mux Asset API does allow you to set a few metadata fields (title, creator_id and external_id).

LinkEnter @mux/supabase for your database needs

This is where @mux/supabase comes in. In many ways (which we’ll get into), Supabase pairs really well as a database for your Mux Video powered application.

First things first, you should already have Supabase set up for your project. (If you don’t, you can run everything locally, without setting up a Supabase account.)

You should have already run npx supabase init and have a supabase/ directory at the root of your project.

Supabase should be running with npx supabase start.

Assuming you have already done that, the next step is to run:

bash
npx @mux/supabase init

This command will:

  • Create a mux schema in your Supabase database. A schema can be thought of as a namespace that contains a group of tables (learn more about schemas in Supabase).
  • The mux schema in Supabase will have tables for:
  • Create an Edge Function at /supabase/functions/mux-webhook — you’ll need to go to the Mux dashboard and set up a webhook that points to that function. (For local development you’ll need to use ngrok or a similar tool to expose your localhost edge function as a Mux webhook.)

After running the init command, you’ll see the mux-webhook function in your project, and you’ll see the migration files in the supabase/migrations directory that sets up the schema and tables.

If you have existing assets that you need to backfill into your mux schema, run the backfill script:

bash
npx @mux/supabase backfill

This will paginate through all of your Assets and Live Streams and backfill the tables in Supabase.

Now you have all your Mux data synced with your Supabase database.

LinkRunning AI workflows with Mux & Supabase

After we started building with this setup we found ourselves repeatedly wanting to do similar things like running AI workflows at specific times.

For example:

Since the @mux/supabase webhook handler is already handling webhooks, this gives us the perfect entrypoint to handle these kinds of workflows.

bash
npx @mux/supabase init-workflows

Let’s go with the content moderation example. First, edit the mux.toml file in /supabase/functions/mux-webhook/

plaintext
[workflows.content-moderation] events = ["video.asset.ready"]

This tells the @mux/supabase integration that when the video.asset.ready webhook fires, we need to run the content-moderation Edge Function.

Next, create the content-moderation Edge Function:

npx supabase functions new content-moderation

In the body of that function, write the code you need to do your moderation logic. In this example, borrowed from the guide:

  • We grab the public playback ID on the asset.
  • We use the playback ID to grab thumbnails from the asset.
  • We send all of those to OpenAI with OpenAI’s omni-moderation-latest model.
  • We get back the moderation scores and check against our thresholds.

What you would need to do:

  • Adjust the threshold for your needs.
  • Flag the asset in the database and follow the steps under moderationResult.exceedsThreshold
functions/content-moderation/index.ts
import Mux from "npm:@mux/mux-node" import type { Webhooks } from "npm:@mux/mux-node/resources/webhooks.js" import { OpenAI } from 'openai'; type UnwrapWebhookEvent = Webhooks.UnwrapWebhookEvent const openaiClient = new OpenAI({ apiKey: process.env.OPENAI_API_KEY, }); // Moderation thresholds, you can adjust these as you refine your moderation logic const THRESHOLDS = { sexual: 0.7, violence: 0.8 }; // Generates a list of thumbnail URLs at regular intervals, based on the asset's duration export function getThumbnailUrls({ playbackId, duration }) { const timestamps = []; if (duration <= 50) { // Short videos less than 50 seconds: 5 evenly spaced thumbnails const interval = duration / 6; for (let i = 1; i <= 5; i++) { timestamps.push(Math.round(i * interval)); } } else { // Longer videos: one thumbnail every 10 seconds for (let time = 0; time < duration; time += 10) { timestamps.push(time); } } return timestamps.map( (time) => `https://image.mux.com/${playbackId}/thumbnail.png?time=${time}&width=640` ); } async function requestModeration(imageUrls) { const moderationPromises = imageUrls.map(async (url) => { console.log(`Moderating image: ${url}`); try { const moderation = await openaiClient.moderations.create({ model: "omni-moderation-latest", input: [ { type: "image_url", image_url: { url: url, }, }, ], }); const categoryScores = moderation.results[0].category_scores; return { url, sexual: categoryScores.sexual || 0, violence: categoryScores.violence || 0, error: false }; } catch (error) { console.error("Failed to moderate image:", error); return { url, sexual: 0, violence: 0, error: true, }; } }); const scores = await Promise.all(moderationPromises); // Find highest scores across all thumbnails const maxSexual = Math.max(...scores.map(s => s.sexual)); const maxViolence = Math.max(...scores.map(s => s.violence)); return { scores, maxScores: { sexual: maxSexual, violence: maxViolence }, exceedsThreshold: maxSexual > THRESHOLDS.sexual || maxViolence > THRESHOLDS.violence }; } async function moderateAsset(asset) { const { id: assetId, duration, playback_ids } = asset; if (!playback_ids || playback_ids.length === 0) { console.log(`No playback IDs for asset ${assetId}, skipping moderation`); return; } // Filter for public playback IDs only const publicPlaybackIds = playback_ids.filter(pid => pid.policy === 'public'); if (publicPlaybackIds.length === 0) { console.log(`Asset ${assetId} has only signed playback IDs, skipping moderation`); return; } const playbackId = publicPlaybackIds[0].id; console.log(`Starting moderation for asset ${assetId}`); const thumbnailUrls = getThumbnailUrls({ playbackId, duration }); console.log(`Generated ${thumbnailUrls.length} thumbnails for moderation`); const moderationResult = await requestModeration(thumbnailUrls); console.log(`Moderation scores - Sexual: ${moderationResult.maxScores.sexual}, Violence: ${moderationResult.maxScores.violence}`); if (moderationResult.exceedsThreshold) { // Save a record in your database that this asset failed moderation // Make sure this asset will not be shown to end-users // Flag the user account who uploaded it, and consider what you should // do next: // - Ban the user from the platform? // - Open a support ticket? console.log(`Content exceeds thresholds, removing access to asset`); } else { console.log(`Asset ${assetId} passed moderation`); } } Deno.serve(async (req) => { try { const event = (await req.json()) as UnwrapWebhookEvent; const asset = event.data; if (!asset) { console.log('No asset'); return new Response('No text track in webhook', { status: 500 }); } await moderateAsset(asset); return new Response('Asset moderation complete', { status: 200 }); } catch (error) { console.error("Error running content-moderation.ts:", error) return new Response( JSON.stringify({ error: "500" }), { status: 500, headers: { "Content-Type": "application/json" } } ) } })

LinkSome details about how the workflows work

Under the hood, workflows on Supabase work like this:

Webhook lands at functions/mux-webhook

  • Rows in the mux schema tables get updated
  • Any configured workflows get put into Supabase Queue

In the background, Supabase Cron runs every 10s to process messages off the queue

  • Each message on the queue corresponds to 1 workflow function with a payload.
  • If the workflow function succeeds, the message is deleted from the queue.
  • If the workflow function errors, the message stays on the queue so it can be retried.

Note that workflow functions are simply invoking a Supabase Edge Function which runs in a Deno context. You can do a lot with Edge Functions like: run local LLM model, write records to your database, or call any 3rd party API.

One limitation on Edge Functions is Maximum duration, on the free plan that is currently 150s, on paid plans that is 400s. If your workflow function takes longer than that it will time-out and stay in the queue to be retried.

LinkYour turn!

We want to see what you build! We’re seeing more and more Mux customers choose Supabase to build their applications and we think this will make it easier. We also see a lot of Mux customers having to build these kinds of AI workflows around their video assets. Give it a try and let us know what you think. If you run into any issues please open a ticket on Github.


Written By

Dylan Jhaveri

Software Engineer and cold water surfer. Previously startup co-founder. Trying to find the best cheeseburger in San Francisco.

Leave your wallet where it is

No credit card required to get started.