diff --git a/content/800-guides/100-build-a-video-processing-pipeline.mdx b/content/800-guides/100-build-a-video-processing-pipeline.mdx deleted file mode 100644 index 6ee15c9bda..0000000000 --- a/content/800-guides/100-build-a-video-processing-pipeline.mdx +++ /dev/null @@ -1,171 +0,0 @@ ---- -title: 'How to build a video processing pipeline' -metaTitle: 'How to build a video processing pipeline with Prisma Pulse and Trigger.dev' -description: 'Learn how to build a scalable video processing pipeline using a decoupled, event-driven architecture with Prisma Pulse and Trigger.dev.' -sidebar_label: 'Video processing pipeline' -image: '/img/guides/video-processing-pipeline-cover.svg' -tags: - - video-processing - - pulse - - trigger.dev - - automation - - serverless - - event-driven - - workflows - - real-time - - scalability - - integration ---- - -## Introduction - -Serverless computing enables applications to scale efficiently, supporting millions of users. However, it faces challenges with longer runtimes and intensive data processing, both of which are crucial for machine learning (ML) applications. This guide will show you how to use Pulse and Trigger.dev to create decoupled, event-driven workflows for efficient handling of complex video processing tasks. - -## Prerequisites - -Before starting this guide, make sure you have: - -- Node.js installed (version 18 or higher) -- A PostgreSQL database -- A [Trigger.dev](https://trigger.dev) account -- A [Deepgram](https://deepgram.com) API key for transcription -- FFmpeg installed on your system - -## 1. Set up your project - -### 1.1. Create a new project - -First, create a new Node.js project and install the necessary dependencies: - -```bash -mkdir video-processing-pipeline -cd video-processing-pipeline -npm init -y -npm install @prisma/client @trigger.dev/sdk @prisma/pulse deepgram fluent-ffmpeg -npm install prisma --save-dev -``` - -### 1.2. Define the data model - -Initialize Prisma and create your data model: - -```bash -npx prisma init -``` - -Add the following model to your `schema.prisma`: - -```prisma -model Video { - id Int @id @default(autoincrement()) - url String - transcription String? - createdAt DateTime @default(now()) - updatedAt DateTime @updatedAt -} -``` - -### 1.3. Run database migration - -Apply the database schema: - -```bash -npx prisma migrate dev --name init -``` - -## 2. Implement the video processing pipeline - -### 2.1. Set up Trigger.dev workflow - -Next, we'll set up a transcription task using Trigger.dev. This script will take a video URL, extract its audio, and transcribe it using the Deepgram API: - -Create a new file called `transcription-workflow.ts`: - -```typescript -import { Job, Trigger } from '@trigger.dev/sdk'; -import { prisma } from './prisma'; -import { deepgram } from './deepgram'; -import ffmpeg from 'fluent-ffmpeg'; -import { Readable } from 'stream'; - -new Trigger({ - id: 'video-transcription', - name: 'Video Transcription', - on: 'video.uploaded', - run: async (event) => { - const video = await prisma.video.findUnique({ - where: { id: event.videoId }, - }); - - if (!video) { - throw new Error('Video not found'); - } - - const audioStream = new Readable(); - ffmpeg(video.url) - .noVideo() - .audioCodec('pcm_s16le') - .format('wav') - .pipe(audioStream); - - const transcription = await deepgram.transcription.preRecorded( - { buffer: audioStream }, - { punctuate: true } - ); - - await prisma.video.update({ - where: { id: video.id }, - data: { transcription: transcription.results.channels[0].alternatives[0].transcript }, - }); - }, -}); -``` - -### 2.2. Configure Prisma Pulse events - -You can now create a file `index.ts` that will act as an API endpoint for uploading videos. When a new video is uploaded, we use Prisma ORM to save the video URL in the database: - -```typescript -import { prisma } from './prisma'; - -async function uploadVideo(url: string) { - const video = await prisma.video.create({ - data: { url }, - }); - - // Emit 'video.uploaded' event - // ... -} -``` - -Now, to connect Pulse and trigger.dev, create a new file called `pulse-events.ts`: - -```typescript -import { pulse } from '@prisma/pulse'; -import { trigger } from './trigger'; - -pulse.model('Video').on('create', async (video) => { - await trigger.emit('video.uploaded', { videoId: video.id }); -}); - -pulse.model('Video').on('update', async (video) => { - if (video.transcription) { - console.log('Transcription completed:', video.id); - // Notify a client or trigger additional actions here - } -}); -``` - -## Next steps - -Now that you have a working video processing pipeline, you can: - -- Add error handling and retries -- Implement progress tracking -- Add support for different video formats -- Scale your pipeline with additional processing steps - -For more information and updates: -- [Get started with Trigger.dev](https://trigger.dev) -- [Get started with Pulse](https://www.prisma.io/pulse) -- Join our [Discord community](https://pris.ly/discord) diff --git a/content/800-guides/200-build-real-time-durable-workflows.mdx b/content/800-guides/200-build-real-time-durable-workflows.mdx deleted file mode 100644 index 10f7294282..0000000000 --- a/content/800-guides/200-build-real-time-durable-workflows.mdx +++ /dev/null @@ -1,218 +0,0 @@ ---- -title: 'How to build real-time durable workflows' -metaTitle: 'How to build real-time durable workflows with Pulse and Inngest' -metaDescription: 'Learn how to create real-time, durable, and extensible workflows using Prisma Pulse and Inngest.' -sidebar_label: 'Real-time durable workflows' -image: '/img/guides/real-time-durable-workflows-cover.svg' -tags: - - real-time - - workflows - - durable - - automation - - prisma - - inngest - - pulse ---- - -## Introduction - -Building web applications today isn't just about functionality—it's about creating seamless and engaging user experiences. Users expect applications to be responsive, real-time, and feature-rich, handling ever-increasing data sets efficiently. This guide will show you how to use Prisma Pulse and Inngest to create real-time, durable workflows that enhance your application's functionality. - -## Prerequisites - -Before starting this guide, make sure you have: - -- Node.js installed (version 18 or higher) -- A PostgreSQL database -- An [Inngest](https://www.inngest.com) account -- The [Pulse/Inngest router service](https://github.com/prisma/pulse-inngest-router) repository -- An OpenAI API key (for the AI writer example) - -## 1. Set up your project - -### 1.1. Create a new project - -First, clone the Pulse/Inngest router repository and install the necessary dependencies: - -```bash -git clone https://github.com/prisma/pulse-inngest-router -cd pulse-inngest-router -npm install -``` - -### 1.2. Define the data model - -Initialize Prisma and create your data model: - -```bash -npx prisma init -``` - -Add the following model to your `schema.prisma`: - -```prisma -model User { - id Int @id @default(autoincrement()) - email String @unique - name String - createdAt DateTime @default(now()) - updatedAt DateTime @updatedAt -} - -model Article { - id Int @id @default(autoincrement()) - title String - content String - status String - createdAt DateTime @default(now()) - updatedAt DateTime @updatedAt -} -``` - -### 1.3. Run database migration - -Apply the database schema: - -```bash -npx prisma migrate dev --name init -``` - -## 2. Implement user onboarding workflow - -### 2.1. Set up Prisma Pulse stream - -Create a file called `user-stream.ts`: - -```typescript -import { prisma } from './prismaClient'; - -async function main() { - const stream = await prisma.user.stream({ - create: {}, - }); - - for await (let event of stream) { - const newUser = event.created; - await inngest.send('user.signed_up', { data: newUser }); - } -} - -main().catch((e) => { - console.error(e); - process.exit(1); -}); -``` - -### 2.2. Create onboarding workflow - -Create a file called `onboarding-workflow.ts`: - -```typescript -import { inngest } from './inngestClient'; - -export const userOnboarding = inngest.createFunction( - 'User Onboarding', - 'user.signed_up', - async ({ event, step }) => { - const { email } = event.data; - - await step.run('Send Welcome Email', async () => { - await sendEmail(email, 'Welcome to our platform!'); - }); - - await step.sleep('Wait 3 Days', 3 * 24 * 60 * 60 * 1000); - - await step.run('Send Follow-Up Email', async () => { - await sendEmail(email, 'We hope you're enjoying our platform!'); - }); - } -); -``` - -## 3. Implement AI writer workflow - -### 3.1. Set up Pulse/Inngest router - -Clone and set up the Pulse/Inngest router: - -```bash -git clone https://github.com/prisma/pulse-inngest-router -cd pulse-inngest-router -npm install -``` - -### 3.2. Create AI writer workflow - -Create a file called `ai-writer-workflow.ts`: - -```typescript -import { inngest } from './inngestClient'; -import { openai } from './openaiClient'; - -export const aiWriterWorkflow = inngest.createFunction( - 'AI Writer Workflow', - 'db/article.update', - async ({ event, step }) => { - const { id, content, status } = event.data; - - if (status === 'REVIEW') { - const suggestions = await step.run('Generate Suggestions', async () => { - return await openai.generateSuggestions(content); - }); - - await step.run('Save Suggestions', async () => { - await prisma.suggestion.create({ - data: { - articleId: id, - suggestions, - }, - }); - }); - - await step.waitForEvent('db/article.update', { - 'data.id': id, - 'data.status': 'PUBLISH', - }, 7 * 24 * 60 * 60 * 1000); - - await step.run('Send Publication Notification', async () => { - await sendNotification(`Article ${id} has been published.`); - }); - } - } -); -``` - -## 4. Deploy and test - -### 4.1. Set up environment variables - -Create a `.env` file with your credentials: - -``` -DATABASE_URL="your-database-url" -INNGEST_EVENT_KEY="your-inngest-key" -OPENAI_API_KEY="your-openai-key" -``` - -### 4.2. Start the services - -Start both the Pulse stream and Inngest functions: - -```bash -npm run start:stream -npm run start:functions -``` - -## Next steps - -Now that you have working real-time workflows, you can: - -- Add error handling and retries -- Implement more complex workflow patterns -- Add monitoring and observability -- Scale your workflows with additional steps - -For more information and updates: -- [Get started with Inngest](https://www.inngest.com) -- [Get started with Pulse](https://www.prisma.io/pulse) -- Join our [Discord community](https://discord.com/invite/prisma) diff --git a/content/800-guides/900-using-prisma-orm-with-cloudflare-d1.mdx b/content/800-guides/900-using-prisma-orm-with-cloudflare-d1.mdx index 617c01ec9f..4d8dcfa618 100644 --- a/content/800-guides/900-using-prisma-orm-with-cloudflare-d1.mdx +++ b/content/800-guides/900-using-prisma-orm-with-cloudflare-d1.mdx @@ -19,9 +19,7 @@ Before starting this guide, make sure you have: - Wrangler CLI installed (version 3.39.0 or higher) - Basic familiarity with Cloudflare Workers and D1 -## Steps - -### 1. Configure Prisma schema +## 1. Configure Prisma schema In your Prisma schema, add the `driverAdapters` Preview feature to the `generator` block and set the `provider` of the `datasource` to `sqlite`. If you just bootstrapped the Prisma schema with `prisma init`, also be sure to add the following `User` model to it: @@ -43,7 +41,7 @@ model User { } ``` -### 2. Install dependencies +## 2. Install dependencies Next, install the required packages: @@ -53,7 +51,7 @@ npm install @prisma/adapter-d1 Also, be sure to use a version of the Wrangler CLI that's above [`wrangler@^3.39.0`](https://github.com/cloudflare/workers-sdk/releases/tag/wrangler%403.39.0), otherwise the `--remote` flag that's used in the next sections won't be available. -### 3. Set up D1 database connection +## 3. Set up D1 database connection To connect your Workers with the D1 instance, add the following binding to your `wrangler.toml`: @@ -73,7 +71,7 @@ Note that `__YOUR_D1_DATABASE_NAME__` and `__YOUR_D1_DATABASE_ID__` in the snipp If you weren't able to grab this ID from the terminal output, you can also find it in the Cloudflare Dashboard or by running `npx wrangler d1 list` and `npx wrangler d1 info __YOUR_D1_DATABASE_NAME__` in your terminal. -### 4. Set up database migrations +## 4. Set up database migrations Create and apply migrations using D1's [migration system](https://developers.cloudflare.com/d1/reference/migrations/): @@ -134,7 +132,7 @@ npx wrangler d1 execute __YOUR_D1_DATABASE_NAME__ --command "INSERT INTO \"User ('jane@prisma.io', 'Jane Doe (Remote)');" --remote ``` -### 5. Implement the Worker +## 5. Implement the Worker Before adding a Prisma Client query to your Worker, you need to generate Prisma Client with the following command: @@ -174,7 +172,7 @@ export default { } ``` -### 6. Run the Worker locally +## 6. Run the Worker locally With the database query in place and Prisma Client generated, you can go ahead and run the Worker locally: @@ -188,7 +186,7 @@ Now you can open your browser at [`http://localhost:8787`](http://localhost:8787 ;[{ id: 1, email: 'jane@prisma.io', name: 'Jane Doe (Local)' }] ``` -### 7. Set the `DATABASE_URL` environment variable and deploy the Worker +## 7. Set the `DATABASE_URL` environment variable and deploy the Worker To deploy the Worker, run the the following command: