Inngest 1.0: 모든 플랫폼에서 작동하는 오픈소스 내구성 워크플로우
요약
Inngest는 복잡한 비즈니스 로직을 구현할 때 발생하는 '신뢰성' 문제를 근본적으로 해결하는 도구입니다. 기존에는 워커 관리, 코드 리팩토링, 별도의 모니터링 인프라 구축이 필요했지만, Inngest는 이를 모두 자동화합니다. `step.run`과 같은 코드를 통해 트랜잭션 단위의 재시도(retry)와 복구 로직을 구현할 수 있으며, 에지 컴퓨팅, 서버리스 등 어떤 환경에서든 일관되게 작동하는 '내구성 워크플로우'를 제공합니다. 특히 AI 에이전트나 다단계 프로세스를 구축하는 개발자에게 최적화되어 있습니다.
핵심 포인트
- 코드 레벨 트랜잭션: `step.run`을 사용하여 실패 시 자동 재시도 및 성공 시 단일 실행만을 보장하는 내구성 워크플로우를 구현할 수 있습니다.
- 환경 독립성 (Agnostic): API 호출, 웹훅, 스케줄 등 모든 트리거와 에지, 서버리스, 전통적 환경에서 코드를 수정 없이 실행 가능합니다.
- 운영 가시성 (Observable): 로그 시스템을 여러 곳에서 뒤질 필요 없이, Inngest 플랫폼 내에서 워크플로우의 실행 상태를 쿼리, 취소, 재실행(Replay)할 수 있습니다.
- 엔터프라이즈급 신뢰성: E2E 암호화, SSO/SAML 지원, HIPAA BAA 제공 등 대규모 비즈니스 요구사항을 충족합니다.
Show HN: Inngest 1.0 – Open-source durable workflows on every platform
Make any code durable by default.
Workflows, agents, endpoints, background jobs—however it's written, wherever it runs—Inngest makes it unbreakable.
You're here because whatever you're building needs to be reliable. We're here because we think you shouldn't have to wrangle workers, refactor code, or build instrumentation to make that true.
Infraless
Your orchestration engine shouldn't dictate how you write production code. Wrap functions in Steps to automate retries, recovery, and flow without added infrastructure. step.run and done.
Agnostic
Inngest was built for change. Run anywhere (edge, serverless, traditional), from any trigger (API calls, webhooks, schedules), on any code (agents, endpoints, cron).
Observable
Focus on logic, not instrumentation. Data about your runs lives where runs happen. So you can query, cancel, or replay without grepping logs across systems.
Start locally, with your stack.
// step.run is a code-level transaction: it retries automatically // on failure and only runs once on success.
const transcript = await step.run('transcribe-video', async () => deepgram.transcribe(event.data.videoUrl) )
// function state is automatically managed for fault tolerance // across steps.
const summary = await step.run('summarize-transcript', async () => llm.createCompletion({ model: "gpt-4o", prompt: createSummaryPrompt(transcript), }) )
ONE-COMMAND SETUP
$ npx --ignore-scripts=false inngest-cli dev
Inngest dev server running...
Ship anywhere
Run anywhere, on any code, from any trigger. Deploy to your favorite cloud provider in one click.
Inngest automatically retries on error, while ensuring efficient runs via throttling, batching, and prioritization. If something breaks, Inngest picks up where it left off.
Scale like the billions of workflows processed this month
Configure, manage, and monitor your workflows while our platform scales for your needs.
- Flow control: Ensure that you all users get a great experience by dynamically allocating resources to your AI workflows with concurrency with keys, throttling and more.
- Recovery Tools: Quickly identify any issue with the Inngest Cloud alerting and metrics and rapidly act on thousands of runs with Replay, Bulk Cancellation.
Built for trust.
Inngest provides enterprise-grade reliability and scalability for your most complex workflows, so your team can focus on building products, not managing infrastructure.
주요 기능 및 신뢰성:
- E2E ENCRYPTION: Encrypt all data that passes through Inngest with end-to-end encryption middleware.
- SSO & SAML: Single sign-on and SAML support for enterprise customers.
- 100K+ EXECUTIONS PER SECOND: Designed for your heavy workloads with capacity for bursting.
- LOW LATENCY: Inngest is designed to be low latency for all functions.
- HIPAA BAA AVAILABLE: Ready to handle sensitive data.
Trusted by software companies at scale, worldwide.
For anyone who is building multi-step AI agents, I highly recommend building it on top of Inngest, the traceability it provides is super useful, plus you get timeouts & retries for free.
Our context switching dropped significantly, because the code is just business logic. If you read the code, you know that the steps that will execute without having to manage any infrastructure.
Inngest completely transformed how we handle AI orchestration for us. Its intuitive DX, built-in multi-tenant concurrency, and flow control allowed us to scale without the complexity of other tools.
AI 자동 생성 콘텐츠
본 콘텐츠는 HN AI Engineering의 원문을 AI가 자동으로 요약·번역·분석한 것입니다. 원 저작권은 원저작자에게 있으며, 정확한 내용은 반드시 원문을 확인해 주세요.
원문 바로가기