
Maitai
Reliable, self-improving enterprise AI
About the Company
Maitai manages the LLM stack for enterprise companies, enabling the fastest and most reliable inference. The future of enterprise AI revolves around mosaics of small, domain-specific models powering powerful, responsive agents, and Maitai is well positioned to capture the market. If you're looking at getting in early with a company redefining how large companies build with AI, then let's talk.
Tech Stack
Infra
As LLMs are core to our customers' products, resiliency and uptime are our top priorities. Since we act as a proxy, our uptime must exceed that of the providers themselves. We’re multi-cloud, multi-region, and built for seamless failover. Our infrastructure runs on Kubernetes, managed with Terraform, and deployed across AWS and GCP. We use GitHub Actions for CI/CD, with Datadog for monitoring, tracing, and performance insights.
Infra stack: Kubernetes, Terraform, AWS, GCP, GitHub Actions, PostgreSQL, Redis, Datadog.
Backend
Our backend is a set of microservices running Python with Quart for web services and Python-based fine-tuning jobs optimized for speed, cost, and accuracy. We use PostgreSQL for conventional data persistence and vector storage. Go is being introduced where performance gains are critical.
Tech stack: Python (Quart), Go (in transition), PostgreSQL.
Frontend
Tech stack: React (Typescript)
Founders
Founder and CTO @ Maitai. Previously a Tech Lead at Presto (S10), delivering CV based analytics and LLM powered voice ordering for the enterprise drive thru. A true jack of all trades, I have hands-on experience with infrastructure, software development, the machine learning stack, and everything in between.
Open Positions at Maitai (1 Jobs)
Ready to start your space career at Maitai?