Kai is the AI company rebuilding cybersecurity for the machine-speed era. Founded by second time founders and trusted by Fortune 500 enterprises, Kai is building a future where security has no categories, no silos, and no human speed bottlenecks. The Kai Agentic AI Platform replaces fragmented, human-limited workflows with agentic AI systems that continuously contextualize, assess, reason, and execute security work at machine speed - making human defenders, superhuman.
Why Join Kai
- Well-funded: With $125M raised, we have the capital, runway, and resolve to rebuild cybersecurity from first principles.
- Proven: We've earned the trust of Fortune 500 and Global 1000 companies, and we're just getting started. Their confidence in Kai reflects what we've built: an AI-powered cybersecurity platform that performs at the scale and speed the enterprise demands.
- Experienced founders: Our founding team consists of second-time entrepreneurs, each with over 20 years of experience in the cybersecurity industry. Their proven expertise and vision drive our ambitious goals.
- World-class leadership team: Our Heads of AI, Engineering, and Product bring extensive experience from some of the world’s most influential companies, ensuring top-tier mentorship, direction, and vision.
- Frontier AI Applied Research Team: Our researchers operate at the leading edge of agentic AI systems, translating breakthrough capabilities into real-world cybersecurity applications.
- Generous compensation: We offer highly competitive salaries, equity options, and a supportive work environment. Your contributions will be valued and rewarded as we grow together.
You Have:
- Data engineering and data modeling
- Building data lakes and extracting data from diverse sources in multiple formats
- Designing and executing ETL processes
- Data warehousing and analytics
- 10+ years of software engineering experience, including 7+ years as a Data Engineer
- Proven experience building data lakes and extracting diverse data efficiently for downstream use
- Strong data engineering and data modeling skills
- Expertise in ETL, data warehousing, and data analytics
- Deep expertise in one or more data stores (SQL, NoSQL, with production experience in large GraphDB deployments
- Hands-on experience with big data technologies such as Spark and DeltaLake
- Cloud experience (AWS / Azure / GCP), Azure preferred
- Skilled in developing scalable APIs and containerizing them using Docker and Kubernetes
- Proficiency in Python and SQL; experience with Go or a similar language (Java, C++)
- Familiarity with Terraform and Infrastructure as Code (nice to have)
- General networking knowledge: TCP/IP, HTTP, REST
- Ability to work cross-functionally, including collaboration with AI teams, in a fast-paced environment with evolving requirements
- Bachelor’s or Master’s degree in Computer Science, Data Science, or a related field