SambaNova is a technology company operating in the fields of artificial intelligence (AI) and machine learning, headquartered in Palo Alto, California. Founded in 2017 by engineers and researchers from Sun Microsystems/Oracle and Stanford University, the company develops AI platforms designed for large-scale enterprise use, with a focus on generative and agentic AI applications supported by purpose-built hardware and software infrastructures.
Founding
SambaNova was established in 2017 by Rodrigo Liang, Kunle Olukotun, and Christopher Ré. The company draws on a combination of academic expertise and commercial engineering experience. Its investors include SoftBank Vision Fund 2, BlackRock, Intel Capital, GV (formerly Google Ventures), Walden International, Temasek, GIC, Redline Capital, Atlantic Bridge Ventures, and Celesta, among others.
Platform and Technological Architectures
SambaNova distinguishes itself through its proprietary Reconfigurable Dataflow Unit (RDU), developed as an alternative to traditional GPUs. The RDU architecture optimizes compute around data movement at the hardware level, enabling higher performance and lower power consumption. RDU-based systems support both training and inference processes as part of an integrated solution.
SambaNova Suite and Its Components
The company’s flagship offering, the SambaNova Suite, is a fully integrated platform designed for large-scale generative and agentic AI. It consists of three main components:
- DataScale® SN40L: A hardware system capable of running multiple large language models (LLMs) simultaneously. It features high memory capacity, a three-tiered memory architecture, and supports up to 5 trillion parameters with model switching times in the microsecond range.
- SambaStudio: A software interface that allows users to train, manage, and control access to models.
- Composition of Experts (CoE): A multi-model architecture that integrates various open-source models, allowing for improved accuracy and performance by dynamically selecting the best model for a given task.
Samba-1 Model
Introduced in 2024, Samba-1 is a trillion-parameter generative AI model. It combines models such as Meta LLaMA 2, Mistral 7B, and Falcon 40B, selecting the most suitable one based on user prompts. This model enables enterprises to fine-tune on proprietary datasets while maintaining data privacy. Samba-1 can be deployed both in the cloud and on-premises.
AI Applications
The SambaNova platform supports agentic AI systems that involve the concurrent operation of multiple models working in coordination to complete complex tasks. By allowing enterprises to retain ownership of custom-trained open-source models, the platform offers enterprise-level control. Compared to closed-source alternatives, SambaNova provides benefits in terms of security, latency, and cost-efficiency.
Use Cases
Through SambaNova Cloud, developers can access high-speed inference environments with models like LLaMA and DeepSeek. Notable use cases include:
- InvestIQ – an investment strategy development platform
- Eveplora – a global hackathon tracking application
- Pokémon NPC Companion – a game assistant generator
- Food Spotlight – a food label analysis tool
International Collaborations
SambaNova partnered with SoftBank to build new AI data centers in the APAC region, offering accelerated inference services using open-source models. The company has also expanded its global reach through partnerships with STC, Hugging Face, Continue, and BlackBox AI.
Future Outlook
SambaNova aims to deliver scalable generative and agentic AI applications to enterprises by unifying fast, secure hardware and software infrastructure in a single platform. Its technological roadmap focuses on model ownership, privacy, regulatory compliance, performance, and efficiency. As of 2025, SambaNova stands out for its inference speed and energy efficiency. The company’s infrastructure is being designed to support AI deployments across government agencies, scientific research institutions, and commercial enterprises with high accuracy and reliability.