About Analog Inference
Analog Inference builds a line of artificial intelligence (AI) inference accelerators using analog in-memory computing technology. It offers products for neural computing. It provides a range of hardware acceleration solutions. The company was founded in 2018 and is based in Santa Clara, California.
ESPs containing Analog Inference
The ESP matrix leverages data and analyst insight to identify and rank leading companies in a given technology landscape.
The AI processors market offers solutions for the increasing demand for artificial intelligence in various industries, including finance, retail, and autonomous vehicles. These processors provide high computing power and low power consumption at an advantageous cost, making AI enablement more accessible. Vendors in this market offer full-stack solutions integrating software and hardware, compatibl…
Expert Collections containing Analog Inference
Expert Collections are analyst-curated lists that highlight the companies you need to know in the most important technology spaces.
Analog Inference is included in 1 Expert Collection, including Semiconductors, Chips, and Advanced Electronics.
Semiconductors, Chips, and Advanced Electronics
Companies in the semiconductors & HPC space, including integrated device manufacturers (IDMs), fabless firms, semiconductor production equipment manufacturers, electronic design automation (EDA), advanced semiconductor material companies, and more
Latest Analog Inference News
May 26, 2023
Analog Inference: Lower Power, Higher Performance Analog Inference is enabling powerful edge AI using analog computing Written by 2 min As artificial intelligence (AI) and machine learning (ML) continue disrupting many industries, companies are increasingly turning to compute-intensive applications. However, achieving and maintaining the extremely large amount of processing these workloads require is extremely difficult due to the inadequacy of existing hardware that can’t manage low power needs — especially in edge markets such as surveillance, smart retail, smart city, mobile phones, and other edge devices. Alumni Ventures portfolio company Analog Inference is helping deliver the high-performance and low-power edge AI needs with next-gen technology that is radically superior to current digital computing solutions. The company’s AI chips have far better performance per watt than traditional computing options, which is critical for enabling proper AI computation at the edge of networks and devices. Momentous Market and Superior Product Analog Inference is designing more affordable AI coprocessor chips that boast exceptional performance while requiring less power than current solutions. These chips have distinct advantages over incumbents in the marketplace, such as: Order-of-magnitude better performance per watt than traditional solutions Mature process node enabling GPU performance at edge cost Large capacity enabling data-center grade neural nets at the edge One of Analog’s initial target markets is video, specifically working with surveillance players and other companies needing to run video analytics at the edge. However, the company’s technology has the potential to integrate with other fast-growing markets, including autonomous vehicles, edge-cloud inference, and mobile phones. The company already has engagements with prominent companies across these industries. Knowledgeable Lead Investor and Experienced Team Analog is backed by an established investor syndicate, including Khosla Ventures. The well-known VC firm led the company’s seed round and holds two board seats. Founder Vinod Khosla is one of the most respected Silicon Valley entrepreneurs and venture capitalists. He has founded multiple companies besides Khosla, including Daisy Systems and Sun Microsystems (acquired by Oracle in a deal worth $7 billion). He also incubated several other startups (Juniper Networks, Nexgen) and is now acting as an advisor to Analog Inference. Analog’s team has deep technical expertise in semiconductors, AI, and analog computing. CEO Carey Kloss has been involved with the AI accelerator sector since 2014 and has deep expertise in AI and neural computing. Before joining Analog Inference in 2020, he was employee #1 at the leading AI company Nervana Systems and led full-stack engineering for Intel’s data-center NNP AI products. Founder, President, & COO Vishal Sarin has led the creation of groundbreaking analog and AI technology at Micron, Fast-Chip, Information Storage Devices, SST/Microchip, and National Semiconductor. He holds over 100 patents. How We Are Involved Blue Ivy Ventures (for the Yale community) sponsored Alumni Ventures’ investment in the company’s $10.6 million Series A . Alumni Ventures secured an allocation through a connection with Khosla, who led the round. Sibling funds Green D Ventures (for the Dartmouth community), Strawberry Creek Ventures (for the UC Berkeley community), and Triphammer Ventures (for the Cornell community) also participated in the round, along with AV’s Deep Tech Fund and Total Access Fund . Webinar The Latest Advances in AI and Machine Learning: A Conversation with CEOs and Founders Watch this on-demand presentation featuring founders of leading AI companies in the Alumni Ventures portfolio as we discuss the latest advances in AI, the exciting potential of innovating companies in this sector and the wide-ranging impact of AI-related technology. Presenters
Analog Inference Frequently Asked Questions (FAQ)
When was Analog Inference founded?
Analog Inference was founded in 2018.
Where is Analog Inference's headquarters?
Analog Inference's headquarters is located at 2350 Mission College Boulevard, Santa Clara.
What is Analog Inference's latest funding round?
Analog Inference's latest funding round is Series B.
How much did Analog Inference raise?
Analog Inference raised a total of $50.51M.
Who are the investors of Analog Inference?
Investors of Analog Inference include Khosla Ventures, Athena Capital Advisors, MFV Partners, TDK Ventures and Green D Ventures.
Who are Analog Inference's competitors?
Competitors of Analog Inference include Mipsology, Mythic, ANAFLASH, Hailo, Neural Magic and 9 more.
Compare Analog Inference to Competitors
Groq designs the Tensor Streaming Processor (TSP) architecture-based chip. It is a single enormous processor that has hundreds of functional units. Its architecture reduces instruction-decoding overhead and handles integer and floating-point data. It was founded in 2016 and is based in Mountain View, California.
Cerebras is a computer systems company dedicated to accelerating deep learning. The Wafer-Scale Engine (WSE), is at the heart of the deep learning system, the Cerebras CS-1. The WSE delivers more computing more memory, and more communication bandwidth. This enables AI research at high speeds and scale.
SiPearl designs Rhea, low-power microprocessors for European exascale supercomputers. The company was founded in 2019 and is based in Maisons-Laffitte, France.
Speedata provides analytics and database processing technology solutions. It develops a computer processor for big data and analytics workloads. The company was founded in 2019 and is based in Netanya, Israel.
Blaize provides a software platform to span the complete edge artificial intelligence (AI) operational workflow from idea to deployed application. It optimizes artificial intelligence (AI) with the location of data. The platform collects and processes data from the edge to the core, with a focus on automotive, smart vision, and enterprise computing markets. It was formally known as ThinCI. It was founded in 2010 and is based in El Dorado Hills, California.
Graphcore provides artificial intelligence (AI) systems and services for organizations to build, train and deploy models in the cloud using infrastructure processing unit (IPU) hardware. It offers products such as cloud infrastructure processing units, data center infrastructure processing units, bow infrastructure processing units, and more. The company serves the finance, biotech, scientific research, and consumer internet sectors. It was founded in 2016 and is based in Bristol, United Kingdom.