Cohere operates as a natural language processing (NLP) company. The company offers services that enable businesses to integrate artificial intelligence (AI) into products, with capabilities such as generating text for product descriptions, blog posts, and articles, understanding the meaning of text for search and content moderation, and creating summaries of text and documents. Cohere primarily serves the enterprise sector, providing AI solutions that can be customized to suit various use cases, domains, or industries. It was founded in 2019 and is based in Toronto, Canada.
Cohere's Product Videos
ESPs containing Cohere
The ESP matrix leverages data and analyst insight to identify and rank leading companies in a given technology landscape.
The generative AI — large language model (LLM) developers market offers foundation models and APIs that enable enterprises to build natural language processing applications for a number of functions. These include content creation, summarization, classification, chat, sentiment analysis, and more. Enterprises can fine-tune and customize these large-scale language models — which are pre-trained on …
Cohere's Products & Differentiators
Access massive language models that can understand text and take appropriate action — like highlight a post that violates your community guidelines, or trigger accurate chatbot responses. Classify uses cutting-edge machine learning to analyze and bucket text into specific categories. Build automated text classifiers into your application to do things like identify toxic language, automatically route customer queries, or detect breaking trends in product reviews.
Research containing Cohere
Get data-driven expert analysis from the CB Insights Intelligence Unit.
CB Insights Intelligence Analysts have mentioned Cohere in 18 CB Insights research briefs, most recently on Nov 21, 2023.
Expert Collections containing Cohere
Expert Collections are analyst-curated lists that highlight the companies you need to know in the most important technology spaces.
Cohere is included in 6 Expert Collections, including Unicorns- Billion Dollar Startups.
Unicorns- Billion Dollar Startups
Companies developing artificial intelligence solutions, including cross-industry applications, industry-specific products, and AI infrastructure solutions.
Digital Content & Synthetic Media
The Synthetic Media collection includes companies that use artificial intelligence to generate, edit, or enable digital content under all forms, including images, videos, audio, and text, among others.
Generative AI 50
CB Insights' list of the 50 most promising private generative AI companies across the globe.
Companies working on generative AI applications and infrastructure.
Latest Cohere News
Nov 24, 2023
Maybe everyone SPECIAL REPORT: THE BATTLE FOR AI SUPREMACY by Paul Gillin SHARE The speed with which Microsoft Corp. hired former OpenAI LP Chief Executive Sam Altman last weekend, along with the firm’s co-founder and an unspecified number of technical specialists, illustrates how high the stakes are in the $60 billion market for cloud artificial intelligence products and services. Microsoft, which by all accounts is the early leader in a market that’s expected to grow 40% annually for the next several years, was clearly focused on protecting its $13 billion investment in Open AI by ensuring that the brain trust behind ChatGPT and other generative AI products stayed in Microsoft’s inner circle. For the moment, at least, there’s a lot to protect. “Microsoft has the early-mover advantage because of its investment in OpenAI and is bringing a lot of OpenAI models into the Azure cloud,” said Arun Chandrasekaran, distinguished vice president analyst at Gartner Inc. “They have so many business applications that touch practically every user that they want to take advantage of their incumbency to bake AI into existing workflows and applications.” Microsoft calls those AI assistants “copilots,” and it has left no doubt that they will anchor its AI strategy going forward. Forrester Research Inc. expects nearly 7 million U.S. knowledge workers will adopt Microsoft copilots over the next year. “We are the copilot company,” Microsoft Chief Executive Satya Nadella said in his keynote speech at the company’s Ignite conference earlier this month. “Everyone will have a copilot for everything you do.” First movers and fast followers But does an early lead indicate that Microsoft can dominate the cloud AI market for the long term? Most observers say no. Predicting anything in a technology climate that turns out breathtaking innovations almost weekly is futile. “With the exponential leaps in technology we have today, holding on to the leadership position is nearly impossible,” said Erin Boelkens, vice president of product at LiveRamp Holdings Inc. LiveRamp’s Boelkens: “Holding on to the leadership position is nearly impossible.” Photo: LinkedIn AI is still in its earliest stages of commercial development and its ripple effects and entirely unpredictable. “This is a turning point in the way Google was in the ’90s,” said Benjamin Lee, a computer science professor at the University of Pennsylvania who worked at Microsoft in 2008 and 2009. “It’s qualitatively changing the way we receive information. Instead of as a series of web links, we’re receiving synthesized text.” Most market watchers agree that the early-mover phenomenon that rocketed Amazon Web Services Inc. to dominance in the public cloud is unlikely to play out the same way in AI. For one thing, all the major cloud providers already have well-established AI portfolios. Although the success of ChatGPT may have accelerated their urgency to define a strategy, none was unprepared to respond to the recent surge in customer interest. AI also shows no signs of being a winner-take-all market. The generative AI segment, which is just one small part of the broader market, has numerous proprietary and open-source large language models for customers to choose from, with more on the way. Cloud providers’ strategies differ substantially and play to existing strengths and customer preferences. Companies such as IBM Corp. and Oracle Corp., both of which trail the big three in public cloud market share, have spoiler potential. Open-source options are also evolving quickly and promise to prevent vendor lock-in issues that vexed many early cloud adopters. SiliconANGLE contacted numerous hyperscalers and market watchers to size up the AI prospects of the public cloud leaders. They told us all have much to gain, even if they don’t dominate. Microsoft: Driving from the desktop Microsoft’s rapid embrace of OpenAI and pivot to the copilot strategy illustrate how effectively Nadella has shaken the desktop software giant out of its decade-long torpor under former CEO Steve Ballmer. The highlight of Microsoft’s recent Ignite conference was the announcement of the company’s first specialized AI chips , filling a crucial gap in its AI strategy. Azure Maia is intended to support generative workloads such as LLMs and GitHub Copilot. The Azure Cobalt CPUs are built on an Arm architecture and aimed at delivering optimal performance and energy efficiency for common workloads. The announcements bring Microsoft closer to parity with rivals Google LLC and Amazon, which have had specialized chipsets for years. Microsoft has earned praise for the speed with which it jumped on the generative AI opportunity early this year. Lee said three major moves by the software giant – its 2018 acquisition of open-source development platform GitHub , the OpenAI investment and the blockbuster purchase of video game maker Activision Blizzard Inc. earlier this year – “all lay the foundation for big shifts. Under new leadership they seem to be setting aside the sacred cows and taking bigger risks,” he said. Although Microsoft has said its AI investments won’t show up on the bottom line this year, analysts attributed about three percentage points of the Azure cloud’s 29% growth in the most recent quarter to generative AI business. Nadella said Microsoft has signed up more than 18,000 customers for its Azure OpenAI services in just 10 months. CloudBolt’s Campos: “Microsoft is absolutely the leader right now.” Photo: CloudBolt “Microsoft is absolutely the leader” in AI right now, said Kyle Campos, chief technology officer at CloudBolt Software Inc. “OpenAI’s production-level decisions around user experience have captured imaginations and it’s hard to believe that will slow down anytime soon. OpenAI is setting the pace.” Chandrasekaran said the OpenAI partnership will lift Microsoft’s cloud strategy across the board. “We’re seeing companies willing to give Azure a chance because they want to take advantage of the Open AI models,” the Gartner analyst said. Although Azure’s market share crept up only 1% over the past year, Ensono LP Cloud Evangelist Gordon McKenna believes estimates are understated. “The published share-of-market figures aren’t the same as what I’m seeing,” he said. “I think a lot of the net new business is coming to Microsoft and that will accelerate with copilots.” The copilot strategy plays to Microsoft’s desktop dominance, giving it a unique opportunity to seed hundreds of millions of desktops with advisers it controls and create upsell opportunities for its enterprise applications. Copilots will no doubt also be a hit with buyers in the $250 billion global video game market. Microsoft’s decades-long relationship with senior information technology executives also works to its advantage. “Given its traditional cornerstone in large enterprise IT departments, we think that they’ll likely have significant and durable first mover advantage,” said Gregg Hill, co-founder of Parkway Venture Capital LLC. Amazon goes wide and deep Despite being perceived as a generative AI laggard behind Microsoft, AWS has been plenty busy in the AI market this year and in previous years too. Last spring it rolled out several new tools for generative AI training and deployment on its cloud platform. Early this summer, it invested $100 million in a program connecting customers to AI experts who will help them build generative AI models. In July it expanded its Bedrock managed AI foundational model service to support new models from Cohere Inc., Anthropic PBC and Stability AI Ltd. Parkway Venture Capitals Hill: Microsoft “will likely have a significant and durable first mover advantage.” Photo: Parkway VC Not slowing down, in September AWS invested $4 billion in Anthropic in a deal that will see the OpenAI rival build its Claude 2 LLM on AWS infrastructure, including the proprietary Trainium and Inferentia chipsets. On Halloween, it debuted a new consumption model that lets customers reserve access to large numbers of hard-to-find graphics processing units for short-duration AI projects. And earlier this month it launched a free training initiative to equip 2 million people with AI skills by 2025. Although seen as playing catch-up to Microsoft in generative AI, AWS has been no laggard in bringing technology to market. Its SageMaker managed service for machine learning model development was introduced six years ago. Last year it added CodeWhisperer , an AI-powered rival to GitHub Copilot. Expect it to employ its usual shock and awe tactics at next week’s re:Invent conference. AI for all AWS plans to double down on the strategy that propelled it to market leadership in the cloud: putting resources into the hands of startups and companies that can’t afford the high costs of model training and inferencing, said Matt Wood, AWS vice president of product. “SageMaker took machine learning, which has traditionally only been within reach of well-funded companies or government agencies, and put it into the hands of tens of thousands of customers,” he said. “That was directly in line with our mission when we started AWS.” The strategy of seeding the startup market may be less fruitful this time given Microsoft’s early lead, but AWS is in a good position to scoop up some business from companies that are wary of posting strategic projects on Azure, noted Chandrasekaran. “Microsoft is so broad-based that many vendors of technology solutions compete at some level with them,” he said. In contrast to its rival’s decision to put all its chips on OpenAI — a bet that suddenly looks a little risky — AWS is positioning itself as an LLM Switzerland by supporting models from Anthropic, AI21 Labs Ltd., Cohere, Stability AI and Meta Platforms Inc. as well open-source models from Hugging Face Inc. and its own. It’s also reported to be working on a massive 2 trillion-parameter LLM code named “Olympus.” AWS’ Wood: “You’d be pretty remiss to pick a winner of a 10K race three steps in.” Photo: AWS Amazon says plenty of AI development is already happening on its infrastructure. “The majority of machine learning unicorns are running on AWS,” Wood said, citing the 40 billion-parameter Falcon40B as an example. Its strategy emphasizes customer choice with built-in privacy and simple access via an application program interface. “We’re the only place where you have privacy, security, access to the largest set of models and customization,” Wood said. Experts say you can never count AWS out as a competitor, given its massive market share and customer loyalty. “There’s a perception that AWS is late to market, but technology-wise, I think they’re going to do much better,” said Bill Wong, a principal research director and head of the AI and data analytics practice at Info-Tech Research Group Inc. AWS’ strength is that “a lot of corporate data is already on their platform,” he said. “Microsoft will always be challenged because they’re tied at the hip to SQL Server, which nobody uses as an enterprise database.” “We’ve seen first-hand how AWS can mobilize its customer base towards a new set of technologies and we believe it’s in a great position to deliver raw AI capabilities through infrastructure-as-a-service,” said Leon Kuperman, chief technology officer at Cast.AI Group Inc., which develops cloud optimization technology. AWS’ Wood scoffed at the notion that Amazon is a latecomer to the generative AI party. “You’d be pretty remiss to pick a winner of a 10K race three steps in,” he said. Google struggles to regain early lead Cast AI’s Kuperman: AWS “is in a great position to deliver raw AI capabilities.” Photo: LinkedIn In theory, the cloud AI market should have been Google’s to lose. The company built one of the first LLMs – Bidirectional Encoder Representations from Transformers — which has since become the standard for natural language processing. Its TensorFlow machine learning framework is widely recognized as one of the leading platforms for deep learning, which is widely used in image and speech recognition. Google’s 2014 acquisition of DeepMind Technologies Ltd. set the stage for subsequent breakthroughs in weather forecasting and protein folding . Its Transformer neural network architecture for language understanding is a foundational technology for today’s LLMs. Its AlphaGo program beat the world’s top Go player in 2017 , a feat that at the time was considered beyond the scope of machines. Google was also the first cloud provider to introduce dedicated AI processing chips. Its Tensor Processing Units, which are now in their fifth generation, are highly regarded by AI startups, said Philip Moyer, vice president of Google Cloud’s global AI business. “Seventy percent of AI unicorns are running on Google Cloud,” he said. “These are organizations that work very close to the metal and processor choice is important.” But Google was seemingly caught flat-footed by Microsoft’s rapid embrace of OpenAI and integration of ChatGPT into its Bing search engine. Google’s rival Bard technology was reportedly rushed to market despite quality concerns and an embarrassing demo that briefly knocked $100 billion off the company’s market capitalization. “You would have thought they would be the leaders in this space,” said CloudBolt’s Campos. “To be a fly on the wall in that boardroom when they went over some of the missed opportunities would have been interesting.” Moyer noted that Bard made its debut just 16 weeks after ChatGPT. Strength in apps Despite a rather rough 2023, the company is playing a long game and its widespread AI integration across compute, storage and applications is considered a strength, as is the choie of models it offers. “One of the core tenets of our strategy is that we don’t believe one model will rule them all,” Moyer said. “We have over 100 foundation models on our platform. We believe in using the right model for the right job.” Google’s Moyer: “We don’t believe one model will rule them all.” Photo: LinkedIn Google’s foundation models include voice, coding, medicine and cybersecurity as well as most of the commercial and open-source LLMs. “Customers are excited that we’re innovating our own models but also about the fact that we treat third-party and open source models as first-class citizens on our platform,” he said. Google researchers have published more than 7,000 papers about various AI models, “so we have this deep commitment of open sourcing and sharing and making sure that these models are explainable and transparent,” Moyer said. Google believes it is unique in offering a choice of GPU and tensor processing unit chips, a wide variety of models, management tools and what it calls an adapter layer that ensures that customer data isn’t inadvertently incorporated into model training. “From day one our model is frozen and customers keep the adapter layer inside their tenant so all of their training and output is theirs,” Moyer said. Like Microsoft, Google has a set of widely used productivity applications that it can use to introduce AI capabilities and hook users. “By owning the entirety of these solutions and platforming them uniformly, Google can seamlessly infuse AI capabilities across its suite,” Dave Vellante, George Gilbert and Rob Strechay of SiliconANGLE Media’s theCUBE Research team wrote in a recent analysis . “This integration surpasses even Microsoft’s Office 365 with OpenAI thanks to Google’s cohesive platform design.” IDC’s Saroff: Google “has been working on generative IT longer than just about anyone.” Photo: LinkedIn Daniel Saroff, group vice president of consulting and research at International Data Corp., said Google’s broad end-user reach through its dominant browser and mobile operating system is potentially even greater than Microsoft’s. “It’s been talking about embedding AI in Chrome and Android and it’s been working on generative IT longer than just about anyone,” he said. Moyer said Duet AI, which is its version of a copilot, has already been broadly implemented throughout its productivity applications and cloud infrastructure. “If you’re a software developer, we can make recommendations for how to detect security vulnerabilities in your code,” he said. “If you’re an operator, we can suggest how you can set up controls or make your database queries more efficient.” Google’s widely acknowledged analytics expertise, track record of breakthrough research and knack for wowing crowds with its demos all work to its advantage. It is also known as a good company to partner with. “Folks looking for strategic relationships in developing models tend to favor Google,” said Info-Tech’s Wong. Google has been less active than its rivals this year, but it’s hardly standing still. It announced support for generative AI models in its Vertex AI suite of machine learning services, outlined a system that can generate high-fidelity music based on text commands and showed how dozens of partners are integrating its technology into their own products. It also joined Microsoft and IBM in addressing a major enterprise AI concern by indemnifying customers against the use of third-party data in AI training models. Last month it followed Amazon’s lead by investing $2 billion in Anthropic in an odd bedfellows arrangement apparently aimed at mutual foe Microsoft. Despite its advantages, Google’s track record of killing off projects , sometimes abruptly, may work against it. “Google has a challenging history of false starts and early abandonment, which lowers my faith in placing big bets,” said CloudBolt’s Campos. Spoilers: IBM, Oracle and niche players It has been 12 years since IBM’s Watson question-answering computer trounced the world’s top players in a televised round of Jeopardy! in what was, at the time, the most impressive public demonstration yet of AI’s capabilities. Despite IBM’s predictions that Watson would become a ubiquitous digital assistant across a wide range of industries, the technology never lived up to its billing and came to be seen as a disappointment . With the introduction of watsonx in March, IBM may finally have the AI breakthrough it has long sought. The platform combines a studio for building generative AI and machine learning foundation models, a data store based on an open data lakehouse architecture, and a set of tools for building AI workflows. That makes watsonx a “ breakthrough opportunity ,” says Vellante. TheCUBE Research’s Vellante: “I think IBM finally got it right.” Photo: SiliconANGLE “It’s really, really good,” Vellante said on a recent episode of theCUBE’s weekly podcast. Describing his recent briefing at IBM’s Thomas J. Watson Research Center, Vellante said he came away “really impressed. They have a robust stack from silicon all the way through the analytics up to the [independent software vendors]. They’ve got AI chips, they’ve got partnerships. I think IBM finally got it right.” Enterprise Technology Research data indicates that watsonx may be a winner. It showed that the percentage of new customers adopting it jumped from 3.6% in July of 2022 to 13% in the most recent October survey. Customer churn also dropped by more than half. Going vertical IBM’s AI strategy is similar to its approach to the cloud, said Hillery Hunter, chief technology officer of IBM Cloud. It addresses vertical use cases in large enterprises with a platform that emphasizes security, privacy protection and openness. IBM sees a huge opportunity in watsonx Code Assistant , a generative AI tool that will help organizations convert aging Cobol code to Java when it becomes generally available later this quarter. “There are billions of lines of Cobol still in use,” Hunter said. “This is a way for organizations to accelerate their digital transformation journey.” The company is also bidding to cultivate a startup ecosystem with the launch of a $500 million venture fund targeting AI startups. And it has partnered with and invested in Hugging Face to provide access to that company’s open-source neural networks and data sets via watsonx. “IBM is leaning super-hard into this, probably because they moved slowly on the cloud and got left behind,” said Ensono’s McKenna. “They see this as something where they could be a player.” It’s about the apps Oracle has largely stayed out of the consumer-led generative AI fray, preferring to play to its strengths in enterprise applications and database management. “There’s a lot of noise in the consumer space,” said Leo Leung, Oracle’s vice president of products and strategy. “Our focus is on the B2B side and that’s perhaps a little less noticed.” Oracle has been quietly folding generative AI capabilities into its Fusion Cloud enterprise cloud applications as well as its NetSuite family aimed at smaller businesses. Oracles Leung: Vendors targeting the enterprise “will find that we are a way to reach those customers.” Photo: Oracle “Businesses are going to use AI a lot in specific use cases,” Leung said, noting that generative features were added to the customer service, marketing and sales components of the Fusion suite in September. “You’ll be hearing a lot more about that from us than about consumer apps.” Oracle has “depth and breadth when it comes to working across industries as well as human resources and financial management,” Leung said. “AWS doesn’t really play in the apps space. Their focus is on builders. A lot of the AI that businesses are going to use will be in their applications.” Partnerships have so far played an outsized role in Oracle strategy. In a joint announcement in October, Oracle and Nvidia Corp. said the GPU maker’s AI supercomputing platform, Nvidia DGX Cloud, will be available in the Oracle Cloud , giving Oracle customers an inside edge on access to scarce GPUs. The two companies also partnered on an investment round in Cohere in June. Two months ago Oracle expanded its relationship with Microsoft to make Oracle’s Exadata database platform available in the Azure cloud . Compared with the hyperscale leaders, “I think we’re a better partner,” Leung said. “Companies that are looking to leverage their focus on the enterprise will find that we are a way to reach those customers.” Between Oracle and mySQL, the company also has the top commercial and open-source database management systems. Both will soon support vector storage, which is considered critical to building generative models. Cloud business also tends to migrate to where data is, a dynamic that is not been lost on the DBMS leader. “Data gravity will play a role,” Leung said. Despite its many strengths, Oracle’s late start in the cloud is working against it, said Info-Tech’s Wong. “I’ve talked to literally hundreds of customers, and nobody’s said that Oracle was their go-to,” he said. Open question Experts said it’s wise not to underestimate the outsized impact of open-source foundation models and toolsets. Although providers all have their own development stacks, the Python language and frameworks like PyTorch and TensorFlow are firmly established and support large communities of developers. “What most vendors won’t share is that the adoption of the tools is low,” said Info-Tech’s Wong. “Most people are using open-source tooling. I’d guess that less than 10% of their customers are using the tools they push so hard.” Unlike the database and application markets, which were dominated by commercial developers from the start, AI technology has been incubated in educational institutions. “I don’t think we’ve seen a technology where people are waiting with bated breath for what comes out of universities,” Wong said. “When it comes to data science, commercial vendors don’t really stand out. The open-source community has done a great job.” Cast AI’s agreed that open source is hard-coded into AI. “The open-source model movement heavily relies on Python frameworks and libraries,” he said. “Hyperscalers can do very little to change the direction of the overall open-source movement.” Open source is a defense against lock-in and has cost benefits as well, said Google’s Moyer. “Many open-source models are distilled from larger models, have fewer parameters and cost less,” he said. “Open source is great for enterprises that don’t want the world’s knowledge but just want to get something done quickly.” The current state of much commercial AI technology is also immature, experts say. Organizations should be wary of adopting toolsets that were rushed to market to take advantage of the generative AI craze, said Kjell Carlsson, head of AI strategy at Domino Data Lab Inc., which makes a data science platform. “You effectively need what you need in the data science world, which is a data science platform that enables you to conduct and monitor every step of the development and deployment process,” he said. “None of the hyperscalers offers anything that is vaguely decent. They’re a collection of point services that they lump together with a brand, but nothing’s integrated.” As a result, he said, “I think in a year you’ll be writing about how many projects fail and how vendors pushed us in that direction.” Small but mighty Vultr’s Kardwell: “This is an incredible moment for independent cloud computing companies.” Photo: Vultr J.J. Kardwell likes to think the generative AI rush will have outsized benefits for companies such as Vultr, a cloud service of The Constant Company LLC, where he is CEO. The ability of smaller cloud providers to move the newest infrastructure generation of hardware quickly into production gives them an edge over slower-moving cloud giants, he said. For example, Vultr has already deployed Nvidia’s newest GH200 GPU into its 32 cloud data centers while Microsoft has only committed to offering GH200-based instances sometime next year . With the newest versions of GPUs delivering three to five times the performance of their predecessors, the months-long lag times hyperscale giants require to provision new infrastructure puts them at a disadvantage, Kardwell said. “We are in such a steep part of the performance curve that [being early] matters a ton,” he said. “Hyperscalers are slow-moving businesses that take a year or more to onboard a new technology. Being a year behind is a massive disadvantage.” Without offering specifics, Kardwell said Vultr’s business “is growing dramatically faster than the overall cloud market. This is an incredible moment for independent cloud computing companies that stand to benefit disproportionately,” he said. AI democratization Regardless of who has the early lead, few people expect any one player to dominate the AI market in the cloud. Vigorous competition, a vibrant open-source ecosystem and a sharp increase in venture capital investments in AI companies will keep the competitive landscape fluid, which ultimately benefits buyers. “The future of AI is not about one model. It’s multi-model with models applied to specific use cases,” Dario Gil, IBM’s director of research, noted in an IBM AI Academy video . Noting that more than 325,000 open-source AI models are already available, he said, the opportunity’s too big to accrue to just a small number of vendors. “For the good of society in the long-term, we don’t want just a few winners. We’re going to see the democratization of AI,” Gil said. In the battle for AI supremacy, in other words, everyone may ultimately be the winner.
Cohere Frequently Asked Questions (FAQ)
When was Cohere founded?
Cohere was founded in 2019.
Where is Cohere's headquarters?
Cohere's headquarters is located at 171 John Street, Toronto.
What is Cohere's latest funding round?
Cohere's latest funding round is Corporate Minority.
How much did Cohere raise?
Cohere raised a total of $469M.
Who are the investors of Cohere?
Investors of Cohere include SAP, Index Ventures, Salesforce Ventures, Thomvest Ventures, Schroders Capital and 16 more.
Who are Cohere's competitors?
Competitors of Cohere include Aleph Alpha, Primer, Hugging Face, Sana Labs, Vectara and 7 more.
What products does Cohere offer?
Cohere's products include Classify and 2 more.
Compare Cohere to Competitors
Anthropic provides artificial intelligence (AI) safety and research services specializing in developing general AI systems and language models. Its research fields include areas such as natural language, human feedback, scaling laws, reinforcement learning, code generation, and interpretability from human feedback to policy and societal impact analysis. Their products are primarily targeted towards businesses seeking to leverage AI technology. It was founded in 2021 and is based in San Francisco, California.
OpenAI is a company focused on artificial intelligence (AI) research and deployment. The company offers a platform that provides access to its latest AI models and guides for safety best practices. They primarily serve the technology industry. It was founded in 2015 and is based in San Francisco, California.
AI21 Labs operates as an artificial intelligence (AI) lab and product company. The company offers a range of AI-powered tools, including a writing companion tool to assist users in rephrasing their writing, and an AI reader that summarizes long documents. It also provides language models for developers to create AI-powered applications. It was founded in 2017 and is based in Tel Aviv-Yafo, Israel.
deepset offers enterprise machine learning and natural language (NLP) products and solutions. It enables developers to use language models and transfer learning techniques for their individual tasks. It also helps enterprises to build, run, and maintain production-ready NLP applications. It was founded in 2018 and is based in Berlin, Germany.
Textify is a company focused on artificial intelligence and natural language processing in the technology sector. The company offers an AI membership platform that provides access to a wide range of AI tools, including predictive writing tools, text analysis and summarization tools, and AI-enabled text discovery tools. These tools are designed to simplify the process of building, distributing, and monetizing AI solutions. It was founded in 2021 and is based in Indore, India.
Symbl.ai is a conversational intelligence application programming interface (API) platform. It offers sales intelligence, customer service intelligence, human resource intelligence, meeting intelligence, communications intelligence, and more for enterprises. The company was founded in 2018 and is based in Seattle, Washington.