Cyabra company logo

The profile is currenly unclaimed by the seller. All information is provided by CB Insights.

cyabra.com

Founded Year

2017

Stage

Series A | Alive

Total Raised

$8.65M

Last Raised

$5.6M | 10 mos ago

About Cyabra

Cyabra has developed a platform designed to protect brands and politicians against misleading discourse on social networks distributed through the use of false identities.

Cyabra Headquarter Location

Tel Aviv,

Israel

Predict your next investment

The CB Insights tech market intelligence platform analyzes millions of data points on venture capital, startups, patents , partnerships and news mentions to help you see tomorrow's opportunities, today.

Research containing Cyabra

Get data-driven expert analysis from the CB Insights Intelligence Unit.

CB Insights Intelligence Analysts have mentioned Cyabra in 1 CB Insights research brief, most recently on Oct 21, 2020.

Expert Collections containing Cyabra

Expert Collections are analyst-curated lists that highlight the companies you need to know in the most important technology spaces.

Cyabra is included in 2 Expert Collections, including Cybersecurity.

C

Cybersecurity

4,937 items

D

Digital Content & Synthetic Media

323 items

The Synthetic Media collection includes companies that use artificial intelligence to generate, edit, or enable digital content under all forms, including images, videos, audio, and text, among others.

Cyabra Patents

Cyabra has filed 1 patent.

The 3 most popular patent topics include:

  • Market segmentation
  • Social media
  • Social networking services
patents chart

Application Date

Grant Date

Title

Related Topics

Status

2/3/2020

Social networking services, Social networks, Market segmentation, Virtual communities, Social media

Application

Application Date

2/3/2020

Grant Date

Title

Related Topics

Social networking services, Social networks, Market segmentation, Virtual communities, Social media

Status

Application

Latest Cyabra News

“We really are trying to make the world a more transparent and genuine place.”

May 29, 2022

A lack of transparency around content online affects every person every day, says Dan Brahmy, co-founder and CEO of Cyabra. CTech A lack of transparency around content online affects every person every day, says Dan Brahmy, co-founder and CEO of Cyabra. People tend to think of disinformation as being mostly around politics and government, but he says bad actors have gotten tangled up with everything we consume and skew many of our decisions with manipulation we should care about. He and his co-founders wanted to improve transparency with Cyabra, which Brahmy explains is like a filter for online conversations. Rather than fact checking, their product currently checks the author and propagation of written content to give users more information about what they read. While four years ago, many people agreed that fake news was an issue, now they are asking for a solution, he shares. Meeting that need with a successful business plan is important, Brahmy says, but his goal is to make the world a more genuine and transparent place. We live in a dangerous world. It's difficult to understand what's real and not real. You're trying to make sense of that with Cyabra. Let's start at a high level understanding where we're at today and what are some of the big challenges that you decided to create Cyabra to solve. I stumbled upon the mission statement of Cyabra four years ago very randomly and very luckily. I got to know a good friend of mine who served 13 years in the army within the Israeli Special Operation Command. He was telling me he ran a huge information warfare department within the Israeli forces. He said he would like to try and solve a huge issue and that there is no truth, there's only falsehood and there's only propaganda. I said, "I don't really know what you're talking about, but it sounds like we're all going to be affected by this. Whatever we do, we have to do it together." I was not an expert in that field four years ago, but I got very lucky to be accompanied by my incredible co-founders who are coming from this information warfare field. We were able to look at the current state of the internet. When you think about it, there is no real transparency. When you and I are reading something, we do not understand if this has been propagated by a real, a bad, or a fake author. Are we facing someone with a specific intention, with a specific agenda? We need to understand the snowball effect. When we created Cyabra four years ago, we said, "We're not doing fact checking." Because I think there's no genuine way of automating the process of fact-checking in an unbiased manner. But on the other hand, there might be a way to do what we may call the author checking and the propagation checking: are we facing a real, bad, fake person, and how much of a snowball effect is being created upon ourselves as the people consuming the information? That's been the mission statement for Cyabra. We're like a filtering mechanism for online conversations. (Photo: N/A) When most people talk about the internet, one of the first words that will come to their mind is the promotion of transparency. There is no filter. But your definition of transparency takes us one level deeper, the transparency behind not just the content itself but what is this content representing and whose view is it. We make assumptions that may not be true about the author and their intentions. I think you nailed it perfectly. When I speak about transparency, I think there's the flip side of what you said, it's when we're looking at the really big and large social platforms out there. Our feeling and our experience makes us feel like there's bad and fake people that are writing shit on the internet, and the lack of transparency means that social platforms, in a sense, are being compensated by those mechanisms of viral propagation. The people who are inflicted by this, it's us, consumers. That's why we're trying to solve the problem. We're all on our phone typing, swiping, listening, and watching stuff all day long. So we're all being affected by this every second of the day. Yes, of course we're affected. But who cares about the fact that there is no transparency and this viral propagation and the screwed up incentive system for the stakeholders? When you're looking to make a successful business and make a positive impact on a large scale, how do you rationalize through who cares about this problem? We really are trying to make a dent and make the world a slightly better place or a more transparent and genuine place. We are not curing cancer. But, nevertheless, I could call what we're trying to solve an online illness. So who cares about this? First of all, as people, we should care about who is trying to skew our opinions. My opinion can be reflected in what kind of cereal I am going to be eating this morning; tomorrow it could be about the midterm elections. We did an analysis about Johnny Depp and Amber Heard and the crazy volumes of inauthentic activities with what they were doing. It's really funny because it surrounds celebrities and athletes. There's some sort of misconception that disinformation and fake news might only be related to political parties and governments. But this is such a tiny part of the problem. The bigger problem is those bad and fake actors found a way to skew every decision and get entangled with everything that we see, everything we consume. What you're touching on is actually a real issue that needs to trouble everybody. I'm really curious then about how you make a business out of it. You need capital and you need to show that this is actually something that can be very, very profitable. First of all, the reason why we're here is because we were able to show the worthwhileness and the technological advancements, thanks to the fundraising that we've conducted. A lot of incredible investors are on our side and enabled us to build what we believe will become this authentic search engine for online conversations. How do we make money? It's really easy. We sell a SaaS product, which works exactly that way, as a search engine. People pay for a volume-based kind of subscription. We usually work with the larger organizations, food and beverage, PR, consumer-oriented brands in the world, and some high level public sector agencies, like the US State Department. We did not reinvent the wheel with the business model. We reinvented the wheel with the approach on what information authenticity sounds like and feels like. For stakeholders, companies, different people who are supporting this mission, do they understand the potential? Four years ago when we started the company, it was a blue ocean, but it felt like a blue pond. The need for education was so high. We were talking to people, like in early 2018, with no product, just knocking on doors. People in mid, late-2018, they weren't laughing at us. They were saying, "Oh, yes. Fake news. Disinformation. But it's not really for us." Four years later and you see the shift in conversation. We see people coming to us. We see CXOs of huge corporations coming to us and saying, "We understand that it's about to happen and that we are all susceptible to this crazy shift.” Now people don't question the need; people ask for a solution before it blows up. How do you think through and even have the ability to create these mechanisms and these solutions for the different platforms or the different modes of communication: text, audio, video? We started with the most common one, which is written content. We know that from a vision standpoint, acting as a filtering mechanism means that we need to have an answer for every type of medium and for every type of publicly available platform so that nothing falls through the cracks. For the last four years, we focused on written content. The next step for us within six to 12 months is focusing on the transcription from the visual content into written content. It's to say, "We do know that this is a picture or a frame within the video of Joe Rogan. And this is how it's combined with the written content talking about Joe Rogan." Audio is a whole new world; there's a lot of falsifications around audio. We've just started the research for this. But I think you'll hear more about the audio solution from our perspective probably within the next 18 to 24 months.

Cyabra Web Traffic

Rank
Page Views per User (PVPU)
Page Views per Million (PVPM)
Reach per Million (RPM)
CBI Logo

Cyabra Rank

  • When was Cyabra founded?

    Cyabra was founded in 2017.

  • Where is Cyabra's headquarters?

    Cyabra's headquarters is located at Tel Aviv.

  • What is Cyabra's latest funding round?

    Cyabra's latest funding round is Series A.

  • How much did Cyabra raise?

    Cyabra raised a total of $8.65M.

  • Who are the investors of Cyabra?

    Investors of Cyabra include TAU Ventures, Founders Fund, Brian Norgard, Will Graylin, Red Sheperd Ventures and 10 more.

  • Who are Cyabra's competitors?

    Competitors of Cyabra include NewsGuard Technologies and 1 more.

You May Also Like

P
Peerwise

Peerwise is a phone app that allows users to perform a real-time, direct user-to-user identity verification test without any external, central, or third-party authority. The solution protects users from voice phishing, biometric impersonation, identity theft, and deepfake ransomware.

Deepware Logo
Deepware

Deepware, created by Zemana, develops deepfake detection technology designed to detect deepfake videos or, simply, any fake content in the areas of visual and audio communication. The company's cloud-based solution can scan a suspicious video to find out if it is synthetically manipulated.

Ad Fontes Media Logo
Ad Fontes Media

Ad Fontes Media rates the news for bias and reliability.

AI Foundation Logo
AI Foundation

AI Foundation aims to democratize and decentralize artificial intelligence. Its first product called Reality Defender, combines human moderation and machine learning to identify malicious content meant to deceive people such as deepfakes.

Sensity Logo
Sensity

Sensity focuses on deep learning for fake video detection and explanation.

FakeNetAI Logo
FakeNetAI

FakeNetAI uses machine learning technology to detect fake videos and protect online trust. The company offers a web app and API that scans videos trained to detect a variety of alteration techniques, allowing clients to protect their users from the dangers of misinformation.

Discover the right solution for your team

The CB Insights tech market intelligence platform analyzes millions of data points on vendors, products, partnerships, and patents to help your team find their next technology solution.

Request a demo

CBI websites generally use certain cookies to enable better interactions with our sites and services. Use of these cookies, which may be stored on your device, permits us to improve and customize your experience. You can read more about your cookie choices at our privacy policy here. By continuing to use this site you are consenting to these choices.