Cipia is a leading provider of in-cabin sensing solutions, operating in the automotive industry. The company offers driver monitoring systems and occupancy monitoring systems, which use edge-based computer vision and artificial intelligence to enhance safety and improve in-car experiences. These solutions primarily cater to the automotive and fleet management sectors. Cipia was formerly known as Eyesight Technologies. It was founded in 2006 and is based in Herzliya, Israel.
Research containing Cipia
Get data-driven expert analysis from the CB Insights Intelligence Unit.
CB Insights Intelligence Analysts have mentioned Cipia in 3 CB Insights research briefs, most recently on Sep 6, 2022.
Expert Collections containing Cipia
Expert Collections are analyst-curated lists that highlight the companies you need to know in the most important technology spaces.
Cipia is included in 4 Expert Collections, including Auto Tech.
Companies working on automotive technology, which includes vehicle connectivity, autonomous driving technology, and electric vehicle technology. This includes EV manufacturers, autonomous driving developers, and companies supporting the rise of the software-defined vehicles.
Supply Chain & Logistics Tech
Companies offering technology-driven solutions that serve the supply chain & logistics space (e.g. shipping, inventory mgmt, last mile, trucking).
Companies that will be exhibiting at CES 2018
Companies developing artificial intelligence solutions, including cross-industry applications, industry-specific products, and AI infrastructure solutions.
Cipia has filed 41 patents.
Hand gestures, Gesture recognition, Image sensors, Computing input devices, Digital photography
Hand gestures, Gesture recognition, Image sensors, Computing input devices, Digital photography
Latest Cipia News
Nov 22, 2023
Andy Boxall / Digital Trends Gesture control systems work best when they are simple, quick, and easy to learn. They also have to feel natural and intuitive, preferably so you remember them, and most of all, they need to be reliable. Apple’s Double Tap on the Apple Watch Series 9 and Apple Watch Ultra 2 is a good example of a gesture control working well. Contents Show 2 more items I think it’s a cool, fun feature , and I’m not alone . However, it’s not the first of its kind, and history shows us that, unfortunately, gestures on mobile devices are more likely to be abandoned and forgotten than they are loved and widely adopted. Recommended Videos Use Motion Sense on Pixel 4 | Pixel Without going way back to look at phones like the 2009 Sony Ericsson Yari , which used the front-facing camera to check body movements and control pre-loaded games, many tech fans may instantly think of the 2019 Google Pixel 4 and Pixel 4 XL when gesture controls are mentioned. Don't Miss These Pixel Deals: These two devices were the first to feature Google’s Project Soli chip, which used radar to recognize even the tiniest of movements around it. The resulting feature, called Motion Sense, allowed you to swipe over and around the phone to play or pause music, silence alarms, or mute an incoming call. It also unlocked the phone when you picked it up. Julian Chokkattu / Digital Trends It was technically exciting, but in practice, it didn’t work reliably enough, and regularity issues meant it couldn’t be used globally, hurting sales potential. The Pixel 4 ended up being the only smartphone outing for Project Soli’s gesture controls, but the chip has lived on without Motion Sense in the Google Nest Hub , where it helps measure breathing while you sleep. eyeSight Technologies Pantech's Vega LTE Smartphone with eyeSight's Gesture Recognition Technology Motion Sense on the Pixel 4 is probably one of the better-known failed gesture control systems on a phone, but other companies had been working on gesture control systems for a long time before it. In 2011, Korean electronics brand Pantech released the Vega LTE, which had basically the same gestures as the Pixel 4 but used a software-based system that relied on the front camera to “see.” It was developed by a company called eyeSight Technologies. For several years after the Pantech Vega, eyeSight Technologies pushed very hard to make gesture controls on mobile devices a thing. It was counting on its platform-agnostic Natural User Interface (NUI) software for success, as it could be built directly into a device’s operating system or even apps to utilize the camera and add gesture control systems. The company worked with Indian smartphone brand Micromax on the A85 Superfone, which used similar gestures to the Pantech Vega, made the NUI available on Android and iOS, and its technology was shown off at technology trade shows on multiple occasions. It boasted partnerships with companies ranging from Nokia to AMD and tried to capitalize on the early VR craze in 2016, too. Despite all these efforts, it never reached the mainstream, and the company eventually changed its name to Cipia and pivoted to in-car tech. Air Gestures on the Samsung Galaxy S4 Digital Trends Around the same time as eyeSight Technologies was promoting its software-based gesture system, Samsung introduced a small set of gesture controls called Air Gestures on the brand-new Galaxy S4 smartphone. The phone uses an infrared sensor to spot basic hand movements over the screen, allowing you to activate it to check the time without touching it, accept calls, interact with apps, and even scroll through web pages with just a swipe. It worked quite well, but the sensor’s short range meant you were almost touching the screen anyway, making it appear more gimmicky than the cool tech perhaps deserved. The feature continued on in Samsung’s repertoire but was slowly phased out and replaced by Air Actions, which uses gestures made with the S Pen stylus to perform similar actions without the need for an infrared sensor. Samsung Galaxy S4 How 2 Use Air Gestures So far, we’ve seen radar, software, and infrared sensors used to understand hand motions and control features on our phones, showing how companies were keen to experiment and that there wasn’t one recognized “best” way of adding gesture recognition to a smartphone. But we’re not done yet. Elliptic Labs Elliptic Labs technology demonstrated at a trade show Malarie Gokey/Digital Trends / Digital Trends Fast forward to 2017, and interestingly, the Galaxy S4 was also called into action in a demonstration of gesture recognition technology from Elliptic Labs , which — like eyeSight Technologies, spent a great deal of time and effort trying to get us waving at our smartphones and other devices. Elliptic Labs’s technology used ultrasound to detect movement, which allowed a greater field of movement and for different gestures to be used without any reliance on light, less power consumption, and more accuracy. It planned on licensing the ultrasound gesture technology to device makers, but it never seemed to get far beyond the demo and concept stage, despite adapting the same system to take advantage of the Internet of Things (IoT) boom and integrating it into speakers and lights . Instead, Xiaomi used its ultrasonic proximity sensor to get rid of a traditional proximity sensor and minimize the bezel on the original Mi Mix . Today, Elliptic Labs still works on proximity sensors and has now eliminated the hardware from the system entirely to offer software-driven proximity detection , which can be found on the Motorola Razr Plus and Razr (2023) compact folding phones. Air Motion on the LG G8 ThinQ Julian Chokkattu / Digital Trends Both Elliptic Labs and eyeSight Technologies, along with other companies like Neonode , experimented with gesture controls between 2010 and 2017 — but without making much of an impact outside of tech trade shows like CES and MWC. When the Pixel 4 reignited interest in gesture controls in 2019, it was joined by another big-name device: the LG G8 ThinQ . LG, which has now stopped making smartphones entirely , loved to try new things with its phones, whether it was modular hardware with the LG G5 or secondary screens on phones like the LG V10 and the V50 ThinQ . Air Motion used front-facing cameras and a time-of-flight (ToF) sensor to detect various hand motions, including mimicking the twisting of a volume knob to adjust the volume of the music player. How to Use LG G8 ThinQ - Air Motion Like all close proximity gesture control systems, its usefulness was questionable as the touchscreen was right there, mere inches from your twiddling fingers. It also wasn’t particularly reliable, which stopped people from using it. The LG G8 ThinQ marked the end of the line for LG’s G Series, and along with Project Soli, Air Motion was perhaps the last gesture control system to be heavily promoted by a phone maker. What about smartwatches? Joe Maring / Digital Trends Until recently, gesture controls have mostly been demonstrated or featured on smartphones. But what about smartwatches? Double Tap on the Apple Watch Series 9 and Apple Watch Ultra 2 owes its existence to an accessibility feature called AssistiveTouch , which has been part of watchOS for several years. Samsung provides a very similar accessibility feature on the Galaxy Watch 6 , too. Space is tight inside a smartwatch, and there’s very little spare room for cameras, proximity sensors, or other complex bits of hardware. Double Tap uses the heart rate sensor, the accelerometer, and software to recognize when you are tapping your fingers, adding another system of recognition to the list. Outside of the Apple Watch and Double Tap, Google demonstrated Project Soli inside a smartwatch , but it never made it to the eventual Google Pixel Watch . The bizarrely named Mad Gaze Watch apparently used bone conduction to enable a range of different gesture controls, from finger snaps to arm taps. In 2015, a company called Deus Ex crowdfunded an add-on module for the Pebble Watch called the Aria , and while it’s not a smartwatch, Google used head nods to add hands-free use to Google Glass . Simple gestures are best Andy Boxall / Digital Trends All these examples show there are more failed attempts at making gesture control systems on our phones and smartwatches popular than there are successes. However, there are several simple gestures that have proven to be effective and reliable — to the point where we don’t even consider them special. A great example is raise-to-wake, where lifting or tilting a device’s screen towards your face turns on the display, and it’s the perfect example of a natural movement activating a feature. It could be argued that anything beyond this is simply too complicated. Even wrist flicks and twists, which were used on the Moto 360 to aid scrolling, seem to be a gesture too far and rarely seen since the Fossil Hybrid HR . Outside of these few isolated examples and the essential accessibility features, gestures have not transformed the regular, everyday use of a wearable or smartphone for most people. Double Tap has the potential to join raise-to-wake as one of the few widely usable gestures on a smartwatch, though, as it’s simple, natural, and works really well. Sadly, history shows gesture control systems and mobile devices simply haven’t captured our interest yet, and I hope Double Tap doesn’t end up on a future list of quickly abandoned yet promising gesture control systems. It’s too interesting to suffer that fate. Editors' Recommendations
Cipia Frequently Asked Questions (FAQ)
When was Cipia founded?
Cipia was founded in 2006.
Where is Cipia's headquarters?
Cipia's headquarters is located at 8 Maskit Street, Herzliya.
What is Cipia's latest funding round?
Cipia's latest funding round is PIPE.
How much did Cipia raise?
Cipia raised a total of $39.2M.
Who are the investors of Cipia?
Investors of Cipia include Alexandre Weinstein, Eli Talmor, Cartridge Holdings, Jebsen Capital, Mizrahi Tefahot Bank and 9 more.
Who are Cipia's competitors?
Competitors of Cipia include Mindtronic AI and 1 more.
Compare Cipia to Competitors
Mindtronic AI (MAI) is a technology IP company providing design services and licensing IP to automotive OEM and Tier 1 for UX and autonomous driving solutions. MAI specializes in automotive-grade biometric recognition, computer vision, 3D, ultra-light deep learning engine for embedded systems, and autonomous drive frameworks.
Jungo Connectivity is a company that focuses on in-cabin sensing AI software within the automotive industry. The company offers advanced driver monitoring and in-cabin monitoring solutions, which provide real-time alerts about unusual driver behavior that could lead to accidents, and are compatible with various types of hardware. Jungo Connectivity primarily sells to the automotive industry, including OEMs, Tier 1s, and fleet vendors. It was founded in 2013 and is based in Netanya, Israel.
Entropik provides an artificial intelligence (AI)-powered integrated market research platform. It captures and analyzes various forms of human emotional data such as facial expressions, voice intonation, and other physiological signals to improve audience engagement and users experiences. It provides solutions for digital brands, consumer brands, media and entertainment industry, and more. The company was founded in 2016 and is based in Bengaluru, India.
Eyeris is a technology company that specializes in the automotive industry. The company's main offering is artificial intelligence that comprehends the entire in-cabin space of autonomous and highly automated vehicles, aiming to enhance safety and user experience by providing personalized comfort and convenience. Eyeris primarily serves the automotive industry. It was founded in 2014 and is based in Palo Alto, California.
Maaind operates as an artificial intelligence (AI) based neuroscience company. It specializes in tracking mental well-being in real-world situations through an assistant application and application programming interface (API)-as-a-service. It was founded in 2018 and is based in London, United Kingdom.
Seeing Machines is a company focused on safety technology in the transport industry. The company's main offerings include computer vision technologies that enable machines to see, understand, and assist people, primarily through a Driver Monitoring System (DMS) that monitors driver and operator attention state, including the detection of drowsiness and distraction. Seeing Machines primarily sells to the automotive, commercial fleet, aviation, rail, and off-road markets. It was founded in 2001 and is based in Fyshwick, Australian Capital Territory.