
Phiar
Founded Year
2017Stage
Series A | AliveTotal Raised
$15.12MLast Raised
$12M | 1 yr agoAbout Phiar
Phiar is an ultra-lightweight artificial intelligence (AI) company. It offers an augmented reality (AR) navigation platform that provides ground plane estimation, depth perception, lane segmentation, semantic segmentation, object detection, and tracking. It serves the automotive industry. The company was founded in 2017 and is based in Redwood City, California.
Missing: Phiar's Product Demo & Case Studies
Promote your product offering to tech buyers.
Reach 1000s of buyers who use CB Insights to identify vendors, demo products, and make purchasing decisions.
Missing: Phiar's Product & Differentiators
Don’t let your products get skipped. Buyers use our vendor rankings to shortlist companies and drive requests for proposals (RFPs).
Research containing Phiar
Get data-driven expert analysis from the CB Insights Intelligence Unit.
CB Insights Intelligence Analysts have mentioned Phiar in 1 CB Insights research brief, most recently on Feb 22, 2023.
Expert Collections containing Phiar
Expert Collections are analyst-curated lists that highlight the companies you need to know in the most important technology spaces.
Phiar is included in 4 Expert Collections, including AR/VR.
AR/VR
1,390 items
This collection includes companies creating hardware and/or software for augmented reality, virtual reality, and mixed reality applications.
Auto Tech
2,472 items
Startups building a next-generation mobility ecosystem, using technology to improve connectivity, safety, convenience, and efficiency in vehicles.Includes technologies such as ADAS and autonomous driving, connected vehicles, fleet telematics, V2V/V2X, and vehicle cybersecurity.
Artificial Intelligence
10,110 items
This collection includes startups selling AI SaaS, using AI algorithms to develop their core products, and those developing hardware to support AI workloads.
AI 100
100 items
Latest Phiar News
Jan 12, 2022
Jet Fighter with a Steering Wheel: Inside the Augmented Reality Car HUD Share Explore by topic Topics Support IEEE Spectrum IEEE Spectrum is the flagship publication of the IEEE — the world’s largest professional organization devoted to engineering and applied sciences. Our articles, podcasts, and infographics inform our readers about developments in technology, engineering, and science. A not-for-profit organization, IEEE is the world's largest technical professional organization dedicated to advancing technology for the benefit of humanity. IEEE websites place cookies on your device to give you the best user experience. By using our websites, you agree to the placement of these cookies. To learn more, read our Privacy Policy. Enjoy more free content and benefits by creating an account Saving articles to read later requires an IEEE Spectrum account The Institute content is only available for members Downloading full PDF issues is exclusive for IEEE Members Access to Spectrum's Digital Edition is exclusive for IEEE Members Following topics is a feature exclusive for IEEE Members Adding your response to an article requires an IEEE Spectrum account Create an account to access more content and features on IEEE Spectrum, including the ability to save articles to read later, download Spectrum Collections, and participate in conversations with readers and editors. For more exclusive content and features, consider Joining IEEE . This article is for IEEE members only. Join the world’s largest professional organization devoted to engineering and applied sciences and get access to all of Spectrum’s articles, podcasts, and special reports. Learn more → Join the world’s largest professional organization devoted to engineering and applied sciences and get access to all of Spectrum’s articles, archives, PDF downloads, and other benefits. Learn more → Enjoy more free content and benefits by creating an account Create an account to access more content and features on IEEE Spectrum, including the ability to save articles to read later, download Spectrum Collections, and participate in conversations with readers and editors. For more exclusive content and features, consider Joining IEEE . Mercedes-Benz The 2022 Mercedes-Benz EQS glides through Brooklyn, the first all-electric sedan from the company that essentially invented the automobile in 1885-1886. But this is definitely the 21st century: Blue directional arrows seem to paint the pavement ahead via an augmented reality (AR) navigation system and color head-up display. Digital street signs and other graphics are superimposed over a camera view on the EQS’ much-hyped “Hyperscreen”—a 142 cm (56-inch), dash-spanning wonder that includes a 45 cm (17.7-inch) OLED center display. But here’s my favorite bit: As I approach my destination, AR street numbers appear and then fade in front of buildings as I pass, like flipping through a virtual Rolodex; there’s no more craning your neck and getting distracted while trying to locate a home or business. Finally, a graphical map pin floats over the real-time scene to mark the journey’s end. It’s cool stuff, albeit for folks who can afford a showboating Mercedes flagship that starts above $103,000, and topped $135,000 in my EQS 580 test car. But CES 2022 in Las Vegas saw Panasonic unveil a more-affordable HUD that it says should reach a production car by 2024. Head-up displays have become a familiar automotive feature, with a speedometer, speed limit, engine rpms or other information that hovers in the driver’s view, helping keep eyes on the road. Luxury cars from Mercedes, BMW, Genesis and others have recently broadened HUD horizons with larger, crisper, more data-rich displays. Mercedes Benz augmented reality navigation Panasonic, powered by Qualcomm processing and AI navigation software from Phiar Technologies, hopes to push into the mainstream with its AR HUD 2.0. Its advances include an integrated eye-tracking camera to accurately match AR images to a driver’s line-of-sight. Phiar’s AI software lets it overlay crisply rendered navigation icons, and spot or highlight objects including vehicles, pedestrians, cyclists, barriers and lane markers. The IR camera can monitor potential driver distraction, drowsiness or impairment, with no need for a standalone camera as with GM’s semi-autonomous Super Cruise system. Panasonic's AR HUD system includes eye-tracking to match AR images to the driver's line-of-sight. Panasonic Andrew Poliak, CTO of Panasonic Automotive Systems Company of America, said the eye tracker spots a driver’s height and head movement to adjust images in the HUD’s “eyebox.” “We can improve fidelity in the driver’s field of view by knowing precisely where the driver is looking, then matching and focusing AR images to the real world much more precisely,” Poliak said. For a demo on the Las Vegas strip, using a Lincoln Aviator as test mule, Panasonic used its SkipGen infotainment system and a Qualcomm Snapdragon SA8155 processor. But AR HUD 2.0 could work with a range of in-car infotainment systems. That includes a new Snapdragon-powered generation of Android Automotive —an open-source infotainment ecosystem, distinct from the Android Auto phone-mirroring app. The first-gen, Intel-based system made an impressive debut in the Polestar 2 , from Volvo’s electric brand. The uprated Android Automotive will run in 2022’s Lidar-equipped Polestar 3 SUV , an electric Volvo SUV, and potentially millions of cars from General Motors, Stellantis, and the Renault-Nissan-Mitsubishi alliance. Gary Karshenboym helped develop Android Automotive for Volvo and Polestar as Google’s head of hardware platforms. Now, he’s chief executive of Phiar , a software company in Redwood, Calif. Karshenboym said AI-powered AR navigation can greatly reduce a driver’s cognitive load, especially as modern cars put ever-more information at their eyes and fingertips. Current embedded navigation screens force drivers to look away from the road, and translate 2D maps as they hurtle along. “It’s still too much like using a paper map, and you have to localize that information with your brain,” Karshenboym says. In contrast, following arrows and stripes displayed on the road itself—a digital Yellow Brick Road, if you will—reduces fatigue and the notorious stress of map reading. It’s something that many direction-dueling couples might give thanks for. “You feel calmer,” he says. “You’re just looking forward, and you drive.” Street testing Phiar's AI navigation engine The system classifies objects on a pixel-by-pixel basis at up to 120 frames per second. Potential hazards, like an upcoming crosswalk or a pedestrian about to dash across the road, can be highlighted by AR animations. Phiar’s synthetic model trained its AI for snowstorms, poor lighting and other conditions, teaching it to fill in the blanks and create a reliable picture of its environment. And the system doesn’t require granular maps, monster computing power or pricey sensors such as radar or Lidar. Its AR tech runs off a single front-facing, roughly 720p camera, powered by a car’s onboard infotainment system and CPU. “There’s no additional hardware necessary,” he says. The company is also making its AR markers appear more convincing by “occluding” them with elements from the real world. In Mercedes’ system, for example, directional arrows can run atop cars, pedestrians, trees or other objects, slightly spoiling the illusion. In Phair’s system, those objects can block off portions of a “magic carpet” guidance stripe, as though it was physically painted on the pavement. “It brings an incredible sense of depth and realism to AR navigation,” Karshenboym says. Once visual data is captured, it can be processed and sent anywhere an automaker chooses, whether a center display, a HUD, or passenger entertainment screens. Those passenger screens could be ideal for Pokemon-style games, the Metaverse or other applications that combine real and virtual worlds. Poliak said some current HUD units hog up to 14 liters of volume in a car. A goal is to reduce that to 7 liters or less, while simplifying and cutting costs. Panasonic says its single optical sensor can effectively mimic a 3D effect, taking a flat image and angling it to offer a generous 10-to-40 meter viewing range. The system also advances an industry trend by integrating display domains—including a HUD or driver’s cluster—in a central, powerful infotainment module. “You get smaller packaging and a lower price point to get into more entry-level vehicles, but with the HUD experience OEM’s are clamoring for,” he said. Panasonic, powered by Qualcomm processing and AI navigation software from Phiar Technologies, hopes to push into the mainstream with its AR HUD 2.0. Its advances include an integrated eye-tracking camera to accurately match AR images to a driver’s line-of-sight. Phiar’s AI software lets it overlay crisply rendered navigation icons, and spot or highlight objects including vehicles, pedestrians, cyclists, barriers and lane markers. The IR camera can monitor potential driver distraction, drowsiness or impairment, with no need for a standalone camera as with GM’s semi-autonomous Super Cruise system. Panasonic's AR HUD system includes eye-tracking to match AR images to the driver's line-of-sight. Panasonic Andrew Poliak, CTO of Panasonic Automotive Systems Company of America, said the eye tracker spots a driver’s height and head movement to adjust images in the HUD’s “eyebox.” “We can improve fidelity in the driver’s field of view by knowing precisely where the driver is looking, then matching and focusing AR images to the real world much more precisely,” Poliak said. For a demo on the Las Vegas strip, using a Lincoln Aviator as test mule, Panasonic used its SkipGen infotainment system and a Qualcomm Snapdragon SA8155 processor. But AR HUD 2.0 could work with a range of in-car infotainment systems. That includes a new Snapdragon-powered generation of Android Automotive —an open-source infotainment ecosystem, distinct from the Android Auto phone-mirroring app. The first-gen, Intel-based system made an impressive debut in the Polestar 2 , from Volvo’s electric brand. The uprated Android Automotive will run in 2022’s Lidar-equipped Polestar 3 SUV , an electric Volvo SUV, and potentially millions of cars from General Motors, Stellantis, and the Renault-Nissan-Mitsubishi alliance. Gary Karshenboym helped develop Android Automotive for Volvo and Polestar as Google’s head of hardware platforms. Now, he’s chief executive of Phiar , a software company in Redwood, Calif. Karshenboym said AI-powered AR navigation can greatly reduce a driver’s cognitive load, especially as modern cars put ever-more information at their eyes and fingertips. Current embedded navigation screens force drivers to look away from the road, and translate 2D maps as they hurtle along. “It’s still too much like using a paper map, and you have to localize that information with your brain,” Karshenboym says. In contrast, following arrows and stripes displayed on the road itself—a digital Yellow Brick Road, if you will—reduces fatigue and the notorious stress of map reading. It’s something that many direction-dueling couples might give thanks for. “You feel calmer,” he says. “You’re just looking forward, and you drive.” Street testing Phiar's AI navigation engine The system classifies objects on a pixel-by-pixel basis at up to 120 frames per second. Potential hazards, like an upcoming crosswalk or a pedestrian about to dash across the road, can be highlighted by AR animations. Phiar’s synthetic model trained its AI for snowstorms, poor lighting and other conditions, teaching it to fill in the blanks and create a reliable picture of its environment. And the system doesn’t require granular maps, monster computing power or pricey sensors such as radar or Lidar. Its AR tech runs off a single front-facing, roughly 720p camera, powered by a car’s onboard infotainment system and CPU. “There’s no additional hardware necessary,” he says. The company is also making its AR markers appear more convincing by “occluding” them with elements from the real world. In Mercedes’ system, for example, directional arrows can run atop cars, pedestrians, trees or other objects, slightly spoiling the illusion. In Phair’s system, those objects can block off portions of a “magic carpet” guidance stripe, as though it was physically painted on the pavement. “It brings an incredible sense of depth and realism to AR navigation,” Karshenboym says. Once visual data is captured, it can be processed and sent anywhere an automaker chooses, whether a center display, a HUD, or passenger entertainment screens. Those passenger screens could be ideal for Pokemon-style games, the Metaverse or other applications that combine real and virtual worlds. Poliak said some current HUD units hog up to 14 liters of volume in a car. A goal is to reduce that to 7 liters or less, while simplifying and cutting costs. Panasonic says its single optical sensor can effectively mimic a 3D effect, taking a flat image and angling it to offer a generous 10-to-40 meter viewing range. The system also advances an industry trend by integrating display domains—including a HUD or driver’s cluster—in a central, powerful infotainment module. “You get smaller packaging and a lower price point to get into more entry-level vehicles, but with the HUD experience OEM’s are clamoring for,” he said.
Phiar Frequently Asked Questions (FAQ)
When was Phiar founded?
Phiar was founded in 2017.
Where is Phiar's headquarters?
Phiar's headquarters is located at 1741 Broadway St Fl 3, Redwood City.
What is Phiar's latest funding round?
Phiar's latest funding round is Series A.
How much did Phiar raise?
Phiar raised a total of $15.12M.
Who are the investors of Phiar?
Investors of Phiar include The Venture Reality Fund, GFR Fund, Norwest Venture Partners, Cambridge Mobile Telematics, State Farm Ventures and 11 more.
Discover the right solution for your team
The CB Insights tech market intelligence platform analyzes millions of data points on vendors, products, partnerships, and patents to help your team find their next technology solution.