Technology is automating out human error, democratizing the search for engineering talent, and speeding R&D times with implications across drug discovery, car manufacturing, and much more.
Across industries, designers, chemists, and engineers are constantly hypothesis testing.
Will this design look right?
Does this compound fit our needs?
Testing and iterating is the essence of research and development.
Major corporations across drugs, technology, aerospace, and more pour billions of dollars each year into R&D. General Motors alone spent upwards of $8B on new development last year.
In this report, we look at how new networking platforms, robotics & 3D printing, AI algorithms, and AR/VR tech are reshaping everything from talent acquisition to hypothesis testing and product design.
Networking platforms broaden talent pools
In the highly scientific world of R&D, finding high-caliber specialists is one of the biggest challenges. Now, software is helping companies tap into a pool of talent around the world.
When it comes to networking untapped talent in data science and finance, platforms like Kaggle, Quantopian, and Numerai are democratizing “quant” work and compensating their collaborators. The concept has also already taken off with pharmaceutical R&D, though it’s growing elsewhere as well.
On-demand science platforms like Science Exchange are currently working across R&D verticals, and allow corporations to quickly solve for a lack of on-site talent by outsourcing R&D.
But even as companies bring in a bigger pool of talent, the process of hypothesis testing has room for improvement, and tightening iteration time will translate to faster and better discoveries.
Robotics & 3D Printing speed up product development across verticals
Accelerating product development is the No. 1 priority for firms using 3D printing, according to a recent industry survey. Moreover, 57% of all 3D printing work is done in the first phases of new product development (i.e. proof of concept and prototyping).
3D printing is already a staple in any design studio. Before ordering thousands of physical parts, designers can us 3D printing to see what a future product looks like.
Similarly, robotics is automating the physical process of trial-and-error across a wide array of verticals.
In R&D for synthetic biology, for example, robotics making a big impact for companies like Zymergen and Ginkgo Bioworks, which manufacture custom chemicals from yeast microbes. Finding the perfect microbe requires testing up to 4,000 different variants concurrently, which translates to lot of wet lab work.
Using automatic pipette systems and robotics arms, liquid handling robots permit high-throughput experimentation to arrive at a winning combination faster and with less human error.
Below is the robot gene tester Counsyl (left), used for transferring samples, and Zymergen’s pipetting robot (right) for automating microbe culture testing.
“Materials engineering is the ability to detect a very small particle — something like a 10-nanometer particle on a 300-millimeter wafer. That is really equivalent to finding an ant in the city of Seattle.” — Om Nalamasu, CTO at Applied Materials
Looking beyond biotech, material science has played a pivotal role in computing and electronics.
Notably, chip manufacturers like Intel and Samsung are among the largest R&D spenders in the world. As semiconductors get ever-smaller, working at nanoscale requires precision beyond human ability, making robotics the preferred option.
Tomorrow’s scientific tools will be increasingly more automated and precise to handle micro-scale precision.
AI is hastening materials science discoveries
Thomas Edison is well-known for highlighting materials science as a process of elimination: “I have not failed 10,000 times. I have not failed once. I have succeeded in proving that those 10,000 ways will not work.”
The spirit of Edison persists in today’s R&D labs, although R&D is still less digitized and software-enabled than one might expect (the National Academy of Sciences says developing new materials is often the longest stage of developing new products). Better digitization of the scientific method will be crucial to developing new products and materials and then manufacturing them at scale.
Currently, the hottest area for deals to AI startups is healthcare, as companies employ AI for drug discovery pipelines. Pharma companies are pouring cash into startups tracing drug R&D such as Recursion Pharmaceuticals and twoXAR, and it’s only a matter of time until this takes off elsewhere.
One company working in chemistry and materials science is Citrine Informatics (below, left). Citrine runs AI on its massive materials database, and claims it helps organizations hit R&D and manufacturing milestones 50% of the time. Similarly, Deepchem (right) develops a Python library for applying deep learning to chemistry.
In short, manufacturers across sectors — industrial biotech, drugs, cars, electronics, or other material goods — are relying on robotic automation and 3D printing to remain competitive and tighten the feedback loop in bringing a product to launch.
Already, startups developing or commercializing complex materials are taking off in the 3D printing world. Companies like MarkForged employ carbon fiber composites, where others like BMF are developing composites with rare nanostructures and exotic physical properties.
Certainly, manufacturers of the future will be relying on intelligent software to make their R&D discoveries.
Augmented and virtual reality ‘abstract away’ the modeling process
Currently, manufacturers of all types rely on prototyping with computer aided design (CAD) software. In future manufacturing processes, augmented and virtual reality could play a greater role in R&D, and could effectively “abstract away” the desktop PC for industrial designers, possibly eliminating the need for 3D printed physical models.
Autodesk, the software developer of AutoCAD, is a bellwether for the future of prototyping and collaboration technology. The company has heavily invested in cutting-edge technology such as 3D printing, including a partnership with health AI startup Atomwise on a “confidential project.” Recently, Autodesk’s exploration into making an AR/VR game engine foreshadows the larger role it envisions for immersive computing in the design process.
Autodesk’s game engine, called Stingray, has added support for the HTC Vive and Oculus Rift headsets. Additionally, game and VR engine maker Unity has announced a partnership with Autodesk to increase interoperability.
Similarly, Apple has imagined AR/VR facilitating the design process in combination with 3D printing. Using the CB Insights database, we surfaced an Apple patent that envisions AR “overlaying computer-generated virtual information” onto real-world views of existing objects, effectively allowing industrial designers to make 3D-printed “edits” onto existing or unfinished objects.
The patent envisions using AR through “semi-transparent glasses,” but also mentions a “mobile device equipped with a camera,” hinting at potential 3D printing opportunities for using ARKit on an iPhone.
A researcher at Cornell has recently demonstrated the ability to sketch with AR/VR while 3D printing. Eventually, the human-computer interface could be so seamless that 3D models can be sculpted in real time.
Tomorrow’s R&D team will be exploring AR and VR, and testing how it works in combination with 3D printing, as well as the traditional prototyping stack.