July 29, 2025

Part 1: Designing the Future: How AI is Transforming Hardware Development

Type

Deep Dives

Contributors

Vikram Venkat

Recent advancements in Artificial Intelligence and Machine Learning have significantly changed the landscape of software development. Development cycles that initially took several weeks can now be completed within a few minutes through the use of generative AI coding assistants, as well as a host of other platforms that help speed up testing, integration, and deployment. While software engineering has significantly benefited from such solutions, physical engineering fields (such as mechanical engineering, aerospace engineering, chemical engineering, civil and structural engineering, and electrical engineering) that develop hardware are lagging in this regard. 

As a mechanical engineer myself, I was trained on and used traditional CAD and CAE tools such as AutoCAD, Creo, Solidworks, Fluent, and others – solutions that are still the platforms of choice at leading engineering companies and universities. The incumbents in this space – Dassault, Autodesk, PTC, and Ansys – remain household names for engineers in these fields and are among the largest companies in the world by market cap and revenue. However, these solutions were primarily developed in the pre-AI and ML eras; in fact, the earliest of these was created in the 1980s, when computers, software architectures, and workflows were very different. While these platforms have attempted to innovate and adapt to an AI-first world, the sunk cost of decades of legacy architecture makes it nearly impossible for them to transform into truly modern, AI-native solutions. The majority of innovations and enhancements made by incumbents have been incremental or bolt-on solutions. Consequently, hardware development cycles in these engineering disciplines still require several months – even years in some complex cases – and require a significant amount of human effort across all elements of the development process, from design to manufacture. 

From Concept to Completion: The Hardware Development Lifecycle

The typical development workflow is similar across each of the aforementioned engineering disciplines and across the industries to which they are applied, including aerospace, automotive, electrical appliances, buildings, and industrial equipment. There are four key steps in this workflow, with some minor nuances based on the use case and engineering discipline involved:

Design: In this stage, engineers ideate and create the basic product concepts to be manufactured. Typically, for complex products, these are broken down into individual components, each of which is designed separately. For example, an automotive gearbox would involve individually designing the gears, shafts, bearings, external housing, as well as any other components (synchronizers, selector forks, clutch, etc.) that may be needed for the system. 

2D and 3D models for these individual parts are typically designed using CAD (Computer-Aided Design) tools by mechanical, aerospace, and civil engineers – some examples of these tools include CATIA, Solidworks, AutoCAD, and Creo. Similarly, electrical and electronics engineers typically use EDA (Electronic Design Automation) tools for design – examples include Synopsys and Cadence. Subsequently, these parts are “assembled,” usually on the same platforms, to create a unified model. Design is a complex process that blends scientific rigor and creativity, and requires significant skill (and often, multiple iterations) to create accurate models.

Simulation: Post creation of the designs, engineers virtually test the efficacy of the models in their expected use cases by simulating real-world performance. This typically includes simulating physical phenomena such as mechanical stress and strain, thermal and fluid flow, multibody dynamics and kinematics, or circuit behavior. These are usually analyzed using numerical methods that iteratively attempt to solve the basic physical equations governing the phenomenon in question (usually differential equations, such as the Euler-Bernoulli equations for beam loading or the Navier-Stokes equations for fluid flow) within a set of constraints that govern the system to be analyzed. These solvers typically subdivide (or ‘discretize’) the larger system into smaller parts (the ‘finite elements’ that lend their name to the technique of Finite Element Analysis), usually represented as a ‘mesh’ of numerical data points. The points in this mesh can be individually analyzed and approximated using simpler algebraic equations, which are then rolled up to define the behavior of the entire system. However, as the systems become larger and more complex, the number of data points (or ‘nodes’) within the mesh increases, leading to a rapid increase in the computational intensity required to model the system.  Simulation is a crucial yet highly challenging step that is critical to testing and validating designs before they are put into an expensive and time-consuming manufacturing process.

Pre-manufacturing: Once the designs have been validated, they need to be prepared for prototyping and manufacturing. Given that design teams and manufacturing teams are usually different, there is a critical intermediate step that requires transferring information from the former to the latter. This involves creating ‘exploded views’ that show 3D models of the various components separately, typically arranged in the order in which they would be assembled into each other. This is accompanied by various 2D models, technical specifications, assembly instructions, bills of materials (BOMs), and any other guidance to the manufacturing teams. These are typically not complex processes but often require a significant time commitment from the engineering team and a high level of attention to detail to ensure accuracy in the manufacturing process.

This process also involves identifying and procuring the relevant materials and parts, onboarding vendors and suppliers, and communicating and collaborating with all stakeholders involved in the workflow. At this stage, the process typically moves away from the engineering teams and involves many more teams, such as finance and operations. Finally, the process enters the realm of manufacturing – a stage also benefiting from AI, computer vision, and other technology innovations, but that’s a topic for a different time!

The Engine Behind the Engine: Tech Enablers 

The manufacturing workflow is conceptually similar to the software development process, which also involves design (coding), testing, and deployment. Like the disruption that has revolutionized the software development life cycle, several technical breakthroughs are showing early promise in optimizing the hardware development life cycle.

First, there are large volumes of data available to build and train ML and AI models. The near-universal adoption of digital CAD, CAE, and CAM tools over the last few decades has led to the creation of vast data repositories that can be analyzed and used as a base for future engineering work. This is further augmented by large amounts of real-world data captured from sensors that track how the designed products perform in actual conditions. Finally, digitally available technical documentation of products on the public internet provides another vast data repository of similar products already available in the market, as well as publicly available user feedback.

Second, generative AI can understand these complex technical documents and unstructured inputs, including images, technical drawings, 3D models, sensor data, user feedback, notes, and instructions (e.g., for assembly or manufacturing) in natural language. This gives AI models a knowledge base beyond the capability of even the best engineers and allows advanced reasoning and technical analyses that were often beyond the capability of earlier software solutions. Generative AI also has the ability to draft documents, create drawings and models, and coordinate with stakeholders in the process, thereby enabling automation of much of the development workflow.

Third, ML and AI models have both evolved to incorporate a fundamental understanding of physics, thereby improving both accuracy and computational efficiency by orders of magnitude. Models can learn from the fundamental concepts of physics as well as from past product data (designs, simulations, and real-world usage data). This allows the model to reduce the number of computations required by prioritizing more likely solutions and solving across fewer nodes, as opposed to across the entire mesh. A first-principles understanding of physics also allows models to work in situations where there is limited data, unlocking new use cases.

These technical breakthroughs are laying the groundwork for new solutions that reimagine the entire hardware development life cycle – a disruption similar to what we are seeing in software development. In part 2 of this article, we will explore some of these new solutions.

times
#
# #