Electronic design automation (EDA), also referred to aselectronic computer-aided design (ECAD),[1] is a category ofsoftware tools for designingelectronic systems such asintegrated circuits andprinted circuit boards. The tools work together in adesign flow that chip designers use to design and analyze entiresemiconductor chips. Since a modernsemiconductor chip can have billions of components, EDA tools are essential for their design. A complete set of technologies are integrated in EDA tools, which have been expanded to software, hardware and design methodologies (such as through design services), this article in particular describes EDA specifically with respect tointegrated circuits (ICs).
The earliest electronic design automation is attributed toIBM with the documentation of its700 series computers in the 1950s.[2]
IBM has developed one of the earliest computer-aided design (CAD) systems, known as Automated Logic Diagram (ALD), which was originally executed on the IBM 704 and 705mainframe computers. The design process started with engineers manually drafting logicschematics, which were later transcribed onto standardized templates and converted intopunch cards for digital processing.[2][3]
Although focused on mechanical geometry,General Motors’DAC-1, built jointly withIBM, was among the earliest interactive, graphics-drivenCAD systems and proved the practicality of screen-based editing for complex engineering data, an idea adopted byIC layout tools.[4]
Prior to the development of EDA,integrated circuits were designed by hand and manually laid out.[5] Some advanced shops used geometric software to generate tapes for aGerberphotoplotter, responsible for generating a monochromatic exposure image, but even those copied digital recordings of mechanically drawn components. The process was fundamentally graphic, with the translation from electronics to graphics done manually; the best-known company from this era wasCalma, whoseGDSII format is still in use today.[6]
In the early 1970s, developers started to automate circuit design in addition to drafting and the firstplacement and routing tools were developed. Because of thecold war, developments often occurred in near parallel. In the Western world, the proceedings of theIEEE and theDesign Automation Conference catalogued the large majority of the developments of the time,[5] and by 1973 a large bibliography was needed to keep track of developments in the field.[7] In theSoviet Union, progress was largely described in a series of books, starting in 1975.[8]
Calma’s Graphic Design System (GDS, 1971) and its 32-bit successorGDSII (1978) let engineers digitise and edit full-chiplayouts on minicomputers; the accompanyingGDSII Stream file became the de-facto mask exchange standard and is still recognised in modern design flows.[9]
The next era began following the publication of "Introduction toVLSI Systems" byCarver Mead andLynn Conway in 1980,[10] and is considered the standard textbook for chip design.[11] The result was an increase in the complexity of the chips that could be designed, with improved access todesign verification tools that usedlogic simulation. The chips were easier to lay out and more likely to function correctly, since their designs could be simulated more thoroughly prior to construction. Although the languages and tools have evolved, this general approach of specifying the desired behavior in a textual programming language and letting the tools derive the detailed physical design remains the basis of digital IC design today.
The earliest EDA tools were produced academically. One of the most famous was the "Berkeley VLSI Tools Tarball", a set ofUNIX utilities used to design early VLSI systems. Widely used were theEspresso heuristic logic minimizer,[12] responsible for circuit complexity reductions andMagic,[13] a computer-aided design platform. Another crucial development was the formation ofMOSIS,[14] a consortium of universities and fabricators that developed an inexpensive way to train student chip designers by producing real integrated circuits. The basic concept was to use reliable, low-cost, relatively low-technology IC processes and pack a large number of projects perwafer, with several copies of chips from each project remaining preserved. Cooperating fabricators either donated the processed wafers or sold them at cost, as they saw the program as helpful to their own long-term growth.
1981 marked the beginning of EDA as an industry. For many years, the larger electronic companies, such asHewlett-Packard,Tektronix andIntel, had pursued EDA internally, with managers and developers beginning to spin out of these companies to concentrate on EDA as a business.Daisy Systems,Mentor Graphics andValid Logic Systems were all founded around this time and collectively referred to as DMV. In 1981, theU.S. Department of Defense additionally began funding ofVHDL as a hardware description language. Within a few years, there were many companies specializing in EDA, each with a slightly different emphasis.
The first trade show for EDA was held at theDesign Automation Conference in 1984, and in 1986,Verilog, another popular high-level design language, was first introduced as a hardware description language byGateway Design Automation. Simulators quickly followed these introductions, permitting direct simulation of chip designs and executable specifications. Within several years, back-ends were developed to performlogic synthesis.
Current digital flows are extremely modular, with front ends producing standardized design descriptions that compile into invocations of units similar to cells without regard to their individual technology. Cells implement logic or other electronic functions via the utilisation of a particular integrated circuit technology. Fabricators generally provide libraries of components for their production processes, with simulation models that fit standard simulation tools.
Most analog circuits are still designed in a manual fashion, requiring specialist knowledge that is unique to analog design (such as matching concepts).[15] Hence, analog EDA tools are far less modular, since many more functions are required, they interact more strongly and the components are, in general, less ideal.
EDA for electronics has rapidly increased in importance with the continuous scaling ofsemiconductor technology.[16] Some users arefoundry operators, who operate thesemiconductor fabrication facilities ("fabs") and additional individuals responsible for utilising the technology design-service companies who use EDA software to evaluate an incoming design for manufacturing readiness. EDA tools are also used for programming design functionality intoFPGAs or field-programmable gate arrays, customisable integrated circuit designs.
Design flow primarily remains characterised via several primary components; these include:
High-level synthesis (additionally known as behavioral synthesis or algorithmic synthesis) – The high-level design description (e.g. in C/C++) is converted intoRTL or the register transfer level, responsible for representing circuitry via the utilisation of interactions between registers.
Logic synthesis – The translation ofRTL design description (e.g. written in Verilog or VHDL) into a discretenetlist or representation of logic gates.
Simulated lithographic and other fabrication defects visible in small standard-cell metal interconnects
Transistor simulation – low-level transistor-simulation of a schematic/layout's behavior, accurate at device-level.
Logic simulation – digital-simulation of anRTL or gate-netlist's digital (Boolean 0/1) behavior, accurate at Boolean-level.
Behavioral simulation – high-level simulation of a design's architectural operation, accurate at cycle-level or interface-level.
Hardware emulation – Use of special purpose hardware to emulate the logic of a proposed design. Can sometimes be plugged into a system in place of a yet-to-be-built chip; this is calledin-circuit emulation.
Technology CAD simulate and analyze the underlying process technology. Electrical properties of devices are derived directly from device physics
Functional verification: ensureslogic design matches specifications and executes tasks correctly. Includes dynamic functional verification via simulation, emulation, and prototypes.[17]
RTL Linting for adherence to coding rules such as syntax, semantics, and style.[18]
Formal verification, alsomodel checking: attempts to prove, by mathematical methods, that the system has certain desired properties, and that some undesired effects (such asdeadlock) cannot occur.
Equivalence checking: algorithmic comparison between a chip's RTL-description and synthesized gate-netlist, to ensure functional equivalence at thelogical level.
Static timing analysis: analysis of the timing of a circuit in an input-independent manner, hence finding a worst case over all possible inputs.
Layout extraction: starting with a proposed layout, compute the (approximate) electrical characteristics of every wire and device. Often used in conjunction with static timing analysis above to estimate the performance of the completed chip.
Physical verification, PV: checking if a design is physically manufacturable, and that the resulting chips will not have any function-preventing physical defects, and will meet original specifications.
Yield analysis: estimating the yield (and hence the cost) of the manufactured chip, and identifying yield bottlenecks to suggest beneficial changes.
Chip finishing which includes custom designations and structures to improvemanufacturability of the layout. Examples of the latter are a seal ring and filler structures.[19]
Producing areticle layout with test patterns and alignment marks.
Mask generation – The generation of flat mask image from hierarchical design.
Automatic test pattern generation or ATPG – The generation of pattern data systematically to exercise as many logic-gates and other components as possible.
Built-in self-test or BIST – The installation of self-contained test-controllers to automatically test a logic or memory structure in the design
Functional safety analysis, systematic computation offailure in time (FIT) rates and diagnostic coverage metrics for designs in order to meet the compliance requirements for the desired safety integrity levels.
Functional safety synthesis, add reliability enhancements to structured elements (modules, RAMs, ROMs, register files, FIFOs) to improve fault detection / fault tolerance. This includes (not limited to) addition of error detection and / or correction codes (Hamming), redundant logic for fault detection and fault tolerance (duplicate / triplicate) and protocol checks (interface parity, address alignment, beat count)
Functional safety verification, running of a fault campaign, including insertion of faults into the design and verification that the safety mechanism reacts in an appropriate manner for the faults that are deemed covered.
Synopsys announced a planned acquisition of Ansys in 2024, but Ansys remains an independent, publicly traded company until the deal closes (expected H1 2025).[26]
Many EDA companies acquire small companies with software or other technology that can be adapted to their core business.[32] Most of the market leaders are amalgamations of many smaller companies and this trend is helped by the tendency of software companies to design tools as accessories that fit naturally into a larger vendor's suite of programs ondigital circuitry; many new tools incorporate analog design and mixed systems.[33] This is happening due to a trend to placeentire electronic systems on a single chip.
Machine learning and artificial-intelligence techniques
Machine-learning methods are now applied at every major stage of the integrated-circuit design flow, from high-level synthesis through sign-off. Machine-learning shortens turnaround times and improves power, performance and area (PPA).[34] EDA vendors have since integrated similar optimization engines into production toolchains.[35]
TheOpenROAD Project (Foundations and Realisation of Open, Accessible Design), launched under DARPA’s IDEA program[36], released a no-human-in-the-loop RTL-to-GDS flow that has successfully taped-out designs.[37][38] Conferences such as ORConf and the annual FOSSi Foundation roadmap sessions now dedicate substantial tracks to open-source EDA progress.[39][40]
^Emerson, Roger (2015). "The Legendary IBM 1401 Data Processing System".Proceedings of the IEEE.103 (12):2250–2254.doi:10.1109/JPROC.2015.2480703 (inactive September 24, 2025).{{cite journal}}: CS1 maint: DOI inactive as of September 2025 (link)
^Krull, F. N. (1994). "The Origin of Computer Graphics within General Motors".IEEE Annals of the History of Computing.16 (3): 40.doi:10.1109/MAHC.1994.298419.S2CID17776315.
^Vancleemput, W. M. (1973).Automated Design of Digital Systems, A bibliography. University of Waterloo, Dept. of Applied Analysis and Computer Science.
^Brayton, Robert K., Gary D. Hachtel, Curt McMullen, and Alberto Sangiovanni-Vincentelli (1984).Logic minimization algorithms for VLSI synthesis. Vol. 2. Springer Science & Business Media.{{cite book}}: CS1 maint: multiple names: authors list (link)
^Ousterhout, John K., Gordon T. Hamachi, Robert N. Mayo, Walter S. Scott, and George S. Taylor (1985). "The magic VLSI layout system".IEEE Design & Test of Computers.2 (1):19–30.Bibcode:1985IDTC....2...19O.doi:10.1109/MDT.1985.294681.{{cite journal}}: CS1 maint: multiple names: authors list (link)
^Lavagno, Martin, and Scheffer (2006).Electronic Design Automation For Integrated Circuits Handbook. Taylor and Francis.ISBN0849330963.{{cite book}}: CS1 maint: multiple names: authors list (link)
Electronic Design Automation For Integrated Circuits Handbook, by Lavagno, Martin, and Scheffer,ISBN0-8493-3096-3, 2006
The Electronic Design Automation Handbook, by Dirk Jansen et al., Kluwer Academic Publishers,ISBN1-4020-7502-2, 2003, available also in GermanISBN3-446-21288-4 (2005)
Combinatorial Algorithms for Integrated Circuit Layout, by Thomas Lengauer,ISBN3-519-02110-2, Teubner Verlag, 1997.