Dynamic digital timing analysis aims at substituting highly accurate but slow analog simulations of digital circuits with less accurate but fast digital approaches to facilitate tracing timing relations between individual transitions in a signal trace. This primarily requires gate delay models, where the input-to-output delay of a transition also depends on the signal history. We focus on a recently proposed hybrid delay model for CMOS multi-input gates, exemplified by a 2-input \NOR\ gate, which is the only delay model known to us that faithfully captures both single-input switching (SIS) and multi-input switching (MIS) effects, also known as ``Charlie effects''. Despite its simplicity as a first-order model, simulations have revealed that suitably parametrized versions of the model predict the actual delays of NOR gates accurately. However, the approach considers isolated gates without their interconnect. In this work, we augment the existing model and its theoretical analysis by a first-order interconnect, and conduct a systematic evaluation of the resulting modeling accuracy: Using SPICE simulations, we study both SIS and MIS effects on the overall delay of \NOR\ gates under variation of input driving strength, wire length, load capacitance and CMOS technology, and compare it to the predictions of appropriately parametrized versions of our model. Overall, our results reveal a surprisingly good accuracy of our fast delay model.
翻译:暂无翻译