论文部分内容阅读
一、引言 时延模型是逻辑模拟、开关级模拟、定时模拟以及定时分析与验证中的基本问题。目前时延模型大多为经验式的。其分析结果与实际电路时延相差很大。实际电路时延特性的复杂性是造成这种情况的主要原因,晶体管尺寸、阈值电压、基元互联、负载电容、输入波形,以及晶体管在电路中的使用方式等因素都将影响电路的时延。 本文在作者近来工作的基础上,提出了基于时延势概念的一种理论上自洽的时延建模方法,所得模型简单,精度较以往时延模型提高,从而为节点时延势方程理论的实用化奠定了基础。
I. INTRODUCTION The delay model is the basic problem in logic simulation, switching simulation, timing simulation and timing analysis and verification. Most of the current delay models are empirical ones. The analysis results and the actual circuit delay vary greatly. The complexity of the actual circuit delay characteristics is the main reason for this situation. Factors such as transistor size, threshold voltage, interconnection of the cells, load capacitance, input waveforms, and how the transistors are used in the circuit all contribute to the circuit delay . Based on the author’s recent work, this paper proposes a theoretically self-consistent delay modeling method based on the concept of time-delay potentials. The obtained model is simple and the accuracy is improved compared with the conventional delay model, The practical basis laid the foundation.