TY - JOUR
T1 - ASAP
T2 - Accelerating Corner-Based Timing Analysis With Bayesian Active Self-Attention Neural Process
AU - Wang, Longze
AU - Xing, Wei W.
AU - Wang, Zhelong
AU - Sotiriou, Christos
AU - Sketopoulos, Nikolaos
AU - Xu, Ning
AU - Cheng, Yuanqing
N1 - Publisher Copyright:
© 1982-2012 IEEE.
PY - 2026
Y1 - 2026
N2 - With the advancement of modern nanoscale technology nodes, static timing analysis (STA) has become an indispensable technique for ensuring circuit reliability and performance across diverse process conditions. However, traditional STA methods scale poorly to the explosion of process corners in the nanoscale fabrication technology. Despite some seminal works in using AI to accelerate such processes, they either lack reliability or stability. To this end, we introduce active self-attention neural process (ASAP), a novel approach addressing this challenge by combining both the latest deep learning methods and the classical Bayesian models to deliver scalable and accurate predictions with a self-calibration strategy to ensure reliability. Technically, the ASAP novelly integrates self-attention to help identify and prioritize crucial features under various input conditions and employs neural process to make confidence-based predictions for the final timing results. Furthermore, ASAP is equipped with Active Learning for self-refinement and self-correction. Experimental evaluations on benchmark circuits demonstrate that our method surpasses state-of-the-art work in STA accuracy by 18% in terms of prediction accuracy.
AB - With the advancement of modern nanoscale technology nodes, static timing analysis (STA) has become an indispensable technique for ensuring circuit reliability and performance across diverse process conditions. However, traditional STA methods scale poorly to the explosion of process corners in the nanoscale fabrication technology. Despite some seminal works in using AI to accelerate such processes, they either lack reliability or stability. To this end, we introduce active self-attention neural process (ASAP), a novel approach addressing this challenge by combining both the latest deep learning methods and the classical Bayesian models to deliver scalable and accurate predictions with a self-calibration strategy to ensure reliability. Technically, the ASAP novelly integrates self-attention to help identify and prioritize crucial features under various input conditions and employs neural process to make confidence-based predictions for the final timing results. Furthermore, ASAP is equipped with Active Learning for self-refinement and self-correction. Experimental evaluations on benchmark circuits demonstrate that our method surpasses state-of-the-art work in STA accuracy by 18% in terms of prediction accuracy.
KW - Active learning
KW - multiprocess corners
KW - neural process (NP)
KW - self-attention mechanism (SAM)
KW - static timing analysis (STA)
UR - https://www.scopus.com/pages/publications/105008957236
U2 - 10.1109/TCAD.2025.3581474
DO - 10.1109/TCAD.2025.3581474
M3 - 文章
AN - SCOPUS:105008957236
SN - 0278-0070
VL - 45
SP - 480
EP - 493
JO - IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems
JF - IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems
IS - 1
ER -