跳到主要导航 跳到搜索 跳到主要内容

ASAP: Accelerating Corner-Based Timing Analysis With Bayesian Active Self-Attention Neural Process

  • Longze Wang
  • , Wei W. Xing
  • , Zhelong Wang
  • , Christos Sotiriou
  • , Nikolaos Sketopoulos
  • , Ning Xu
  • , Yuanqing Cheng*
  • *此作品的通讯作者

科研成果: 期刊稿件文章同行评审

摘要

With the advancement of modern nanoscale technology nodes, static timing analysis (STA) has become an indispensable technique for ensuring circuit reliability and performance across diverse process conditions. However, traditional STA methods scale poorly to the explosion of process corners in the nanoscale fabrication technology. Despite some seminal works in using AI to accelerate such processes, they either lack reliability or stability. To this end, we introduce active self-attention neural process (ASAP), a novel approach addressing this challenge by combining both the latest deep learning methods and the classical Bayesian models to deliver scalable and accurate predictions with a self-calibration strategy to ensure reliability. Technically, the ASAP novelly integrates self-attention to help identify and prioritize crucial features under various input conditions and employs neural process to make confidence-based predictions for the final timing results. Furthermore, ASAP is equipped with Active Learning for self-refinement and self-correction. Experimental evaluations on benchmark circuits demonstrate that our method surpasses state-of-the-art work in STA accuracy by 18% in terms of prediction accuracy.

源语言英语
页(从-至)480-493
页数14
期刊IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems
45
1
DOI
出版状态已出版 - 2026

指纹

探究 'ASAP: Accelerating Corner-Based Timing Analysis With Bayesian Active Self-Attention Neural Process' 的科研主题。它们共同构成独一无二的指纹。

引用此