Issue link: https://iconnect007.uberflip.com/i/1295812
OCTOBER 2020 I DESIGN007 MAGAZINE 15 and small at the end. Thus, an analysis that exceeds the pass/ fail threshold by a small amount at the start of the cycle might fail with a more detailed analy- sis. It's important that we have a sense of how big that error band is, so we can interpret the results of any particular analysis appro- priately. Generally speaking, we group analysis results into three categories: 1. Green: The design has a significant positive margin, even after accounting for the error band in the model. We could perform a more accurate analysis to get a bet- ter assessment, but we may not need to. The design might be ready to fab-out as-is. 2. Red: The design has a significant nega- tive margin, even considering the error band. The chances are that the design is broken, and something needs to change. Even though we could perform a more detailed assessment, it's probably best just to figure out what's wrong with the design and fix it. 3. Yellow: The degree of positive or nega- tive margin is within the error band, so we're not really sure whether the design will work or not. A more detailed analysis is needed to be sure. Process efficiency rises when the analysis result is green or red. In either case, the need for a more detailed analysis has been delayed, and design work can continue in the mean- time. This doesn't eliminate the need to perform a complete post-layout verification before fab- out, and to have that verification performed by a SI expert. This final "gate" to fab-out is proven and valuable. The goal is to improve the quality of designs being passed to SI experts for verification so that fewer design changes are needed. Shaughnessy: Are you saying you shouldn't have to be an analysis specialist to perform SI analysis? Westerhoff: Exactly. Let's look at what happens in actual prac- tice with DDR4 as an example. A lot of companies lay out their DDR4 designs based on the con- troller manufacturer's recom- mendations and then pass the layout back to the silicon vendor for review. Silicon vendors can't afford to set up and run detailed post-layout verification for all the designs customers want to pass their way, so they need to develop good design "screening" to find common problems. Think of it as trying to find 95% of the problems with 10% of the effort of a full post- layout analysis. Those screening efforts typi- cally take one of two forms: 1. Automated electrical rule checks: These go beyond simple physical rule checks to include things like trace impedance, the prox- imity of return path vias, etc. The silicon ven- dors configure rules for their specific device technologies, allowing them to do the same screening an experienced designer might do, but much faster. 2. First-order simulation: Design volume and turnaround time make this simpler than what the vendor would use internally for sign- off. For example, the process might simulate data nets with generic technology models, computing a delay and figure of merit for each signal's quality. This simulation can be run quickly to identify any discrepancies between signals in a data bus. The question becomes, "If the silicon ven- dors are already doing this type of screening on the designs customers give them for review, why not put those same processes directly in the hands of customers themselves?" That's what we're looking to do. We already offer comprehensive electrical design rule checking, and we're bringing out a first-order pulse response analysis patterned after the techniques some of the silicon ven- dors use. You don't need any simulation model at all. You load your design, set up the analysis Todd Westerhoff