Frequently asked questions
The following help topics provide answers to the most commonly asked questions about product registration and download. If you have any further questions, please don't hesitate to contact us!
There can be a few challenges, namely:
- As we move to more advanced technologies where devices keep shrinking, physical characteristics once negligible may now become protagonists among the challenges for semiconductor devices design and validation. That requires advancements in EDA tools which may then require additional formats to read in to model those new characteristics. Existing formats may also evolve.
- That requires that parsers are always up-to-date to handle the additional descriptions. Example: multi-patterning in layout, noise/power/OCV/EM modelling, etc.
- The additional formats will subsequently require additional checks to handle the additional information that will now be present in the IP collateral.
- Besides, new technology nodes may push designers to new architectural challenges, requiring different formats to be cross-checked differently. Example: with the addition of multiple power domain designs, checking that CDL and Liberty agree in terms of power domain is paramount to ensure a design synthesized to the follow the proper circuit specifications.
Amount of data:
- Also, the exponential amount of characterization corners seen in state-of-the-art technologies and the increased amount of modelling information per corner, having the throughput to accommodate the increased amount of data in a still reasonable timeframe requires not only parsers and checks to be efficiently implemented, but the tool framework that manages the QA handling of all the data needs to be scalable.
Homebrew QA solutions are typically as good as the problems a team or a company sees. Once a design team has to move to a new technology node, new IP vendor or new architectural changes, as plethora of new issues arise and that homebrew flow no longer has acceptable coverage. Moreover, the quality of homebrew solutions is typically not industry-grade (e.g. several shortcuts are taken to get a check done, there’s not standardization to how checks are written/debugged/regression-tested, etc.). Finally, an engineer that have written a QA script to do a check may no longer be at the company and the potentially no industry-grade code may have to be discarded and another engineer will have to write it from scratch.
Crossfire is backed by most of the top semiconductor companies in the world. Fractal’s customer portfolio span from consumer electronics and mil-aero to design houses and foundries, meaning it can leverage the expertise on problems that have been exercised across a much broader set of technologies and design types than any company would be able to. All built-in checks developed by Crossfire have been developed in collaboration with its users, but they are fully maintained and enhanced by Fractal. So, when customer need a new check, they either ask Fractal to code in their spec or receive help to write their own custom checks using Fractal parsers and APIs that are provided through the Crossfire framework. The Crossfire framework, besides providing a series of built-in checks that’s growing every month, also provides an unparalleled infrastructure for debugging, waiving, reporting and vendor-customer QA-handshaking that no homebrew solution would be able to provide in a standardized manner.
3rd-party design tools for characterization, synthesis and even verification may use their parsers to report syntax issues, but it has no insight about the validity of the inputs (and less so on their outputs – i.e. garbage in, garbage out). Each of these tools may only read a subset of the files that become part of the IP deliverables, but they are not designed to read different format types of collaterals and cross-check data consistency among all these formats or using custom input parameters to ensure data is within spec. Also, different vendors may read the same data using different parsers, so what may be read in in one tool may not so in another. Writing parsers that are fully compliant with the industry standard is in the core of what Fractal do (so that data can be read in by any tool that support the format).
IP consumers typically have no control on how IP vendors do QA. The responsibility of defining the quality standard for the external IP to be integrated into a design has to be in the hands of who is paying for new tape-out re-spins: the company whose design is integrating that external IP. The IP provider also does not have foresight of all scenarios their IP may be used by their customers, making it hard for a decent QA coverage. It is essential then that external IPs are properly QA’d prior to integration. Crossfire provides a handshaking QA deck called CFS (Crossfire Setup), which allows IP providers and IP consumers to communicate and tune the QA parameters accordingly with the minimum number of cycles (i.e. if a IP integrator finds an issue from an IP vendor, he can share with the vendor how to reproduce the QA issues with Crossfire, so that the vendor can re-send the IP only when the Crossfire report is in a “All-pass” scenario).
All checks in Crossfire have been specified in collaboration with Fractal’s customers. When a user sees the need for a check not available yet in Crossfire, such user will provide a spec and Fractal will judge if such check is beneficial for the tools portfolio or if it is too customer specific. For the former, Fractal will take responsibility to develop and maintain such new check, while for the latter the user will use the Fractal API to navigate through their data whichever way they see fit.
Crossfire number of built-in checks hovers in the 300s and is growing every month. However, user typically pick between 100 and 200 of these checks, as not all checks are fit for all design flows.
The most fundamental checks are cell, pin consistency across formats, arc presence check, corner trend checks, power domain checks, layout vs layout/abstract checks.
Contact us and ask or introduction and demo on Crossfire. Depending on your region we will allocate one of our engineering resources to go through the tool details and take the opportunity to discuss the exit criteria for an eventual evaluation. Typical evaluations with proper tool coverage range in the 2-week timeframe.
There are two fundamental work models, directly dependent on the fact whether the IP is designed in-house or acquired from an external provider (vendor):
1) Regression testing: as the IP is developed, its set of collateral can be checked daily in a regression testing fashion in order to detect issue in the flow as early as possible.
2) Sign-off only: before a IP from an external vendor can be integrated into a design, that IP needs to be checked. Issues found then need to be reported back to the vendor so that a new version of the IP (without the issues) can be release. The Crossfire Setup (CFS) operates as a QA deck which allow customers and vendors to communicate QA and significantly reduce the turn-around time for a new IP version.
Crossfire customers range from consumer electronics, foundries and mil-aero industries. In many cases they intersect, and Crossfire is the link to ensure QA metrics are met on all ends of IP development.
Release notes for the weekly development and quarterly production releases can be found at our Downloads page.