Module Level Verification Project flow(Memory controller used as reference)

VLSI Industry work can majorly be classified in to 3 categories.

  • SoC Design & Verification
  • Subsystem Design & Verification
  • Module level(or IP) Design & Verification

VLSI Engineer should plan his career in such a way to work on projects in all 3 domains above, during his initial part of career. Even though all above domains are different, they still follow common steps on how design & verification is approached. This article we will focus on detailed steps that goes in to Module level verification projects. Majority of these items are applicable to Subsystem & SOC projects as well.

  • Design Specification understanding
  • Listing down of Features
  • Listing down scenarios
  • Test plan development
  • Test Bench architecture
  • Testbench component coding & integration
  • Sanity Testcase development
  • Testcase coding & debugging
  • Regression setup & debugging testcases
  • Verification closure using regression & coverage criteria

Reading Specification

  • Design Architecture understanding
  • Wishbone interface
  • Memory Interface
  • Configuration and status registers
  • Memory Timing Controller
  • Power down Controller
  • Open Bank & Row Tracking
  • Refresh Controller
  • Power on Config
  • Address & Mux Counter
  • Data Latch Packer and Parity
  • Design interfaces Spec Reading : Wishbone, SRAM, SDRAM, Flash, synchronous CS devices
  • Understand different memory interface signals, timing diagram, different features supported
  • SDRAM features like write, read, bank based access, precharge, activate, refresh, auto refresh, burst termination. How these features are achieved using different signals at SDRAM interface
  • Significance of MODE register in SDRAM functionality
  • Design has Configuration & Status registers to configure the design behaviour for different requirements
  • CSR, POC, BA_MASK, CSC[0..7], TMS[0..7]
  • Design sub component detailed understanding and how they are interacting

Listing down of Features

  • The list of things you notice on quick look through the design specification. The spec in most cases gives list of features in the design description. However user need to glance through the spec to list down all
  • Chip select
  • Power on Configuration
  • Memory controller powering down
  • Different memory type access
  • SDRAM Refresh
  • Memories of different sizes, bus width, bank number, bank size, row & column count
  • Memories of different timing parameters
  • Parity enable & disable
  • Burst access to memories

Listing down scenarios (SN means Scenario, NSN : Error Scenario)

  • Chip select
  • SN#1: All chip select enabled, same memory type connected to all chip selects. All memories accessed concurrently
  • SN#2: Different memories connected to different chip selects, concurrent access to all memories
  • NSN#3: Accessing a memory whose chip select is not enabled
  • NSN#4: Connecting different SDRAM types to same chip select. Do the same for SSRAM.
  • NSN#5: Memory not connected CS0, while others are connected
  • Power On configuration
  • SN#6 : Check the POC reg value on power on reset
  • Memory Controller powering down
  • TODO
  • Total 30-40 scenarios for above features including error case scenarios

Test plan development

  • Above scenarios are converted to one or more testcases
  • All tests are put in spreadsheet format, with testname, feature, test description, etc

Test Bench architecture

  • What all components are going in testbench
  • Top module, Program, Env, BFM, Generator, Monitor, Functional coverage, reference model, checker, scoreboard, assertions, Interfaces, Mailboxes
  • Understanding Functionality of each component & coding each component
  • BFM :
  • Get the txs from generator using mailbox,
  • Drive the Tx on interface as per protocol
  • Get the response back from DUT
  • Convert response to object format and give it to generator
  • Generator:
  • Get the test case information either as test number or text file
  • Generate scenario as per test requirements
  • Put the transaction in to BFM connected mailbox
  • Get the response transaction from BFM connected mailbox
  • Monitor:
  • Monitor interface signals, validate the transaction at clock positive edge
  • Populate the transaction for different phases(Address, data, response)
  • Put the transaction in to reference model & Coverage model mailbox
  • Coverage:
  • Define the cover group, create covergroup in new function
  • Get the Tx from monitor mailbox, trigger covergroup sampling event
  • Assertions(module):
  • Declare all the interface signals as module ports
  • List down all the checks on interface protocol signals. Timing diagram & transaction understanding makes this lot easier
  • Bind design internal signals to local module in testbench environment if there is need to write assertions on design interface interconnects
  • Understand how to write sequences, using sequences in property definitions, how to cover & assert a property
  • Note: assume(property) is used in Formal verification
  • Register Model
  • All the design configuration & status registers coded in class format
  • The register values in register model should always be kept in sync with design values.
  • Reference Model(Class):
  • Get the all design interface inputs in transaction format from monitor mbox
  • If the transaction is meant for configuring a register, then update register value in Register model.
  • If the transaction is meant for slave device, convert transaction in to slave interface format to expected transaction; this will be compared with slave interface transaction.
  • Checker(class):
  • Get expected transaction from reference model & actual transaction from design interface (slave). Compare these two txs. Update static field match or mismatch based on above comparision.
  • Match & Mismatch count will be used for checking testcase pass/fail status.

Testbench component connection based on architecture

  • Top Module -> Interfaces, design instance, Program block, assertion module instance, clock & reset generation, most importantly Memory verilog models(downloaded from memory vendor company websites)
  • Use defines like CS0_SDRAM_CON to indicate that SDRAM is connected to CS0. Same for CS0_SSRAM_CON, etc. DO same for all chip selects. Current testcase being run gives information on what type of memory is connected to each select
  • However CS0_SDRAM_CON & CS0_SSRAM_CON can;t be defined in same testcase run. By doing this we are indicating connection of different memories to same chip select
  • These defines should be used in coding memory monitor and register model coding for TMS register.
  • Interface -> Signal declaration, Modport, Clocking block, task & function definition
  • Design: Comes as one of the inputs for functional verification. Know the top module
  • Program-> top_env -> wb_env, mem_env, Memory Controller scoreboard, Memory Controller checker
  • wb_env -> wb_bfm, wb_gen, wb_mon, wb_cov
  • mem_env -> mem_monitor
  • All the connection are done by passing mbox & interface as function new argument

Sanity Testcase Coding

  • Check if register configuration happening correct, whether traffic flow happens correct
  • Check Txs at different interface to validate the testcase flow
  • At this checker & scoreboard may not have been coded, so only waveform confirm test pass/fail

Testcase coding & debugging

  • Code all the testcases, Run the tests
  • Testcase coding by updating case statement in wb_gen::run method with required scenario. It involves creating object, randomizing object as per scenario requirements
  • Testcase coding as a text file, which is parsed by C/PERL program, array of test case input parameters(integers) passed to Program block using DPI, which in turn pass it to wb_gen, which uses these parameters for test scenario generation. These parameters can be used for both configuring registers & traffic generation.
  • Debug failures
  • Issues can be in Design & testbench
  • Knowing the design sub block connection, also having a good idea how transaction flows in design; helps trace the design issues quite easily.
  • Assertion based debugging
  • Debugging using log file information
  • Debugging using Waveform tracing
  • Signal tracing through design source files
  • Schematic Tracing through questasim tool

Regression setup & debugging testcases

  • PERL script used to setup regression
  • Debug failing tests
  • Analyze coverage
  • Write more constrained random/Directed tests to achieve 100% Functional & Code coverage

Closing verification

  • 100% passing of all testcases
  • 100% functional coverage
  • 100% code coverage
Course Registration