3. Getting Started

3.1. Requirements

To build the SoC and deploy on FPGAs, the following set of tools will be required:

  1. Bluespec Compiler: This is required to compile the BSV based soc, core, and other devices to Verilog.

  2. Python3.7: Python 3.7 is required to configure compilation macros and clone dependencies.

  3. Verilator 4.08+: Verilator is required for simulation purposes. Those not interested in simulation can skip installing this tool.

  4. RISC-V Toolchain 9.2.0+: You will need to install the RISC-V GNU toolchain to be able to compile programs that can run on ChromiteM. You will need to build the tool-chain for march=rv64imac.

  5. RISC-V OpenOCD: ChromiteM SoC has a JTAG based debugger that provides access to the gdb interface. For gdb to be able to communicate to the processor, OpenOCD is required. We will be using the FTDI and JLink drivers available in the openocd.

  6. Miniterm: Miniterm is required for communication to the debug port on the FPGAs.

  7. Vivado 2019+: Xilinx Vivado is required to generate bitstream and program the any xilinx based FPGAs.

3.1.1. Quick Tools Setup

If you are using a debian OS like Ubuntu, you can intall the above tools using a single build script as mentioned below. <jobs> is a numeric representing the number of parallel cores/threads to be used during the build and <path> is the location where you would like to install the tools.

$ git clone https://gitlab.com/incoresemi/utils/common_utils.git
$ cd common_utils/chromitem_soc
$ sudo JOBS=<jobs> ./tools_setup.sh all <path>
$ source tools_setup.sh export_path <path>

You can also install individual tools using the same script.

$ ./tools_setup.sh

tools_setup 0.1.0
================================================

Tools dir : /home/..../chromite_tools...

Usage: sudo JOBS=<n> ./tools_setup.sh <command> <install_path>

Command to install the following tools. JOBS option when given
is passed on to make command during installation.

Available commands:
sudo ./tools_setup.sh help                       Displays help
sudo ./tools_setup.sh bsc <path>                 Install Bluespec Compiler
sudo ./tools_setup.sh verilator <path>           Install Verilator
sudo ./tools_setup.sh riscv_tools <path>         Install RISC-V GNU Toolchain
sudo ./tools_setup.sh openocd <path>             Install RISC-V OpenOCD
sudo ./tools_setup.sh dtc <path>                 Install DTC 1.4.7
sudo ./tools_setup.sh python_setup <path>        Install pyenv and Python 3.7.0
sudo ./tools_setup.sh all <path>                 Installs  all the above tools
sudo ./tools_setup.sh clean_builds <path> []     Removes all build dirs
                                                 Usage: sudo ./tools_setup.sh clean_builds <path> all
sudo ./tools_setup.sh clean_all <path> []        Removes all src and build dirs
                                                 Usage: sudo ./tools_setup.sh clean_all <path> all
sudo ./tools_setup.sh export_path <path>         Sets up path variable

If you are on a non-debian OS, it is advised to install each of the above tools from their source repositories. Please refer to the tools_setup.sh in the above steps for any patches and configuration settings specific for ChromiteM.

3.1.2. Vivado

Xilinx Vivado is a propreitary tool required to generate bitstream and program your FPGA.

  1. Create an account on Xilinx (https://www.xilinx.com/registration/create-account.html) and download the Vivado Design Suite 2019.2 Linux Unified Installer (https://www.xilinx.com/member/forms/download/xef.html?filename=Xilinx_Unified_2019.2_1106_2127_Lin64.bin).

  2. Run the Xilinx unified installer

$ chmod +x Xilinx_Unified_2019.2_1106_2127_Lin64.bin
$ ./Xilinx_Unified_2019.2_1106_2127_Lin64.bin
  1. Install Vivado Design Suite HLx with the default settings that are present in the installer.

  2. Install Xilinx cable drivers. Assuming, you set the install path in the previous step as /tools/Xilinx, run the following commands:

$ cd /tools/Xilinx/Vivado/2019.2/data/xicom/cable_drivers/lin64/install_script/install_drivers
$ sudo ./install_drivers
  1. Add the Vivado binary path to your $PATH variable by appending the following line in your .bashrc or .cshrc:

export PATH=$PATH:/tools/Xilinx/Vivado/2019.2/bin
  1. Add Vivado board files for Digilent FPGA boards:

$ git clone https://github.com/Digilent/vivado-boards.git
$ sudo cp -r vivado-boards/new/board_files/*  /tools/Xilinx/Vivado/2019.2/data/boards/board_files/*

3.2. Building the FPGA Target

Perform the following steps to generate the bit-stream

  1. Clone the repository:

$ git clone https://gitlab.com/incoresemi/fpga_ports/chromitem_soc.git
$ cd chromitem_soc
  1. Configure the FPGA board/target: you need to edit only the board_alias field in the file chromitem.yaml. This field takes an alias name of the board you are targeting. Following is the list of currently supported fields:

    • arty100t

  2. Run the configuration script:

$ python -m configure.main
  1. Generate Verilog: Change <jobs> to the number of parallel threads you want to use for the build.

$ make -j<jobs> generate_verilog
  1. Build the mcs file:

$ make generate_hexfiles ip_build fpga_build generate_mcs
  1. Flash the mcs file:

$ make program_mcs

One the MCS file has been flashed you will have to power-off and on the board (either through a switch or by removing the USB cable). To confirm successfully build, open a serial communication with the UART via miniterm and check for the following:

$ sudo miniterm -f direct --eol CRLF /dev/ttyUSB1 115200

--- Miniterm on /dev/ttyUSB1  115200,8,N,1 ---
--- Quit: Ctrl+] | Menu: Ctrl+T | Help: Ctrl+T followed by Ctrl+H ---

  _____        _____
 |_   _|      / ____|
   | |  _ __ | |     ___  _ __ ___
   | | | '_ \| |    / _ \| '__/ _ \
  _| |_| | | | |___| (_) | | |  __/
 |_____|_| |_|\_____\___/|_|  \___|


            Chromite M SoC
            Version: 0.9.2
  Build Date: 2020-05-30 21:07:54 IST
Copyright (c) 2020 InCore Semiconductors
  Available under Apache v2.0 License

3.3. Debug Programs

Once you have successfully build the FPGA target you can debug programs on the FPGA via GDB.

The Chromite M SoC repository includes some benchmark and sample programs which can be run/debugged. These are available in the software directory. For the full list of the available programs:

$ cd software/
$ make list-programs
program-list: coremarks dhrystone hello

To compile and debug a program on the FPGA:

$ make PROGRAM=dhrystone connect_gdb

This basically compiles the program available in the software/programs folder using riscv-gcc, connects open-ocd to the FPGA target to enable debug mode and launches GDB with the elf loaded in the memory. You can now use standard GDB commands like: info reg, info all-registers, continue, etc to debug and execute the program.

List of quick GDB commands

Table 3.1 Quick list of common GDB commands that can be used for debugging the core.

Command

Description

c/continue

this cause the core to resume execution from the current pc

i r

list all integer register and current pc

info all-registers

list all regsiters and csrs available on the core

info reg <csr-name>

list the current value of the csr-name

stepi

step through a single instruction

b *0x80000000

set a breakpoint at address 0x80000000

i b

list all breakpoints

del <num>

delete breakpoint number <num>

x/x 0x80000000

list the 4 bytes of content at location 0x80000000

x/10x 0x80000000

list 10 consecutive 4-byte words starting from location 0x80000000

load <elf>

load the <elf> into the memory

file <elf>

read the debug symbols of the <elf>

compare-sections

compare if the loading of and elf was successfull. Will require running file and load commands on the same elf before issuing this command.

monitor reset halt

Will reset the SoC except the debug module. Has to be followed by monitor gdb_sync and stepi commands for proper effect.

3.4. Simulating the SoC

The following can be used to simulate non-propreitary components of the SoC. Currently the SoC uses the DDR ip from the FPGA vendor and is thus not available for simulation through open source simulators like verilator.

In this case, we replace the DDR controller slave with a simple RAM based memory in the test-bench to enable simulation.

3.4.1. Verilator Executable

We will be using the open source verilator tool to build a simulation executable. Please ensure you have the latest version of verilator installed or atleast version 4.08+.

The YAML for simulation configuration is : chromitem_sim.yaml. Once you have changed the settings the chromitem_sim.yaml use the following steps to generate a verilated executable:

$ python -m configure.main -ispec chromitem_sim.yaml
$ make generate_verilog -j<jobs>
$ make link_verilator

The above should generate a bin folder with the following file:

  • chromitem_soc: verilated executable

3.4.2. Memory Hex Files

As described in Memory Map, there are three memory sources in the SoC :

  • 256MB of DDR

  • 4KB Boot ROM

  • 16KB of OCM (On Chip Memory).

By default the ZSBL (Zeroth Stage Boot Loader) will print the incore ascii art along with the build information and either execute an ebreak or jump to a different region depending on the boot config provided during simulation. The ZSBL execution can be skipped in simulation by changing the reset_pc parameter in the chromitem_sim.yaml to point to any of the above three choices.

The simulation also requires the hex memory files for the ROM, RAM memory and OCM memories. Defaults copies of these can be generated using the following single command:

$ make generate_hexfiles

You should now have the following files in the bin folder:

  • chromitem_soc: verilator executable of the SoC

  • boot.mem: hex file containing the ZSBL of Boot ROM

  • ddr.mem: hex file for the RAM memory containing a hello-world program

  • ocm.mem: hex file for the OCM memory containing a hello-world program

Note

The following describes how to generate and modify each of the memory files. These steps can be skipped.

3.4.2.1. Custom Memory files

The boot ROM files can be modified from the boot-code folder. The hex files can be generated using the following command:

$ cd boot-code; make; cd ..;
$ cp boot-code/outputs/boot.mem bin/

This should generate a single file : boot.mem in the bin folder

To generate the hex file for the RAM memory perform the following steps

$ cd software; make PROGRAM=hello ddrhex; cd ..;
$ cp software/build/ddr.mem bin/

You can change hello to dhrystone or other programs available in software/programs directory.

Similarly to generate the hex files for the OCM memory do the following:

$ cd software; make PROGRAM=hello ocmhex; cd ..;
$ cp software/build/ocm.mem bin/

You can change hello to dhrystone or other programs available in software/programs directory.

You can start simulation using the chromitem_soc executable as elaborated in the next section.

3.4.3. Simulating Executable

Once you have the chromitem_soc executable ready from the above steps, you can use the following steps to simulate that executable:

  • ./chromitem_soc: This will start simulating the core with reset_pc as defined in the yaml. An printf statements used in the application will be available in the app_log file.

  • ./chromitem_soc +openocd: With this option the executable will wait for a openocd and gdb connection. One can use the scripts avaialable in the sim/gdb_setup folder.

  • ./chromitem_soc +boot0: This option will first execute the ZSBL and exectute an ebreak and halt. This is default behavior if no boot option provided.

  • ./chromitem_soc +boot1: This option will first execute the ZSBL and jump to OCM region.

  • ./chromitem_soc +boot2: This option will first execute the ZSBL and jump to DDR region.

Along with the above, you can also use the following options:

  • +rtldump: This will create a instruction trace dump of the execution in rtl.dump file.