Running the Tutorials

This is quick reference guide to run the precompiled tutorials. For in-depth explanations, see the individualTutorials pages.

Tutorials are implemented as plugins which can be enabled by setting the corresponding environment variables in theconfig/<setup-type>/.env file.The following sections assume that the system is started in therfsim configuration without connected hardware.

After setting the environment variable, the system can be started via:

# Start the system (rfsim or other configs in config/)./scripts/start_system.shrfsim# Stop the system./scripts/stop_system.sh# Check running containersdockercomposeps# View gNB logsdockercomposelogs-foai-gnb

GPU-Accelerated LDPC

Enable the CUDA-accelerated LDPC decoder by updating your configuration (config/rfsim/.env file).

GNB_EXTRA_OPTIONS=--loader.ldpc.shlibversion_cuda

Start the system:

./scripts/start_system.shrfsim

Verify the plugin is loaded by checking the gNB logs:

[LOADER] library libldpc_cuda.so successfully loaded

Demapper Capture Plugin

To capture IQ samples and LLRs using the capture plugin, create the log files in theplugins/data_acquisition/logs directory:

mkdir-pplugins/data_acquisition/logscdplugins/data_acquisition/logstouchdemapper_in.txtdemapper_out.txtchmod666demapper_in.txtdemapper_out.txt

The results will be written into these files.

Set the environment variable and start the system:

GNB_EXTRA_OPTIONS=--loader.demapper.shlibversion_capture

Start the system:

./scripts/start_system.shrfsim

Verify the plugin is loaded by checking the gNB logs:

[LOADER] library libdemapper_capture.so successfully loaded

Inspect the captured data via:

catplugins/data_acquisition/logs/demapper_in.txt# Output: timestamps, modulation, IQ values...catplugins/data_acquisition/logs/demapper_out.txt# Output: timestamps, modulation, LLR values...

TensorRT Neural Demapper

Build the TensorRT engine using theplugins/neural_demapper/scripts/build-trt-plans.sh script. This is done automatically during installation of the Sionna Research Kit.

Run the neural demapper inference using TensorRT by setting the environment variable:

# we limit the MCS indices to 10 in order to stay within the 16-QAM modulation orderGNB_EXTRA_OPTIONS=--loader.demapper.shlibversion_trt--MACRLCs.[0].dl_max_mcs10--MACRLCs.[0].ul_max_mcs10

It will automatically load the TRT engine as defined inplugins/neural_demapper/config/demapper_trt.config.

Start the system:

./scripts/start_system.shrfsim

Verify the plugin is loaded by checking the gNB logs:

[LOADER] library libdemapper_trt.so successfully loadedInitializing TRT demapper (TID 20)Initializing TRT runtime 20

Neural Receiver

Build the TensorRT engine using theplugins/neural_receiver/scripts/build-trt-plans.sh script. This is automatically done during installation of the Sionna Research Kit.

Run the neural receiver inference using TensorRT by setting the environment variable:

# we limit the MCS indices to 10 in order to stay within the 16-QAM modulation orderGNB_EXTRA_OPTIONS=--loader.receiver.shlibversion_trt--MACRLCs.[0].dl_max_mcs10--MACRLCs.[0].ul_max_mcs10

Start the system:

./scripts/start_system.shrfsim

Verify the plugin is loaded by checking the gNB logs:

[LOADER] library libreceiver_trt.so successfully loadedInitializing TRT receiver (TID 20)Initializing TRT runtime 20

If the receiver is running, you can also see the live statistics in the gNB logs. Note that this requires traffic to be scheduled on the PUSCH, i.e., run iperf3 on the UE side.