Movatterモバイル変換


[0]ホーム

URL:


Skip to main content

Search History

Bookmark

Pleaselog in to show your saved searches.

Search History

Bookmark

Please   to show your saved searches.

  • All our products
    • Back
  • Tools & software
    • Back
  • Applications
    • Back
  • Solutions
    • Back
    Skip to main content
    STMicroelectronics homepage

    Search History

    Bookmark

    Please to show your saved searches.

    Have you ever wondered about the status of the package you have delivered? Has any impact caused significant damage to the contents of the package?

    Sensors can assist in monitoring these conditions, but it is essential to have low power consumption to increase battery life. This is where edge AI comes into play. At the sensor level, it can help save power while tracking the state of your package and detecting any possible events.

    In this use case, we will show you how to implement a smart asset tracking solution usingST MEMS sensors.

    Approach

    We combined two advanced features available in the ST MEMS sensors: the machine learning core (MLC) and the finite state machine (FSM):

    • We used MEMS-Studio to generate and configure a decision tree model with three nodes to detect the different classes.
    • The MLC processes the accelerometer data to detect the state of the package (e.g. stationary, in motion, shaken).
    • The FSM processes raw and filtered accelerometer data to detect impacts and falls.
    • In this example, the threshold for impact detection is set to 0.5 g, and the angle for upright is set to 26.
    • The interrupts generated on the INT1/INT2 pins of the sensor allow to wake up the microcontroller only when the desired events have been detected. 

    Sensor

    3-axis ultralow-power smart accelerometer with AI, antialiasing filter, and advanced digital features (reference:LIS2DUX12).

    Data

    The accelerometer data have been acquired with 16 g full-scale and 25 Hz output data rate in low-power mode.
    Sensor orientation is set as for the ENU convention (with Z-axis pointing up).

    Results

    Power consumption (sensor + algorithm): 14.7 uA

    The output of the MLC can be read from the MLC1_SRC (34h) register:

    • 00h = Stationary - Upright
    • 04h = Stationary - Not upright
    • 08h = In motion
    • 0Ch = Shaken


    The FSM detects the following states:

    • Impact, detected by FSM #1
    • Free-fall, detected by FSM #2


    The configuration generates an interrupt (pulsed and active high) on the INT1 pin every time the register MLC1_SRC (34h) is updated with a new value (when the state detected by the MLC changes). The duration of the interrupt pulse is 40 ms in this configuration.

    The configuration generates an interrupt (pulsed and active high) on the INT2 pin when either a free-fall or an impact event is detected by the FSM. The free-fall interrupt remains active as long as the package is airborne. The FSM_STATUS (13h) register allows determining which FSM has generated the interrupt in order to distinguish between impact and free-fall events.

    Model created with
    MEMS-Studio
    Compatible with
    LSM6DSOX
    Resources

    Model created with MEMS-Studio

    A complete software solution for desktops to enable AI features on smart sensors. It allows users to analyze data, evaluate embedded libraries, and design no-code algorithms for the entire portfolio of MEMS sensors.

    Model created with MEMS-StudioModel created with MEMS-StudioModel created with MEMS-Studio

    Compatible with LIS2DUX12

    Smart sensors capable of directly processing the data they capture and delivering meaningful insights to the host device. By processing data locally, smart sensors reduce transmitted data and cloud processing requirements, thus lowering power consumption at the system level.

    Compatible with LIS2DUX12Compatible with LIS2DUX12Compatible with LIS2DUX12
    You might also be interested by

    Vision | STM32Cube.AI | STM32 AI MCU | Partner | Smart home | Wearables | Microphone | Accelerometer | Tutorial

    Handheld development platform for real-time vision, motion, and voice at the edge

    All‑in‑one STM32N6‑based platform for on‑device edge AI with NPU acceleration

    Industrial | Smart city | Vision | STM32Cube.AI | STM32 AI MCU | Partner | Video | Smart home

    Secure entry systems using id3 face recognition with liveness detection

    Edge processing with RGB camera and ToF sensor ensures rapid and secure anti-spoofing for access control using the STM32N6 MCU.

    Industrial | Smart city | Vision | STM32Cube.AI | STM32 AI MCU | Partner | Video | Transportation | Model zoo

    Smart rear view camera running on batteries

    How does the STM32N6 improve real-time detection of people, cars, trucks, and cyclists in blind spots


    [8]ページ先頭

    ©2009-2025 Movatter.jp