Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up

TRex, a fast multi-animal tracking system with markerless identification, and 2D estimation of posture and visual fields.

License

NotificationsYou must be signed in to change notification settings

mooch443/trex

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

CondaBuildLinuxCondaBuildMacOSCondaBuildWindows

Now with nativeApple Silicone (M1) and ML Compute support.How to install TRex (arm64).

Documentation:https://trex.run/docs

Hey there

Welcome to the git repository ofTRex (https://trex.run) -- a software designed to track and identify individuals and other moving entities using computer vision and machine learning. The work-load is split into two (not entirely separate) tools:

  • TGrabs: Record or convert existing videos, perform live-tracking and closed-loop experiments
  • TRex: Track converted videos (in PV format), use the automatic visual recognition, explore the data with visual helpers, export task-specific data, and adapt tracking parameters to specific use-cases

TRex can track 256 individuals in real-time, or up to 128 with all fancy features like posture estimation enabled, and for up to 100 individuals allows you to(when realtime speed is not required) visually recognize individuals and automatically correct potential tracking errors.

TGrabs, which is used to directly process already saved videos or to record directly from webcams and/or Basler machine-vision cameras with integrated and customizable closed-loop support. Camera support can be extended for other APIs with a bit of C++ knowledge, of course.

Installation

TRex supports all major platforms. You can create a new virtual environment (namedtracking here) using Anaconda or miniconda/miniforge by running:

# macOS (Intel/arm64 M1), Windowsconda create -n tracking -c trexing trex
# Linuxconda create -n tracking -c defaults -c conda-forge -c trexing trex

macOS with an arm64 / M1 processor

If you own a new Mac with anApple Silicone CPU, the Intel version (above) works fine in Rosetta. However, I would strongly encourage installing TRex viaminiforge, a flavor of miniconda that natively supports arm64 packages. Simply follow the instructions here for installing miniforge:https://github.com/conda-forge/miniforge#download.

Once you're done, you can run this command to create the virtual environment:

# macOS (arm64/M1)conda create -n tracking -c trexing trex

Installing tensorflow on the M1 is a bit more complicated, which is why TRex will not allow you to use machine learning unless you install the following extra packages manually. Instructions will be printed out after you created the environment Apple provides their own tensorflow version for macOS including a native METAL (https://developer.apple.com/metal/tensorflow-plugin/) plugin. To install tensorflow inside your environment, just run:

# activate the TRex environmentconda activate tracking# install tensorflow dependencies and metal pluginconda install -c apple -y tensorflow-deps==2.7.0python -m pip install tensorflow-macos==2.7.0 tensorflow-metal==0.3.0

Manual compilation

Pre-built binaries are compiled with fewer optimzations and features than a manually compiled one (due to compatibility and licensing issues) and thus are slightly slower =(. For example, the conda versions do not offer support for Basler cameras. If you need to use TGrabs with machine vision cameras, or need as much speed as possible (or the newest version), please consider compiling the software yourself.

If you want compatibility with the Basler API (or other things with licensing/portability issues), pleaseuse one of the manual compilation options (seehttps://trex.run/docs/install.html).

Usage

Within the conda environment, simply run:

trex

Opening a video directly and adjustingparameters:

trex -i /path/to/video.pv -track_threshold 25 -track_max_individuals 10

If you don't want a graphical user interface and save/quit when tracking finishes:

trex -i /path/to/video.pv -nowindow -auto_quit

To convert a video to our custom pv format (for usage in TRex) from the command-line:

tgrabs -i /full/path/to/video.mp4 -o funny_name

Readmore about parameters for TRexhere and for TGrabshere.

Contributors, Issues, etc.

This project has been developed, is still being updated, byTristan Walter.If you want to contribute, please submit a pull request on github and I will be happy to credit you here, for any substantial contributions!

If you have any issues running the software please consult the documentation first (especially the FAQ section)and if this does not solve your problem, please file an issue using theissue tracker here on github.If you experience problems withTensorflow, such as installing CUDA or cuDNN dependencies, then please direct issues to those development teams.

License

Released under the GPLv3 License (seeLICENSE).

Reference

If you use this software in your work, please cite ouropen-access paper:

@article {walter2020trex,  article_type = {journal},  title = {TRex, a fast multi-animal tracking system with markerless identification, and 2D estimation of posture and visual fields},  author = {Walter, Tristan and Couzin, Iain D},  editor = {Lentink, David},  volume = 10,  year = 2021,  month = {feb},  pub_date = {2021-02-26},  pages = {e64000},  citation = {eLife 2021;10:e64000},  doi = {10.7554/eLife.64000},  url = {https://doi.org/10.7554/eLife.64000},  journal = {eLife},  issn = {2050-084X},  publisher = {eLife Sciences Publications, Ltd},}

[8]ページ先頭

©2009-2025 Movatter.jp