Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up

IQFormer: A Novel Transformer-Based Model with Multi-modality Fusion for Automatic Modulation Recognition

License

NotificationsYou must be signed in to change notification settings

WestdoorSad/IQFormer

Repository files navigation

Official Code for "IQFormer: A Novel Transformer-Based Model With Multi-Modality Fusion for Automatic Modulation Recognition". [paper]

Citation

If our work is helpful to your research, please star us on github and cite :

@ARTICLE{10729886,author={Shao, Mingyuan and Li, Dingzhao and Hong, Shaohua and Qi, Jie and Sun, Haixin},journal={IEEE Transactions on Cognitive Communications and Networking},title={IQFormer: A Novel Transformer-Based Model With Multi-Modality Fusion for Automatic Modulation Recognition},year={2024},volume={},number={},pages={1-1},keywords={Feature extraction;Modulation;Transformers;Convolution;Time-frequency analysis;Wireless communication;Time-domain analysis;Signal to noise ratio;Market research;Interference;Automatic modulation recognition;deep learning;multi-modality fusion;transformer},doi={10.1109/TCCN.2024.3485118}}

Preparation

We conducted experiments on three datasets, namely RML2016.10a, RML2016.10b and HisarMod2019.1.

The datasets can be downloaded from the DeepSig(RML2016 series),HisarMod2019.1. Special thanks toRichardzhangxx for providing the .MAT file for HisarMod2019.1. For your convenience, I have combined the I/Q signals and saved them in the h5py file. If you want to know the most widely used dataset division ratio, read my paper.

withh5py.File(os.path.join(args.database_path,'HisarMod2019train.h5'))ash5file:train=h5file['samples'][:]train_label=h5file['labels'][:]SNR_tr=h5file['snr'][:]h5file.close()withh5py.File(os.path.join(args.database_path,'HisarMod2019test.h5'))ash5file:test=h5file['samples'][:]test_label=h5file['labels'][:]SNR_te=h5file['snr'][:]h5file.close()

Please extract the downloaded compressed file directly into the ./dataset directory, or change the

args.database_path.args.database_choose should keep in [2016.10a, 2016.10b, 2019].

Then you just need to

pythonmain.py

If you want our pre-trained models on both three datasets, please contactshaomy666@stu.xmu.edu.cn

Environment

These models are implemented in Keras, and the environment setting is:

  • Python 3.11
  • Pytorch 1.12.0
  • pandas
  • seaborn
  • h5py
  • scikit-learn
  • matplotlib
  • tensorboardX
  • tqdm
  • timm

License

This code is distributed under an MIT LICENSE. Note that our code depends on other libraries and datasets which each have their own respective licenses that must also be followed.

About

IQFormer: A Novel Transformer-Based Model with Multi-modality Fusion for Automatic Modulation Recognition

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages


[8]ページ先頭

©2009-2025 Movatter.jp