- Notifications
You must be signed in to change notification settings - Fork120
PyTorch(1.6+) implementation ofhttps://github.com/kang205/SASRec
License
pmixer/SASRec.pytorch
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
update on 05/23/2025: thx toWentworth1028 andTiny-Snow, we have LayerNorm update, for higher NDCG&HR, and here's thedoc👍.
update on 04/13/2025: inhttps://arxiv.org/html/2504.09596v1, I listed the ideas worth to try but not yet due to my limited bandwidth in sparse time.
pls feel free to do these experiments to have fun, and pls consider citing the article if it somehow helps in your recsys exploration:
@article{huang2025revisiting_sasrec, title={Revisiting Self-Attentive Sequential Recommendation}, author={Huang, Zan}, journal={CoRR}, volume={abs/2504.09596}, url={https://arxiv.org/abs/2504.09596}, eprinttype={arXiv}, eprint={2504.09596}, year={2025}}or this bib for short
@article{huang2025revisiting, title={Revisiting Self-Attentive Sequential Recommendation}, author={Huang, Zan}, journal={arXiv preprint arXiv:2504.09596}, year={2025}}paper source code inlatex folder.
for questions or collaborations, pls create a new issue in this repo or drop me an email using the email address as shared.
modified based onpaper author's tensorflow implementation, switching to PyTorch(v1.6) for simplicity, fixed issues like positional embedding usage etc. (making it harder to overfit, except for that, in recsys, personalization=overfitting sometimes)
code inpython folder.
to train:
python main.py --dataset=ml-1m --train_dir=default --maxlen=200 --dropout_rate=0.2 --device=cudajust inference:
python main.py --device=cuda --dataset=ml-1m --train_dir=default --state_dict_path=[YOUR_CKPT_PATH] --inference_only=true --maxlen=200output for each run would be slightly random, as negative samples are randomly sampled, here's my output for two consecutive runs:
1st run - test (NDCG@10: 0.5897, HR@10: 0.8190)2nd run - test (NDCG@10: 0.5918, HR@10: 0.8225)pls check paper author'srepo for detailed intro and more complete README, and here's the paper bib FYI :)
@inproceedings{kang2018self, title={Self-attentive sequential recommendation}, author={Kang, Wang-Cheng and McAuley, Julian}, booktitle={2018 IEEE International Conference on Data Mining (ICDM)}, pages={197--206}, year={2018}, organization={IEEE}}I see a dozen of citations of the repo🫰, pls use the bib as below if needed.
@misc{Huang_SASRec_pytorch,author = {Huang, Zan},title = {{SASRec.pytorch}},url = {https://github.com/pmixer/SASRec.pytorch},howpublished = {\url{https://github.com/pmixer/SASRec.pytorch}},year={2020}}About
PyTorch(1.6+) implementation ofhttps://github.com/kang205/SASRec
Topics
Resources
License
Uh oh!
There was an error while loading.Please reload this page.
Stars
Watchers
Forks
Contributors4
Uh oh!
There was an error while loading.Please reload this page.