Movatterモバイル変換


[0]ホーム

URL:


Skip to main content

Advertisement

Springer Nature Link
Log in

Open-Set Domain Adaptation via Joint Error Based Multi-class Positive and Unlabeled Learning

  • Conference paper
  • First Online:

Part of the book series:Lecture Notes in Computer Science ((LNCS,volume 15131))

Included in the following conference series:

  • 220Accesses

Abstract

Open-set domain adaptation aims to improve the generalization performance of a learning algorithm on a more realistic problem of open-set domain shift where the target data contains an additional unknown class that is not present in the source data. Most existing algorithms include two phases that can be described as closed-set domain adaptation given heuristic unknown class separation. Therefore, the generalization error cannot be strictly bounded due to the gap between the true distribution and samples inferred from heuristics. In this paper, we propose an end-to-end algorithm that tightly bound the risk of the entire target task by positive-unlabeled (PU) learning theory and the joint error from domain adaptation. Extensive experiments on various data sets demonstrate the effectiveness and efficiency of our proposed algorithm over open-set domain adaptation baselines.

This is a preview of subscription content,log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
JPY 3498
Price includes VAT (Japan)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
JPY 8465
Price includes VAT (Japan)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
JPY 10581
Price includes VAT (Japan)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide -see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Similar content being viewed by others

Notes

  1. 1.

    See proofs in the supplemental material.

References

  1. Baktashmotlagh, M., Faraki, M., Drummond, T., Salzmann, M.: Learning factorized representations for open-set domain adaptation. In: ICLR (2019)

    Google Scholar 

  2. Ben-David, S., Blitzer, J., Crammer, K., Kulesza, A., Pereira, F., Vaughan, J.: A theory of learning from different domains. Mach. Learn.79, 151–175 (2010)

    Article MathSciNet  Google Scholar 

  3. Bucci, S., Loghmani, M.R., Tommasi, T.: On the effectiveness of image rotation for open set domain adaptation. In: ECCV (2020)

    Google Scholar 

  4. Busto, P.P., Gall, J.: Open set domain adaptation. In: ICCV (2017)

    Google Scholar 

  5. Cubuk, E.D., Zoph, B., Shlens, J., Le, Q.V.: Randaugment: practical automated data augmentation with a reduced search space. In: CVPRW (2020)

    Google Scholar 

  6. Deng, J., Dong, W., Socher, R., Li, L.J., Li, K., Fei-Fei, L.: Imagenet: a large-scale hierarchical image database. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 248–255 (2009)

    Google Scholar 

  7. Fang, Z., Lu, J., Liu, F., Xuan, J., Zhang, G.: Open set domain adaptation: theoretical bound and algorithm. arXiv abs/1907.08375 (2019)

    Google Scholar 

  8. Feng, Q., Kang, G., Fan, H., Yang, Y.: Attract or distract: exploit the margin of open set. In: ICCV (2019)

    Google Scholar 

  9. Ganin, Y., et al.: Domain-adversarial training of neural networks. J. Mach. Learn. Res.17(1), 2096–2030 (2016)

    Google Scholar 

  10. Garg, S., Balakrishnan, S., Lipton, Z.C.: Domain adaptation under open set label shift. In: NeurIPS (2022)

    Google Scholar 

  11. Gomes, R., Krause, A., Perona, P.: Discriminative clustering by regularized information maximization. In: NIPS (2010)

    Google Scholar 

  12. Grandvalet, Y., Bengio, Y.: Semi-supervised learning by entropy minimization. In: NIPS (2005)

    Google Scholar 

  13. He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 770–778 (2015)

    Google Scholar 

  14. Laine, S., Aila, T.: Temporal ensembling for semi-supervised learning. In: ICLR (2017)

    Google Scholar 

  15. Li, W., Liu, J., Han, B., Yuan, Y.: Adjustment and alignment for unbiased open set domain adaptation. In: CVPR (2023)

    Google Scholar 

  16. Lin, T.-Y., et al.: Microsoft COCO: common objects in context. In: Fleet, D., Pajdla, T., Schiele, B., Tuytelaars, T. (eds.) ECCV 2014. LNCS, vol. 8693, pp. 740–755. Springer, Cham (2014).https://doi.org/10.1007/978-3-319-10602-1_48

    Chapter  Google Scholar 

  17. Liu, H., Cao, Z., Long, M., Wang, J., Yang, Q.: Separate to adapt: open set domain adaptation via progressive separation. In: CVPR (2019)

    Google Scholar 

  18. Loghmania, M.R., Vinczea, M., Tommasi, T.: Positive-unlabeled learning for open set domain adaptation. Pattern Recogn. Lett.136, 198–204 (2020)

    Article  Google Scholar 

  19. Long, M., Cao, Y., Wang, J., Jordan, M.I.: Learning transferable features with deep adaptation networks. In: Proceedings of the 32nd International Conference on International Conference on Machine Learning, vol. 37, pp. 97–105. JMLR.org (2015)

    Google Scholar 

  20. Long, M., Cao, Z., Wang, J., Jordan, M.I.: Conditional adversarial domain adaptation. In: Advances in Neural Information Processing Systems, pp. 1640–1650. Curran Associates, Inc. (2018)

    Google Scholar 

  21. Long, M., Zhu, H., Wang, J., Jordan, M.I.: Deep transfer learning with joint adaptation networks. In: Proceedings of the 34th International Conference on Machine Learning, vol. 70, pp. 2208–2217. JMLR.org (2017)

    Google Scholar 

  22. Luo, Y., Wang, Z., Huang, Z., Baktashmotlagh, M.: Progressive graph learning for open-set domain adaptation. In: ICML (2020)

    Google Scholar 

  23. van der Maaten, L., Hinton, G.: Visualizing data using t-SNE. J. Mach. Learn. Res.9, 2579–2605 (2008)

    Google Scholar 

  24. Oliver, A., Odena, A., Raffel, C., Cubuk, E.D., Goodfellow, I.J.: Realistic evaluation of deep semi-supervised learning algorithms. In: NeurIPS (2018)

    Google Scholar 

  25. Peng, X., Usman, B., Kaushik, N., Hoffman, J., Wang, D., Saenko, K.: VisDA: the visual domain adaptation challenge. arXiv abs/1710.06924 (2017)

    Google Scholar 

  26. Peng, X., Usmana, B., Saito, K., Kaushik, N., Hoffman, J., Saenko, K.: Syn2real: a new benchmark forsynthetic-to-real visual domain adaptation. arXiv abs/1806.09755 (2018)

    Google Scholar 

  27. Saito, K., Watanabe, K., Ushiku, Y., Harada, T.: Maximum classifier discrepancy for unsupervised domain adaptation. In: IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 3723–3732 (2017)

    Google Scholar 

  28. Saito, K., Yamamoto, S., Ushiku, Y., Harada, T.: Open set domain adaptation by backpropagation. In: ECCV (2018)

    Google Scholar 

  29. Sajjadi, M., Javanmardi, M., Tasdizen, T.: Regularization with stochastic transformations and perturbations for deep semi-supervised learning. In: NIPS (2016)

    Google Scholar 

  30. Sohn, K., et al.: Fixmatch: simplifying semi-supervised learning with consistency and confidence. In: NIPS (2020)

    Google Scholar 

  31. Tang, H., Chen, K.C., Jia, K.: Unsupervised domain adaptation via structurally regularized deep clustering. In: IEEE Conference on Computer Vision and Pattern Recognition (2020)

    Google Scholar 

  32. Tzeng, E., Hoffman, J., Saenko, K., Darrell, T.: Adversarial discriminative domain adaptation. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 2962–2971 (2017)

    Google Scholar 

  33. Venkateswara, H., Eusebio, J., Chakraborty, S., Panchanathan, S.: Deep hashing network for unsupervised domain adaptation. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 5385–5394 (2017)

    Google Scholar 

  34. Wang, Q., Meng, F., Breckon, T.P.: Progressively select and reject pseudo-labelled samples for open-set domain adaptation. In: AAAI (2022)

    Google Scholar 

  35. Wu, J., He, J.: Domain adaptation with dynamic open-set targets. In: KDD (2022)

    Google Scholar 

  36. Xu, Y., Xu, C., Xu, C., Tao, D.: Multi-positive and unlabeled learning. In: IJCAI (2017)

    Google Scholar 

  37. Zhang, D., Westfechtel, T., Harada, T.: Unsupervised domain adaptation via minimized joint error. In: TMLR (2023)

    Google Scholar 

  38. Zhao, H., Combes, R.T.D., Zhang, K., Gordon, G.: On learning invariant representations for domain adaptation. In: Proceedings of Machine Learning Research, vol. 97, pp. 7523–7532. PMLR (2019)

    Google Scholar 

Download references

Acknowledgements

This research is partially supported by JST Moonshot R&D Grant Number JPMJPS2011, CREST Grant Number JPMJCR2015 and Basic Research Grant (Super AI) of Institute for AI and Beyond of the University of Tokyo.

Author information

Authors and Affiliations

  1. The University of Tokyo, Tokyo, Japan

    Dexuan Zhang, Thomas Westfechtel & Tatsuya Harada

  2. RIKEN, Wako, Japan

    Tatsuya Harada

Authors
  1. Dexuan Zhang

    You can also search for this author inPubMed Google Scholar

  2. Thomas Westfechtel

    You can also search for this author inPubMed Google Scholar

  3. Tatsuya Harada

    You can also search for this author inPubMed Google Scholar

Corresponding author

Correspondence toDexuan Zhang.

Editor information

Editors and Affiliations

  1. University of Birmingham, Birmingham, UK

    Aleš Leonardis

  2. University of Trento, Trento, Italy

    Elisa Ricci

  3. Technical University of Darmstadt, Darmstadt, Germany

    Stefan Roth

  4. Princeton University, Princeton, NJ, USA

    Olga Russakovsky

  5. Czech Technical University in Prague, Prague, Czech Republic

    Torsten Sattler

  6. École des Ponts ParisTech, Marne-la-Vallée, France

    Gül Varol

1Electronic supplementary material

Below is the link to the electronic supplementary material.

Rights and permissions

Copyright information

© 2025 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Zhang, D., Westfechtel, T., Harada, T. (2025). Open-Set Domain Adaptation via Joint Error Based Multi-class Positive and Unlabeled Learning. In: Leonardis, A., Ricci, E., Roth, S., Russakovsky, O., Sattler, T., Varol, G. (eds) Computer Vision – ECCV 2024. ECCV 2024. Lecture Notes in Computer Science, vol 15131. Springer, Cham. https://doi.org/10.1007/978-3-031-73464-9_7

Download citation

Publish with us

Access this chapter

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
JPY 3498
Price includes VAT (Japan)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
JPY 8465
Price includes VAT (Japan)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
JPY 10581
Price includes VAT (Japan)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide -see info

Tax calculation will be finalised at checkout

Purchases are for personal use only


[8]ページ先頭

©2009-2025 Movatter.jp