Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Commitfe26e4b

Browse files
authored
Add the release information of MMScan-devkit in the README (#96)
* Update README.mdAnnounce the release of MMScan-devkit* Update README.mdAnnounce the release of MMScan-devkit
1 parent89aca6f commitfe26e4b

File tree

1 file changed

+1
-0
lines changed

1 file changed

+1
-0
lines changed

‎README.md‎

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -44,6 +44,7 @@ It encompasses over <b>5k scans encapsulating 1M ego-centric RGB-D views, 1M lan
4444
Building upon this database, we introduce a baseline framework named <b>Embodied Perceptron</b>. It is capable of processing an arbitrary number of multi-modal inputs and demonstrates remarkable 3D perception capabilities, both within the two series of benchmarks we set up, i.e., fundamental 3D perception tasks and language-grounded tasks, and <b>in the wild</b>.
4545

4646
##🔥 News
47+
-\[2025-01\] We are delighted to present the official release of[MMScan-devkit](https://github.com/OpenRobotLab/EmbodiedScan/tree/mmscan), which encompasses a suite of data processing utilities, benchmark evaluation tools, and adaptations of some models for the MMScan benchmarks. We invite you to explore these resources and welcome any feedback or questions you may have!
4748
-\[2024-09\] We are pleased to announce the release of EmbodiedScan v2 beta, with original annotations on newly added~5k scans from ARKitScenes and the beta version of MMScan's annotations on the original 5k scans. Fill in the[form](https://docs.google.com/forms/d/e/1FAIpQLScUXEDTksGiqHZp31j7Zp7zlCNV7p_08uViwP_Nbzfn3g6hhw/viewform) to apply for downloading. Welcome for any feedback!
4849
-\[2024-08\] We preliminarily release the[sample data](https://drive.google.com/file/d/1Y1_LOE35NpsnkneYElvNwuuR6-OAbwPm/view?usp=sharing) of[MMScan](https://tai-wang.github.io/mmscan/) and the full release will be ready with ARKitScenes' annotations this month, which will be announced via emails to the community. Please stay tuned!
4950
-\[2024-06\] The report of our follow-up work with the most-ever hierarchical grounded language annotations,[MMScan](https://tai-wang.github.io/mmscan/), has been released. Welcome to talk with us about EmbodiedScan and MMScan at Seattle, CVPR 2024!

0 commit comments

Comments
 (0)

[8]ページ先頭

©2009-2025 Movatter.jp