Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Search photos on Unsplash using natural language

License

NotificationsYou must be signed in to change notification settings

haltakov/natural-language-image-search

Repository files navigation

Open In Colab

Search photos on Unsplash using natural language descriptions. The search is powered by OpenAI'sCLIP model and theUnsplash Dataset.

"Two dogs playing in the snow"

Search results for "Two dogs playing in the snow"Photos byRichard Burlton,Karl Anderson andXuecheng Chen onUnsplash.

"The word love written on the wall"

Search results for "The word love written on the wall"Photos byGenton Damian ,Anna Rozwadowska,Jude Beck onUnsplash.

"The feeling when your program finally works"

Search results for "The feeling when your program finally works"Photos bybruce mars,LOGAN WEAVER,Vasyl Skunziak onUnsplash.

"The Syndey Opera House and the Harbour Bridge at night"

Search results for "The Syndey Opera House and the Harbour Bridge at night"Photos byDalal Nizam andAnna Tremewan onUnsplash.

How It Works?

OpenAI'sCLIP neural network is able to transform both images and text into the same latent space, where they can be compared using a similarity measure.

For this project, all photos from the fullUnsplash Dataset (almost 2M photos) were downloaded and processed with CLIP.

The pre-computed feature vectors for all images can then be used to find the best match to a natural language search query.

How To Run The Code?

On Google Colab

If you just want to play around with different queries jump to theColab notebook.

Open In Colab

On your machine

Before running any of the code, make sure to install all dependencies:

pip install -r requirements.txt

If you want to run all the code yourself open the Jupyter notebooks in the order they are numbered and follow the instructions there:

  • 01-setup-clip.ipynb - setup the environment checking out and preparing the CLIP code.
  • 02-download-unsplash-dataset.ipynb - download the photos from the Unsplash dataset
  • 03-process-unsplash-dataset.ipynb - process all photos from the dataset with CLIP
  • 04-search-image-dataset.ipynb - search for a photo in the dataset using natural language queries
  • 09-search-image-api.ipynb - search for a photo using the Unsplash Search API and filter the results using CLIP.

NOTE: only the Lite version of the Unsplash Dataset is publicly available. If you want to use the Full version, you will need toapply for (free) access.

NOTE: searching for images using the Unsplash Search API doesn't require access to the Unsplash Dataset, but will probably deliver worse results.

Acknowledgements

This project was inspired by these projects:

About

Search photos on Unsplash using natural language

Topics

Resources

License

Stars

Watchers

Forks

Contributors5


[8]ページ先頭

©2009-2025 Movatter.jp