Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Caching best practices#14404

Unanswered
endingofourlife asked this question inQuestions
Discussion options

First Check

  • I added a very descriptive title here.
  • I used the GitHub search to find a similar question and didn't find it.
  • I searched the FastAPI documentation, with the integrated search.
  • I already searched in Google "How to X in FastAPI" and didn't find any information.
  • I already read and followed all the tutorial in the docs and didn't find an answer.
  • I already checked if it is not related to FastAPI but toPydantic.
  • I already checked if it is not related to FastAPI but toSwagger UI.
  • I already checked if it is not related to FastAPI but toReDoc.

Commit to Help

  • I commit to help with one of those options 👆

Example Code

classCacheManager(CacheBase):asyncdefget_or_cache(self,key:str,fetch_func:callable,to_cache_mapper:callable,from_cache_mapper:callable,ttl:int=3600):"""        Generic: check cache by key,        if found -> from_cache_mapper -> pydantic model        if not found -> fetch_func -> to_cache_mapper -> cache -> return pydantic model        :param key: cache key        :param ttl: time to live in seconds        :param fetch_func: async () -> model        :param to_cache_mapper: model -> dict        :param from_cache_mapper: dict -> model        """cached=awaitself.get_cached_data(key)ifcached:logger.debug(f"{key} from cache")returnfrom_cache_mapper(cached)logger.debug(f"Fetching{key} from DB")model=awaitfetch_func()ifnotmodel:returnNonedata_to_cache=to_cache_mapper(model)# -> dictawaitself.cache_data(key,data_to_cache,ttl)logger.info(f"{key} cached")returnfrom_cache_mapper(data_to_cache)# -> pydantic# and usage likelanguages_list=awaitself._cache_manager.get_or_cache(key=self._cache_all_key,fetch_func=fetch_all,to_cache_mapper=PLanguageMapper.language_models_to_cache,from_cache_mapper=PLanguageMapper.cache_many_to_response,ttl=3600            )

Description

Best way to implement a clean caching layer in a mid-size fast api (up to 20 users per sec) ?

I started with custom decorators and a CacheManager class, but it’s becoming hard to maintain and use... What should I do in my case? Many routes have structure like -> if cache - return, else (db query) + cache - return. Should I use aiocache[redis] instead of my custom classes?

Thank you!

Operating System

Windows

Operating System Details

No response

FastAPI Version

0.122.0

Pydantic Version

2.12.4

Python Version

3.12

Additional Context

No response

You must be logged in to vote

Replies: 1 comment

Comment options

Hi@endingofourlife,
You're right, a customCacheManager with manual mappers becomes cumbersome quickly. Switching to a standard library is definitely the best practice here.

I recommend usingfastapi-cache2 with Redis. It handles the serialization and deserialization of Pydantic models, eliminating your need for manual mappers and boilerplateget_or_cache calls.

Setup (in your main file)

fromcontextlibimportasynccontextmanagerfromfastapiimportFastAPIfromfastapi_cacheimportFastAPICachefromfastapi_cache.backends.redisimportRedisBackendfromredisimportasyncioasaioredis@asynccontextmanagerasyncdeflifespan(app:FastAPI):# Initialize connection to Redisredis=aioredis.from_url("redis://localhost")FastAPICache.init(RedisBackend(redis),prefix="fastapi-cache")yieldapp=FastAPI(lifespan=lifespan)
You must be logged in to vote
0 replies
Sign up for freeto join this conversation on GitHub. Already have an account?Sign in to comment
Labels
questionQuestion or problem
2 participants
@endingofourlife@JunjieAraoXiong

[8]ページ先頭

©2009-2025 Movatter.jp