-
Notifications
You must be signed in to change notification settings - Fork 411
Description
[REQUIRED] Step 2: Extension name
This feature request is for extension: storage-resize-images
What feature would you like to see?
Support for more scalable backfill processing. We're using DO_BACKFILL=true
to resize ~600k images (JPG, PNG, etc) into multiple AVIF sizes (total ~200 GB), with the function continuously running for over a week. Initially we processed ~9 images/minute — now it's down to ~2/minute and continues to degrade.
It seems bottlenecked by the hardcoded maxResults: 3
in bucket.getFiles()
, making throughput unmanageable at scale. We'd like:
maxResults
to be configurable- Options to parallelize backfill (e.g. via Pub/Sub or task splitting)
- Optional progress tracking (so that we don't have to guesstimate against the log messages)
How would you use it?
One-time backfill of a large historical image set to generate multiple AVIF sizes. Current speed would take ~200 days to complete. We're running the extension with:
CONTENT_FILTER_LEVEL=OFF
DELETE_ORIGINAL_FILE=false
DO_BACKFILL=true
firebaseextensions.v1beta.function/location=europe-west1
FUNCTION_MEMORY=2048
IMAGE_TYPE=avif
IMG_SIZES=640x480,1200x800,1920x1280
IS_ANIMATED=true
MAKE_PUBLIC=true
OUTPUT_OPTIONS={"avif":{"quality":100}}
REGENERATE_TOKEN=false
RESIZED_IMAGES_PATH=thumbnails
SHARP_OPTIONS={"fit": "cover"}
Would appreciate any improvements, and/or general scaling tips & best practices for improving throughput on large datasets.