Firebase storage artifacts

FirebaseFirebase Storage

Firebase Problem Overview


I' trying to understand what eu.artifacts.%PROJECT NAME%.appspot.com is. It's currently taking up 800mb of storage from my daily 5gb limit. It contains only application/octet-stream type of files. This bucket was created automatically and the file path is eu.artifacts....appspot.com/containers/images. 2 heaviest files there weight as much as 200mb and 130mb. I tried deleting it but it was automatically created again. Users can upload pictures on my website but that bucket currently only takes about 10mb containing all the user images.

So my question is: What is this bucket for and why does it weight so much?

Firebase Solutions


Solution 1 - Firebase

firebaser here

If you are using Cloud Functions, the files you're seeing are related to a recent change in how the runtime (for Node 10 and up) is built.

Cloud Functions now uses Cloud Build to create the runtime (for Node 10 and up) for your Cloud Functions. And Cloud Build in turn uses Container Registry to store those runtimes, which stores them in a new Cloud Storage bucket under your project.

For more on this, also see this entry in the Firebase pricing FAQ on Why will I need a billing account to use Node.js 10 or later for Cloud Functions for Firebase?

Also see this thread on the firebase-talk mailing list about these artifacts.


Update: some other answers suggest deleting artifacts from the Storage buckets, and even setting up lifecycle management on them to do so automatically. This leads to dangling references to those artifacts in the Container Registry, which breaks future builds.

To safely get rid of the artifacts, delete the container from the Container Registry console (it's under the gcf folder) or with a script. That will then in turn also delete the artifacts from your Storage bucket.

Since version 9.14 of the CLI, the firebase deploy process automatically cleans up its container images after a deploy. So if you upgrade to the latest version, you should no longer get additional artifacts in your storage buckets.

Solution 2 - Firebase

I've consulted GCP support and here are a few things

  • Cloud Functions caused the surge in storage usage
  • Since these artifacts are not stored in the default bucket, they'll charge you even if your total bytes stored are not reaching the free tier limit
  • Remove the artifact bucket at https://console.cloud.google.com/storage/browser. According to the support staff > Regarding the artifacts bucket, you can actually get rid of them, as they are storing previous versions of the function. However, I do not recommend deleting the "gcf-sources..." bucket(s) , as it contains the current image, so deleting this bucket would mess up your function.

I tried to remove it in whole, and so far it is not causing trouble. I'll update if it break things later.


Edit 201118: See comment below and you might need to keep the bucket while removing all the content in it.

Solution 3 - Firebase

Adding to @yo1995
I consulted with Firebase Support and they confirmed that the artifacts bucket should not be deleted. Basically the artifacts are used to help build the final image to be stored in the "gcf-sources" bucket.

To quote them directly
"you are free to delete the contents in "XX.artifacts", but please leave the bucket untouched, it will be used in the following deployment cycles."

There might be some unintended behaviour if you delete the artifacts bucket entirely.
Also "The team is working to clean up this bucket automatically, but there are some restrictions that they need to solve before publishing the solution."

For the time being I set the bucket to auto-delete files older than 1 day old.

Solution 4 - Firebase

Adding to @yo1995's response, you can delete the artifacts in the bucket without needing to go into GCP. Staying in Firebase, you go to Storage, then "Add a Bucket". From there, you will see the option to import the gcp and artifact buckets. Next, you can delete the artifacts in the buckets accordingly.

Per some comments received, it's important not to delete the bucket. Rather, delete the artifacts in the bucket only!

Artifact Bucket Location Picture

Solution 5 - Firebase

EDIT early 2022: This whole answer is now moot. It may have worked in the past, but the actual root cause of the problem is now fixed in the Firebase CLI.

How to reduce storage

So there is a great answer to the issue but the solution as to how to fix it requires further deep diving.

To help future developers cut right to the chase, here is the result you should see after adding the following rules to your project in GCP

completely cleaned up artifact storage

The orange line is the us-artifacts.<your-project>.appspot.com bucket.

Steps to fix the issue
  1. Navigate to https://console.cloud.google.com/
  2. Open the GCP project that corresponds to the Firebase project
  3. In the menu, choose Storage -> Browser Navigation menu
  4. Click on the offending us-artifacts.<your-project>.appspot.com bucket
  5. Go to the 'Lifecycle' tab and add a life span of 3 days
  • Add a rule
  • Delete Object
  • Age, 3 Days lifecycle rule NB: Results will not appear on the usage graph until about 24 hours later
Caveat

Firebase uses containers that back reference previous containers, so if you set a period of 3 days and your firebase deploy functions start failing, you will need to update the local name of your function to include versioning, and either specify a build flag to delete old versions, remove them from your firebase.json, or manually delete obsolete functions.

Using versioned API type functions

In your entrypoint, assuming index.ts, and assuming you have initilaised firebase with

admin.initializeApp(functions.config().firebase)
import * as functions from 'firebase-functions'

// define the app as a cloud function called APIv1 build xxxxxx
export const APIv1b20201202 = functions.https.onRequest(main)

where main is the name of your app

and in your firebase.json

...
"hosting": {
    "public": "dist",
    "ignore": ["firebase.json", "**/.*", "**/node_modules/**", "**/tests/**"],
    "rewrites": [
      {
        "source": "/api/v1/**",
        "function": "APIv1b2021202"
      }
    ]
  },
...

Or, to Manually Update

# Deploy new function called APIv11
$ firebase deploy --only functions:APIv11

# Wait until deployment is done; now both APIv11 and APIv10 are running

# Delete APIv10
$ firebase functions:delete APIv10

Solution 6 - Firebase

Firebase said they have released a fix (as of June 2021):

https://github.com/firebase/firebase-tools/issues/3404#issuecomment-865270081

> Fix is in the next version of firebase-tools, which should be coming today.

To fix:

  1. Run npm i -g firebase-tools.

  2. Browse your contains in Cloud Storage https://console.cloud.google.com/storage/browser/ (look for a bucket named gcf-sources-*****-us-central1)

  3. Any deleted functions via firebase deploy --only functions seem to remove artifacts automatically, but if you delete them through the UI, they artifacts remain.

Solution 7 - Firebase

After some research and emailing with the firebase team, this is what was suggested to me.

We are aware that Cloud Build is not automatically deleting old artifacts so it's size keeps on increasing, as a workaround I’d recommend deleting the files inside the bucket in order to reduce any possible charges.

You can delete the files into the mentioned buckets going to the GCP console (use the same credentials as Firebase Console) -> Select the correct project -> From the left upper corner menu select Storage -> Browser. You will see all the buckets that belong to your project, click on the bucket you prefer, and you can delete the files from there.

One other option that you may try is managing the bucket's object lifecycles. There is an option to delete objects when they meet all conditions specified in the lifecycle rule, here is a link with one example about this option. In this way, the bucket objects will be deleted automatically.

Solution 8 - Firebase

I have created a configuration file I named storage_artifacts_lifecycle.json with contents:

{
  "lifecycle": {
    "rule": [
      {
        "action": { "type": "Delete" },
        "condition": {
          "age": 21
        }
      }
    ]
  }
}

I configure my storage lifecycle with the command:

gsutil lifecycle set ./firebase/storage_artifacts_lifecycle.json gs://us.artifacts.${MY_PROJECT_ID}.appspot.com

and I validate its results after running with

gsutil lifecycle get gs://us.artifacts.${MY_PROJECT_ID}.appspot.com

Hope this helps some!

Solution 9 - Firebase

I did a bit of research on the topic and find the optimal solution for me - a script that I run before each deploy of my Firebase functions. The script scans my container images and:

  • Keeps the ones with latest tag.
  • Deletes all the images except the last too.

This approach is semi-automated. The storage anyway grows only when I deploy so it works really well for me.

The script is written in JavaScript for environment with node and gcloud cli available.

const spawn = require("child_process").spawn;

const KEEP_AT_LEAST = 2;
const CONTAINER_REGISTRIES = [
  "gcr.io/<your project name>",
  "eu.gcr.io/<your project name>/gcf/europe-west3"
];

async function go(registry) {
  console.log(`> ${registry}`);
  const images = await command(`gcloud`, [
    "container",
    "images",
    "list",
    `--repository=${registry}`,
    "--format=json",
  ]);
  for (let i = 0; i < images.length; i++) {
    console.log(`    ${images[i].name}`);
    const image = images[i].name;
    let tags = await command(`gcloud`, [
      "container",
      "images",
      "list-tags",
      image,
      "--format=json",
    ]);
    const totalImages = tags.length;
    // do not touch `latest`
    tags = tags.filter(({ tags }) => !tags.find((tag) => tag === "latest"));
    // sorting by date
    tags.sort((a, b) => {
      const d1 = new Date(a.timestamp.datetime);
      const d2 = new Date(b.timestamp.datetime);
      return d2.getTime() - d1.getTime();
    });
    // keeping at least X number of images
    tags = tags.filter((_, i) => i >= KEEP_AT_LEAST);

    console.log(`      For removal: ${tags.length}/${totalImages}`);
    for (let j = 0; j < tags.length; j++) {
      console.log(
        `      Deleting: ${formatImageTimestamp(tags[j])} | ${tags[j].digest}`
      );
      await command("gcloud", [
        "container",
        "images",
        "delete",
        `${image}@${tags[j].digest}`,
        "--format=json",
        "--quiet",
        "--force-delete-tags",
      ]);
    }
  }
}

function command(cmd, args) {
  return new Promise((done, reject) => {
    const ps = spawn(cmd, args);
    let result = "";

    ps.stdout.on("data", (data) => {
      result += data;
    });

    ps.stderr.on("data", (data) => {
      result += data;
    });

    ps.on("close", (code) => {
      if (code !== 0) {
        console.log(`process exited with code ${code}`);
      }
      try {
        done(JSON.parse(result));
      } catch (err) {
        done(result);
      }
    });
  });
}

function formatImageTimestamp(image) {
  const { year, month, day, hour, minute } = image.timestamp;
  return `${year}-${month}-${day} ${hour}:${minute}`;
}

(async function () {
  for (let i = 0; i < CONTAINER_REGISTRIES.length; i++) {
    await go(CONTAINER_REGISTRIES[i]);
  }
})();

It runs the following commands:

# finding images
gcloud container images list --repository=<your repository>

# getting metadata
gcloud container images list-tags <image name>

# deleting images
gcloud container images delete <image name>@<digest> --quiet --force-delete-tags

A blog post describing my findings is available here https://krasimirtsonev.com/blog/article/firebase-gcp-saving-money

Solution 10 - Firebase

As an alternative, You can create a life Cycle rule to delete the objects inside the folder. set the age as 1 day. So it will delete all objects in the folder which is more than 1 day aging. lifeCycle rulw

SetCondition

Attributions

All content for this solution is sourced from the original question on Stackoverflow.

The content on this page is licensed under the Attribution-ShareAlike 4.0 International (CC BY-SA 4.0) license.

Content TypeOriginal AuthorOriginal Content on Stackoverflow
QuestionglazeView Question on Stackoverflow
Solution 1 - FirebaseFrank van PuffelenView Answer on Stackoverflow
Solution 2 - Firebaseyo1995View Answer on Stackoverflow
Solution 3 - FirebasedeckoView Answer on Stackoverflow
Solution 4 - FirebaseMike AltonjiView Answer on Stackoverflow
Solution 5 - FirebaseBeerswillerView Answer on Stackoverflow
Solution 6 - Firebased-_-bView Answer on Stackoverflow
Solution 7 - FirebaseKejsarenView Answer on Stackoverflow
Solution 8 - FirebaseGreg FentonView Answer on Stackoverflow
Solution 9 - FirebaseKrasimirView Answer on Stackoverflow
Solution 10 - FirebaseAnkurView Answer on Stackoverflow