SafetySetting

public final classSafetySetting


This class is deprecated.
The Vertex AI in Firebase SDK (firebase-vertexai) has been replaced with the FirebaseAI SDK (firebase-ai) to accommodate the evolving set of supported features and services.For migration details, see the migration guide: https://firebase.google.com/docs/vertex-ai/migrate-to-latest-sdk

A configuration for aHarmBlockThreshold of someHarmCategory allowed and blocked in responses.

Summary

Public constructors

SafetySetting(
    @NonNullHarmCategory harmCategory,
    @NonNullHarmBlockThreshold threshold,
    HarmBlockMethod method
)

Public constructors

SafetySetting

public SafetySetting(
    @NonNullHarmCategory harmCategory,
    @NonNullHarmBlockThreshold threshold,
    HarmBlockMethod method
)
Parameters
@NonNullHarmCategory harmCategory

The relevantHarmCategory.

@NonNullHarmBlockThreshold threshold

The threshold form harm allowable.

HarmBlockMethod method

Specify if the threshold is used for probability or severity score, if not specified it will default toHarmBlockMethod.PROBABILITY.

Except as otherwise noted, the content of this page is licensed under theCreative Commons Attribution 4.0 License, and code samples are licensed under theApache 2.0 License. For details, see theGoogle Developers Site Policies. Java is a registered trademark of Oracle and/or its affiliates.

Last updated 2025-07-21 UTC.