Skip to content

Image generation models like photo-face-swap include a safety checker designed to prevent the model from generating images that depict nudity, violence, or other unsafe content.

To protect users, we enable the safety checker for web predictions on certain image-based models and all derivative fine-tunes of those models.

The safety checker is intended to protect users, but it can sometimes be overly restrictive or generate false positives, incorrectly flagging safe content as unsafe. In such cases, you can disable the safety checker when running the model via the API. This gives you the flexibility to use a custom safety-checking model or integrate a third-party service into your workflow.

For more details on allowed use, please refer to the Terms of Service.