Deep learning

BigGAN deep in Tensorflow 2

A TensorFlow 2 implementation of BigGAN deep neural network

Deep Learning
Johan Leduc

Conditional samples from BigGAN deep 256x256 pixels

I have recently tried to find a state-of-the-art conditional image generation model in Tensorflow 2. I found out that the BigGAN deep model from the Large Scale GAN Training for High Fidelity Natural Image Synthesis paper was a perfect cast for the job. In this post, I present a full Tensorflow 2 implementation of the model with pre-trained weights.

Deepmind released the models on TFHub available as a KerasLayer object but they are not fine-tunable using Tensorflow 2 and the model’s code is not released. This was problematic since I wanted the control to fine-tune the first blocks while freezing the last ones.

I found an implementation by HuggingFace that actually recodes the model’s logic in PyTorch and fetch the pre-trained weights from the released models. I didn’t want to switch to PyTorch so I adapted the model’s logic and weights mapping from PyTorch to Tensorflow 2. The code to build and load pre-trained weights is available on our Github TF2 published models repository.

To generate samples, simply import the BigGAN object and sample.

from biggan import BigGAN

model = BigGAN.from_pretrained("biggan-deep-128")
images = model.sample_with_labels(["lion", "tiger"])

Since the model’s code is entirely release in TF2, you can fine-tune or change it as you please for instance by freezing the lasts blocks while training the first blocks. Enjoy!

About the author

Johan Leduc

Data Science Researcher @ Sarus

Ready?

Ready to unlock the value of your data? We can set you up in no time.
main.py
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17

Shell

Subscribe to our newsletter

You're on the list! Thank you for signing up.
Oops! Something went wrong while submitting the form.
32, rue Alexandre Dumas
75011 Paris — France
Resources
Blog
©2023 Sarus Technologies.
All rights reserved.