HyperGAN
Search…
HyperGAN
About
Getting started
CLI guide
Configurations
Showcase
AI Explorer for Android
Youtube, Twitter, Discord +
Examples
2D
Text
Classification
Colorizer
Next Frame (video)
Tutorials
Training a GAN
Pygame inference
Creating an image dataset
Searching for hyperparameters
Components
GAN
Generator
Discriminator
Layers
Loss
Latent
Trainer
Optimizer
Train Hook
Adversarial Norm
Weight Constraint
Stabilizing Training
JARE
Learning Rate Dropout
Gradient Penalty
Rolling Memory
Other GAN implementations
Powered By
GitBook
Learning Rate Dropout
https://arxiv.org/abs/1912.00144
examples
1
{
2
"class"
:
"function:hypergan.train_hooks.learning_rate_dropout_train_hook.LearningRateDropoutTrainHook"
,
3
"dropout"
:
0.01
,
4
"ones"
:
1e12
,
5
"zeros"
:
0.0
,
6
"skip_d"
:
true
7
}
Copied!
options
attribute
description
type
dropout
0-1 dropout ratio. Defaults to
0.5
float
ones
The gradient multiplier when not dropped out. Defaults to
0.1
float
zeros
The gradient multiplier when dropped out. Defaults to
0.0
float
skip_d
skip d gradients
Defaults to
false
boolean
skip_g
skip g gradients
Defaults to
false
boolean
Floats are
configurable parameters
​
Previous
JARE
Next
Gradient Penalty
Last modified
1yr ago
Copy link
Contents
examples
options