Softmax Loss
Adapted from https://arxiv.org/abs/1704.06191
Source: /losses/softmax_loss.py
ln_zb = (((-d_real).exp().sum()+(-d_fake).exp().sum())+1e-12).log()
d_target = 1.0 / d_real.shape[0]
g_target = d_target / 2.0
g_loss = g_target * (d_fake.sum() + d_real.sum()) + ln_zb
d_loss = d_target * d_real.sum() + ln_zb
examples
Configurations: /losses/softmax_loss/
{
"class": "function:hypergan.losses.softmax_loss.SoftmaxLoss",
}
Last updated
Was this helpful?