# Gradient Penalty

$$
lambda \* relu(||gradients(target, components)||\_2 - flex) ^2
$$

## examples

```javascript
{                                                                                       
  "class": "function:hypergan.train_hooks.gradient_penalty_train_hook.GradientPenaltyTra
  "lambda": 1.00,                                                                       
  "flex": 1.0,                                                                          
  "components": ["discriminator"],                                                       
  "target": "discriminator"
}
```

## options

| attribute | description | type |
| --------- | ----------- | ---- |

| target | <p>Used in gradients(target, components)</p><p>defaults to <code>discriminator</code></p> | string (optional) |
| ------ | ----------------------------------------------------------------------------------------- | ----------------- |

| lambda | <p>Loss multiple</p><p>defaults to <code>1.0</code></p> | float |
| ------ | ------------------------------------------------------- | ----- |

| components | <p>Used in gradients(target, components)</p><p>defaults to all components</p> | array of strings |
| ---------- | ----------------------------------------------------------------------------- | ---------------- |

| flex | <p>Can also be a list for separate X/G flex.</p><p>example: <code>\[0.0, 10.0]</code></p> | array of float |
| ---- | ----------------------------------------------------------------------------------------- | -------------- |

| loss | <p>Side loss is added to: <code>g\_loss</code> or <code>d\_loss</code></p><p>defaults to <code>g\_loss</code></p> | string |
| ---- | ----------------------------------------------------------------------------------------------------------------- | ------ |

{% hint style="info" %}
Floats are [configurable parameters](https://hypergan.gitbook.io/hypergan/configuration/configurable-parameters)
{% endhint %}
