26 lines
		
	
	
		
			858 B
		
	
	
	
		
			Markdown
		
	
	
	
	
	
			
		
		
	
	
			26 lines
		
	
	
		
			858 B
		
	
	
	
		
			Markdown
		
	
	
	
	
	
---
 | 
						|
title: Updaters
 | 
						|
short_title: Updaters
 | 
						|
description: Special algorithms for gradient descent.
 | 
						|
category: Models
 | 
						|
weight: 10
 | 
						|
---
 | 
						|
 | 
						|
## What are updaters?
 | 
						|
 | 
						|
The main difference among the updaters is how they treat the learning rate. Stochastic Gradient Descent, the most common learning algorithm in deep learning, relies on `Theta` (the weights in hidden layers) and `alpha` (the learning rate). Different updaters help optimize the learning rate until the neural network converges on its most performant state.
 | 
						|
 | 
						|
## Usage
 | 
						|
 | 
						|
To use the updaters, pass a new class to the `updater()` method in either a `ComputationGraph` or `MultiLayerNetwork`.
 | 
						|
 | 
						|
```java
 | 
						|
ComputationGraphConfiguration conf = new NeuralNetConfiguration.Builder()
 | 
						|
    .updater(new Adam(0.01))
 | 
						|
    // add your layers and hyperparameters below
 | 
						|
    .build();
 | 
						|
```
 | 
						|
 | 
						|
## Available updaters
 | 
						|
 | 
						|
{{autogenerated}} |