Skip to content

Commit d761e8a

Browse files
feat: expose leakyReluAlpha
very small feature, large consequences
1 parent 565d600 commit d761e8a

7 files changed

+33
-25
lines changed

README.md

Lines changed: 8 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -3,8 +3,8 @@
33

44
<img src="https://cdn.rawgit.com/harthur-org/brain.js/ff595242/logo.svg" alt="Logo" width=200px/>
55

6-
[![npm](https://img.shields.io/npm/dt/brain.js.svg?style=flat-square)](https://npmjs.com/package/brain.js)
7-
[![Backers on Open Collective](https://opencollective.com/brainjs/backers/badge.svg)](#backers) [![Sponsors on Open Collective](https://opencollective.com/brainjs/sponsors/badge.svg)](#sponsors)
6+
[![npm](https://img.shields.io/npm/dt/brain.js.svg?style=flat-square)](https://npmjs.com/package/brain.js)
7+
[![Backers on Open Collective](https://opencollective.com/brainjs/backers/badge.svg)](#backers) [![Sponsors on Open Collective](https://opencollective.com/brainjs/sponsors/badge.svg)](#sponsors)
88

99
[![Gitter](https://badges.gitter.im/Join%20Chat.svg)](https://gitter.im/brain-js/Lobby?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge) [![Slack](https://slack.bri.im/badge.svg)](https://slack.bri.im)
1010

@@ -45,7 +45,7 @@
4545
+ [`likely`](#likely)
4646
- [Neural Network Types](#neural-network-types)
4747
+ [Why different Neural Network Types?](#why-different-neural-network-types)
48-
48+
4949
# Examples
5050
Here's an example showcasing how to approximate the XOR function using `brain.js`:
5151
more info on config [here](https://github.com/BrainJS/brain.js/blob/develop/src/neural-network.js#L31).
@@ -55,7 +55,8 @@ more info on config [here](https://github.com/BrainJS/brain.js/blob/develop/src/
5555
const config = {
5656
binaryThresh: 0.5,
5757
hiddenLayers: [3], // array of ints for the sizes of the hidden layers in the network
58-
activation: 'sigmoid' // supported activation types: ['sigmoid', 'relu', 'leaky-relu', 'tanh']
58+
activation: 'sigmoid' // supported activation types: ['sigmoid', 'relu', 'leaky-relu', 'tanh'],
59+
leakyReluAlpha: 0.01 // supported for activation type 'leaky-relu'
5960
};
6061

6162
// create a simple feed forward neural network with backpropagation
@@ -293,7 +294,7 @@ const net = crossValidate.fromJSON(json);
293294
An example of using cross validate can be found in [examples/cross-validate.js](examples/cross-validate.js)
294295

295296
### Train Stream
296-
Streams are a very powerful tool in node for massive data spread across processes and are provided via the brain.js api in the following way:
297+
Streams are a very powerful tool in node for massive data spread across processes and are provided via the brain.js api in the following way:
297298
```js
298299
const net = new brain.NeuralNetwork();
299300
const trainStream = new brain.TrainStream({
@@ -364,11 +365,12 @@ const net = new brain.NeuralNetwork({
364365
```
365366

366367
### activation
367-
This parameter lets you specify which activation function your neural network should use. There are currently four supported activation functions, **sigmoid** being the default:
368+
This parameter lets you specify which activation function your neural network should use. There are currently four supported activation functions, **sigmoid** being the default:
368369

369370
- [sigmoid](https://www.wikiwand.com/en/Sigmoid_function)
370371
- [relu](https://www.wikiwand.com/en/Rectifier_(neural_networks))
371372
- [leaky-relu](https://www.wikiwand.com/en/Rectifier_(neural_networks))
373+
* related option - 'leakyReluAlpha' optional number, defaults to 0.01
372374
- [tanh](https://theclevermachine.wordpress.com/tag/tanh-function/)
373375

374376
Here's a table (thanks, Wikipedia!) summarizing a plethora of activation functions — [Activation Function](https://www.wikiwand.com/en/Activation_function)

browser.js

Lines changed: 8 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@
66
* license: MIT (http://opensource.org/licenses/MIT)
77
* author: Heather Arthur <fayearthur@gmail.com>
88
* homepage: https://github.com/brainjs/brain.js#readme
9-
* version: 1.4.9
9+
* version: 1.4.10
1010
*
1111
* acorn:
1212
* license: MIT (http://opensource.org/licenses/MIT)
@@ -1108,6 +1108,7 @@ var NeuralNetwork = function () {
11081108
key: 'defaults',
11091109
get: function get() {
11101110
return {
1111+
leakyReluAlpha: 0.01,
11111112
binaryThresh: 0.5,
11121113
hiddenLayers: [3], // array of ints for the sizes of the hidden layers in the network
11131114
activation: 'sigmoid' // Supported activation types ['sigmoid', 'relu', 'leaky-relu', 'tanh']
@@ -1298,7 +1299,7 @@ var NeuralNetwork = function () {
12981299
key: '_runInputLeakyRelu',
12991300
value: function _runInputLeakyRelu(input) {
13001301
this.outputs[0] = input; // set output state of input layer
1301-
1302+
var alpha = this.leakyReluAlpha;
13021303
var output = null;
13031304
for (var layer = 1; layer <= this.outputLayer; layer++) {
13041305
for (var node = 0; node < this.sizes[layer]; node++) {
@@ -1309,7 +1310,7 @@ var NeuralNetwork = function () {
13091310
sum += weights[k] * input[k];
13101311
}
13111312
//leaky relu
1312-
this.outputs[layer][node] = sum < 0 ? 0 : 0.01 * sum;
1313+
this.outputs[layer][node] = sum < 0 ? 0 : alpha * sum;
13131314
}
13141315
output = input = this.outputs[layer];
13151316
}
@@ -1678,6 +1679,7 @@ var NeuralNetwork = function () {
16781679
}, {
16791680
key: '_calculateDeltasLeakyRelu',
16801681
value: function _calculateDeltasLeakyRelu(target) {
1682+
var alpha = this.leakyReluAlpha;
16811683
for (var layer = this.outputLayer; layer >= 0; layer--) {
16821684
for (var node = 0; node < this.sizes[layer]; node++) {
16831685
var output = this.outputs[layer][node];
@@ -1692,7 +1694,7 @@ var NeuralNetwork = function () {
16921694
}
16931695
}
16941696
this.errors[layer][node] = error;
1695-
this.deltas[layer][node] = output > 0 ? error : 0.01 * error;
1697+
this.deltas[layer][node] = output > 0 ? error : alpha * error;
16961698
}
16971699
}
16981700
}
@@ -2085,6 +2087,7 @@ var NeuralNetwork = function () {
20852087
key: 'toFunction',
20862088
value: function toFunction() {
20872089
var activation = this.activation;
2090+
var leakyReluAlpha = this.leakyReluAlpha;
20882091
var needsVar = false;
20892092
function nodeHandle(layers, layerNumber, nodeKey) {
20902093
if (layerNumber === 0) {
@@ -2114,7 +2117,7 @@ var NeuralNetwork = function () {
21142117
case 'leaky-relu':
21152118
{
21162119
needsVar = true;
2117-
return '((v=' + result.join('') + ')<0?0:0.01*v)';
2120+
return '((v=' + result.join('') + ')<0?0:' + leakyReluAlpha + '*v)';
21182121
}
21192122
case 'tanh':
21202123
return 'Math.tanh(' + result.join('') + ')';

browser.min.js

Lines changed: 6 additions & 7 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

dist/neural-network.js

Lines changed: 7 additions & 4 deletions
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

dist/neural-network.js.map

Lines changed: 1 addition & 1 deletion
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

index.d.ts

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -34,7 +34,8 @@ export interface INeuralNetworkJSON {
3434
outputLookup: any;
3535
inputLookup: any;
3636
activation: NeuralNetworkActivation,
37-
trainOpts: INeuralNetworkTrainingOptions
37+
trainOpts: INeuralNetworkTrainingOptions,
38+
leakyReluAlpha?: number,
3839
}
3940

4041
export interface INeuralNetworkTrainingData {

package.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
{
22
"name": "brain.js",
33
"description": "Neural network library",
4-
"version": "1.4.9",
4+
"version": "1.4.10",
55
"author": "Heather Arthur <fayearthur@gmail.com>",
66
"repository": {
77
"type": "git",

0 commit comments

Comments
 (0)