-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LSTM examples? #109
Comments
const trainingData = [
'Jane saw Doug.',
'Spot saw himself.',
'Doug saw Jane.'
];
const lstm = new brain.recurrent.LSTM();
const result = lstm.train(trainingData, { iterations: 1000 });
const run1 = lstm.run('Jane');
const run2 = lstm.run('Spot');
const run3 = lstm.run('Doug');
console.log('run 1: Jane' + run1);
console.log('run 2: Spot' + run2);
console.log('run 3: Doug' + run3); https://jsfiddle.net/robertleeplummerjr/x3cna8rn/2/ I actually just discovered a bug, using the
😎 |
Notice too, after each name, there are spaces. That isn't magic nor was the net specifically told to put spaces, it is just math. The net just saw them and added them after training. |
Now I'm starting to have fun with this: https://jsfiddle.net/robertleeplummerjr/x3cna8rn/4/
|
lol, this is actually addictive: https://jsfiddle.net/robertleeplummerjr/x3cna8rn/5/
|
https://twitter.com/robertlplummer/status/947275642365730816 Now we just need a publisher... |
Amazing!! So I was completely wrong about how LSTM's expect their training data it seems? I was thinking the time series was each item in the array of training data, but it seems that it's not. In this case, it's using each character of each item? This is really fun! https://codepen.io/anon/pen/xpdmdN?editors=1010 I'm seeing some weird behavior sometimes, where instead of text, it gives output like '19,19,19,19...19' instead of words. Not sure if that is the bug you mentioned on the other issue. So now that I can feed it text, and think I understand how it expects text input, I'm lost as to how to feed it anything else. Suppose we wanted try to predict the weather. Our input data might look like this: {
date:'10-12-17',
temperature: 45,
humidity: 30
},
{
date:'10-13-17',
temperature: 40,
humidity: 10
}, How would we convert that into something the LSTM can try to make sense of, both as training data, and then once trained, as input? |
Actually you are more correct than you think. The net does provide some syntactical sugar in some scenarios. Objects have bit yet been worked out, unless you convert them to string or number. |
To be clearer. things like numbers aren't so good with recurrent nets as they have been defined to me and I've seen them implemented. Part of the reason is that every character in the net is the input, so numbers are seen as more like characters (honestly more like buttons that are pushed of a sequence, that is translated to another sequence when predicting). This is borrowed mentality from recurrentjs, and is part of the reason I started working on v2. I want the ability to feed data into a recurrent net, not just "push buttons". |
I found the issue related when playing around with the toFunction method here #110. Will be pushing a pr soon, locally I have it fixed. |
released: https://www.npmjs.com/package/brain.js |
I'd like to use this children's book example in the release of v1 on the brain.js.org homepage. Are you ok with that? |
Ok yeah, that makes sense (the pushed buttons analogy)! That's what I've been observing. The LSTM currently throws an error for example, if you try to run it with a word it's never seen before. I'd love to see the example in the readme! That was a big motivator for this exercise, to help others make sense of it. I got the idea for the children's book here btw: https://www.youtube.com/watch?v=WCUNPb-5EYI |
are you cool with using it as a demo? It is just simple enough and fun enough I feel like it'd be great as a primer into recurrent. |
Yeah go for it! I think it does a good job giving newcomers and idea how the different types of networks work and what types of projects they’re a good fit for. |
Fixed the output function: https://jsfiddle.net/robertleeplummerjr/x3cna8rn/10/ |
published new release, I believe one is v1 ready. |
Awesome! I’m going to close this issue. |
Hey, how did you ever resolve this problem? I am facing the same situation. I have time a time series with objects that look like this: { My goal was to predict the closing price for a future date. I am trying to implement an LSTM. I am not sure what to send as the input and output, so I tried a couple of things and got different outputs. First I pushed only the closing price into the training data like so:
I then receive the following error: Cannot read property 'runBackpropagate' of undefined I then tried to use the input as being the index and output as being the price:
Now if I run it for an input between 0 and 20 it gives me the same values from the training data, but if I run it for anything out of that scope (22 for example) I get unrecognized character "22" Finally, I tried pushing them in as strings:
And my output is a string like: "112.3". However, I am not sure if that's supposed to be the expected value. Option 1 seems like it would make more sense, as to pass the position as input and close as output. Thanks again! This library is truly awesome! |
TL;DR TS;DR Let that sink in for a minute. The values that you are training for aren't actually sent into the neural net (I want to change this). Part of my frustration with javascript neural networks is that they are "just hobby architectures", for learning.... As in "don't take it seriously".... Please. When I implemented the recurrent neural network from recurrent.js, I took a step back and thought "how can I fix this for javascript?", for it to not be a hobby, but for it to have that awakening much like when a super hero realizes its powers. I did a lot of research and arrived at http://gpu.rocks/, but it was not very stable, and not yet released as v1. If you want things done right... After some initial pull requests, and fantastic conversations with the architects, I was added to the team, where I helped improve the architecture and enabled it to run on node with CPU only fallback, while I was working on brain.js to have an architecture that could use gpu.js. Our plan is to have opencl running in it soon, which is where we go from hobby to enterprise. Long story short, a few days before the new year, v1 was released. The Recurrent neural network concept is being planned, and will match very closely to the feedforward neural network and will allow for this easy means of altering networks to time series data or translation or math or audio or video, because the architecture is all about resembling very simple math. So getting back to why this response is so long, I could have started on experimenting with more interesting concepts on how to use LSTM's, and recurrent neural networks, but I chose to build the foundation for making javascript a superhero. I talked with a fellow developer yesterday and I said the following: "In javascript if you want some serious neural networks to train, it could take a year of training. But if you had the right foundation, you could raise the performance a couple orders of magnitude and calculate those same values in a day. I'd take a year to build that foundation. Why? Because then I could do 4 lifetimes of work in two years." |
TY, that means a lot. |
Here may be more what you are looking for to train over time data: https://jsfiddle.net/9t2787k5/4/ This uses https://github.com/wagenaartje/neataptic |
@robertleeplummerjr here's some end-of-data for IBM from 2014-to-2016 in JSON https://gist.github.com/RobertLowe/63642523f227b15c6616a1d89f5b489f (although a less stable stock might be more interesting) I've only seen LSTM's implemented in |
@RobertLowe the api for the time series is now ready for review and built. It strongly reuses the recurrent series, and should be simple enough for anyone who understands lists. Said differently: very low learning curve. The api is proposed here: #192 The api: import brain from 'brain.js';
const net = brain.recurrent.RNNTimeSeries(options);
// or
const net = brain.recurrent.LSTMTimeSeries(options);
// or
const net = brain.recurrent.GRUTimeSeries(options);
net.train([
[1,2,3,4,5],
[5,4,3,2,1],
]);
net.run([1,2,3,4]) -> 5
net.run([5,4,3,2]) -> 1 |
@robertleeplummerjr wow, this is great! I'll have fun giving it a run with some larger dataset. Thanks for the ping! 👍 |
HI @robertleeplummerjr , |
Hmmm. I'm experiencing a very strange error here. I have this LSTM network where I'm trying to train it on two numbers and two strings. I have a pretty large set of data to pull from, and I use one variable, whether or not it's set to one, as the output. I've modified the data so that it should definitely return the right results, but the results are constantly incorrect. I've increased the number of training iterations, and pulled objects directly from the list, but I still get wrong answers. What's going on? nvm. increased iterations and it works |
Can you share config and or data? |
Nevermind, I got everything working. Seems pretty accurate when I set hidden layers to [50,50] and provide more data for the network to consume, which is sort of blowing my mind. I'm using it at work in a backend, too. Does this have anything to do with tensorflow? |
Sharing your example .ay help others, I'd you are up for it. I'm working on the node GPU counterpart right now, for node (GPU.js). So I'm seeing your c++, and raising you GPU. It will be soon,. 99% of it works, but need multiple render targets for it to cover all our use cases., And will have that in the coming days. |
No. |
Cool, it would be amazing if you could get this working at a very performant level. I'm excited to see it grow and maybe contribute once I actually understand it. My problem before was this:
Yup. Just set it up completely wrong. |
The whole idea in brain.js and that we are proving is that:
It is so easy to get caught up in cloning work, to not focus on innovation. We are laying the groundwork to surpase the bigger libraries for innovation because we won't have the tech debt. |
Any update on the GPU version? Also, why would a GPU shader version be better than rewriting the code in C++? |
The underlying engine (GPU.js) is in very active development, has been proven to work in node and will be released soon, which will then spark the completion of v2 of brain.js, including the lstm bits. |
I get really locked down with the current LSTM, mostly because of the gate size. it keeps repeating simple patterns like, 'the cab the cab the' etc. Now I'm by no means a RNN genius, so... I found clipsize and outputsize in the rnn.defaults, in the codebase. |
@Samuel-Clarke123 Can you post a script example to build context from? |
This is my current method for training my LSTM. Currently each file is approx 30 lines each, about 45 chars longs per line. So I guess my question about the LSTM is this: Thanks for your help 😄 |
What is your tensorflow code to get this working? |
A bunch of long keras and python stuff relevant model definition
with the sequences being manually controlled by me (ie: X.shape[1]) |
@robertleeplummerjr any development here on @Samuel-Clarke123 issue/question? Thanks! |
Ty for the prodding! TL;DR TS;DR Further reading on why this has taken so long: Javascript for machine learning? Get a real language. If you've followed this thread, you must give that story a read. Also, please tweet it if you feel inclined. In addition to fixing and finalizing all that came with getting GPU.js working on Node (almost a year of work), two weeks ago, when I was trying to perform code coverage on Brain.js, it would fail in a fireball of death. I had to find a way to jack into istanbul.js (the cove coverage tool for js) to bind code coverage on the CPU mode, and as well a way to strip it out when on GPU mode (Under https://github.com/gpujs/gpu.js#gpu-settings, look at It ended up turning, as you would imagine, into its own (simple) project called istanbul-spy, so others may use it if needed. It turns out, the istanbul.js community met this with some praise. So now code coverage works, now node support works, now we can get on with our lives and finish this backlog of work, and get v2 out the door! |
Thanks for the speedy reply @robertleeplummerjr AND all the work!! I'm really enjoying working on NN with JS. I see your notes on the Convolutional NN, amazing! Any update on that timeline? Sorry for being all 'gimmie, gimmie, gimmie'.. |
Following on the other issue I created #108 , I'm trying to teach an LSTM network to write a simple children's book. I'm getting odd behavior but really don't know what I'm doing to begin with. I'd love to get this example working and added to the readme for others to follow but am hitting lots of little roadblocks. Here's my code:
The results in the console:
some observations:
run
is a string. I was expecting an arrayIt obviously isn't working as I expected. Since I can't find a good example I really have no idea what I'm doing wrong. I'm wondering if I'm training it wrong? I'm starting to think I should be giving it example sequences as input... like
{input: "Jane saw Spot", output: "."}
but I can't wrap my head around how to express that as valid input.The text was updated successfully, but these errors were encountered: