Join us on Discord!
You can help CodeWalrus stay online by donating here.

Neural Network in pure TI-Basic

Started by _iPhoenix_, April 01, 2017, 01:04:29 PM

Previous topic - Next topic

0 Members and 2 Guests are viewing this topic.

_iPhoenix_

cross-over
I made probably the worst decision of my life. I am making a TI-Basic neural network.

Here's what's working:
-Perceptron/single neuron
-Input
-Weights

Here's what's not working:
-Outputs
-More than one neuron (network part of 'neural network')
-My brain
-TI-Basic

I have some code done, but it's really sloppy, and uses a lot of subprograms to keep it organized. (I'm porting from Java, ok?!) In the end, I can just paste the code from the subprograms into the main class (dang OOP is getting to me) program.

I will post progress updates here and potentially code snippets, but I'm going to take a small break from it. I spent way too much time getting to this point.

Also, this is definitely the first time anyone has done this, so I'm pretty proud of my mess code.

I will post code when I can, as I am in a very volatile state.  I currently am working on solving some of the problems of the code and do not have a working version.

I expect to be done in a week or two, but I am not sure. [edit: not happening]

Additionally, it is surprisingly fast, with one neuron/layer. The time required per layer should (this is from experience) go up exponentially, though.


Single Perceptron
Code/Download (version 1, I used a super cheatysimple activation function to give me time to re-learn basic calculus.)
[dropbox download here]

Source (optimized, credits down below, may not work 100%):

prgmPERCEPT:
Ans->|LARGS

0.1->C
If 1=|LARGS(1
seq(2rand-1,I,1,|LARGS(2->|LWTS

If 2=|LARGS(1
Then
DelVar S
For(I,3,dim(|LARGS)-1
S+|LARGS(I)*|LWTS(I-2->S
End
1-2(S<0->theta
End

If 3=|LARGS(1
Then
|LARGS(2->D
dim(|LARGS)-2->dim(|LA
DelVar S
~1+2(5>2sum(seq(|LARGS(I)*|LWTS(I-2),I,3,dim(|LARGS)-1->S
D-Ans->E
For(I,1,dim(|LWTS
C*E*|LARGS(I+2->|LB(I
|LWTS(I)+C*E*|LARGS(I+2->|LWTS(I
End
{E,S,|LB(1),|LB(2
End

last updated March 30, 2017
prgmTRAIN:
Menu("RESUME TRAINING?","YES",02,"NO",03
Lbl 03
3->dim(|LZYZYZ
Fill(0,|LZYZYZ
1->|LZYZYZ(1
ClrDraw
{1,2:prgmPERCEPT
Lbl 02

Input "NUMBER OF TRIALS: ",A
|LZYZYZ(2->I%
A->|N
Repeat not(|N
|N-1->|N
Text(0,1,|LZYZYZ(1
Text(12,1,|LZYZYZ(2
Text(24,1,|LZYZYZ(1)-|LZYZYZ(2
randInt(~100,100,2->L1
1-2(L1(1)>L1(2
augment({3,Ans},L1->|LZ
prgmPERCEPT
Ans->L1
If Ans(1
Pt-On(|LZ(3),|LZ(4),Black,3
|LZYZYZ(2)+not(Ans(1->|LZYZYZ(2
Pt-On(|LZ(3),|LZ(4),Black+L1(2),1
1+|LZYZYZ(1->|LZYZYZ(1
End

Last updated April 2, 2017

And here's the README file (text, have fun putting this on your calc), for archival purposes (and for help):
Drag both onto CE/CSE (monochrome may work)

Launch prgmTRAIN

Do not change the name of prgmPERCEPT.

Bugs/Modifications are welcome.

If there is any code that doesn't run (I'm talking about stuff in prgmPERCEPT), it is there for later versions.

Thanks!

~iPhoenix


Credits for optimizations not included in download, but in source:
mr womp womp
PT_ or P_T (can't tell anymore!)
(potentially you?)


Here's a really good image on perceptrons I found.
It explained a lot (to me, at least, a long time ago)



Here's a video about the speed (and I leaked some stuff)

Here's a great explanation (including psudocode) on what a neural network is, and how to make one.

1 layer (3 neurons) neural network
"Change to {0 (instead of {1})  if you do not want to see the nerd stuff :P
{1->|LDEBUG
rand(6->|LN1
rand(3->|LN2
DelVar theta
{0,1,0,1->|LIN1
{0,0,1,1->|LIN2
{0,1,1,0->|LIN3
Repeat 0
theta+1->theta
Disp "---","Trial Num: "+toString(theta
1+remainder(theta-1,4->I
|LIN1(I->|LINPUT(1
|LIN2(I->|LINPUT(2
|LIN3(I->Z
Disp "Expected Output: "+toString(Z
For(A,1,2
Disp "Input "+toString(A)+": "+toString(|LINPUT(A
End
Disp "---",""
DelVar P
For(A,1,3
DelVar S
For(I,0,3,3
S+|LINPUT(I/3+1)*|LN1(A+I->S
End
S->L2(A
1/(1+e^(~S->|LRES(A
End
|LRES*|LN2->|LRES

sum(|LRES->A
1/(1+e^(~A))->A
For(N,1,3
If sum(|LDEBUG:Then
Disp "Synapse "+toString(N
Disp "Expected: "+toString(Z),"Result: "+toString(A),"Error: "+toString(Z-A
End
Z-A->E
nDeriv(1/(1+e^(~B)),B,A)*E->C
If sum(|LDEBUG
Disp "Change: "+toString(C
|LN2(N->L3(N
C+|LN2(N->|LN2(N

If sum(|LDEBUG:Then
Disp "old:"+toString(L3(N
Disp "new:"+toString(|LN2(N
Disp ""
End
End
If sum(|LDEBUG
Disp "","Input-Hidden:"
For(A,1,3
L2(A->O
C*L3(A)*nDeriv(1/(1+e^(~X)),X,O)->L4(A
If sum(|LDEBUG
Disp "Change "+toString(A)+": "+toString(Ans
End
For(A,1,3
|LINPUT(1->L5(A
|LINPUT(2->L5(A+3
End
3->dim(L4
augment(L4,L4->L4
L4*L5->L5
|LN1+L5->|LN1
If sum(|LDEBUG:Then
For(A,1,12
If A<6
Disp "old: "+toString(|LN1(A)-L5(A
If A=6
Disp ""
If A>6
Disp "new: "+toString(|LN1(A-6
End
End
Disp "",""
End

End

updated April 12, 2017
^^ Code looks weird, just input it into SC3

I literally copied and pasted the code, so there may be a few bugs.
I am also not 100% sure the code works (It should), I need a day or two to run 10 thousand trials :P

I also will not (as of right now) be explaining what it writes on the homescreen, it's a huge mess.
  • Calculators owned: Two TI-84+ CE's
Please spam here: https://legend-of-iphoenix.github.io/spam/

"walruses are better than tuxedo chickens, all hail the great :walrii:" ~ me
Evolution of my avatar:

novenary

I don't feel like reading the whole post right now, but this sounds like a very interesting project. I wonder how much you'll be able to do with this considering the speed of the platform. Good luck, and keep us posted !

_iPhoenix_

#2
I am almost done, and have a code release happening tmro, or Monday, at the latest.


It will include some really cool algorithms, even some that aren't implemented, yet. I will also release my idea for integrating a set of neurons, so #hype!

[edit]
Here's a video about the speed (and I leaked some stuff)

  • Calculators owned: Two TI-84+ CE's
Please spam here: https://legend-of-iphoenix.github.io/spam/

"walruses are better than tuxedo chickens, all hail the great :walrii:" ~ me
Evolution of my avatar:

_iPhoenix_

*bump* Code updated.
New features include
-Speed optimizations
-New UI
-Saving of training data
-Easier-to-understand training results.

I am unable to post my complete algorithms rn, doing that Monday :P
  • Calculators owned: Two TI-84+ CE's
Please spam here: https://legend-of-iphoenix.github.io/spam/

"walruses are better than tuxedo chickens, all hail the great :walrii:" ~ me
Evolution of my avatar:

_iPhoenix_

*double bump, sorry, but this is important!*

I did it. I made it a neural network!

Although it only has one layer, it should be able to solve the xor problem easily. (in a day or two) I recommend running the code in CEmu, with the throttle set at 500%, and perhaps removing some of the Disp tokens, to speed it up.

Have fun!
  • Calculators owned: Two TI-84+ CE's
Please spam here: https://legend-of-iphoenix.github.io/spam/

"walruses are better than tuxedo chickens, all hail the great :walrii:" ~ me
Evolution of my avatar:

novenary


_iPhoenix_

  • Calculators owned: Two TI-84+ CE's
Please spam here: https://legend-of-iphoenix.github.io/spam/

"walruses are better than tuxedo chickens, all hail the great :walrii:" ~ me
Evolution of my avatar:

Dream of Omnimaga

I have no clue what is a neural network, but have you managed to get good speed even in basic?
  • Calculators owned: TI-82 Advanced Edition Python TI-84+ TI-84+CSE TI-84+CE TI-84+CEP TI-86 TI-89T cfx-9940GT fx-7400G+ fx 1.0+ fx-9750G+ fx-9860G fx-CG10 HP 49g+ HP 39g+ HP 39gs (bricked) HP 39gII HP Prime G1 HP Prime G2 Sharp EL-9600C
  • Consoles, mobile devices and vintage computers owned: Huawei P30 Lite, Moto G 5G, Nintendo 64 (broken), Playstation, Wii U

_iPhoenix_

Erm... Uhh... "The speed of light sucks." - John Carmack.

My rendition: "The speed of my program sucks" - _iPhoenix_

I thought it wouldn't take a month. I have been running CEmu for about that long, and little progress has been made. I think I messed up somewhere but I have decided to re-write it from scratch. That'll take a while... It's happening, though.
  • Calculators owned: Two TI-84+ CE's
Please spam here: https://legend-of-iphoenix.github.io/spam/

"walruses are better than tuxedo chickens, all hail the great :walrii:" ~ me
Evolution of my avatar:

Dream of Omnimaga

That's a lot O.O. Was that at CEmu's 1x speed or did you speed it up?
  • Calculators owned: TI-82 Advanced Edition Python TI-84+ TI-84+CSE TI-84+CE TI-84+CEP TI-86 TI-89T cfx-9940GT fx-7400G+ fx 1.0+ fx-9750G+ fx-9860G fx-CG10 HP 49g+ HP 39g+ HP 39gs (bricked) HP 39gII HP Prime G1 HP Prime G2 Sharp EL-9600C
  • Consoles, mobile devices and vintage computers owned: Huawei P30 Lite, Moto G 5G, Nintendo 64 (broken), Playstation, Wii U

_iPhoenix_

Maximum.

I may have found the incorrect line of code (yah. It is WAYYY over my head).
Unfortunately, I cannot change it today because I have state testing tmro.

I wax told the tests were going to be hard, but I seem to be rather good at finishing things like that in <10 min.
And I NEVER guess.
  • Calculators owned: Two TI-84+ CE's
Please spam here: https://legend-of-iphoenix.github.io/spam/

"walruses are better than tuxedo chickens, all hail the great :walrii:" ~ me
Evolution of my avatar:

Dream of Omnimaga

Is the speed issue caused by the error or did you just push basic to its limits? I am curious about how fast it would be on a HP Prime.
  • Calculators owned: TI-82 Advanced Edition Python TI-84+ TI-84+CSE TI-84+CE TI-84+CEP TI-86 TI-89T cfx-9940GT fx-7400G+ fx 1.0+ fx-9750G+ fx-9860G fx-CG10 HP 49g+ HP 39g+ HP 39gs (bricked) HP 39gII HP Prime G1 HP Prime G2 Sharp EL-9600C
  • Consoles, mobile devices and vintage computers owned: Huawei P30 Lite, Moto G 5G, Nintendo 64 (broken), Playstation, Wii U

_iPhoenix_

No, I just got a few formulae off.
And the code is a mess.
  • Calculators owned: Two TI-84+ CE's
Please spam here: https://legend-of-iphoenix.github.io/spam/

"walruses are better than tuxedo chickens, all hail the great :walrii:" ~ me
Evolution of my avatar:

Powered by EzPortal