Announcement

Collapse

Forum Guidelines

This forum is for finished source code that is working properly. If you have questions about this or any other source code, please post it in one of the Discussion Forums, not here.
See more
See less

Neural Network that learns to convert binary to dezimal

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Neural Network that learns to convert binary to dezimal

    This code simulates a learning Neuronal Network. To understand what Neural Networks can be used for think like this:
    1. You need a program that converts Binary Numbers (e.g. "0101" to dezimal number e.g. "5").
    2. You are too lazy to find an appropriate Algorhytm, or the Algo is too complex (like for example in Face recognition).

    So what you do is, you set up an Neuronal Network. See it like a "Self configuring Algorhytm".
    Set up properly it will search itself the right "Algo" to solve your problem. This is done during the so called "Backward Pass".
    In this Pass, the Net changes itself into a direction that will end up finding the sollution you need.

    Here its the "Binary to Decimal" Conversion. Which is just an example.


    It has 4 Input Neurons, 8 Neurons in the Hidden Layer and 8 Output-Neurons.

    0 I ---- H ---- O1
    1 I ---- H ---- O2
    0 I ---- H ---- O3
    1 I ---- H ---- O4

    The Input Neurons are given a Binary Number from 1 to 8 (The Input Neurons are set to 0011 for "3" for example).
    Then The Output Neuron O3 must have the highest value of all Output Neurons.

    When the Net starts the result-values are rather random. After some learning, the net recognizes all binary numbers from 1 to 8.

    Note that this version does not have any visible Output. You need to add your "Debug Print" into X_AU () to see something.

    This Net is very flexible. You can change the number of Neurons by just changing this line:

    Code:
      ACnt=4:BCnt=8:OCnt=8
    So for example you can change the number of Neurons in the Hidden Layer and see how the net will change its behavior.

    PS: Done faster with JK-Edit.
    Code:
    '***********************************************************************************************
    '
    '
    '***********************************************************************************************
    
    #COMPILE EXE
    
    '#INCLUDE ONCE "private\G_MainLib.inc"
    
    MACRO DebS(P1)=STR$(INT(P1*100000)/100000)
    MACRO INST = INSTANCE
    MACRO G_REG()
     REGISTER R01 AS LONG,R02 AS LONG
    END MACRO
    MACRO G_S01()
     LOCAL S01 AS STRING
    END MACRO
    MACRO G_S02()
     LOCAL S01,S02 AS STRING
    END MACRO
    
    MACRO G_S03()
     LOCAL S01,S02,S03 AS STRING
    END MACRO
    
    ' Your Debug Print comes in here
    SUB X_AU(BYVAL P1 AS STRING)
      ' Print String P1
    END SUB
    
    '***********************************************************************************************
    GLOBAL NA AS NeoNet_Interface
    
    FUNCTION PBMAIN () AS LONG
    
    NA=CLASS "NeoNet"
    NA.TrainNet()
    
    END FUNCTION
    
    
    '***********************************************************************************************
    
    '***********************************************************************************************
    CLASS NeoNet
    INST ACnt,BCnt,OCnt AS DWORD
    INST LearnSteps AS LONG
    INST LearnRate AS SINGLE
    INST Inputs(), Outputs() AS SINGLE
    INST Epsilon AS SINGLE
    INST WeightsInput(),WeightsHidden() AS SINGLE
    INST BiasHidden(),BiasOut(),HIDDEN() AS SINGLE
    INST DeltaHidden(),DeltaOut(),Target() AS SINGLE
    INST Net() AS BYTE
    INST Total_Error AS SINGLE
    ' Minimaler Error bei X, Largest Output bei X
    INST LAON,EMIN AS LONG
    ' Soll-Ausgabe
    INST Soll AS LONG
    '-------------------------------
    
    CLASS METHOD CREATE()
    
    END METHOD
    '-------------------------------
    
    '-------------------------------
    INTERFACE NeoNet_Interface
    INHERIT IUNKNOWN
    
    '-------------------------------
    
    '-------------------------------
    METHOD TrainNet()
      G_REG
      G_S01
        ME.Init_Mem()
        ME.InitWeights()
    FOR R02=1 TO LearnSteps
        FOR R01=1 TO OCnt
          ME.SetInputsAndTargets(R01)
          ME.FeedForward()
          ME.FeedBackward()
          ME.DrawNet(R02)
        NEXT
    NEXT
    END METHOD
    '-------------------------------
    PROPERTY GET Inputs(BYVAL A1 AS LONG) AS SINGLE
    PROPERTY=Inputs(A1)
    END PROPERTY
    
    PROPERTY SET Inputs(BYVAL A1 AS LONG,BYREF A2 AS SINGLE)
    Inputs(A1)=A2
    END PROPERTY
    '-------------------------------
    END INTERFACE
    '-------------------------------
     CLASS METHOD Init_Mem()
      ME.InitVars()
      DIM Inputs(ACnt)
      DIM Outputs(OCnt),Target(OCnt),BiasOut(OCnt)
      DIM WeightsInput(BCnt,ACnt),WeightsHidden(OCnt,BCnt)
      DIM BiasHidden(BCnt),DeltaHidden(BCnt),HIDDEN(BCnt)
      DIM DeltaOut(OCnt)
    END METHOD
    '-------------------------------
    ' A1 - Pass-Nummer
    CLASS METHOD DrawNet(BYVAL A1 AS DWORD)
    REGISTER R01 AS DWORD,R02 AS LONG
    G_S02
    X_AU "Pass-Nr."+STR$(A1)+" Total Error: "+TRIM$(Total_Error)
    S01="":S02=""
    FOR R01=0 TO ACnt
      S01+="I("+TRIM$(R01)+")="+TRIM$(Inputs(R01))+" -- "
    NEXT r01
    X_AU S01
    
    S01=" Soll="+TRIM$(Soll)+"   Ist="+TRIM$(LAON)+"  with: "+TRIM$(Outputs(LAON))
    X_AU S01
    X_AU "-------------------------------------------------------------------"
    'Sleep 10
    END METHOD
    '-------------------------------
    CLASS METHOD InitVars()
     G_REG
      ACnt=4:BCnt=8:OCnt=8
      LearnSteps=130000
      LearnRate=0.03
      Epsilon=LearnRate
    END METHOD
    '-------------------------------
    CLASS METHOD Activation(BYREF A1 AS SINGLE) AS SINGLE
     LOCAL S1 AS SINGLE
     S1=1/(1+EXP(-A1))
     METHOD=S1
    END METHOD
    '-------------------------------
    CLASS METHOD Abl(BYREF P1 AS SINGLE) AS SINGLE
     LOCAL S1 AS SINGLE
     S1=P1*(1-P1)
      METHOD=S1
    END METHOD
    '-------------------------------
    CLASS METHOD InitWeights()
      REGISTER i AS LONG
      REGISTER j AS LONG
      FOR i=0 TO OCnt
        FOR j=0 TO BCnt
          WeightsHidden(i,j)=RND()*2 - 1
        NEXT
        BiasOut(i)=RND()*2 - 1
      NEXT
    
      FOR i=0 TO BCnt
        FOR j=0 TO ACnt
          WeightsInput(i,j)=RND()*2 - 1
        NEXT
        BiasHidden(i) =RND()*2 - 1
      NEXT
    END METHOD
    '-------------------------------
    CLASS METHOD SetInputsAndTargets(BYVAL A1 AS LONG)
      G_REG
      G_S01
      S01=BIN$(A1,4)
      FOR R01=1 TO 4
        Inputs(R01)=VAL(MID$(S01,R01,1))
      NEXT r01
    
      FOR R01=1 TO OCnt
        Target(R01) = 0
      NEXT
     Target(A1)=1
     Soll=A1
    END METHOD
    
    '-------------------------------
    CLASS METHOD FeedForward()
      REGISTER i AS LONG
      REGISTER j AS LONG
      LOCAL aH,aI AS SINGLE
    
      FOR i = 0 TO BCnt
        aI=0
        FOR j=0 TO ACnt
          aI+=Inputs(j) * WeightsInput (i,j)
        NEXT
        HIDDEN(i)  = ME.Activation(aI+BiasHidden(i))
      NEXT
    
      FOR i = 0 TO OCnt
        aH=0
        FOR j=0 TO BCnt
          aH+=HIDDEN(j) * WeightsHidden(i,j)
        NEXT
        Outputs(i) = ME.Activation(aH+BiasOut(i)   )
      NEXT
    
    END METHOD
    
    '-------------------------------
    CLASS METHOD FeedBackward()
      REGISTER i AS LONG
      REGISTER j AS LONG
      LOCAL S1,S2 AS SINGLE
      ' Errormin und Largest Output
      LOCAL EMI,LAO AS SINGLE
      EMI=1:LAO=-1:EMIN=-1:LAON=-1
      FOR i=0 TO OCnt
        S1=Target(i)-Outputs(i)
        S2+=ABS(S1)
        DeltaOut(i) = S1 * ME.Abl(Outputs(i))
        IF S1<EMI THEN EMI=S1:EMIN=i
        IF Outputs(i)>LAO THEN LAO=Outputs(i):LAON=i
      NEXT
        Total_Error=S2
    
      FOR i=0 TO OCnt
        FOR j=0 TO BCnt
          WeightsHidden(i,j)+=(Epsilon*DeltaOut(i)*HIDDEN(j))
        NEXT
        BiasOut(i)+=(Epsilon*DeltaOut(i))
      NEXT
    
      FOR i=0 TO OCnt
        DeltaHidden(i)=0
        FOR j=0 TO BCnt
          DeltaHidden(i)+=(DeltaOut(j)*WeightsHidden(i,j))
        NEXT
        DeltaOut(i)*=ME.Abl(HIDDEN(i))
      NEXT
    
      FOR i=0 TO BCnt
        FOR j=0 TO ACnt
          WeightsInput(i,j)=WeightsInput(i,j) + (Epsilon * DeltaHidden(i) * Inputs(j))
        NEXT
        BiasHidden(i)+=(Epsilon * DeltaHidden(i))
      NEXT
    END METHOD
    END CLASS
    Last edited by Theo Gottwald; 3 Feb 2018, 02:33 AM.
    --Theo Gottwald
    ------------------------------------------------
    76706 Dettenheim * Germany * info@it-berater.org
    ------------------------------------------------
    Joses Forum * Theo's Link Site * IT-Berater.org

  • #2
    Hi Theo,

    I think many of us are interested in this but the code for me is very hard to understand. If you find the time, it would be nice to see the code to be documented extensively.

    Kind regards,
    Steven
    So here we are, this is the end.
    But all that dies, is born again.
    - From The Ashes (In This Moment)

    Comment


    • #3
      The underlaying concepts are pretty simple. But for non-mathematics it may take you 15 Month to get it running and understand it completely.
      Here is a good start:
      https://www.coursera.org/learn/neural-networks

      Attached Files
      --Theo Gottwald
      ------------------------------------------------
      76706 Dettenheim * Germany * info@it-berater.org
      ------------------------------------------------
      Joses Forum * Theo's Link Site * IT-Berater.org

      Comment


      • #4
        Originally posted by Theo Gottwald View Post
        The underlaying concepts are pretty simple.
        Maybe too simple

        I immediately thought of this thread when I came across this article:
        https://medium.com/the-spike/your-co...s-9034e42d34f2
        --
        [URL="http://www.camcopng.com"]CAMCo - Applications Development & ICT Consultancy[/URL][URL="http://www.hostingpng.com"]
        PNG Domain Hosting[/URL]

        Comment

        Working...
        X