Cicada ---> Online Help Docs ---> Example: memory networks in Cicada

The Anagrambler

After running a few more tests we eventually convince ourselves that NN.cicada is working, so we open a new file in our Cicada directory and start thinking about how to put our networks to use.

The particular learning algorithm we are using is well suited to the task of pattern completion. To demonstrate, let’s build a network to unscramble anagrams. The inputs to this network will be the number of times each of the 26 letters appears in a word, encoded in the activity levels of 26 input neurons. The outputs will be the ordering of those letters relative to alphabetical, using n output neurons for a maximum word length n. (For example, a lowest-to-highest ranking of outputs of 3-2-4-1-5 for the input ‘ortob’ would imply the ordering 3-2-4-1-5 of the characters ‘b-o-o-r-t’, which spells ‘robot’.)

anagrambler.cicada


    forEach :: {
       
       counter :: int
       
       code
       
       for counter in <1, top(args[1])> &
           args(args[1][counter], counter)
    }
   
   
    anagrambler :: neural_network : {
       
       setupNN :: {
           
           ltr :: string
           params :: { step_size :: learning_rate :: double }
           
           
           code
           
           params = { .5, .1 }
           if trap(
               the_word = args[1]
               (params<<args)()
           ) /= passed then (
               printl("Error: optional params are step_size, learning_rate")
               return     )
           
           forEach(NN_in;
               ltr =! alph[args[2]]
               args[1] = find(the_word, ltr; mode = 0)   )
   
           NN_out[^size(the_word)]
           NN_out[].letter =! the_word
       }
       
       
       ask :: setupNN : {
           
           outputString :: string
           
           
           code
           
           process(NN_in, params.step_size)
           
           sort(NN_out, 2)
           
           NN_out[^numOutputs]
           NN_out[].order = activity[<numInputs+2, numInputs+numOutputs+1>]
           if size(the_word) < numOutputs then NN_out[^size(the_word)]
           
           sort(NN_out, 1)
           outputString =! NN_out[].letter
           
           print(outputString)
       }
       
       
       teach :: setupNN : {
           
           c1 :: int
           
           
           code
           
           forEach(NN_out; args[1].order = args[2]/size(the_word))
           sort(NN_out, 2)
           
           NN_out[^numOutputs]
           
           for c1 in <1, args[2]> &
               process(NN_in, NN_out[].order, params.step_size, params.learning_rate)
       }
    }
   
   
    the_word :: string
    NN_in :: [26] double
    NN_out :: [anagrambler.numOutputs] { order :: double, letter :: char }
   
    alph := "abcdefghijklmnopqrstuvwxyz"
   

At last, we’re ready to build a digital brain and put it to the task of unscrambling anagrams. We run Cicada, then load each of the two .cicada source files.


    > run("NN")
   
   
    > run("anagrambler")
   

Next we specify how big of a brain we need. Let’s decide to work with words of 6 or fewer characters (so, 6 output neurons), and of course we expect a 26-character alphabet (lowercase only please).


    > anagrambler.init(26, 6, 0)
   

With the custom brain built and ready, we can try


    > anagrambler.ask("lleoh")
   
    ehllo
   

Hardly a surprise; we haven’t taught it its first word yet.


    > anagrambler.teach("hello", 10)   | 10 = # training cycles
   
   
    > anagrambler.ask("lleoh")
   
    hello
   

Voilà---it successfully reconstructed a pattern.

A little bit of playing around shows that our little network can actually learn a few words at a time. Next we might rebuild it to learn longer words, or see if adding more hidden neurons increases its vocabulary -- the experimental cycles are very short with a command prompt. There are also the training parameters -- is 10 rounds of training on each word too many? Will our network learn faster if we increase the learning rate, or will it become unstable? It’s simple to test.


    > anagrambler.teach("hello", 5; learning_rate = that*2)
   

One final detail: it’s annoying to have to type “run("NN"), run("anagrambler")” every time we start our program. Going back to NN.c, let’s write a script that runs those commands and have runCicada() run that script before presenting the command prompt. NN.c’s main() function now reads:


    int main(int argc, char **argv)
    {
       const char *startupScript = "run(\"NN\"), run(\"anagrambler\")";
       return runCicada(fs, startupScript, true);
    }
   

Now our compiled application is basically a version of Cicada that has a C-coded Anagrambler engine built into the language.


Prev: Writing and debugging a Cicada wrapper    Next: Calling Cicada from a C program


Last update: November 12, 2025