Well, here’s my actual thesis.. November 11, 2008Posted by olvidadorio in Machine Learning.
Tags: backpropagation, evolution, neural nets, neuro-symbolic integration, thesis
.. as was handed in on the 27th Oct.: Subsymbolic String Representations Using Simple Recurrent Neural Nets (Bsc_Pickard.pdf)
- Actually, I’m not flying until Monday.
- Yes, one of the outcomes is that it’s rather difficult to memorize fairly large sequences using feedforward nets and backpropagation. Did I finally prove it? No. But I produced material to support that claim, I guess. And it’s also my strong intuition that regardless of the learning-paradigm used, fairly small (i.e. non-huge) nets just run into big problems. Non generalizing. Most probably this has to do with the fact that neural activation (with non-linear activation-function) can at most encode ordinal information, not scalar data (due to the non-linear distortion). One way or another this makes it hard to encode arbitrarily long sequences, as — and this I haven’t proven — all linearily-decodeable encoding-schemes that press arbitrarily long sequences into fixed-size vectors rely heavily on scaling.
So the problem is that ffNN’s have non-linear distortion which mixes up signals, but only linear means to decode with.
- So about that “evolving metadata along with evolution” approach — what do they work on concretely? Like, for what kind of a domain is that useful? I can see that as being a fairly interesting area… I think I read something on that once. One of the issues is that one tends to end up in is solving the problem already within the evolutionary parameters. But no idea. Just keep giving me reasons to move my ass to Amsti! ;}