Well, here’s my actual thesis.. November 11, 2008Posted by olvidadorio in Machine Learning.
Tags: backpropagation, evolution, neural nets, neuro-symbolic integration, thesis
.. as was handed in on the 27th Oct.: Subsymbolic String Representations Using Simple Recurrent Neural Nets (Bsc_Pickard.pdf)
- Actually, I’m not flying until Monday.
- Yes, one of the outcomes is that it’s rather difficult to memorize fairly large sequences using feedforward nets and backpropagation. Did I finally prove it? No. But I produced material to support that claim, I guess. And it’s also my strong intuition that regardless of the learning-paradigm used, fairly small (i.e. non-huge) nets just run into big problems. Non generalizing. Most probably this has to do with the fact that neural activation (with non-linear activation-function) can at most encode ordinal information, not scalar data (due to the non-linear distortion). One way or another this makes it hard to encode arbitrarily long sequences, as — and this I haven’t proven — all linearily-decodeable encoding-schemes that press arbitrarily long sequences into fixed-size vectors rely heavily on scaling.
So the problem is that ffNN’s have non-linear distortion which mixes up signals, but only linear means to decode with.
- So about that “evolving metadata along with evolution” approach — what do they work on concretely? Like, for what kind of a domain is that useful? I can see that as being a fairly interesting area… I think I read something on that once. One of the issues is that one tends to end up in is solving the problem already within the evolutionary parameters. But no idea. Just keep giving me reasons to move my ass to Amsti! ;}
Work In Progress August 11, 2008Posted by olvidadorio in Programming.
Tags: neural networks, obama, thesis, xmpp
add a comment
I am still hacking away at my thesis. That’s why I haven’t posted in a while. I would have loved to have made some pithy comments about Obama’s evolving policy for president and how seamlessly he’s getting into the mood. Oh, and I’d loved to have talked about a few more issues: There were some further thoughts on Browser as Platform vs. other ways of doing that (remember, I had opted for IM’s). It’s interesting that XMPP has been getting more and more attention (as well as discontention) as it becomes clear that some sort of push-mechanism is really kinda necessary…
Most of the other topics are fermenting! My thesis, as I said, has taken a front seat. I still haven’t really started writing a text (that could worry me). I set my time to finish until the end of September, where I should be done & ready to review by mid Sept. – which doesn’t leave a heck of a lot of time. Oh no. Next steps are: wrapping up the programming part. Since a straightforward way of learning the task hasn’t worked, I’m opting for an augmentative strategy. Very much like heuristics help you in search problems, I’ll be helping the learning algorithm along by providing a (hopefully) well-picked set of more easily learnable sub-problems and sub-networks that are merged and integrated step-by-step. Lets hope it’ll fly.
Back to hack.