It seems that an assertion you're making, Smooth, is that evolutionary algorithms can't produce new complex specified information, because they are programmed in advance by humans, even their 'random' components...
But it's not random. It's programmed. Whatever the time is, and whatever the number the generator get, the result is always the same. It's much more complex than without the random number generator, but it's still the same. Just because it's heavier to visualise it, and it seem like the robot is acting randomly, it doesn't mean it is. Because we know that it is beaing led by it's programming.
... and therefore the information is already implicit in the algorithm.
Now, I disagree with what you say here, but I'll put that on hold until I understand your position better as it may not be important.
My question: Does this limitation on evolutionary algorithms, in your view, apply to algorithms more generally? i.e. can any algorithms produce new complex specified information? If they can, which ones can and which ones can't? How do we tell the two kinds apart?
If no algorithms can generate CSI, then it would imply that 'complex specified information' is in technical terms non-computable. This would have interesting implications.
Edited by Peepul, : No reason given.