Surprisingly little.
The bulk of your computer's processing is running stuff it doesn't really need to run... just stuff you want it to be running so that it's ready for your 'day-to-day' stuffs at any time.
Strip all that away, design and program it specifically for the task at hand and nothing more, and things get much faster. A lot faster.
The typical 40 robot, 300 motion side-rail example I mentioned above would run off a program taking up less than 1 MB of file size.
The maximum storage of the entire industrial computer would even be in the 2-to-4 MB range.
It's not a very "powerful" computer, it's simply a specifically, efficiently designed one used for the tasks it's programmed for and nothing else.
Thanks for the feedback.
Been thinking about this a bit -two thoughts come to mind.
First is that any consumer smart car would be full of bloatware - since we would also expect it to surf the internet, play music, etc. etc. But this could be avoided by having two separate systems - one for the user interface and one to actually drive the car.
However, my second question is whether industrial computers are every required to do tasks of this complexity. You talked about your experience with factory robots; but these are operating in very controlled conditions - they have to react to things moving quickly within a fairly narrowly defined set of parameters (the production line); plus presumably some mechanism to spot when something is going beyond these parameters (such as things falling off the conveyor belt).
It strikes me that navigating a city while reacting to the natural flow of the world around you is a computational task several orders of magnitude more intensive. Or am I overestimating this or underestimating what industrial robots do?