The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning; that is they refer to or are correlated according to some system with certain physical or conceptual entities. These semantic aspects of communication are irrelevant to the engineering problem.
Shannon introduced a solid mathematical foundation to information theory where the amount of information, measured in bits, is equal to the log2 of the number of states. For example, if you wish to encode one of 16 different colors then you would need log2(16) bits, which is 4 bits.
If you wish to know more about the technical side of information theory then you could peruse the paper, or alternatively read the Wikipedia article on information theory, or look up information theory at any number of websites. We can get into actual information theory at any level of detail you like.
In information theory everything is a source of information. For example, there is no fundamental difference in the information given off by a supernova of its luminosity in the form of light and an astronomer's record of his observations of the supernova in his notebook. Information is merely being translated from one form (intensity of starlight) to another (observational notations). The information about the supernova was created not by the astronomer but by the supernova itself. If you doubt that the information comes from the supernova then ask yourself how the astronomer could take accurate notes without observing it. If he can't see the supernova then he can't transcribe the information from the supernova into his notebook. The astronomer is not the originator of the information.
Discussions of information theory often bog down in refusals to accept that in information theory, information has a very formal definition. In everyday use words like information, meaning and knowledge are used almost interchangeably, but in information theory, information has a clear and concisely mathematical definition.
To avoid confusion with the actual definition of information provided by information theory, Gitt should instead claim that only intelligence can create knowledge or meaning, but even then he'd be on shaky ground. I can't see anyone successfully arguing that, for example, laboratory mice don't gradually acquire knowledge about a maze.