Register | Sign In


Understanding through Discussion


EvC Forum active members: 48 (9179 total)
4 online now:
Newest Member: Jorge Parker
Post Volume: Total: 918,243 Year: 5,500/9,624 Month: 525/323 Week: 22/143 Day: 12/10 Hour: 0/0


Thread  Details

Email This Thread
Newer Topic | Older Topic
  
Author Topic:   What got into Hoyle?
Fosdick 
Suspended Member (Idle past 5616 days)
Posts: 1793
From: Upper Slobovia
Joined: 12-11-2006


Message 8 of 38 (397945)
04-28-2007 12:25 PM
Reply to: Message 7 by Rob
04-28-2007 9:34 AM


Information in communication networks
Percy:
Anyone come across any good explanations?
Rob:
Would that explanation contain information?
Rob, on another thread (see Message 17) Percy was expaining how binary, digital information works. I don't have authority to post on that thread, so I am posting here an alternative explanation of information, as it is seen by Shannon and those interested in the structure ("infostructure") of communication networks (or systems). Here’s another way to look at information theory, with the emphasis on network communication.
Let’s say you have a four-party telephone network, comprising users John, Sue, Frank, and Carol. There are 16 possible communication lines in this network (4x4), if each party can call himself or herself, and 12 lines if parties cannot call themselves. Let’s assume the latter. Add to that the assumption each member has the same calling frequency (or probability) as any other. Now, suppose each time a call is initiated by any one party it is evenly directed to each of the other parties (e.g., when John makes a call he is just as likely to call Sue or Frank or Carol (i.e., no difference in calling probability), according to you sampling. And let’s also assume the same likelihood of communication applies to all other members of this network.
Given this scenario, you could not predict anything about this communication network except that any call issued within it would have the same probability of receivership. You could then say, in accordance with information/communication theory, that you have zero information about the system. Shannon likened this to Boltzmann’s concept of entropy; in this case maximum entropy.
Now, suppose we have learned that not every single call in the communication network amounts to a random event; say, John calls Sue more often than Frank, and Sue calls Carol more often than John. And your sampling shows that Sue calls more frequently than the others, and that Frank rarley calls at all. Now we can say, stochastically, that this network contains more information about its structure. This information accrues with a reduction of entropy about the network. What you would be estimating, according to many interpretations, would be the system’s “average mutual information.”
Now, suppose this communication system is extremely different, as measured by your sampling. Each time a call is issued it always comes from Carol and it always goes to Sue. Your sampling shows no other calling activity. Therefore, you can predict with a very high degree of certainty how this network will behave in any calling event. You could say, in accordance with information/communication theory, that this network has reached its maximum information content, and that its contains almost no entropy.
Hope it helps.
”HM

This message is a reply to:
 Message 7 by Rob, posted 04-28-2007 9:34 AM Rob has not replied

Replies to this message:
 Message 9 by AdminQuetzal, posted 04-28-2007 12:43 PM Fosdick has not replied

  
Newer Topic | Older Topic
Jump to:


Copyright 2001-2023 by EvC Forum, All Rights Reserved

™ Version 4.2
Innovative software from Qwixotic © 2024