Van Jacobson (Research Fellow at PARC, former Chief Scientist at Cisco Systems) gave a nice, high-level, visionary Google Tech Talk called "A New Way to look at Networking".
He says that networking needs to move from focusing on a connection between two computers to focusing on movements of data. He refers to this new type of network as a dissemination-based network.
In this vision, the network is optimized to retrieve named chunks of data from any available resource nearby over any available communication channel (e.g. "Someone, send me X."). This is in contrast with current networks that emphasize setting up a connection between two specific machines over a specific communication channel (e.g. "Connect to machine A.B.C.D.").
Van mentions BitTorrent and other examples as early but ad hoc progress toward this type of network structure. He criticized weaknesses in these early systems, saying, for example, that BitTorrent only worked well for very large files.
Far be it for me to disagree with Van about anything related to networking, but I have to admit this is where I started to question the proposal.
It seems to me it is exactly large data -- big, mostly unchanging video and audio files -- that is amenable to a data sharing infrastructure like Van is proposing. I would think that e-mails, dynamic web pages, and other types of rapidly changing, more personal data would get little benefit from the dissemination-based network proposed.
But, I may be thinking too small. This is not merely a way to share files.
I could imagine a world where every machine was part of a global mesh, data chunks encrypted and replicated across the cloud when the data is born, individual machines joining and dropping out on a millisecond basis, data migrating on demand and fading when no longer needed. This seems to be closer to the vision laid out in this talk.
It is a fun and worthwhile talk. If you watch it and have comments on it or what I said above, I would enjoy hearing your thoughts.