I have seen a few mapping projects out there (like the one that maps the Linux kernel) use a program to crawl over the "source" and output some metadata which is then fed into another program which generates an insane amout of PostScript. I could imagine something similar, but I would like to take a simpler approach than having to debug some insane PostScript output. Perhaps having an intermediate human-readable form of the data would be a good idea.

I could see nodes color-coded based on type (person, place, thing, idea, superdoc, user, etc.) and possibly given a scale (in bold-ness, maybe 5 shades of differentiation or something?) based on that nodes highest rep or something.

One of the hardest bits of this problem is that unlike the internet, e2 is not a hierarchy or even an n-to-n mesh; any two nodes can be linked. Graphically representing that without having some sort of detail threshold could make any representation of e2 get out of hand awefully quickly. So, it would be best to asign weights to links (soft and hard) per node that would demonstrate numerically how connected one node is to another. That would be shown graphically by modifing the thickness of the line.

It would probably be best to have a three pass parser that would:

  1. Step one: Parse nodes into objects that contain node names links.
  2. Step two: Further process this data:
    1. Assign weighting to the links.
    2. Filter links below a certain weight.
    3. Sort the nodes in geographic space clustering nodes with the most simular links close to eachother.
    4. Output some sort of easily parsable format (XML?).
  3. Step three: Parse the intermediate format into PostScript or some other vector description.

I'll will post more, after I will have thought about it some more and will have talked about it with some of my routing visulization buddies (future perfect tense, wow!).




Changelog:

  • 06/16/2001: Added blurb on how soft/hard links would be graphed.