Visualizing evolutionary history with WebVR


To try out this demo, click the settings button (to the right with the gear symbol) and select some data to draw. If you have a compatible device, you can display this visualization in virtual reality by scrolling down (or clicking on the button to the right with the eye icon) and clicking the grey goggles button. In a standard web browser this button will open a full-screen version of the data visualization. However this view has a more limited range of interaction options, so I recommend just clicking the eye button to center the visualization in your web browser. Have fun! If you have questions or just want to talk about data visualization in VR, feel free to contact me!

Settings

Data selection
Visualization options

FAQ

What is this showing?

This will all make a lot more sense if you read one of the associated papers (pre-prints forthcoming), but the basic idea is that this visualization allows us to quickly get a sense of the aggregate effect of the long sequence of small events that make up the process of evolution. Each one of the paths that you can draw on the landscape represents the entire chain of ancestry for a single evolved individual. Each individual is an x,y coordinate pair. The populations are evolving on 2-input real-valued functions (represented by the curved surface). Fitness is evaluated by plugging an individual's x and y coordinates into the function. Individuals are then plotted on the surface, with fitness being the z coordinate.

The individuals in this data set were all the fittest member of the population at the end of their respective runs of the algorithm. By drawing paths from their coordinates, to their parents' coordinates, to their grandparents' coordinates and so on, we can see how evolution traverses this fitness landscape.

If you've looked at the settings panel, you'll notice there are a lot of options. That's because this data represents a large number of runs of evolution in a wide variety of conditions. The settings panel lets you choose subsets to look at.


Why is nothing showing up?

Probably because you haven't told it to draw anything! Go to settings (gear icon), check the boxes for the annotations you want (e.g. spheres for start and end points, paths for full lineages), and then click the "Draw" button. It might take a bit to render, but data should show up.

Note that at low mutations rates lineages are often very short and can be hard to see.

If there is still nothing, send me an issue on GitHub - this is all still pretty experimental.


What are some good settings to start with?

Try setting "Selection Scheme" to Eco-EA, "Mutation rate" to .01, and checking the "Draw lineage paths" box. Leave everything else as the default. This should show you a single lineage traversing the landscape. If you want to try looking at multiple lineages, try setting "Number of lineage paths to draw" to 2 and increasing the Z-separation slider about halfway. From there, feel free to change things and see what happens! In general, mutation rates lower than .01 tend to stay fixed in a pretty small part of the landscape, and Eco-EA tends to travel farther than other selection schemes.


What VR devices are supported?

Check out webvr.rocks for more thorough information. In general, most browsers will at least render it in WebGL. In this mode, you can use your mouse to click and drag to rotate the visualization and see it from different angles.

To use virtual reality, you need something more advanced.

If you have a phone-mount headset, such as Google Cardboard, and a phone that is compatible with it, you should be able to use that. Navigate to this website on your phone, select and draw the data you want to see, and tap the goggles icon in the lower right corner. This should open it in VR mode (in which your phone displays two images, one for each eye). You can then put the phone in the headset and look at the visualization in three dimensions.

If you have a more advanced VR system, such as an Oculus Rift or HTC Vive, you can open this webpage on the computer connected to the VR system. Clicking the gray goggles icon in the lower right corner should bring up the visualization on your headset. You can then put on the headset and walk around the visualization. If you have two hand controllers, you can use them to zoom in and out with pinch and spread gestures (like you would use on a touch screen). Note that this has been tested on a Rift, but not yet on a Vive. If you try it on a Vive (or something else, like Windows Mixed Reality) I'd love to hear how it goes.


What are the different selection schemes?

Tournament selection: [TOURNAMENT_SIZE] individuals are chosen randomly from the population and the fittest reproduces. Increasing tournament size essentially increases the strength of selection.

Roulette selection: Individuals are chosen to reproduce with probability proportional to their fitness.

Eco-EA: Like tournament selection, except that individuals receive fitness bonuses for occupying rare niches (in these problems, that entails being in a part of the landscape where there aren't many other individuals). For more detail, see (Goings and Ofria, 2010).

Drift: Individuals are selected to reproduce entirely at random.

Where did these fitness functions come from?

The GECCO Niching competition.

I want to know more about using VR for data visualization

Me too! There's still a lot for everyone to learn about how to most effectively take advantage of virtual reality for data visualization. The Knight Foundation has been doing some really great work to figure all of this out. I found this article particularly helpful, and this article has a bunch of really cool examples. Their work is all aimed at journalism, though. I'm planning on writing a blog post soon, detailing what this project has taught me specifically about using VR for data visualization in the context of evolution.

What tools did you use to build this?

For full details, see the GitHub repo for this site. In summary, though, I used:

  • A-Frame: an open source and very straightforward web framework for virtual reality with a great community around it, all of whom were very helpful.
  • The A-Frame Heatmap-3D component: an A-Frame extension which I could not have done this without (it draws the fitness landscape).
  • The A-Frame camera-transform-controls component: an A-Frame extension which I used to add the hand gestures on the Oculus Rift and HTC Vive.
  • Bootstrap: an open source web framework, which I used to make this website.
  • Open Iconic: open source icons which I used for the buttons on this site.


  • And of course my trusty laptop running Linux Mint.

    Who else helped make this happen?

    I'm very grateful to Alex Lalejini, Charles Ofria, and the rest of the MSU Digital Evolution Lab for their input on this project. I also thank the MSU Institute for Cyber-Enabled Research for the computational resources to generate this data in the first place, and the MSU Digital Scholarship Lab for giving me access to (and helping me configure!) an Oculus Rift on which to develop and use this visualization.

    Who are you?

    I'm a PhD student at Michigan State University studying evolution, ecology, and computer science. To learn more about my research, see my website.

    Follow me on Twitter! My Github