A recent project required a view of the South Pacific as seen from Earth orbit. I’ve put together and rendered orbital views of Earth before, most recently in Autodesk Maya. The results can be convincing, but the process is involved – requiring several concentric 3D spheres with various types of surfaces and opacity to replicate landmass, ocean, atmosphere, and clouds.
So when given the opportunity to try it with Terragen, I was curious to see how it would work out.
Terragen is a specialized 3D application used primarily to create landscapes. Among its many strengths is the ability to render natural-looking surfaces – land and water – and atmospheres and clouds. Most often the point of view is at ground level, but it also can be placed out in space. Which makes Terragen a killer app for building planets.
First, we’ll call on NASA to gather some of the parts we’ll need.
NASA’s Blue Marble web page, part of its Visible Earth catalog, includes many high-resolution images of the Earth’s surface. Most are surface image composites representing each month of the year. You can clearly see the seasons change. The polar icecaps advance and retreat, and vegetation morphs from green to brown and back again. Other images, available at the same scale and resolution, depict topography (altitude), bathymetry (ocean depths), city lights, and cloud cover.
These are enormous files. The main composites are 21,600 by 10,800 pixels and, as tiffs, clock in at over 900MB apiece. Individual tiles, each measuring 90 degrees latitude by 90 degrees longitude, are also available at the same size, 21,600 by 10,800 pixels. (If my math is right, that yields a resolution of about 3.5 pixels per mile at the equator.)
To determine how large a scene’s texture files need to be, I start with the final rendering size and work backwards. In this case the rendering needs to be wide enough to cover a two-page spread in a print publication. The required resolution is 300 pixels per inch, which yields a width of 5,100 pixels. The area to be depicted stretches from West Papua to Tonga in the South Pacific. On NASA’s image, this distance measures about 4,000 pixels – close enough but it means the image needs to be kept full-size. On the other hand, if we were rendering a 720HD video frame (1280×720) of the entire globe, the images could be resized to 2,048 pixels wide and there would be plenty of resolution to spare.
We’ll download two files – a surface image and the elevation image.
The brightness of the pixels in the elevation image represent altitude: The brighter the pixel, the higher the area it represents.
We also need a water surface mask (or landmass mask, depending on how you look at it). I made one by carefully selecting all of the water surfaces in the first image, above, and filling them with solid white in Photoshop. Landmass areas are filled with solid black.
Setting up the scene
The scene requires only a planet object. The camera is positioned about 160 million miles away, far enough to frame the entire planet. The focal length is set to 200mm to provide some foreshortening of the planet’s disk.
Unlike character-modeling applications, Terragen lacks the ability to create parent-child relationships between objects, so building a working planetary system would be a challenge. Our simple scene will be Ptolemaic. Terragen’s built-in light source (the “sun”) will orbit around our Earth.
We’ll hide the planetary atmosphere to give us a clear view of the surface.
Building the surface
We’ll start by creating three Image Map shaders in Terragen’s node editor:
- One for terrain, to map the surface image to the surface of the planet.
- One for elevation, which will create surface displacement.
- And one for the water surface mask.
The first Image Map Shader will handle displacement. The settings position the shader at 0, 0, 0 relative to the planet (at the planet’s center) and project the image onto the surface:
This shader will be connected to a Default Shader, which expects displacement data to be provided as luminosity. So this shader must output color. Make sure that Apply Color is checked under the shader’s Color tab.
The Default Shader’s Displacement Multiplier is set to 10000. Assuming the scale is in meters, this puts the maximum altitude slightly higher than Mount Everest (8,848 meters).
The Default Shader is connected to the Planet node.
For this rendering, the multiplier was temporarily set to 20000 to make the displacement more apparent:
The terrain Image Map Shader takes the same settings as the displacement shader:
. . . and is connected to the Color Function of the Default Shader:
It renders like this:
The water mask Image Map Shader also takes the same settings:
. . . and is connected to the Default Shader’s Reflectivity Function:
The tab’s Index of Refraction and Roughness settings are adjusted as shown to create a large, soft specular highlight. The mask prevents the highlight from appearing on land:
Adding an Atmosphere
The default atmosphere settings are intended for landscapes, where the atmosphere is viewed horizontally. We’ll be peering through the atmosphere vertically, and at a much larger scale. Haze and color settings are adjusted to compensate:
. . . and the rendering:
This shows how Terragen’s atmosphere behaves, optically, very much like a real one. The oceans, which were very dark before, now reflect the atmosphere’s blue tint. To adjust the ocean color, tweak the Diffuse Shader’s reflectivity settings, the atmosphere’s Blue Sky Density, or both.
And finally, a cloud layer. The cloud node’s density shader settings need to be increased to match our global scale:
. . . and a new rendering:
Alignment and Rotation
Earth is tilted on its axis. Everyone knows this is what creates our seasons, and for realistic lighting (and animation) we should match that here.
But Terragen’s internal rotation order of Bank, Pitch, Heading (ZXY) complicates things somewhat. Changing the Y rotation value affects both the X and Z rotation values of any given node. If we were to tilt the planet on the X or Z axis and then rotate its heading – to bring a particular part of the globe into view – the tilted axis would rotate along with it. Animated, the Earth would wobble like a top.
To handle this, we’ll break the planet’s tilt (Bank, or X) and rotation (Heading, or Y) over two separate nodes – and set Y first so it won’t affect X. (And we’ll do the same for the cloud layer.)
First, we connect the Default Shader – basically a big basket that we used to assemble the planet’s surface, elevation, and reflectivity – to a Transform Input shader, and use that to set the heading:
This in turn is connected to the Planet node, which handles the axis tilt with an X rotation setting of 24.437.
Note that the rotation values are displayed in XYZ order, even though the program’s internal rotation order is ZXY.
Both Translate Textures with Planet and Rotate Textures with Planet are selected, so all of our images will be aligned (and will rotate) properly.
We use two Transform Input nodes to tilt and rotate the cloud layer. Actually, we’ll tilt and rotate the cloud layer’s density shader, which creates the fractal pattern that drives the cloud shapes.
Plug the density shader into the first Transform node. As before, this handles the heading (Y) value:
. . . and the second handles the tilt (X) value:
Note that both of these transform nodes have been translated to the same location as the planet object. If these translation settings don’t match, the density shader and planetary atmosphere will be offset and animating the rotation values will lead to unpredictable results. Simply copy and paste the planet’s transform values into these two shaders.
The second Transform Shader is plugged into the cloud layer’s density shader input:
All done! To rotate the earth, simply set the Y values in the appropriate Transform shaders. To position the sun, change its heading value.
The final node network: