Before proceeding I should stress here that my knowledge of solar physics is pretty elementary, being garnered from a bit of Wikipedia browsing and books such as the excellent "Wonders of the Solar System" so don't call me out if anything I say here is imprecise, incorrect or a blatant lie - I don't pretent to be any kind of expert on this subject.
Conducting even my initial research into the subject however immediately revealed that the physical processes at work in a star are pretty complex, scientists' understanding of even the one at the centre of our own solar system is not completely exhaustive, so picking the correct metric(s) to base my graphical representations upon is as ever a trade-off. The more physically accurate I want the simulation to be the more work is going to be involved and the more complex and therefore slow the code will become. This is a real-time project and the effect has to render in just a millisecond or two so it has to be kept simple, in the end I decided temperature was a pretty obvious measure of how a star looks and of the various temperatures associated with a star (core, surface and corona amongst others) the optical surface temperature seemed like a good fit as it’s largely to do with the observed appearance of the star. Hereafter I’ll refer to this optical surface temperature as simply “temperature”.
The temperature of stars is expressed using the Kelvin scale with the temperature of our own sun being approximately 5700K. (see this page on Effective Temperature or here about Spectrums for more details on this) There are also various classes and classifications used to categorise stars but one of the easiest to understand that I found is the Hertzsprung-Russell diagram which provides a really clear visual representation of the relationship between stars magnitudes/luminosities and their classifications/ temperatures.
|(Image from Wikipedia)|
Tying this into my existing star shader was relatively straightforward, I just had to pass in the temperature of the star as a pixel shader constant then use this along with my noise value to look up the Kelvin to RGB conversion texture . I did at first think something was wrong though as at 5700K my lovely yellow and white "Sun" was rendering as a salmon pink sort of shade - not at all sun-like!
After a little more digging it would appear that along with much of the Earth's populace I've been sold something of a lie for the last four decades! Turns out that our Sun isn't actually yellow at all but is rather that fetching salmon pink I mentioned earlier, it only looks yellow from the Earth (or at times a lovely red when rising or setting) due to the absorption and dispersion of light as it passes through our atmosphere. Presumably TV and Films SFX departments use yellow when rendering the Sun from space for consistency (and so they don’t have to keep explaining why it’s salmon pink)
Anyway, happy that this wasn’t a bug in my shader I had a play around with varying this temperature value I was passing to the shader to change the visible colour of my star which produced what I think are quite reasonable results:
The temperature of each version is overlayed - an interesting fact I wasn't aware of is how the colour passes from red through yellow through white and finally to pale blue with temperature - mentally I had assumed that white would be the hottest, so I learned something there.
Note that it’s not just the colour that’s varying with temperature here, the size and “fuzzyness” of the corona is also increasing as the temperature increases – there’s no particular scientific basis for this, it just felt like hotter stars would be expected to have larger coronas. The magnitude of the screen space ray distortion is also linked to temperature, producing more erratic field movement on hotter stars.
Take a look at the video below to see the transition occurring in real time – there’s no doubt more that could be done for now I think it’s good enough for my purposes: producing a range of interesting stars around which to base my procedural solar systems.