Yes, I know that
For those who don't like the HDR effect:
It's a trend and very misused. On a normal monitor, with a normal contrast range, the images will be washed out and grey.
Yet, for the client HDR means another thing. It means that the calculations can take into account a larger dynamic range.
SolarLiner, I'm not a great fan of tonemapping and other "artistic" transfer functions.
I prefer "exposure compensation" keeping things more or less "linear".
Here's a (random) example that I like:
https://www.youtube.com/watch?v=ovlI3ko4l1I
Amateur photographer here, HDR
is a trend, but it isn't being misused, at least lot generally. HDR is useful when the contrast is great, greater than what a picture could normally reproduce linearly. Actually, that's false; HDR is the process of gathering a high range of luminosity values (IRL photography uses bracketing for example or special sensors), and tonemapping is the process of preserving as much detail as possible through a low dynamic range medium, when coming from a high(er) dynamic range.
To take home-crafted examples,
these photos don't need to be taken in HDR due to the relatively low contrast (first picture is in a cloudy weather, and so all of the incoming light is indirect, and on the second, it's mostly reflected diffuse light coming from a sunny sky), and therefore don't need any tonemapping. But
these photos needed it because on the first one the contrast between the direct light of the setting sun was too high compared to the rest of the sky (and the beach), and the on second one the sea and silhouettes would have been completely blown out in the dark (and also i'd have lost details in the cloud patch, which adds to the "fullness" of the picture here).
Thing is, your own eye is doing the tonemapping to allow you to see details in the shadows as well as in the bright lights. In fact the eye can register a much high contrast than any camera currently does, pretty much.
What that means for Orbiter, is that if we go the HDR route, we can't discard tonemapping, because adaptation only would either leave bright areas completely white or dark areas completely black. With space scenes there's a much, much higher contrast than any scene on the ground, because we have the darkness of space and the bright sun at the same time (or most of the time, a bright planet shining against the really dark cosmic microwave background).
In fact, the way it was set before, and the way it is set up in the inline client, is somewhat of a poor man's tonemapping: light intensities are set so that everything will be in the right luminosity range for the monitor to display it correctly. In fact, the sun light intensity is constant and doesn't decrease with distance to prevent the outer planets from being shown too dark.
Even in the newer versions of D3D9 for Beta with the atmospheric scattering, there is a tonemapping pass for bring everything down to the [0,1] range that the monitor can display. And in the game engine that you showed, I'm sure there's even a basic Reinhard tonemapping pass with the eye adaptation.
This is why we have to choose the right processing for the HDR to LDR conversion, and why I also wanted to implement CLUTs, so that people could weigh in and make their own "perfect display"of the game.
---------- Post added at 12:48 ---------- Previous post was at 12:44 ----------
That makes me think, the tonemapping of the scattering still remains on the latest build, whereas it was removed on the test builds, which allowed for a better behavior of the tonemapper in post-process.