This post: http://orbiter-forum.com/showthread.php?p=496952&postcount=6437 gave me an idea:
There are shape models for 67P/Churyumov–Gerasimenko, but apparently no global texture maps to go with them (please correct me if I'm wrong). Would it be an interesting project for the orbiter community to try and generate such a texture map, to create a cool 67P addon for Orbiter?
There are plenty of high-quality images from the Rosetta orbiter from all directions, so the raw data should be available. It's just a matter of processing those data.
I guess the work flow would look something like this:
One of the tricky bits is to find a suitable parameterization. Because of the complicated shape, a direct mapping into longitude/latitude pairs may not be possible, as discussed in the other thread. Another way could be this: approximate the shape model with a spherical harmonics expansion (probably wouldn't require a large number of terms). Then define a mapping from the spherical harmonics surface to a sphere. Use the sphere coordinates of a mapped vertex as the parameterisation.
Of course there is plenty of other challenging stuff as well (registration of individual images after mapping, lighting correction/shadow removal, etc.) but if it works, we'd have something that doesn't exist anywhere else. Maybe you could even sell it back to ESA :thumbup:
Any interest? Suggestions?
There are shape models for 67P/Churyumov–Gerasimenko, but apparently no global texture maps to go with them (please correct me if I'm wrong). Would it be an interesting project for the orbiter community to try and generate such a texture map, to create a cool 67P addon for Orbiter?
There are plenty of high-quality images from the Rosetta orbiter from all directions, so the raw data should be available. It's just a matter of processing those data.
I guess the work flow would look something like this:
- Take a shape model and extract vertex coordinates
- Parameterize the vertex coordinates in a way that allows mapping to texture coordinates
- Take a Rosetta image, and figure out the camera position in terms of the shape model
- project the image onto the mesh surface
- map the image into parameter space
- repeat for the next image, and so on
- merge the projected images in parameter space (seam removal, lighting adjustment, etc.)
- assign texture coordinates to the vertices according to the parameterization, and wrap the finished global map around it.
- Marvel at cool orbiter object
One of the tricky bits is to find a suitable parameterization. Because of the complicated shape, a direct mapping into longitude/latitude pairs may not be possible, as discussed in the other thread. Another way could be this: approximate the shape model with a spherical harmonics expansion (probably wouldn't require a large number of terms). Then define a mapping from the spherical harmonics surface to a sphere. Use the sphere coordinates of a mapped vertex as the parameterisation.
Of course there is plenty of other challenging stuff as well (registration of individual images after mapping, lighting correction/shadow removal, etc.) but if it works, we'd have something that doesn't exist anywhere else. Maybe you could even sell it back to ESA :thumbup:
Any interest? Suggestions?