Skip to content

iMG

Figure 1: International Space Station Rendered in Blender

Team

  • Margaret S Flaum
  • Garrett J Percevault
  • Adam Smith

Mentor

Thomas Chrien

Abstract

Vision

Our goal is to use existing tools and our optical knowledge to build a simulation tool that carries out a rigorous optical analysis and simulates the imaging of a satellite at a range of distances, attitudes, and illumination angles. The simulation tool will import CAD models of various geometries for analysis. The end goal for Millenium Space is to identify satellite images using machine learning algorithms using the fewest number of resolved pixels possible. Therefore, the ability to produce images of various resolutions is of particular interest.

Background
Figure 2: Solar Phase Angle (SPA) describes the orientation of the sun, satellite, and camera with respect to each other. This is an important concept to our project as it plays a vital role in determining what parts of the object are visible as well as how illuminated it is
Figure 3: Three example orientations of the solar phase angle with respect to a spherical satellite when all other conditions are the same. This shows how the view from the camera is impacted by this factor.
Figure 4: Depending on surface type, the reflection of light off of materials is completely different. Metallic reflections result in the reflected light going in one direction while a diffuse reflection sends the light in every direction. In order to model how a satellite would reflect like, we used a combination of the two types.

Due to the complex nature of the shape of satellites, for objects more complicated than a sphere, we transitioned to using Blender, a CAD software. Blender provides a simulated image of our satellite with a relative illumination (grayscale) in 8 bits. Our camera is 12 bits, so we needed to then convert this value to what our real grayscale value would be.

Specifications
  1. Use radiometry to quantify light reflected off a satellite that will be detected by camera in space.
  2. Produce images of varying resolution from simulation.
  3. Import 3D CAD Models into Blender 
  4. Include surface properties that are realistic to satellites, by including a combination of diffuse and specular reflection.
  5. Input the following parameters into Blender:
    1. Pixel size on CCD
    2. Number of pixels on CCD
    3. Camera lens focal length
    4. Distance between camera and satellite
    5. Size and shape of satellite/CAD model
    6. Solar phase angle
    7. Object surface reflectivity properties 
  6. Input the following parameters into Python:
    1. Photon Flux per pixel per second (Appendix 3)
      1. Blender Relative Grayscale Value 
      2. CMOS Quantum Efficiency
      3. Solar Irradiance 
      4. Pixel pitch
      5. Satellite Distance
      6. Focal Length
      7. Aperture Diameter 
    2. Photons to Grayscale Conversion Factor (Appendix 4)
      1. Gain
      2. Well Capacity
      3. Responsivity 
    3. Expected Noise Equations, based on (Appendix 5)
      1. Dark Current 
      2. Read Noise
      3. Grayscale Value
    4. Exposure time
  7. Generate a library of example simulated images of satellites.
Sample Images

Figure 5: Sample images of the International Space Station at 1000m (top left), 3000m (top right), and 10,000m (bottom)

Calculations

Photon Flux per Pixel

To calculate the expected photons that can be detected by our camera we would integrate the product of the solar spectrum times the quantum efficiency of the detector divided by the energy of a photon.The wavelength range of interest for cameras of this application is 400-1000 nm.

Figure 6 – Solar Spectrum REFERENCE. Since we are working with images taken from space, we will consider the yellow portion of the plot. This plot uses ASTM E-490 AM0 data. 

Now, we need to account for the fact not every photon that reaches the satellite will reach our detector. 

​᠎​

Flux Per Pixel

This calculation must be expressed per pixel, as each pixel will have a different grayscale value. Therefore, we should calculate the area one pixel will cover at the satellite distance. 

Ratio of Photons at Satellite to Photons that reach Camera

We need to find what ratio of total photons emitted from each pixel our sensor will measure. First, we need to define our pixel area, since the calculation will be done on a per pixel basis. 

p = pixel side length

x = satellite distance

f = focal length

We will start by calculating the photon flux for a pixel with the simplest case. We will assume each pixel on the satellite is a flat Lambertian surface that emits photons equally in all directions (half sphere). Furthermore, we will assume the surface that is recorded by pixel is flat to the camera so we will not have to account for any cosine dependence in the calculation. 

Figure 7 – Illustration of 100% diffuse surface. 

In this simple example, the amount of photons that will reach the camera lens is simply the ratio of the camera aperture to the entire surface area of the lambertian reflectance. In other words, a solid angle. 

d = aperture diameter 

For a solar power of 1W/m2, and a quantum efficiency based on the curve below, this results in an overall equation, as shown below:

Simplifying:

p = pixel pitch

x = satellite distance

f = effective focal length

d = aperture diameter (clear aperture)

G =Relative Grayscale Value (from Blender)

A = Average Satellite Absorption 

T =Average Lens Transmission

Figure 8: Quantum Efficiency curve of the CMOS detector being use, provided by our customer.

We do not have exact data points so we had to estimate. 

Using solar spectrum data points and estimated QE points, we integrated using summation for a total of 

p = 2.2e-6

d = 16.4e-3

A = .5

T=.9

f = 23e-3

Real Grayscale Value

Blender only gives us relative illumination. In order to determine the actual grayscale value we will need to account for the conversion gain, well capacity, bit depth, and photon flux from the previous section. 

Conversion Gain (electron/dn) CG

Conversion gain is the ratio of electrons to output DN values in the final image. DN stands for digital number which is a grayscale value. This is a value that can be adjusted on camera to several settings. 

Well Capacity (electron) WC

The number of electrons each pixel can generate per exposure before saturation. 

Bit Depth BD

Determines the range of DN values possible for images taken by camera. The minimum value is zero and the maximum is given by:

For example, an 8 bit camera has DN values 0-255. However, depending on the conversion gain setting of the camera, the maximum DN value may not be possible to reach. 

Solar Flux to DN

 However, this equation does not account for saturation, so the maximum possible DN value must be calculated which will depend on the well capacity.

If the calculated DN value if greater than DNMax, the value will have to reset to DNMax.

Aptina CMOS Calculation 

WC = 4192 e

CG = 1.8 e/dn

SNR Methods

We will consider 3 different methods of dealing with noise in our image. 

Method 1

No noise, just calculate expected dn value per pixel.

Method 2 (Specific to Aptina CMOS)

These values were experimentally determined and converted to equations using curve fitting. All of these noises are calculated in standard deviation. Furthermore, the noise is in units of DN. 

Add noise in quadrature for total noise:

Method 3: General Noise Equation 

Calculate in electrons then use conversion gain to convert to dn. 

e: electrons 

dc: dark current (e/s)

We can convert to DN by dividing the total by the conversion gain. We can substitute e = CG*dn so we can directly compare methods 2 and 3.

Noise Comparison

From Aptina CMOS Specifications:

dc = .88 e/s

nr = 4.42 e

We can also express electrons in DN so we can compare it to our previous method. 

== dn

We will assume t=0 as the dark current is not statistically significant.

Figure 9: Comparison of noise methods for Aptina CMOS in the possible dn range of 0-2329

The biggest difference between the two methods is for pixels close to 0.

Applying Noise

In order to apply the noise, we need a gaussian random number generator that takes input mean (dn value) and the calculated standard deviation. We will use the Python module ‘random’ to do this. 

One issue with this method is the dn value could go below 0 or above 2329. The difference is very important at 0 since proportional to the dn value, the noise has a much greater effect. Therefore, we will add the absolute value of the minimum value to every pixel. Then, we would reset any value above 2329 back to 2329 since this would not have a major impact compared to dn values near 0. 

References
[1] W. Mars, Photon Behaviour & Cameras. [Online]. Available: http://warrenmars.com/photography/technical/resolution/photons.html.

[2] O. Aflak, “Ray Tracing From Scratch in Python,” Medium, 26 July 2020. [Online]. Available: https://medium.com/swlh/ray-tracing-from-scratch-in-python-41670e6a96f9. [Accessed November 2021] [3] L. Muratov, T. Perkins, M. Fox, X. Jin and P. LeVan, “Use of AI for Satellite Model Determination from Low Resolution 2D Images,” in Advanced Maui Optical and Space Surveillance Technologies Conference, Maui, Hawaii, 2019.

[4] M. Hejduk, “Specular and Diffuse Components in Spherical Satellite Photometric Modeling,” AMOS Tech, 2011.

[5] J. Africano, P. Kervin, D. Hall, P. Sydney, J. Ross, T. Payne, S. Gregory, K. Jorgensen, K. Jarvis, T. Parr-Thumm, G. Stansbery and E. Barker, “UNDERSTANDING PHOTOMETRIC PHASE ANGLE CORRECTION,” in Proceedings of the Fourth European Conference on Space Debris, Darmstadt, Germany, 2005.

[6] H. Seo, H. Jin, Y. Song, Y. Lee and Y. Oh, “The Photometric Brightness Variation of Geostationary Orbit Satellite,” Journal of Astronomy and Space Sciences, pp. 179-183, 2013.

[7] J. Hostetler and H. Cowardin, “Experimentally-Derived Bidirectional Reflectance Distribution Function Data in Support of the Orbital Debris Program Office,” NASA.

[8] M. Hejduk, H. Cowardin and E. Stansbery, “Satellite Material Type and Phase Function Determination in Support of Orbital Debris Size Estimation,” NASA.

Return to the top of the page