UCLA Extension

Applications, Design, and Testing of CMOS and CCD Sensors and Camera Systems

This course benefits scientists, engineers, and hardware managers that are involved with the selection, specification, and design of CMOS/CCD camera systems for a variety of applications. In recent years CMOS sensors have been increasingly displacing CCD sensors as the preferred and more useful in many imaging applications (with “sensor” defined as the light-sensing focal plane chip). In this course the advantages and disadvantages of CCDs and CMOS sensor technologies and their fundamental engineering description are covered. The role of CMOS sensors in providing highly integrated and flexible digital camera systems is highlighted. The basics of visible sensor camera design and imaging performance are also described in detail, including metrics such as signal to noise ratio (SNR), modulation transfer function (MTF), and optical Q. The underlying principles are illustrated with design synthesis examples that span the range of multispectral satellite imaging to high-definition television (HDTV) color scanning. Requirements flow-down, camera and sensor specification, and corresponding verification by testing are also covered. Finally the future trends for CMOS sensor and camera technologies are discussed.

Complete Details

The course can help participants to:

  • Understand the basics of CCD sensor design and operation
  • Understand the basics of CMOS sensor design and operation
  • Compare primary architecture differences for CMOS and CCD sensor technologies
  • Analyze different CMOS imager pixel types, including their advantages and disadvantages, and compare their SNR performance
  • Overview trends and improvements in CMOS circuit techniques and fabrication process changes and advancements
  • Describe and characterize CMOS sensor performance metrics: quantum efficiency, noise dark current, full well, modulation transfer function (MTF), radiation hardness
  • Describe the camera system requirements flowdown process
  • Understand camera design basics and camera signal and noise optimization
  • Review image chain cascading of component modulation transfer functions (MTFs) (optics, pixel, diffusion, motion, etc.)
  • Visualize MTF and SNR trades and aliasing impact directly using processed images
  • Illustrate imaging system design synthesis with two detailed but very different examples: multispectral satellite imager and an HDTV telecine
  • Outline typical visible imager detailed hardware design specifications
  • Review CMOS/CCD experimental characterization methods and associated test hardware used for requirements verification
  • Understand the history of color imaging, including color television standards and the segue to digital television standards
  • Review basics of HDTV and the MPEG-2 compression algorithm
  • Describe the design and signal processing for single-chip color filter array approaches
  • Summarize digital still and video camera formats
  • Understand CMOS/CCD pixel and camera scaling relationships and future trends

Course Materials

The text, CMOS/CCD Sensor and Camera Systems, 2nd Edition, Gerald C. Holst and Terrence S. Lomheim (JCD Publishing and SPIE Press, 2011), and lecture notes are distributed on the first day of the course. The notes are for participants only and are not otherwise available for sale or unauthorized distribution.

Coordinator and Lecturer

Terrence S. Lomheim, PhD, Distinguished Engineer, Sensor Systems Subdivision, The Aerospace Corporation, El Segundo, California. For the past 34 years at The Aerospace Corporation, Dr. Lomheim has held both technical staff and management positions. He has performed detailed experimental evaluations of the electro-optical properties, imaging capabilities, and radiation-effects sensitivities of infrared and visible focal plane devices, and has been involved in the development of modeling tools used to predict instrument-level performance for advanced DoD visible and infrared point-source and imaging sensor systems. Dr. Lomheim has authored and coauthored 60 publications in the areas of visible and infrared focal plane technology, sensor design and performance, and applied optics. He is a part-time instructor in the physics department at California State University, Dominguez Hills, and regularly teaches technical short courses for the International Society for Optical Engineering (SPIE) and for the UCSB and UCLA Extension programs. He is a Fellow of SPIE.

Daily Schedule

Day 1

CMOS and CCD Sensors: Basic Description, Comparisons, Fundamental Performance Metrics

Introduction to the Basics of CCD and CMOS Sensors
Survey of CMOS pixel architectures (3T, 4T, 5T, etc.); CMOS imager architectures and the integration of on-chip analog-to-digital conversion; monolithic and hybrid architectures; front-side versus back-side illuminated CMOS sensors; power dissipation and performance metrics, including quantum efficiency and fill factor, analog signal chain noise, frame rates and line rates, time-delay and integration, dark current, linearity, modulation transfer function, and radiation-hardness/tolerance.

Signal-to-Noise, MTF, Image Quality, Constraints, Detailed Examples

Derivation of single pixel signal and noise equations; definition of system noise (photons-in to bits-out); review of optical system scaling and sizing, modulation transfer function (MTF) basics: optical system, sensor pixel aperture, diffusion, temporal aperture, and scan velocity mismatch components; sampling and aliasing effects, including visual examples; image quality based on the combined use of MTF and signal-to-noise ratio; imaging figures-of-merit; flowdown of sensor system requirements to CMOS/CCD specifications; imaging sensor constraints and design trades; example of configuring a visible/near IR multispectral CMOS/CCD image sensor based on a given set of system constraints; example of configuring an HDTV Telecine film scanning system, including the tri-color visible image sensor.

Day 2

Specifying and Measuring CMOS/CCD Image Sensors

Specification of performance parameters for visible CMOS/CCD sensors; methods for experimentally characterizing CMOS/CCD sensors, including quantum efficiency, response uniformity, nonlinearity, color-dependent MTF using spot scan and tilted knife-edge techniques; lab optical system constraints and choices; observation of spatial beat patterns; verification of wavelength-dependent effects; test plans and test equipment.

Color Imaging Using CMOS/CCD Image Sensors: Background and Fundamentals

NTSC color and colorimetry standards; how to use the CIE chromaticity diagram; definition of luminance and chrominance (hue and saturation) signals; brief historical review of TV signal basics and analog color encoding methods.

Color Imaging Using CMOS/CCD Sensors: Broadcast Applications, Color Filter Arrays, Non-Broadcast Color Imaging, HDTV

Color camera processing for 3- and 2-chip color cameras–analog and digital; stripe and mosaic color filter arrays (CFAs); complementary and primary color filter arrays; CFA-implications for camera design and signal processing-dependence on CFA pattern; single chip CFA cameras–digital and analog; color pixel interpolation; CFA patterns for frequency interleaving of chrominance signals; multi-color commercial, land-management, and biomedical applications; multispectral and hyperspectral imaging approaches and applications. HDTV standards and implications; review of HDTV development; MPEG-2 compression standard; imaging chips for HDTV applications.

CMOS/CCD Commercial Imager Formats and Trends

Digital still and video formats, optical mount and sensor dimensions, chroma sampling terminology, space-bandwidth product, camera and pixel scaling relationships, technology drivers, trends, yield and relative cost.

For more information contact the Short Course Program Office:
shortcourses@uclaextension.edu | (310) 825-3344 | fax (310) 206-2815