Sensor: It refers to an image sensor, whose surface contains several million to tens of millions of photodiodes. It is a semiconductor chip that converts optical images into electrical signals.
Pixel: A pixel is the basic unit of a Sensor. An image is composed of pixels, and the number of pixels indicates the amount of photosensitive elements contained in the camera.
Resolution: It refers to the maximum number of pixels that an image can accommodate in both the horizontal and vertical directions.
Pixel size: It refers to the actual size represented by a pixel in both the length and width directions.
Vividly represented by the above figure, pixels represent the total number of black grids in this image, which is 91 pixels, while resolution refers to the number of black grids in the length and width respectively. The figure shown above is 13*7. Pixel size is the size represented by each black grid in this image, and the unit is generally micrometers. When the image size is constant, the larger the pixel size, the lower the resolution and the lower the clarity.

Background: After people had sensors that could sense the intensity of light, they could only take black-and-white photos (grayscale images) because the sensors at that time could only sense the intensity of light but not color. If one wanted to obtain a color image, the most direct method was to add filters of different colors. Therefore, the Bayer array was developed. It is composed of red, green and blue filters arranged alternately in a regular pattern. A filter of one of the RGB colors is placed on each pixel, allowing only the light of a specific color to pass through.
Bayer Formation: by Eastman. The Bayer array, invented by Bryce Bayer, a scientist from Kodak, in 1976, is still widely used in the field of digital image processing to this day.



Human eye cells
In the human eye, there are two types of visual cells: cone-shaped and rod-shaped.
Cone cells are further classified into three types: red photoreceptor cells, green photoreceptor cells (the most sensitive), and blue photoreceptor cells. They are not sensitive when the illuminance is low. Only when the light intensity reaches a certain condition can the cone cells function.
Rod cells are highly sensitive to light and can form images of objects in very dim lighting conditions, but they cannot sense colors.
This also explains why people can see objects at night but cannot effectively distinguish their colors.

The difference between CCD and CMOS
CCD (charge couple device) : Charge-coupled device, integrated on semiconductor single crystal materials.
CMOS (complementary metal oxide semiconductor) : Complementary metal oxide semiconductor, integrated on semiconductor materials of metal oxides.
At present, in the security market, the image sensors of cameras are either CCD or CMOS. In the era of standard-definition surveillance, both analog cameras and standard-definition network cameras generally used CCD sensors. However, in the past few years, CMOS has been swallowing up the CCD market. In the era of high-definition surveillance, CMOS has gradually replaced CCD sensors.
1. Information reading speed
The charge information stored in the CCD charge-coupled device needs to be transferred bit by bit downward under the control of the synchronous signal, and then uniformly amplified for ADC conversion. The transfer and reading output of the charge information require a clock control circuit, and the overall circuit is relatively complex. CMOS sensors directly perform amplification gain and analog-to-digital conversion within the light-sensitive unit, making signal reading very simple. They can also process image information from each unit simultaneously. Therefore, the reading speed of CMOS is faster than that of CCD.
2. Sensitivity
Because each pixel of A CMOS sensor contains additional circuits (amplifiers and A/D conversion circuits), the light-sensitive area of each pixel only occupies a small part of the pixel's own area. Therefore, when the pixel size is the same, the sensitivity of a CMOS sensor is lower than that of a CCD sensor.
3. Noise
Since each photodiode in CMOS requires an amplifier, if measured in megapixels, then millions of amplifiers are needed. As amplifiers are analog circuits, it is difficult to keep the amplification gain of each pixel consistent. Therefore, compared with CCD sensors that have only one amplifier, the noise of CMOS sensors will increase significantly, affecting image quality.
4. Power consumption
The image acquisition method of CMOS sensors is active. The charge generated by the photodiode is directly amplified and converted by the adjacent circuit. However, CCD sensors are passive in acquisition. An applied voltage must be applied to make the charge in each pixel move downward, and the applied voltage usually requires 12 to 18V. Therefore, CCD also requires precise power supply line design and withstand voltage strength. The high driving voltage makes the power consumption of CCD much higher than that of CMOS.
5. Cost
Because CMOS sensors adopt the MOS process, which is the most commonly used in general semiconductor circuits, peripheral circuits (such as timing control, CDS, ISP, etc.) can be easily integrated into the sensor chip, thus saving the cost of peripheral chips. CCD transmits data through charge transfer. If just one pixel fails to operate, the entire row of data cannot be transmitted. Therefore, the yield of CCD is relatively low. Moreover, its manufacturing process is complex, and only a few manufacturers can master it. This is also the reason for the high cost.
Shutter speed
The shutter is a device used to control the exposure time and is an important component of a camera. Its structure, form and function are important factors in measuring the grade of a camera. Both CCD and CMOS image sensors use electronic shutters, including global shutters and rolling shutters.
Global Shutter: All pixels of the Sensor collect light simultaneously and expose simultaneously. That is, at the beginning of the exposure, the Sensor begins to collect light. At the end of the exposure, the light collection circuit is cut off, and then the Sensor value is read as one frame.
All pixels are exposed at the same moment, similar to freezing a moving object, so it is suitable for shooting fast-moving objects.
Rolling Shutter: The Sensor achieves this through progressive exposure. At the beginning of the exposure, the Sensor scans line by line and exposes line by line until all pixels are exposed. Of course, all the actions are completed in an extremely short time, and the exposure time for different row pixels varies.
It is line-by-line sequential exposure, so it is not suitable for shooting moving objects. If the object or the camera is in a state of rapid movement during shooting, the shooting result is very likely to show phenomena such as "tilting", "swaying" or "partial exposure".
The development trend of CMOS
1. Low-light effect
The development from the traditional FSI(Front side Illumination) front-illuminated CMOS sensor to the BSI(Backside Illumination) back-illuminated CMOS sensor is a major technological leap. The greatest optimization of the back-illuminated CMOS sensor lies in the change of the internal structure of the component. Back-illuminated CMOS reverses the orientation of the light-sensitive layer components, allowing light to directly enter from the back. This avoids the influence of the circuit between the microlens and the photodiode and the transistor in the traditional CMOS sensor structure, significantly enhancing the efficiency of light and greatly improving the shooting effect in low-light conditions. Back-illuminated CMOS sensors have made a qualitative leap in sensitivity compared to traditional CMOS sensors. As a result, their focusing ability and image quality have been greatly improved under low illumination.

2. Noise suppression
On the one hand, the specialized noise detection algorithm is directly integrated into the control logic of the CMOS image sensor. Through this technology, fixed noise can be successfully eliminated. On the other hand, various technological innovations are adopted in the ISP, such as denoising technology, to improve the noise problem of CMOS.
3. High integration
One of the major advantages of CMOS sensors. It is a circuit with other functions integrated in its sensor. For example, the launched OV10633 is a 720p HD wide dynamic range sensor. The OV10633 model integrates WDR wide dynamic range and ISP image signal processing functions on the same chip as the image sensor.