HOME

TheInfoList



OR:

An event camera, also known as a neuromorphic camera, silicon retina, or dynamic vision sensor, is an imaging sensor that responds to local changes in brightness. Event cameras do not capture images using a shutter as conventional (frame) cameras do. Instead, each pixel inside an event camera operates independently and asynchronously, reporting changes in brightness as they occur, and staying silent otherwise.


Functional description

Event camera pixels independently respond to changes in brightness as they occur. Each pixel stores a reference brightness level, and continuously compares it to the current brightness level. If the difference in brightness exceeds a threshold, that pixel resets its reference level and generates an event: a discrete packet that contains the pixel address and timestamp. Events may also contain the polarity (increase or decrease) of a brightness change, or an instantaneous measurement of the illumination level, depending on the specific sensor model. Thus, event cameras output an asynchronous stream of events triggered by changes in scene illumination.Event cameras typically report timestamps with a microsecond temporal resolution, 120 dB
dynamic range Dynamics (from Greek δυναμικός ''dynamikos'' "powerful", from δύναμις ''dynamis'' " power") or dynamic may refer to: Physics and engineering * Dynamics (mechanics), the study of forces and their effect on motion Brands and ent ...
, and less under/overexposure and
motion blur Motion blur is the apparent streaking of moving objects in a photograph or a sequence of frames, such as a film or animation. It results when the image being recorded changes during the recording of a single exposure, due to rapid movement or l ...
than frame cameras. This allows them to track object and camera movement (
optical flow Optical flow or optic flow is the pattern of apparent motion of objects, surfaces, and edges in a visual scene caused by the relative motion between an observer and a scene. Optical flow can also be defined as the distribution of apparent velocit ...
) more accurately. They yield grey-scale information. Initially (2014), resolution was limited to 100 pixels. A later entry reached 640x480 resolution in 2019. Because individual pixels fire independently, event cameras appear suitable for integration with asynchronous computing architectures such as neuromorphic computing. Pixel independence allows these cameras to cope with scenes with brightly and dimly lit regions without having to average across them. It is important to note that, while the camera reports events with microsecond resolution, the actual temporal resolution (or, alternatively, the bandwidth for sensing) is on the order of tens of microseconds to a few milliseconds, depending on signal contrast, lighting conditions, and sensor design. * Indicates human perception temporal resolution, including cognitive processing time. **Refers to change recognition rates, and varies according to signal and sensor model.


Types

Temporal contrast sensors (such as DVS (Dynamic Vision Sensor), or sDVS (sensitive-DVS)) produce events that indicate polarity (increase or decrease in brightness), while temporal image sensors indicate the instantaneous
intensity Intensity may refer to: In colloquial use * Strength (disambiguation) *Amplitude * Level (disambiguation) * Magnitude (disambiguation) In physical sciences Physics *Intensity (physics), power per unit area (W/m2) *Field strength of electric, m ...
with each event. The DAVIS (Dynamic and Active-pixel Vision Sensor) contains a global shutter active pixel sensor (APS) in addition to the dynamic vision sensor (DVS) that shares the same photo
sensor array A sensor array is a group of sensors, usually deployed in a certain geometry pattern, used for collecting and processing electromagnetic or acoustic signals. The advantage of using a sensor array over using a single sensor lies in the fact that an ...
. Thus, it has the ability to produce image frames alongside events. Many event cameras additionally carry an
inertial measurement unit An inertial measurement unit (IMU) is an electronic device that measures and reports a body's specific force, angular rate, and sometimes the Orientation (geometry), orientation of the body, using a combination of accelerometers, gyroscopes, an ...
(IMU).


Retinomorphic sensors

Another class of event sensors are so-called ''retinomorphic'' sensors. While the term ''retinomorphic'' has been used to describe event sensors generally, in 2020 it was adopted as the name for a specific sensor design based on a resistor and photosensitive
capacitor In electrical engineering, a capacitor is a device that stores electrical energy by accumulating electric charges on two closely spaced surfaces that are insulated from each other. The capacitor was originally known as the condenser, a term st ...
in series. These capacitors are distinct from photocapacitors, which are used to store
solar energy Solar energy is the radiant energy from the Sun's sunlight, light and heat, which can be harnessed using a range of technologies such as solar electricity, solar thermal energy (including solar water heating) and solar architecture. It is a ...
, and are instead designed to change capacitance under illumination. They (dis)charge slightly when the capacitance is changed, but otherwise remain in equilibrium. When a photosensitive capacitor is placed in series with a
resistor A resistor is a passive two-terminal electronic component that implements electrical resistance as a circuit element. In electronic circuits, resistors are used to reduce current flow, adjust signal levels, to divide voltages, bias active e ...
, and an input voltage is applied across the circuit, the result is a sensor that outputs a voltage when the light intensity changes, but otherwise does not. Unlike other event sensors (typically a photodiode and some other circuit elements), these sensors produce the signal inherently. They can hence be considered a single device that produces the same result as a small circuit in other event cameras. Retinomorphic sensors have to-date only been studied in a research environment.


Algorithms


Image reconstruction

Image reconstruction from events has the potential to create images and video with high dynamic range, high temporal resolution, and reduced motion blur. Image reconstruction can be achieved using temporal smoothing, e.g. high-pass or complementary filter. Alternative methods include
optimization Mathematical optimization (alternatively spelled ''optimisation'') or mathematical programming is the selection of a best element, with regard to some criteria, from some set of available alternatives. It is generally divided into two subfiel ...
and gradient estimation followed by Poisson integration. It has been also shown that the image of a static scene can also be recovered from noise events only by analyzing their correlation with scene brightness.


Spatial convolutions

The concept of spatial event-driven convolution was postulated in 1999 (before the DVS), but later generalized during EU project CAVIAR (during which the DVS was invented) by projecting event-by-event an arbitrary convolution kernel around the event coordinate in an array of integrate-and-fire pixels. Extension to multi-kernel event-driven convolutions allows for event-driven deep
convolutional neural networks A convolutional neural network (CNN) is a type of feedforward neural network that learns features via filter (or kernel) optimization. This type of deep learning network has been applied to process and make predictions from many different type ...
.


Motion detection and tracking

Segmentation and detection of moving objects viewed by an event camera can seem to be a trivial task, as it is done by the sensor on-chip. However, these tasks are difficult, because events carry little information and do not contain useful visual features like texture and color. These tasks become even more challenging given a moving camera, because events are triggered everywhere on the image plane, produced by moving objects and the static scene (whose apparent motion is induced by the camera's ego-motion). Some of the recent approaches to solving this problem include the incorporation of motion-compensation models and traditional clustering algorithms.


Potential applications

Potential applications include most tasks classically fitting conventional cameras, but with emphasis on machine vision tasks (such as object recognition, autonomous vehicles, and robotics.). The US military is considering infrared and other event cameras because of their lower power consumption and reduced heat generation. Considering the advantages the event camera possesses, compared to conventional image sensors, it is considered fitting for applications requiring low power consumption and latency, and where it is difficult to stabilize the camera's line of sight. These applications include the aforementioned autonomous systems, but also space imaging, security, defense, and industrial monitoring. Research into color sensing with event cameras is underway, but it is not yet convenient for use with applications requiring color sensing.


See also

* Neuromorphic engineering * Retinomorphic sensor * Rolling shutter


References

{{Reflist Science of photography Image sensors