Use of Form, Motion and Context for Object Classification in 3D Point Clouds

A core feature required by an autonomous vehicle is the robust and accurate detection, tracking and classification of moving objects. In the previous research project, the form and movement of an object was used for classification. Targeted objects were assigned to a class by hand and a model containing form and movement information was created on the basis of this information. In this follow-up project, this model is to be created automatically in order to quickly learn different classes of moving objects. In order to make this possible and to separate the objects into different object classes and at the same time improve the classification and the motion prediction, the context in which the objects move should also be considered in addition to the form and the motion. How meaningful the information from the context is can be shown by the example of humans. A person uses it in a natural way, for example through a series of implicit rules known from experience. Examples of this are: A pedestrian often moves along a pedestrian crossing, a pedestrian path is often parallel to the road, and cars move along this road. In this project, therefore, a method is to be developed which additionally learns the context in which the object is located. One of the key points is that this should be done automatically on the basis of the already known knowledge about the shape and movement of an object, without setting fixed rules for the system. The expected benefit lies in the more robust classification and prediction of moving objects, even in difficult situations. This includes, for example, the difficulty that the shape is not exactly observable in case of concealment by other objects or self concealment, or that the movement cannot be used when an object previously moved has come to a standstill. Through the additional combination of context in connection with form and movement, processing should also be made possible in these situations.
 

Projektseite der DFG