AXIS Object Analytics

Solution overview

About the application

AXIS Object Analytics detects objects that move in the scene and classifies them as, for example, humans or vehicles. You can set up the application to send alarms for either humans or vehicles, or both. You can choose different conditions for sending alarms, such as movement within a predefined area, or line crossing. The alarm can be used by Axis network video devices or third-party software to, for example, record video, play an audio message, or alert security staff.

Mount the camera

This image illustrates an appropriately mounted camera.

  1. Mounting height
  2. Tilt
  3. Detection area
  4. Minimum detection distance
  5. Maximum detection distance
  6. Field of view distance
  7. Field of view elevation

Consider the following when you mount the camera:

Mounting position

If you mount the camera so it looks down from above, it makes it difficult for the application to classify objects.

Tilt

The camera must be sufficiently oriented towards the ground so that the center of the image is below the horizon. Mount the camera so that the minimum detection distance is longer than half of the camera’s mounting height (minimum detection distance > camera mounting height / 2).

Maximum detection distance

  • The maximum detection distance depends on:
  • Camera type and model

  • Camera lens. A higher focal range allows for a longer detection distance.

  • Weather. For example, heavy rain or snow can affect the detection distance and accuracy.

  • Light. Detection accuracy and range can be affected by insufficient illumination.

  • Camera load

Object height

For a human to be detected, the height must be at least 48 pixels in a video stream with a height of 1080 pixels. This equals about 4% of the total image height for any stream height. For a vehicle, the height must be at least 28 pixels, or about 3% of the total image height.

Roll

The camera’s roll angle must be nearly equal to zero. It means that the image should be level with the horizon.

Field of view

The camera’s field of view must be fixed.

Vibrations

The application tolerates small camera vibrations, but you get the best performance when the camera is not subject to vibrations.

Object visibility

Detection accuracy can be affected if objects are only partially visible due to, for example, foliage. It’s particularly important that characteristic features, such as legs or wheels, are visible.

Contrast

  • There needs to be a certain level of contrast between objects and the background. You can increase the level of illumination and adjust the image settings to improve the level of contrast.
  • When you use a day-and-night camera with artificial lighting, we recommend at least 50 lux in the entire detection area.

  • When you use built-in IR lighting, the maximum detection distance depends on the camera and the environment.

Expected movement of objects in the scene

Objects that approach the camera in a straight line need to move for a longer time before they get detected compared to other objects.

Recommended image settings

Before you start to use the application, we recommend you to turn on Forensic WDR and barrel distortion correction, if they are available for your camera.

The image to the right is an example of barrel distortion. Barrel distortion is a lens effect where straight lines appear increasingly bent closer to the edges of the frame.

Get started

  1. Log in to the product’s webpage as an administrator and go to Settings > Apps > AXIS Object Analytics.

  2. Select the application.

  3. Start the application and click Open.

  4. In the welcome screen, click Step-by-step to follow the recommended setup procedure.

  5. In step 1, read through the considerations.

  6. In step 2, select if you want the application to trigger alarms for objects classified as humans, vehicles, or both. Read more about Classification of objects.

  7. Select if you want the application to trigger alarms when objects move inside a defined area or when they cross a defined line. If you want to learn more, see Object in area and Line crossing

  8. For PTZ cameras, you can choose to restrict detection to a specific preset position. Select it from the list.

  9. Adjust the default line or area that trigger alarms.

    To find out how to adjust the default line or include area, see Adjust virtual line or area.

  10. In step 4, verify your settings.

You have now created one scenario. To rename it or change other settings, click Open.

To create more scenarios, click +.

Create scenario: object in area
Create scenario: line crossing

Adjust virtual line or area

  • To reshape a virtual line or area, click and drag one of the anchor points.

  • To move a virtual line or area, click and drag.

  • To remove a corner, right-click the corner.

  • Virtual line
  • To reset the virtual line to its default size, click Scene > Reset line.

  • To change the direction that objects trigger alarms, click Scene > Change trigger direction. The red arrows next to the line show the current direction. Alarms trigger when objects cross the line in the direction of the arrows.

  • Area
  • To reset the include area to its default size, click Scene >.

  • To create an area inside the include area where you don’t want alarms to trigger, click Scene > Add exclude area.

Additional settings

Modify a scenario

To modify a scenario, click Scenarios and click Open in the scenario card.

  • To change the scenario name, click .

  • To change what type of objects that trigger alarms, click Triggering objects.

  • Note

    If you select Any motion, the application doesn’t classify objects. Instead, alarms trigger whenever something moves in the scene. It may, for example, be animals, swaying foliage, flags, or shadows. To ignore certain object types, you can use filters. For more information, see Filters.

  • To adjust the virtual line or area, click Scene.

Calibrate perspective

If the scene has a significant depth, you need to calibrate the perspective to remove false alarms due to small objects. During calibration, the application compares the height of the objects as they appear in the image with the actual heights of the corresponding physical objects. The application uses the calibrated perspective to calculate the object size.

Place vertical bars in the image to calibrate perspective. The bars represent physical objects at different distances from the camera.

  1. Go to Settings > Advanced > Perspective and click +.

  2. In the live view, choose three objects of known height, for example humans or fence poles, that are located on the ground and at different distances from the camera.

  3. Place one bar at each object and adjust the length of each bar to the height of the object.

  4. For each bar, enter the corresponding object’s height.

  5. Select the scenarios you want to apply the perspective to.

  6. Click Save.

Example

If there is a fence with 2 meter high poles extending from the camera towards the horizon, position the bars at the fence poles, adjust their lengths and enter 200 cm (6 ft 7 in) in the fields.

Add burnt-in metadata overlays to video streams

To show what triggered an alarm in the live and recorded video stream, turn on metadata overlay. When you turn on metadata overlay:

  • A rectangle is shown around objects that trigger alarms.

  • The area or line of the scenario that triggers the alarm is shown.

If several scenarios trigger at the same time, overlays are shown for all of them in all streams with that selected resolution.

Important

The metadata overlays are burnt in alarm overlays to the selected resolution of the video stream. You can’t remove them from recorded video.

Note

If you use view areas, the metadata overlays only appear in the first view area. The default name of the first view area is View area 1.

  1. In the application’s webpage, go to Settings > Metadata overlay.

  2. Select in which resolution burnt-in metadata overlays should appear.

    You can only select one resolution and it will be applied to all scenarios.

Restrict detection to a PTZ preset position

For PTZ cameras, you can restrict detection to a specific preset position.

  1. Go to Scenarios and click Open in a scenario card, or click + to create a new scenario.

  2. Click Scene and select a preset position from the list.

Note

Each time the preset position changes, the application needs to recalibrate. We recommend you to wait at least 15 seconds before you change between preset positions in a guard tour.

Learn more

Classification of objects

The application can classify two types of objects: humans and vehicles. For cameras with deep learning, vehicles can be further categorized into busses, cars, bikes, and trucks. The application shows a rectangle around classified objects.

Objects classified as vehicles get a blue rectangle, and objects classified as humans get a red rectangle.
  • For the best possible results:
  • at some point, the entire object needs to be visible in the scene

  • the object needs to be in motion within the scene for at least 2 seconds

  • for cameras with machine learning, humans need to move in a somewhat upright position

  • the upper body of a human needs to be visible

  • objects need to stand out from the background

Object in area

When you use the trigger condition Object in area, the application triggers when objects move inside a defined area. This area is called an include area.

Include area

The include area is an area in which selected object types trigger alarms. Objects can trigger alarms even if only a part of the object is inside the include area. The application ignores moving objects that are outside the include area.

Reshape and resize the area so that it only covers the part of the image in which moving objects should be detected. The default rectangle can be changed to a polygon with up to 10 corners.

Recommendation

If there’s a busy road or sidewalk close to the include area, draw the include area so that the objects outside the include area do not accidentally trigger alarms. This means you should avoid drawing the include area too close to the busy road or sidewalk.

Exclude areas

An exclude area is an area inside the include area in which selected object types do not trigger alarms. Use exclude areas if there are areas inside the include area that trigger a lot of unwanted alarms. You can create up to 5 exclude areas.

Move, reshape, and resize the area so that it covers the desired part of the image. The default rectangle can be changed to a polygon with up to 10 corners.

Recommendation

Place exclude areas inside the include area. Use exclude areas to cover areas where you don’t want alarms to trigger.

Line crossing

When you use the trigger condition Line crossing, the application triggers when objects cross a virtually defined line.

The virtual line is a yellow line in the image. Objects of the selected type that cross the line in a certain direction trigger alarms. The red arrows on the line show the current direction. Alarms trigger when objects cross the line in the direction indicated by the arrows.

To trigger an alarm the object must pass the line. As illustrated in the image, the lower part of the object must pass the line for the alarm to be triggered. Objects that only touch the line do not trigger alarms.

  • In the left image, the man does not trigger an alarm, as his lower body has not yet passed the line.

  • In the right image, the man triggers an alarm, as his lower body has passed the line.

Virtual line recommendations

Adjust the virtual line so that objects can’t enter the protected area without passing the line and so that the application can detect objects before they cross the line.

Integration

Set up alarms in AXIS Camera Station

This example explains how to set up a rule in AXIS Camera Station to alert the operator and record video that includes metadata overlays when AXIS Object Analytics triggers an alarm.

Before you start

  1. Add the camera to AXIS Camera Station
  2. In AXIS Camera Station, add the camera. See the user manual for AXIS Camera Station.

  1. Create a device event trigger
  2. Click and go to Configuration > Recording and events > Action rules and click New.

  3. Click Add to add a trigger.

  4. Select Device event from the list of triggers and click Ok.

  5. In the Configure device event trigger section:

    • In Device, select the camera.

    • In Event, select one of the scenarios for AXIS Object Analytics.

    • In Trigger period, set an interval time between two successive triggers. Use this function to reduce the number of successive recordings. If an additional trigger occurs within this interval, the recording will continue and the trigger period starts over from that point in time.

  6. In Filters, set active to Yes.

  7. Click Ok.

  1. Create actions to raise alarms and record video
  2. Click Next.

  3. Click Add to add an action.

  4. Select Raise alarm from the list of actions and click Ok.

  5. Note

    The alarm message is what the operator sees when an alarm is raised.

  6. In the Alarm message section, enter an alarm title and description.

  7. Click Ok.

  8. Click Add to add another action.

  9. Select Record from the list of actions and click Ok.

  10. In the list of cameras, select the camera to use for recording.

  11. Important

    To include metadata overlays in the recording, make sure you select a profile with the same resolution as the one selected for metadata overlays in the application.

  12. Select a profile and set the prebuffer and postbuffer.

  13. Click Ok.

  1. Specify when the alarm is active
  2. Click Next.

  3. If you only want the alarm to be active during certain hours, select Custom schedule.

  4. Select a schedule from the list.

  5. Click Next.

  6. Enter a name for the rule.

  7. Click Finish.

Note

To see the metadata overlays in the live view, make sure you select the streaming profile that matches the one you set in the application.

Record video when there is an alarm

The following example explains how to set up the Axis device to record video to an SD card when the application triggers an alarm.

  1. In the product’s webpage, go to Settings > Apps and make sure the application is running.

  2. To check that the SD card is mounted, go to Settings > System > Storage.

  3. Go to Settings > System > Events and add a rule.

  4. Type a name for the rule.

  5. In the list of conditions, under Applications, select the application scenario. To trigger the same action for all scenarios, select Any Scenario.

  6. In the list of actions, under Recordings, select Record video.

  7. Select an existing stream profile or create a new one.

    To show metadata overlays, make sure you have turned it on in the application for the same resolution that is in the stream profile.

  8. In the list of storage options, select SD card.

    Make sure the SD card is mounted.

  9. To test the rule, go back to the scenario in the application’s webpage and click Test alarm.

    This generates an event, as if the scenario had triggered for real. If you have turned on metadata overlays, a red or blue rectangle will show.

Troubleshooting

Problems detecting objects

... when image is unstable

Turn on Electronic image stabilization (EIS) in the Image tab of the product’s webpage.

... at image edges, where the image looks distorted

Turn on Barrel distortion correction (BDC) in the Image tab of the product’s webpage.

... immediately

Objects need to be fully visible in the scene before they can be detected by the application.

... in other situations

It could be because the objects melt into the background if they are the same color, or because there is bad light in the scene. Try to improve the light.

Problems with false alarms

... due to small animals that appear large in the image

Calibrate the perspective. See Calibrate perspective.

Problems with metadata overlays

... on a second client

Metadata overlays are only visible for one client at a time.

Filters

If the application is set up to trigger alarms for any motion, you may experience unwanted alarms. You can then use filters.

Short-lived objects –
Use this to ignore objects that only appear in the image for a short period of time.
Small objects –
Use this to ignore small objects.
Swaying objects –
Use this to ignore objects that only move a short distance.

Filter recommendations

  • Filters are applied to all moving objects found by the application and should be set up with care to make sure that no important objects are ignored.

  • Set up one filter at a time and test it before you turn on another filter.

  • Change the filter settings carefully until you’ve reached the desired result.

The short-lived objects filter

Use the short-lived objects filter to avoid alarms for objects that only appear for a short period of time, such as light beams from a passing car or quickly moving shadows.

When you turn on the short-lived objects filter and the application finds a moving object, the object does not trigger an alarm until the set time has passed. If the alarm is used to start a recording, configure the pre-trigger time so that the recording also includes the time the object moved in the scene before triggering the alarm.

Set up the short-lived objects filter

  1. Click Scenarios and select an existing scenario or click + to create a new scenario.

  2. Click Triggering objects and make sure Any motion is selected.

  3. Go to Filters > Short-lived objects.

  4. Enter the number of seconds in the field. The number of seconds is the minimum time that objects must pass before the object triggers an alarm. Start with a small number.

  5. If the result is not satisfactory, increase the filter time in small steps.

The swaying object filter

The swaying objects filter ignores objects that only move a short distance, for example swaying foliage, flags, and their shadows. If the swaying objects are large, for example large ponds or large trees, use exclude areas instead of the filter. The filter is applied to all detected swaying objects and, if the value is too large, important objects might not trigger alarms.

When the swaying object filter is turned on and the application detects an object, the object does not trigger an alarm until it has moved a distance larger than the filter size.

Set up the swaying objects filter

The filter ignores any object moving a shorter distance than that from the center to the edge of the ellipse.

Note
  • The filter applies to all objects in the image, not just objects in the same position as the setup ellipse.
  • We recommend that you begin with a small filter size.
  1. Click Scenarios and select an existing scenario or click + to create a new scenario.

  2. Click Triggering objects and make sure Any motion is selected.

  3. Go to Filters > Swaying objects.

  4. Enter how far objects are allowed to move, as a percentage of the screen, before an alarm triggers.

The small objects filter

The small objects filter reduces false alarms by ignoring objects that are small, for example small animals.

Note
  • The filter applies to all objects in the image, not just objects in the same position as the setup rectangle.
  • The application ignores objects that are smaller than both the entered height and the entered width.

Set up the small objects filter

  1. Click Scenarios and select an existing scenario or click + to create a new scenario.

  2. Click Triggering objects and make sure Any motion is selected.

  3. Go to Filters > Small objects.

  4. Note

    If you have calibrated the perspective, enter the width and height of the objects to ignore in centimeters (inches) instead of as percentage of the image.

  5. Enter the width and height of the objects to ignore as percentage of the image.