Solution overview
About the application
AXIS Object Analytics detects, classifies, and counts moving objects, specifically humans or vehicles. You can set up scenarios with different conditions for detection, such as objects that move or stay longer than a set time within a predefined area or that cross a defined line. When objects are detected or counted, Axis network devices or third-party software can perform different actions, such as record video, play an audio message, or alert security staff.
Considerations
For best results, the camera must be correctly mounted. There are requirements on the scene, image and objects. The considerations in this chapter are generic. For product-specific considerations, see the user manual for your product at help.axis.com.
This image illustrates a correctly mounted camera.
Mounting position
If you mount the camera so it looks down from above, it makes it difficult for the application to classify objects.
Tilt
The camera must be sufficiently oriented towards the ground so that the center of the image is below the horizon. Mount the camera so that the minimum detection distance is longer than half of the camera’s mounting height (minimum detection distance > camera mounting height / 2).
Detection area
An object’s point of detection must be inside the detection area. The point of detection of a human is at its feet, and of a vehicle it’s at its center.
Maximum detection distance
- The maximum detection distance depends on:
Camera type and model
Camera lens. A higher focal range allows for a longer detection distance.
Weather. For example, heavy rain or snow can affect the detection distance and accuracy.
Light. Detection accuracy and range can be affected by insufficient illumination.
Camera load
We recommend you to use AXIS Site Designer to determine the maximum detection distance for different camera models at your site.
Roll
The camera’s roll angle must be nearly equal to zero. It means that the image should be level with the horizon.
Field of view
The camera’s field of view must be fixed.
Vibrations
The application tolerates small camera vibrations, but you get the best performance when the camera is not subject to vibrations.
Object size
For a human to be detected, the minimum height is 4% of the total image height. For a vehicle, the minimum height is 3% of the total image height. However, this requires perfect image conditions and no obstructions to the view. To minimize the risk of missed detections, we recommend a height of at least 8% for humans and 6% for vehicles.
Object visibility
Detection accuracy can be affected:
if objects are only partially visible due to, for example, foliage. It’s particularly important that characteristic features, such as legs or wheels, are visible.
when the scene is crowded with objects that frequently overlap each other. For example when there’s a traffic congestion, or in a parking lot.
Contrast
- There needs to be a certain level of contrast between objects and the background. Fog, direct light shining on the camera, or an overly noisy image can cause contrast issues. You can increase the level of illumination and adjust the image settings to improve the level of contrast.
When you use a day-and-night camera with artificial lighting, we recommend at least 50 lux in the entire detection area.
When you use built-in IR lighting, the maximum detection distance depends on the camera and the environment.
Expected movement of objects in the scene
Objects that approach the camera in a straight line need to move for a longer time before they get detected compared to objects that move perpendicular to the camera’s field of view.
Human pose
Humans need to move in a somewhat upright position.
Object motion
Objects need to move within the scene for at least 2 seconds.
Recommended image settings
Before you start to use the application, we recommend you to turn on Forensic WDR and barrel distortion correction, if they are available for your camera.
- Conditions where detections can be delayed or missed
- Note
These conditions are not relevant for radar-video fusion cameras.
Fog
Direct light shining on the camera
Inadequate light
Overly noisy image
- Situations that can trigger false alarms
Partially hidden people or vehicles. For example, a small van that appears from behind a wall can look like a person since the vehicle is high and narrow.
Insects on the camera lens. Note that day-and-night cameras with infrared spots attract insects and spiders.
A combination of car headlights and heavy rain.
Human-size animals.
Strong light causing shadows.
Get started
Log in to the device interface as an administrator and go to Apps > AXIS Object Analytics.
Start the application and click Open.
In the welcome screen, click Step-by-step to follow the recommended setup procedure.
In Considerations, read through the information.
Click + New scenario.
Select what you want your scenario to do:
Object in area: Detect and classify objects that move inside a defined area.
Line crossing: Detect and classify objects that cross a defined line.
Time in area: Detect and classify objects that stay in an area too long.
Crossline counting: Count and classify objects that cross a defined line.
Occupancy in area: Classify and estimate the number of objects within a defined area at any given time.
Motion in area: Detect any kind of object that move inside a defined area.
Motion line crossing: Detect any kind of object that cross a defined line.
- To learn more about the different scenarios, see Area scenarios and Line crossing scenarios.
Select the type of object you want the application to detect.
Read more about Classification of objects.
For PTZ cameras, you can choose to restrict detection to a specific preset position. Select it from the list.
Configure your scenario.
To find out how to adjust the default line or include area, see Adjust virtual line or area.
Verify your settings and click Finish.
You have now created a scenario. To rename or modify it, click Open.
To create more scenarios, click + New scenario.
Adjust virtual line or area
To reshape a virtual line or area, click and drag one of the anchor points.
To move a virtual line or area, click and drag.
To remove a corner, right-click the corner.
- Virtual line
To change the direction that objects should move to be detected, click Scene > Change trigger direction. The red arrows next to the line show the current direction. Actions trigger when objects cross the line in the direction of the arrows.
To reset the virtual line to its default size, click Scene > Reset line.
If you have modified the virtual line in an existing scenario, you can copy the shape, position, and trigger direction of the line to a new scenario. To copy a virtual line, go to Copy virtual line from an existing scenario and select a scenario in the drop-down list.
- Area
To create an area inside the include area where you don’t want objects to be detected, click + Add exclude area.
If you have modified the include area in an existing scenario, you can copy the shape and position of the area to a new scenario. To copy an include area, go to Copy area of interest from an existing scenario and select a scenario in the drop-down list
Configure the application
Modify a scenario
To modify a scenario, click Scenarios and click Open in the scenario card.
- For all scenario types:
To rename the scenario, click .
To change what type of objects to detect, click Triggering objects.
To adjust the virtual line or area, click Scene.
- For time in area scenarios:
Use the advanced setting Keep the rule active as long as the object is tracked when you create a rule in the device’s web interface, and the rule has an action with the option "...while the rule is active". This will make the rule stay active as long as the object is tracked and within the include area, and not only for the duration of the alarm.
For an example of how to set this up, see Record video when a human stays too long in an area.
- For crossline counting scenarios:
To reset counts on a daily basis, click Crossline counting and turn on Reset counts at midnight.
To reset counts once, click Crossline counting and click Reset counts.
- Note
The application stores counting data for 35 days, regardless of your type of storage.
To send events with counting data at one-minute intervals, turn on Event interval.
- For occupancy in area scenarios:
To trigger actions based on occupancy levels in the area of interest, set up an Occupancy threshold.
To trigger actions when the occupancy threshold has been valid for a set time, set the number of seconds in Trigger action after set time.
To send events with occupancy data at one-minute intervals, turn on Event interval. The event includes the minimum, maximum and average occupancy during the interval.
- For motion in area and motion line crossing scenarios:
To reduce false alarms due to short-lived, swaying, or small objects, use filters. For instructions and more information, see Use filters.
Calibrate perspective
It’s not possible to calibrate the perspective on all types of devices, for example certain panoramic cameras.
If the scene has a significant depth, you need to calibrate the perspective to remove false alarms due to small objects. During calibration, the application compares the height of the objects as they appear in the image with the actual heights of the corresponding physical objects. The application uses the calibrated perspective to calculate the object size.
Place vertical bars in the image to calibrate perspective. The bars represent physical objects at different distances from the camera.
Go to Settings > Advanced > Perspective and click +.
In the live view, choose two objects of the same, known height, that are located on the ground and at different distances from the camera.
You can use, for example, fence poles or a human.
Place the bars by the objects and adjust the length of each bar to the height of the object.
Select the scenarios you want to apply the perspective to.
Enter the height of the objects in Perspective bar height.
Click Save.
Example
If there is a fence with 2 meter high poles extending from the camera towards the horizon, position the bars at the fence poles, adjust their lengths and enter 200 cm (6 ft 7 in) in the fields.
Make sure the bars don’t overlap each other in height.
Add burnt-in metadata overlays to video streams
To show the event that was detected in the live and recorded video stream, turn on metadata overlay. When you turn on metadata overlay the application shows:
A rectangle around detected objects.
The area or line of the scenario where the object was detected.
For crossline counting: a table with the accumulated count per object type.
For occupancy in area: a table with the estimated count per object type at the given time.
If you turn on trajectories, the application also shows a line that outlines the path that an object has taken.
If several scenarios get triggered at the same time, overlays are shown for all of them in all streams with that selected resolution.
The metadata overlays are burnt in alarm overlays to the selected resolution of the video stream. You can’t remove them from recorded video.
If you use view areas, the metadata overlays only appear in the first view area. The default name of the first view area is View area 1.
In the application’s webpage, go to Settings > Advanced and, depending on your camera:
Turn on Metadata overlay.
Under Metadata overlay, select in which resolution burnt-in metadata overlays should appear. You can only select one resolution and the setting applies to all scenarios.
To show the path an object has taken, select Trajectories.
Restrict detection to a PTZ preset position
For PTZ cameras, you can restrict detection to a specific preset position.
Go to Scenarios and click Open in a scenario card, or click + to create a new scenario.
Click Scene and select a preset position from the list.
Each time the preset position changes, the application needs to recalibrate. We recommend you to wait at least 15 seconds before you change between preset positions in a guard tour.
Use filters
Use filters to reduce the risk of false alarms in motion in area or motion line crossing scenarios.
Short-lived objects: Ignores objects that only appear in the image for a short period of time.
Swaying objects: Ignores objects that only move a short distance.
Small objects: Ignores small objects.
Filter recommendations
Filters are applied to all moving objects found by the application and should be set up with care to make sure that no important objects are ignored.
Set up one filter at a time and test it before you turn on another filter.
Change the filter settings carefully until you’ve reached the desired result.
Ignore short-lived objects
Use the short-lived objects filter to avoid detecting objects that only appear for a short period of time, such as light beams from a passing car or quickly moving shadows.
When you turn on the short-lived objects filter and the application finds a moving object, the object doesn’t trigger an action until the set time has passed. If the action is to start a recording, configure the pre-trigger time so that the recording also includes the time the object moved in the scene before it triggered the action.
Click Scenarios and click + to create a new scenario.
Select Motion in area or Motion line crossing.
Turn on Short-lived objects.
Enter the number of seconds in the field. The number of seconds is the minimum time that must pass before the object triggers an action in the device’s event management system. Start with a small number.
If the result is not satisfactory, increase the filter time in small steps.
Ignore swaying objects
The swaying objects filter ignores objects that only move a short distance, for example swaying foliage, flags, and their shadows. If the swaying objects are large, for example large ponds or large trees, use exclude areas instead of the filter. The filter is applied to all detected swaying objects and, if the value is too large, important objects might not trigger actions.
When the swaying object filter is turned on and the application detects an object, the object does not trigger an action until it has moved a distance larger than the filter size.
The filter ignores any object moving a shorter distance than that from the center to the edge of the ellipse.
- We recommend that you begin with a small filter size.
Click Scenarios and click + to create a new scenario.
Select Motion in area.
Turn on Swaying objects.
Enter how far objects are allowed to move, as a percentage of the screen, before an action triggers.
Ignore small objects
The small objects filter reduces false alarms by ignoring objects that are small, for example small animals.
- The filter applies to all objects in the image, not just objects in the same position as the setup rectangle.
- The application ignores objects that are smaller than both the entered height and the entered width.
Click Scenarios and click + to create a new scenario.
Select Motion in area or Motion line crossing.
Turn on Small objects.
- Note
If you have calibrated the perspective, enter the width and height of the objects to ignore in centimeters (inches) instead of as percentage of the image.
Enter the width and height of the objects to ignore as percentage of the image.
Set up rules for events
To learn more, check out our guide Get started with rules for events.
Record video when an object gets detected
This example explains how to set up the Axis device to record video to an SD card when the application detects an object.
In the device’s web interface, go to Apps and make sure the application is started.
To check that the SD card is mounted, go to System > Storage.
Go to System > Events and add a rule.
Type a name for the rule.
In the list of conditions, under Application, select the application scenario. To trigger the same action for all scenarios, select Object Analytics: Any Scenario.
In the list of actions, under Recordings, select Record video.
In the list of storage options, select SD-DISK.
Select a Camera and a Stream profile.
To show metadata overlays, make sure you have turned it on in the application for the same resolution that is in the stream profile.
- Note
We don’t recommend you to use a scenario with time in area to trigger recordings if the time an object is allowed to stay inside the include area is more than 30 seconds. The reason is that it’s challenging to use a prebuffer time longer than 30 seconds, which is required if you want to see what happened before the object was detected.
If you want to start the recording before the object was detected, enter a Prebuffer time.
Click Save.
To test the rule, go to the application’s webpage and open the scenario. Click Test alarm. This generates an event, as if the scenario had triggered for real. If you have turned on metadata overlays, a red or blue rectangle will show.
Record video when a human stays too long in an area
This example explains how to set up an Axis device to record video to an SD card when the application detects a human that stays too long in a defined area.
- In the device’s web interface:
Go to Apps and make sure that the application is started.
Go to System > Storage and check that the SD card is mounted.
- In AXIS Object Analytics:
In Scenarios, click + New scenario.
Select Time in area and click Next.
Select Human and click Next.
Adjust the area of interest according to your needs.
Under Time in area settings, set the time during which the human is allowed to stay in the area.
Click Finish.
Open the scenario you just created.
Go to Triggering objects > Time in area > Advanced and click Keep the rule active as long as the object is tracked.
This makes it possible to keep the rule that you create in the device’s web interface active as long as the object is tracked, and not only for the duration of the alarm.
- In the device’s web interface:
Go to System > Events and add a rule.
Type a name for the rule.
In the list of conditions, under Application, select the application scenario.
In the list of actions, under Recordings, select Record video while the rule is active.
In the list of storage options, select SD-DISK.
Select a Camera and a Stream profile.
To show metadata overlays, make sure you have turned it on in the application for the same resolution that is in the stream profile.
- Note
We don’t recommend you to use a scenario with time in area to trigger recordings if the time an object is allowed to stay inside the include area is more than 30 seconds. The reason is that it’s challenging to use a prebuffer time longer than 30 seconds, which is required if you want to see what happened before the object was detected.
If you want to start the recording before the object was detected, enter a Prebuffer time.
Click Save.
- In AXIS Object Analytics:
To test the rule, open the scenario and click Test alarm. This generates an event, as if the scenario had triggered for real.
Send an email when 100 vehicles have passed
With crossline counting and the passthrough threshold functionality, you can get notified every time a user-defined number of objects have crossed the line.
This example explains how to set up a rule to send an email every time 100 vehicles have passed.
Before you start
Create an email recipient in the device interface.
- In AXIS Object Analytics:
In Scenarios, click + New scenario.
Select Crossline counting and click Next..
Clear Human from the listed object types and click Next.
Update the name of the scenario to
Count vehicles
.Adjust the virtual line according to your needs.
Turn on Passthrough threshold.
In Number of counts between events, type
100
.Click Finish.
- In the device’s web interface:
Go to System > Events and add a rule.
Type a name for the rule.
In the list of conditions, under Application, select Object Analytics: Count vehicles passthrough threshold reached.
In the list of actions, under Notifications, select Send notification to email.
Select a recipient from the list.
Type a subject and a message for the email.
Click Save.
Activate a strobe siren when more than 50 objects are in a defined area
With occupancy in area and the passthrough threshold functionality, you can trigger actions when a user-defined number of objects stay in an area.
This example explains how to connect a camera to AXIS D4100-E Network Strobe Siren over MQTT. When AXIS Object Analytics detects that more than 50 humans have stayed in a defined area for one minute, the camera will trigger an action that activates a profile in the strobe siren.
- Before you start:
Create a profile in the strobe siren.
Set up an MQTT broker and get the broker’s IP address, username and password.
- In AXIS Object Analytics:
In Scenarios, click + New scenario.
Select Occupancy in area and click Next.
Select Human and click Next.
Update the name of the scenario to Max 50.
Adjust the area of interest according to your needs.
Turn on Occupancy threshold.
Set Number of objects to More than 50.
Set Trigger action after set time to
60
seconds.Click Finish.
- Set up the MQTT client in the camera’s web interface:
Go to System > MQTT > MQTT client > Broker and enter the following information:
Host: Broker IP address
Client ID: For example Camera 1
Protocol: The protocol the broker is set to
Port: The port number used by the broker
The broker Username and Password
Click Save and Connect.
- Create two rules for MQTT publishing in the camera’s web interface:
Go to System > Events > Rules and add a rule.
This rule will activate the strobe siren.
Enter the following information:
Name: Threshold alarm
Condition: Applications: Max 50 threshold alarm changed.
Action: MQTT > Send MQTT publish message
Topic: Threshold
Payload: On
QoS: 0, 1 or 2
Click Save.
Add another rule with the following information:
This rule will deactivate the strobe siren.
Name: No threshold alarm
Condition: Applications: Max 50 threshold alarm changed
Select Invert this condition.
Action: MQTT > Send MQTT publish message
Topic: Threshold
Payload: Off
QoS: 0, 1 or 2
Click Save.
- Set up the MQTT client in the strobe siren’s web interface:
Go to System > MQTT > MQTT client > Broker and enter the following information:
Host: Broker IP address
Client ID: Siren 1
Protocol: The protocol the broker is set to
Port: The port number used by the broker
Username and Password
Click Save and Connect.
Go to MQTT subscriptions and add a subscription.
Enter the following information:
Subscription filter: Threshold
Subscription type: Stateful
QoS: 0, 1 or 2
Click Save.
- Create a rule for MQTT subscriptions in the strobe siren’s web interface:
Go to System > Events > Rules and add a rule.
Enter the following information:
Name: Motion detected
Condition: MQTT > Stateful
Subscription filter: Threshold
Payload: On
Action: Light and siren > Run light and siren profile while the rule is active
Profile: Select the profile you want to be active.
Click Save.
Learn more
Classification of objects
The application can classify two types of objects: humans and vehicles. The application shows a rectangle around classified objects. Objects classified as humans get a red rectangle, and objects classified as vehicles get a blue rectangle.
For cameras with deep learning, vehicles can be further categorized into trucks, buses, cars, bikes, and other.
If you use the time in area functionality, the rectangle is yellow until the time condition has been fulfilled. If the object then stays inside the include area for another 30 seconds, the rectangle becomes dashed.
Each classified object has a point of detection that the application uses to decide if an object is inside or outside an include area or when it crosses a virtual line. For a human, the point of detection is at its feet, and for a vehicle it's at its center. If a human's feet or a vehicle's center gets obstructed from the camera's view, the application makes an assumption of the location of the point of detection.
We recommend you to take the assumed location of objects' point of detection into consideration when you draw the include area or virtual line.
For the best possible results:
At some point, the entire object needs to be visible in the scene.
The object needs to be in motion within the scene for at least 2 seconds.
For cameras with machine learning, humans need to move in a somewhat upright position. For cameras with deep learning, this is not a requirement.
The upper body of a human needs to be visible
Objects need to stand out from the background
Reduce motion blur.
Area scenarios
When you set up an Object in area scenario, the application detects objects that move inside a defined area. The defined area is called an include area.
With the scenario Time in area, you can set a time limit for how long an object is allowed to stay inside the include area before the application triggers an action. When an object enters the include area, the time counter starts. If the object leaves the include area before the set time limit is reached, the counter resets. It’s the object’s point of detection that must be inside the include area for the counter to keep counting. The time in area feature is suitable for areas where humans or vehicles are only supposed to stay for a short while, like tunnels or school yards after hours.
When you set up an Occupancy in area scenario, the application estimates how many objects that are inside the include area at any given time. An object counter displays the estimated number of objects currently in the include area. When an object enters or leaves the area, the object counter adjusts. Occupancy in area is suitable for areas where you would want to get an estimated count of one or several object types, such as parking lots.
When you select a Motion in area scenario, the application doesn’t classify objects. Instead, it detects any object that moves in the scene. It can, for example, be animals, swaying foliage, flags, or shadows. To ignore small objects, swaying objects, or objects that only appear for a short time, you can use filters. For more information, see Use filters.
Include area
The include area is the area where the application detects and counts selected object types. The application triggers actions for objects if its point of detection is inside the include area. The application ignores objects that are outside the include area.
Reshape and resize the area so that it only covers the part of the scene where you want to detect and count objects. If you use occupancy in area or the time in area functionality, it’s important to include parts of a scene that isn’t crowded with objects that frequently overlap each other. The default include area rectangle can be changed to a polygon with up to 10 corners.
Recommendation
If there’s a busy road or sidewalk close to the include area, draw the include area so that objects outside the include area don’t accidentally get detected. This means you should avoid drawing the include area too close to the busy road or sidewalk.
Exclude areas
An exclude area is an area inside the include area in which selected object types don’t get detected or counted. Use exclude areas if there are areas inside the include area that trigger a lot of unwanted actions. You can create up to 5 exclude areas.
Move, reshape, and resize the area so that it covers the desired part of the scene. The default rectangle can be changed to a polygon with up to 10 corners.
Recommendation
Place exclude areas inside the include area. Use exclude areas to cover areas where you don’t want to detect objects.
Line crossing scenarios
When you set up a Line crossing scenario, the application detects objects that cross a virtually defined line.
With the Crossline counting scenario, the application detects and counts the objects that cross the virtual line, and you can see the accumulated count in a table.
When you select a Motion line crossing scenario, the application doesn’t classify objects. Instead, it detects any object that crosses the virtual line. To ignore small objects or objects that only appear for a short time, you can use filters. For more information, see Use filters.
The virtual line is a yellow line in the image. Objects of the selected type that cross the line in a certain direction get detected. The red arrows on the line show the current direction. Actions trigger when objects cross the line in the direction indicated by the arrows.
To trigger an action the object must cross the line. As shown in the illustration, the object’s point of detection must cross the line for the action to trigger. Objects that only touch the line don’t trigger actions.
In the illustration to the left, the man doesn’t trigger an action, as his point of detection has not yet crossed the line.
In the illustration to the right, the man triggers an action, as his point of detection has crossed the line.
For information about the point of detection, see Classification of objects.
Virtual line recommendations
- Adjust the virtual line so that:
objects are unlikely to be waiting at the line.
objects are clearly visible in the image before they cross the line.
an object’s point of detection is likely to cross the line.
Integration
Set up alarms in AXIS Camera Station
This example explains how to set up a rule in AXIS Camera Station to alert the operator and record video that includes metadata overlays when AXIS Object Analytics detects an object.
Before you start
- You need:
an Axis network camera with AXIS Object Analytics set up and running, see Get started.
metadata overlays turned on in the application, see Add burnt-in metadata overlays to video streams.
a computer with AXIS Camera Station installed
- Add the camera to AXIS Camera Station
In AXIS Camera Station, add the camera. See the user manual for AXIS Camera Station.
- Create a device event trigger
Click and go to Configuration > Recording and events > Action rules and click New.
Click Add to add a trigger.
Select Device event from the list of triggers and click Ok.
In the Configure device event trigger section:
In Device, select the camera.
In Event, select one of the scenarios for AXIS Object Analytics.
In Trigger period, set an interval time between two successive triggers. Use this function to reduce the number of successive recordings. If an additional trigger occurs within this interval, the recording will continue and the trigger period starts over from that point in time.
In Filters, set active to Yes.
Click Ok.
- Create actions to raise alarms and record video
Click Next.
Click Add to add an action.
Select Raise alarm from the list of actions and click Ok.
- Note
The alarm message is what the operator sees when an alarm is raised.
In the Alarm message section, enter an alarm title and description.
Click Ok.
Click Add to add another action.
Select Record from the list of actions and click Ok.
In the list of cameras, select the camera to use for recording.
- Important
To include metadata overlays in the recording, make sure you select a profile with the same resolution as the one selected for metadata overlays in the application.
Select a profile and set the prebuffer and postbuffer.
Click Ok.
- Specify when the alarm is active
Click Next.
If you only want the alarm to be active during certain hours, select Custom schedule.
Select a schedule from the list.
Click Next.
Enter a name for the rule.
Click Finish.
To see the metadata overlays in the live view, make sure you select the streaming profile that matches the one you set in the application.
Integration of counting data
The crossline counting and occupancy in area scenarios produce metadata about counted objects. To visualize the data and analyze trends over time, you can set up an integration to a third-party application. With this method, it’s possible to present data from one or several cameras. To learn more about how to set up the integration, see the guidelines at Axis Developer Community.
Troubleshooting
Problems detecting objects | |
... when image is unstable | Turn on Electronic image stabilization (EIS) in the Image tab in the device’s web interface. |
... at image edges, where the image looks distorted | Turn on Barrel distortion correction (BDC) in the Image tab in the device’s web interface. |
... immediately | Objects need to be fully visible in the scene before they can be detected by the application. |
... in other situations | It could be because the objects melt into the background if they are the same color, or because of bad light in the scene. Try to improve the light. |
Problems with false alarms | |||||||||
... due to small animals that appear large in the image | Calibrate the perspective. See Calibrate perspective. | ||||||||
... when you have set up a Motion in area scenario. | The application doesn’t classify objects in this scenario. Instead it detects any object that moves in the scene. Use filters to ignore small, swaying, or short-lived objects. See Use filters. | ||||||||
... when you have set up a Motion line crossing scenario. | The application doesn’t classify objects. Instead it detects any object that cross the virtual line. Use filters to ignore small or short-lived objects. See Use filters. |
Problems counting objects | ||||||||||||||||
... due to stationary objects that look like humans or vehicles when you use occupancy in area | Objects need to be fully visible in the scene. The application counts both moving and stationary objects in occupancy in area scenarios, which increases the risk of false detections. Add an exclude area to ignore stationary objects that look like humans or vehicles. |
Problems with metadata overlays | |||||||||||||||||||
... on a second client | Metadata overlays are only visible for one client at a time. |
Problems with the video stream | ||||||||||||||||||||||
... on Firefox browser for cameras with high resolutions | Try Google Chrome™ browser instead. |