Your vehicle has a camera and tracking unit (GPS tracker) installed. When the tracker detects a harsh driving event, a video recording of the incident is made.
The video is analyzed and classified by our software, and then is available to watch in Fleet or the Integrated Video app.
This article covers the following topics:
- How videos are triggered
- Harsh driving events and thresholds
- Analysis and object detection
- How videos are classified
- Videos and collisions
- Help us improve classifications
How videos are triggered
Driving behavior such as sudden acceleration or deceleration creates a g-force to the vehicle detected by the vehicle tracker. This is measured and triggers a video if the g-force goes above certain thresholds.
Harsh driving events and thresholds
The vehicle tracker detects the following harsh driving events:
- The vehicle speeds up suddenly (by more than 5 mph per second).
- The g-force is above 0.220g for at least 1 second.
- The vehicle makes a harsh turn at speed.
- The g-force is above 0.4g for at least 1 second.
- The vehicle slows down suddenly (speed decreased by more than 6 mph per second).
- The g-force is above 0.265g for at least 1 second.
- The vehicle jolts suddenly (caused by impact with a person, animal, or object, for example).
- The g-force is above 0.7g for 0.1 second.
Analysis and object detection
During the analysis, our AI identifies any objects in the video. It can detect objects such as other vehicles, people, and even animals. The objects are analyzed and then listed as the following events:
Stop sign violation
The vehicle did not fully stop at a stop sign.
The vehicle was dangerously close to the vehicle in front for at least 2 seconds.
The AI calculates the chance of collision with an object by measuring the speed and the direction the object is traveling.
The driver was handling their phone for at least 2 seconds when the harsh driving event happened. For example they were scrolling, texting, or talking on the phone.
Phone distraction can only be detected by a driver-facing camera.
How videos are classified
The AI classifies videos as Critical, Major, Moderate, or Minor to indicate how urgently you should watch them. The classification levels are generated based on a number of factors, including:
- How severely the driver braked, accelerated, cornered, or encountered a sudden force to the vehicle.
- How close other vehicles, people, animals, or objects were.
- The g-force exerted to the front or rear of the vehicle.
The analyzed videos are classified based on the severity of the harsh driving event that occurred. Phone distraction does not influence the video’s classification level.
Here are some possible scenarios:
The driver may have lost control of the vehicle and left the road, mounted a curb, spun the vehicle, or collided with a person, object, or another vehicle.
The vehicle may have been involved in a dangerous situation. The driver may have swerved or suddenly braked to avoid a collision with a person, object, or another vehicle.
The driving manner was inappropriate to the situation and there was an elevated risk of an accident. The driver may have braked or accelerated more harshly than usual, even if there are no people, objects, or other vehicles nearby. For example, the driver may have braked suddenly at a red traffic light.
The event that triggered the recording was just above the threshold to register as a harsh driving event. The analysis suggests that an incident was unlikely.
How the AI analyzes a scenario
Trigger: Hard braking
Videos and collisions
If you suspect a collision has occurred, search for videos in Fleet. If the video is not in the list, request videos in Video on Demand or download SD card videos as soon as possible.
Bear in mind that:
- Cameras and trackers can stop recording at the point of collision. This is because either the vehicle’s power is cut as a safety response, or the impact damages the device. If this happens, a video will not be triggered in Fleet.
- Low force impacts, such as side swipes. are less likely to be detected.
Help us improve classifications
You can improve the accuracy of classifications by checking that the vehicle’s equipment is working properly before starting a journey, and by giving us feedback on the video detail page.
Check the vehicle’s equipment
- The engine must be turned on for the camera to begin recording and detect harsh driving events that trigger videos. The camera continues recording until the engine is turned off.
- The camera should be attached to the windshield and aligned correctly so that it accurately detects objects.
- The vehicle tracker should be mounted securely and not able to move around, otherwise it can generate inaccurate data.
- The camera’s LED light sequence should show that the camera is recording (blue) and connected to the network (green)
- If the green light is off, this suggests that the area’s GPS coverage is unstable.
- If the vehicle has been involved in a collision, check the camera and vehicle tracker for any signs of damage.
Give us feedback
At the end of each video we ask for your feedback on the accuracy of the video classification and analysis. We use this data to iteratively update and improve our AI. The algorithm is not updated straight away.