Only this pageAll pages
Powered by GitBook
1 of 23

Traffic Monitor

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Configuration

Loading...

Loading...

Loading...

Development

Loading...

Loading...

Help & FAQ

Loading...

Loading...

Loading...

Sensor Payloads

Loading...

Loading...

Loading...

Welcome to smart city traffic monitoring

The Traffic Monitor is an open source platform to capture holistic roadway usage.

Overview

TrafficMonitor.ai, the Traffic Monitor, is an open source smart city traffic monitoring software built with commodity hardware to capture holistic roadway usage. Utilizing edge machine learning object detection and Doppler radar, it counts pedestrians, bicycles, and cars and measures vehicle speeds.

Find our website at trafficmonitor.ai to sign up for the newsletter.

Be Counted!

Our mission is to democratize the power of AI tools to improve community safety and quality of life through the ethical and transparent use of these capabilities.

Get Started

The Traffic Monitor software and hardware are open source and available for anyone to build, modify, improve, and contribute back. We welcome your ideas and contributions at the repository!

Thank You 🩵

Made possible thanks to these incredible, unaffiliated projects:

  • (core Object Detection application)

  • (low-code programming environment)

We also have integration with the open source IoT Platform to monitor your device(s) and share data.

Traffic Monitor GitHub
Frigate NVR project
Node-RED
ThingsBoard

Start counting!

Find answers to common issues In addition to , , and .

Contribute back to the open source project

Getting Started
Help & FAQ
discussions
issues
chat
Contribute

Assembly Instructions

Assemble the traffic monitor and get it ready for use

(coming soon)

Dev Environment

Set up to contribute back to the Traffic Monitor project

The Traffic Monitor software is completely open source, so you are welcome to modify your devices to fit your needs. If you think others will benefit from your changes, you are welcome to join the community and contribute back!

The Traffic Monitor OSS repo is set up as a monorepo containing everything to get the TM up and running.

Node-RED

Node-RED provides the primary logic engine to the Traffic Monitor including:

  • Accepting input from other applications, such as Frigate for object detection, and sensors such as the radar for speed measurement.

  • Enriching events by attaching speed

  • Saving payloads and data internally

  • Sending data to downstream applications

(More instructions coming soon)

Getting Started

Start counting with the Traffic Monitor!

Step 1: Get your Traffic Monitor

The traffic monitor requires the Recommended Hardware and software installation before you can begin capturing roadway data.

There are two ways to get started:

Step 2: Plan your Deployment

The traffic monitor must be placed with a good view of the roadway. See the for more information on the most common deployment types:

  1. deployments allow you to set up on the right-of-way or sidewalk next to a roadway to get counts for a short time.

  2. deployments are geared towards setting up a traffic monitor in a location to get 24/7 counts and monitoring.

Step 3: Set up your Device

Before you set up the device, be sure to follow the guide.

After you have your traffic monitor built, software installed, and physically set up (mounted), it is time to boot it up! To get the best data and most out of the device, follow these steps:

  1. will guide you in setting up the object detection capabilities by turning on the camera and detection and defining zones.

  2. will turn on additional data collection capabilities, such as pairing your camera with the radar and other sensors, sharing your de-identified event data with another system or data provider, and more!

Step 4: Collect and Share that Data

The traffic monitor will now run 24/7 collecting the data you set up. It will automatically restart if the power cycles, even if you are not there to reset it.

You can view the on-device dashboards, review event snapshots and clips in Frigate, download the local database, or view data shared with another data provider.

There are other projects out there that even pair the Traffic Monitor with physical displays to show daily counts of cars, bikes, and pedestrians. The sky is the limit! Learn more and share via the or our .

Config Overview

Traffic Monitor Configuration overview

Traffic Monitor settings and configuration are stored and controlled locally, on each device.

The following configuration files control the most common operations:

  • Frigate Config - to enable and configure object detection

  • Node-RED - to configure hardware sensors

  • Node-RED - to enable and configure sensors and ThingsBoard IoT hub connection

It is recommended to start with a minimal configuration and add to it as needed.

Backup and Restore

Backup and restore are built into the Node-RED Device Dashboard but require a connection to a IoT (Internet of Things) hub server.

Build Your Own Device (DIY)

Traffic Monitor details and assembly instructions

Hardware Assembly

Start by collecting the Recommended Hardware.

Assemble the components: Assembly Instructions

Software Setup

See

Contributing

Thank you for your interest in getting involved! We welcome all skill levels and abilities as contributions. Visit our page to get involved.

Contributions can include any of the following plus more!

  • 💡 and use case ideas

  • 🐞

  • 👩‍💻 , see Contributing

Frequently Asked Questions

Troubleshooting tips and tricks

How many simultaneous object detections can the traffic monitor handle?

I am interested in using this in a location that is very congested. There could easily be 10 walkers and a few bikes in a single frame. Can it handle that many simultaneous detections? I am primarily looking for counts and direction of travel.

Software Installation

👍 Comment on or thumbs-up milestones or issues

  • 💬 Questions, comments, and videos/images of your deployments on our Traffic Monitor Zulip Chat!

  • We foster a safe and welcoming community, review our Code of Conduct.

    Traffic Monitor GitHub
    Feature requests
    Bug reports
    Code contributions

    Build It!

    See our Build your own device (DIY) guide. The software is open source and components are available retail.

    Customize to your heart's content.

    Buy It!

    (coming soon)

    Purchase a pre-built traffic monitor or kit from the Roadway Biome Project.

    Deployment and Mounting Guide
    Software Installation
    Setup Guide
    Frigate Config
    Node-RED Config
    TM GitHub Show and tell
    TM Zulip chat
    LED Display showing daily counts capture with the traffic monitor
    Cover
    Cover
    Temporary
    Permanent
    Short answer

    Yes, 👍 it can handle all that and more than you will throw at it. I think the biggest practical limitation for the Traffic Monitor is likely going to be when you have so many objects they "overlap" each other, so you can't tell if there is a person/bike behind another (think at the beginning of a marathon).

    Longer answer

    We use the Frigate NVR Project to do the heavy lifting on object detection, decoding video, and motion detection. So, I am going to reference their documentation and discussions.

    A couple of relevant discussions:

    if you have an inference speed of 10 milliseconds then that means you can run (1000 ms in a second / 10 milliseconds per inference) 100 object detection inferences per second. Frigate often runs multiple inferences on a single camera frame, for instance when motion happens in two places at the same time, when an object was partially detected at the edge of a region and a larger region should be used, etc.

    • from https://github.com/blakeblackshear/frigate/discussions/7491 (emphasis mine)

    The maximum number of detections per frame is determined by the largest number of disjoint movement zones, for which the upper bound then is equal to how many tiles of WxH, where WxH is model input dimensions, are needed to tile the full frame (not counting the region-connecting logic).

    • from https://github.com/blakeblackshear/frigate/discussions/18326

    Some illustration of this. I have a quiet road today, but when the sun is casting shadows through the trees, Frigate is doing a lot of work to send various regions (green boxes) to object detection to "see" if any of that shadow motion is an object. The following image has more than 18 regions (that I can count) it is sending to object detection PLUS the 9(??) labeled objects it is tracking and sending. My inference speed was still sitting around 9ms, so the object detector could handle many more. The CPU spikes because of the video decoding and motion tracking, but with the RPi5 we still had quite a bit of overhead.

    Roadway with many regions being sent to be checked for object detection because of motion and shadows.
    ThingsBoard

    Where can I get support?

    Get help with the Traffic Monitor

    We are a volunteer-driven open source project relying on the generosity of our community to answer questions and offer support. While we strive to assist as many as possible, our capacity is limited, and your patience is appreciated. Your involvement is crucial, and by engaging with us, you help foster a supportive and thriving environment where everyone can benefit from shared knowledge and solutions.

    We are also an inclusive community and ask that everyone read and follow our Code of Conduct.

    We also welcome your contributions! You make the traffic monitor great. For code, troubleshooting, new ideas and more, see Contributing.

    Read the docs

    The Traffic Monitor docs have a wealth of knowledge and are searchable.

    Search the project repository

    After the docs the best place to start is the . Look for the search bar in the upper right corner and then filtering for matching discussions, issues, or even code (filter is on the left side of the screen).

    Post on GitHub discussions

    The board is the next best place to interact with the community. Hit New Discussion and there are a variety of category templates:

    • 💬 - Chat about anything and everything here

    • 💡 - Share ideas for new features, also vote on ideas

    • 🗳️ - Take a vote from the community

    • 🙏 - Ask the community for help

    Open a GitHub issue

    To open a , ensure your topic is specific to a bug, request for a new feature, or a detailed technical question that hasn't been addressed in existing discussions. Make sure to provide enough context and detail to help collaborators address the issue effectively. There are a few issue templates:

    • 🐞 - Create a report to help us improve

    • 💡 - Suggest an idea for this project

    Sometimes, a GitHub issue may be closed without a full resolution. This does not imply a lack of concern or interest from the maintainers. Issues may be closed for various reasons, such as duplicate reports, inactivity, or because they are being addressed elsewhere. We encourage continuous community engagement and collaboration to ensure all significant matters are acknowledged and acted upon.

    Chat with us

    To foster an interactive community, we also have a . This is often where project developers will discuss code contributions, so it is a good place to get started if you are interested in .

    Deployment and Mounting Guide

    Permanent and temporary physical placement suggestions.

    Deployment encompasses geographic location and bearing, physical hardware mounting, angle of camera and device to roadway, and configuration to make it ready to detect objects.

    Deployment and Mounting

    Warning: Ensure compliance with all applicable laws and local regulations when installing, mounting, and deploying a traffic monitor, particularly in public spaces. Unauthorized surveillance can lead to legal consequences and infringement of privacy rights. Always consult with legal professionals or local authorities if you are unsure of the requirements. Information in this guide is for educational purposes, and you are responsible for adhering to applicable laws and consequences of the deployment and use of the traffic monitor.

    Ensure you are mounting the traffic monitor in an approved area to comply with local regulations, and avoid attaching it to utility poles without proper authorization.

    Temporary

    The traffic monitor may work very well at head-height mounted anywhere with an unobstructed view of the roadway. A sturdy camera tripod works well for this situation.

    [Image of tripod mount]

    [Image of other temporary equipment]

    Permanent

    Safety Note: When installing the camera, always use proper safety equipment, such as gloves and safety goggles and ladder safety to protect yourself. Ensure that the camera is securely mounted, particularly in public spaces, to prevent tampering or accidental damage. Failure to do so could result in injury or damage to property. Verify that all mounting components are tightly fastened, safety tethers are in place, and check for stability to guarantee safe and reliable operation.

    Choose a mounting location

    [link to omnipresence installation]

    [show image of roadway with height, vertical angle, horizontal angle, and calculations]

    Angle the traffic monitor

    This will be dependent on the hardware you have chosen to install:

    • The camera needs an unobstructed view of the roadway for the best performance, but it is able to perform object detection anywhere in the camera frame.

    • The radar has a narrower field-of-view (FOV) than most cameras and requires specific angles to the roadway for the most accurate speed measurements. You can test this by having someone hold the radar unit (outside of the case) and seeing the red/blue blinking LEDs on the front of the radar as you move towards and away from the unit. Watch the LEDs as objects move through the view and determine the boundaries for drawing the zone.

    [show image of sample roadway]

    Next Steps

    Power on the Traffic Monitor (once it is plugged in to a power source it should automatically start, check for the green LED on the board next to the USB-C power slot) and proceed to Setup Guide to connect and configure it.

    👐 Show and tell - Show off something you've made

    Traffic Monitor project repository
    Traffic Monitor repo discussions
    General
    Ideas
    Polls
    Q&A
    Traffic Monitor GitHub issue
    Bug report
    Feature Request
    Traffic Monitor Zulip chat
    Contributing

    Recommended Hardware

    Commodity hardware to enable object detection and speed/direction measurement.

    Customize the hardware to fit your needs! The core components include the computing device, storage, camera, and co-processor. Feel free to mix-and-match components but most of the documentation and default configuration assumes using the hardware recommended below.

    Sample Configurations

    Here are some sample sensor configurations and the data it collects:

    1. Camera + AI co-processor is the lowest cost and will give you object detection, direction, visual speed measurements, and much more.

    2. Add in a radar for the most accurate speed and direction measurements and basic object detection for nighttime detection.

    3. Include an environmental sensor to also measure air quality, gases, particulate matter, noise, temperature, and much more.

    4. (future feature) Install only the radar for the most privacy-conscious built that will be capable of basic object detection, speed, and direction.

    5. Add additional camera(s) to monitor more directions using the same AI co-processor.**

    ** The traffic monitor software is capable of supporting potentially any number of cameras either connected directly or via a local feed on the same AI co-processor for monitor multiple directions or any other configuration (see for more details). The TM software also has support for up to four (4) radars directly connected and paired in any pattern to the cameras.

    Hardware Check List - Bill of Materials (BOM)

    Use the following checklist as a quick guide to components you need to purchase

    We are not affiliated with any of the stores or companies linked in this section. These are suggestions that have been used or tested by contributors. If you have used or tested more, post on !

    Computing Device

    (Required) (RPi 5) 4GB/8GB. The Traffic Monitor is designed around 4GB memory profile, but if you have many sensors and other applications running, 8GB may be more performant.

    Also pick up a (very cheap) official CPU cooler: which helps prevent overheating on very hot days.

    The Traffic Monitor is based on the Raspberry Pi 5. The Raspberry Pi 4B and earlier units are not recommended as they have experienced detrimental performance due to not meeting the power requirements on the peripherals (USB) for the TPU and radar for this setup. However, many have been successful with earlier versions of the Raspberry Pi for object detection, so your mileage may vary.

    Storage

    (Required) A high-quality microSD card or a SSD (see alternative). Recommend at least 32GB capacity for system files with minimal (or no) snapshot and video capture.

    1. Option: Setup has been tested and works well with the .

    2. Option: performs particularly well but sizes only range up to 128GB.

    3. Alternative: There are many options on the RPi5 to use a faster, more durable NVME (M.2) drive, including those that pair with the Coral TPU or other AI co-processors.

    Power

    (Required) To run the Traffic Monitor and components.

    The Raspberry Pi 5 is rated for 27-watts (5V at 5A) and using anything with a lower rating like the older RPi PSUs will often result in resets and/or throttling. However, the Traffic Monitor typically consumes between 6-14-watts of energy when it is fully operational and inferencing, depending on number of components in use and how much motion is detected.

    1. Recommended Option: The official for testing and permanent mounts.

    2. Alternative: PoE (Power over Ethernet) HATs available for the RPi 5.

      1. Raspberry Pi Foundation has not yet released an official one, but if you have a working solution suggest it in the !

    Camera(s)

    (Required) For full object detection capabilities.

    The official Raspberry Pi cameras are below recommended for low-cost, compact, local object detection; however any camera that can output H.264 is compatible with the traffic monitor, so you may attach USB or even networked cameras. See more at for alternatives.

    1. Recommended: (wide angle recommended)

      1. Requires a that is sold separately.

    2. Alternative/additional: for faster motion capture and custom-lens based on your needs

    The Raspberry Pi 5 has 2 (two) camera transceiver slots, so you can easily attach 2 native Raspberry Pi cameras.

    See the for more information on tuning stream configurations based on various goals for your deployment.

    AI Co-processor

    (Required with camera) The AI co-processor is an efficient way to run the object detection model, much more efficient than CPU-alone.

    The AI co-processor is used by Frigate to run the object detection model, see Frigate's for more options and details.

    . The Coral TPU is capable of 100+ FPS with millisecond inference time. Other co-processors may work, but the Coral TPU is fully supported with out of the box.

    1. Easiest Option: is easy-to-use co-processor that you can connect to any computing device with a USB interface.

    2. Alternative: Coral HATs (Hardware-Attached-on-Top [of a Raspberry Pi]) are more compact, upgradable, and usually cheaper:

      • pairs nicely with the (not the A+E key!).

    Radar

    (Recommended) Provides accurate speed and direction measurement.

    1. - certified with same tests as law enforcement speed radars. Detection up to 100-meters away.

    Other Sensors

    (Optional) For additional environmental data.

    1. Air quality (AQ) sensor: paired with the (recommended) . Also pick up a longer ribbon cable, we recommend the .

    The TM enclosure attempts to isolate the AQ sensor by physically separating the hardware. This way the heat from the RPi and other components do not interfere with environmental readings.

    Get AQ sensor details and capabilities on the page.

    Enclosure (weather-resistant box)

    • Print it yourself: We offer a 3D printable model so you can build the truly open source Traffic Monitor. Visit our open source repository for details and parts list.

    • Purchase: (coming soon) Purchase the box or a kit to assemble yourself.

    • Alternative DIY: There are many waterproof electrical junction boxes that may be modified to fit your needs with the traffic monitor. Rough dimensions to fit the Traffic Monitor components including the camera and radar should be around 9"x7"x4" such as the .

    Software Installation

    Software installation for the Traffic Monitor.

    Whether you or buy a pre-built unit, these are the instructions to perform a fresh install.

    Preparation

    Raspberry Pi OS 64-bit Lite install is the recommended default; however, RPi OS 64-bit Full install will give you a desktop environment and other packages. Find it on RPi Imager by navigating to Operating System > Raspberry Pi OS (other) > Raspberry Pi OS Lite (64-bit)

  • has performed well for some contributors.
  • (Future discussion) Solar panel + battery. There have been working prototypes, with caveats. Discuss it in the TM GitHub Discussion!

  • Requires a RPi 5 Camera Cable that is sold separately.
    Pineboards offers the Hat AI! Coral TPU bundle that connects via PCIe that offers a sleek way to add the Coral capabilities with an additional slot for an M.2 SSD.
  • Alternative (currently testing): Raspberry Pi AI HAT+ with Hailo-8L offers high-performance, power-efficient processing.

  • recommended hardware > cameras
    TM GitHub discussions
    Raspberry Pi 5
    27W USB-C Power Supply
    RPi5 active cooler
    Raspberry Pi 5
    RPi5 active cooler
    SanDisk Extreme Pro microSDXC UHS-I Card
    Raspberry Pi official SD Card
    27W USB-C Power Supply
    TM GitHub Discussion
    Waveshare PoE HAT (F)
    Frigate's recommended camera hardware
    Raspberry Pi Camera Module 3
    RPi 5 Camera Cable
    Raspberry Pi Global Shutter
    Frigate camera setup
    supported hardware
    Coral AI Tensor Processing Unit (TPU)
    Frigate object detectors
    Coral USB Accelerator
    Rapsberry Pi M.2 HAT+
    Coral M.2 Accelerator B+M Key
    OmniPreSence OPS243-A Doppler Radar Sensor
    Enviro+
    Particulate Matter (PM) Sensor
    male-to-female GPIO ribbon cable
    Air Quality (AQ) Payload
    greendormer/tm-enclosure-3d
    TICONN IP67 ABS Enclosure
    Environment File
    Config File
    .
    1. Install Raspberry Pi OS (64-bit lite) using RPi Imager.

      • Recommended: OS customization to set up your WiFi credentials and username and password to access the device headless; i.e. SSH with no monitor or keyboard.

    2. Insert the microSD card and boot up the Raspberry Pi device.

      • Note: The first boot may take a few minutes before it is fully online.

    Install the Traffic Monitor software

    An internet connection is required to install system dependencies, build the required Docker containers, download drivers, and more. After installation, the traffic monitor can run fully offline.

    The Traffic Monitor software is installed via an Ansible deploy script. This allows you to perform a local installation or remote installation from a host computer to 1 or more devices simultaneously!

    Local installation

    1. Connect to your device and executing the following commands via a terminal:

      1. sudo apt update && sudo apt install -y git

      2. git clone https://github.com/glossyio/traffic-monitor

      3. bash traffic-monitor/script/tmsetup.sh

    2. Follow instructions and restart when prompted.

    Remote Installation

    Install the TM software to 1 or more devices simultaneously using our Ansible deploy script. This allows you to deploy or update a whole fleet of Traffic Monitors with common configuration files in a single command!

    Before you can perform remote installation, ensure the following:

    1. Your host machine is able to run bash commands (Linux or Mac)

    2. Raspberry Pi OS is installed on each device

    3. the device is running and online

    4. you have the device IP address or resolvable host name from your host to the device

    1. On your host machine, download TM OSS software and set up any configurations you want to send:

      1. git clone https://github.com/glossyio/traffic-monitor

    2. Run the install script from your host machine with your device IP address(es), enter username specified on OS setup (needs to be the same across all devices, and prompt for password):

      1. bash traffic-monitor/script/tmsetup.sh -H <ip_address> -l <ssh_username> -k

    3. Continue to and for each device.

    Next Steps

    1. Deploy your device: See the Deployment and Mounting Guide.

    2. Set up zones, location, and enable your sensors: See Setup Guide.

    3. Start capturing roadway usage data!

    Build Your Own Device (DIY)

    Frigate Config

    Configure Frigate object detection on the Traffic Monitor

    Object detection is powered by Frigate NVR, which provides powerful capabilities for tuning and reviewing events. The Traffic Monitor is not directly affiliated with the Frigate NVR project.

    Refer to Frigate Configuration docs for full list of available configuration options and descriptions.

    Recommended Traffic Monitor Settings

    The recommended Traffic Monitor settings attempts to optimize the Frigate config for object detection on roadways. Each deployment presents unique scenarios and challenges with accurate and precise object detection.

    View frigate-config.yml for the sample.

    Many settings will need to be uniquely tailored to your specific deployment. See for optimizing your placement.

    Optimizing Object Detection

    You can more easily determine how your object detection is working through Frigate's Debug interface by going to Frigate > Settings > Debug.

    Fine-tuning object can help you with the following:

    • detection (are you missing bikes or pedestrians?)

    • reducing cross-classification (is an ebike being called a motorcycle?)

    • minimizing false positives (is a tree being detected as a person?), see also

    The object detection model accuracy and detection ability may vary depending on a number of factors including mounting conditions such as height and angles to the roadway, camera quality and settings, and environmental conditions such as clouds, rain, snow, etc.

    The generalized model available in the base version works well at a variety of angles, but is particularly suited for an oblique angle that has a good side-view of objects as they pass through the frame. have a variety of score and threshold parameters that may be set to be more effective with your deployment.

    Sample Object Detection Fine-Tuning

    The most relevant section of the Frigate config for fine-tuning object detection is the following.

    In this sample, bicycle threshold is set very low to detect most types of bikes encountered on the roadway while motorcycle threshold is set high so even large ebikes don't get cross-classified as motorcycles:

    Defining Masks

    Another tool for reducing false-positives, creating private areas, and refining your configuration. To access this capability, log into your Frigate interface and go to .

    Use masks sparingly. Over-masking will make it more difficult for objects to be tracked. See .

    1. Motion Masks: may be designated to prevent unwanted types of motion from triggering detection.

    2. Object filter masks: filter out false positives for a given object type based on location.

    For detailed information visit .

    Node-RED Config

    Traffic Monitor Node-RED configuration logic

    This page is for the Traffic Monitor -specific configuration of the Node-RED flows. This controls much of the logic and flow for the traffic monitor but does not control other applications such as Frigate or the operating system.

    See for controlling object detection parameters.

    Setup Guide

    Steps to connect to and setup your Traffic Monitor.

    At this point you have your traffic monitor and it is running. Nice job!

    This guide will walk you though configuring your device based on your sensors (required), adjust it for roadway conditions (recommended), optimize your data capture, and connect with the ThingsBoard platform (optional).

    Steps

    Use Cases and Scenarios

    Is it possible to do this...

    We are only as powerful as the story we tell. What can we do with the data we collect? What can the traffic monitor do?

    The TrafficMonitor.ai collects an extensive amount of data, depending on the sensors installed. For details on payloads, check out , , and .

    How can I perform a "near miss" analysis using this hardware?

    This scenario is where there is a driver of a vehicle and a vulnerable road user like a pedestrian at the crosswalk at the same time.

    This is an important and dangerous situation that has absolutely been largely un-captured in our current transportation system.

    Deployment and Mounting Guide
    Setup Guide
    Deployment and Mounting Guide
    Frigate object filters
    Frigate > Settings > Motion Masks
    Frigate masks
    Frigate > Masks
    Frigate > Settings > Debug to see how your object detection settings are working
    Defining Masks
    Environment File

    The environment file defines variables that can be used during Node-RED start-up in the settings file and within a flows' node properties.

    A sample script can be found at node-red-project/environment.

    The environment file is loaded by the systemd service node-red.service that is set up during by the Node-RED Rapsberry Pi Install script. It shall be located in the user node-red directory, by default at ~/.node-red/environment. Changes to the environment file must be applied by restarting the Node-RED service by executing the command node-red-restart in the terminal.

    Config File

    The Traffic Monitor Node-RED config file changes definitions to various services and functionality.

    The config file is loaded whenever the TM flows restart. It is located in the user node-red directory,~/.node-red/config.yml.

    It is not necessary to copy this full configuration file. Default values are specified below.

    Frigate Config
    objects:
      track:
      - bicycle
      - person
      - car
      - motorcycle
      - dog
      # Optional: filters to reduce false positives for specific object types
      filters:
        bicycle:
          # Optional: minimum width*height of the bounding box for the detected object (default: 0)
          min_area: 0
          # Optional: maximum width*height of the bounding box for the detected object (default: 24000000)
          max_area: 24000000
          # Optional: minimum width/height of the bounding box for the detected object (default: 0)
          min_ratio: 0.2
          # Optional: maximum width/height of the bounding box for the detected object (default: 24000000)
          max_ratio: 10.0
          # Optional: minimum score for the object to initiate tracking (default: shown below)
          min_score: 0.25
          # Optional: minimum decimal percentage for tracked object's computed score to be considered a true positive (default: shown below)
          threshold: 0.42
        motorcycle:
          min_area: 0
          max_area: 24000000
          min_ratio: 0.2
          max_ratio: 10.0
          min_score: 0.5
          threshold: 0.8
    ########
    # This file contains node-red environment variables loaded by node-red.service
    #   Read more at https://nodered.org/docs/user-guide/environment-variables
    #     and https://fedoraproject.org/wiki/Packaging:Systemd
    #  
    # This file shall be located at the root node-red directory, usually `~/.node-red`
    #   this file is loaded by `systemd`, changes can be applied 
    #   by running the command `node-red-restart` in the terminal
    #   read more at https://nodered.org/docs/getting-started/raspberrypi
    #
    # Uses:
    #   - variables can be used in settings.js by calling `process.env.ENV_VAR`
    #   - node property can be set by calling `${ENV_VAR}
    #
    ########
    
    # traffic monitor open source software release version
    TM_VERSION='0.3.0'
    
    # used in settings.js for credentialSecret 
    NODE_RED_CREDENTIAL_SECRET='myNodeRED1234'
    
    # database locations, relative to user directory defined in settings.js
    #  will be relative path to store SQLite databases
    TM_DATABASE_PATH_TMDB='code/nodered/db/tmdb.sqlite'
    
    # mqtt broker for incoming Frigate events 
    #  Settings below set up the aedes broker node
    TM_MQTT_BROKER_HOST='localhost'
    TM_MQTT_BROKER_PORT='1883'
    # mqtt user, leave blank for no authentication
    TM_MQTT_BROKER_USERNAME=''
    # mqtt password, leave blank for no authentication
    TM_MQTT_BROKER_PASSWORD=''
    
    # defines system USB serial port for radar
    # run `ls -lat /sys/class/tty/ttyACM*` to list devices
    TM_RADAR_SERIAL_PORT_00='/dev/ttyACM0'
    TM_RADAR_SERIAL_PORT_01='/dev/ttyACM1'
    TM_RADAR_SERIAL_PORT_02='/dev/ttyACM2'
    TM_RADAR_SERIAL_PORT_03='/dev/ttyACM3'
    ########
    # This file contains configuration settings executed by node-red
    # Note: Comments will be removed by updates from node-red
    ########
    
    # Optional: IoT hub backend integration
    thingsboard:
        # Optional: enable connection to backend thingsboard server (default: shown below)
        enabled: false
        # Required: host name, without protocol or port number
        host: tb.server.com
        # Required: thingsboard telemetry protocol (default: shown below), 
        # NOTE: only http(s) currently supported, mqtt coming soon
        #  see https://thingsboard.io/docs/reference/protocols/
        protocol: http
        # Optional: port, common settings: https=443, http=80, mqtt=1883
        # Check with your ThingsBoard admin for settings
        port:
        # Optional: API key for device 
        # Note: (Future) if already provisioned, will be assigned based on provisionDeviceKey and secret
        access_token:
        # Optional: future use for auto-provisioning (RPiSN)
        # provisionDeviceKey: 
        # Optional: future use for auto-provisioning (manual)
        # provisionDeviceSecret: 
    
    # Optional: deployment location details
    # Note: May be used to determine device placement on maps
    # NOTE: Can be overridden at the sensors level, top-level values will cascade down
    deployment:
        # NOTE: for address-level accuracy, recommend at least 4 digits after the decimal
        # Optional: Latitude in decimal degrees format; e.g. 45.5225
        lat:
        # Optional: Longitude in decimal degrees format; e.g. -122.6919
        lon:
        # Optional: cardinal (N, S, E, W) or ordinal (NE, NW, SE, etc.) direction the camera/radar is facing 
        # Note: For bearing, match the roadway traffic direction
        bearing:
    
    sensors:
        # Optional: if used, must match the Frigate camera name(s)
        # if not set, no cameras will be used
        cameras:
            # camera name must match Frigate configuration camera names
            picam_h264:
                # Optional Enable/disable the camera (default: shown below).
                # if disabled, any Frigate events for specified camera will be ignored
                # Note: this will not impact Frigate's system
                enabled: false
                # Optional: define the radar co-located with camera to associate speeds
                # camera and radar direction and field of view (FOV) should match
                # Note: name needs to match one defined in `radars` section
                # Note: A single radar may be attached to multiple cameras
                camera_radar: TM_RADAR_SERIAL_PORT_00
    
        # Optional: used to specify what radars are enabled for speed/direction and detection
        radars:
            # Note: Names must match those defined in the node-red environment file
            # Names are used to associate readings with cameras and other sensors
            TM_RADAR_SERIAL_PORT_00:
                # Optional: Enable/disable the radar (default: shown below).
                enabled: false
    
        # Optional: used to specify air quality monitor sensor name(s)
        # Note: air quality configuration file is separate from the node-red config, based on the aq device
        airquality_monitors:
            # Required: aq sensor name must match AQ configuration -defined MQTT topic middle element (second element)
            sensorname01:
                # Optional: Enable/disable the AQ sensor payloads (default: shown below).
                enabled: false
                # Required: mqtt topic to subscribe for incoming full-payload telemetry from AQ sensor
                #  must be last element in mqtt topic defined in AQ configuration
                mqtt_topic_incoming: readings
    
    time:
        # Optional: Set a timezone to use in the UI (default: use browser local time)
        # NOTE: shall be in unix tz format: https://en.wikipedia.org/wiki/List_of_tz_database_time_zones
        #  this will also set the timezone for the entire system
        timezone: America/Los_Angeles
        # Optional: For internet-connected deployments, sync using `timedatectl set-npt` (default: shown below)
        # Note: for offline deployments, time will stop whenever power is disconnected
        npt_set: true
    The default configuration files have disabled all sensors until you follow these steps. There will be no data captured until you enable your sensors using the following steps.

    1. Connect to your Device

    Physical Access

    Physical access to the device is a less-convenient method but will allow the most control to address issues.

    Monitor, Keyboard, Mouse

    See Raspberry Pi Getting Started for more information on connecting your Raspberry Pi. It should be as simple as plugging in a USB keyboard, USB mouse, and micro HDMI cable to your monitor. In this case, you can use localhost as the RPi IP address or use the host name.

    SD Card

    If your Traffic Monitor uses the default Raspberry Pi installation method, you will have an SD Card boot media that contains all your system files. If necessary, you can insert the card into a micro-SD card reader to access the entire Raspberry Pi OS directory structure.

    Remote Access

    Remote access allows you to control various parts of your Raspberry Pi without connecting it to a monitor, keyboard, or mouse. This must be done from another computer, e.g. a laptop. See Raspberry Pi's remote access docs for a full rundown of options.

    You will need to know the Traffic Monitor / Raspberry Pi IP address or host name to connect to the various configuration environments.

    Finding Your IP address

    • If you chose the Build Your Own Device (DIY) route, we recommend you set up WiFi credentials by following the Raspberry Pi Imager docs and it will automatically be accessible the network you specified. Find the RPi IP address via your router, or if your router supports DNS host names, you can use the host name set on the RPi.

    • If you received a pre-built device, check with your provider for specific instructions. To get you started, it may be available as a Hotspot that will host a wireless network. Connect to it like any WiFi network, look for the host name as the SSID. The IP address of the Raspberry Pi will be the Gateway IP address or you may use the host name set on the RPi.

    See Find the IP address of your Raspberry Pi for more options.

    2. Configure Frigate Zones

    Frigate controls and generates object detection events with the camera.

    This section describes setting up Frigate with the Traffic Monitor Recommended Hardware. If you have alternative or optional camera(s) or other components, you may need additional configuration. Reference the official Frigate Configuration for more details.

    Frigate has a well-developed front-end user interface that can be accessed by visiting http://<TM_IP_ADDRESS>:5000 in a browser.

    The Traffic Monitor will be expecting the following specifically named Frigate zones to work properly with all dashboards and workflow logic. These need to manually drawn based on your deployment.

    Ensure following Frigate zones are manually configured each time a the traffic monitor is re-positioned or relocated, based on your unique deployment and roadway conditions.

    Set up or modify the following zones, overlaying any temporary or permanent stationary objects:

    1. zone_capture - Set to capture the entire roadway, including sidewalks that are clearly in view for counting objects.

    2. zone_near - Paired with zone_near, this will determine if an object moves "outbound" or "inbound". Set this to be roughly the further half of the zone_capture region.

    3. zone_far - Paired with zone_far, this will determine if an object moves "outbound" or "inbound". Set this to be roughly the closer half of the zone_capture region.

    4. zone_radar - (for units equipped with radar) - This should correspond to the field of view for the radar (where it can pick up accurate measurements) on the street. It will roughly make a rectangle in the center of the camera field of view from curb to curb.

    Properly configured Frigate Zones

    After changes are made, you will need to restart Frigate before they take effect. You can do this via Frigate > Settings > Restart Frigate.

    Define Masks

    Optional step for reducing false-positives, creating private areas, and refining your configuration.

    Use masks sparingly. Over-masking will make it more difficult for objects to be tracked. See Frigate masks for more detailed explanation of how masks work and how to use them.

    • Motion Masks: may be designated to prevent unwanted types of motion from triggering detection.

    • Object filter masks: filter out false positives for a given object type based on location.

    For more information view Frigate > Setup > Motion Masks and detailed info at Frigate > Masks.

    Optimize Object Detection

    The object detection model accuracy and detection ability may vary depending on a number of factors including mounting conditions such as height and angles to the roadway, different cameras and camera settings, and environmental conditions.

    The generalized model available in the base version works well at a variety of angles, but is particularly suited for an oblique angle that has a good side-view of objects as they pass through the frame. Frigate object filters have a variety of score and threshold parameters that may be set to be more effective with your deployment.

    3. Configure Node-RED

    Node-RED controls most of the workflow logic and data collection.

    You will need to Setup Guide to edit the Node-RED Config files.

    1. Open up the terminal or via SSH enter the command: nano ~/.node-red/config.yml to begin editing the config file.

    2. Change the deployment location information to represent the current deployment. Get your latitude and longitude from any map service, such as Google Maps and enter bearing with the single-letter cardinal direction the traffic monitor is facing.

    1. Modify sensors to reflect currently installed components. For example, with a single Raspberry Pi Camera and Radar, it may look like this:

    1. To save changes, press Ctr+o (hold control and o)

    2. To exit, press Ctr+x (hold control and x)

    You will need to restart Node-RED for setting to take effect. Do this by entering the command systemctl restart nodered into the terminal.

    deployed
    We need to create definitions on what a "near miss / near hit" is
    , but we have laws we can follow for this.
    highlights many of these:
    • In Oregon, every corner is a crosswalk. ORS 801.220

    • Pedestrians invoke their right to cross when any part or extension of the pedestrian (body, cane, wheelchair, or bicycle) enters the crosswalk. ORS 811.028(4), 814.040(1)(a)

    • A driver must remain stopped Until the pedestrian passes the driver’s lane (or lane they intend to turn into) plus one further lane. ORS 811.028

    To accomplish this with the TrafficMonitor, the following would allow for analysis of events that have a driver / pedestrian conflict:

    1. Mount a TM watching an intersection.

    2. Draw zones that represent "pedestrian zones" and "driver zones".

    3. This will create separate events for person and car with relevant event payload fields including start_time, end_time, and entered_zones.

    4. Watch for events where a driver enters the zone simultaneously as a pedestrian in their zone (overlapping start/end times) where the zones will be in conflict.

    5. Download the data and create an analysis that looks at those potential conflicts.

    Example scenario analysis:

    • if a person was at the zone_ped_ne (bottom left) and wanted to cross S or W

    • and a car simultaneously entered the zone_intersection_e (left) and wanted to turn into zone_intersection_n(right)

    • and the car _turned first_ into the zone_intersection_n before the pedestrian crossed, based on start_time and end_time for both events

    • This would potentially be a "near miss" or at least an illegal maneuver for a driver while a pedestrian was in the crosswalk.

    You can potentially add in other elements to make the requires more stringent, like seeing if the car was stationary at the stop sign or just blew through it. Or tighten the "pedestrian zone" to represent the very edges of the sidewalk to show the pedestrian had "intent to cross".

    Example of a potential near-miss scenario zone tracking set up. Zones everywhere!

    Of course, the truly more harrowing observations would include those where a person and car were in the same zone_intersection (on the road) at the same time. That would obviously be a "near miss" if it wasn't a true driver striking a pedestrian.

    Events Payload
    Radar Payload
    Air Quality (AQ) Payload
    PortlandBicycleSchool.com
    deployment:
        lat: 45.5225
        lon: -122.6919
        bearing: n
    sensors:
        cameras:
            picam_h264:
                enabled: true
                camera_radar: TM_RADAR_SERIAL_PORT_00
        radars:
            TM_RADAR_SERIAL_PORT_00:
                enabled: true

    Events Payload

    Object detection event payload

    Overview

    Traffic Monitor object detection events are generated by the connected sensors by performing object detection to identify instances of roadway users such as cars, bikes, pedestrians, and more. Additional data and metadata may also be added via other sensors and processes to create an event payload.

    Hardware and Software

    The camera is the primary detection method, powered by . Events are created by optical camera observation and machine learning-powered inference the frames to label .

    Future feature: The radar may also generate object detection events. This is particularly useful for nighttime and low-light conditions or even deployments that do not utilize a camera. See for more information on radar readings.

    Frigate MQTT Incoming Payload

    Events are recorded via the integration on the frigate/events topic containing type: "end",which indicates a complete capture event. See Frigate documentation for a full list of available attributes.

    Events Database

    The following attributes are captured, by default, into tmdb.events.sqlite:

    Attribute
    Description
    SQLite data type
    Valid values

    Telemetry

    HTTP Telemetry

    requests are sent for each event as a single JSON payload containing:

    • ts: frame_time * 1000 - to make it milliseconds

    • values: {event:values}- contains all attributes in

    MQTT Publications

    The primary event are available on-device for downstream subscriptions (e.g. Home Assistant):

    • Topic: tm/event

    • Payload: Same as the

    Additionally, there is a daily cumulative object count utilized for the on-device dashboard and connected displays:

    • Topic: tm/events

    • Payload: Daily cumulative counts of detected objects (resets at 0400 local time). Note, these can be adjusted in the Frigate config for .

      • car

    sub_label

    Additional informatoin assigned to Frigate event. Frigate-generated from MQTT event `end`.

    TEXT

    Assigned via Frigate HTTP API

    top_score

    Model inference score for object label. This is the highest score as object moved through field of view. Frigate-generated from MQTT event `end`.

    REAL

    0-1 value

    frame_time

    Unix timestamp in seconds for when the object was optimally identified by Frigate for the field of view. Frigate-generated from MQTT event `end`.

    REAL

    Unix timestamp in Seconds

    start_time

    Unix timestamp in seconds for when the object first entered the field of view. Frigate-generated from MQTT event `end`.

    REAL

    Unix timestamp in Seconds

    end_time

    Unix timestamp in seconds for when the object exited the field of view. Frigate-generated from MQTT event `end`.

    REAL

    Unix timestamp in Seconds

    entered_zones

    JSON array list of zones in order the object entered each zone. Specified zones are used for various calculations. Frigate-generated from MQTT event `end`.

    TEXT

    Free-text via Frigate; expected: - "zone_capture" - region for counting objects - "zone_radar" - radar detection field of view (FOV) - "zone_near" - area closest to radar, for determining visual direction - "zone_far" - area furthest from radar, for determining visual direction

    score

    Model inference score for the object to initiate tracking. Computed score as object moves through field of view. Frigate-generated from MQTT event `end`.

    REAL

    0-1 value

    area

    Width*height of the bounding box for the detected object Frigate-generated from MQTT event `end`.

    REAL

    0-24000000

    ratio

    Width/height of the bounding box for the detected object; e.g. 0.5 is tall (twice as high as wide box) Frigate-generated from MQTT event `end`.

    REAL

    0-24000000

    motionless_count

    Number of frames the object has been motionless Frigate-generated from MQTT event `end`.

    REAL

    Integer count; e.g. 0

    position_changes

    Number of times the object has changed position Frigate-generated from MQTT event `end`.

    REAL

    Integer count; e.g. 2

    attributes

    Attributes with top score that have been identified on the object at any point Frigate-generated from MQTT event `end`.

    TEXT, JSON

    JSON object with key:value pairs; e.g. {"face": 0.86}

    direction_calc

    Assigned object moving direction relative to device placement; i.e. "outbound" is moving away from device.

    TEXT

    "outbound" or "inbound"

    speed_calc

    Assigned speed/velocity calculated for entire time object was in the camera's field of view.

    REAL

    Positive, Negative corresponding to inbound and outbound direction, respectively

    provenance

    Source(s) of event detection. List any sensor on the device that captured or created this event. The first item in the array is considered the primary source.

    TEXT, JSON

    JSON Array with every sensor that confirms the same event. e.g. Camera sensor: frigate, Radar sensor: radar

    radarName

    Radar sensor name that is associated (via config) with the camera during the event, regardless if event was confirmed by radar (see provenance).

    TEXT

    Free-text, defined from configs

    deployment_id

    Each ID represents a unique deployment configuration and/or location for the device. This acts a foreign key link to the `deployment` table, `id` column.

    TEXT, FOREIGN KEY

    `deployment`.`id` foreign key, may be null

    person

  • bicycle

  • motorcycle

  • bicycle_adj - bicycle plus motorcycle - adjust for eBikes, but this is addressed now by Frigate configs for each object and is unnecessary for most deployments

  • person_adj - person minus bicycle - Essentially represents "pedestrian count" since every bicycle will have [at least one] rider that is detected independently of the bike.

    • Scooters and other transit modes will likely only count as a person using the base model.

    • Cars never have a person identified inside of them with the base model.

  • dog

  • cat

  • id

    UUID for object detection. Will be generated by `source` Frigate-generated from MQTT event `end`. Radar-generated from flow.

    TEXT, PRIMARY KEY

    Frigate: Unix timestamp in seconds concatenated to a hyphen and randomly-generated 6-alphanumeric value; e.g. "1721144705.617111-fg3luy" Radar: Unix timestamp in seconds concatenated to a hyphen and randomly-generated 8-alphanumeric value concatenated with a hypen r '-r'; e.g. "1721144705.617111-fg3luy4x-r"

    camera

    Name of camera for object detection (defined in Frigate config). Frigate-generated from MQTT event `end`.

    TEXT

    Free-text

    label

    Object label assigned by Frigate. Frigate-generated from MQTT event `end`.

    TEXT

    Frigate NVR
    object detectors
    available objects
    Radar Payload
    Frigate MQTT
    ThingsBoard HTTP upload telemetry API
    Available Objects
    Events Database
    Events Database

    Assigned by model, based on

    Frigate available objects

    Radar Payload

    Doppler radar payloads

    Overview

    The Doppler radar enables speed and direction measurement to be collected and added to events.

    Hardware

    See for more information on selecting a radar unit.

    The radar from :

    OmniPreSense’s OPS243 is complete short-range radar (SRR) solution providing motion detection, speed, direction, and range reporting. All radar signal processing is done on board and a simple API reports the processed data.

    Software

    The OPS243 radar sensors include an easy-to-use for returning a variety of radar readings and calculated values in JSON format. By default, we capture all of these values in separate tables:

    • DetectedObjectVelocity (command ON).

      • Sensor determines if an object is present by looking for 2 consecutive speed reports. If met, the max speed detected is reported. If a faster speed is detected, additional speeds are reported. This lowers the number of speed reports for a given detected object. Use On to turn the mode off.

    Radar Database

    The following attributes are captured, by default, into tmdb.events.sqlite:

    radar_dov table

    Attribute
    Description
    SQLite data type
    Valid values

    radar_timed_speed_counts table

    Attribute
    Description
    SQLite data type
    Valid values

    radar_raw_speed_magnitude table

    Attribute
    Description
    SQLite data type
    Valid values

    radar_raw_speed_magnitude_single table

    Attribute
    Description
    SQLite data type
    Valid values

    radar_oc_payload table

    Attribute
    Description
    SQLite data type
    Valid values
    TimedSpeedCounts (command @O).
    • radar_timed_speed_counts table.

    • Sensor counts and reports the cumulative number of objects (defined by DetectedObjectVelocity) that have gone by in a given period. Default TM setting is reporting every 300-seconds.

  • Raw Speed Magnitude (command OS).

    • radar_raw_speed_magnitude table and radar_raw_speed_magnitude_single table

    • Reports magnitude and associated speed of each reading. The magnitude is a measure of the size, distance, and reflectivity of the object detected. By default, TM captures the 3-burst speed/magnitude pairs and the single strongest magnitude and associated speed in separate tables for deep dive and easier analysis, respectively.

  • Vehicle Length (command OC).

    • Note: Requires Firmware OPS9243.

    • radar_oc_payload table

    • From the docs: Provides several parameters which can help identify the vehicle type and/or lane in which the vehicle is located. This includes start/end time, frames, min/max MPH, magnitude, and length calculations.

  • velocity

    Maximum speed/velocity calculated for all measurements object was detected in the radar zone. `DetectedObjectVelocity` from API

    REAL

    Integer, Positive, Negative corresponding to inbound and outbound direction, respectively

    radarName

    Radar sensor that captured the data. This field is used to associate with cameras and other radars.

    TEXT

    deployment_id

    Each ID represents a unique deployment configuration and/or location for the device. This acts a foreign key link to the `deployment` table, `id` column.

    TEXT, FOREIGN KEY

    `deployment`.`id` foreign key, may be null

    count

    Number of object detection (DOV) measurements; should correspond to number of objects/vehicles that passed through radar zone

    INTEGER

    Integer, positive

    average

    Average speed/velocity across all object detections (count) during measurement interval; defined by `units` attribute

    REAL

    Integer, Positive, Negative corresponding to inbound and outbound direction, respectively

    radarName

    Radar sensor that captured the data. This field is used to associate with cameras and other radars.

    TEXT

    deployment_id

    Each ID represents a unique deployment configuration and/or location for the device. This acts a foreign key link to the `deployment` table, `id` column.

    TEXT, FOREIGN KEY

    `deployment`.`id` foreign key, may be null

    speed

    Array of individual Speed/velocity for the sampling time depending on K+ setting; e.g. one speed represents ~50-ms at 20k samples. Corresponds to magnitude's array location in descending order of speed (configurable).

    REAL

    Positive, Negative corresponding to inbound and outbound direction, respectively

    radarName

    Radar sensor that captured the data. This field is used to associate with cameras and other radars.

    TEXT

    deployment_id

    Each ID represents a unique deployment configuration and/or location for the device. This acts a foreign key link to the `deployment` table, `id` column.

    TEXT, FOREIGN KEY

    `deployment`.`id` foreign key, may be null

    speed

    Array of individual Speed/velocity for the sampling time depending on K+ setting; e.g. one speed represents ~50-ms at 20k samples. Corresponds to magnitude's array index==0 in descending order of speed (configurable).

    REAL

    Positive, Negative corresponding to inbound and outbound direction, respectively

    radarName

    Radar sensor that captured the data. This field is used to associate with cameras and other radars.

    TEXT

    deployment_id

    Each ID represents a unique deployment configuration and/or location for the device. This acts a foreign key link to the `deployment` table, `id` column.

    TEXT, FOREIGN KEY

    `deployment`.`id` foreign key, may be null

    direction

    Object moving direction relative to radar placement; i.e. "outbound" is moving away from radar.

    TEXT

    inbound, outbound

    frames_count

    Number of frames, defined by doppler pings (equivalent to OS) for an object through the field of view (FOV)

    INTEGER

    velocity_max

    Maximum velocity/speed of object through field of view (FOV)

    REAL

    velocity_min

    Minimum velocity/speed of object through field of view (FOV)

    REAL

    magnitude_max

    Maximium magnitude of doppler radar response of object through field of view (FOV)

    REAL

    magnitude_mean

    Average / mean magnitude of doppler radar response of object through field of view (FOV)

    REAL

    velocity_change

    Delta max_speed – min_speed. This can help with indication of the lane a vehicle is in. A vehicle farther to the edge of the FoV will have a higher cosine error change and therefore delta speed. This should be normalized to speed so offline we’ve used (max_speed – min_speed)/max_speed. A lower number tends to show a vehicle in the farther lane over.

    REAL

    frames_per_velocity

    Number of frames captured per unit of velocity. This acts as the inverse to speed in order to calculate length of object. Calculated as `frames_count / velocity_max`

    REAL

    object_length

    Estimted length of object, calculated by taking speed and time through field of view.

    REAL

    units

    Velocity unit of measurement. Will also match length units, relatively.

    TEXT

    mph, mps - for Miles Per Hour (imperial) and Meters Per Second (metric), respectively

    object_label

    RESERVE FOR FUTURE USE. Radar-based classification for the type of object that moved through the field of view. This is estimated from common roadway-based objects. Payload is a JSON object containing key:value of the top -calculated labels and respective likelihood score.

    TEXT, JSON

    JSON object with key:value pairs; e.g. {"car": 0.86, "bike": 0.25, "person": 0.10}

    radarName

    Radar sensor that captured the data. This field is used to associate with cameras and other radars.

    TEXT

    deployment_id

    Each ID represents a unique deployment configuration and/or location for the device. This acts a foreign key link to the `deployment` table, `id` column.

    TEXT, FOREIGN KEY

    `deployment`.`id` foreign key, may be null

    time

    Unix timestamp directly from radar

    REAL

    Unix timestamp in Seconds

    unit

    Unit of measure for velocity/speed set on radar (configurable)

    TEXT

    "mph"

    direction

    Object moving direction relative to radar placement; i.e. "outbound" is moving away from radar.

    TEXT

    time

    Unix timestamp directly from radar

    REAL

    Unix timestamp in Seconds

    direction

    Object moving direction relative to radar placement; i.e. "outbound" is moving away from radar.

    TEXT

    "outbound" or "inbound"

    units

    Unit of measure for velocity/speed set on radar (configurable)

    TEXT

    time

    Unix timestamp directly from radar

    REAL

    Unix timestamp in Seconds

    unit

    Unit of measure for velocity/speed set on radar (configurable)

    TEXT

    "mph"

    magnitude

    Array of individual magnitude measurements for the sampling time depending on K+ setting; e.g. one speed represents ~50-ms at 20k samples. Corresponds to speed's array location in descending order of speed (configurable).

    TEXT

    time

    Unix timestamp directly from radar

    REAL

    Unix timestamp in Seconds

    unit

    Unit of measure for velocity/speed set on radar (configurable)

    TEXT

    "mph"

    magnitude

    Array of individual magnitude measurements for the sampling time depending on K+ setting; e.g. one speed represents ~50-ms at 20k samples. Corresponds to speed's array index==0 in descending order of speed (configurable).

    REAL

    start_time

    Unix timestamp directly from radar

    REAL

    Unix timestamp in Seconds

    end_time

    Unix timestamp directly from radar

    REAL

    Unix timestamp in Seconds

    delta_time_msec

    Different in end and start times in milliseconds, representing the amount of time the object was in the radar zone: i.e. `end_time` minus `start_time` * 1000

    REAL

    OPS243-A
    OmniPreSense
    API interface
    Radar
    radar_dov table

    "outbound" or "inbound"

    "mph"

    Positive

    Positive

    Air Quality (AQ) Payload

    Environment and air quality sensor for the Traffic Monitor

    Overview

    The air quality monitor enables collection of a variety of environmental measurements including gasses commonly associated with pollution, temperature, pressure, humidity, and much more.

    The AQ software is available at greendormer/enviroplus-monitor. It is based on the wonderful work from the roscoe81/enviro-monitor and pimoroni/enviroplus projects.

    Hardware

    The following hardware has been tested and incorporated into the Traffic Monitor.

    Although we strive to include high-quality equipment and data collection into our application, we make no warranty on the veracity or quality of the hardware or data. We welcome those with an background to !

    • – Enviro + Air Quality

      • Enviro for Raspberry Pi – Enviro + Air Quality

      • Air quality (pollutant gases and particulates*), temperature, pressure, humidity, light, and noise

    Software

    The AQ software is available at as a Python service script that communicates with the Traffic Monitor Node-RED flow via MQTT messages. See the repository for installation and setup instructions.

    config.json

    See for a detailed description of every available key.

    Recommended config settings

    The following are important keys for the recommended default Traffic Monitor -specific configuration:

    • "enable_send_data_to_homemanager": true in order to send MQTT payloads to specified broker

    • "mqtt_broker_name": "localhost" to send to Node-RED MQTT broker (assumes port 1883)

    • "indoor_outdoor_function": "Outdoor" to utilize outdoor_mqtt_topic

    Deployment-specific config settings

    The following location-based settings need to be set per-deployment for your location. They are utilized by the for calculating the times of various aspects of the sun and phases of the moon (lat/lon, time zone) and calibrating temperature, humidity, barometer, and gas (altitude) readings.

    Air Quality MQTT Incoming Payload

    The TM AQ application sends messages via MQTT integration on the aq/readings topic.

    The sensor needs to stabilize (default 5-minutes) after the script initializes before it will send external updates (via MQTT). This is defined by startup_stabilisation_time in config.json.

    MQTT attribute details:

    Key
    Valid Values
    Units
    Notes

    Air Quality Database

    The following attributes are captured, by default, into tmdb.events.sqlite:

    Attribute
    Description
    SQLite data type
    Valid values

    Notes on Air Quality readings

    Gas sensor

    The analog gas sensor: The MiCS-6814 is a robust MEMS sensor for the detection of pollution from automobile exhausts and for agricultural/industrial odors.

    The sensor includes the ability to detect reductive (RED), oxidative (OXI), and ammonia (NH3) gases. The raw gas readings are measured as Ohms of resistance for their respective gasses, but the software compensates for temperature, humidity, altitude, and drift to provide PPM (parts per million) equivalents.*

    *See Software notes and additional discussions on and .

    Software notes

    • Gas calibrates using Temp, Humidity, and Barometric Pressure readings.

    • Gas Sensors (Red, Oxi, NH3) take 100-minutes to warm-up and readings to become available

    • To compensate for gas sensor drift over time, the software calibrates gas sensors daily at time set by gas_daily_r0_calibration_hour, using average of daily readings over a week if not already done in the current day and if warm-up calibration is completed. This compensates for gas sensor drift over time

    Temperature, pressure, and humidity

    The temperature, pressure, humidity sensor with I2S digital output.

    Software notes

    • Temp (temperature) and Hum (humidity) have cubic polynomial compensation factors applied to raw readings

    • Min Tempand Max Tempare calculated over the entire time the script is running

    • Bar(Barometer) reading updates only every 20 minutes

    Optical (light, proximity)

    The light and proximity sensor

    Noise

    MEMS microphone ().

    Particulate matter (PM)

    The Plantower Particulate Matter (PM) Sensor.

    PMS5003 Particulate Matter Sensor for Enviro

    • Monitor air pollution cheaply and accurately with this matchbox-sized particulate matter (PM) sensor from Plantower!

    • It senses particulates of various sizes (PM1, PM2.5, PM10) from sources like smoke, dust, pollen, metal and organic particles, and more.

    "enable_display": false since the AQ sensor will be in an enclosure

  • "outdoor_mqtt_topic": "aq/sensorname01/readings" for sending messages, must start with "aq" and the middle element, "sensorname01" must be defined in your TM config

  • "long_update_delay": 300 for time between sending MQTT messages (default 300-seconds)

  • Forecast

    {OBJECT}

    Valid: true/false, 3 Hour Change is millibars difference in barometer readings, Forecast is description calculated from barometer change

    Calculated forecast based on sensor barometer changes

    pm01

    REAL

    ug/m3 (microgram per meter cubed, µg/m³)

    Particulate Matter 1 micrometers / microns (PM1, PM1), Read directly using the pms5003.pm_ug_per_m3() method from the particulate matter sensor.

    pm025

    REAL

    ug/m3 (microgram per meter cubed, µg/m³)

    Particulate Matter 2.5 micrometers / microns (PM2.5, PM2.5), read directly using the pms5003.pm_ug_per_m3() method from the particulate matter sensor.

    pm10

    REAL

    ug/m3 (microgram per meter cubed, µg/m³)

    Particulate Matter 10 micrometers / microns (PM10, PM10), Read directly using the pms5003.pm_ug_per_m3() method from the particulate matter sensor.

    dew

    REAL

    C

    Calculated from Temp and Hum as (237.7 * (math.log(dew_hum/100)+17.271*dew_temp/(237.7+dew_temp))/(17.271 - math.log(dew_hum/100) - 17.271*dew_temp/(237.7 + dew_temp)))

    temp

    REAL

    C

    Adjusted for compensation factor set in config.json

    temp_min

    REAL

    C

    Minimum temperature measured while sensor was running (only resets on restart)

    temp_max

    REAL

    C

    Maximum temperature measured while sensor was running (only resets on restart)

    gas_red

    REAL

    ppm

    Red PPM calculated as red_in_ppm = math.pow(10, -1.25 * math.log10(red_ratio) + 0.64). red_ratio is compensated gas value, see Software notes.

    gas_oxi

    REAL

    ppm

    Oxi PPM calculated as oxi_in_ppm = math.pow(10, math.log10(oxi_ratio) - 0.8129). oxi_ratio is compensated gas value, see Software notes.

    nh3

    REAL

    ppm

    NH3 PPM calculated as nh3_in_ppm = math.pow(10, -1.8 * math.log10(nh3_ratio) - 0.163). nh3_ratio is compensated gas value, see Software notes.

    lux

    REAL

    lux

    Read directly using the ltr559.get_lux() method from the light sensor.

    temp_raw

    REAL

    C

    Read directly from sensor absent compensation.

    bar_raw

    REAL

    C

    Read directly from sensor absent compensation.

    hum_raw

    REAL

    %

    Read directly from sensor absent compensation.

    gas_red_raw

    REAL

    Ohms

    Read directly from sensor using gas_data.reducing method absent compensation.

    gas_oxi_raw

    REAL

    Ohms

    Read directly from sensor using gas_data.oxidising method absent compensation.

    gas_nh3_raw

    REAL

    Ohms

    Read directly from sensor using gas_data.nh3 method absent compensation.

    current_time

    REAL

    Unix time in Seconds

    Created by script upon reading values.

    bar

    Barometer air pressure reading in bars (hPa) measured directly from BME280 sensor with compensation set for altitude from device config.

    REAL

    hum

    Humidity reading in percent (%) measured directly from BME280 sensor with compensation factor set from device config.

    REAL

    dew

    Calculated dew point in degree Celsius, based on temperature and humidity using the following calculation (Python) `(237.7 * (math.log(dew_hum/100)+17.271*dew_temp/(237.7+dew_temp))/(17.271 - math.log(dew_hum/100) - 17.271*dew_temp/(237.7 + dew_temp)))`

    REAL

    temp_raw

    Temperature reading in degree Celsius measured directly from BME280 sensor absent of any compensation (raw values).

    REAL

    bar_raw

    Barometer air pressure reading in bars (hPa) measured directly from BME280 sensor absent of any compensation (raw values).

    REAL

    hum_raw

    Humidity reading in percent (%) measured directly from BME280 sensor absent of any compensation (raw values).

    REAL

    pm01

    Particulate Matter (PM) at 1 micrometers or greater in diameter in micrograms per cubic meter (ug/m3) measured directly from PM sensor.

    REAL

    0-infinity

    pm025

    Particulate Matter (PM) at 2.5 micrometers or greater in diameter in micrograms per cubic meter (ug/m3) measured directly from PM sensor.

    REAL

    0-infinity

    pm10

    Particulate Matter (PM) at 10 micrometers or greater in diameter in micrograms per cubic meter (ug/m3) measured directly from PM sensor.

    REAL

    0-infinity

    gas_red

    Reducing gases (RED) reading in Parts Per Million (PPM) measured directly from gas sensor with compensation factor set for drift. Eg hydrogen, carbon monoxide

    REAL

    0-infinity

    gas_oxi

    Oxidising gases (OX) reading in Parts Per Million (PPM) measured directly from gas sensor with compensation factor set for drift. Eg chlorine, nitrous oxide

    REAL

    0-infinity

    gas_nh3

    Ammonia (NH3) reading in Parts Per Million (PPM) measured directly from gas sensor with compensation factor set for drift. Gas resistance for nh3/ammonia

    REAL

    0-infinity

    gas_red_raw

    Reducing gases (RED) reading in Ohms measured directly from gas sensor absent of any compensation (raw values). Eg hydrogen, carbon monoxide

    REAL

    0-infinity

    gas_oxi_raw

    Oxidising gases (OX) reading in Ohms measured directly from gas sensor absent of any compensation (raw values). Eg chlorine, nitrous oxide

    REAL

    0-infinity

    gas_nh3_raw

    Ammonia (NH3) reading in Ohms measured directly from gas sensor absent of any compensation (raw values). Gas resistance for nh3/ammonia

    REAL

    0-infinity

    lux

    Lux reading in Lux measured directly from optical sensor with proximity-adjusted minimum.

    REAL

    0.01 to 64k lux

    lux_raw

    Lux reading in Lux measured directly from optical sensor.

    REAL

    0.01 to 64k lux

    proximity

    Proximity reading measure directly from optical sensor.

    REAL

    0-infinity

    sensorName

    Air Quality sensor that captured the data. This field may be used to associate with other sensors.

    TEXT

    deployment_id

    Each ID represents a unique deployment configuration and/or location for the device. This acts a foreign key link to the `deployment` table, `id` column.

    TEXT, FOREIGN KEY

    `deployment`.`id` foreign key, may be null

  • Raw gas readings will also have compensation factors applied, determined by regression analysis.

    • Air pressure reading has an altitude compensation factor applied (defined in config.json)

  • Dew(Dew Point) is calculated from temperature and humidity using the following calculation:

  • gas_calibrated

    true/false

    gas_sensors_warmup_time = 6000 or startup_stabilisation_time when reset_gas_sensor_calibration = true

    bar

    [REAL, TEXT]

    hPa, Comfort-level {"0": "Stable", "1": "Fair", "3": "Poorer/Windy/", "4": "Rain/Gale/Storm"}

    Air pressure, compensated for altitude and temp as Bar / comp_factor where comp_factor = math.pow(1 - (0.0065 * altitude/(temp + 0.0065 * alt + 273.15)), -5.257)

    hum

    [REAL, TEXT]

    %, Comfort-level {"good": "1", "dry": "2", "wet": "3"}

    entryDateTime

    Unix timestamp when data capture began on sensor

    REAL

    Unix timestamp in Seconds

    gas_calibrated

    Indicates if gas sensors are fully "warmed up". Will be false until `gas_sensors_warmup_time` is met (default 10-minutes after sensor starts)

    REAL

    BOOLEAN, 1 = true / 0 = false

    temp

    Temperature reading in degree Celsius measured directly from BME280 sensor with compensation factor set from device config.

    REAL

    environmental science
    contribute
    Enviro for Raspberry Pi
    Getting started
    greendormer/enviroplus-monitor
    Config Readme
    astral package
    MICS6814
    pimoroni/enviroplus-python #47
    pimoroni/enviroplus-python #67
    BME280
    LTR-559
    datasheet
    PMS5003

    Adjusted for compensation factor set in config.json

    dewpoint = (237.7 * (math.log(dew_hum/100)+17.271*dew_temp/(237.7+dew_temp))/(17.271 - math.log(dew_hum/100) - 17.271*dew_temp/(237.7 + dew_temp)))
    {
        "altitude": 49,
        "city_name": "Portland",
        "time_zone": "America/Los_Angeles",
        "custom_locations": [
            "Portland, United States of America, America/Los_Angeles, 45.52, -122.681944"
        ]
    }
    {
        "gas_calibrated": false,
        "bar": [
            1009.44,
            "0"
        ],
        "hum": [
            28.7,
            "2"
        ],
        "p025": 0,
        "p10": 0,
        "p01": 0,
        "dew": 2.1,
        "temp": 21,
        "temp_min": 20.9,
        "temp_max": 21.1,
        "gas_red": 4.2,
        "gas_oxi": 0.15,
        "gas_nh3": 0.69,
        "lux": 1.2,
        "proximity": 255,
        "lux_raw": 1.16185,
        "temp_raw": 28.68982029794099,
        "bar_raw": 1003.7122773175154,
        "hum_raw": 18.31337919009301,
        "gas_red_raw": 140805,
        "gas_oxi_raw": 103585,
        "gas_nh3_raw": 227871,
        "current_time": 1738698665.077546
    }