Best algorithm to detect presence in a room based on input from 2 motion sensors

Assume we have 2 motion sensors in a room. Currently, there are 2 algorithms to detect presence in the room based on input from the 2 motion sensors.

  1. Simple algorithm Wasp in a Box

  2. Complicated algorithm The Phi Accrual Failure Detector

However, Wasp in a Box algorithm is so simple and need a door sensor. It could not work for an open room that links to other rooms and has no door to open or close.

On the other hand, The Phi Accrual Failure Detector algorithm is so complicated to implement in lua language and downside of this algorithm is the time to detect no presence is about 30 minutes with timeout parameter set at 240 seconds.

This, in fact, is not the case. WIAB can use any sort of detector, it doesn’t even need an actual open/closed door. What’s crucial is that the ‘doorway’ one actually detects people entering or leaving (for example, a downward facing PIR above a doorway.) Once the doorway sensor is triggered, there is ambiguity as to whether it was someone entering or leaving, so it is good to ensure that the ‘room’ sensor (inside the ‘box’) is triggered shortly after someone enters.

There are other potentially ambiguous situations, mostly involving multiple people, but nevertheless it’s far superior to a single motion detector in the room. And, as you point out, it’s really simple to implement.

Was a Plug In ever made for this?

Very easy to do with two scenes and a bit of Lua. See Strategy question on motion control.

Also Plugin suggestion : Automatic Presence Detection.

This, in fact, is not the case. WIAB can use any sort of detector, it doesn’t even need an actual open/closed door. What’s crucial is that the ‘doorway’ one actually detects people entering or leaving (for example, a downward facing PIR above a doorway.) Once the doorway sensor is triggered, there is ambiguity as to whether it was someone entering or leaving, so it is good to ensure that the ‘room’ sensor (inside the ‘box’) is triggered shortly after someone enters.

There are other potentially ambiguous situations, mostly involving multiple people, but nevertheless it’s far superior to a single motion detector in the room. And, as you point out, it’s really simple to implement.[/quote]

I have 2 open rooms and use 2 motion sensors for each room.

  1. Dinning room has 3 doorways, 2 doorways (without physical door) connect to living room and to kitchen room, 1 doorway (with physic door) connect to bed room.

  2. Kitchen room has 1 doorway (without physical door) connect to dinning room.

I think Wasp in a Box algorithm will not work for my dinning room.

For my kitchen room, it will fail to detect leaving if I enter and exit kitchen room too fast, faster than motion sensor’s detection interval. i.e If the detection interval of the motion sensor downward facing PIR above doorway is 30 seconds then it will fail to detect leaving if I enter and exit kitchen room less than 30 seconds. To avoid fail detect leaving, the detection interval needs to be set to 1 or 2 seconds → drain battery.

Clearly, there are constraints.

[ul][li]if you don’t have a detector for each doorway [/li]
[li]if you don’t measure often enough to detect the passage of someone[/li][/ul]

It’s only an algorithm and it can’t work without the right data. Nothing else would either. It’s not magic.

[quote=“akbooer, post:6, topic:196917”]Clearly, there are constraints.

[ul][li]if you don’t have a detector for each doorway [/li]
[li]if you don’t measure often enough to detect the passage of someone[/li][/ul]

It’s only an algorithm and it can’t work without the right data. Nothing else would either. It’s not magic.[/quote]

Have you tried to implement The Phi Accrual Failure Detector algorithm?

Waiting for sound sensor coming from Aeotec in 2017. 4 in 1 sensor will solve our detecting presence in a room challenge.

[url=https://aeotec.com/z-wave-glass-break-sound-sensor]https://aeotec.com/z-wave-glass-break-sound-sensor[/url]

  1. z-wave break sensor
    Glass break sensor.
    Thieves don?t always come in through your doors. Sound Sensor lets you know when intruders are trying to break in by smashing your windows.

  2. z-wave sound sensor
    Presence sensor.
    Knowing if someone is in a room is essential, especially when you?re not home. Sound Sensor listens for sounds that shouldn?t be there, sending you alerts when they are.

  3. z-wave smoke sensor
    Smoke sensor.
    Monitor your home?s most important safety sensors without replacing them. Sound Sensor can make any smoke detector a Z-Wave smoke detector.

  4. z-wave gas sensor
    CO & CO2 sensor.
    Whether it?s carbon dioxide or carbon monoxide that you?re monitoring, Sound Sensor can integrate CO detectors into your home?s Z-Wave network.

While waiting for factory to research and produce real presence sensors, I share my best algorithm to detect presence in a room based on input from 2 motion sensors.

My best algorithm is based on following observation:

Human will not sit or stand motionless more than 10 minutes in dinning room and kitchen room. For living room, bed room and reading room, I use motion sensors + pressure mat sensors to detect presence.

Based on the above observation, here is my best algorithm for dinning room and kitchen room:

  1. If a motion sensor in a room is tripped then human is in that room

  2. If all motion sensors in a room are not tripped and last trip is more than 10 minutes ago then human is not in that room

This algorithm is better than Wasp in a Box algorithm. It works very well for an open room that links to other rooms and has no door to open or close.

This algorithm is also simpler to implement than The Phi Accrual Failure Detector algorithm.

Here is implementation of best algorithm to detect presence in a room based on input from 2 motion sensors.

  1. Put following in lua start up:
    SS_SID = “urn:micasaverde-com:serviceId:SecuritySensor1”
    SP_SID = “urn:upnp-org:serviceId:SwitchPower1”

HumanInKitchen = 0
LastTripInKitchen = os.time()
KitchenTimeOut = 600

KitchenLight = 186
KitchenNeoMotionSensor = 152
KitchenFibaroMotionSensor = 158

function getStatus(devID)
local devStatus = luup.variable_get(SP_SID, “Status”, devID)
return devStatus
end

function isOn(devID)
local devStatus = luup.variable_get(SP_SID, “Status”, devID)
if (devStatus == “1”) then
return true
else
return false
end
end

function isOff(devID)
local devStatus = luup.variable_get(SP_SID, “Status”, devID)
if (devStatus == “0”) then
return true
else
return false
end
end

function getTrip(devID)
local devStatus = luup.variable_get(SS_SID, “Tripped”, devID)
return devStatus
end

function isTrip(devID)
local devStatus = luup.variable_get(SS_SID, “Tripped”, devID)
if (devStatus == “1”) then
return true
else
return false
end
end

function isNotTrip(devID)
local devStatus = luup.variable_get(SS_SID, “Tripped”, devID)
if (devStatus == “0”) then
return true
else
return false
end
end

function turnOnLight(devID)
local lightStatus = luup.variable_get(SP_SID, “Status”, devID)
if (lightStatus == “0”) then
luup.call_action (SP_SID, “SetTarget”, {[“newTargetValue”] = 1}, devID)
end
end

function turnOffLight(devID)
local lightStatus = luup.variable_get(SP_SID, “Status”, devID)
if (lightStatus == “1”) then
luup.call_action (SP_SID, “SetTarget”, {[“newTargetValue”] = 0}, devID)
end
end

  1. Create scene name SomebodyInKitchen. This scene will be triggered by KitchenNeoMotionSensor or KitchenFibaroMotionSensor tripped event. In lua code of this scene, put following:

LastTripInKitchen = os.time()
HumanInKitchen = 1
–log(“Detected movement in kitchen room, turn on kitchen light”)
turnOnLight(KitchenLight)

  1. Create scene name NobodyInKitchen. This scene will be triggered by KitchenNeoMotionSensor or KitchenFibaroMotionSensor not tripped event. In lua code of this scene, put following:

function CheckHumanInKitchenStatus()
if HumanInKitchen == 1 or os.difftime(os.time(), LastTripInKitchen) < KitchenTimeOut then
– log(“Movement in kitchen room before delay KitchenTimeOut seconds, do not turn off light”)
return true
end

-- log("No movement in kitchen room after delay KitchenTimeOut seconds, turn off light")
turnOffLight(KitchenLight)

end

LastTripInKitchen = os.time()
– log(“Detected no movement in kitchen room, check all sensors”)
if isNotTrip(KitchenFibaroMotionSensor) and isNotTrip(KitchenNeoMotionSensor) then
– log(“All sensors are not tripped. Turn off HumanInKitchen indicator and call delay KitchenTimeOut seconds to turn off kitchen light”)
HumanInKitchen = 0
luup.call_delay (“CheckHumanInKitchenStatus”, KitchenTimeOut)
end

Adding time dimension to improve presence detection algorithm

  1. I defined 2 timeout constants in start up lua. one is used for peak hours and another one is used for non peak hours.

PeakHourTimeout = 600 – 10 minutes
NonPeakHourTimeout = 180 – 3 minutes

  1. After that, I divided 24 hours a day into several time segments and put them in start up lua.

function breakfasttime()
local t = os.date(‘*t’)
if (6 <= t.hour and t.hour < 9) then
return true
else
return false
end
end

function morningtime()
local t = os.date(‘*t’)
if (9 <= t.hour and t.hour < 11) then
return true
else
return false
end
end

function lunchtime()
local t = os.date(‘*t’)
if (11 <= t.hour and t.hour < 13) then
return true
else
return false
end
end

function afternoontime()
local t = os.date(‘*t’)
if (13 <= t.hour and t.hour < 17) then
return true
else
return false
end
end

function dinnertime()
local t = os.date(‘*t’)
if (17 <= t.hour and t.hour < 19) then
return true
else
return false
end
end

function eveningtime()
local t = os.date(‘*t’)
if (19 <= t.hour and t.hour < 23) then
return true
else
return false
end
end

function sleepingtime()
local t = os.date(‘*t’)
if (23 <= t.hour or t.hour < 6) then
return true
else
return false
end
end

function cookingtime()
local t = os.date(‘*t’)
if (16 <= t.hour and t.hour < 19) then
return true
else
return false
end
end

  1. Time dimension was added into NobodyInKitchen scene as follow:

function CheckHumanInKitchenStatus()
if cookingtime() then
if HumanInKitchen == 1 or os.difftime(os.time(), LastTripInKitchen) < PeakHourTimeout then
– log(“Movement in kitchen room before delay " … PeakHourTimeout … " seconds, do not turn off light”)
return true
end
else
if HumanInKitchen == 1 or os.difftime(os.time(), LastTripInKitchen) < NonPeakHourTimeout then
– log(“Movement in kitchen room before delay " … NonPeakHourTimeout … " seconds, do not turn off light”)
return true
end
end

-- log("No movement in kitchen room after delay Peak/NonPeakHourTimeout, turn off light")
turnOffLight(KitchenLight)

end

LastTripInKitchen = os.time()
– log(“Detected no movement in kitchen room, check all sensors”)
if isNotTrip(KitchenFibaroMotionSensor) and isNotTrip(KitchenNeoMotionSensor) then
– log(“All sensors are not tripped. Turn off HumanInKitchen indicator and call delay Peak/NonPeakHourTimeout seconds to turn off kitchen light”)
HumanInKitchen = 0
if cookingtime() then
luup.call_delay (“CheckHumanInKitchenStatus”, PeakHourTimeout)
else
luup.call_delay (“CheckHumanInKitchenStatus”, NonPeakHourTimeout)
end
end

Time dimension was added into NobodyInDinning scene

function CheckHumanInDinningStatus()
if breakfasttime() or lunchtime() or dinnertime() then
if HumanInDinning == 1 or os.difftime(os.time(), LastTripInDinning) < PeakHourTimeout then
– log(“Movement in Dinning room before delay " … PeakHourTimeout … " seconds, do not turn off light”)
return true
end
else
if HumanInDinning == 1 or os.difftime(os.time(), LastTripInDinning) < NonPeakHourTimeout then
– log(“Movement in Dinning room before delay " … NonPeakHourTimeout … " seconds, do not turn off light”)
return true
end
end

-- log("No movement in Dinning room after delay Peak/NonPeakHourTimeout, turn off light")
turnOffLight(DinningLight)

end

LastTripInDinning = os.time()
– log(“Detected no movement in Dinning room, check all sensors”)
if isNotTrip(DinningFibaroMotionSensor) and isNotTrip(DinningNeoMotionSensor) then
– log(“All sensors are not tripped. Turn off HumanInDinning indicator and call delay Peak/NonPeakHourTimeout seconds to turn off Dinning light”)
HumanInDinning = 0
if breakfasttime() or lunchtime() or dinnertime() then
luup.call_delay (“CheckHumanInDinningStatus”, PeakHourTimeout)
else
luup.call_delay (“CheckHumanInDinningStatus”, NonPeakHourTimeout)
end
end

After adding dimension such as non peak/peak hours hours timeout and time segments, the presence detection algorithm worked very well, I had almost zero fail detection.

New algorithm to detect presence in an open room that has only one doorframe without physical door

Here is old-origin Wasp In A Box Algorithm

1. Assume there is no wasp in the box.
2. Have you heard the wasp buzzing in the box?
    YES = There is a wasp in the box until you are told otherwise.
3. Have you opened the box since you last heard buzzing?
    YES = There is no wasp in the box.
    NO = There is still a wasp in the box if you heard one earlier.
Got to step 2.

The Wasp In A Box Algorithm worked very well for a room that has physical door. However, it did not work very well for an open room that has no door to open or close and in order to implement Wasp In A Box Algorithm, you need to have a schedule heartbeat job to run every minute.

In search of better algorithm to detect presence in an open room that has only one doorframe without physical door, I was inspired by Venus flytrap - the killing machine’s snap-trap mechanism:

[url=https://www.youtube.com/watch?v=z5fOsgrAJiU]VENUS FLYTRAP JAWS OF DOOM!! 2016 compilation - YouTube

The snap-traps of Venus flytraps, are thin, modified, v-shaped leaves. Small, hairs on the inside are triggers. When an insect touches these hairs twice, it triggers closure using short-term changes in the electrical potential on the surface of cells. Similar cell-to-cell communication, using different ions, is found in muscles and neurons in animals.

The snap-trap movement occurs through a combination of elasticity, cell pressure (called “turgor”) and growth in the cells of both sides of the leaf trap. Cells on the outside of the trap quickly enlarge, allowing stretching and fast growth, by pumping water inside. Meanwhile, cells on the inside, which previously held the trap open, have fluid pumped out. The shape of the leaf facilitates the closing movement, as the concave halves turn inside out in 0.5 seconds.

[url=https://www.sciencenewsforstudents.org/article/eating-venus-flytraps-must-count]https://www.sciencenewsforstudents.org/article/eating-venus-flytraps-must-count[/url]

Venus flytrap can count how many time the hairs were triggered. The trigger hairs inside “tell” the plant when to spring the trap and when to release digestive enzymes.
[url=https://m.youtube.com/watch?v=aiDskGkeqzo]Carnivorous Plant Counting Prey/ Curr. Biol., Jan. 21, 2016 (Vol. 26, Issue 3) - YouTube

The Venus flytraps’ snap-trap mechanism
The Venus flytrap has a specialized leaf that acts as a trap. Inside this trap are hair-like triggers that tell the plant when to close the leaf. An insect walking across the leaf trap will bend one of those trigger hairs. That will cause cells to “fire”, or send signals, to nearby cells.

  1. One nudge of a trigger hair causes cells to fire - yet the trap stays open. That lack of response helps prevent false alarms. It keeps the trap from closing on a raindrop, for example, or a brushing leaf.

  2. But a second nudge within 20 seconds of the first will spring the trap. Snap! The two sides of the trap slam shut, caging the insect inside.

  3. The fifth nudge of trigger hair causes the plant’s cells to release digestive enzymes to digest its prey.

By studying of Venus flytraps’ snap-trap mechanism, I developed a new algorithm to detect presence in a room that has only one doorway. I named my new algorithm as Trapped In Venus Box Algorithm.

Here is my Trapped In Venus Box Algorithm

1. Assume there is no prey in the Venus box.
2. Have you heard a prey flying/moving outside of the Venus box, near to leaf trap of the Venus box?
    YES = Venus opens leaf trap, waiting for the prey to go in the Venus box.
    NO = Venus keeps waiting for the next prey
3. Have you still heard the prey moving in the Venus box 10 seconds after the leaf trap was opened?
    YES = Venus snaps the leaf trap. The prey is snap-trapped in the Venus box until you are told otherwise.
    NO = The prey did not go in the Venus box or it went in and came out too fast, faster than 10 seconds lack of response that helps to prevent false alarms. Venus will ignore this prey and keep waiting for the next prey.
4. Have you heard the prey flying/moving outside of the Venus box, near to leaf trap of the Venus box after it has been snap-trapped in the Venus box?
    YES = The Venus leaf trap was forced to open by the prey, the prey probably escaped.
    NO = The prey is still snap-trapped in the Venus box.

This Trapped In Venus Box Algorithm is event-driven algorithm, no need to implement a schedule heartbeat job.

Very nice work. Unfortunately I share my house with several GSD so I need to find a way to differentiate between Human and Dog(s). I haven’t found a sensor yet that sends back a size/weight type response that would allow me to set a cutoff for a certain size/weight to rule them out…

It is so simple. You can attach all motion sensors on the wall at same level 1m or 1.5m from the ground, then you use a carton paper or similar attach right under motion sensor to prevent the motion sensor from detecting movement below 1m or 1.5m level. i.e the motion sensor only detect movements above 1m or 1.5m level.

Hmmm. There’s an idea. Mine are the down angled ones. I can mount them lower and upside down so the beam goes up. I gotta mull about some more on that… Thank you!

Motion sensor works best if we are moving across it, but it lacks of response if we are moving toward it. So placement strategy of motion sensor is very important.

For dinning room, we can strategy place 2 motion sensors as following:

  1. One motion sensor is placed high on the wall and face down angle. This far sensor will monitor human activity when they are moving across dinning room.
  2. One motion sensor is placed at strategy location center and under dinning table. This near sensor will monitor human activity when they are sitting around dinning table for eating.

A strategy combination of both far sensor and near sensor will increase the change of detecting human presence in dinning room. Far sensor may fail to detect human if they are sitting motionless at dinning table but near sensor will not fail to detect human’s legs moving under dinning table when they are sitting motionless.

Trapped In Venus Box Algorithm

1. Assume there is no prey in the Venus box.
2. Have you heard a prey flying/moving outside of the Venus box, near to leaf trap of the Venus box?
    YES = Venus opens leaf trap, waiting for the prey to go in the Venus box.
    NO = Venus keeps waiting for the next prey
3. Have you still heard the prey moving in the Venus box 10 seconds after the leaf trap was opened?
    YES = Venus snaps the leaf trap. The prey is snap-trapped in the Venus box until you are told otherwise.
    NO = The prey did not go in the Venus box or it went in and came out too fast, faster than 10 seconds lack of response that helps to prevent false alarms. Venus will ignore this prey and keep waiting for the next prey.
4. Have you heard the prey flying/moving outside of the Venus box, near to leaf trap of the Venus box after it has been snap-trapped in the Venus box?
    YES = The Venus leaf trap was forced to open by the prey, the prey probably escaped.
    NO = The prey is still snap-trapped in the Venus box.

To implement Trapped In Venus Box Algorithm, we need to be able to detect movement inside and movement outside of a room. I already have 2 motion sensors inside kitchen room, I need another motion sensor outside of kitchen room.

Luckily, I no need to buy another motion sensor, I already have 2 motion sensors inside dinning room. I can use one of dinning room’s motion sensors to detect movement outside of kitchen room. That dinning room’s motion sensor will be used as dual purpose motion sensor: monitor movement inside dinning room and monitor movement outside kitchen room.

Here is Vera Smart Home implementation of Trapped In Venus Box Algorithm for my kitchen room.

  1. Modify lua start up add following:
    TrapInVenusBox = 0
    LastVenusOpen = os.time()
    VenusLackOfResponseDelay = 10 – 10 seconds is detection interval default setting of my dinning room’s motion sensor. Detection interval or re-trigger interval or blind time of a motion sensor is the interval of being re-triggered after the PIR detector has been triggered. No report will be sent during this interval if a movement is presented. i.e PIR sensor will be “blind” (insensitive) to motion after last detection for the amount of time specified in this parameter setting. Set this interval to shorter time periods allow to detect motion more frequently, but the battery will be drained faster.

lua start up
SS_SID = “urn:micasaverde-com:serviceId:SecuritySensor1”
SP_SID = “urn:upnp-org:serviceId:SwitchPower1”

HumanInKitchen = 0
LastTripInKitchen = os.time()

HumanInDinning = 0
LastTripInDinning = os.time()

PeakHourTimeout = 600 – 10 minutes
NonPeakHourTimeout = 180 – 3 minutes

TrapInVenusBox = 0
LastVenusOpen = os.time()
VenusLackOfResponseDelay = 10

KitchenLight = 186
KitchenNeoMotionSensor = 152
KitchenFibaroMotionSensor = 158

DinningLight = 82
DinningNeoMotionSensor = 140 – this is dual purpose motion sensor that monitor movement inside dinning room and movement outside kitchen room
DinningFibaroMotionSensor = 69

function getStatus(devID)
local devStatus = luup.variable_get(SP_SID, “Status”, devID)
return devStatus
end

function isOn(devID)
local devStatus = luup.variable_get(SP_SID, “Status”, devID)
if (devStatus == “1”) then
return true
else
return false
end
end

function isOff(devID)
local devStatus = luup.variable_get(SP_SID, “Status”, devID)
if (devStatus == “0”) then
return true
else
return false
end
end

function getTrip(devID)
local devStatus = luup.variable_get(SS_SID, “Tripped”, devID)
return devStatus
end

function isTrip(devID)
local devStatus = luup.variable_get(SS_SID, “Tripped”, devID)
if (devStatus == “1”) then
return true
else
return false
end
end

function isNotTrip(devID)
local devStatus = luup.variable_get(SS_SID, “Tripped”, devID)
if (devStatus == “0”) then
return true
else
return false
end
end

function turnOnLight(devID)
local lightStatus = luup.variable_get(SP_SID, “Status”, devID)
if (lightStatus == “0”) then
luup.call_action (SP_SID, “SetTarget”, {[“newTargetValue”] = 1}, devID)
end
end

function turnOffLight(devID)
local lightStatus = luup.variable_get(SP_SID, “Status”, devID)
if (lightStatus == “1”) then
luup.call_action (SP_SID, “SetTarget”, {[“newTargetValue”] = 0}, devID)
end
end

function breakfasttime()
local t = os.date(‘*t’)
if (6 <= t.hour and t.hour < 9) then
return true
else
return false
end
end

function morningtime()
local t = os.date(‘*t’)
if (9 <= t.hour and t.hour < 11) then
return true
else
return false
end
end

function lunchtime()
local t = os.date(‘*t’)
if (11 <= t.hour and t.hour < 13) then
return true
else
return false
end
end

function afternoontime()
local t = os.date(‘*t’)
if (13 <= t.hour and t.hour < 17) then
return true
else
return false
end
end

function dinnertime()
local t = os.date(‘*t’)
if (17 <= t.hour and t.hour < 19) then
return true
else
return false
end
end

function eveningtime()
local t = os.date(‘*t’)
if (19 <= t.hour and t.hour < 23) then
return true
else
return false
end
end

function sleepingtime()
local t = os.date(‘*t’)
if (23 <= t.hour or t.hour < 6) then
return true
else
return false
end
end

function cookingtime()
local t = os.date(‘*t’)
if (16 <= t.hour and t.hour < 19) then
return true
else
return false
end
end

  1. Create new scene name VenusOpen. This scene will be triggered by DinningNeoMotionSensor tripped event. In lua code of this scene, put following:
    if TrapInVenusBox == 0 then
    –log(“A prey is flying outside of the Venus box, open Venus leaf trap and wait for the prey to go inside”)
    TrapInVenusBox = 0
    else
    –log(“Venus leaf trap was forced to open by the prey, the prey probably escaped”)
    TrapInVenusBox = 0
    end
    LastVenusOpen = os.time()

  2. Modify scene SomebodyInKitchen as below:
    if os.difftime(os.time(), LastVenusOpen) > VenusLackOfResponseDelay then
    – log(“The prey is moving in the Venus box " … VenusLackOfResponseDelay … " seconds after Venus leaf trap was opened, close Venus leaf trap”)
    TrapInVenusBox = 1
    end

LastTripInKitchen = os.time()
HumanInKitchen = 1
–log(“Detected movement in kitchen room, turn on kitchen light”)
turnOnLight(KitchenLight)

  1. Mofify scene NobodyInKitchen as below:
    function CheckHumanInKitchenStatus()
    if TrapInVenusBox == 1 then
    – log(“Venus leaf trap is closed, the prey is in Venus box, do not turn off light”)
    return true
    end

    if cookingtime() then
    if HumanInKitchen == 1 or os.difftime(os.time(), LastTripInKitchen) < PeakHourTimeout then
    – log(“Movement in kitchen room before delay " … PeakHourTimeout … " seconds, do not turn off light”)
    return true
    end
    else
    if HumanInKitchen == 1 or os.difftime(os.time(), LastTripInKitchen) < NonPeakHourTimeout then
    – log(“Movement in kitchen room before delay " … NonPeakHourTimeout … " seconds, do not turn off light”)
    return true
    end
    end

    – log(“No movement in kitchen room after delay Peak/NonPeakHourTimeout, turn off light”)
    turnOffLight(KitchenLight)
    end

LastTripInKitchen = os.time()
– log(“Detected no movement in kitchen room, check all sensors”)
if isNotTrip(KitchenFibaroMotionSensor) and isNotTrip(KitchenNeoMotionSensor) then
– log(“All sensors are not tripped. Turn off HumanInKitchen indicator and call delay Peak/NonPeakHourTimeout seconds to turn off kitchen light”)
HumanInKitchen = 0
if cookingtime() then
luup.call_delay (“CheckHumanInKitchenStatus”, PeakHourTimeout)
else
luup.call_delay (“CheckHumanInKitchenStatus”, NonPeakHourTimeout)
end
end

If you want to see how the algorithm works, you can create 3 virtual switches for HumanInDinning, HumanInKitchen and TrapInVenusBox indicators to see how these indicators turn on and off on Vera dashboard in real time.

Final touch to complete best algorithm to detect presence in a room

I created a virtual switch for ManualMode and add following code into function CheckHumanInDinningStatus() and CheckHumanInKitchenStatus(). When I have supper in dinning room, out of normal dinner time, I can command echo dot: “Alexa, turn on manual mode”

function CheckHumanInDinningStatus()
if isOn(ManualMode) then
– log(“Manual mode is on, do not turn off light”)
return true
end

if breakfasttime() or lunchtime() or dinnertime() then
    if HumanInDinning == 1 or os.difftime(os.time(), LastTripInDinning) < PeakHourTimeout then
        -- log("Movement in Dinning room before delay " .. PeakHourTimeout .. " seconds, do not turn off light")
        return true
    end
else
    if HumanInDinning == 1 or os.difftime(os.time(), LastTripInDinning) < NonPeakHourTimeout then
        -- log("Movement in Dinning room before delay " .. NonPeakHourTimeout .. " seconds, do not turn off light")
        return true
    end
end

function CheckHumanInKitchenStatus()
if isOn(ManualMode) then
– log(“Manual mode is on, do not turn off light”)
return true
end

if TrapInVenusBox == 1 then
    -- log("Venus leaf trap is closed, the prey is in Venus box, do not turn off light")
    return true
end

if cookingtime() then
    if HumanInKitchen == 1 or os.difftime(os.time(), LastTripInKitchen) < PeakHourTimeout then
        -- log("Movement in kitchen room before delay " .. PeakHourTimeout .. " seconds, do not turn off light")
        return true
    end
else
    if HumanInKitchen == 1 or os.difftime(os.time(), LastTripInKitchen) < NonPeakHourTimeout then
        -- log("Movement in kitchen room before delay " .. NonPeakHourTimeout .. " seconds, do not turn off light")
        return true
    end
end

-- log("No movement in kitchen room after delay Peak/NonPeakHourTimeout, turn off light")
turnOffLight(KitchenLight)

end

Thanks for this great example. I’ve taken this example and modified it to fit my needs a little better. My setup has only 1 motion sensor, but I do have other inputs to help determine occupancy. (A network receiver, and a Kodi box). My example requires the usage of XBMCState, as well as the Onkyo receiver plugin. If anyone is curious, I’m working on the code here.