I don't know if it is a core or frontend problem or something else. But I'm seeing a higher CPU use in generally, but especially when clicking on a card in the Lovelace view.
My normal CPU use was 2 a 3 % now it is at 6 to 8 %
When I want to show the history the fans of my NUC immediately start to blow and I see always a turning circle.
And the CPU use peaks at about 18%
This are some screenshots from Glances when clicking on history in Lovelace
I had debug on for
logs:
homeassistant.event: debug
homeassistant.components.logger: debug
homeassistant.components.history: debug
homeassistant.components.recorder: debug
But not seeing anything special
I've tried the Py-Spy way, but I can not get it installed
Compiling addr2line v0.11.0
Running `rustc --crate-name addr2line /root/.cargo/registry/src/github.com-1ecc6299db9ec823/addr2line-0.11.0/src/lib.rs --error-format=json --json=diagnostic-rendered-ansi,artifacts --crate-type lib --emit=dep-info,metadata,link -C opt-level=3 --cfg 'feature="cpp_demangle"' --cfg 'feature="default"' --cfg 'feature="object"' --cfg 'feature="rustc-demangle"' --cfg 'feature="std"' --cfg 'feature="std-object"' -C metadata=361484a7c1298d46 -C extra-filename=-361484a7c1298d46 --out-dir /tmp/cargo-installWui9CK/release/deps -L dependency=/tmp/cargo-installWui9CK/release/deps --extern cpp_demangle=/tmp/cargo-installWui9CK/release/deps/libcpp_demangle-eb3cfddebaa342f3.rmeta --extern fallible_iterator=/tmp/cargo-installWui9CK/release/deps/libfallible_iterator-a4123992588425b5.rmeta --extern gimli=/tmp/cargo-installWui9CK/release/deps/libgimli-7d5edb096fedfa90.rmeta --extern lazycell=/tmp/cargo-installWui9CK/release/deps/liblazycell-ea0ac196d0b3629c.rmeta --extern object=/tmp/cargo-installWui9CK/release/deps/libobject-477a467cbf075a12.rmeta --extern rustc_demangle=/tmp/cargo-installWui9CK/release/deps/librustc_demangle-1fe779301a7b7227.rmeta --extern smallvec=/tmp/cargo-installWui9CK/release/deps/libsmallvec-9bd83c2162524722.rmeta --cap-lints allow`
Running `rustc --crate-name remoteprocess /root/.cargo/registry/src/github.com-1ecc6299db9ec823/remoteprocess-0.3.4/src/lib.rs --error-format=json --json=diagnostic-rendered-ansi,artifacts --crate-type lib --emit=dep-info,metadata,link -C opt-level=3 --cfg 'feature="default"' --cfg 'feature="unwind"' -C metadata=2427f6658e4e467c -C extra-filename=-2427f6658e4e467c --out-dir /tmp/cargo-installWui9CK/release/deps -L dependency=/tmp/cargo-installWui9CK/release/deps --extern addr2line=/tmp/cargo-installWui9CK/release/deps/libaddr2line-361484a7c1298d46.rmeta --extern benfred_read_process_memory=/tmp/cargo-installWui9CK/release/deps/libbenfred_read_process_memory-e58b01e0194ad895.rmeta --extern goblin=/tmp/cargo-installWui9CK/release/deps/libgoblin-f7d9a003d2b2b860.rmeta --extern lazy_static=/tmp/cargo-installWui9CK/release/deps/liblazy_static-dda545747076ca2e.rmeta --extern libc=/tmp/cargo-installWui9CK/release/deps/liblibc-d6a517a54d66530e.rmeta --extern log=/tmp/cargo-installWui9CK/release/deps/liblog-33247e2f897fd775.rmeta --extern memmap=/tmp/cargo-installWui9CK/release/deps/libmemmap-d558b4e09fe43210.rmeta --extern nix=/tmp/cargo-installWui9CK/release/deps/libnix-003b039a35e9d039.rmeta --extern object=/tmp/cargo-installWui9CK/release/deps/libobject-477a467cbf075a12.rmeta --extern proc_maps=/tmp/cargo-installWui9CK/release/deps/libproc_maps-af79a59c980f3ade.rmeta --extern regex=/tmp/cargo-installWui9CK/release/deps/libregex-bc89e9f6ea47428c.rmeta --cap-lints allow -L native=/usr/local/lib -l static=unwind -l static=unwind-ptrace -l static=unwind-x86_64`
Running `rustc --crate-name py_spy /root/.cargo/registry/src/github.com-1ecc6299db9ec823/py-spy-0.3.3/src/lib.rs --error-format=json --json=diagnostic-rendered-ansi --crate-type lib --emit=dep-info,metadata,link -C opt-level=3 -C metadata=ff12da3be12bb5c3 -C extra-filename=-ff12da3be12bb5c3 --out-dir /tmp/cargo-installWui9CK/release/deps -L dependency=/tmp/cargo-installWui9CK/release/deps --extern clap=/tmp/cargo-installWui9CK/release/deps/libclap-f6bf72e393cc38bc.rmeta --extern console=/tmp/cargo-installWui9CK/release/deps/libconsole-f112d37c0f47569f.rmeta --extern cpp_demangle=/tmp/cargo-installWui9CK/release/deps/libcpp_demangle-eb3cfddebaa342f3.rmeta --extern ctrlc=/tmp/cargo-installWui9CK/release/deps/libctrlc-aad786bac40b3a9d.rmeta --extern env_logger=/tmp/cargo-installWui9CK/release/deps/libenv_logger-e27ea58ddced9d00.rmeta --extern failure=/tmp/cargo-installWui9CK/release/deps/libfailure-3335c4e883f54b39.rmeta --extern goblin=/tmp/cargo-installWui9CK/release/deps/libgoblin-f7d9a003d2b2b860.rmeta --extern indicatif=/tmp/cargo-installWui9CK/release/deps/libindicatif-225faf62a9b3f910.rmeta --extern inferno=/tmp/cargo-installWui9CK/release/deps/libinferno-d5bd4adb491d743b.rmeta --extern lazy_static=/tmp/cargo-installWui9CK/release/deps/liblazy_static-dda545747076ca2e.rmeta --extern libc=/tmp/cargo-installWui9CK/release/deps/liblibc-d6a517a54d66530e.rmeta --extern log=/tmp/cargo-installWui9CK/release/deps/liblog-33247e2f897fd775.rmeta --extern lru=/tmp/cargo-installWui9CK/release/deps/liblru-7c22ae3502192a2e.rmeta --extern memmap=/tmp/cargo-installWui9CK/release/deps/libmemmap-d558b4e09fe43210.rmeta --extern proc_maps=/tmp/cargo-installWui9CK/release/deps/libproc_maps-af79a59c980f3ade.rmeta --extern rand=/tmp/cargo-installWui9CK/release/deps/librand-3b4eb2183b20dcb0.rmeta --extern rand_distr=/tmp/cargo-installWui9CK/release/deps/librand_distr-9984e8ecbf04f38f.rmeta --extern regex=/tmp/cargo-installWui9CK/release/deps/libregex-bc89e9f6ea47428c.rmeta --extern remoteprocess=/tmp/cargo-installWui9CK/release/deps/libremoteprocess-2427f6658e4e467c.rmeta --extern serde=/tmp/cargo-installWui9CK/release/deps/libserde-64b415a5891575a4.rmeta --extern serde_derive=/tmp/cargo-installWui9CK/release/deps/libserde_derive-87637e81ef141bef.so --extern serde_json=/tmp/cargo-installWui9CK/release/deps/libserde_json-8b40eb8bd5b4f85c.rmeta --extern tempfile=/tmp/cargo-installWui9CK/release/deps/libtempfile-2085f7c84ee47d2e.rmeta --extern termios=/tmp/cargo-installWui9CK/release/deps/libtermios-776eb0ed1fc2710a.rmeta --cap-lints allow --cfg unwind -L native=/usr/local/lib`
error: could not find native static library `unwind`, perhaps an -L flag is missing?
error: aborting due to previous error
error: could not compile `remoteprocess`.
Caused by:
process didn't exit successfully: `rustc --crate-name remoteprocess /root/.cargo/registry/src/github.com-1ecc6299db9ec823/remoteprocess-0.3.4/src/lib.rs --error-format=json --json=diagnostic-rendered-ansi,artifacts --crate-type lib --emit=dep-info,metadata,link -C opt-level=3 --cfg 'feature="default"' --cfg 'feature="unwind"' -C metadata=2427f6658e4e467c -C extra-filename=-2427f6658e4e467c --out-dir /tmp/cargo-installWui9CK/release/deps -L dependency=/tmp/cargo-installWui9CK/release/deps --extern addr2line=/tmp/cargo-installWui9CK/release/deps/libaddr2line-361484a7c1298d46.rmeta --extern benfred_read_process_memory=/tmp/cargo-installWui9CK/release/deps/libbenfred_read_process_memory-e58b01e0194ad895.rmeta --extern goblin=/tmp/cargo-installWui9CK/release/deps/libgoblin-f7d9a003d2b2b860.rmeta --extern lazy_static=/tmp/cargo-installWui9CK/release/deps/liblazy_static-dda545747076ca2e.rmeta --extern libc=/tmp/cargo-installWui9CK/release/deps/liblibc-d6a517a54d66530e.rmeta --extern log=/tmp/cargo-installWui9CK/release/deps/liblog-33247e2f897fd775.rmeta --extern memmap=/tmp/cargo-installWui9CK/release/deps/libmemmap-d558b4e09fe43210.rmeta --extern nix=/tmp/cargo-installWui9CK/release/deps/libnix-003b039a35e9d039.rmeta --extern object=/tmp/cargo-installWui9CK/release/deps/libobject-477a467cbf075a12.rmeta --extern proc_maps=/tmp/cargo-installWui9CK/release/deps/libproc_maps-af79a59c980f3ade.rmeta --extern regex=/tmp/cargo-installWui9CK/release/deps/libregex-bc89e9f6ea47428c.rmeta --cap-lints allow -L native=/usr/local/lib -l static=unwind -l static=unwind-ptrace -l static=unwind-x86_64` (exit code: 1)
warning: build failed, waiting for other jobs to finish...
error: failed to compile `py-spy v0.3.3`, intermediate artifacts can be found at `/tmp/cargo-installWui9CK`
Caused by:
build failed
bash-5.0# # pip install py-spy
/bin/ash: pip: not found
arch | x86_64
-- | --
chassis | embedded
dev | false
docker | true
docker_version | 19.03.11
hassio | true
host_os | HassOS 4.13
installation_type | Home Assistant OS
os_name | Linux
os_version | 5.4.63
python_version | 3.8.5
supervisor | 245
timezone | Europe/Brussels
version | 0.115.1
virtualenv | false
configuration.yaml
I don't know
None
If you need some more info, ask...
When not using the frontend, once a 50-60 second the fans of my NUC are blowing fast.
This is a screenshot of glances the moment this happens.
Could be related to this issue #39890
There isn't much we can do to help without a py-spy
or interesting logs.
Maybe you can tell me how to install....
I've tried
https://developers.home-assistant.io/docs/operating-system/debugging/#ssh-access-to-the-host
and then
apk add cargo
cargo install py-spy
There isn't much we can do to help without a
py-spy
or interesting logs.
Which Debug you need?
I've tried
logs:
homeassistant.event: debug
homeassistant.components.logger: debug
homeassistant.components.history: debug
homeassistant.components.recorder: debug
Try turning on full debug mode. (default: debug) and seeing if there is anything interesting. Without a py-spy
to narrow the issue down, its not possible to give any more guidance for which area to look.
Is there a way to send you this debug log?
Btw I've tried another 2 times to install Py-Spy, always
error: could not find native static library `unwind`, perhaps an -L flag is missing?
error: aborting due to previous error
error: could not compile `remoteprocess`.
To learn more, run the command again with --verbose.
warning: build failed, waiting for other jobs to finish...
error: failed to compile `py-spy v0.3.3`, intermediate artifacts can be found at `/tmp/cargo-installNtWBUG`
Caused by:
build failed
If you want to send it privately, [email protected] would be fine.
Sent from my Mobile
On Sep 19, 2020, at 9:11 AM, Giel Janssens notifications@github.com wrote:
Is there a way to send you this debug log?—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub, or unsubscribe.
I've send you a wetransfer link because .log
is 100MB
2020-09-19 16:16:05 DEBUG (MainThread) [homeassistant.helpers.event] Template update {% if states('sensor.uptime_in_uren')|float > 0.05|float %}
{{states|selectattr('state', 'in', ['unavailable','unknown','none'])
|reject('in', expand('group.entity_blacklist'))
|reject('eq', states.group.entity_blacklist)
|list|length}}
{% else %}
0
{% endif %}
Try disabling this template as its watching all states and you have a massive amount of state changed events happening.
You could replace it with an automation to only happen every few seconds and it should help
Also change
{{ states.sensor | list | length }}
to
{{ states.sensor | count }}
This will avoid creating a list every time a sensor updates
2020-09-19 16:16:05 DEBUG (MainThread) [homeassistant.helpers.event] Template update {% if states('sensor.uptime_in_uren')|float > 0.05|float %}
{{states|selectattr('state', 'in', ['unavailable','unknown','none'])
|reject('in', expand('group.entity_blacklist'))
|reject('eq', states.group.entity_blacklist)
|list|length}}
{% else %}
0
{% endif %}
I've had tried that yesterday, there was no difference. So I turned it back on...
But I will make an automation of that...
{{ states.device_tracker | list | length }}
as well
I can tell you what I see is inefficient, but I'm pretty much shooting in the dark without a py-spy
.
I'll try to change that all.
Py-Spy, I did try to install another 2 times....
bdraco@Js-MacBook-Pro-3 ~ % grep 'state_changed' home-assistant\ \(1\).log | grep '16:14:40'|wc
315 10547 187923
bdraco@Js-MacBook-Pro-3 ~ % grep 'state_changed' home-assistant\ \(1\).log | grep '16:14:41'|wc
241 7817 135811
bdraco@Js-MacBook-Pro-3 ~ % grep 'state_changed' home-assistant\ \(1\).log | grep '16:14:42'|wc
355 16331 251219
bdraco@Js-MacBook-Pro-3 ~ % grep 'state_changed' home-assistant\ \(1\).log | grep '16:14:43'|wc
259 8447 146329
bdraco@Js-MacBook-Pro-3 ~ % grep 'state_changed' home-assistant\ \(1\).log | grep '16:14:44'|wc
291 9860 176954
It looks like you are getting about 300 state changed events per second 😮
I'd focus on reducing the state change events to only what you need.
I'd focus on reducing the state change events to only what you need.
How do you mean? Delete some sensors?
In 0.114.x I haven't seen these problems...
I have disabled my unavailable sensor
and replaced all | list | length }}
by | count }}
and
When not using the frontend, once a 50-60 second the fans of my NUC are blowing fast.
this appears to be less frequent
I have found this
2020-09-19 16:42:46 23 [Warning] Aborted connection 23 to db: 'homeassistant' user: 'hass' host: '172.30.32.1' (Got an error reading communication packets)
2020-09-19 17:58:12 35 [Warning] Aborted connection 35 to db: 'homeassistant' user: 'hass' host: '172.30.32.1' (Got an error reading communication packets)
in the logs from the MariaDB addon
I'd focus on reducing the state change events to only what you need.
How do you mean? Delete some sensors?
In 0.114.x I haven't seen these problems...
If you disable all your template entities, does the issue go away?
I think I removed all template sensors
and template binary sensors
The restart off HA takes still a lot of time
CPU is still higher then 0.114.x
And when I click something in the frontend
So I restore all my templates...
It is likely the mysql load is caused by the processing of all those state changed events.
I'm not sure what has changed between 0.114 and 0.115 that would cause that increase. It may be caused by a change in the integration that is generating these events.
error: could not find native static library
unwind
, perhaps an -L flag is missing?
I get the same error with cargo
. However, this worked for me inside a 0.115.1 Container install:
echo 'manylinux1_compatible = True' > /usr/local/lib/python3.8/site-packages/_manylinux.py
pip install py-spy
py-spy record --pid $(pidof python3) --duration 120 --output /config/www/spy.svg
(I did have to add the SYS_PTRACE
capability to the container but I assume this is handled already in the OS setup.)
bash-5.0# py-spy dump --pid 232
Process 232: python3 -m homeassistant --config /config
Python v3.8.5 (/usr/local/bin/python3.8)
Thread 232 (idle): "MainThread"
select (selectors.py:468)
_run_once (asyncio/base_events.py:1823)
run_forever (asyncio/base_events.py:570)
run_until_complete (asyncio/base_events.py:603)
run (asyncio/runners.py:43)
run (homeassistant/runner.py:133)
main (homeassistant/__main__.py:312)
<module> (homeassistant/__main__.py:320)
_run_code (runpy.py:87)
_run_module_as_main (runpy.py:194)
Thread 254 (idle): "Thread-1"
dequeue (logging/handlers.py:1427)
_monitor (logging/handlers.py:1478)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 255 (idle): "SyncWorker_0"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 256 (idle): "SyncWorker_1"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 257 (idle): "SyncWorker_2"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 258 (idle): "SyncWorker_3"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 259 (idle): "SyncWorker_4"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 260 (idle): "SyncWorker_5"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 263 (idle): "SyncWorker_6"
query (MySQLdb/connections.py:259)
_query (MySQLdb/cursors.py:319)
execute (MySQLdb/cursors.py:206)
do_execute (sqlalchemy/engine/default.py:593)
_execute_context (sqlalchemy/engine/base.py:1276)
_execute_clauseelement (sqlalchemy/engine/base.py:1124)
_execute_on_connection (sqlalchemy/sql/elements.py:298)
execute (sqlalchemy/engine/base.py:1011)
_execute_and_instances (sqlalchemy/orm/query.py:3533)
__iter__ (sqlalchemy/orm/query.py:3508)
yield_events (homeassistant/components/logbook/__init__.py:418)
humanify (homeassistant/components/logbook/__init__.py:251)
_get_events (homeassistant/components/logbook/__init__.py:516)
json_events (homeassistant/components/logbook/__init__.py:228)
run (concurrent/futures/thread.py:57)
_worker (concurrent/futures/thread.py:80)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 264 (idle): "SyncWorker_7"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 265 (idle): "SyncWorker_8"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 266 (idle): "SyncWorker_9"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 267 (idle): "SyncWorker_10"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 268 (idle): "SyncWorker_11"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 269 (idle): "SyncWorker_12"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 270 (idle): "SyncWorker_13"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 271 (idle): "SyncWorker_14"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 272 (idle): "SyncWorker_15"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 273 (idle): "SyncWorker_16"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 274 (idle): "SyncWorker_17"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 275 (idle): "SyncWorker_18"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 276 (idle): "SyncWorker_19"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 277 (idle): "SyncWorker_20"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 278 (idle): "SyncWorker_21"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 279 (idle): "SyncWorker_22"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 280 (idle): "Recorder"
run (homeassistant/components/recorder/__init__.py:346)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 281 (idle): "Thread-2"
dequeue (logging/handlers.py:1427)
_monitor (logging/handlers.py:1478)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 284 (idle): "zeroconf-Engine-284"
run (zeroconf/__init__.py:1325)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 285 (idle): "zeroconf-Reaper_285"
wait (threading.py:306)
run (zeroconf/__init__.py:1451)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 286 (idle): "SyncWorker_23"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 287 (idle): "SyncWorker_24"
query (MySQLdb/connections.py:259)
_query (MySQLdb/cursors.py:319)
execute (MySQLdb/cursors.py:206)
do_execute (sqlalchemy/engine/default.py:593)
_execute_context (sqlalchemy/engine/base.py:1276)
_execute_clauseelement (sqlalchemy/engine/base.py:1124)
_execute_on_connection (sqlalchemy/sql/elements.py:298)
execute (sqlalchemy/engine/base.py:1011)
_execute_and_instances (sqlalchemy/orm/query.py:3533)
__iter__ (sqlalchemy/orm/query.py:3508)
yield_events (homeassistant/components/logbook/__init__.py:418)
humanify (homeassistant/components/logbook/__init__.py:251)
_get_events (homeassistant/components/logbook/__init__.py:516)
json_events (homeassistant/components/logbook/__init__.py:228)
run (concurrent/futures/thread.py:57)
_worker (concurrent/futures/thread.py:80)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 288 (idle): "SyncWorker_25"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 289 (idle): "SyncWorker_26"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 290 (idle): "SyncWorker_27"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 291 (idle): "SyncWorker_28"
query (MySQLdb/connections.py:259)
_query (MySQLdb/cursors.py:319)
execute (MySQLdb/cursors.py:206)
do_execute (sqlalchemy/engine/default.py:593)
_execute_context (sqlalchemy/engine/base.py:1276)
_execute_clauseelement (sqlalchemy/engine/base.py:1124)
_execute_on_connection (sqlalchemy/sql/elements.py:298)
execute (sqlalchemy/engine/base.py:1011)
_execute_and_instances (sqlalchemy/orm/query.py:3533)
__iter__ (sqlalchemy/orm/query.py:3508)
yield_events (homeassistant/components/logbook/__init__.py:418)
humanify (homeassistant/components/logbook/__init__.py:251)
_get_events (homeassistant/components/logbook/__init__.py:516)
json_events (homeassistant/components/logbook/__init__.py:228)
run (concurrent/futures/thread.py:57)
_worker (concurrent/futures/thread.py:80)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 292 (idle): "SyncWorker_29"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 294 (idle): "SyncWorker_30"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 295 (idle): "SyncWorker_31"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 296 (idle): "SyncWorker_32"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 297 (idle): "SyncWorker_33"
query (MySQLdb/connections.py:259)
_query (MySQLdb/cursors.py:319)
execute (MySQLdb/cursors.py:206)
do_execute (sqlalchemy/engine/default.py:593)
_execute_context (sqlalchemy/engine/base.py:1276)
_execute_clauseelement (sqlalchemy/engine/base.py:1124)
_execute_on_connection (sqlalchemy/sql/elements.py:298)
execute (sqlalchemy/engine/base.py:1011)
_execute_and_instances (sqlalchemy/orm/query.py:3533)
__iter__ (sqlalchemy/orm/query.py:3508)
yield_events (homeassistant/components/logbook/__init__.py:418)
humanify (homeassistant/components/logbook/__init__.py:251)
_get_events (homeassistant/components/logbook/__init__.py:516)
json_events (homeassistant/components/logbook/__init__.py:228)
run (concurrent/futures/thread.py:57)
_worker (concurrent/futures/thread.py:80)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 298 (idle): "SyncWorker_34"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 299 (idle): "SyncWorker_35"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 300 (idle): "SyncWorker_36"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 301 (idle): "SyncWorker_37"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 302 (idle): "SyncWorker_38"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 303 (idle): "SyncWorker_39"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 304 (idle): "SyncWorker_40"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 305 (idle): "SyncWorker_41"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 306 (idle): "SyncWorker_42"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 307 (idle): "SyncWorker_43"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 308 (idle): "SyncWorker_44"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 309 (idle): "SyncWorker_45"
query (MySQLdb/connections.py:259)
_query (MySQLdb/cursors.py:319)
execute (MySQLdb/cursors.py:206)
do_execute (sqlalchemy/engine/default.py:593)
_execute_context (sqlalchemy/engine/base.py:1276)
_execute_clauseelement (sqlalchemy/engine/base.py:1124)
_execute_on_connection (sqlalchemy/sql/elements.py:298)
execute (sqlalchemy/engine/base.py:1011)
_execute_and_instances (sqlalchemy/orm/query.py:3533)
__iter__ (sqlalchemy/orm/query.py:3508)
yield_events (homeassistant/components/logbook/__init__.py:418)
humanify (homeassistant/components/logbook/__init__.py:251)
_get_events (homeassistant/components/logbook/__init__.py:516)
json_events (homeassistant/components/logbook/__init__.py:228)
run (concurrent/futures/thread.py:57)
_worker (concurrent/futures/thread.py:80)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 310 (idle): "SyncWorker_46"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 311 (idle): "SyncWorker_47"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 312 (idle): "SyncWorker_48"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 313 (idle): "SyncWorker_49"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 314 (idle): "SyncWorker_50"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 315 (idle): "SyncWorker_51"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 320 (idle): "Thread-5"
loop (paho/mqtt/client.py:1163)
loop_forever (paho/mqtt/client.py:1782)
_thread_main (paho/mqtt/client.py:3428)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 326 (idle): "Thread-6"
select (selectors.py:415)
handle_request (socketserver.py:294)
run (pysonos/events.py:149)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 327 (idle): "Thread-7"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 328 (idle): "Thread-8"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 329 (idle): "Thread-9"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 331 (idle): "Thread-11"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 334 (idle): "Thread-14"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 336 (idle): "Thread-16"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 337 (idle): "Thread-17"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 340 (idle): "Thread-20"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 342 (idle): "Thread-22"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 343 (idle): "Thread-23"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 344 (idle): "Thread-24"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 346 (idle): "Thread-26"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 347 (idle): "Thread-27"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 348 (idle): "Thread-28"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 349 (idle): "Thread-29"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 353 (idle): "Thread-33"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 354 (idle): "Thread-34"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 356 (idle): "Thread-36"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 358 (idle): "Thread-38"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 360 (idle): "Thread-40"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 377 (idle): "SyncWorker_52"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 378 (idle): "SyncWorker_53"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 379 (idle): "SyncWorker_54"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 380 (idle): "SyncWorker_55"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 381 (idle): "SyncWorker_56"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 382 (idle): "SyncWorker_57"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 383 (idle): "SyncWorker_58"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 384 (idle): "SyncWorker_59"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 385 (idle): "SyncWorker_60"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 386 (idle): "SyncWorker_61"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 387 (idle): "SyncWorker_62"
query (MySQLdb/connections.py:259)
_query (MySQLdb/cursors.py:319)
execute (MySQLdb/cursors.py:206)
do_execute (sqlalchemy/engine/default.py:593)
_execute_context (sqlalchemy/engine/base.py:1276)
_execute_clauseelement (sqlalchemy/engine/base.py:1124)
_execute_on_connection (sqlalchemy/sql/elements.py:298)
execute (sqlalchemy/engine/base.py:1011)
_execute_and_instances (sqlalchemy/orm/query.py:3533)
__iter__ (sqlalchemy/orm/query.py:3508)
yield_events (homeassistant/components/logbook/__init__.py:418)
humanify (homeassistant/components/logbook/__init__.py:251)
_get_events (homeassistant/components/logbook/__init__.py:516)
json_events (homeassistant/components/logbook/__init__.py:228)
run (concurrent/futures/thread.py:57)
_worker (concurrent/futures/thread.py:80)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 388 (idle): "SyncWorker_63"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 827 (idle): "zeroconf-ServiceBrowser__printer._tcp.local.-_nut._tcp.local.-_miio._udp.local.-_esphomelib._tcp.local.-_ipps._tcp.local.-_plugwise._tcp.local.-_ssh._tcp.local.-_Volumio._tcp.local.-_elg._tcp.local.-_ipp._tcp.local.-_bond._tcp.local.-_googlecast._tcp.local.-_http._tcp.local.-_hap._tcp.local.-_axis-video._tcp.local.-_dkapi._tcp.local.-_api._udp.local.-_daap._tcp.local.-_viziocast._tcp.local.-_wled._tcp.local.-_xbmc-jsonrpc-h._tcp.local.-_spotify-connect._tcp.local._827"
wait (threading.py:306)
wait (zeroconf/__init__.py:2407)
run (zeroconf/__init__.py:1701)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 828 (idle): "Thread-48"
send_events (pyhap/accessory_driver.py:483)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 829 (idle): "Thread-49"
select (selectors.py:415)
serve_forever (socketserver.py:232)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 831 (idle): "Thread-51"
recv (pyhap/hap_server.py:779)
recv_into (pyhap/hap_server.py:758)
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:429)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 832 (idle): "Thread-52"
recv (pyhap/hap_server.py:779)
recv_into (pyhap/hap_server.py:758)
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:429)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 6015 (idle): "Thread-81"
recv (pyhap/hap_server.py:779)
recv_into (pyhap/hap_server.py:758)
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:429)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 25348 (idle): "Thread-170"
recv (pyhap/hap_server.py:779)
recv_into (pyhap/hap_server.py:758)
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:429)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 164849 (idle): "stream_worker"
_stream_worker_internal (homeassistant/components/stream/worker.py:205)
stream_worker (homeassistant/components/stream/worker.py:48)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 259889 (idle): "Thread-624"
recv (pyhap/hap_server.py:779)
recv_into (pyhap/hap_server.py:758)
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:429)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 260522 (idle): "Thread-625"
recv (pyhap/hap_server.py:779)
recv_into (pyhap/hap_server.py:758)
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:429)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
bash-5.0#
Do you have something that is constantly gathering logbook data? There are at least 5 separate logbook api requests running in your py-spy.
There is a lot going on here, but at least https://github.com/home-assistant/core/pull/40250 and https://github.com/home-assistant/core/pull/40272 should help performance of templates.
It looks like the core issue is that MQTT is flooding home assistant with state changed events.
Do you have something that is constantly gathering logbook data? There are at least 5 separate logbook api requests running in your py-spy.
Not that I know
logbook.yaml
exclude:
domains:
- sun
entities:
- sensor.since_last_boot
- sensor.date
- sensor.cpu_use
- sensor.ram_free
- sensor.ram_use
- sensor.time
- sensor.time_date
- sensor.pihole_ads_blocked_today
- sensor.pihole_ads_percentage_blocked_today
- sensor.pihole_dns_queries_today
logger.yaml
default: error
recorder.yaml
db_url: !secret mysql
exclude:
domains:
- camera
- group
- history_graph
- scene
- sun
- weather
- zone
entities:
- binary_sensor.bluetooth_scan
- sensor.dark_sky_cloud_coverage_1
- sensor.dark_sky_cloud_coverage_2
- sensor.dark_sky_cloud_coverage_3
- sensor.dark_sky_cloud_coverage_4
- sensor.dark_sky_cloud_coverage_5
- sensor.dark_sky_cloud_coverage_6
- sensor.dark_sky_cloud_coverage_7
- sensor.dark_sky_daily_summary
- sensor.dark_sky_daytime_high_temperature
- sensor.dark_sky_daytime_high_temperature_1
- sensor.dark_sky_daytime_high_temperature_2
- sensor.dark_sky_daytime_high_temperature_3
- sensor.dark_sky_daytime_high_temperature_4
- sensor.dark_sky_daytime_high_temperature_5
- sensor.dark_sky_daytime_high_temperature_6
- sensor.dark_sky_daytime_high_temperature_7
- sensor.dark_sky_dew_point_1
- sensor.dark_sky_dew_point_2
- sensor.dark_sky_dew_point_3
- sensor.dark_sky_dew_point_4
- sensor.dark_sky_dew_point_5
- sensor.dark_sky_dew_point_6
- sensor.dark_sky_dew_point_7
- sensor.dark_sky_hourly_summary
- sensor.dark_sky_humidity_1
- sensor.dark_sky_humidity_2
- sensor.dark_sky_humidity_3
- sensor.dark_sky_humidity_4
- sensor.dark_sky_humidity_5
- sensor.dark_sky_humidity_6
- sensor.dark_sky_humidity_7
- sensor.dark_sky_overnight_low_temperature
- sensor.dark_sky_overnight_low_temperature_1
- sensor.dark_sky_overnight_low_temperature_2
- sensor.dark_sky_overnight_low_temperature_3
- sensor.dark_sky_overnight_low_temperature_4
- sensor.dark_sky_overnight_low_temperature_5
- sensor.dark_sky_overnight_low_temperature_6
- sensor.dark_sky_overnight_low_temperature_7
- sensor.dark_sky_precip_1
- sensor.dark_sky_precip_2
- sensor.dark_sky_precip_3
- sensor.dark_sky_precip_4
- sensor.dark_sky_precip_5
- sensor.dark_sky_precip_6
- sensor.dark_sky_precip_7
- sensor.dark_sky_precip_intensity_1
- sensor.dark_sky_precip_intensity_2
- sensor.dark_sky_precip_intensity_3
- sensor.dark_sky_precip_intensity_4
- sensor.dark_sky_precip_intensity_5
- sensor.dark_sky_precip_intensity_6
- sensor.dark_sky_precip_intensity_7
- sensor.dark_sky_summary
- sensor.dark_sky_wind_bearing_1
- sensor.dark_sky_wind_bearing_2
- sensor.dark_sky_wind_bearing_3
- sensor.dark_sky_wind_bearing_4
- sensor.dark_sky_wind_bearing_5
- sensor.dark_sky_wind_bearing_6
- sensor.dark_sky_wind_bearing_7
- sensor.dark_sky_wind_speed_1
- sensor.dark_sky_wind_speed_2
- sensor.dark_sky_wind_speed_3
- sensor.dark_sky_wind_speed_4
- sensor.dark_sky_wind_speed_5
- sensor.dark_sky_wind_speed_6
- sensor.dark_sky_wind_speed_7
- sensor.date
- sensor.date_time
- sensor.moon
- sensor.next_rising
- sensor.notification_hour
- sensor.notification_minute
- sensor.notification_time
- sensor.notification_time_long
- sensor.people_in_space
- sensor.pws_weather_1n_metric
- sensor.pws_weather_2d_metric
- sensor.pws_weather_2n_metric
- sensor.pws_weather_3d_metric
- sensor.pws_weather_3n_metric
- sensor.pws_weather_4d_metric
- sensor.pws_weather_4n_metric
- sensor.since_last_boot
- sensor.time
- sensor.time_date
- sensor.cert_expiry
- sensor.custom_card_tracker
- sensor.custom_component_tracker
- sensor.custom_python_script_tracker
- sensor.dummy
- sensor.dummy_badkamer
- sensor.dummy_deze_maand
- sensor.dummy_slaapkamer
- sensor.dummy_slaapkamer_fien
- sensor.dummy_slaapkamer_noor
- sensor.dummy_vandaag
- sensor.dummy_verluchting
- sensor.dummy_warmtepomp
- sensor.last_boot
- sensor.moon
- sensor.notification_hour
- sensor.notification_minute
- sensor.notification_time
- sensor.notification_time_long
- sensor.uptime_in_dagen
- sensor.uptime_in_uren
- sensor.zonne_energie_omvormers_gefilterd_lowpass
history.yaml
exclude:
domains:
- weblink
- automation
- updater
- sun
- group
entities:
- sensor.since_last_boot
- sensor.date
- sensor.pws_dewpoint_c
- sensor.pws_dewpoint_string
- sensor.pws_feelslike_c
- sensor.pws_feelslike_string
- sensor.pws_heat_index_c
- sensor.pws_heat_index_string
- sensor.pws_precip_today_metric
- sensor.pws_precip_today_string
- sensor.pws_pressure_mb
- sensor.pws_relative_humidity
- sensor.pws_solarradiation
- sensor.pws_station_id
- sensor.pws_uv
- sensor.pws_visibility_km
- sensor.pws_weather
- sensor.pws_wind_string
- sensor.ram_use
- sensor.time
- sensor.time_date
- sensor.weer_morgen_condition
- sensor.weer_morgen_temperature_max
- sensor.weer_morgen_temperature_min
- sensor.weer_overmorgen_condition
- sensor.weer_overmorgen_temperature_max
- sensor.weer_overmorgen_temperature_min
- binary_sensor.den_hof_outdoor_animal
# - binary_sensor.nummer_79_den_hof_outdoor_animal
- binary_sensor.den_hof_outdoor_motion
- binary_sensor.den_hof_outdoor_vehicle
# - binary_sensor.nummer_79_den_hof_outdoor_vehicle
- climate.netatmo_living_2
- binary_sensor.binnen_warm
- binary_sensor.buiten_fris_genoeg
- binary_sensor.buiten_warm
- sensor.badkamertemperatuur_mean
- sensor.beweging_badkamer_count
- sensor.beweging_slaapkamer_count
- sensor.beweging_slaapkamer_fien_count
- sensor.beweging_slaapkamer_noor_count
- sensor.buienradar
- sensor.bureau
- sensor.fien
- sensor.noor
- sensor.opladers
- sensor.slaapkamer
- sensor.temperatuur_living_mean
- sensor.verluchting
- sensor.wc
- zone.home
It looks like the core issue is that MQTT is flooding home assistant with state changed events.
I have Smappee configured via MQTT, Every second I get a realtime update
{
"totalPower": 833,
"totalReactivePower": 528,
"totalExportEnergy": 0,
"totalImportEnergy": -675983779,
"monitorStatus": 0,
"utcTimeStamp": 1600610515811,
"channelPowers": [
{
"ctInput": 0,
"power": 146,
"exportEnergy": 0,
"importEnergy": -2136093549,
"phaseId": 1,
"current": 11
},
{
"ctInput": 1,
"power": 569,
"exportEnergy": 0,
"importEnergy": -1238023984,
"phaseId": 0,
"current": 26
},
{
"ctInput": 2,
"power": 118,
"exportEnergy": 0,
"importEnergy": -1596829640,
"phaseId": 2,
"current": 6
},
{
"ctInput": 3,
"power": 1005,
"exportEnergy": 11198469,
"importEnergy": -1071029410,
"phaseId": 1,
"current": 43
},
{
"ctInput": 4,
"power": 2075,
"exportEnergy": 12407578,
"importEnergy": 1014673999,
"phaseId": 0,
"current": 88
},
{
"ctInput": 5,
"power": 1000,
"exportEnergy": 675661,
"importEnergy": -1210750333,
"phaseId": 2,
"current": 42
}
],
"voltages": [
{
"voltage": 236,
"phaseId": 0
}
]
}
Please try changing the commit interval in recorder to 5 seconds and make a new py-spy
recording
This is a new https://www.dropbox.com/s/elvi42f589055gs/spy.svg?dl=0
That looks better. Is there a way to turn down the Smappee updates to every 5 seconds as well? That should calm things down until we can get https://github.com/home-assistant/core/pull/40345 https://github.com/home-assistant/core/pull/40250 and https://github.com/home-assistant/core/pull/40272 out in 0.116
Is there a way to turn down the Smappee updates to every 5 seconds as well?
I can't change this...
This is a screenshot of glances when I'm clicking on different entity's in a Lovelace view
The high cpu there is caused by showing the logbook part on the more info panel. You have so many state changes feeding into the database that its taking a long time to fetch all the events for the past 24 hours each time you click the popup.
cc @zsarnett
Maybe we should only show the last 12 hours or last x entries? Not sure what the best solution would be for this. I wonder how many others have this same issue
Would be very nice to have the logbook on the More-info card optional in the first place
Secondly, please make that only show on entities included in logbook/history.
Right now it shows on all entities with the swirling icon and after a second or 5 stating nothing to show...
Only posting here since you are discussing optimizing this. Otherwise I would have added a new FR
Maybe we should only show the last 12 hours or last x entries? Not sure what the best solution would be for this. I wonder how many others have this same issue
I'll work on the query on the backend. We may be able to make it faster by excluding all the state change events and then unioning them back in if it doesn't make the query too complex. We may have to add a time_type index
Maybe we should only show the last 12 hours or last x entries? Not sure what the best solution would be for this. I wonder how many others have this same issue
@zsarnett No need to make any changes, I fixed the performance issue looking up single entities in https://github.com/home-assistant/core/pull/40075.
Loading my garage exit door lock status before 1.32s (1324ms) after 94ms
I think the logbook access in the more info is the root cause of some of the other performance issues in 0.115. The problem should just go away after #40075
I made a custom component for logger with your changes in #40075
This much better, when I'm clicking around in Lovelace, no more blowing fans
This is a Py-Spy when these changes are applied
https://www.dropbox.com/s/t5zqcq9dzl4v5nk/spy-40075.svg?dl=0
And this is a screenshot of glances
Only my idle CPU use is still more then 0.114.x and I still have occasionally blowing fans
I made another custom component from #40345 (I took the MQTT Component inn DEV and applied your changes)
This is not so good, constantly blowing fans
2 Py-Spy's
https://www.dropbox.com/s/u095q9djk0txmcr/spy-40075%2B40345.svg?dl=0
https://www.dropbox.com/s/yvcxkmrmrk1xo6f/spy-40075%2B40345_2.svg?dl=0
And a screenshot
I don't know how to test #40250 and #40272 so for now I just leave my custom component for logbook applied
This is another Py-Spy, immediately started when the fans of my NUC started to blow
https://www.dropbox.com/s/iustxvojg8v07gn/spy-40075-blowing.svg?dl=0
This is one when the fans started to blow at around 1min 40sec
https://www.dropbox.com/s/c3kq8lc9m30qu2s/spy-40075-blowing-fan-2.svg?dl=0
This are 2 Py-Spy's of 300sec
Can you post your template entities (sensors) configurtion?
The most of my template sensors are in
sensors
https://github.com/gieljnssns/My-Hassio-config/tree/master/config/devices/sensors/template_sensors
And some are in
packages
https://github.com/gieljnssns/My-Hassio-config/tree/master/config/packages
In here are a lot of binary sensors
binary_sensors
https://github.com/gieljnssns/My-Hassio-config/tree/master/config/devices/binary_sensors
In here are some template switches
switches
https://github.com/gieljnssns/My-Hassio-config/tree/master/config/devices/switch
Try disabling config/devices/sensors/template_sensors/unavailable.yaml
and template_sensors/component_count.yaml
Also if you change the sensors in: template_sensors/component_count.yaml
to
{{ states.input_text | count }}
instead of
{{ states.input_text | list | length }}
since they no longer have to generate a list and throw it away.
I'm sorry, I forgot to tell config/devices/sensors/template_sensors/unavailable.yaml
Is already disabled for 2 days, until I make an automation for it
And {{ states.input_text | list | length }}
is already converted to {{ states.input_text | count }}
And I also replaced all now()
by as_local(states.sensor.time.last_changed)
Sorry for the confusion, my repo isn't updated yet
Do templates as trigger or condition also count?
Do templates as trigger or condition also count?
yes
value_template
in other sensors?
I have 1373 results in 225 files for template
Anything that is a trigger condition and has a template has to be watched/entities listened for to see if triggers
Do you know an easy way to extract them all out my config? Regex or something?
Or do you no longer need them?
Anything that is a trigger condition and has a template has to be watched/entities listened for to see if triggers
I don't understand this. Conditions are only checked when the corresponding trigger fires, not when anything mentioned in the condition changes state (right?).
A value_template
of another sensor is only updated when the other sensor updates.
Both of these should happen rarely and thus the efficiency matters less.
Anything that is a trigger condition and has a template has to be watched/entities listened for to see if triggers
I don't understand this. Conditions are only checked when the corresponding trigger fires, not when anything mentioned in the condition changes state (right?).
A
value_template
of another sensor is only updated when the other sensor updates.Both of these should happen rarely and thus the efficiency matters less.
I probably shouldn't have used the phrasing trigger condition
.
For clarity:
template trigger
- listens for state changes and (re)evaluated the template
template condition
- evaluated only when triggered by something.
Leaving this open since we have 4 other PRs related to this linked.
@bdraco
If you need more info or Py-Spy, just ask
@gieljnssns Once everything is merged, it would be very helpful to get a new round of py-spy
s while running the new code to see what remains.
2 down, 3 to go
Will these PRs be tagged with 0.115.x
or are they for 0.116
?
0.116 as they are too extensive for 0.115
Also it looks like writing state changes is more expensive than I would expect due to the flush.
I'll see if I can get rid of the flush
calls
I took care of the flush calls in #40467
@bdraco I also saw this issue, and I think much of it is indeed related to the new state-history function.
This is the query executed for the entity binary_sensor.videosource_1_motion_alarm
DECLARE "c_7f9a850c10_13" CURSOR WITHOUT HOLD FOR SELECT events.event_type AS events_event_type, events.event_data AS events_event_data, events.time_fired AS events_time_fired, events.context_id AS events_context_id, events.context_user_id AS events_context_user_id, states.state AS states_state, states.entity_id AS states_entity_id, states.domain AS states_domain, states.attributes AS states_attributes
FROM events LEFT OUTER JOIN states ON events.event_id = states.event_id LEFT OUTER JOIN states AS old_state ON states.old_state_id = old_state.state_id
WHERE (events.event_type != 'state_changed' OR states.state_id IS NOT NULL AND old_state.state_id IS NOT NULL AND states.state IS NOT NULL AND states.state != old_state.state) AND (events.event_type != 'state_changed' OR states.domain NOT IN ('proximity', 'sensor') OR (states.attributes NOT LIKE '%' || '"unit_of_measurement":' || '%')) AND events.event_type IN ('state_changed', 'logbook_entry', 'call_service', 'homeassistant_start', 'homeassistant_stop', 'script_started', 'automation_triggered') AND events.time_fired > '2020-09-22T09:49:06.809000+00:00'::timestamptz AND events.time_fired < '2020-09-23T09:49:06.809000+00:00'::timestamptz AND (states.last_updated = states.last_changed AND states.entity_id = 'binary_sensor.videosource_1_motion_alarm' OR states.state_id IS NULL AND (events.event_data LIKE '%' || '"entity_id": "binary_sensor.videosource_1_motion_alarm"' || '%')) ORDER BY events.time_fired
All the OR constructs are sings of trouble.
@Expaso please try the logbook with the latest 0.116dev after https://github.com/home-assistant/core/pull/40075
@bdraco
Restarting HA takes a long time
This is a Py-Spy while starting up
https://www.dropbox.com/s/dx8mgkcz4jc22wq/spy-startup.svg?dl=0
Maybe this is useful...
@bdraco
Restarting HA takes a long time
This is a Py-Spy while starting up
dropbox.com/s/dx8mgkcz4jc22wq/spy-startup.svg?dl=0Maybe this is useful...
How many seconds does it take to finish startup?
This was a Py-Spy of 120 seconds, startup was just finished before the PS ended, so I think around 2,5 min
It looks like move of the same issues. Hopefully it gets quite a bit faster when all of the linked PRs are merged
Thanks for the additional py-spy
👍 I've made more progress tracking down CPU issues than any other github issue ever opened.
what is the point to have this kind of template validation tool? any mistake leads to restart via portainer
basically - template editor becomes useless
what is the point to have this kind of template validation tool? any mistake leads to restart via portainer
basically - template editor becomes useless
Please open a separate issue for this as there isn't enough information in your post to understand what is going on.
Sometimes HA suddenly hangs up with the following log:
Sometimes HA suddenly hangs up with the following log:
It might be helpful to get an strace of the python process when this is happening
It might be helpful to get an strace of the python process when this is happening
@bdraco
I will try, but the problem is that at that moment docker container becomes unresponsive and I can't run py-spy script.
moving my data from another issue I just submitted and closing that one...
Here is a py-spy recording of the system on V.115.5
Here is a "top" command on the system running v115.5
And for comparison here are the same on v114.2
Thank you for the py-spy
s. They are helpful and highlight opportunities we have to optimize the template engine. Can you also provide a log with event logging turned on. It would be helpful to know which templates are generating the additional workload in 0.115
logger:
default: warning
logs:
homeassistant.helpers.event: debug
I'm still running 114.2 right now but I put the logging config above in my configuration and I'm not seeing any event debug logging at all in the home-assistant.log.
I didn't want to update again until I knew what I expected to see in the logs but I'm not seeing anything. Should I see something in V114.2 with that logger config?
Here is my current config:
logger:
default: error
logs:
homeassistant.helpers.event: debug
In 0.115 you'll see something like this
2020-09-29 17:56:55 DEBUG (MainThread) [homeassistant.helpers.event] Template update {{ ((states('sensor.energy_usage') | float) + (states('sensor.energy_usage_2') | float)) / 1000 }} triggered by event: <Event state_changed[L]: entity_id=sensor.energy_usage, old_state=<state sensor.energy_usage=1856; attribution=Data provided by Sense.com, unit_of_measurement=W, friendly_name=Energy Usage, icon=mdi:flash @ 2020-09-29T12:54:55.783374-05:00>, new_state=<state sensor.energy_usage=1909; attribution=Data provided by Sense.com, unit_of_measurement=W, friendly_name=Energy Usage, icon=mdi:flash @ 2020-09-29T12:56:55.890390-05:00>>
2020-09-29 17:56:55 DEBUG (MainThread) [homeassistant.helpers.event] Template group [TrackTemplate(template=Template("{{ ((states('sensor.energy_usage') | float) + (states('sensor.energy_usage_2') | float)) / 1000 }}"), variables=None, rate_limit=None)] listens for {'all': False, 'entities': {'sensor.energy_usage', 'sensor.energy_usage_2'}, 'domains': set()}
here is my log file.
Template group [TrackTemplate(template=Template("{{states|count}}
"), variables=None), TrackTemplate(template=Template("{{states|count}} entities in {{states|groupby('domain')|count}} domains
"), variables=None), TrackTemplate(template=Template("{{states.alert|count}}
"), variables=None), TrackTemplate(template=Template("{{states.automation|count}}
"), variables=None), TrackTemplate(template=Template("{{states.binary_sensor|count}}
"), variables=None), TrackTemplate(template=Template("{{states.camera|count}}
"), variables=None), TrackTemplate(template=Template("{{states.climate|count}}
"), variables=None), TrackTemplate(template=Template("{{states.counter|count}}
"), variables=None), TrackTemplate(template=Template("{{states.cover|count}}
"), variables=None), TrackTemplate(template=Template("{{states.device_tracker|count}}
"), variables=None), TrackTemplate(template=Template("{{states.geo_location|count}}
"), variables=None), TrackTemplate(template=Template("{{states.group|count}}
"), variables=None), TrackTemplate(template=Template("{{states.input_boolean|count}}
"), variables=None), TrackTemplate(template=Template("{{states.input_datetime|count}}
"), variables=None), TrackTemplate(template=Template("{{states.input_number|count}}
"), variables=None), TrackTemplate(template=Template("{{states.input_select|count}}
"), variables=None), TrackTemplate(template=Template("{{states.input_text|count}}
"), variables=None), TrackTemplate(template=Template("{{states.light|count}}
"), variables=None), TrackTemplate(template=Template("{{states.media_player|count}}
"), variables=None), TrackTemplate(template=Template("{{states('sensor.count_persistent_notifications')}}
"), variables=None), TrackTemplate(template=Template("{{states.person|count}}
"), variables=None), TrackTemplate(template=Template("{{states.proximity|count}}
"), variables=None), TrackTemplate(template=Template("{{states.remote|count}}
"), variables=None), TrackTemplate(template=Template("{{states.scene|count}}
"), variables=None), TrackTemplate(template=Template("{{states.script|count}}
"), variables=None), TrackTemplate(template=Template("{{states.sensor|count}}
"), variables=None), TrackTemplate(template=Template("{{states.sun|count}}
"), variables=None), TrackTemplate(template=Template("{{states.switch|count}}
"), variables=None), TrackTemplate(template=Template("{{states.timer|count}}
"), variables=None), TrackTemplate(template=Template("{{states.variable|count}}
"), variables=None), TrackTemplate(template=Template("{{states.weather|count}}
"), variables=None), TrackTemplate(template=Template("{{states.zone|count}}
"), variables=None), TrackTemplate(template=Template("{{states.zwave|count}}
"), variables=None), TrackTemplate(template=Template("mdi:home-assistant"), variables=None)] listens for {'all': True, 'entities': set(), 'domains': set()}
TrackTemplate(template=Template("{% set update = states('sensor.time') %} {{expand(states.binary_sensor, states.sensor, states.switch, states.variable, states.media_player, states.light)|selectattr('state', 'in', ['unavailable','unknown','none'])
|reject('in', expand('group.entity_blacklist'))
|reject('eq', states.group.entity_blacklist)
|list|length}}
"), variables=None), TrackTemplate(template=Template("{% set update = states('sensor.time') %} {{expand(states.binary_sensor, states.sensor, states.switch, states.variable, states.media_player, states.light)|selectattr('state', 'in', ['unavailable','unknown','none'])
|reject('in', expand('group.entity_blacklist'))
|reject('eq' , states.group.entity_blacklist)
|reject('eq' , states.group.battery_status)
|map(attribute='entity_id')|list|join(', ')}}
If you disable these two template groups, does the cpu return to normal?
I completely missed that I had that count template in the system. I had done a search on "{{ states |" (with a space after states) but that counting template didn't have the space.
I've been using a Shelly switch that is integrated using the ESPHome API that then toggles a zigbee light bulb as a kind of "canary in the coalmine" thing since that's what alerted me to the CPU issue in the first place. I created an automation to record the delta-T between the switch toggle and the light responding.
For reference, here are the results of that in V114.2 (ignore those wildly extraneous values):
Here are those results in v115.5 with both of the above templates running:
After removing the counts template here are those results:
As you can see the times have come down significantly removing just that one template. But they are still at least twice what the baseline values were.
Then I removed the other template sensor above and now everything seems to be back to "normal":
However I'm surprised that the second template I removed still has such a huge impact on system resources (especially since I'm not running on a Pi but instead an i3 NUC) as that's the "recommended" solution that was put forth in the "heads up!" thread since it doesn't look at all state changes but just a few domains. But I guess the domains that it is still watching are the really active ones (especially the "sensor" domain) so I guess I shouldn't really be surprised after all.
It seems like a good argument for bringing back some form of explicit control over how frequently the templates get rendered.
Hope it helps...and it wasn't too "wordy"... :)
It seems like a good argument for bringing back some form of explicit control over how frequently the templates get rendered.
That is being discussed here https://github.com/home-assistant/architecture/issues/206
Hope it helps...and it wasn't too "wordy"... :)
Thanks, 0.116 should make things quite a bit better regardless because all of those cases are more performant.
@gieljnssns Can you provide new py-spy
s with 0.116beta or 0.116 when it comes out
I will do tomorrow.
This py-spy
is from the second restart of HA after the update to 0.116b0
, as soon it was possible
https://www.dropbox.com/s/rz9ywoskzf8m2jb/spy-0.116-startup.svg?dl=0
This py-spy
is one of 10min running 0.116b0
, once in +/- 1min my fans are spinning up
https://www.dropbox.com/s/0x86vnva13kuckh/spy-0.116-10min.svg?dl=0
and I had errors from ZHA
at startup
2020-10-01 09:35:35 ERROR (MainThread) [zigpy_deconz.uart] Lost serial connection: read failed: device reports readiness to read but returned no data (device disconnected or multiple access on port?)
2020-10-01 09:35:36 ERROR (MainThread) [zigpy.application] Couldn't start application
2020-10-01 09:35:36 ERROR (MainThread) [homeassistant.components.zha.core.gateway] Couldn't start deCONZ = dresden elektronik deCONZ protocol: ConBee I/II, RaspBee I/II coordinator
Traceback (most recent call last):
File "/usr/src/homeassistant/homeassistant/components/zha/core/gateway.py", line 147, in async_initialize
self.application_controller = await app_controller_cls.new(
File "/usr/local/lib/python3.8/site-packages/zigpy/application.py", line 68, in new
await app.startup(auto_form)
File "/usr/local/lib/python3.8/site-packages/zigpy_deconz/zigbee/application.py", line 66, in startup
self.version = await self._api.version()
File "/usr/local/lib/python3.8/site-packages/zigpy_deconz/api.py", line 435, in version
(self._proto_ver,) = await self[NetworkParameter.protocol_version]
File "/usr/local/lib/python3.8/site-packages/zigpy_deconz/api.py", line 400, in read_parameter
r = await self._command(Command.read_parameter, 1 + len(data), param, data)
File "/usr/local/lib/python3.8/site-packages/zigpy_deconz/api.py", line 304, in _command
return await asyncio.wait_for(fut, timeout=COMMAND_TIMEOUT)
File "/usr/local/lib/python3.8/asyncio/tasks.py", line 490, in wait_for
raise exceptions.TimeoutError()
asyncio.exceptions.TimeoutError
CPU is still around 5 a 6 % while before 0.115.x
it was 2 a 3 %
If you need more - just ask....
We probably can reduce the relookups with expire_on_commit
set to False
and only expiring the cache every 5 minutes.
Actually I think we just need to set expire_on_commit
to False
for recorder
as each commit is emptying the cache and causing the data to be refetched from the db every second. In short https://github.com/home-assistant/core/pull/40467 didn't actually solve the issue
@gieljnssns Beta 1 has the additional fix for writing the database. Can you give that a shot and provide additional py-spy
s?
@bdraco
Some new py-spy's
This one is one of 120s while starting up - it takes about 2min to start up
https://www.dropbox.com/s/1nhyvrbiumlaxae/spy-0.116b1-startup.svg?dl=0
The next one is 600s - still having fans to spin up sometimes
https://www.dropbox.com/s/2wr3sazjsj5ujmj/spy-0.116b1-10min.svg?dl=0
dump
bash-5.0# py-spy dump --pid 232
Process 232: python3 -m homeassistant --config /config
Python v3.8.5 (/usr/local/bin/python3.8)
Thread 232 (idle): "MainThread"
select (selectors.py:468)
_run_once (asyncio/base_events.py:1823)
run_forever (asyncio/base_events.py:570)
run_until_complete (asyncio/base_events.py:603)
run (asyncio/runners.py:43)
run (homeassistant/runner.py:133)
main (homeassistant/__main__.py:312)
<module> (homeassistant/__main__.py:320)
_run_code (runpy.py:87)
_run_module_as_main (runpy.py:194)
Thread 254 (idle): "Thread-1"
dequeue (logging/handlers.py:1427)
_monitor (logging/handlers.py:1478)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 255 (idle): "SyncWorker_0"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 256 (idle): "SyncWorker_1"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 257 (idle): "SyncWorker_2"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 258 (idle): "SyncWorker_3"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 259 (idle): "SyncWorker_4"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 260 (idle): "SyncWorker_5"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 263 (idle): "SyncWorker_6"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 264 (idle): "SyncWorker_7"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 265 (idle): "SyncWorker_8"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 266 (idle): "SyncWorker_9"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 267 (idle): "SyncWorker_10"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 268 (idle): "SyncWorker_11"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 269 (idle): "SyncWorker_12"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 270 (idle): "SyncWorker_13"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 271 (idle): "SyncWorker_14"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 272 (idle): "SyncWorker_15"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 273 (idle): "SyncWorker_16"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 274 (idle): "SyncWorker_17"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 275 (idle): "SyncWorker_18"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 276 (idle): "SyncWorker_19"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 277 (idle): "SyncWorker_20"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 278 (idle): "SyncWorker_21"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 279 (idle): "SyncWorker_22"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 280 (idle): "SyncWorker_23"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 281 (idle): "SyncWorker_24"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 282 (idle): "Recorder"
run (homeassistant/components/recorder/__init__.py:351)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 283 (idle): "Thread-2"
dequeue (logging/handlers.py:1427)
_monitor (logging/handlers.py:1478)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 285 (idle): "zeroconf-Engine-285"
run (zeroconf/__init__.py:1325)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 286 (idle): "zeroconf-Reaper_286"
wait (threading.py:306)
run (zeroconf/__init__.py:1451)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 288 (idle): "Thread-5"
loop (paho/mqtt/client.py:1167)
loop_forever (paho/mqtt/client.py:1779)
_thread_main (paho/mqtt/client.py:3452)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 292 (idle): "SyncWorker_25"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 293 (idle): "SyncWorker_26"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 296 (idle): "influxdb"
wait (threading.py:306)
get (queue.py:179)
get_events_json (homeassistant/components/influxdb/__init__.py:500)
run (homeassistant/components/influxdb/__init__.py:550)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 299 (idle): "SyncWorker_27"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 300 (idle): "SyncWorker_28"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 301 (idle): "SyncWorker_29"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 302 (idle): "SyncWorker_30"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 303 (idle): "SyncWorker_31"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 304 (idle): "SyncWorker_32"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 305 (idle): "SyncWorker_33"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 306 (idle): "SyncWorker_34"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 307 (idle): "SyncWorker_35"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 308 (idle): "SyncWorker_36"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 309 (idle): "SyncWorker_37"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 310 (idle): "SyncWorker_38"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 311 (idle): "SyncWorker_39"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 312 (idle): "SyncWorker_40"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 313 (idle): "SyncWorker_41"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 314 (idle): "SyncWorker_42"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 319 (idle): "SyncWorker_43"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 320 (idle): "SyncWorker_44"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 326 (idle): "Thread-6"
select (selectors.py:415)
handle_request (socketserver.py:294)
run (pysonos/events.py:149)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 327 (idle): "Thread-7"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 328 (idle): "Thread-8"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 330 (idle): "Thread-10"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 331 (idle): "Thread-11"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 334 (idle): "Thread-14"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 335 (idle): "Thread-15"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 338 (idle): "Thread-18"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 339 (idle): "Thread-19"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 341 (idle): "Thread-21"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 343 (idle): "Thread-23"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 344 (idle): "Thread-24"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 346 (idle): "Thread-26"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 348 (idle): "Thread-28"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 349 (idle): "Thread-29"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 351 (idle): "Thread-31"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 353 (idle): "Thread-33"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 354 (idle): "SyncWorker_45"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 355 (idle): "SyncWorker_46"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 356 (idle): "SyncWorker_47"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 357 (idle): "SyncWorker_48"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 359 (idle): "SyncWorker_49"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 360 (idle): "SyncWorker_50"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 361 (idle): "SyncWorker_51"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 366 (idle): "SyncWorker_52"
ping (icmplib/ping.py:133)
ping (homeassistant/components/ping/device_tracker.py:94)
update (homeassistant/components/ping/device_tracker.py:100)
update_interval (homeassistant/components/ping/device_tracker.py:141)
run (concurrent/futures/thread.py:57)
_worker (concurrent/futures/thread.py:80)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 367 (idle): "SyncWorker_53"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 368 (idle): "SyncWorker_54"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 369 (idle): "SyncWorker_55"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 370 (idle): "SyncWorker_56"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 371 (idle): "SyncWorker_57"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 372 (idle): "SyncWorker_58"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 376 (idle): "SyncWorker_59"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 377 (idle): "SyncWorker_60"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 378 (idle): "SyncWorker_61"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 379 (idle): "SyncWorker_62"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 380 (idle): "SyncWorker_63"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 381 (idle): "Thread-39"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 383 (idle): "Thread-41"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 384 (idle): "Thread-42"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 385 (idle): "Thread-43"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 433 (idle): "zeroconf-ServiceBrowser__printer._tcp.local.-_dkapi._tcp.local.-_daap._tcp.local.-_axis-video._tcp.local.-_api._udp.local.-_ipp._tcp.local.-_elg._tcp.local.-_nut._tcp.local.-_ssh._tcp.local.-_xbmc-jsonrpc-h._tcp.local.-_plugwise._tcp.local.-_esphomelib._tcp.local.-_Volumio._tcp.local.-_http._tcp.local.-_hap._tcp.local.-_spotify-connect._tcp.local.-_ipps._tcp.local.-_googlecast._tcp.local.-_bond._tcp.local.-_miio._udp.local.-_viziocast._tcp.local.-_wled._tcp.local.-_homekit._tcp.local._433"
wait (threading.py:306)
wait (zeroconf/__init__.py:2407)
run (zeroconf/__init__.py:1701)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 435 (idle): "Thread-49"
send_events (pyhap/accessory_driver.py:483)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 436 (idle): "Thread-50"
select (selectors.py:415)
serve_forever (socketserver.py:232)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 441 (idle): "Thread-51"
recv (pyhap/hap_server.py:779)
recv_into (pyhap/hap_server.py:758)
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:429)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 442 (idle): "Thread-52"
recv (pyhap/hap_server.py:779)
recv_into (pyhap/hap_server.py:758)
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:429)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 443 (idle): "Thread-53"
recv (pyhap/hap_server.py:779)
recv_into (pyhap/hap_server.py:758)
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:429)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
Collecting samples from 'python3 -m homeassistant --config /config' (python v3.8.5)
Total Samples 7800
That looks a lot better. How is the cpu usage compared to 0.115?
My frontend wasn't accessible anymore, I had to restart HA in the container
CPU Now 12 - 13% - v0.115 was less
The thing that still stands out is the number of MQTT messages are overwhelming. Can you send a debug log?
default: error
logs:
homeassistant.mqtt: debug
homeassistant.event: debug
?
default: error
logs:
homeassistant.components.mqtt: debug
homeassistant.helpers.event: debug
homeassistant.core: debug
This is going to be a big file, I'll send a we-transfer link to your email
How long 5min, 10min,....
This is going to be a big file, I'll send a we-transfer link to your email
How long 5min, 10min,....
5 minutes should be fine.
Startup takes 2 minutes...
Link send
Thanks. There wasn't anything helpful in the log. At this point it's probably best to instrument with cProfile so we can see what is using cpu with the profiler. I'll need to write a custom component to do that.
I've put together a profiler.
The profiler can be installed via HACS by adding https://github.com/bdraco/profiler.git
as a custom repository.
Then setup the Profile integration (no restart needed).
Once its up an running call the profiler.start
service.
This will generate a profile.TIMESTAMP.cprof
and callgrind.out.TIMESTAMP
file in your home assistant directory after 60 seconds.
I can use those files to get a better idea of what is going on
I will do this tomorrow, tonight it's soccer-night and 🍺🍻🥃
can we add this Profiler in Homeassistant OS too? if so, Id be glad to try and help
Will this custom component be helpful to run on .115 (non-dev) version? Or will it be useable on that version?
If so I can run it and provide info if you want.
It should work on 0.115 as well on anything that can run core
ok, ive been able to install and run it on my test instance, will do so on the production system too. how do we get the files to you?
btw this isnt the py_spy test you're using, would you be able to create a CC for that too, so we can really log (spy) as you need?
ok, ive been able to install and run it on my test instance, will do so on the production system too. how do we get the files to you?
email is fine (address above)
btw this isnt the py_spy test you're using, would you be able to create a CC for that too, so we can really log (spy) as you need?
py-spy
requires a bit more work to install on a lot of systems so a custom component might generate more frustration then its worth.
I get this on RPI3+ venv install:
2020-10-03 12:28:05 ERROR (MainThread) [homeassistant.core] Error executing service: <ServiceCall profiler.start (c:99cf28ac055a11eba239c9e9903a3e80): seconds=60.0>
Traceback (most recent call last):
File "/home/pi/homeassistant/lib/python3.7/site-packages/homeassistant/core.py", line 1331, in catch_exceptions
await coro_or_task
File "/home/pi/homeassistant/lib/python3.7/site-packages/homeassistant/core.py", line 1350, in _execute_service
await handler.func(service_call)
File "/home/pi/homeassistant/lib/python3.7/site-packages/homeassistant/helpers/service.py", line 523, in admin_handler
await result
File "/home/pi/.homeassistant/custom_components/profiler/__init__.py", line 43, in _async_generate_profile
profiler.dump_stats(f"profile.{start_time}.cprof")
File "/usr/lib/python3.7/cProfile.py", line 46, in dump_stats
with open(file, 'wb') as f:
PermissionError: [Errno 13] Permission denied: 'profile.1601717225062955.cprof'
Files send
The key remaining issue is the number of calls to _mqtt_handle_message
and the ~880 events that happened in 60 seconds.
There also seems to be a template that is somehow generating json
, but that is small in comparison.
I get this on RPI3+ venv install:
2020-10-03 12:28:05 ERROR (MainThread) [homeassistant.core] Error executing service: <ServiceCall profiler.start (c:99cf28ac055a11eba239c9e9903a3e80): seconds=60.0> Traceback (most recent call last): File "/home/pi/homeassistant/lib/python3.7/site-packages/homeassistant/core.py", line 1331, in catch_exceptions await coro_or_task File "/home/pi/homeassistant/lib/python3.7/site-packages/homeassistant/core.py", line 1350, in _execute_service await handler.func(service_call) File "/home/pi/homeassistant/lib/python3.7/site-packages/homeassistant/helpers/service.py", line 523, in admin_handler await result File "/home/pi/.homeassistant/custom_components/profiler/__init__.py", line 43, in _async_generate_profile profiler.dump_stats(f"profile.{start_time}.cprof") File "/usr/lib/python3.7/cProfile.py", line 46, in dump_stats with open(file, 'wb') as f: PermissionError: [Errno 13] Permission denied: 'profile.1601717225062955.cprof'
I made an adjustment that should take care of that
cd custom_components
cd profiler
git pull
Restart, then add the Profiler
integration in the UI
I know that this is not Discord/forum/helpdesk... but after those steps and removing profiler: from configuration.yaml I am clueless- UI?
I know that this is not Discord/forum/helpdesk... but after those steps and removing profiler: from configuration.yaml I am clueless- UI?
The profiler can now be installed via HACS by adding https://github.com/bdraco/profiler.git
as a custom repository.
@bdraco
I've updated to v0.116.0b2
, I have 2 new py-spy's
for you
At startup https://www.dropbox.com/s/i9p0505uvk1v2u2/spy-0.116b2-startup.svg?dl=0
One of 10min https://www.dropbox.com/s/91ab3ab1m3g9ey0/spy-0.116b2-10min.svg?dl=0
If you need something else...
The system was a bit less noisy in the last one so I can see we are converting States
to a as_dict
multiple times. I think we could cache this.
The system was a bit less noisy in the last one so I can see we are converting
States
to aas_dict
multiple times. I think we could cache this.
@bdraco
Do I have to make another py-spy
with v0.116.0b3
?
Or do I have to wait until #41217 and #41208 get merged?
There likely won't be any difference with b3. Waiting for #41217 and #41208 makes sense.
@gieljnssns Actually #41225 could make a difference so it would be great to get a fresh py-spy
with b3
it would be great to get a fresh py-spy with b3
One at startup and one of 10 min?
Just have to wait until the newest beta is visible for me... (I'll hit reload
every minute)
it would be great to get a fresh py-spy with b3
One at startup and one of 10 min?
Just have to wait until the newest beta is visible for me... (I'll hit
reload
every minute)
Yes please
0.116b3 is landed
startup https://www.dropbox.com/s/ux167osddglaq5h/spy-0.116b3-startup.svg?dl=0
Another of 2 min (I forgot to change the time)
https://www.dropbox.com/s/7hw1lmi9vpljis2/spy-0.116b3-2min.svg?dl=0
Its looking better. I think the compression change on the websocket suggested here https://github.com/aio-libs/aiohttp/issues/2856#issuecomment-375944162 will help a bit as well
async_fire
is a bottleneck because it has to call asyncio.iscoroutinefunction
and asyncio.iscoroutine
before it figured out a function is a callback. I'll see if I can improve that as well
I think b4 doesn't make a difference?
None of your PR are in this?
There are still 34 areas I want to work on.
asyncio.iscoroutinefunction
and asyncio.iscoroutine
. We already have a pattern for this in class Service
https://github.com/home-assistant/core/pull/41304state_changed
events. I have a good idea how to make this happen. https://github.com/home-assistant/core/pull/41314I missed a bug in the template rate limit that I just fixed in https://github.com/home-assistant/core/pull/41308 so that should help as well
- [ ] Reducing the overhead of MQTT matching. I don't regularly use MQTT so this may need some help since I'm not getting anywhere after the initial improvement made in #40345.
let me know if I can help, rather heavily using MQTT here
- [ ] Reducing the overhead of MQTT matching. I don't regularly use MQTT so this may need some help since I'm not getting anywhere after the initial improvement made in #40345.
let me know if I can help, rather heavily using MQTT here
@Mariusthvdb
py-spy
s with 0.116.0b5 and a profile would be helpful to identify commonalities between your install and @gieljnssns 's install.
I'd be glad to do so, only issue is I am on Home Assistant OS, so not sure if I can install py-spy?
I'd be glad to do so, only issue is I am on Home Assistant OS, so not sure if I can install py-spy?
I've had success installing it via apk add cargo ; cargo install py-spy
once I'm in the docker container https://developers.home-assistant.io/docs/operating-system/debugging/#accessing-the-container-bash after https://developers.home-assistant.io/docs/operating-system/debugging/#home-assistant-operating-system
This may also work as well:
https://github.com/home-assistant/core/issues/40292#issuecomment-695363148
I believe #41304, #41314, #41217 and #41208 are going to be for v0.117.0
.
But I've installed v0.116.0b5
with #41308 in it, so 2 new py-spy's
2 min at startup https://www.dropbox.com/s/mumxpxtw5j2f5rd/spy-0.116b5-startup.svg?dl=0
and 10 min https://www.dropbox.com/s/mbw4n52xabh47aq/spy-0.116b5-10min.svg?dl=0
I believe #41304, #41314, #41217 and #41208 are going to be for
v0.117.0
.
But I've installedv0.116.0b5
with #41308 in it, so 2 newpy-spy's
2 min at startup dropbox.com/s/mumxpxtw5j2f5rd/spy-0.116b5-startup.svg?dl=0and 10 min dropbox.com/s/mbw4n52xabh47aq/spy-0.116b5-10min.svg?dl=0
Thanks. Based on these, and the amount of listeners being dispatched, the next change that should have a perceptible difference is https://github.com/home-assistant/core/pull/41304
I'd be glad to do so, only issue is I am on Home Assistant OS, so not sure if I can install py-spy?
I've had success installing it via
apk add cargo ; cargo install py-spy
once I'm in the docker container https://developers.home-assistant.io/docs/operating-system/debugging/#accessing-the-container-bash after https://developers.home-assistant.io/docs/operating-system/debugging/#home-assistant-operating-systemThis may also work as well:
#40292 (comment)
this is a bit too dangerous for my production system. Couldn't this be setup via the Portainer Add-on? https://github.com/hassio-addons/addon-portainer
this is a bit too dangerous for my production system. Couldn't this be setup via the Portainer Add-on? hassio-addons/addon-portainer
I don't see why not. As long as you can enter the docker container, you should be able to install py-spy
And if you up- or downgrade HA you have a new container.
Then you have to reinstall py-spy, so I don't see why this is dangerous.
real sorry, but I cant see how to install in Portainer. thought it to be the safest/easiest way to use extra container installs in HA OS, but maybe it isn't available after all.
I am going to leave this closed since 0.116 is hours away. If the problem persists in 0.116 please start a fresh case as it's unlikely we will get other users to dig though this one and provide helpful data once 0.116 is out.
Thanks. Based on these, and the amount of listeners being dispatched, the next change that should have a perceptible difference is #41304
Will this be in v0.116.0
?
Thanks. Based on these, and the amount of listeners being dispatched, the next change that should have a perceptible difference is #41304
Will this be in
v0.116.0
?
0.117
https://github.com/home-assistant/core/pull/41433 should help as well
@bdraco
I am going to leave this closed since 0.116 is hours away. If the problem persists in 0.116 please start a fresh case as it's unlikely we will get other users to dig though this one and provide helpful data once 0.116 is out.
My CPU use isn't back to the level of v0.114.x
, but I will wait to open a new issue until v0.117bx
(if necessary) and #41304, #41314, #41217, #41208 and #41433 are merged
@gieljnssns https://github.com/home-assistant/core/pull/41513 and https://github.com/home-assistant/core/pull/41523 should help a bit as well, but I suspect #41433 is going to be the big win for you based on your profile.
Also https://github.com/home-assistant/core/pull/41394 may make quite a difference as well
By chance do you have a py-spy
from 0.114 for comparison ?
By chance do you have a py-spy from 0.114 for comparison ?
Unfortunately not
I have made a custom component for #41433 with this https://github.com/emontnemery/home-assistant/tree/170cd2d17d91016090b5bb03041e661ecc50d20a/homeassistant/components/mqtt
and a custom component for #41394 with this https://github.com/bdraco/home-assistant/blob/19def8547f5c33ed256fc092d89938b11298ff3b/homeassistant/components/recorder/__init__.py
At startup I have this error
2020-10-09 15:34:22 ERROR (MainThread) [homeassistant.config] Platform error: binary_sensor
Traceback (most recent call last):
File "/usr/src/homeassistant/homeassistant/config.py", line 814, in async_process_component_config
platform = p_integration.get_platform(domain)
File "/usr/src/homeassistant/homeassistant/loader.py", line 401, in get_platform
cache[full_name] = importlib.import_module(
File "/usr/local/lib/python3.8/importlib/__init__.py", line 127, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
File "<frozen importlib._bootstrap>", line 991, in _find_and_load
File "<frozen importlib._bootstrap>", line 961, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
File "<frozen importlib._bootstrap>", line 991, in _find_and_load
File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 671, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 783, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/config/custom_components/mqtt/__init__.py", line 47, in <module>
from . import debug_info, discovery
File "/config/custom_components/mqtt/discovery.py", line 13, in <module>
from homeassistant.loader import async_get_mqtt
ImportError: cannot import name 'async_get_mqtt' from 'homeassistant.loader' (/usr/src/homeassistant/homeassistant/loader.py)
But I have made the 2 py-spy's
startup https://www.dropbox.com/s/f8te1xiak30f596/spy-0.116.1-custom_recorder_and_mqtt-startup.svg?dl=0
10 min https://www.dropbox.com/s/d1bpfjhn08vxklc/spy-0.116.1-custom_recorder_and_mqtt-10min.svg?dl=0
Now I'm going to try without mqtt custom component
The mqtt 0.117 dev version has some other changes in it that won't work on 0.116 since it needs the core changes to support it.
Thanks for the new py-spy
s. Everything new I'm seeing in them looks like its covered by one of the 0.117 merges
The mqtt 0.117 dev version has some other changes in it that won't work on 0.116 since it needs the core changes to support it.
I thought so, for mqtt I going to wait for the beta, I'm not going to do dev
, but only #41394 is sampling...
Thanks. The recorder is looking much much better now. https://github.com/home-assistant/core/pull/41304 should help take a lot of noise out of profile so we will get more detail in 0.117
No, thank you.
I will ping you back when v0.117b0 drops
One more that should help with the high number of state change events https://github.com/home-assistant/core/pull/41736
One more that should help with mqtt https://github.com/home-assistant/core/pull/41796
Thanks a lot, when the beta arrives, I send you 2 new py-spy's.
https://github.com/home-assistant/core/pull/41820 should help a bit as well
Everything that should help with this except https://github.com/home-assistant/core/pull/41741 has made it into 0.117dev at this point.
@bdraco
Since yesterday evening my cpu is higher then I'm used to (since 0.115.x)
I made a py-spy of 2min. https://www.dropbox.com/s/i7nhf5lzf0xohxe/spy-0.116.2-cpu_higher.svg?dl=0
But I'm seeing a lot of errors when taking the py-spy
py-spy> 1.11s behind in sampling, results may be inaccurate. Try reducing the sampling rate
py-spy> 1.09s behind in sampling, results may be inaccurate. Try reducing the sampling rate
py-spy> 1.14s behind in sampling, results may be inaccurate. Try reducing the sampling rate
py-spy> 1.19s behind in sampling, results may be inaccurate. Try reducing the sampling rate
py-spy> 1.18s behind in sampling, results may be inaccurate. Try reducing the sampling rate
py-spy> 1.15s behind in sampling, results may be inaccurate. Try reducing the sampling rate
py-spy> 1.28s behind in sampling, results may be inaccurate. Try reducing the sampling rate
py-spy> 1.54s behind in sampling, results may be inaccurate. Try reducing the sampling rate
py-spy> 1.54s behind in sampling, results may be inaccurate. Try reducing the sampling rate
py-spy> 1.62s behind in sampling, results may be inaccurate. Try reducing the sampling rate
py-spy> 1.59s behind in sampling, results may be inaccurate. Try reducing the sampling rate
py-spy> 1.59s behind in sampling, results may be inaccurate. Try reducing the sampling rate
py-spy> 1.71s behind in sampling, results may be inaccurate. Try reducing the sampling rate
py-spy> 1.76s behind in sampling, results may be inaccurate. Try reducing the sampling rate
py-spy> 1.86s behind in sampling, results may be inaccurate. Try reducing the sampling rate
py-spy> 1.89s behind in sampling, results may be inaccurate. Try reducing the sampling rate
py-spy> 2.01s behind in sampling, results may be inaccurate. Try reducing the sampling rate
py-spy> 2.16s behind in sampling, results may be inaccurate. Try reducing the sampling rate
....
When I do py-spy dump --pid 233
I get this but my terminal isn't long enough
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 295 (idle): "SyncWorker_27"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 296 (idle): "SyncWorker_28"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 297 (idle): "SyncWorker_29"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 298 (idle): "SyncWorker_30"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 299 (idle): "SyncWorker_31"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 300 (idle): "SyncWorker_32"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 301 (idle): "SyncWorker_33"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 302 (idle): "SyncWorker_34"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 303 (idle): "SyncWorker_35"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 304 (idle): "SyncWorker_36"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 305 (idle): "SyncWorker_37"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 306 (idle): "SyncWorker_38"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 307 (idle): "SyncWorker_39"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 308 (idle): "SyncWorker_40"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 309 (idle): "SyncWorker_41"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 310 (idle): "SyncWorker_42"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 311 (idle): "SyncWorker_43"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 312 (idle): "SyncWorker_44"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 313 (idle): "SyncWorker_45"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 314 (idle): "SyncWorker_46"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 315 (idle): "SyncWorker_47"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 316 (idle): "SyncWorker_48"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 317 (idle): "SyncWorker_49"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 318 (idle): "SyncWorker_50"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 319 (idle): "SyncWorker_51"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 320 (idle): "SyncWorker_52"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 321 (idle): "SyncWorker_53"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 322 (idle): "SyncWorker_54"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 323 (idle): "SyncWorker_55"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 324 (idle): "SyncWorker_56"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 325 (idle): "SyncWorker_57"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 326 (idle): "SyncWorker_58"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 327 (idle): "SyncWorker_59"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 328 (idle): "SyncWorker_60"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 329 (idle): "SyncWorker_61"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 330 (idle): "SyncWorker_62"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 331 (idle): "SyncWorker_63"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 332 (idle): "zeroconf-Engine-332"
run (zeroconf/__init__.py:1325)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 333 (idle): "zeroconf-Reaper_333"
wait (threading.py:306)
run (zeroconf/__init__.py:1451)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 334 (idle): "influxdb"
wait (threading.py:306)
get (queue.py:179)
get_events_json (homeassistant/components/influxdb/__init__.py:500)
run (homeassistant/components/influxdb/__init__.py:550)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 340 (idle): "Thread-6"
select (selectors.py:415)
handle_request (socketserver.py:294)
run (pysonos/events.py:149)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 429 (idle): "zeroconf-ServiceBrowser__printer._tcp.local.-_googlecast._tcp.local.-_Volumio._tcp.local.-_hap._tcp.local.-_homekit._tcp.local.-_nut._tcp.local.-_wled._tcp.local.-_axis-video._tcp.local.-_dkapi._tcp.local.-_viziocast._tcp.local.-_xbmc-jsonrpc-h._tcp.local.-_ipps._tcp.local.-_plugwise._tcp.local.-_elg._tcp.local.-_ssh._tcp.local.-_miio._udp.local.-_http._tcp.local.-_bond._tcp.local.-_esphomelib._tcp.local.-_api._udp.local.-_spotify-connect._tcp.local.-_daap._tcp.local.-_ipp._tcp.local._429"
wait (threading.py:306)
wait (zeroconf/__init__.py:2407)
run (zeroconf/__init__.py:1701)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 430 (idle): "Thread-57"
send_events (pyhap/accessory_driver.py:483)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 431 (idle): "Thread-58"
select (selectors.py:415)
serve_forever (socketserver.py:232)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 433 (idle): "Thread-60"
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:427)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 436 (idle): "Thread-63"
recv (pyhap/hap_server.py:779)
recv_into (pyhap/hap_server.py:758)
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:429)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 437 (idle): "Thread-64"
recv (pyhap/hap_server.py:779)
recv_into (pyhap/hap_server.py:758)
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:429)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 558 (idle): "Thread-72"
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:427)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 1100 (idle): "Thread-79"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 1102 (idle): "Thread-81"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 1103 (idle): "Thread-82"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 1104 (idle): "Thread-83"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 1107 (idle): "Thread-86"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 1109 (idle): "Thread-88"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 1112 (idle): "Thread-91"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 1113 (idle): "Thread-92"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 1114 (idle): "Thread-93"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 1115 (idle): "Thread-94"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 1116 (idle): "Thread-95"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 1118 (idle): "Thread-97"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 1123 (idle): "Thread-102"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 1125 (idle): "Thread-104"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 1127 (idle): "Thread-106"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 1128 (idle): "Thread-107"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 1373 (idle): "Thread-112"
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:427)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 1462 (idle): "Thread-113"
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:427)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 2782 (idle): "Thread-127"
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:427)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 5548 (idle): "Thread-150"
recv (pyhap/hap_server.py:779)
recv_into (pyhap/hap_server.py:758)
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:429)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 6442 (idle): "Thread-164"
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:427)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 8477 (idle): "Thread-195"
recv (pyhap/hap_server.py:779)
recv_into (pyhap/hap_server.py:758)
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:429)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 9855 (idle): "Thread-223"
recv (pyhap/hap_server.py:779)
recv_into (pyhap/hap_server.py:758)
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:429)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 10308 (idle): "Thread-248"
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:427)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 11167 (idle): "Thread-267"
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:427)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 11327 (idle): "Thread-287"
recv (pyhap/hap_server.py:779)
recv_into (pyhap/hap_server.py:758)
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:429)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 11454 (idle): "Thread-290"
recv (pyhap/hap_server.py:779)
recv_into (pyhap/hap_server.py:758)
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:429)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 14387 (idle): "Thread-339"
recv (pyhap/hap_server.py:779)
recv_into (pyhap/hap_server.py:758)
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:429)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 14407 (idle): "Thread-341"
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:427)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 15147 (idle): "Thread-359"
recv (pyhap/hap_server.py:779)
recv_into (pyhap/hap_server.py:758)
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:429)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 15966 (idle): "Thread-406"
recv (pyhap/hap_server.py:779)
recv_into (pyhap/hap_server.py:758)
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:429)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 16045 (idle): "Thread-407"
recv (pyhap/hap_server.py:779)
recv_into (pyhap/hap_server.py:758)
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:429)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 16496 (idle): "Thread-432"
recv (pyhap/hap_server.py:779)
recv_into (pyhap/hap_server.py:758)
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:429)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 18396 (idle): "Thread-443"
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:427)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 18721 (idle): "Thread-466"
recv (pyhap/hap_server.py:779)
recv_into (pyhap/hap_server.py:758)
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:429)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 18944 (idle): "Thread-477"
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:427)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 18955 (idle): "Thread-478"
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:427)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 18970 (idle): "Thread-479"
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:427)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 18981 (idle): "Thread-480"
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:427)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 19022 (idle): "Thread-483"
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:427)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 19034 (idle): "Thread-485"
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:427)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 20282 (idle): "Thread-513"
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:427)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 20409 (idle): "Thread-515"
recv (pyhap/hap_server.py:779)
recv_into (pyhap/hap_server.py:758)
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:429)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 20419 (idle): "Thread-516"
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:427)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 20689 (idle): "Thread-540"
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:427)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 20785 (idle): "Thread-544"
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:427)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 22407 (idle): "Thread-555"
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:427)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 22486 (idle): "Thread-562"
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:427)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 22519 (idle): "Thread-565"
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:427)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 22538 (idle): "Thread-566"
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:427)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 22569 (idle): "Thread-567"
recv (pyhap/hap_server.py:779)
recv_into (pyhap/hap_server.py:758)
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:429)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 23046 (idle): "Thread-578"
recv (pyhap/hap_server.py:779)
recv_into (pyhap/hap_server.py:758)
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:429)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 23529 (idle): "Thread-585"
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:427)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 24320 (idle): "Thread-594"
recv (pyhap/hap_server.py:779)
recv_into (pyhap/hap_server.py:758)
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:429)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 27280 (idle): "Thread-641"
recv (pyhap/hap_server.py:779)
recv_into (pyhap/hap_server.py:758)
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:429)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 30370 (idle): "Thread-668"
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:427)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 32823 (idle): "Thread-682"
recv (pyhap/hap_server.py:779)
recv_into (pyhap/hap_server.py:758)
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:429)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 35732 (idle): "Thread-717"
recv (pyhap/hap_server.py:779)
recv_into (pyhap/hap_server.py:758)
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:429)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 35858 (idle): "Thread-721"
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:427)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 36098 (idle): "Thread-727"
recv (pyhap/hap_server.py:779)
recv_into (pyhap/hap_server.py:758)
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:429)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 38244 (idle): "Thread-761"
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:427)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 39029 (idle): "Thread-782"
recv (pyhap/hap_server.py:779)
recv_into (pyhap/hap_server.py:758)
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:429)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 44418 (idle): "Thread-823"
recv (pyhap/hap_server.py:779)
recv_into (pyhap/hap_server.py:758)
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:429)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 44752 (idle): "Thread-829"
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:427)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 44992 (idle): "Thread-833"
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:427)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 46110 (idle): "Thread-851"
recv (pyhap/hap_server.py:779)
recv_into (pyhap/hap_server.py:758)
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:429)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 46270 (idle): "Thread-857"
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:427)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 46307 (idle): "Thread-860"
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:427)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 46322 (idle): "Thread-861"
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:427)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 46567 (idle): "Thread-867"
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:427)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 47511 (idle): "Thread-871"
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:427)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 47740 (idle): "Thread-874"
recv (pyhap/hap_server.py:779)
recv_into (pyhap/hap_server.py:758)
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:429)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
bash-5.0#
And this is a top
I have 2 additional spy's
py-spy record --pid 233 --duration 30 --output /config/www/spy-0.116.2-cpu_higher_30s.svg
https://www.dropbox.com/s/vauf4zkokjpqxyz/spy-0.116.2-cpu_higher_30s.svg?dl=0
and
py-spy record --pid 233 --duration 60 --rate 75 --output /config/www/spy-0.116.2-cpu_higher_60s_rate75.svg
https://www.dropbox.com/s/tlfhm5qb0829tn2/spy-0.116.2-cpu_higher_60s_rate75.svg?dl=0
After a restart of HA my cpu level is back to (after 0.115) niveau.
If your terminal can't handle all the output, store the output in a text file: py-spy dump --pid 233 > dump.txt
.
How often do you sample your CPU? The biggest thing showing up on the flame graph is … fetching states from the database which, among other things, are used to render that graph you're showing a screenshot off 😬
Can you send another callgrind.out with the profiler.
I can compare it to the last one and see what changed
Profile send
How often do you sample your CPU? The biggest thing showing up on the flame graph is … fetching states from the database which, among other things, are used to render that graph you're showing a screenshot off 😬
The graph I'm showing is just cpu from https://www.home-assistant.io/integrations/systemmonitor/
Did you take the profile when the cpu had increased, or after the restart?
It looks functionally identical to the last one.
From the py-spy
top
it looks like something was fetching history
data. Do you have a any custom cards that poll history?
Did you take the profile when the cpu had increased, or after the restart?
It was taken after the restart, when this happens again, I will take another profile
Do you have a any custom cards that poll history?
I use
custom:auto-entities, 8x
custom:search-card, 1x
custom:vertical-stack-in-card, 17x
custom:fold-entity-row, (a lot 96x)
custom:calendar-card, 5x
custom:state-switch, 1x
custom:bar-card, 22x
custom:mini-graph-card, 32x
custom:weather-card, 1x
custom:mini-media-player, 10x
custom:gauge-card, 1x
custom:power-wheel-card, 1x
custom:simple-thermostat. 5x
I don't know if there is one card that's fetching history
data..
When it happens again, try closing all browser windows and the mobile app and see if the cpu goes back to baseline. If it’s a custom card polling history it should stop as soon as the ui is closed
Sent from my Mobile
On Oct 15, 2020, at 10:22 AM, Giel Janssens notifications@github.com wrote:
Did you take the profile when the cpu had increased, or after the restart?It was taken after the restart, when this happens again, I will take another profile
Do you have a any custom cards that poll history?
I use
custom:auto-entities, 8x
custom:search-card, 1x
custom:vertical-stack-in-card, 17x
custom:fold-entity-row, (a lot 96x)
custom:calendar-card, 5x
custom:state-switch, 1x
custom:bar-card, 22x
custom:mini-graph-card, 32x
custom:weather-card, 1x
custom:mini-media-player, 10x
custom:gauge-card, 1x
custom:power-wheel-card, 1x
custom:simple-thermostat. 5x
I don't know if there is one card that's fetching history data..—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub, or unsubscribe.
When it happens again, try closing all browser windows and the mobile app and see if the cpu goes back to baseline. If it’s a custom card polling history it should stop as soon as the ui is closed
I'll take a profile first and then I'll close them all...
Your 32 mini graph cards all fetch history! You shouldn't profile with your frontend open while having custom cards loaded because that's not our code running.
0.117b1 is out
When I come home from work, I will update and send new py-spy's.
Are all your commits in the beta?
When I come home from work, I will update and send new py-spy's.
Are all your commits in the beta?
All the core commits are in beta.
There is one more small frontend tweak to reduce the number of database queries coming that I didn't get right the first time around ... https://github.com/home-assistant/frontend/pull/7427 ... this one probably doesn't make that much difference unless you are frequently clicking around the UI
I've updated to 0.117b1
Startup https://www.dropbox.com/s/enq8iimwi5c9y3o/spy-0.117b1-startup.svg?dl=0
10 min https://www.dropbox.com/s/pwmo8ncnkxhykhb/spy-0.117b1-10min.svg?dl=0
I had a lot of issues with the new breaking changes of the templates
this automations do not work anymore
alias: Netatmo Webhook
initial_state: 'on'
trigger:
platform: webhook
webhook_id: netatmo_wh
action:
- service: mqtt.publish
data_template:
topic: 'netatmo/update'
retain: true
payload_template: >
{{ trigger.json | tojson }}
alias: Scan for arrive and depart
initial_state: "on"
trigger:
- platform: homeassistant
event: start
- platform: state
entity_id: binary_sensor.entity_id_for_bt_scanning
to: "on"
condition:
condition: and
conditions:
- condition: template
value_template: "{{as_timestamp(as_local(states.sensor.time.last_changed)) - as_timestamp(state_attr('automation.scan_for_arrive_and_depart', 'last_triggered')) | float > 45 }}"
action:
- service_template: >
{% if is_state('script.mqtt_arrive_depart_script', 'on') %}
script.turn_off
{% else %}
script.turn_on
{% endif %}
data_template:
entity_id: >
{% if is_state('script.mqtt_arrive_depart_script', 'on') %}
script.mqtt_arrive_depart_script
{% else %}
script.dummy
{% endif %}
- wait_template: "{{ is_state('script.dummy', 'off') }}"
- service: script.mqtt_arrive_depart_script
data_template:
# {"identity":"{{trigger.entity_id.split(".")[1]}} {{as_local(states.sensor.time.last_changed).strftime("%H:%M")}}"}
identity: >
{% if trigger.platform == 'state' %}
{"identity":"{{states("sensor.entity_id_for_bt_scanning")}} {{as_local(states.sensor.time.last_changed).strftime("%H:%M")}}"}
{% else %}
{"identity":"HA-start {{as_local(states.sensor.time.last_changed).strftime("%H:%M")}}"}
{% endif %}
With legacy_templates: true
startup https://www.dropbox.com/s/acz3umh7q5hzdax/spy-0.117b1-startup-2.svg?dl=0
10 min https://www.dropbox.com/s/n1nkm3qp51wngsn/spy-0.117b1-10-2min.svg?dl=0
Cpu isn't back to the level pre 0.115.x
mqtt looks much better.
Can you send a profile as well?
profile send
Do you know how to change this templates for the new breaking changes?
{{ trigger.json | tojson }}
and
identity: >
{% if trigger.platform == 'state' %}
{"identity":"{{states("sensor.entity_id_for_bt_scanning")}} {{as_local(states.sensor.time.last_changed).strftime("%H:%M")}}"}
{% else %}
{"identity":"HA-start {{as_local(states.sensor.time.last_changed).strftime("%H:%M")}}"}
{% endif %}
I didn't have a hand in the template changes this go around so I'm not up on how to change the templates.
Here is what the profile show the time being allocated. MQTT is much better.
The key issue is still the massive number of state changed events. Can you send 60 seconds of logs with debug turned on for homeassistant.core
?
It might also be helpful to get a profile with all browsers closed as it should take the websocket noise out since its still the bulk of the cpu time.
Can you send 60 seconds of logs with debug turned on for homeassistant.core?
send with mail...
profile send
Thanks. It's odd that there are quite a few forward events for not state changed even though the log is showing mostly state change events. Do you have any custom cards that are subscribing to the web socket? Have any of them changed since 0.115?
This are the only ones https://github.com/home-assistant/core/issues/40292#issuecomment-709399724
They are all at the latest version, maybe there was an update in between…
Btw startup still takes more then 1 minute
% grep '<Event' 1min.log |awk '{print $8}'|sort|uniq -c
17 automation_triggered[L]:
42 call_service[L]:
1 netatmo_event[L]:
6 script_started[L]:
1649 state_changed[L]:
1 zha_event[L]:
I'm thinking something is subscribing to state changed events and something else is subscribing to ALL events. Which is why line 70 and 82 are so prevalent in the websocket_api
code
The light.py
time I couldn't find before is zha.light.async_get_state
Even with the browser closed, the websocket was still sending events.
Do you use almond
cloud?
Even with the browser closed, the websocket was still sending events.
I had closed my browser I use for HA and all mobile apps on the iPhones in our house
Do you use almond cloud?
No
Is this useful?
Yes. It confirms the second all events listener, which is likely from the websocket.
I'm not sure which integration is doing it though as the only one I know of is almond
that connects back to the websocket.
https://github.com/aio-libs/aiohttp/pull/4993 is going to help this but it won't until we upgrade to aiohttp 3.7. It would be great to find out why there is something listening for all events when all browser are closed.
There isn't anything I can do to find out?
I just remembered I added logging of the connecting host in #40581
Can you turn on debug logging for homeassistant.components.websocket_api
and see if you can find a line in the log from self._logger.debug("Connected from %s", request.remote)
?
On a side note, with https://github.com/home-assistant/core/pull/41433 and https://github.com/home-assistant/core/pull/41796 there was a 73% reduction in time processing mqtt messages between 0.116 and 0.117
self._logger.debug("Connected from %s", request.remote)
I can't find a line like this...
Should I send you the whole log?
self._logger.debug("Connected from %s", request.remote)
I can't find a line like this...
Should I send you the whole log?
logger:
default: warning
logs:
homeassistant.components.websocket_api: debug
To confirm, your logger config looks something like ^^ ?
default: warning
logs:
# homeassistant.core: debug
homeassistant.components.websocket_api: debug
Go ahead and email over the whole log. I'll dig though it. I might find something else helpful as well
send
I noticed that since the second websocket connection is using a different identifier, it can't use the cache so it has to serialize the message again. Let me see if I can improve that.
This are my addons...
Yes. It confirms the second all events listener, which is likely from the websocket.
I think Appdaemon was the culprit
I've turned it off
Send you a new profile with appdaemon off
Was the appdaemon one with the browsers / mobile apps open?
I didn't had any browser open with appdaemon, the addon was just running
Looks like the additional websocket is NOT appdaemon as its still present.
Looks like the additional websocket is NOT appdaemon as its still present.
But now I have * (1 luisteraars)
I'm sorry, but I have to go right now, I ll be back tomorrow after work...
I've got enough that I can work on improving the json serialize cache
https://github.com/home-assistant/core/pull/42226 improves the cache so it can be used between the state_changed
and *
listeners.
Something is definitely not right after the last OS update
Something is definitely not right after the last OS update
Please open a new issue and provide a profiler callgrind file and py-spy
0.117b2 is out with the adjusted websocket code
It looks like one of the state changed listeners is creating executor jobs.
Do you have any of these?
I use influxdb as addon
https://github.com/gieljnssns/My-Hassio-config/blob/master/config/influxdb.yaml
Every state change event is creating an executor job to insert into the queue. I'm pretty sure we can mark these as callbacks so we don't need to jump into the executor to do the queue insert. We could probably also switch to SimpleQueue as it will be more performant now that python 3.7 is the minimum version.
It looks like this pattern has been copied all over the place in many of the integrations listed above.
I'll do some experimenting this weekend.
The unit of work we are processing with some of these state change events is far smaller than the cost of returning control to the event loop. We may be able to swap add/run job now that most listeners are called back from event.py instead of core.py so control is returned to the event loop where we expect larger callbacks to run. I'll do some testing with this on the weekend. It will be something for 0.118.
I think I had originally mentioned influxdb as something that had changed 0.115 but didn't dive in that much initially as we didn't think it was a factor. Will research those changes as well.
3am here so I'll have to wait until after some sleep to continue research
Looks like recorder was switched to a callback in #3812
Looks like lots of calls to adjust thread count are happening in the profile. We don't see the full scope of the impact in the profile because once it jumps into the executor it's not profiled anymore. It's probably maxing out the executor jobs given the number of state change events
I recall you had 64 sync workers so the executor is definitely being maxed out. If influx isn't waiting for a start event to begin processing events it's going to impact startup time.
... and it's not so that is going to be a drag on startup
influxdb fixed in https://github.com/home-assistant/core/pull/42265
I have one py-spy
on 0.117b2
10min https://www.dropbox.com/s/1seegc9cvxp26vj/spy-0.117b2-10min.svg?dl=0
And profile is send trough mail
0.117b3
isn't already visible for me
b3 is still building. It should be out shortly. It has the influxdb fix.
I have one
py-spy
on0.117b2
10min dropbox.com/s/1seegc9cvxp26vj/spy-0.117b2-10min.svg?dl=0
And profile is send trough mail
The websocket serialize cache is working as expected, and has cut the overhead almost in half 👍
Then I hope I'll find some time later, otherwise it is for tomorrow....
Sorry for the late answer, but this weekend I have to work 12h shifts.
I have some new py-spy's
Startup with appdaemon on and off
https://www.dropbox.com/s/3j67pv4vjhwr6v7/spy-0.117b3-startup-appdaemon_off.svg?dl=0
https://www.dropbox.com/s/xi2eprp29owbvht/spy-0.117b3-startup-appdaemon_on.svg?dl=0
10 minutes appdaemon on and off
https://www.dropbox.com/s/8u9xl5a45au71qu/spy-0.117b3-10min-appdaemon_on.svg?dl=0
https://www.dropbox.com/s/yt9sq35oxeaspip/spy-0.117b3-10min-appdaemon_off.svg?dl=0
The last change to the websocket code almost eliminated the difference between appdaemon running and not running. https://github.com/home-assistant/core/pull/42305 should take care of the last of it (I have some changes merged into aiohttp that are only in 3.7.x), so it won't be until 0.118.
hacs startup time is impacting startup time. I've fixed it here https://github.com/hacs/integration/pull/1585 and it should be in the next hacs release.
When you get a chance, can you send a profile as well so I can check that the influxdb change was effective?
profile send
Influxdb performance issue confirmed fixed 👍
https://github.com/home-assistant/core/pull/42305 in 0.118 is the next change that will make a difference
I just found that there is sqlite3 db access in the event loop. This shouldn't be happening.
I opened a separate issue for this https://github.com/home-assistant/core/issues/42345
I found a cache that wasn't working as expected that would impact startup time and fixed it here https://github.com/home-assistant/core/pull/42470
One to watch for 0.118: https://github.com/home-assistant/core/pull/42679
That one made it to 117.2
@bdraco
I've installed the new beta 0.118.0b0
I have 2 new py-spy's
and I've send you a new profile
https://www.dropbox.com/s/a4tmcn17o3uckj4/spy-0.118b0-10min.svg?dl=0
https://www.dropbox.com/s/al7yzblykxri5uv/spy-0.118b0-startup.svg?dl=0
Py-spy dump
bash-5.0# py-spy dump --pid 236
Process 236: python3 -m homeassistant --config /config
Python v3.8.6 (/usr/local/bin/python3.8)
Thread 236 (idle): "MainThread"
select (selectors.py:468)
_run_once (asyncio/base_events.py:1823)
run_forever (asyncio/base_events.py:570)
run_until_complete (asyncio/base_events.py:603)
run (asyncio/runners.py:44)
run (homeassistant/runner.py:133)
main (homeassistant/__main__.py:312)
<module> (homeassistant/__main__.py:320)
_run_code (runpy.py:87)
_run_module_as_main (runpy.py:194)
Thread 257 (idle): "Thread-1"
dequeue (logging/handlers.py:1431)
_monitor (logging/handlers.py:1482)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 258 (idle): "SyncWorker_0"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 259 (idle): "SyncWorker_1"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 260 (idle): "SyncWorker_2"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 261 (idle): "SyncWorker_3"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 262 (idle): "SyncWorker_4"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 263 (idle): "SyncWorker_5"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 264 (idle): "SyncWorker_6"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 267 (idle): "Recorder"
run (homeassistant/components/recorder/__init__.py:357)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 268 (idle): "Thread-2"
dequeue (logging/handlers.py:1431)
_monitor (logging/handlers.py:1482)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 270 (idle): "influxdb"
wait (threading.py:306)
get (queue.py:179)
get_events_json (homeassistant/components/influxdb/__init__.py:527)
run (homeassistant/components/influxdb/__init__.py:577)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 271 (idle): "SyncWorker_7"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 272 (idle): "SyncWorker_8"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 273 (idle): "zeroconf-Engine-273"
run (zeroconf/__init__.py:1350)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 274 (idle): "zeroconf-Reaper_274"
wait (threading.py:306)
run (zeroconf/__init__.py:1476)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 277 (idle): "Thread-5"
loop (paho/mqtt/client.py:1167)
loop_forever (paho/mqtt/client.py:1779)
_thread_main (paho/mqtt/client.py:3452)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 278 (idle): "Thread-6"
wait (threading.py:306)
get (queue.py:179)
run (aiosqlite/core.py:92)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 280 (idle): "SyncWorker_9"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 281 (idle): "SyncWorker_10"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 282 (idle): "SyncWorker_11"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 283 (idle): "SyncWorker_12"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 284 (idle): "SyncWorker_13"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 285 (idle): "SyncWorker_14"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 286 (idle): "SyncWorker_15"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 287 (idle): "SyncWorker_16"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 292 (idle): "Thread-7"
select (selectors.py:415)
handle_request (socketserver.py:294)
run (pysonos/events.py:149)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 293 (idle): "Thread-8"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 294 (idle): "Thread-9"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 295 (idle): "Thread-10"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 296 (idle): "Thread-11"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 299 (idle): "Thread-14"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 300 (idle): "Thread-15"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 301 (idle): "Thread-16"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 302 (idle): "Thread-17"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 303 (idle): "Thread-18"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 304 (idle): "Thread-19"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 306 (idle): "Thread-21"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 308 (idle): "Thread-23"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 317 (idle): "Thread-32"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 319 (idle): "Thread-34"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 320 (idle): "Thread-35"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 321 (idle): "Thread-36"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 323 (idle): "Thread-38"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 324 (idle): "Thread-39"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 325 (idle): "Thread-40"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 326 (idle): "Thread-41"
wait (threading.py:306)
wait (threading.py:558)
run (pysonos/events.py:362)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 332 (idle): "SyncWorker_17"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 333 (idle): "SyncWorker_18"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 335 (idle): "SyncWorker_19"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 336 (idle): "SyncWorker_20"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 337 (idle): "SyncWorker_21"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 338 (idle): "SyncWorker_22"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 340 (idle): "SyncWorker_23"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 341 (idle): "SyncWorker_24"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 342 (idle): "SyncWorker_25"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 343 (idle): "SyncWorker_26"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 344 (idle): "SyncWorker_27"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 345 (idle): "SyncWorker_28"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 346 (idle): "SyncWorker_29"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 352 (idle): "SyncWorker_30"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 353 (idle): "SyncWorker_31"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 354 (idle): "SyncWorker_32"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 355 (idle): "SyncWorker_33"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 356 (idle): "SyncWorker_34"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 357 (idle): "SyncWorker_35"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 358 (idle): "SyncWorker_36"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 359 (idle): "SyncWorker_37"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 360 (idle): "SyncWorker_38"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 361 (idle): "SyncWorker_39"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 362 (idle): "SyncWorker_40"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 363 (idle): "SyncWorker_41"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 364 (idle): "SyncWorker_42"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 365 (idle): "SyncWorker_43"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 366 (idle): "SyncWorker_44"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 367 (idle): "SyncWorker_45"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 368 (idle): "SyncWorker_46"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 369 (idle): "SyncWorker_47"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 370 (idle): "SyncWorker_48"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 371 (idle): "SyncWorker_49"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 372 (idle): "SyncWorker_50"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 373 (idle): "SyncWorker_51"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 374 (idle): "SyncWorker_52"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 375 (idle): "SyncWorker_53"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 376 (idle): "SyncWorker_54"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 377 (idle): "SyncWorker_55"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 378 (idle): "SyncWorker_56"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 379 (idle): "SyncWorker_57"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 380 (idle): "SyncWorker_58"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 381 (idle): "SyncWorker_59"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 382 (idle): "SyncWorker_60"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 383 (idle): "SyncWorker_61"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 384 (idle): "SyncWorker_62"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 385 (idle): "SyncWorker_63"
_worker (concurrent/futures/thread.py:78)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 794 (idle): "zeroconf-ServiceBrowser__Volumio._tcp.local.-_http._tcp.local.-_ssh._tcp.local.-_ipp._tcp.local.-_xbmc-jsonrpc-h._tcp.local.-_api._udp.local.-_daap._tcp.local.-_dkapi._tcp.local.-_hap._tcp.local.-_spotify-connect._tcp.local.-_bond._tcp.local.-_ipps._tcp.local.-_plugwise._tcp.local.-_printer._tcp.local.-_elg._tcp.local.-_esphomelib._tcp.local.-_nut._tcp.local.-_viziocast._tcp.local.-_axis-video._tcp.local.-_googlecast._tcp.local.-_miio._udp.local.-_wled._tcp.local.-_homekit._tcp.local._794"
wait (threading.py:306)
wait (zeroconf/__init__.py:2432)
run (zeroconf/__init__.py:1726)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 796 (idle): "Thread-53"
send_events (pyhap/accessory_driver.py:483)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 797 (idle): "Thread-54"
select (selectors.py:415)
serve_forever (socketserver.py:232)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 798 (idle): "Thread-55"
recv (pyhap/hap_server.py:779)
recv_into (pyhap/hap_server.py:758)
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:429)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 799 (idle): "Thread-56"
recv (pyhap/hap_server.py:779)
recv_into (pyhap/hap_server.py:758)
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:429)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 810 (idle): "Thread-58"
recv (pyhap/hap_server.py:779)
recv_into (pyhap/hap_server.py:758)
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:429)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 812 (idle): "Thread-59"
recv (pyhap/hap_server.py:779)
recv_into (pyhap/hap_server.py:758)
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:429)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 857 (idle): "Thread-60"
recv (pyhap/hap_server.py:779)
recv_into (pyhap/hap_server.py:758)
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:429)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 866 (idle): "Thread-61"
recv (pyhap/hap_server.py:779)
recv_into (pyhap/hap_server.py:758)
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:429)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 867 (idle): "Thread-62"
recv (pyhap/hap_server.py:779)
recv_into (pyhap/hap_server.py:758)
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:429)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
Thread 876 (idle): "Thread-63"
recv (pyhap/hap_server.py:779)
recv_into (pyhap/hap_server.py:758)
readinto (socket.py:669)
handle_one_request (http/server.py:395)
handle (http/server.py:429)
__init__ (socketserver.py:720)
__init__ (pyhap/hap_server.py:164)
finish_request (pyhap/hap_server.py:943)
process_request_thread (socketserver.py:650)
run (threading.py:870)
_bootstrap_inner (threading.py:932)
_bootstrap (threading.py:890)
bash-5.0#
Did the higher cpu persist past the initial restart period?
The majority of the time is writing to the websocket. In this profiler there were 1334 websocket messages written.
In the last one there were only 402.
The good news is the websocket is definitely faster per message in 0.118dev.
I'm not sure what is causing the large increase in number of events which I assume are state changed events.
Looks like the executor is still being maxed out which appears to be coming from HACS. I imagine that HACS will likely have a release in the near future with the performance improvements I mentioned above after https://github.com/hacs/integration/pull/1642 is worked out.
Top callbacks:
814x mqtt https://www.home-assistant.io/integrations/mqtt/
325x utility meter sensor https://www.home-assistant.io/integrations/utility_meter/
350x integration sensors https://www.home-assistant.io/integrations/integration/
352x template sensors https://www.home-assistant.io/integrations/template/
zha running in the event loop confirmed fixed 👍 (was https://github.com/home-assistant/core/pull/42679)
It looks like there is ~22% increase in template processing overhead from the native types (calling _parse_result
in template.py
). Still not so much as all of the calls combined are still less than one call to async_update
of the sun
sensor.
This component is using a bit more time than I would expect.
https://github.com/mudape/iphonedetect/blob/master/custom_components/iphonedetect/device_tracker.py
It would be much faster if it switched to using something pyroute2
since there wouldn't be any fork/exec() overhead.
Did the higher cpu persist past the initial restart period?
My Core CPU is now in between 0.1% and 6%
The next thing to watch for is the hacs
update. There was some talk about coalescing the inserts in zigpy
, however I don't have a handle on where that went.
Top callbacks:
814x mqtt https://www.home-assistant.io/integrations/mqtt/
325x utility meter sensor https://www.home-assistant.io/integrations/utility_meter/
350x integration sensors https://www.home-assistant.io/integrations/integration/
352x template sensors https://www.home-assistant.io/integrations/template/
It corresponds to my subjective perception - utility meters + integration eat a huge of resources.
utility meters + integration eat a huge of resources.
And I have a lot of them
integration 23x and more coming
utility_meter 75x and more coming
Sometimes seeing warnings from integration sensor in my log
Logger: homeassistant.components.integration.sensor
Source: components/integration/sensor.py:170
Integration: integration (documentation, issues)
First occurred: 12:22:02 (4 occurrences)
Last logged: 12:43:03
Invalid state (2.0 > ): [<class 'decimal.ConversionSyntax'>]
Invalid state ( > 1.0): [<class 'decimal.ConversionSyntax'>]
Invalid state (0.0 > ): [<class 'decimal.ConversionSyntax'>]
Invalid state ( > 0.0): [<class 'decimal.ConversionSyntax'>]
Sometimes seeing warnings from integration sensor in my log
Logger: homeassistant.components.integration.sensor Source: components/integration/sensor.py:170 Integration: integration (documentation, issues) First occurred: 12:22:02 (4 occurrences) Last logged: 12:43:03 Invalid state (2.0 > ): [<class 'decimal.ConversionSyntax'>] Invalid state ( > 1.0): [<class 'decimal.ConversionSyntax'>] Invalid state (0.0 > ): [<class 'decimal.ConversionSyntax'>] Invalid state ( > 0.0): [<class 'decimal.ConversionSyntax'>]
Would you please open a new issue for that one.
Will do tomorrow....
Not 100% sure what is going on here yet, but it seems unexpected that so many of these get called later.
Edit: I guess there really are that many updates happening in 60 seconds. I think we can bypass some of this for groups though since they will never be slow and shouldn't ever throw.
Now I see this in my logs
Logger: aiohttp.server
Source: components/websocket_api/http.py:81
First occurred: 12:24:31 (1 occurrences)
Last logged: 12:24:31
Error handling request
Traceback (most recent call last):
File "/usr/local/lib/python3.8/site-packages/aiohttp/web_protocol.py", line 422, in _handle_request
resp = await self._request_handler(request)
File "/usr/local/lib/python3.8/site-packages/aiohttp/web_app.py", line 499, in _handle
resp = await handler(request)
File "/usr/local/lib/python3.8/site-packages/aiohttp/web_middlewares.py", line 118, in impl
return await handler(request)
File "/usr/src/homeassistant/homeassistant/components/http/forwarded.py", line 71, in forwarded_middleware
return await handler(request)
File "/usr/src/homeassistant/homeassistant/components/http/request_context.py", line 18, in request_context_middleware
return await handler(request)
File "/usr/src/homeassistant/homeassistant/components/http/ban.py", line 72, in ban_middleware
return await handler(request)
File "/usr/src/homeassistant/homeassistant/components/http/auth.py", line 127, in auth_middleware
return await handler(request)
File "/usr/src/homeassistant/homeassistant/components/http/view.py", line 129, in handle
result = await result
File "/usr/src/homeassistant/homeassistant/components/websocket_api/http.py", line 42, in get
return await WebSocketHandler(request.app["hass"], request).async_handle()
File "/usr/src/homeassistant/homeassistant/components/websocket_api/http.py", line 233, in async_handle
await self._writer_task
File "/usr/src/homeassistant/homeassistant/components/websocket_api/http.py", line 81, in _writer
await self.wsock.send_str(message)
File "/usr/local/lib/python3.8/site-packages/aiohttp/web_ws.py", line 300, in send_str
await self._writer.send(data, binary=False, compress=compress)
File "/usr/local/lib/python3.8/site-packages/aiohttp/http_websocket.py", line 687, in send
await self._send_frame(message, WSMsgType.TEXT, compress)
File "/usr/local/lib/python3.8/site-packages/aiohttp/http_websocket.py", line 656, in _send_frame
await self.protocol._drain_helper()
File "/usr/local/lib/python3.8/site-packages/aiohttp/base_protocol.py", line 87, in _drain_helper
await waiter
File "/usr/local/lib/python3.8/asyncio/selector_events.py", line 935, in _write_ready
n = self._sock.send(self._buffer)
BrokenPipeError: [Errno 32] Broken pipe
Do I have to open a new issue?
Now I see this in my logs
Logger: aiohttp.server Source: components/websocket_api/http.py:81 First occurred: 12:24:31 (1 occurrences) Last logged: 12:24:31 Error handling request Traceback (most recent call last): File "/usr/local/lib/python3.8/site-packages/aiohttp/web_protocol.py", line 422, in _handle_request resp = await self._request_handler(request) File "/usr/local/lib/python3.8/site-packages/aiohttp/web_app.py", line 499, in _handle resp = await handler(request) File "/usr/local/lib/python3.8/site-packages/aiohttp/web_middlewares.py", line 118, in impl return await handler(request) File "/usr/src/homeassistant/homeassistant/components/http/forwarded.py", line 71, in forwarded_middleware return await handler(request) File "/usr/src/homeassistant/homeassistant/components/http/request_context.py", line 18, in request_context_middleware return await handler(request) File "/usr/src/homeassistant/homeassistant/components/http/ban.py", line 72, in ban_middleware return await handler(request) File "/usr/src/homeassistant/homeassistant/components/http/auth.py", line 127, in auth_middleware return await handler(request) File "/usr/src/homeassistant/homeassistant/components/http/view.py", line 129, in handle result = await result File "/usr/src/homeassistant/homeassistant/components/websocket_api/http.py", line 42, in get return await WebSocketHandler(request.app["hass"], request).async_handle() File "/usr/src/homeassistant/homeassistant/components/websocket_api/http.py", line 233, in async_handle await self._writer_task File "/usr/src/homeassistant/homeassistant/components/websocket_api/http.py", line 81, in _writer await self.wsock.send_str(message) File "/usr/local/lib/python3.8/site-packages/aiohttp/web_ws.py", line 300, in send_str await self._writer.send(data, binary=False, compress=compress) File "/usr/local/lib/python3.8/site-packages/aiohttp/http_websocket.py", line 687, in send await self._send_frame(message, WSMsgType.TEXT, compress) File "/usr/local/lib/python3.8/site-packages/aiohttp/http_websocket.py", line 656, in _send_frame await self.protocol._drain_helper() File "/usr/local/lib/python3.8/site-packages/aiohttp/base_protocol.py", line 87, in _drain_helper await waiter File "/usr/local/lib/python3.8/asyncio/selector_events.py", line 935, in _write_ready n = self._sock.send(self._buffer) BrokenPipeError: [Errno 32] Broken pipe
Do I have to open a new issue?
Yes please. It likely just means the connection wasn't closed cleanly. I doubt it actually affects anything.
This component is using a bit more time than I would expect.
https://github.com/mudape/iphonedetect/blob/master/custom_components/iphonedetect/device_tracker.py
It would be much faster if it switched to using something
pyroute2
since there wouldn't be any fork/exec() overhead.
Thanks @bdraco
Long thread this, don't know who have high cpu usage with iphonedetect.
I've released a beta with pyroute2, please test :)
@gieljnssns Can you give the above a shot ^
@bdraco
0.118.0b2
with beta from iphonedetect
https://www.dropbox.com/s/ip9w1fucavrmq5k/spy-0.118b2-10min.svg?dl=0
Profile send with mail
Thanks. The latency in the event loop is better because it doesn't have to suspend to execute the subprocess.
Unfortunately the overhead of parsing the the netlink messages is using more cpu time since the python version is slower than the c version as there must be a large number of neighbors on your host. The overall result was likely better for me since I don't have as many neighbors on my host.
pyroute2
has filtering https://docs.pyroute2.org/iproute.html#pyroute2.iproute.IPRouteMixin.get_neighbors so we may be able to use that to avoid some of the parse overhead for a net win.
Potentially something like this might be faster:
ip.get_neighbours(dst=xxxx, match=lambda x: x['state'] == 2)
I'll do some testing today if I don't run out of time.
cc @mudape
from pyroute2 import IPRoute
import cProfile
ip = IPRoute()
pr = cProfile.Profile()
pr.enable()
ip.neigh("dump", dst="192.168.208.5")
pr.disable()
pr.create_stats()
pr.dump_stats("pyroute.cprof")
For testing
It looks like this is what we need to limit the query https://github.com/svinota/pyroute2/pull/723
Well it looks like that is only supported on newer kernels so there is no way to get a single record on older kernels which means the only way is to dump the whole table and filter. That is unfortunate, as I don't think we have a way to make this faster without a kernel upgrade.
Even on 4.4.59, get isn't implemented
root@ha-dev:~/pyroute2# ip neigh get 192.168.208.1
Sorry, "neigh get" is not implemented :-(
I did find some more polling that shouldn't be happening in the latest profiles. I've cleaned that up in 0.119dev.
HACS beta 1.8.0 has been tagged https://github.com/hacs/integration/releases with the performance cleanups I mentioned above.
py-spy
with Hacs beta https://www.dropbox.com/s/jbujvcr5z80prot/spy-0.118b2-10min-hacs_beta.svg?dl=0
Profile is send
There are still a lot of calls to async_add_job
that don't pre-check the callable type.
Its likely this block in mqtt
def _mqtt_on_message(self, _mqttc, _userdata, msg) -> None:
"""Message received callback."""
self.hass.add_job(self._mqtt_handle_message, msg)
Most helpful comment
Thanks for the additional
py-spy
👍 I've made more progress tracking down CPU issues than any other github issue ever opened.