Core: miflora sensors reported as 'Unavailable'.

Created on 29 Dec 2019  Β·  86Comments  Β·  Source: home-assistant/core

Home Assistant release with the issue:

0.103.4

Last working Home Assistant release (if known):
0.102

Operating environment (Hass.io/Docker/Windows/etc.):

raspbian/virtualenv on pi4
python_version 3.7.3

Integration:

https://www.home-assistant.io/integrations/miflora/

Description of problem:
Since https://github.com/home-assistant/home-assistant/pull/29276, miflora sensors are now displayed as 'Unavailable' when they can't be reached. Since BtLE is quite unreliable, even with a good connection they are frequently displayed as 'Unavailable'.

As values tend to change slowly, a day-old stale value is preferable to 'Unavailable'.

image

All six of my devices are within 4m of the pi4 which runs home-assistant. Even the closest devices have gaps in the graphs where they are 'Unavailable'.
image
image
image

Problem-relevant configuration.yaml entries and (fill out even if it seems unimportant):


Traceback (if applicable):


Additional information:

Suggestion: Allow users to set a threshold after which the sensor is considered 'Unavailable'. Only mark the sensor 'Unavailable' if the user has configured a threshold.

miflora

Most helpful comment

Can we await confirmation of the fix before closing the issue please?

All 86 comments

Hey there @danielhiversen, @ChristianKuehnel, mind taking a look at this issue as its been labeled with a integration (miflora) you are listed as a codeowner for? Thanks!

I have this too, I've also noticed that this issue came about with 0.103.

Some findings:

  • sensors will remain in Unavailable until a host reboot is performed
  • when attempting to run a bluetooth scan from cli using bluetoothctl > scan on command, returns the following:

image

workaround in cli:

  • running a bluetoothctl > power off, then power on will allow a scan to be performed again although I've not yet verified if the sensors come back online yet, will update when/if this occurs.

I'm facing this issue on Hassio 0.103.4 on a Raspberry Pi 3b.

This is related to the change https://github.com/home-assistant/home-assistant/pull/29276 by @ferbar. I suppose this was his intention.

@ferbar I would recommend only showing the error message after several attempts have failed or after not data for several hours. The current implementation seems a bit confusing...

@ChristianKuehnel I was asking for help in the forum if there are intended ways of solving this no success until now.
https://community.home-assistant.io/t/entity-remember-last-state-for-an-hour-and-then-go-to-unavailable-if-update-fails-for-a-long-period/153351

If everybody agrees I will implement the timeout in the miflora code (sry, I'm thinking this is the wrong place)
An other workaround would be to create a template sensor which always returns "available"

I'm facing the exact same issues ("Failed to start discovery: org.bluez.Error.InProgress") as @talondnb.
Running 0.103.4 in docker, and added the miflora debugging to my logger with;
logger: default: info logs: homeassistant.components.sensor.miflora: debug btlewrap: debug miflora: debug

In my Home Assistant container logs via portainer I see the following information;
2020-01-04 12:06:41 INFO (SyncWorker_2) [homeassistant.components.miflora.sensor] Polling error BluetoothBackendException: Could not read data from Mi Flora sensor C4:7C:8D:66:19:4D, 2020-01-04 12:06:41 DEBUG (SyncWorker_15) [btlewrap.bluepy] Call to <function BluepyBackend.connect at 0x715a6540> failed, try 1 of 3, 2020-01-04 12:06:41 DEBUG (SyncWorker_15) [btlewrap.bluepy] Call to <function BluepyBackend.connect at 0x715a6540> failed, try 2 of 3, 2020-01-04 12:06:41 DEBUG (SyncWorker_15) [btlewrap.bluepy] Call to <function BluepyBackend.connect at 0x715a6540> failed, try 3 of 3, 2020-01-04 12:06:41 INFO (SyncWorker_15) [homeassistant.components.miflora.sensor] Polling error BluetoothBackendException: , 2020-01-04 12:06:41 DEBUG (SyncWorker_4) [miflora.miflora_poller] Using cache (0:15:00.432023 < 0:20:00), 2020-01-04 12:06:41 INFO (SyncWorker_4) [homeassistant.components.miflora.sensor] Polling error BluetoothBackendException: Could not read data from Mi Flora sensor C4:7C:8D:66:19:4D, 2020-01-04 12:06:41 DEBUG (SyncWorker_12) [miflora.miflora_poller] Using cache (0:15:00.441903 < 0:20:00), 2020-01-04 12:06:41 INFO (SyncWorker_12) [homeassistant.components.miflora.sensor] Polling error BluetoothBackendException: Could not read data from Mi Flora sensor C4:7C:8D:66:19:4D, 2020-01-04 12:16:52 INFO (MainThread) [hacs.factory] Prosessing 11 tasks, 2020-01-04 12:17:00 INFO (MainThread) [hacs.factory] Task prosessing of 11 tasks completed in 8 seconds, 2020-01-04 12:23:27 ERROR (MainThread) [metno] https://aa015h6buqvih86i1.api.met.no/weatherapi/locationforecast/1.9/ returned , 2020-01-04 12:23:27 ERROR (MainThread) [homeassistant.components.met.weather] Retrying in 15 minutes, 2020-01-04 12:26:42 DEBUG (SyncWorker_10) [miflora.miflora_poller] Filling cache with new sensor data., 2020-01-04 12:26:42 DEBUG (SyncWorker_10) [btlewrap.bluepy] Call to <function BluepyBackend.connect at 0x715a6540> failed, try 1 of 3, 2020-01-04 12:26:42 DEBUG (SyncWorker_10) [btlewrap.bluepy] Call to <function BluepyBackend.connect at 0x715a6540> failed, try 2 of 3, 2020-01-04 12:26:42 DEBUG (SyncWorker_10) [btlewrap.bluepy] Call to <function BluepyBackend.connect at 0x715a6540> failed, try 3 of 3, 2020-01-04 12:26:42 INFO (SyncWorker_10) [homeassistant.components.miflora.sensor] Polling error BluetoothBackendException: , 2020-01-04 12:26:42 DEBUG (SyncWorker_19) [miflora.miflora_poller] Using cache (0:15:00.019490 < 0:20:00), 2020-01-04 12:26:42 INFO (SyncWorker_19) [homeassistant.components.miflora.sensor] Polling error BluetoothBackendException: Could not read data from Mi Flora sensor C4:7C:8D:66:19:4D, 2020-01-04 12:26:42 DEBUG (SyncWorker_8) [btlewrap.bluepy] Call to <function BluepyBackend.connect at 0x715a6540> failed, try 1 of 3, 2020-01-04 12:26:42 DEBUG (SyncWorker_8) [btlewrap.bluepy] Call to <function BluepyBackend.connect at 0x715a6540> failed, try 2 of 3, 2020-01-04 12:26:42 DEBUG (SyncWorker_8) [btlewrap.bluepy] Call to <function BluepyBackend.connect at 0x715a6540> failed, try 3 of 3, 2020-01-04 12:26:42 INFO (SyncWorker_8) [homeassistant.components.miflora.sensor] Polling error BluetoothBackendException: , 2020-01-04 12:26:42 DEBUG (SyncWorker_4) [miflora.miflora_poller] Using cache (0:15:00.476614 < 0:20:00), 2020-01-04 12:26:42 INFO (SyncWorker_4) [homeassistant.components.miflora.sensor] Polling error BluetoothBackendException: Could not read data from Mi Flora sensor C4:7C:8D:66:19:4D, 2020-01-04 12:26:42 DEBUG (SyncWorker_14) [miflora.miflora_poller] Using cache (0:15:00.489555 < 0:20:00), 2020-01-04 12:26:42 INFO (SyncWorker_14) [homeassistant.components.miflora.sensor] Polling error BluetoothBackendException: Could not read data from Mi Flora sensor C4:7C:8D:66:19:4D, 2020-01-04 12:46:43 DEBUG (SyncWorker_14) [btlewrap.bluepy] Call to <function BluepyBackend.connect at 0x715a6540> failed, try 1 of 3, 2020-01-04 12:46:43 DEBUG (SyncWorker_14) [btlewrap.bluepy] Call to <function BluepyBackend.connect at 0x715a6540> failed, try 2 of 3, 2020-01-04 12:46:43 DEBUG (SyncWorker_14) [btlewrap.bluepy] Call to <function BluepyBackend.connect at 0x715a6540> failed, try 3 of 3, 2020-01-04 12:46:43 INFO (SyncWorker_14) [homeassistant.components.miflora.sensor] Polling error BluetoothBackendException: , 2020-01-04 12:46:43 DEBUG (SyncWorker_0) [miflora.miflora_poller] Filling cache with new sensor data., 2020-01-04 12:46:43 DEBUG (SyncWorker_0) [btlewrap.bluepy] Call to <function BluepyBackend.connect at 0x715a6540> failed, try 1 of 3, 2020-01-04 12:46:43 DEBUG (SyncWorker_0) [btlewrap.bluepy] Call to <function BluepyBackend.connect at 0x715a6540> failed, try 2 of 3, 2020-01-04 12:46:43 DEBUG (SyncWorker_0) [btlewrap.bluepy] Call to <function BluepyBackend.connect at 0x715a6540> failed, try 3 of 3, 2020-01-04 12:46:43 INFO (SyncWorker_0) [homeassistant.components.miflora.sensor] Polling error BluetoothBackendException: , 2020-01-04 12:46:43 DEBUG (SyncWorker_8) [miflora.miflora_poller] Using cache (0:15:00.013403 < 0:20:00), 2020-01-04 12:46:43 INFO (SyncWorker_8) [homeassistant.components.miflora.sensor] Polling error BluetoothBackendException: Could not read data from Mi Flora sensor C4:7C:8D:66:19:4D, 2020-01-04 12:46:43 DEBUG (SyncWorker_13) [miflora.miflora_poller] Using cache (0:15:00.035718 < 0:20:00), 2020-01-04 12:46:43 INFO (SyncWorker_13) [homeassistant.components.miflora.sensor] Polling error BluetoothBackendException: Could not read data from Mi Flora sensor C4:7C:8D:66:19:4D, 2020-01-04 12:46:43 DEBUG (SyncWorker_12) [miflora.miflora_poller] Using cache (0:15:00.047539 < 0:20:00), 2020-01-04 12:46:43 INFO (SyncWorker_12) [homeassistant.components.miflora.sensor] Polling error BluetoothBackendException: Could not read data from Mi Flora sensor C4:7C:8D:66:19:4D, 2020-01-04 12:46:53 INFO (MainThread) [hacs.factory] Prosessing 11 tasks, 2020-01-04 12:47:01 INFO (MainThread) [hacs.factory] Task prosessing of 11 tasks completed in 9 seconds, 2020-01-04 13:06:44 DEBUG (SyncWorker_6) [miflora.miflora_poller] Filling cache with new sensor data., 2020-01-04 13:06:44 DEBUG (SyncWorker_6) [btlewrap.bluepy] Call to <function BluepyBackend.connect at 0x715a6540> failed, try 1 of 3, 2020-01-04 13:06:44 DEBUG (SyncWorker_6) [btlewrap.bluepy] Call to <function BluepyBackend.connect at 0x715a6540> failed, try 2 of 3, 2020-01-04 13:06:44 DEBUG (SyncWorker_6) [btlewrap.bluepy] Call to <function BluepyBackend.connect at 0x715a6540> failed, try 3 of 3, 2020-01-04 13:06:44 INFO (SyncWorker_6) [homeassistant.components.miflora.sensor] Polling error BluetoothBackendException: , 2020-01-04 13:06:44 DEBUG (SyncWorker_13) [miflora.miflora_poller] Using cache (0:15:00.013333 < 0:20:00), 2020-01-04 13:06:44 INFO (SyncWorker_13) [homeassistant.components.miflora.sensor] Polling error BluetoothBackendException: Could not read data from Mi Flora sensor C4:7C:8D:66:19:4D, 2020-01-04 13:06:44 DEBUG (SyncWorker_3) [btlewrap.bluepy] Call to <function BluepyBackend.connect at 0x715a6540> failed, try 1 of 3, 2020-01-04 13:06:44 DEBUG (SyncWorker_3) [btlewrap.bluepy] Call to <function BluepyBackend.connect at 0x715a6540> failed, try 2 of 3, 2020-01-04 13:06:44 DEBUG (SyncWorker_3) [btlewrap.bluepy] Call to <function BluepyBackend.connect at 0x715a6540> failed, try 3 of 3, 2020-01-04 13:06:44 INFO (SyncWorker_3) [homeassistant.components.miflora.sensor] Polling error BluetoothBackendException: , 2020-01-04 13:06:44 DEBUG (SyncWorker_7) [miflora.miflora_poller] Using cache (0:15:00.475562 < 0:20:00), 2020-01-04 13:06:44 INFO (SyncWorker_7) [homeassistant.components.miflora.sensor] Polling error BluetoothBackendException: Could not read data from Mi Flora sensor C4:7C:8D:66:19:4D, 2020-01-04 13:06:45 DEBUG (SyncWorker_4) [miflora.miflora_poller] Using cache (0:15:00.485941 < 0:20:00), 2020-01-04 13:06:45 INFO (SyncWorker_4) [homeassistant.components.miflora.sensor] Polling error BluetoothBackendException: Could not read data from Mi Flora sensor C4:7C:8D:66:19:4D, 2020-01-04 13:16:54 INFO (MainThread) [hacs.factory] Prosessing 11 tasks, 2020-01-04 13:17:02 INFO (MainThread) [hacs.factory] Task prosessing of 11 tasks completed in 9 seconds, 2020-01-04 13:26:44 DEBUG (SyncWorker_4) [miflora.miflora_poller] Filling cache with new sensor data., 2020-01-04 13:26:44 DEBUG (SyncWorker_4) [btlewrap.bluepy] Call to <function BluepyBackend.connect at 0x715a6540> failed, try 1 of 3, 2020-01-04 13:26:44 DEBUG (SyncWorker_4) [btlewrap.bluepy] Call to <function BluepyBackend.connect at 0x715a6540> failed, try 2 of 3, 2020-01-04 13:26:44 DEBUG (SyncWorker_4) [btlewrap.bluepy] Call to <function BluepyBackend.connect at 0x715a6540> failed, try 3 of 3, 2020-01-04 13:26:44 INFO (SyncWorker_4) [homeassistant.components.miflora.sensor] Polling error BluetoothBackendException: , 2020-01-04 13:26:44 DEBUG (SyncWorker_4) [btlewrap.bluepy] Call to <function BluepyBackend.connect at 0x715a6540> failed, try 1 of 3, 2020-01-04 13:26:44 DEBUG (SyncWorker_4) [btlewrap.bluepy] Call to <function BluepyBackend.connect at 0x715a6540> failed, try 2 of 3, 2020-01-04 13:26:44 DEBUG (SyncWorker_4) [btlewrap.bluepy] Call to <function BluepyBackend.connect at 0x715a6540> failed, try 3 of 3, 2020-01-04 13:26:44 INFO (SyncWorker_4) [homeassistant.components.miflora.sensor] Polling error BluetoothBackendException: , 2020-01-04 13:26:44 DEBUG (SyncWorker_15) [miflora.miflora_poller] Using cache (0:15:00.455645 < 0:20:00), 2020-01-04 13:26:44 INFO (SyncWorker_15) [homeassistant.components.miflora.sensor] Polling error BluetoothBackendException: Could not read data from Mi Flora sensor C4:7C:8D:66:19:4D, 2020-01-04 13:26:44 DEBUG (SyncWorker_12) [miflora.miflora_poller] Using cache (0:15:00.466251 < 0:20:00), 2020-01-04 13:26:44 INFO (SyncWorker_12) [homeassistant.components.miflora.sensor] Polling error BluetoothBackendException: Could not read data from Mi Flora sensor C4:7C:8D:66:19:4D, 2020-01-04 13:26:44 DEBUG (SyncWorker_18) [miflora.miflora_poller] Using cache (0:15:00.473421 < 0:20:00), 2020-01-04 13:26:44 INFO (SyncWorker_18) [homeassistant.components.miflora.sensor] Polling error BluetoothBackendException: Could not read data from Mi Flora sensor C4:7C:8D:66:19:4D, 2020-01-04 13:46:45 DEBUG (SyncWorker_15) [btlewrap.bluepy] Call to <function BluepyBackend.connect at 0x715a6540> failed, try 1 of 3, 2020-01-04 13:46:45 DEBUG (SyncWorker_15) [btlewrap.bluepy] Call to <function BluepyBackend.connect at 0x715a6540> failed, try 2 of 3, 2020-01-04 13:46:45 DEBUG (SyncWorker_15) [btlewrap.bluepy] Call to <function BluepyBackend.connect at 0x715a6540> failed, try 3 of 3, 2020-01-04 13:46:45 INFO (SyncWorker_15) [homeassistant.components.miflora.sensor] Polling error BluetoothBackendException: , 2020-01-04 13:46:45 DEBUG (SyncWorker_16) [miflora.miflora_poller] Filling cache with new sensor data., 2020-01-04 13:46:45 DEBUG (SyncWorker_16) [btlewrap.bluepy] Call to <function BluepyBackend.connect at 0x715a6540> failed, try 1 of 3, 2020-01-04 13:46:45 DEBUG (SyncWorker_16) [btlewrap.bluepy] Call to <function BluepyBackend.connect at 0x715a6540> failed, try 2 of 3, 2020-01-04 13:46:45 DEBUG (SyncWorker_16) [btlewrap.bluepy] Call to <function BluepyBackend.connect at 0x715a6540> failed, try 3 of 3, 2020-01-04 13:46:45 INFO (SyncWorker_16) [homeassistant.components.miflora.sensor] Polling error BluetoothBackendException: , 2020-01-04 13:46:45 DEBUG (SyncWorker_2) [miflora.miflora_poller] Using cache (0:15:00.013082 < 0:20:00), 2020-01-04 13:46:45 INFO (SyncWorker_2) [homeassistant.components.miflora.sensor] Polling error BluetoothBackendException: Could not read data from Mi Flora sensor C4:7C:8D:66:19:4D, 2020-01-04 13:46:45 DEBUG (SyncWorker_5) [miflora.miflora_poller] Using cache (0:15:00.081467 < 0:20:00), 2020-01-04 13:46:45 INFO (SyncWorker_5) [homeassistant.components.miflora.sensor] Polling error BluetoothBackendException: Could not read data from Mi Flora sensor C4:7C:8D:66:19:4D, 2020-01-04 13:46:45 DEBUG (SyncWorker_11) [miflora.miflora_poller] Using cache (0:15:00.090209 < 0:20:00), 2020-01-04 13:46:45 INFO (SyncWorker_11) [homeassistant.components.miflora.sensor] Polling error BluetoothBackendException: Could not read data from Mi Flora sensor C4:7C:8D:66:19:4D, 2020-01-04 13:46:55 INFO (MainThread) [hacs.factory] Prosessing 11 tasks, 2020-01-04 13:47:03 INFO (MainThread) [hacs.factory] Task prosessing of 11 tasks completed in 9 seconds, 2020-01-04 14:06:46 DEBUG (SyncWorker_0) [miflora.miflora_poller] Filling cache with new sensor data., 2020-01-04 14:06:48 DEBUG (SyncWorker_0) [miflora.miflora_poller] Received result for handle 56: 62 2A 33 2E 32 2E 31, 2020-01-04 14:06:49 DEBUG (SyncWorker_0) [miflora.miflora_poller] Received result for handle 53: B1 00 00 15 00 00 00 10 1E 00 02 3C 00 FB 34 9B, 2020-01-04 14:06:49 DEBUG (SyncWorker_7) [miflora.miflora_poller] Using cache (0:00:00.015156 < 0:20:00), 2020-01-04 14:06:49 DEBUG (SyncWorker_17) [miflora.miflora_poller] Using cache (0:00:00.025710 < 0:20:00), 2020-01-04 14:06:49 DEBUG (SyncWorker_13) [miflora.miflora_poller] Using cache (0:00:00.035413 < 0:20:00),

With the Flower Care app, I can still sync the data between the sensor and my phone so I'm positive the battery isn't dead. The sensor is running on firmware 3.2.1 (latest).

Edit;
Can confirm running power off and power on via bluetoothctl does indeed work and my sensor is reporting data again (for now, at least)

Edit 2;
14:06 it synced new data (after turning bluetooth off and back on)
14:26 it synced new data
14:46 it failed to sync new data (same issue as above)

Anyone running https://github.com/home-assistant/hassos/releases/tag/3.8 yet?
It has an update to BlueZ package, maybe a fix?

I'm also getting this after updating to the latest docker image of Home Assistant v0.104.3
I tried restarting my controller via bluetoothctl from what others suggested but wasn't able to get a reading. I restarted the docker container and was able to get one of three sensors to read for a few minutes before it went back to unavailable.
The only error I'm getting in the logs is about the sensors taking longer than 10 seconds to connect.

Edit: 24 hours later, and everything has started working on it's own...? :man_shrugging:

The PR above will not fix any bad-connection-problems. It will only make it less worse because you can tell HA to report values even if they are 24 hours old. Before HA 0.103 you have seen the last value received forever, even if the battery was empty for days/weeks.

You can check the following:
in configuration.yaml - miflora set
scan_interval: 30
(don't forgett to revert this later on)
Remove the battery of the miflora sensor. restart HA. The led of the BT dongle will be blinking all the time because HA tries to refresh the values all the time.

root@home-assistant:~# gatttool --device=C4:7C:8D:AA:BB:CC --char-read -a 0x35
connect: Device or resource busy (16)

Set scan_interval: 3600 so HA will only try to scan every hour and the BT dongle led will stop flashing.

root@home-assistant:~# time gatttool --device=C4:7C:8D:AA:BB:CC --char-read -a 0x35

the BT dongle led will start flashing

connect error: Connection refused (111)

real    0m42.017s
user    0m0.005s
sys     0m0.012s

Compare the behavior with the battery inserted and the flower sensor next to the BT dongle or a longer range.

Anyone running https://github.com/home-assistant/hassos/releases/tag/3.8 yet?
It has an update to BlueZ package, maybe a fix?

Actually, I believe that this is a culprit of the issue in Hass.io.
Before I updated to 0.104.x and HassOS 3.8, my miFlora worked good enough for almost a year. I had a small automation that was controlling whether the values were up-to-date, so didn't need @ferbar update either. Now everything related to miFlora is broken which is quite annoying.

I think my issues started with 0.104.x as well, I lowered the baudrate of my Pi 3B last sunday and no issues so far (about 56 hours later), still receiving data which matches with the values from the Flower Care app.

Found it here;
https://github.com/raspberrypi/linux/issues/2264#issuecomment-344911712

My miflora issues started 104 as well, before that it was working well, now it is broken

@Martinvdm @xPhantomNL @Molodax which hardware do you use? HassOS or something else?

Using intel nuc with docker

@xPhantomNL your suggestion definitely helped (on an RPi 3Brev1.2)!

Anyone running https://github.com/home-assistant/hassos/releases/tag/3.8 yet?
It has an update to BlueZ package, maybe a fix?

Actually, I believe that this is a culprit of the issue in Hass.io.
Before I updated to 0.104.x and HassOS 3.8, my miFlora worked good enough for almost a year. I had a small automation that was controlling whether the values were up-to-date, so didn't need @ferbar update either. Now everything related to miFlora is broken which is quite annoying.

Have you had any luck getting them working again yet?

Same issues here, very annoying (5 sensors, some with new batteries after less than a year)...

Edit: I completely agree with https://github.com/home-assistant/core/issues/30275#issuecomment-582024419

Same issues here, very annoying (5 sensors, some with new batteries after less than a year)...

I've made this crude workaround which should function until a fix is in place. I'm using a Pi 3B and have had this issue since around 0.103, currently running 0.106.5. Still testing but worth trying.

## Shell Command
shell_command:
  bluetooth_off: 'hciconfig hci0 down'
  bluetooth_on: 'hciconfig hci0 up'

## Automations
automation:
  # Bluetooth Power toggle every 15 minutes (workaround for bluetooth pi 3 issue)
  - alias: 'Toggle bluetooth power every 15 minutes'
    trigger:
      platform: time_pattern
      minutes: "/15"
    action:
      - service: shell_command.bluetooth_off
      - delay: '00:00:05'
      - service: shell_command.bluetooth_on

Edit: working well so far

@talondnb thanks for the workaround! I tried it by logging in via the hassio ssh addon but the command couldn't be found (which makes sense). I then applied it on my server directly (ubuntu 18.04, via ssh) 'sudo hciconfig down/up' but my sensors are still being reported as unavailable. πŸ˜• How do I apply these commands correctly? πŸ™ˆ

PS: I can poll all my sensors very quickly and without any issues from my laptop using the miflora python lib which is also used by this component. So it MUST be related to this component and the way how it interacts with Home Assistant. πŸ€·β€β™‚οΈ

@talondnb thanks for the workaround! I tried it by logging in via the hassio ssh addon but the command couldn't be found (which makes sense). I then applied it on my server directly (ubuntu 18.04, via ssh) 'sudo hciconfig down/up' but my sensors are still being reported as unavailable. πŸ˜• How do I apply these commands correctly? πŸ™ˆ

PS: I can poll all my sensors very quickly and without any issues from my laptop using the miflora python lib which is also used by this component. So it MUST be related to this component and the way how it interacts with Home Assistant. πŸ€·β€β™‚οΈ

The hciconfig command doesn't exist within the hassio ssh addon and the shell_command integration runs commands within the hassio docker host itself. To get to the host, disable protection mode in the SSH addon (the one by Frenck) and run 'docker exec -it homeassistant /bin/bash' within a ssh session.

If you want to perform the same within the hassio ssh addon, simply run 'bluetoothctl power off' then 'bluetoothctl power on'. This 'reboots' the bluetooth controller and the sensors will come back within 20 minutes or so - it is not instant unfortunately.

It's definitely relating to the software on homeassistant/pi.

BTW, it looks like a 'hciconfig hci0 reset' might be simpler and unless it was a coincidence, brought my sensors back sooner:

## Shell Command
shell_command:
  bluetooth_reset: 'hciconfig hci0 reset'

## Automations
automation:
  # Bluetooth Power toggle every 15 minutes (workaround for bluetooth pi 3 issue)
  - alias: 'Toggle bluetooth power every 15 minutes'
    trigger:
      platform: time_pattern
      minutes: "/15"
    action:
      - service: shell_command.bluetooth_reset

BTW, it looks like a 'hciconfig hci0 reset' might be simpler and unless it was a coincidence, brought my sensors back sooner:

It's a trick I used for a while, and it was better than nothing, but the Bluetooth would eventually lock up and I would have to reboot the Pi (as in, the hci0 reset would be stuck as well, so nothing could be done). Sorry to be a downer, but I wouldn't get my hopes up.

Ahh, didn't realise. Thanks for the insight though, I think for those like myself this is probably the best workaround for now, sigh.

@talondnb thanks for clarifying but I can simply log into my host since it runs normal Ubuntu (and Hassio + docker on top of it). However, a sudo bluetoothctl followed by power off, power on does not help (it seems to power cycle the adapter correctly but als miflora sensors are still unavailable).

I just realized after a restart of HA (v106.5) that I just have this (wrt. miflora) in my log (I have 6 sensors):

2020-03-08 20:15:05 INFO (SyncWorker_33) [homeassistant.loader] Loaded miflora from homeassistant.components.miflora
[...]
2020-03-08 20:15:08 INFO (MainThread) [homeassistant.components.sensor] Setting up sensor.miflora
2020-03-08 20:15:08 INFO (MainThread) [homeassistant.components.sensor] Setting up sensor.miflora
2020-03-08 20:15:08 INFO (MainThread) [homeassistant.components.sensor] Setting up sensor.miflora
2020-03-08 20:15:08 INFO (MainThread) [homeassistant.components.sensor] Setting up sensor.miflora
2020-03-08 20:15:08 INFO (MainThread) [homeassistant.components.sensor] Setting up sensor.miflora
2020-03-08 20:15:08 INFO (MainThread) [homeassistant.components.sensor] Setting up sensor.miflora
[...]
2020-03-08 20:15:08 DEBUG (MainThread) [homeassistant.components.miflora.sensor] Miflora is using BluepyBackend backend.
2020-03-08 20:15:08 DEBUG (MainThread) [homeassistant.components.miflora.sensor] Miflora is using BluepyBackend backend.
2020-03-08 20:15:08 DEBUG (MainThread) [homeassistant.components.miflora.sensor] Miflora is using BluepyBackend backend.
2020-03-08 20:15:08 DEBUG (MainThread) [homeassistant.components.miflora.sensor] Miflora is using BluepyBackend backend.
2020-03-08 20:15:08 DEBUG (MainThread) [homeassistant.components.miflora.sensor] Miflora is using BluepyBackend backend.
2020-03-08 20:15:08 DEBUG (MainThread) [homeassistant.components.miflora.sensor] Miflora is using BluepyBackend backend.
[...]
2020-03-08 20:15:24 DEBUG (SyncWorker_21) [homeassistant.components.miflora.sensor] Polling data for spathiphyllum Conductivity

That's it and according to the code, it seems to be stall in https://github.com/home-assistant/core/blob/0.106.5/homeassistant/components/miflora/sensor.py#L164 because every other cflow case has an output. For me it seems to be stuck at polling the first sensor but the question is why. Does anyone know (maybe @ChristianKuehnel ?) if there's a timeout for polling and to what value it is set to?

According to my investigations, parameter_value() will be called which will have a cache miss, so it will invoke self.fill_cache(). In that method, there is no loop but a comment saying "wait 5 minutes before retrying": where does this (wait+retry) happen?

To continue, I assume the work is done in connection.write_handle() which calls BluetoothInterface.connect() -> self._backend.connect() -> BluepyBackend.connect() (at least in my case of bluepy). I don't know what Peripheral does (and don't want to track it further down) but at least up to here, there's no waiting or retrying ... :thinking: :sos:

Edit: I now also tried to do a scan and polled every sensor from the server itself (running HA and the miflora component) and that worked flawlessly. Again, it must be an issue in the HA component.

BTW, it looks like a 'hciconfig hci0 reset' might be simpler and unless it was a coincidence, brought my sensors back sooner:

It's a trick I used for a while, and it was better than nothing, but the Bluetooth would eventually lock up and I would have to reboot the Pi (as in, the hci0 reset would be stuck as well, so nothing could be done). Sorry to be a downer, but I wouldn't get my hopes up.

Looks like I've hit this:

bash-5.0# hciconfig hci0 reset
Can't init device hci0: Operation timed out (110)

And within the SSH addon:

➜  ~ bluetoothctl power off
Changing power off succeeded
➜  ~ bluetoothctl power on 
Failed to set power on: org.bluez.Error.Failed

Hm interessting: I was able to reproduce this issue with the plantgateway on my server. It worked for the first sensor and failed for the remaining ones (although it correctly timed out, in contrast to this HA component).

Doing:

$ sudo bluetoothctl
[bluetooth]# power off
[bluetooth]# power on
[bluetooth]# exit

made it working again. This makes me think that it's rather a driver issue related to bluez (Linux Bluetooth protocol stack) combined with a missing timeout in the miflora HA component.

Edit: however, this does still not work in HA, my sensors are still unavail. Also a restart of HA (not a reboot) did not help.

@talondnb @aronsky Can you access your host and try to reset the Bluetooth daemon, e.g., on Ubuntu via systemctl restart bluetooth (as root)? Otherwise, I would assume a reboot should also work... (but yeah, very annoying).

@talondnb @aronsky Can you access your host and try to reset the Bluetooth daemon, e.g., on Ubuntu via systemctl restart bluetooth (as root)? Otherwise, I would assume a reboot should also work... (but yeah, very annoying).

I'm running hass.io and don't have systemctl commands in either ssh addon nor the docker HA host itself. The current state of my bluetooth is frozen up, hci0 times out, basically as @aronsky described above in https://github.com/home-assistant/core/issues/30275#issuecomment-596169815.

A reboot will most certainly fix this, but whyyyyy is it happening.

@Danielhiversen, do you have any suggestions on what else this issue could be?

I'm really curious how does hass.io set up Bluetooth and where the baudrate
is set. If I get a chance to boot it, I'll look into it.

On Mon, Mar 9, 2020, 00:15 talondnb notifications@github.com wrote:

BTW, it looks like a 'hciconfig hci0 reset' might be simpler and unless it
was a coincidence, brought my sensors back sooner:

It's a trick I used for a while, and it was better than nothing, but the
Bluetooth would eventually lock up and I would have to reboot the Pi (as
in, the hci0 reset would be stuck as well, so nothing could be done). Sorry
to be a downer, but I wouldn't get my hopes up.

Looks like I've hit this:

bash-5.0# hciconfig hci0 reset
Can't init device hci0: Operation timed out (110)

β€”
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
https://github.com/home-assistant/core/issues/30275?email_source=notifications&email_token=ABD52DHKK2MUO76EUJXNCBLRGQKJ3A5CNFSM4KBBKPLKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOEOFDCLQ#issuecomment-596259118,
or unsubscribe
https://github.com/notifications/unsubscribe-auth/ABD52DBFBSWIINBALTPEJDTRGQKJ3ANCNFSM4KBBKPLA
.

@talondnb @aronsky Can you access your host and try to reset the Bluetooth daemon, e.g., on Ubuntu via systemctl restart bluetooth (as root)? Otherwise, I would assume a reboot should also work... (but yeah, very annoying).

I'm running hass.io and don't have systemctl commands in either ssh addon nor the docker HA host itself. The current state of my bluetooth is frozen up, hci0 times out, basically as @aronsky described above in #30275 (comment).

A reboot will most certainly fix this, but whyyyyy is it happening.

@Danielhiversen, do you have any suggestions on what else this issue could be?

I tried looking at how Bluetooth works under HassOS, but couldn't find any differences to regular raspbian... Do you mind sharing the output of sudo systemctl status from your HA host (please go over the output and make sure that there's no private data there, though if you're only using that Raspberry Pi for Hass, it should be fine; also - the output is paged, so consider redirecting it to a file to copy the full output).

@talondnb @aronsky Can you access your host and try to reset the Bluetooth daemon, e.g., on Ubuntu via systemctl restart bluetooth (as root)? Otherwise, I would assume a reboot should also work... (but yeah, very annoying).

I'm running hass.io and don't have systemctl commands in either ssh addon nor the docker HA host itself. The current state of my bluetooth is frozen up, hci0 times out, basically as @aronsky described above in #30275 (comment).
A reboot will most certainly fix this, but whyyyyy is it happening.
@Danielhiversen, do you have any suggestions on what else this issue could be?

I tried looking at how Bluetooth works under HassOS, but couldn't find any differences to regular raspbian... Do you mind sharing the output of sudo systemctl status from your HA host (please go over the output and make sure that there's no private data there, though if you're only using that Raspberry Pi for Hass, it should be fine; also - the output is paged, so consider redirecting it to a file to copy the full output).

[email protected]'s password: 

 _    _                                         _     _              _
| |  | |                          /\           (_)   | |            | |
| |__| | ___  _ __ ___   ___     /  \   ___ ___ _ ___| |_ __ _ _ __ | |_
|  __  |/ _ \| '_ ` _ \ / _ \   / /\ \ / __/ __| / __| __/ _` | '_ \| __|
| |  | | (_) | | | | | |  __/  / ____ \\__ \__ \ \__ \ || (_| | | | | |_
|_|  |_|\___/|_| |_| |_|\___| /_/    \_\___/___/_|___/\__\__,_|_| |_|\__|


Our command line:
$ ha help
➜  ~ sudo systemctl
sudo: systemctl: command not found
➜  ~ docker exec -it homeassistant /bin/bash
bash-5.0# sudo
bash: sudo: command not found
bash-5.0# systemctl
bash: systemctl: command not found
bash-5.0# 

I can't figure out how to mount the HassOS system partitions, so I'm having a hard time figuring out its architecture, and where the system (not Hass) configuration is located.

I'm having a hard time believing it's inside a docker container. The command line you posted is a bit confusing. I see that you're connecting to the Home Assistant host, but is there any chance you are not actually running on the host, but inside one of the containers? Though how you would be able to docker exec from there is beyond me.

Any chance you can hook up a keyboard, mouse and HDMI cable to your Raspberry Pi, login locally, and see if any of the commands I mentioned behave differently?

I've not yet tried this, but can do so tomorrow some time:

https://developers.home-assistant.io/docs/hassio_debugging/

Awesome. I just noticed that you already mentioned that you don't have access to systemctl, sorry for asking you to run it again :)

Hopefully, if you get physical access to the Pi (or the HassIO debugging works), we'll have a direction for a solution. Let me know how it goes.

Ok I've managed to get host access:

# systemctl status
● hassio
    State: running
     Jobs: 0 queued
   Failed: 0 units
    Since: Thu 1970-01-01 00:00:01 UTC; 50 years 2 months ago
   CGroup: /
           β”œβ”€init.scope
           β”‚ └─1 /sbin/init
           └─system.slice
             β”œβ”€rauc.service
             β”‚ └─252 /usr/bin/rauc --mount=/run/rauc service
             β”œβ”€rngd.service
             β”‚ └─245 /usr/sbin/rngd -f
             β”œβ”€systemd-timesyncd.service
             β”‚ └─241 /usr/lib/systemd/systemd-timesyncd
             β”œβ”€hassos-supervisor.service
             β”‚ β”œβ”€600 /bin/sh /usr/sbin/hassos-supervisor
             β”‚ └─628 docker container start --attach hassos_supervisor
             β”œβ”€avahi-dnsconfd.service
             β”‚ └─247 /usr/sbin/avahi-dnsconfd -s
             β”œβ”€NetworkManager.service
             β”‚ β”œβ”€249 /usr/sbin/NetworkManager --no-daemon
             β”‚ └─338 /sbin/dhclient -d -q -sf /usr/libexec/nm-dhcp-helper -pf /var/run/dhclient-eth0.pid -lf /var/lib/NetworkManager/dhclient-d55162b4-6152-4310-9312-8f4c54d86afa-eth0.lease -cf /var/lib/NetworkManager/dhclient-eth0.conf eth0
             β”œβ”€dbus.service
             β”‚ └─246 /usr/bin/dbus-daemon --system --address=systemd: --nofork --nopidfile --systemd-activation --syslog-only
             β”œβ”€docker.service
             β”‚ β”œβ”€  348 /usr/bin/dockerd -H fd:// --storage-driver=overlay2 --log-driver=journald --data-root /mnt/data/docker
             β”‚ β”œβ”€  359 containerd --config /var/run/docker/containerd/containerd.toml --log-level info
             β”‚ β”œβ”€  643 containerd-shim -namespace moby -workdir /mnt/data/docker/containerd/daemon/io.containerd.runtime.v1.linux/moby/8cecb1fe21b44af7779ca59b3c4aba8a2646e5e72da39294485208beaf13e031 -address /var/run/docker/containerd/containerd.sock -containerd-binary /usr/bin/containerd -runtime-root /var/run/docker/runtime-runc
             β”‚ β”œβ”€  658 s6-svscan -t0 /var/run/s6/services
             β”‚ β”œβ”€  752 s6-supervise s6-fdholderd
             β”‚ β”œβ”€  899 udevd --daemon
             β”‚ β”œβ”€  941 s6-supervise supervisor
             β”‚ β”œβ”€  945 python3 -m supervisor
             β”‚ β”œβ”€ 1007 socat UDP-RECVFROM:53,fork UDP-SENDTO:172.30.32.3:53
             β”‚ β”œβ”€ 1018 containerd-shim -namespace moby -workdir /mnt/data/docker/containerd/daemon/io.containerd.runtime.v1.linux/moby/59127ce6d67635dae5fdbba4b97a0cd5052b47d1142a1042c605b719ceec2f48 -address /var/run/docker/containerd/containerd.sock -containerd-binary /usr/bin/containerd -runtime-root /var/run/docker/runtime-runc
             β”‚ β”œβ”€ 1038 coredns -conf /config/corefile
             β”‚ β”œβ”€ 1046 containerd-shim -namespace moby -workdir /mnt/data/docker/containerd/daemon/io.containerd.runtime.v1.linux/moby/4df11596d52589d56d20753cc2ecb67fa5f922e7b01b05b9bcd9532fc975beaa -address /var/run/docker/containerd/containerd.sock -containerd-binary /usr/bin/containerd -runtime-root /var/run/docker/runtime-runc
             β”‚ β”œβ”€ 1070 s6-svscan -t0 /var/run/s6/services
             β”‚ β”œβ”€ 1181 ping -n -q -c 5 -W1 10.10.10.21
             β”‚ β”œβ”€ 1213 s6-supervise s6-fdholderd
             β”‚ β”œβ”€ 1509 udevd --daemon
             β”‚ β”œβ”€ 1535 s6-supervise bridge
             β”‚ β”œβ”€ 1536 s6-supervise pulseaudio
             β”‚ β”œβ”€ 1539 pulseaudio --system -vvv
             β”‚ β”œβ”€ 1540 socat UNIX-LISTEN:/data/external/pulse.sock,fork,unlink-early,mode=777 UNIX-CONNECT:/data/internal/pulse.sock
             β”‚ β”œβ”€ 1599 /usr/bin/docker-proxy -proto tcp -host-ip 0.0.0.0 -host-port 8884 -container-ip 172.30.33.0 -container-port 8884
             β”‚ β”œβ”€ 1613 /usr/bin/docker-proxy -proto tcp -host-ip 0.0.0.0 -host-port 8883 -container-ip 172.30.33.0 -container-port 8883
             β”‚ β”œβ”€ 1626 /usr/bin/docker-proxy -proto tcp -host-ip 0.0.0.0 -host-port 1884 -container-ip 172.30.33.0 -container-port 1884
             β”‚ β”œβ”€ 1638 /usr/bin/docker-proxy -proto tcp -host-ip 0.0.0.0 -host-port 1883 -container-ip 172.30.33.0 -container-port 1883
             β”‚ β”œβ”€ 1646 containerd-shim -namespace moby -workdir /mnt/data/docker/containerd/daemon/io.containerd.runtime.v1.linux/moby/b7f5cfa2dc4d55dc262f5ceec9ca356cee6123489ef5c55126af36c091fbc628 -address /var/run/docker/containerd/containerd.sock -containerd-binary /usr/bin/containerd -runtime-root /var/run/docker/runtime-runc
             β”‚ β”œβ”€ 1662 /sbin/docker-init -- /run.sh
             β”‚ β”œβ”€ 1720 bash /usr/bin/bashio /run.sh
             β”‚ β”œβ”€ 1773 socat TCP-LISTEN:8080,fork,reuseaddr SYSTEM:/bin/auth_srv.sh
             β”‚ β”œβ”€ 1774 mosquitto -c /etc/mosquitto.conf
             β”‚ β”œβ”€ 2541 containerd-shim -namespace moby -workdir /mnt/data/docker/containerd/daemon/io.containerd.runtime.v1.linux/moby/e13c98d0cc606ac8fde6e79e957af1edab6410acf76a9ba7ff5574080d90deae -address /var/run/docker/containerd/containerd.sock -containerd-binary /usr/bin/containerd -runtime-root /var/run/docker/runtime-runc
             β”‚ β”œβ”€ 2570 /sbin/docker-init -- /run.sh
             β”‚ β”œβ”€ 2601 containerd-shim -namespace moby -workdir /mnt/data/docker/containerd/daemon/io.containerd.runtime.v1.linux/moby/47e91aac76092b5148faa7a5d8be376f277deb22871a33daf47bc195604b6a63 -address /var/run/docker/containerd/containerd.sock -containerd-binary /usr/bin/containerd -runtime-root /var/run/docker/runtime-runc
             β”‚ β”œβ”€ 2645 /sbin/docker-init -- /init
             β”‚ β”œβ”€ 2703 bash /usr/bin/bashio /run.sh
             β”‚ β”œβ”€ 2760 s6-svscan -t0 /var/run/s6/services
             β”‚ β”œβ”€ 2849 s6-supervise s6-fdholderd
             β”‚ β”œβ”€ 3875 nmbd -F -S -s /etc/smb.conf
             β”‚ β”œβ”€ 3876 smbd -F -S -s /etc/smb.conf
             β”‚ β”œβ”€ 3980 smbd -F -S -s /etc/smb.conf
             β”‚ β”œβ”€ 3981 smbd -F -S -s /etc/smb.conf
             β”‚ β”œβ”€ 5710 s6-supervise stdin
             β”‚ β”œβ”€ 5711 s6-supervise ttyd
             β”‚ β”œβ”€ 5712 s6-supervise sshd
             β”‚ β”œβ”€ 5714 bash /usr/bin/bashio ./run
             β”‚ β”œβ”€ 5715 ttyd -d1 -i hassio -p 62252 tmux -u new -A -s homeassistant zsh -l
             β”‚ β”œβ”€ 5716 /usr/sbin/sshd -D -e
             β”‚ β”œβ”€ 6536 containerd-shim -namespace moby -workdir /mnt/data/docker/containerd/daemon/io.containerd.runtime.v1.linux/moby/f5cd5b7aa2a38b1d91253626055c64eeba4f707b5c0d00b67a7228dd5e0b2896 -address /var/run/docker/containerd/containerd.sock -containerd-binary /usr/bin/containerd -runtime-root /var/run/docker/runtime-runc
             β”‚ β”œβ”€ 6537 containerd-shim -namespace moby -workdir /mnt/data/docker/containerd/daemon/io.containerd.runtime.v1.linux/moby/8a555aa80a6235edc91dde7dd290fc547c77ab5f5e9be2d34611e7c6e3eff4b9 -address /var/run/docker/containerd/containerd.sock -containerd-binary /usr/bin/containerd -runtime-root /var/run/docker/runtime-runc
             β”‚ β”œβ”€ 6570 /sbin/docker-init -- /init
             β”‚ β”œβ”€ 6577 /sbin/docker-init -- /bin/sh -c python3 -m dropbox_upload
             β”‚ β”œβ”€ 6609 s6-svscan -t0 /var/run/s6/services
             β”‚ β”œβ”€ 6647 s6-supervise s6-fdholderd
             β”‚ β”œβ”€ 6876 python3 -m dropbox_upload
             β”‚ β”œβ”€ 7068 s6-supervise esphome
             β”‚ β”œβ”€ 7070 s6-supervise nginx
             β”‚ β”œβ”€ 7071 /usr/bin/python3 /usr/local/bin/esphome /config/esphome dashboard --socket /var/run/esphome.sock --hassio
             β”‚ β”œβ”€ 7072 nginx: master process nginx
             β”‚ β”œβ”€ 7199 nginx: worker process
             β”‚ β”œβ”€ 8405 smbd -F -S -s /etc/smb.conf
             β”‚ β”œβ”€21838 containerd-shim -namespace moby -workdir /mnt/data/docker/containerd/daemon/io.containerd.runtime.v1.linux/moby/0be29fea285e5b7b941f801d49b030f48c694153871901c08294baa441c98db1 -address /var/run/docker/containerd/containerd.sock -containerd-binary /usr/bin/containerd -runtime-root /var/run/docker/runtime-runc
             β”‚ β”œβ”€21854 /usr/local/bin/python3 -m homeassistant --config /config
             β”‚ β”œβ”€21880 udevd --daemon
             β”‚ β”œβ”€30476 /usr/bin/docker-proxy -proto tcp -host-ip 0.0.0.0 -host-port 8485 -container-ip 172.30.33.1 -container-port 8485
             β”‚ β”œβ”€30482 containerd-shim -namespace moby -workdir /mnt/data/docker/containerd/daemon/io.containerd.runtime.v1.linux/moby/f7390cc6ad0e3780e3b1b90796adc0601d11bc384bf451ae6db12f0e13d1e683 -address /var/run/docker/containerd/containerd.sock -containerd-binary /usr/bin/containerd -runtime-root /var/run/docker/runtime-runc
             β”‚ β”œβ”€30497 /sbin/docker-init -- /init ./run.sh
             β”‚ β”œβ”€30559 s6-svscan -t0 /var/run/s6/services
             β”‚ β”œβ”€30590 foreground  if   /etc/s6/init/init-stage2-redirfd   foreground    if     if      s6-echo      -n      --      [s6-init] making user provided files available at /var/run/s6/etc...          foreground      backtick      -n      S6_RUNTIME_PROFILE       printcontenv       S6_RUNTIME_PROFILE            importas      -u      S6_RUNTIME_PROFILE      S6_RUNTIME_PROFILE      backtick      -n      S6_RUNTIME_PROFILE_SRC       ifte        s6-echo        /etc/cont-profile.d/${S6_RUNTIME_PROFILE}               s6-echo        /etc              s6-test       -n       ${S6_RUNTIME_PROFILE}            importas      -u      S6_RUNTIME_PROFILE_SRC      S6_RUNTIME_PROFILE_SRC      if       s6-rmrf       /var/run/s6/etc            --Moreif       s6-mkdir       -pm       0755       /var/run/s6/etc            forx      i       fix-attrs.d       cont-init.d       cont-finish.d       services.d            importas      -u      i      i      if       s6-test       -d       ${S6_RUNTIME_PROFILE_SRC}/${i}            ifelse       s6-test       0       -eq       0             s6-ln       -s       ${S6_RUNTIME_PROFILE_SRC}/${i}       /var/run/s6/etc/${i}            if       s6-hiercopy       ${S6_RUNTIME_PROFILE_SRC}/${i}       /var/run/s6/etc/${i}                importas     -u     ?     ?     if      s6-echo      --      exited ${?}.          ifelse      s6-test      2      -eq      0           exit      0          exit     ${?}        if     if      s6-echo      -n      --      [s6-init] ensuring user provided files have correct perms...          foreground      redirfd      -r      0      /etc/s6/init/init-stage2-fixattrs.txt      fix-attrs          importas     -u     ?     ?     if      s6-echo      --      exited ${?}.          ifelse      s6-test      2      -eq      0           exit      0          exit     ${?}        if     if     -t      s6-test      -d      /var/run/s6/etc/fix-attrs.d          if      s6-echo      [fix-attrs.d] applying ownership & permissions fixes...          if      pipeline       s6-ls       -0       --       /var/run/s6/etc/fix-attrs.d            pipeline       s6-sort       -0       --            forstdin      -0      --      i      importas      -u      i      i      if       s6-echo       --       [fix-attrs.d] ${i}: applying...             foreground       redirfd       -r       0       /var/run/s6/etc/fix-attrs.d/${i}       fix-attrs            importas      -u      ?      ?      if       s6-echo       --       [fix-attrs.d] ${i}: exited ${?}.            ifelse       s6-test       2       -eq       0             exit       0            exit      ${?}          if      s6-echo      --      [fix-attrs.d] done.             if     if     -t      s6-test      -d      /var/run/s6/etc/cont-init.d          if      s6-echo      [cont-init.d] executing container initialization scripts...          if      pipeline       s6-ls       -0       --       /var/run/s6/etc/cont-init.d            pipeline       s6-sort       -0       --            forstdin      -o      0      -0      --      i      importas      -u      i      i      if       s6-echo       --       [cont-init.d] ${i}: executing...             foreground       /var/run/s6/etc/cont-init.d/${i}            importas      -u      ?      ?      if       s6-echo       --       [cont-init.d] ${i}: exited ${?}.            ifelse       s6-test       2       -eq       0             exit       0            exit      ${?}          if      s6-echo      --      [cont-init.d] done.             if     if     -t      s6-test      -d      /var/run/s6/etc/services.d          if      s6-echo      [services.d] starting services          if      pipeline       s6-ls       -0       --       /var/run/s6/etc/services.d            forstdin      -0      -p      --      i      importas      -u      i      i      if       s6-test       -d       /var/run/s6/etc/services.d/${i}            s6-hiercopy      /var/run/s6/etc/services.d/${i}      /var/run/s6/services/${i}          if      s6-svscanctl      -a      /var/run/s6/services          if      backtick      -D      0      -n      S6_CMD_WAIT_FOR_SERVICES       printcontenv       S6_CMD_WAIT_FOR_SERVICES            importas      -u      S6_CMD_WAIT_FOR_SERVICES      S6_CMD_WAIT_FOR_SERVICES      backtick      -D      5000      -n      S6_CMD_WAIT_FOR_SERVICES_MAXTIME       printcontenv       S6_CMD_WAIT_FOR_SERVICES_MAXTIME            importas      -u      S6_CMD_WAIT_FOR_SERVICES_MAXTIME      S6_CMD_WAIT_FOR_SERVICES_MAXTIME      if      -t       if        s6-test        ${S6_CMD_WAIT_FOR_SERVICES}        -ne        0              s6-test       1       -ne       0            s6-maximumtime      -t      ${S6_CMD_WAIT_FOR_SERVICES_MAXTIME}      pipeline       s6-ls       -0       --       /var/run/s6/etc/services.d            forstdin      -0      -o      0      --      i      importas      -u      i      i      ifelse       s6-test       -f       /var/run/s6/services/${i}/down             exit       0            ifelse       s6-test       -f       /var/run/s6/services/${i}/notification-fd             s6-svwait       -t       ${S6_CMD_WAIT_FOR_SERVICES_MAXTIME}       -U       /var/run/s6/services/${i}            s6-svwait      -t      ${S6_CMD_WAIT_FOR_SERVICES_MAXTIME}      -u      /var/run/s6/services/${i}          if      s6-echo      --      [services.d] done.               importas   -u   ?   ?   ifelse    s6-test    2    -eq    0       exit    0      foreground    redirfd    -w    1    /var/run/s6/env-stage3/S6_STAGE2_EXITED    s6-echo    -n    --    ${?}      exit   ${?}    if  -t   s6-test   1   -ne   0    foreground   s6-setsid   -gq   --   with-contenv   backtick   -D   0   -n   S6_LOGGING    printcontenv    S6_LOGGING      importas   S6_LOGGING   S6_LOGGING   ifelse    s6-test    ${S6_LOGGING}    -eq    2       redirfd    -w    1    /var/run/s6/uncaught-logs-fifo    fdmove    -c    2    1    ./run.sh      ./run.sh    importas  -u  ?  ?  foreground   /etc/s6/init/init-stage2-redirfd   s6-echo   --   [cmd] ./run.sh exited ${?}    foreground   redirfd   -w   1   /var/run/s6/env-stage3/S6_STAGE2_EXITED   s6-echo   -n   --   ${?}    foreground   s6-svscanctl   -t   /var/run/s6/services    s6-pause  -th importas -u ? ? if  s6-test  ${?}  -ne  0 if  s6-test  2  -ne  0 ifelse  s6-test  2  -ne  1  s6-svscanctl  -t  /var/run/s6/services s6-echo -- !!!!!  init-stage2 failed. !!!!!
             β”‚ β”œβ”€30591 s6-supervise s6-fdholderd
             β”‚ β”œβ”€30602 foreground  s6-setsid  -gq  --  with-contenv  backtick  -D  0  -n  S6_LOGGING   printcontenv   S6_LOGGING    importas  S6_LOGGING  S6_LOGGING  ifelse   s6-test   ${S6_LOGGING}   -eq   2     redirfd   -w   1   /var/run/s6/uncaught-logs-fifo   fdmove   -c   2   1   ./run.sh    ./run.sh importas -u ? ? foreground  /etc/s6/init/init-stage2-redirfd  s6-echo  --  [cmd] ./run.sh exited ${?} foreground  redirfd  -w  1  /var/run/s6/env-stage3/S6_STAGE2_EXITED  s6-echo  -n  --  ${?} foreground  s6-svscanctl  -t  /var/run/s6/services s6-pause -th
             β”‚ β”œβ”€30735 /bin/bash ./run.sh
             β”‚ β”œβ”€30763 node /usr/bin/pm2-runtime start npm -- start
             β”‚ β”œβ”€30774 npm
             β”‚ └─30795 node index.js
             β”œβ”€avahi-daemon.service
             β”‚ β”œβ”€349 avahi-daemon: running [hassio.local]
             β”‚ └─350 avahi-daemon: chroot helper
             β”œβ”€system-getty.slice
             β”‚ └─[email protected]
             β”‚   └─601 /sbin/getty -L tty1 115200 vt100
             β”œβ”€wpa_supplicant.service
             β”‚ └─337 /usr/sbin/wpa_supplicant -u
             β”œβ”€systemd-udevd.service
             β”‚ └─127 /usr/lib/systemd/systemd-udevd
             β”œβ”€bluetooth.service
             β”‚ └─306 /usr/libexec/bluetooth/bluetoothd
             β”œβ”€systemd-journald.service
             β”‚ └─110 /usr/lib/systemd/systemd-journald
             β”œβ”€dropbear.service
             β”‚ β”œβ”€  953 /usr/sbin/dropbear -F -R -E -p 22222 -s
             β”‚ β”œβ”€  955 /bin/sh /usr/sbin/hassos-cli
             β”‚ β”œβ”€ 1137 /bin/ash -l
             β”‚ β”œβ”€ 1182 systemctl status
             β”‚ β”œβ”€ 1183 /bin/more
             β”‚ └─21703 /usr/sbin/dropbear -F -R -E -p 22222 -s
             └─bluetooth-bcm43xx.service
               └─305 /usr/bin/hciattach /dev/serial1 bcm43xx 921600 noflow - b8:27:eb:85:df:19
#  

BTW, I've also gone and disabled wlan0, just to see if it has been interfering with the BT at all. Monitoring for now..

Perfect! Now this is where you have access to the stuff I've been talking about :D

Look at the bottom line - you have hciattach with the high baud rate of 921600. While in that shell, go ahead and edit the /usr/bin/btuart file according to my suggestion in the other thread (https://github.com/home-assistant/core/issues/31657#issuecomment-593071318) - find the line with 921600 in it, and change the number to 115200.

Once you restart, make sure that the output of sudo systemctl status bluetooth-bcm43xx.service (at the host level) gives you the updated command line, with the baudrate of 115200 - and you're golden.

Perfect! Now this is where you have access to the stuff I've been talking about :D

Look at the bottom line - you have hciattach with the high baud rate of 921600. While in that shell, go ahead and edit the /usr/bin/btuart file according to my suggestion in the other thread (#31657 (comment)) - find the line with 921600 in it, and change the number to 115200.

Once you restart, make sure that the output of sudo systemctl status bluetooth-bcm43xx.service (at the host level) gives you the updated command line, with the baudrate of 115200 - and you're golden.

Thanks! I will try this soon. Have you played around with the possibility of wifi causing the issue at all? I've disabled the wlan0 nic (ifconfig wlan0 down) and so far things seem stable, bit early to tell though.

Argh, literally all my sensors changed to unavailable just now after I pressed enter on the above post! Cursed!

I will try the baud rate lowering this afternoon..

Yeah, WiFi doesn't seem related (I disabled it on my Pi long before the problems with miFlora appeared). Don't worry, I'm positive the baudrate change will help.

I've tried this but the file is only read-only, no sudo in sight either.

Which user do you use to log into the box?

root, but it's weird, so I can do the ssh to [email protected] -p 22222, it then brings me to home assistant command line. I have to type 'login' to get to host system shell.

$ ssh [email protected] -p 22222
 _    _                                         _     _              _   
| |  | |                          /\           (_)   | |            | |  
| |__| | ___  _ __ ___   ___     /  \   ___ ___ _ ___| |_ __ _ _ __ | |_ 
|  __  |/ _ \| '_ ` _ \ / _ \   / /\ \ / __/ __| / __| __/ _` | '_ \| __|
| |  | | (_) | | | | | |  __/  / ____ \\__ \__ \ \__ \ || (_| | | | | |_ 
|_|  |_|\___/|_| |_| |_|\___| /_/    \_\___/___/_|___/\__\__,_|_| |_|\__|

Welcome on Home Assistant command line.

For more details use 'help' and 'exit' to close.
If you need access to host system use 'login'.

ha > login
# 

I then edit /usr/bin/btuart with vi and it shows Readonly. I tried stopping the bluetooth service but it doesn't help.

Well, the hash seems to indicate that you're root... Can you chmod u+w /usr/bin/btuart and see if it helps?

# chmod u+w /usr/bin/btuart 
chmod: /usr/bin/btuart: Read-only file system

I don't think I'm winning this one!

Ah. OK. We'll figure it out eventually. What does mount return?

Thanks for your persistence!

# mount
/dev/mmcblk0p5 on / type squashfs (ro,relatime)
devtmpfs on /dev type devtmpfs (rw,relatime,size=468472k,nr_inodes=117118,mode=755)
sysfs on /sys type sysfs (rw,nosuid,nodev,noexec,relatime)
proc on /proc type proc (rw,nosuid,nodev,noexec,relatime)
securityfs on /sys/kernel/security type securityfs (rw,nosuid,nodev,noexec,relatime)
tmpfs on /dev/shm type tmpfs (rw,nosuid,nodev)
devpts on /dev/pts type devpts (rw,nosuid,noexec,relatime,gid=5,mode=620,ptmxmode=000)
tmpfs on /run type tmpfs (rw,nosuid,nodev,mode=755)
tmpfs on /sys/fs/cgroup type tmpfs (ro,nosuid,nodev,noexec,mode=755)
cgroup2 on /sys/fs/cgroup/unified type cgroup2 (rw,nosuid,nodev,noexec,relatime,nsdelegate)
cgroup on /sys/fs/cgroup/systemd type cgroup (rw,nosuid,nodev,noexec,relatime,xattr,name=systemd)
bpf on /sys/fs/bpf type bpf (rw,nosuid,nodev,noexec,relatime,mode=700)
cgroup on /sys/fs/cgroup/cpuset type cgroup (rw,nosuid,nodev,noexec,relatime,cpuset)
cgroup on /sys/fs/cgroup/freezer type cgroup (rw,nosuid,nodev,noexec,relatime,freezer)
cgroup on /sys/fs/cgroup/memory type cgroup (rw,nosuid,nodev,noexec,relatime,memory)
cgroup on /sys/fs/cgroup/blkio type cgroup (rw,nosuid,nodev,noexec,relatime,blkio)
cgroup on /sys/fs/cgroup/pids type cgroup (rw,nosuid,nodev,noexec,relatime,pids)
cgroup on /sys/fs/cgroup/cpu,cpuacct type cgroup (rw,nosuid,nodev,noexec,relatime,cpu,cpuacct)
cgroup on /sys/fs/cgroup/devices type cgroup (rw,nosuid,nodev,noexec,relatime,devices)
cgroup on /sys/fs/cgroup/net_cls,net_prio type cgroup (rw,nosuid,nodev,noexec,relatime,net_cls,net_prio)
cgroup on /sys/fs/cgroup/perf_event type cgroup (rw,nosuid,nodev,noexec,relatime,perf_event)
tmpfs on /etc/machine-id type tmpfs (ro,mode=755)
debugfs on /sys/kernel/debug type debugfs (rw,relatime)
mqueue on /dev/mqueue type mqueue (rw,relatime)
/dev/mmcblk0p7 on /mnt/overlay type ext4 (rw,relatime)
/dev/mmcblk0p7 on /etc/modules-load.d type ext4 (rw,relatime)
/dev/mmcblk0p7 on /root/.docker type ext4 (rw,relatime)
/dev/mmcblk0p7 on /etc/docker type ext4 (rw,relatime)
/dev/mmcblk0p7 on /etc/udev/rules.d type ext4 (rw,relatime)
/dev/mmcblk0p7 on /root/.ssh type ext4 (rw,relatime)
/dev/mmcblk0p7 on /etc/dropbear type ext4 (rw,relatime)
/dev/mmcblk0p7 on /etc/modprobe.d type ext4 (rw,relatime)
/dev/mmcblk0p1 on /mnt/boot type vfat (rw,relatime,fmask=0022,dmask=0022,codepage=437,iocharset=ascii,shortname=mixed,errors=remount-ro)
/dev/mmcblk0p7 on /etc/NetworkManager/system-connections type ext4 (rw,relatime)
/dev/mmcblk0p7 on /etc/hostname type ext4 (rw,relatime)
/dev/mmcblk0p7 on /etc/hosts type ext4 (rw,relatime)
/dev/mmcblk0p7 on /etc/systemd/timesyncd.conf type ext4 (rw,relatime)
configfs on /sys/kernel/config type configfs (rw,relatime)
/dev/mmcblk0p8 on /mnt/data type ext4 (rw,relatime)
/dev/zram2 on /tmp type ext4 (rw,nosuid,nodev,nobarrier)
/dev/zram1 on /var type ext4 (rw,relatime,nobarrier)
/dev/mmcblk0p7 on /var/lib/systemd type ext4 (rw,relatime)
/dev/mmcblk0p7 on /var/lib/NetworkManager type ext4 (rw,relatime)
/dev/mmcblk0p7 on /var/lib/bluetooth type ext4 (rw,relatime)
/dev/mmcblk0p8 on /var/lib/docker type ext4 (rw,relatime)
/dev/mmcblk0p7 on /var/log/journal type ext4 (rw,relatime)
overlay on /mnt/data/docker/overlay2/5bfe7809d72da52b9afd5107aaab1cacb3ec8504dce4ce97304ea0aca3aa959d/merged type overlay (rw,relatime,lowerdir=/mnt/data/docker/overlay2/l/3KIWSB2RE6AVU6GCAIN3JSX66V:/mnt/data/docker/overlay2/l/HIGJWOYHFHOH6YGLQUYJ44HQ5S:/mnt/data/docker/overlay2/l/W6MRIASS6OFXQAUSHHPHWJZ5C7:/mnt/data/docker/overlay2/l/YBDNOH7L5XAFPT6TBLLAJ2MBAN:/mnt/data/docker/overlay2/l/LQT2K7GK4JT5LJQSVX44R54WAK:/mnt/data/docker/overlay2/l/BVOR2HTBOKBUNHONECSITFFSQM:/mnt/data/docker/overlay2/l/PFDJYTWJKXIWKRH7M6PCEA22MT:/mnt/data/docker/overlay2/l/4LH2MDCE2KBOMDDXHFQ6L6ZXTJ:/mnt/data/docker/overlay2/l/2YDT7HPQVTDJ7NEFMOTAU3JKTN:/mnt/data/docker/overlay2/l/SCZHVBWY7W7LXO25L7T7RUNIRL:/mnt/data/docker/overlay2/l/QLPR2HSBDTZKMOAZL6KWHOZ5ZV:/mnt/data/docker/overlay2/l/3Z7PRDPXVXGJLTF62XLCGNLS66:/mnt/data/docker/overlay2/l/2EDLJ5L4BWNIIJ4XXHMIPU4IIB:/mnt/data/docker/overlay2/l/NGY5DDS3FIV3DG7Q26Z4K67KW4:/mnt/data/docker/overlay2/l/Y6TKKYIX5LIVO5P4QSD37MT2YW,upperdir=/mnt/data/docker/overlay2/5bfe7809d72da52b9afd5107aaab1cacb3ec8504dce4ce97304ea0aca3aa959d/diff,workdir=/mnt/data/docker/overlay2/5bfe7809d72da52b9afd5107aaab1cacb3ec8504dce4ce97304ea0aca3aa959d/work)
overlay on /var/lib/docker/overlay2/5bfe7809d72da52b9afd5107aaab1cacb3ec8504dce4ce97304ea0aca3aa959d/merged type overlay (rw,relatime,lowerdir=/mnt/data/docker/overlay2/l/3KIWSB2RE6AVU6GCAIN3JSX66V:/mnt/data/docker/overlay2/l/HIGJWOYHFHOH6YGLQUYJ44HQ5S:/mnt/data/docker/overlay2/l/W6MRIASS6OFXQAUSHHPHWJZ5C7:/mnt/data/docker/overlay2/l/YBDNOH7L5XAFPT6TBLLAJ2MBAN:/mnt/data/docker/overlay2/l/LQT2K7GK4JT5LJQSVX44R54WAK:/mnt/data/docker/overlay2/l/BVOR2HTBOKBUNHONECSITFFSQM:/mnt/data/docker/overlay2/l/PFDJYTWJKXIWKRH7M6PCEA22MT:/mnt/data/docker/overlay2/l/4LH2MDCE2KBOMDDXHFQ6L6ZXTJ:/mnt/data/docker/overlay2/l/2YDT7HPQVTDJ7NEFMOTAU3JKTN:/mnt/data/docker/overlay2/l/SCZHVBWY7W7LXO25L7T7RUNIRL:/mnt/data/docker/overlay2/l/QLPR2HSBDTZKMOAZL6KWHOZ5ZV:/mnt/data/docker/overlay2/l/3Z7PRDPXVXGJLTF62XLCGNLS66:/mnt/data/docker/overlay2/l/2EDLJ5L4BWNIIJ4XXHMIPU4IIB:/mnt/data/docker/overlay2/l/NGY5DDS3FIV3DG7Q26Z4K67KW4:/mnt/data/docker/overlay2/l/Y6TKKYIX5LIVO5P4QSD37MT2YW,upperdir=/mnt/data/docker/overlay2/5bfe7809d72da52b9afd5107aaab1cacb3ec8504dce4ce97304ea0aca3aa959d/diff,workdir=/mnt/data/docker/overlay2/5bfe7809d72da52b9afd5107aaab1cacb3ec8504dce4ce97304ea0aca3aa959d/work)
nsfs on /run/docker/netns/fad3fd06527e type nsfs (rw)
overlay on /mnt/data/docker/overlay2/4d3c9f66162741eb77a47896b8f6b059d6dadb838d2455bd88d90ddf08a529e3/merged type overlay (rw,relatime,lowerdir=/mnt/data/docker/overlay2/l/MBBRNHIY7S3EM5ZLPEP2K3TAK2:/mnt/data/docker/overlay2/l/AB2UKFFDJNHEPVLVMLOOBFILYW:/mnt/data/docker/overlay2/l/I6MXKITJ2MSCGWPFUPSV4EYGWI:/mnt/data/docker/overlay2/l/E6OFQZLBXHS4IAEWEZSKJICQ34:/mnt/data/docker/overlay2/l/ZFMCKJSOHF2HGLVEUF6PZ7KEBG:/mnt/data/docker/overlay2/l/IHFKM2RDWNGGUWUKWRJ76KCXP6,upperdir=/mnt/data/docker/overlay2/4d3c9f66162741eb77a47896b8f6b059d6dadb838d2455bd88d90ddf08a529e3/diff,workdir=/mnt/data/docker/overlay2/4d3c9f66162741eb77a47896b8f6b059d6dadb838d2455bd88d90ddf08a529e3/work)
overlay on /var/lib/docker/overlay2/4d3c9f66162741eb77a47896b8f6b059d6dadb838d2455bd88d90ddf08a529e3/merged type overlay (rw,relatime,lowerdir=/mnt/data/docker/overlay2/l/MBBRNHIY7S3EM5ZLPEP2K3TAK2:/mnt/data/docker/overlay2/l/AB2UKFFDJNHEPVLVMLOOBFILYW:/mnt/data/docker/overlay2/l/I6MXKITJ2MSCGWPFUPSV4EYGWI:/mnt/data/docker/overlay2/l/E6OFQZLBXHS4IAEWEZSKJICQ34:/mnt/data/docker/overlay2/l/ZFMCKJSOHF2HGLVEUF6PZ7KEBG:/mnt/data/docker/overlay2/l/IHFKM2RDWNGGUWUKWRJ76KCXP6,upperdir=/mnt/data/docker/overlay2/4d3c9f66162741eb77a47896b8f6b059d6dadb838d2455bd88d90ddf08a529e3/diff,workdir=/mnt/data/docker/overlay2/4d3c9f66162741eb77a47896b8f6b059d6dadb838d2455bd88d90ddf08a529e3/work)
overlay on /mnt/data/docker/overlay2/cef46dec4f72ebb33e14da326237ae20ad9940b1ca15412cca106869a7521d66/merged type overlay (rw,relatime,lowerdir=/mnt/data/docker/overlay2/l/NHGPAJP3Z44NBA2RQMG2QQ4IPX:/mnt/data/docker/overlay2/l/UOFFRURXWZR74TPIMBIBKX5NU3:/mnt/data/docker/overlay2/l/CUNBTAZEQSAUNUZ5NTJWS5UWDT:/mnt/data/docker/overlay2/l/2EDLJ5L4BWNIIJ4XXHMIPU4IIB:/mnt/data/docker/overlay2/l/NGY5DDS3FIV3DG7Q26Z4K67KW4:/mnt/data/docker/overlay2/l/Y6TKKYIX5LIVO5P4QSD37MT2YW,upperdir=/mnt/data/docker/overlay2/cef46dec4f72ebb33e14da326237ae20ad9940b1ca15412cca106869a7521d66/diff,workdir=/mnt/data/docker/overlay2/cef46dec4f72ebb33e14da326237ae20ad9940b1ca15412cca106869a7521d66/work)
overlay on /var/lib/docker/overlay2/cef46dec4f72ebb33e14da326237ae20ad9940b1ca15412cca106869a7521d66/merged type overlay (rw,relatime,lowerdir=/mnt/data/docker/overlay2/l/NHGPAJP3Z44NBA2RQMG2QQ4IPX:/mnt/data/docker/overlay2/l/UOFFRURXWZR74TPIMBIBKX5NU3:/mnt/data/docker/overlay2/l/CUNBTAZEQSAUNUZ5NTJWS5UWDT:/mnt/data/docker/overlay2/l/2EDLJ5L4BWNIIJ4XXHMIPU4IIB:/mnt/data/docker/overlay2/l/NGY5DDS3FIV3DG7Q26Z4K67KW4:/mnt/data/docker/overlay2/l/Y6TKKYIX5LIVO5P4QSD37MT2YW,upperdir=/mnt/data/docker/overlay2/cef46dec4f72ebb33e14da326237ae20ad9940b1ca15412cca106869a7521d66/diff,workdir=/mnt/data/docker/overlay2/cef46dec4f72ebb33e14da326237ae20ad9940b1ca15412cca106869a7521d66/work)
nsfs on /run/docker/netns/4b091fa52980 type nsfs (rw)
nsfs on /run/docker/netns/533979c668ce type nsfs (rw)
overlay on /mnt/data/docker/overlay2/4b510acd908a1568eb4c794a567714a32ec0439afc096597dc7b4ea570b44216/merged type overlay (rw,relatime,lowerdir=/mnt/data/docker/overlay2/l/ZWXVNRJCW7GW2IDG7EIBUGB6V2:/mnt/data/docker/overlay2/l/OQ5WSFU76XT3AK4ROI5X6EBZ5X:/mnt/data/docker/overlay2/l/W5GP7YSXIO4BS3RFDLBC4OJ72X:/mnt/data/docker/overlay2/l/J4FLFDKSSGW3ERD7RHGNNQ2PCO:/mnt/data/docker/overlay2/l/2SYJKWOPC56YLJ2W5RCFCWHLAW:/mnt/data/docker/overlay2/l/UCFHF2SBJMMBA2MIY3XBRXCE5M:/mnt/data/docker/overlay2/l/FP3SZ3VCIEXFAEOWLPDV3UQKWN:/mnt/data/docker/overlay2/l/65KPLBGDFB5IWNP6746XN6X76O:/mnt/data/docker/overlay2/l/XST2KHWWFEV6V4D5LMSWWHVEEE,upperdir=/mnt/data/docker/overlay2/4b510acd908a1568eb4c794a567714a32ec0439afc096597dc7b4ea570b44216/diff,workdir=/mnt/data/docker/overlay2/4b510acd908a1568eb4c794a567714a32ec0439afc096597dc7b4ea570b44216/work)
overlay on /var/lib/docker/overlay2/4b510acd908a1568eb4c794a567714a32ec0439afc096597dc7b4ea570b44216/merged type overlay (rw,relatime,lowerdir=/mnt/data/docker/overlay2/l/ZWXVNRJCW7GW2IDG7EIBUGB6V2:/mnt/data/docker/overlay2/l/OQ5WSFU76XT3AK4ROI5X6EBZ5X:/mnt/data/docker/overlay2/l/W5GP7YSXIO4BS3RFDLBC4OJ72X:/mnt/data/docker/overlay2/l/J4FLFDKSSGW3ERD7RHGNNQ2PCO:/mnt/data/docker/overlay2/l/2SYJKWOPC56YLJ2W5RCFCWHLAW:/mnt/data/docker/overlay2/l/UCFHF2SBJMMBA2MIY3XBRXCE5M:/mnt/data/docker/overlay2/l/FP3SZ3VCIEXFAEOWLPDV3UQKWN:/mnt/data/docker/overlay2/l/65KPLBGDFB5IWNP6746XN6X76O:/mnt/data/docker/overlay2/l/XST2KHWWFEV6V4D5LMSWWHVEEE,upperdir=/mnt/data/docker/overlay2/4b510acd908a1568eb4c794a567714a32ec0439afc096597dc7b4ea570b44216/diff,workdir=/mnt/data/docker/overlay2/4b510acd908a1568eb4c794a567714a32ec0439afc096597dc7b4ea570b44216/work)
nsfs on /run/docker/netns/162f4cb2768d type nsfs (rw)
overlay on /mnt/data/docker/overlay2/4fa62b7c91c854c27700eee135eb4f4dafe9e78fe081b2179274298cc0139a06/merged type overlay (rw,relatime,lowerdir=/mnt/data/docker/overlay2/l/A7FHR4BOJAEDHBBX3IMBT5TZPQ:/mnt/data/docker/overlay2/l/FLITDPDFHCCI42WWRV6OEU2E6H:/mnt/data/docker/overlay2/l/TQ4VA3MKP5PEETZOG7LHVQHIXK:/mnt/data/docker/overlay2/l/ENAUNYLQITAO3TL64HI7SOPNSN:/mnt/data/docker/overlay2/l/H4GSZZ4YMH7VDOWN2IBGKKWTBY:/mnt/data/docker/overlay2/l/NNP5MRCHI74L2W25C54O7QBVJE:/mnt/data/docker/overlay2/l/XST2KHWWFEV6V4D5LMSWWHVEEE,upperdir=/mnt/data/docker/overlay2/4fa62b7c91c854c27700eee135eb4f4dafe9e78fe081b2179274298cc0139a06/diff,workdir=/mnt/data/docker/overlay2/4fa62b7c91c854c27700eee135eb4f4dafe9e78fe081b2179274298cc0139a06/work)
overlay on /var/lib/docker/overlay2/4fa62b7c91c854c27700eee135eb4f4dafe9e78fe081b2179274298cc0139a06/merged type overlay (rw,relatime,lowerdir=/mnt/data/docker/overlay2/l/A7FHR4BOJAEDHBBX3IMBT5TZPQ:/mnt/data/docker/overlay2/l/FLITDPDFHCCI42WWRV6OEU2E6H:/mnt/data/docker/overlay2/l/TQ4VA3MKP5PEETZOG7LHVQHIXK:/mnt/data/docker/overlay2/l/ENAUNYLQITAO3TL64HI7SOPNSN:/mnt/data/docker/overlay2/l/H4GSZZ4YMH7VDOWN2IBGKKWTBY:/mnt/data/docker/overlay2/l/NNP5MRCHI74L2W25C54O7QBVJE:/mnt/data/docker/overlay2/l/XST2KHWWFEV6V4D5LMSWWHVEEE,upperdir=/mnt/data/docker/overlay2/4fa62b7c91c854c27700eee135eb4f4dafe9e78fe081b2179274298cc0139a06/diff,workdir=/mnt/data/docker/overlay2/4fa62b7c91c854c27700eee135eb4f4dafe9e78fe081b2179274298cc0139a06/work)
nsfs on /run/docker/netns/default type nsfs (rw)
overlay on /mnt/data/docker/overlay2/d0099bf2cc1e54c9478af31f901a764205adea1b5adc6647d982d70d0007eb34/merged type overlay (rw,relatime,lowerdir=/mnt/data/docker/overlay2/l/WG76BUY2J7H3ZPI2XROSGQ3VFK:/mnt/data/docker/overlay2/l/HANAO7DLI6BBVQTORBLCMMQQ43:/mnt/data/docker/overlay2/l/N6KHXW3W7R6NLCG5GX6ZX5CCE3:/mnt/data/docker/overlay2/l/L2AS26JHVMTNWTGSZQCRQKAR3W:/mnt/data/docker/overlay2/l/QRI63R2OBH6SN77COHUDABDT7I:/mnt/data/docker/overlay2/l/MSMDEOYATAW6FFPFA6VUMPLOTP:/mnt/data/docker/overlay2/l/MCONYROSSFPBN6ORKTUHRCENYE,upperdir=/mnt/data/docker/overlay2/d0099bf2cc1e54c9478af31f901a764205adea1b5adc6647d982d70d0007eb34/diff,workdir=/mnt/data/docker/overlay2/d0099bf2cc1e54c9478af31f901a764205adea1b5adc6647d982d70d0007eb34/work)
overlay on /var/lib/docker/overlay2/d0099bf2cc1e54c9478af31f901a764205adea1b5adc6647d982d70d0007eb34/merged type overlay (rw,relatime,lowerdir=/mnt/data/docker/overlay2/l/WG76BUY2J7H3ZPI2XROSGQ3VFK:/mnt/data/docker/overlay2/l/HANAO7DLI6BBVQTORBLCMMQQ43:/mnt/data/docker/overlay2/l/N6KHXW3W7R6NLCG5GX6ZX5CCE3:/mnt/data/docker/overlay2/l/L2AS26JHVMTNWTGSZQCRQKAR3W:/mnt/data/docker/overlay2/l/QRI63R2OBH6SN77COHUDABDT7I:/mnt/data/docker/overlay2/l/MSMDEOYATAW6FFPFA6VUMPLOTP:/mnt/data/docker/overlay2/l/MCONYROSSFPBN6ORKTUHRCENYE,upperdir=/mnt/data/docker/overlay2/d0099bf2cc1e54c9478af31f901a764205adea1b5adc6647d982d70d0007eb34/diff,workdir=/mnt/data/docker/overlay2/d0099bf2cc1e54c9478af31f901a764205adea1b5adc6647d982d70d0007eb34/work)
overlay on /mnt/data/docker/overlay2/58e901e3298d7a502180f9f811895c7d709fd541abdd6759ef227ff63693a890/merged type overlay (rw,relatime,lowerdir=/mnt/data/docker/overlay2/l/LAXCKGIQMDKFFDCSIVM5YBYQH3:/mnt/data/docker/overlay2/l/S6T55A223MVRQDDHUKLI5M6ILA:/mnt/data/docker/overlay2/l/BYBFK7JDMVMRE6Y3PGUP66VBSS:/mnt/data/docker/overlay2/l/7PUCEZTP7RHQOCHYM7UO7A7DNY:/mnt/data/docker/overlay2/l/VHJMNMJJGSRBIBW7RLCK73EO22:/mnt/data/docker/overlay2/l/FNHI6IZTQRZMWS236R2B4L57HK:/mnt/data/docker/overlay2/l/CG3YT64QPRLNKDBUWUFD2MCFBW:/mnt/data/docker/overlay2/l/BDDHJVXZO42AS5FQTNZWKN2PDD:/mnt/data/docker/overlay2/l/O42BSJAHG4EHU7HQSO5BWDH5RB:/mnt/data/docker/overlay2/l/P5LOJTTJR2NNXSALEZN7K25CB4:/mnt/data/docker/overlay2/l/MBXHJMLVZSD3IOAISEFNPJX7KF:/mnt/data/docker/overlay2/l/FNNUN5QLIDRX4P2DJRVIUWY3TQ:/mnt/data/docker/overlay2/l/5V4W64LWCWH5H6JJHMR3B6ZOY5:/mnt/data/docker/overlay2/l/7XKVBYTNXPNLHHANKMB7XVFHFP:/mnt/data/docker/overlay2/l/B54CCXEQV6Z6SGUYGOMVAF7UIE:/mnt/data/docker/overlay2/l/IVBCKWHFJ5B272DKMH3EU3QRXJ:/mnt/data/docker/overlay2/l/QQNS77ABHBBC6HDTODKMF2R7ZO:/mnt/data/docker/overlay2/l/3CRXCPFQBKDQO7Q2EKUG5SB5AG:/mnt/data/docker/overlay2/l/6BLX7X7VCZVHPNC5XCYH5Y6EJZ:/mnt/data/docker/overlay2/l/5RYJJZZPLAA4DHXQ2UI7FD4Z75:/mnt/data/docker/overlay2/l/6LJ6HWWELHIZAQH52LE57N2DNK:/mnt/data/docker/overlay2/l/GX4BBXVPVATEZYFUF7HSC4N6VC:/mnt/data/docker/overlay2/l/QHIOKRBLCKEMF4CGP6RIFGP3XZ:/mnt/data/docker/overlay2/l/VAXHKIPQHRVNMVMARU6CAWIM7O:/mnt/data/docker/overlay2/l/BHTJONSASM6Q4KKWU3NSZCUE5Y,upperdir=/mnt/data/docker/overlay2/58e901e3298d7a502180f9f811895c7d709fd541abdd6759ef227ff63693a890/diff,workdir=/mnt/data/docker/overlay2/58e901e3298d7a502180f9f811895c7d709fd541abdd6759ef227ff63693a890/work)
overlay on /var/lib/docker/overlay2/58e901e3298d7a502180f9f811895c7d709fd541abdd6759ef227ff63693a890/merged type overlay (rw,relatime,lowerdir=/mnt/data/docker/overlay2/l/LAXCKGIQMDKFFDCSIVM5YBYQH3:/mnt/data/docker/overlay2/l/S6T55A223MVRQDDHUKLI5M6ILA:/mnt/data/docker/overlay2/l/BYBFK7JDMVMRE6Y3PGUP66VBSS:/mnt/data/docker/overlay2/l/7PUCEZTP7RHQOCHYM7UO7A7DNY:/mnt/data/docker/overlay2/l/VHJMNMJJGSRBIBW7RLCK73EO22:/mnt/data/docker/overlay2/l/FNHI6IZTQRZMWS236R2B4L57HK:/mnt/data/docker/overlay2/l/CG3YT64QPRLNKDBUWUFD2MCFBW:/mnt/data/docker/overlay2/l/BDDHJVXZO42AS5FQTNZWKN2PDD:/mnt/data/docker/overlay2/l/O42BSJAHG4EHU7HQSO5BWDH5RB:/mnt/data/docker/overlay2/l/P5LOJTTJR2NNXSALEZN7K25CB4:/mnt/data/docker/overlay2/l/MBXHJMLVZSD3IOAISEFNPJX7KF:/mnt/data/docker/overlay2/l/FNNUN5QLIDRX4P2DJRVIUWY3TQ:/mnt/data/docker/overlay2/l/5V4W64LWCWH5H6JJHMR3B6ZOY5:/mnt/data/docker/overlay2/l/7XKVBYTNXPNLHHANKMB7XVFHFP:/mnt/data/docker/overlay2/l/B54CCXEQV6Z6SGUYGOMVAF7UIE:/mnt/data/docker/overlay2/l/IVBCKWHFJ5B272DKMH3EU3QRXJ:/mnt/data/docker/overlay2/l/QQNS77ABHBBC6HDTODKMF2R7ZO:/mnt/data/docker/overlay2/l/3CRXCPFQBKDQO7Q2EKUG5SB5AG:/mnt/data/docker/overlay2/l/6BLX7X7VCZVHPNC5XCYH5Y6EJZ:/mnt/data/docker/overlay2/l/5RYJJZZPLAA4DHXQ2UI7FD4Z75:/mnt/data/docker/overlay2/l/6LJ6HWWELHIZAQH52LE57N2DNK:/mnt/data/docker/overlay2/l/GX4BBXVPVATEZYFUF7HSC4N6VC:/mnt/data/docker/overlay2/l/QHIOKRBLCKEMF4CGP6RIFGP3XZ:/mnt/data/docker/overlay2/l/VAXHKIPQHRVNMVMARU6CAWIM7O:/mnt/data/docker/overlay2/l/BHTJONSASM6Q4KKWU3NSZCUE5Y,upperdir=/mnt/data/docker/overlay2/58e901e3298d7a502180f9f811895c7d709fd541abdd6759ef227ff63693a890/diff,workdir=/mnt/data/docker/overlay2/58e901e3298d7a502180f9f811895c7d709fd541abdd6759ef227ff63693a890/work)
overlay on /mnt/data/docker/overlay2/1f35ab57bba0e493197bf4ba487a87f653e6b8f96e317c673069b9167969f95e/merged type overlay (rw,relatime,lowerdir=/mnt/data/docker/overlay2/l/G2KWTZF3U74P4X44YNUGLOR2JO:/mnt/data/docker/overlay2/l/LXXGGF7GPKGUWOPO2VWXW5MEZN:/mnt/data/docker/overlay2/l/UHVHVPUVJJYLY5Z7NA66PURYFF:/mnt/data/docker/overlay2/l/UMMCYLYIZ5B5M6D5YJK3JTHJGB:/mnt/data/docker/overlay2/l/4IGB7XD7UYEVO2BUXJPFP74WF3:/mnt/data/docker/overlay2/l/ZQMRBR5A7MMCWY2FTNJ57NPYBD:/mnt/data/docker/overlay2/l/ZANXWFJKX7SCNKFVGV75RNIILI:/mnt/data/docker/overlay2/l/WL2CXYN2A5ZO343LNLMWGFPYLZ,upperdir=/mnt/data/docker/overlay2/1f35ab57bba0e493197bf4ba487a87f653e6b8f96e317c673069b9167969f95e/diff,workdir=/mnt/data/docker/overlay2/1f35ab57bba0e493197bf4ba487a87f653e6b8f96e317c673069b9167969f95e/work)
overlay on /var/lib/docker/overlay2/1f35ab57bba0e493197bf4ba487a87f653e6b8f96e317c673069b9167969f95e/merged type overlay (rw,relatime,lowerdir=/mnt/data/docker/overlay2/l/G2KWTZF3U74P4X44YNUGLOR2JO:/mnt/data/docker/overlay2/l/LXXGGF7GPKGUWOPO2VWXW5MEZN:/mnt/data/docker/overlay2/l/UHVHVPUVJJYLY5Z7NA66PURYFF:/mnt/data/docker/overlay2/l/UMMCYLYIZ5B5M6D5YJK3JTHJGB:/mnt/data/docker/overlay2/l/4IGB7XD7UYEVO2BUXJPFP74WF3:/mnt/data/docker/overlay2/l/ZQMRBR5A7MMCWY2FTNJ57NPYBD:/mnt/data/docker/overlay2/l/ZANXWFJKX7SCNKFVGV75RNIILI:/mnt/data/docker/overlay2/l/WL2CXYN2A5ZO343LNLMWGFPYLZ,upperdir=/mnt/data/docker/overlay2/1f35ab57bba0e493197bf4ba487a87f653e6b8f96e317c673069b9167969f95e/diff,workdir=/mnt/data/docker/overlay2/1f35ab57bba0e493197bf4ba487a87f653e6b8f96e317c673069b9167969f95e/work)
overlay on /mnt/data/docker/overlay2/9317c1dc8e5106e12b69ad43fc9891bd0060cf7ad8dc69d4832bb296053c5674/merged type overlay (rw,relatime,lowerdir=/mnt/data/docker/overlay2/l/P53ATCE7OS22YOWMATPNW6UYSN:/mnt/data/docker/overlay2/l/WIWXMUDD3HP3JQER2JCSXTCSFD:/mnt/data/docker/overlay2/l/OANIVTTZQCLULENVJJVNS2GSHF:/mnt/data/docker/overlay2/l/HCKYV6XE3TA27R4RUFT5VZEXKZ:/mnt/data/docker/overlay2/l/GVKFVVREP4YYELLFM6ISHRYEFS:/mnt/data/docker/overlay2/l/5BGEAAVE5WFRYVTYYEPIRO7QXC:/mnt/data/docker/overlay2/l/WWSRCUBML2ASAMBFRMVT3FAHAQ:/mnt/data/docker/overlay2/l/GE3V2WYKNOAO6ACIGMMR2ET6E2:/mnt/data/docker/overlay2/l/OCN27NUTCAAMYGVI3FPDVAOGL5:/mnt/data/docker/overlay2/l/BLF6667HN5PKSQZT3PUPH7ZAXA:/mnt/data/docker/overlay2/l/KC6CBQWPRYRPH37ZH6COGRJ35R:/mnt/data/docker/overlay2/l/6CTCUFNS2LHX2WQPSVBNTKR336:/mnt/data/docker/overlay2/l/LGHGJ43DHMYMPZQY474VTHNE75:/mnt/data/docker/overlay2/l/WXQB6WSUUMFNJYQC4CKA5YOCDR,upperdir=/mnt/data/docker/overlay2/9317c1dc8e5106e12b69ad43fc9891bd0060cf7ad8dc69d4832bb296053c5674/diff,workdir=/mnt/data/docker/overlay2/9317c1dc8e5106e12b69ad43fc9891bd0060cf7ad8dc69d4832bb296053c5674/work)
overlay on /var/lib/docker/overlay2/9317c1dc8e5106e12b69ad43fc9891bd0060cf7ad8dc69d4832bb296053c5674/merged type overlay (rw,relatime,lowerdir=/mnt/data/docker/overlay2/l/P53ATCE7OS22YOWMATPNW6UYSN:/mnt/data/docker/overlay2/l/WIWXMUDD3HP3JQER2JCSXTCSFD:/mnt/data/docker/overlay2/l/OANIVTTZQCLULENVJJVNS2GSHF:/mnt/data/docker/overlay2/l/HCKYV6XE3TA27R4RUFT5VZEXKZ:/mnt/data/docker/overlay2/l/GVKFVVREP4YYELLFM6ISHRYEFS:/mnt/data/docker/overlay2/l/5BGEAAVE5WFRYVTYYEPIRO7QXC:/mnt/data/docker/overlay2/l/WWSRCUBML2ASAMBFRMVT3FAHAQ:/mnt/data/docker/overlay2/l/GE3V2WYKNOAO6ACIGMMR2ET6E2:/mnt/data/docker/overlay2/l/OCN27NUTCAAMYGVI3FPDVAOGL5:/mnt/data/docker/overlay2/l/BLF6667HN5PKSQZT3PUPH7ZAXA:/mnt/data/docker/overlay2/l/KC6CBQWPRYRPH37ZH6COGRJ35R:/mnt/data/docker/overlay2/l/6CTCUFNS2LHX2WQPSVBNTKR336:/mnt/data/docker/overlay2/l/LGHGJ43DHMYMPZQY474VTHNE75:/mnt/data/docker/overlay2/l/WXQB6WSUUMFNJYQC4CKA5YOCDR,upperdir=/mnt/data/docker/overlay2/9317c1dc8e5106e12b69ad43fc9891bd0060cf7ad8dc69d4832bb296053c5674/diff,workdir=/mnt/data/docker/overlay2/9317c1dc8e5106e12b69ad43fc9891bd0060cf7ad8dc69d4832bb296053c5674/work)
nsfs on /run/docker/netns/3ec69f6445ed type nsfs (rw)
overlay on /mnt/data/docker/overlay2/5c7f4ed436aff980fcddc08428015d5fbd26c65de760acc18e3307e44b76fd54/merged type overlay (rw,relatime,lowerdir=/mnt/data/docker/overlay2/l/QRGL3KX77QYHPEFZCY47V4MSH6:/mnt/data/docker/overlay2/l/XB2JSRMLKSCHF2AHJLYPMHW3UO:/mnt/data/docker/overlay2/l/JTQXJCCK3HYMR7G4FVIQKXK4FT:/mnt/data/docker/overlay2/l/PZK5CO5NG4EKDVGHQ5GGDU7B45:/mnt/data/docker/overlay2/l/YCR2KGQDK5ZY6HZG3JLZNE7NLA:/mnt/data/docker/overlay2/l/YYBIXH2BWV5ZWDXE53SXXSMWDX:/mnt/data/docker/overlay2/l/2EDLJ5L4BWNIIJ4XXHMIPU4IIB:/mnt/data/docker/overlay2/l/NGY5DDS3FIV3DG7Q26Z4K67KW4:/mnt/data/docker/overlay2/l/Y6TKKYIX5LIVO5P4QSD37MT2YW,upperdir=/mnt/data/docker/overlay2/5c7f4ed436aff980fcddc08428015d5fbd26c65de760acc18e3307e44b76fd54/diff,workdir=/mnt/data/docker/overlay2/5c7f4ed436aff980fcddc08428015d5fbd26c65de760acc18e3307e44b76fd54/work)
overlay on /var/lib/docker/overlay2/5c7f4ed436aff980fcddc08428015d5fbd26c65de760acc18e3307e44b76fd54/merged type overlay (rw,relatime,lowerdir=/mnt/data/docker/overlay2/l/QRGL3KX77QYHPEFZCY47V4MSH6:/mnt/data/docker/overlay2/l/XB2JSRMLKSCHF2AHJLYPMHW3UO:/mnt/data/docker/overlay2/l/JTQXJCCK3HYMR7G4FVIQKXK4FT:/mnt/data/docker/overlay2/l/PZK5CO5NG4EKDVGHQ5GGDU7B45:/mnt/data/docker/overlay2/l/YCR2KGQDK5ZY6HZG3JLZNE7NLA:/mnt/data/docker/overlay2/l/YYBIXH2BWV5ZWDXE53SXXSMWDX:/mnt/data/docker/overlay2/l/2EDLJ5L4BWNIIJ4XXHMIPU4IIB:/mnt/data/docker/overlay2/l/NGY5DDS3FIV3DG7Q26Z4K67KW4:/mnt/data/docker/overlay2/l/Y6TKKYIX5LIVO5P4QSD37MT2YW,upperdir=/mnt/data/docker/overlay2/5c7f4ed436aff980fcddc08428015d5fbd26c65de760acc18e3307e44b76fd54/diff,workdir=/mnt/data/docker/overlay2/5c7f4ed436aff980fcddc08428015d5fbd26c65de760acc18e3307e44b76fd54/work)
nsfs on /run/docker/netns/c914d086e678 type nsfs (rw)
# 

Let's try mount -o remount,rw /dev/mmcblk0p5 /

# mount -o remount,rw /dev/mmcblk0p5 /
# vi /usr/bin/btuart 
# chmod u+w /usr/bin/btuart 
chmod: /usr/bin/btuart: Read-only file system
# mount
/dev/mmcblk0p5 on / type squashfs (ro,relatime)

OK. So it looks like sqashfs is not writable (https://unix.stackexchange.com/questions/205108/remount-squashfs-root-filessytem-read-write). It seems that to make that change, you'd need to reflash that partition on your SD card. Now, that should be possible (as long as your partition table leaves enough space for it). But that would require some preparations:

  1. Having access to a Linux box (a VM should be fine) that can read the SD card
  2. Backing up of the current SD card (in case we mess anything up)
  3. Verifying that the partition table has enough space for an updated squashfs partition (in theory, it shouldn't take more space - we're making a tiny change to a file, not adding new data - but just to make sure we're not wasting our time).

If you're up to it, let me know, I'll be happy to help.

I think I had better call it, hehe. Thanks so much for your help so far, but to be honest if I have to go down this path I would rather just move off Hass.io and onto a virtualenv install. Really appreciate your help so far though!

Has anyone done a pull request to reduce the baud rate? Or is this still only deemed a workaround?

It's just a workaround, I'm afraid - the problem is only relevant to certain editions of Pi 3, as far as I understand - since later editions have hardware flow control, and thus avoid overflow (which is the underlying problem that causes the Bluetooth device to halt completely, and is solved by a lower baud rate). Therefore, I don't think they'll merge it into the mainline firmware, and even if they do - it'll take a long time to reach HassOS, I'm afraid...

I'd strongly consider moving to a virtualenv/docker install (that's what I'm running currently), and you could even install Hass directly on Raspbian (should be pretty easy) instead of using HassOS.

@aronsky Sure? I am running it on an ODROID XU4 which would mean it is affected in the same way. I would have tried to modify the baudrate but I cannot find such a (config) file.

How's your ODROID XU4 provisioned - did you flassh HassOS, or did you install Hass.io over a regular Linux distro?

Hass.io over Ubuntu 18.04 using an USB Bluetooth stick πŸ€”

I'm not sure how USB Bluetooth sticks are configured, and even whether the underlying problem is the same... However, did you go over the commands I recommended to @talondnb in the other thread, starting at https://github.com/home-assistant/core/issues/31657#issuecomment-593074076?

If the problem is an overflow due to the lack of hardware flow control, we should be able to figure out where the baud rate is set in order to lower it.

@aronsky thanks for your super quick support! Both ps -ef | grep hciattach and ps -ef | grep btattach are empty. However, dmesg contains (among a lot of other stuff):

[149364.467729] usb usb2-port1: disabled by hub (EMI?), re-enabling...                                          [149364.472994] usb 2-1: USB disconnect, device number 2
[149365.579310] usb 2-1: new full-speed USB device number 3 using exynos-ohci
[149365.814584] usb 2-1: New USB device found, idVendor=0a5c, idProduct=21e8
[149365.814613] usb 2-1: New USB device strings: Mfr=1, Product=2, SerialNumber=3
[149365.814629] usb 2-1: Product: BCM20702A0
[149365.814646] usb 2-1: Manufacturer: Broadcom Corp    [149365.814661] usb 2-1: SerialNumber: 5CF37081D855
[149365.935536] Bluetooth: hci0: BCM: chip id 63
[149365.937530] Bluetooth: hci0: BCM: features 0x07     [149365.954531] Bluetooth: hci0: hassio
[149365.956629] Bluetooth: hci0: BCM20702A1 (001.002.014) build 0000
[149365.962390] bluetooth hci0: Direct firmware load for brcm/BCM20702A1-0a5c-21e8.hcd failed with error -2
[149365.962402] Bluetooth: hci0: BCM: Patch brcm/BCM20702A1-0a5c-21e8.hcd not found                             [149366.941727] usb usb2-port1: disabled by hub (EMI?), re-enabling...
[149366.946838] usb 2-1: USB disconnect, device number 3[149367.671265] usb 2-1: new full-speed USB device number 4 using exynos-ohci                                   [149367.906499] usb 2-1: New USB device found, idVendor=0a5c, idProduct=21e8                                    [149367.906533] usb 2-1: New USB device strings: Mfr=1, Product=2, SerialNumber=3                               [149367.906550] usb 2-1: Product: BCM20702A0            [149367.906566] usb 2-1: Manufacturer: Broadcom Corp
[149367.906582] usb 2-1: SerialNumber: 5CF37081D855     [149368.027453] Bluetooth: hci0: BCM: chip id 63
[149368.029435] Bluetooth: hci0: BCM: features 0x07     [149368.046475] Bluetooth: hci0: hassio                 [149368.048518] Bluetooth: hci0: BCM20702A1 (001.002.014) build 0000
[149368.048629] bluetooth hci0: Direct firmware load for brcm/BCM20702A1-0a5c-21e8.hcd failed with error -2
[149368.048652] Bluetooth: hci0: BCM: Patch brcm/BCM20702A1-0a5c-21e8.hcd not found
[152662.441493] usb usb2-port1: disabled by hub (EMI?), re-enabling...
[152662.446492] usb 2-1: USB disconnect, device number 4[152663.554320] usb 2-1: new full-speed USB device number 5 using exynos-ohci                                   [152663.793503] usb 2-1: New USB device found, idVendor=0a5c, idProduct=21e8                                    [152663.793533] usb 2-1: New USB device strings: Mfr=1, Product=2, SerialNumber=3                               [152663.793549] usb 2-1: Product: BCM20702A0
[152663.793565] usb 2-1: Manufacturer: Broadcom Corp    [152663.793581] usb 2-1: SerialNumber: 5CF37081D855
[152663.914614] Bluetooth: hci0: BCM: chip id 63        [152663.916557] Bluetooth: hci0: BCM: features 0x07     [152663.933547] Bluetooth: hci0: hassio                 [152663.935538] Bluetooth: hci0: BCM20702A1 (001.002.014) build 0000                                            [152663.935617] bluetooth hci0: Direct firmware load for brcm/BCM20702A1-0a5c-21e8.hcd failed with error -2     [152663.935634] Bluetooth: hci0: BCM: Patch brcm/BCM20702A1-0a5c-21e8.hcd not found
[152665.430069] usb usb2-port1: disabled by hub (EMI?), re-enabling...
[152665.435381] usb 2-1: USB disconnect, device number 5[152665.782533] usb 2-1: new full-speed USB device number 6 using exynos-ohci
[152669.494421] usb 2-1: new full-speed USB device number 7 using exynos-ohci
[152669.729383] usb 2-1: New USB device found, idVendor=0a5c, idProduct=21e8
[152669.729405] usb 2-1: New USB device strings: Mfr=1, Product=2, SerialNumber=3
[152669.729418] usb 2-1: Product: BCM20702A0            [152669.729431] usb 2-1: Manufacturer: Broadcom Corp    [152669.729444] usb 2-1: SerialNumber: 5CF37081D855     [152669.846516] Bluetooth: hci0: BCM: chip id 63
[152669.848452] Bluetooth: hci0: BCM: features 0x07
[152669.866388] Bluetooth: hci0: BCM20702A              [152669.868379] Bluetooth: hci0: BCM20702A1 (001.002.014) build 0000
[152669.868428] bluetooth hci0: Direct firmware load for brcm/BCM20702A1-0a5c-21e8.hcd failed with error -2
[152669.868439] Bluetooth: hci0: BCM: Patch brcm/BCM20702A1-0a5c-21e8.hcd not found
[152671.063984] usb usb2-port1: disabled by hub (EMI?), re-enabling...
[152671.069079] usb 2-1: USB disconnect, device number 7
[152675.122480] usb 2-1: new full-speed USB device number 8 using exynos-ohci
[152675.357478] usb 2-1: New USB device found, idVendor=0a5c, idProduct=21e8
[152675.357500] usb 2-1: New USB device strings: Mfr=1, Product=2, SerialNumber=3                               [152675.357514] usb 2-1: Product: BCM20702A0
[152675.357527] usb 2-1: Manufacturer: Broadcom Corp    [152675.357540] usb 2-1: SerialNumber: 5CF37081D855     [152675.474595] Bluetooth: hci0: BCM: chip id 63
[152675.476549] Bluetooth: hci0: BCM: features 0x07     [152675.493467] Bluetooth: hci0: BCM20702A
[152675.495535] Bluetooth: hci0: BCM20702A1 (001.002.014) build 0000
[152675.495673] bluetooth hci0: Direct firmware load for brcm/BCM20702A1-0a5c-21e8.hcd failed with error -2     [152675.495702] Bluetooth: hci0: BCM: Patch brcm/BCM20702A1-0a5c-21e8.hcd not found                             [154025.510408] udevd[6]: starting version 3.2.8
[154025.534759] udevd[7]: starting eudev-3.2.8          [162350.040770] NET: Registered protocol family 38
[164907.714263] udevd[6]: starting version 3.2.8        [164907.735232] udevd[7]: starting eudev-3.2.8          [172025.250963] udevd[6]: starting version 3.2.8
[172025.274289] udevd[7]: starting eudev-3.2.8          [237749.901263] hassio: port 10(vethdd86874) entered disabled state
[237749.901810] vethe7cad98: renamed from eth0

Don't know why but the formatting of the log got screwed up somehow :thinking:

Does this help?

Can't find anything helpful there. Do you have the output of systemctl status?

Do you want the entire output (which is quite a lot: 264 lines)? :see_no_evil:

Here's already the output for systemctl status bluetooth:

● bluetooth.service - Bluetooth service
   Loaded: loaded (/lib/systemd/system/bluetooth.service; enabled; vendor preset: enabled)                         Active: active (running) since Sat 2020-03-07 00:27:56 UTC; 3 days ago
     Docs: man:bluetoothd(8)                             Main PID: 349 (bluetoothd)
   Status: "Running"                                       CGroup: /system.slice/bluetooth.service
           └─349 /usr/lib/bluetooth/bluetoothd          
Mar 07 00:27:56 hassio systemd[1]: Starting Bluetooth service...                                                Mar 07 00:27:56 hassio bluetoothd[349]: Bluetooth daemon 5.48
Mar 07 00:27:56 hassio bluetoothd[349]: Starting SDP server
Mar 07 00:27:56 hassio bluetoothd[349]: Bluetooth management interface 1.14 initialized                         Mar 07 00:27:56 hassio systemd[1]: Started Bluetooth service.                                                   Mar 09 00:00:05 hassio bluetoothd[349]: No cache for C4:7C:8D:6B:22:5E                                          Mar 09 00:00:10 hassio bluetoothd[349]: No cache for C4:7C:8D:65:5C:6A
Mar 09 00:00:15 hassio bluetoothd[349]: No cache for C4:7C:8D:65:5C:33

Nah, the Bluetooth service itself isn't the one that's interesting to me - are there any other interesting entries there regarding it? For example, in the previously posted output, the service of interest to me was bluetooth-bcm43xx.service - but I suppose you won't have that.

A bit complicated as I am connected from my mobile phone via Termux and a VPN :laughing:

I searched for "blue" and just found

β”œβ”€bluetooth.service
β”‚ └─349 /usr/lib/bluetooth/bluetoothd

Nothing else about "blue" / "bluetooth", especially no bluetooth-bcm43xx.service :-(

EDIT: Anything specific you want me to search for? :see_no_evil:

Based on a few searches I made, it looks like Bluetooth over USB works differently - not via serial, so baud rate and flow control are probably irrelevant... I'm afraid that might be a different problem, maybe within miflora itself (since as you mentioned earlier, the adapter itself works).

One option you could consider, in case you have a spare ESP8266/ESP32 laying around, is to flash it with ESPHome and let it handle all the BLE connections with miflora, while reporting to Home Assistant over WiFI.

Thanks for your support anyway! The strange thing is that it used to work for nearly a year without issues (with the same adapter), previously on a RPi2 (with that adapter as well due to the lack of bluetooth). I thought about running a script on the system, periodically resetting the bluetooth adapter (power off -> power on) which seemed to work for me (see my post above).

But yes, using dedicated hardware as bluetooth/plant gateway would be another option althought this would require running another device permanently (which I would like to avoid due to power consumption, already having a lot of devices around and my ODROID is still close enough to my mi flora sensors) :see_no_evil: .

If resetting the adapter works for you, it's a good workaround.

Same issue here, worked perfect for 1 year +, now broken.

Debian, Docker, HASS (USB Bluetooth).

Can we await confirmation of the fix before closing the issue please?

I am also very sceptical whether #31156 really solves this issue...

IMHO there are two different bugs:

  1. since HA 0.103: bad connection which results in the graph described in the original bug message. This should be solved with #31156
  2. since HA 0.104: bluetooth crashes and doesn't recover even if HA is stopped (please correct me if it started in 0.103) This will probably not be solved by my patch. I bet this is caused thru:
    a) Update BlueZ 5.52 see: https://github.com/home-assistant/operating-system/releases/tag/3.8
    b) Some changes either in the miflora code or in HA resulting in multiple queries at the same time or too many queries are breaking bluez. related to: https://github.com/home-assistant/core/issues/30147

This works for me:
odroid xu4 + usb bt dongle
HA 0.105
dietpi / kernel 4.14.66+ / bluez 5.43-2+deb9u1
1 miflora sensor 2meter away, so good connection. I have a 2nd one with a hardware/firmware bug. it takes one new battery every 4 weeks... I will buy another one.

https://github.com/home-assistant/core/pull/31156 did not resolve the issue. I'm running "former HASSIO" on RPI 4.

Still broken here too :(

i have the same issues :-(

I also hat the issue yesterday and then did a update to 0.108.5 and supervisor 217
using the terminal bluetoothctlworks with scan on works.
So it seemst to be fixed...

Yes, the issue isn't with the BLE controller in my case (it's always worked), it's the fact that the MiFlora integration does not appear to work.

@DanielXYZ2000
i never used terminal in HA.
when i type i bluetoothctl
just this message comes on:
Bild 2

@Dreamoffice Which terminal add-on are you using? use the communitiy one:
SSH & Web Terminal
that should contain this command

im using this :-(
Bild 3

ok. perfect with the other terminal it is working, but the sensors are still unavailbale.
how looks your mac adress of the sensor.
did you also add the BLE?
like: BLE_XX:XX:XX:XX?

@DanielXYZ2000
- platform: miflora mac: C4:7C:8F:61:2C:2A
like this?

on the console you should get:
`
~bluetoothctrl

Agent registered
[CHG] Controller ..... XX:XX:... Discovering: yes
[bluetooth]# scan on
Discovery started
[CHG] ....
[NEW] Device C4:X:XX:...XX:AC Flower care`

in my config it looks like

sensor: 
  - platform:   
    mac: 'C4:XX:XX:XX:XX:AC' #nr: 1
    name: Palme
    force_update: true 
    median: 1
    adapter: hci0
    monitored_conditions:
    - moisture
    - light
    - temperature
    - conductivity
    - battery

You don't need BLE_ as prefix

@DanielXYZ2000
is this working in your setup
`sensor:

  • platform: `
    nothing after platform?

i don t know why but i can receive the values. they are still unavailable.
in the flower care app i see all values.
in the bluetooth scan i see the sensors
but not in Home Assistant

Ups, yes platform should be miflora
`
sensor:

  • platform: miflora
    mac: 'C4:XX:XX:XX:XX:AC' #nr: 1
    ....
    `
    I also use the plant component -
    I also know that it can take a while till the sensor shows up

# https://home-assistant.io/components/plant/ plant: Palme: sensors: moisture: sensor.palme_moisture battery: sensor.palme_battery temperature: sensor.palme_temperature conductivity: sensor.palme_conductivity brightness: sensor.palme_brightness min_moisture: 20 max_moisture: 55 min_battery: 10 min_conductivity: 100 min_temperature: 11

I am running hass.io on an RPi 3 and also encountered my miflora sensors to be unavailable afte 104 a few minutes after hass.io was rebooted. With 0.108.3 I am still having the problem.
@xPhantomNL as far as I understood lowering the baudrate worked for you. I read the entire thread, but could not spot how to lower the baudrate in hass.io. Is there an edit I could make to the configuration.yaml?

Was this page helpful?
0 / 5 - 0 ratings