Re: Bluetooth mesh sample (mesh) - Hard Fault


laczenJMS
 

Hi Johan,

I increased the stack size to CONFIG_BT_RX_STACK_SIZE=2048 (I don't
understand why the real stack size is reported to be 2348), anyhow it
gives a new hard fault (but now with assert: '0' failed):

Kernel stacks:
main (real size 512): unused 220 usage 292 / 512 (57 %)
idle (real size 256): unused 200 usage 56 / 256 (21 %)
interrupt (real size 2048): unused 1640 usage 408 / 2048 (19 %)
workqueue (real size 2048): unused 1668 usage 380 / 2048 (18 %)
prio recv thread stack (real size 748): unused 440 usage 308 / 748 (41 %)
recv thread stack (real size 2348): unused 308 usage 2040 / 2348 (86 %)
[bt] [ERR] isr_rx_conn_pkt_ctrl: assert: '0' failed
***** HARD FAULT *****
Executing thread ID (thread): 0x200023dc
Faulting instruction address: 0x12d50
Fatal fault in ISR! Spinning...

Kind regards,

Jehudi

2017-09-06 11:35 GMT+02:00 Johan Hedberg <johan.hedberg@...>:

Hi Jehudi,

On Wed, Sep 06, 2017, Laczen JMS wrote:
Running the bluetooth mesh example (samples/bluetooth/mesh/) on a
nrf51822 results in a Hard Fault. The provisioner is meshctl (bluez).
After successful provisioning I disconnect from the mesh, when
connecting again the hard fault appears:

Kernel stacks:
main (real size 512): unused 228 usage 284 / 512 (55 %)
idle (real size 256): unused 200 usage 56 / 256 (21 %)
interrupt (real size 2048): unused 1656 usage 392 / 2048 (19 %)
workqueue (real size 1024): unused 676 usage 348 / 1024 (33 %)
prio recv thread stack (real size 448): unused 144 usage 304 / 448 (67 %)
recv thread stack (real size 1396): unused 32 usage 1364 / 1396 (97 %)
***** HARD FAULT *****
Executing thread ID (thread): 0x20001f04
Faulting instruction address: 0xf63c
Fatal fault in ISR! Spinning...

If needed I can provide more info/logging...
I'd bet the recv thread stack overflowed. It's already at 97% with only
32 unused bytes based on the above log. Try increasing its size to
something bigger. The Kconfig variable is called CONFIG_BT_RX_STACK_SIZE.

Johan

Join devel@lists.zephyrproject.org to automatically receive all group messages.