Compare commits

..

118 Commits

Author SHA1 Message Date
Vinayak Kariappa Chettimada
4ebecae78a Bluetooth: Controller: Fix connection update interval_us variables
Fix connection update microsecond interval variable data
type, to use 32-bit so that a value upto 2000 seconds, i.e.
4 seconds interval and 499 peripheral latency can be stored.

Regression in commit abfe5f17a9 ("Bluetooth: Controller:
1 ms connection").

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
(cherry picked from commit 0a4480ce61)
2025-07-11 11:24:10 -10:00
Glenn Andrews
5e24e48342 boards: disco_l475_iot1: fix arduino_i2c assignment
0f05f58bf5 assigned the `arduino_i2c` to the incorrect pins for the Arduino UNO connector. This assigns them back.

signed-off-by: Glenn Andrews <glenn.andrews.42@gmail.com>
(cherry picked from commit a2da571172)
2025-07-10 06:27:54 -10:00
Benjamin Cabé
6753a86f27 doc: _templates: show correct version in sidebar in case of releases
Ensure that version indicator in sidebar is correct in case of releases.

Fixes zephyrproject-rtos/zephyr#91799.

Signed-off-by: Benjamin Cabé <benjamin@zephyrproject.org>
(cherry picked from commit bbfaf8adec)
2025-06-23 13:04:19 -07:00
Fabian Blatz
aabd3a1826 modules: lvgl: Register print callback after lv_init
Move initialization of the print callback handler after calling lv_init, as
the latter zeroes the global callback structure.

Signed-off-by: Fabian Blatz <fabianblatz@gmail.com>
(cherry picked from commit 5cffb8e5a6)
2025-06-05 13:07:45 -07:00
Henrik Brix Andersen
5bef9327b0 tests: drivers: virtualization: qemu_kvm_arm64.overlay: fix SPDX license
"UNLICENSED" is not a valid SPDX License Identifier. Remove it from the
list and just keep "Apache-2.0".

Fixes: #89413

Signed-off-by: Henrik Brix Andersen <hebad@vestas.com>
(cherry picked from commit b240f11592)
2025-06-03 21:40:31 -07:00
Jukka Rissanen
7c48241e5e net: http: server: Select POSIX_C_LIB_EXT instead of FNMATCH
The CONFIG_POSIX_C_LIB_EXT will get support for fnmatch() function.
The old CONFIG_FNMATCH is deprecated.

Signed-off-by: Jukka Rissanen <jukka.rissanen@nordicsemi.no>
(cherry picked from commit 0f90affcdf)
2025-06-03 21:39:52 -07:00
Jukka Rissanen
63d789ac0b net: http: server: The detail length of wildcard detail was wrong
The path length of the detail resource was not set properly.

Signed-off-by: Jukka Rissanen <jukka.rissanen@nordicsemi.no>
(cherry picked from commit 33cf7dc78a)
2025-06-03 21:39:52 -07:00
Krzysztof Chruściński
c08390df71 drivers: serial: nrfx_uarte: Fix use of PM_DEVICE_ISR_SAFE
PM_DEVICE_ISR_SAFE shall not be used when non-asynchronous API is used
because RX is disabled in suspend action and that takes relatively
long time. In case of PM_DEVICE_ISR_SAFE it is done with interrupts
disabled. RX is not used at all if disable-rx property is set and in
that case PM_DEVICE_ISR_SAFE can be used.

Added macro which determines if ISR safe mode can be used.

Signed-off-by: Krzysztof Chruściński <krzysztof.chruscinski@nordicsemi.no>
(cherry picked from commit e0f5241b28)
2025-05-23 11:44:33 -07:00
Luis Ubieda
d4374ed2cf sensor: adxl345: Disable Sensor Streaming by default
Have the application enable this feature explicitcly, so that
simple applications do not need to disable this to get the
expected behavior.

Signed-off-by: Luis Ubieda <luisf@croxel.com>
2025-05-09 14:06:28 -07:00
Luis Ubieda
0f9a0f4d00 sensor: adxl345: Only enable FIFO Stream with Sensor Stream is enabled
Otherwise with its default configuration (25-Hz, 32-level FIFO),
getting individual samples could be up to 1-second old.

Signed-off-by: Luis Ubieda <luisf@croxel.com>
2025-05-09 14:06:28 -07:00
Luis Ubieda
80b607dd8f sensor: adxl345: Fix decoder for non-streaming mode
The following fixes have been applied to this decoder:
- The Q-scale factor was fixed, both for full-scale and non
full-scale modes.
- The data-type decoded is struct sensor_three_axis_data, as
it should for read/decode API.

Signed-off-by: Luis Ubieda <luisf@croxel.com>
2025-05-09 14:06:28 -07:00
Luis Ubieda
e465e6168e sensor: adxl345: Add get_size_info API
Used by the sensor-shell in order to retrieve values, otherwise
it crashes.

Signed-off-by: Luis Ubieda <luisf@croxel.com>
2025-05-09 14:06:28 -07:00
Dong Wang
0fb81dec80 logging: Assign IDs to log backends early during log_core_init
This commit moves the assignment of backend IDs from the 'z_log_init'
function to the earlier 'log_core_init' function. This ensures that
backend IDs are assigned before they are used in the 'log_backend_enable'
function, preventing incorrect settings of log dynamic filters.

Signed-off-by: Dong Wang <dong.d.wang@intel.com>
(cherry picked from commit 3fac83151d)
2025-05-09 11:59:59 -07:00
Benjamin Cabé
1cdd63e5ba doc: Update Graphviz font configuration
Customize Graphviz dot rendering to use same font stack as our Sphinx
theme.

Signed-off-by: Benjamin Cabé <benjamin@zephyrproject.org>
(cherry picked from commit c0f76d9363)
2025-05-09 11:57:23 -07:00
Dong Wang
64451449ac logging: Ensure atomic update of log filter slot in LOG_FILTER_SLOT_SET
Replaced the read-modify-write sequence with a single read and write
operation, preventing the intermediate value is wrongly used to filter
out logs of another thread with higher priority that preempts the current
thread.

Signed-off-by: Dong Wang <dong.d.wang@intel.com>
(cherry picked from commit 872f363696)
2025-05-09 10:45:48 -07:00
Chekhov Ma
79696d420e drivers: gpio: adp5585: fix wrong reg during pin configure
The ADP5585_GPO_OUT_MODE_A is used when configuring initial output
during pin configuration, causing pins configured HIGH is incorrectly
configured as open-drain. Replacing the reg with ADP5585_GPO_DATA_OUT
fixes the issue.

Signed-off-by: Chekhov Ma <chekhov.ma@nxp.com>
(cherry picked from commit c87900aa40)
2025-05-09 10:36:07 -07:00
Benjamin Cabé
d5d5a7c7db doc: make graphviz diagrams look good in dark theme
Add filter to invert colors in dark mode.

Signed-off-by: Benjamin Cabé <benjamin@zephyrproject.org>
(cherry picked from commit fd919b5160)
2025-05-09 10:34:47 -07:00
Jukka Rissanen
215737c229 net: if: Release the interface lock early when starting IPv4 ACD
In order to avoid any mutex deadlocks between iface->lock and
TX lock, release the interface lock before calling a function
that will acquire TX lock. See previous commit for similar issue
in RS timer handling. So here we create a separate list of ACD
addresses that are to be started when network interface comes up
without iface->lock held.

Signed-off-by: Jukka Rissanen <jukka.rissanen@nordicsemi.no>
2025-05-09 10:34:15 -07:00
Jukka Rissanen
60190a93f4 net: if: Release the interface lock early when starting IPv6 DAD
In order to avoid any mutex deadlocks between iface->lock and
TX lock, release the interface lock before calling a function
that will acquire TX lock. See previous commit for similar issue
in RS timer handling.

Signed-off-by: Jukka Rissanen <jukka.rissanen@nordicsemi.no>
2025-05-09 10:34:15 -07:00
Jukka Rissanen
f749a05200 net: if: Release the interface lock early in IPv6 RS timeout handler
The net_if.c:rs_timeout() is sending a new IPv6 router solicitation
message to network by calling net_if_start_rs(). That function will
then acquire iface->lock and call net_ipv6_start_rs() which will try
to send the RS message and acquire TX send lock.
During this RS send, we might receive TCP data that could try to
send an ack to peer. This will then in turn cause also TX lock
to be acquired. Depending on timing, the lock ordering between
rx thread and system workq might mix which could lead to deadlock.
Fix this issue by releasing the iface->lock before starting the
RS sending process. The net_if_start_rs() does not really need to
keep the interface lock for a long time as it is the only one sending
the RS message.

Fixes #86499

Signed-off-by: Jukka Rissanen <jukka.rissanen@nordicsemi.no>
2025-05-09 10:34:15 -07:00
Jukka Rissanen
3af878f68e net: if: Documentation missing for IPv4 ACD timeout variable
Add documentation for ACD timeout variable as it was missing.

Signed-off-by: Jukka Rissanen <jukka.rissanen@nordicsemi.no>
2025-05-09 10:34:15 -07:00
Jukka Rissanen
05dd94d1dd net: if: Documentation missing for IPv6 DAD start time variable
Add documentation for DAD start time variable as it was missing.

Signed-off-by: Jukka Rissanen <jukka.rissanen@nordicsemi.no>
2025-05-09 10:34:15 -07:00
Sudan Landge
0a8f9bb61c modules: tf-m: update to 2.1.2
Update TF-M to 2.1.2 from version 2.1.1.
This is required to use MbedTLS 3.6.3.

Signed-off-by: Sudan Landge <sudan.landge@arm.com>
2025-05-09 10:27:22 -07:00
Sudan Landge
bbe7ab00de modules: mbedtls: update to 3.6.3
Update Mbed TLS to 3.6.3 as it has CVE fixes.

Signed-off-by: Sudan Landge <sudan.landge@arm.com>
2025-05-09 10:27:22 -07:00
Fabio Baltieri
7ff8f6696c ci: update few workflows to use Python 3.12
The HTML doc build workflow and PR assigner recently started failing to
install python packages, breaking all backports. Seems to work when
using Python 3.12, this is already done in main in bacb99da6d.

Signed-off-by: Fabio Baltieri <fabiobaltieri@google.com>
2025-05-09 11:19:59 +01:00
Alex An
83e99be7d1 drivers: retained_mem: Fix using multiple nRF retained memory regions
Fixes an error when using multiple nRF retained memory regions caused by
a missing separating comma.

Signed-off-by: Alex An <aza0337@auburn.edu>
(cherry picked from commit fbb75862a1)
2025-05-07 11:02:27 -07:00
Luis Ubieda
94c7905e01 spi_rtio: fix transactions for default handler
This patch fixes transaction op items not performed within a single
SPI transfer. This is common for Write + Read commands, that depend on
the CS kept asserted until the end, otherwise the context will be lost.
A similar fix was applied to i2c_rtio_default on #79890.

Signed-off-by: Luis Ubieda <luisf@croxel.com>
(cherry picked from commit b5ca5b7b77)
2025-05-07 08:14:55 -07:00
Mathieu Choplain
43c79ba72c dts: st: stm32u5: restore correct clocks on multi-bit devices
During the transition to STM32_CLOCK macro (in 57723cf), the `clocks`
property of peripherals requiring more than one bit to be set were
mistakenly modified. Commit 2c3294b079
partially fixed these errors, but some nodes for the U5 series are still
wrong.

Restore `clocks` on affected devices in corresponding STM32U5 DTSI.

Fixes: 57723cf405
Signed-off-by: Mathieu Choplain <mathieu.choplain@st.com>
(cherry picked from commit 37bdc38ec6)
2025-03-20 14:25:06 -07:00
Luis Ubieda
7412674a3c rtio: workq: Select Early P4WQ threads init
Otherwise the RTIO Workqueue is not functional for devices during init.

An example of this issue is devices using SPI transfers with default
spi_rtio.

Signed-off-by: Luis Ubieda <luisf@croxel.com>
(cherry picked from commit 2ce2794987)
2025-02-26 09:38:00 -06:00
Luis Ubieda
0c413db7eb p4wq: Add Kconfig to perform early init on threads
In order to make them functional for devices during init. Default
behavior is to keep late initialization, as before.

Signed-off-by: Luis Ubieda <luisf@croxel.com>
(cherry picked from commit ec45b29ea3)
2025-02-26 09:38:00 -06:00
Simon Guinot
af81a8ba32 drivers: led: lp50xx: check the number of LED colors
The current code assumes (especially in the lp50xx_set_color function)
that the number of LED colors defined in DT is not greater than 3. But
since this is not checked, then this is not necessarily the case...

This patch consolidates the initialization of the lp50xx LED driver by
checking the number of colors for each LED found in DT.

Signed-off-by: Simon Guinot <simon.guinot@sequanux.org>
(cherry picked from commit ffbb8f981a)
2025-02-26 09:31:58 -06:00
Robert Lubos
546386cd1f net: if: Setup DAD timer regardless of DAD query result
In rare occasions when sending DAD NS packet fails, we should still
setup the DAD timer, unless we implement some kind of more advanced
retry mechanism. If we don't do that, the IPv6 address added to the
interface will never be usable in such cases.

Signed-off-by: Robert Lubos <robert.lubos@nordicsemi.no>
(cherry picked from commit 008a7ca202)
2025-02-26 09:30:50 -06:00
Robert Lubos
aa48023434 net: if: Clear neighbor cache when removing IPv6 addr with active DAD
DAD creates an entry in the neighbor cache for the queried (own)
address. In case the address is removed from the interface while DAD is
still incomplete, we need to remove the corresponding cache entry (just
like in case of DAD timeout) to avoid stale entries in the cache.

Signed-off-by: Robert Lubos <robert.lubos@nordicsemi.no>
(cherry picked from commit a09fd8e97f)
2025-02-26 09:30:50 -06:00
Robert Lubos
3410679046 tests: net: conn_mgr: Use valid LL address in tests
Using L2 address of length 1 (invalid/unsupported one) confused IPv6
layer during LL address generation - since that length was not a valid
one, the address was not initialized properly and a part of it was set
semi-random. This could result for example in filling out the neighbor
tables.

Signed-off-by: Robert Lubos <robert.lubos@nordicsemi.no>
(cherry picked from commit 3089a5d116)
2025-02-26 09:30:50 -06:00
Derek Snell
7eab725dfb soc: nxp: rw: Update system core clock frequency
After updating the main_clk, need to update the frequency tracked in
HAL MCUXpresso SDK framework for other drivers.

Signed-off-by: Derek Snell <derek.snell@nxp.com>
(cherry picked from commit 793e44afdd)
2025-02-26 09:28:45 -06:00
Yong Cong Sin
a50476ca00 logging: log_cmds: init uninitialized backend on log_go()
For backends that do not autostart themselves, initialize
& enable them on `log backend <log_backend_*> go`, so
that they function properly.

Signed-off-by: Yong Cong Sin <ycsin@meta.com>
Signed-off-by: Yong Cong Sin <yongcong.sin@gmail.com>
(cherry picked from commit f840a35be3)
2025-02-20 13:03:28 -06:00
Yong Cong Sin
ae73df5621 logging: init backend id regardless of autostart
The `id` is basically a compile-time constant. Setting it every
time the backend is enabled is unnecessary. Instead, set it on
`z_log_init()` regardless of whether or not it requires to be
`autostart`ed.

Fixes an issue where the `filter_get`/`filter_set`
accessed the wrong index and displayed the wrong log level when
user accesses the status of an uninitialized backend via:
`log backend <uninitialized_backend> status`.

Also fixes an issue when user tries to list the backends via:
`log list_backends`, where all uninitialized backends will have
ID = 0.

Signed-off-by: Yong Cong Sin <ycsin@meta.com>
Signed-off-by: Yong Cong Sin <yongcong.sin@gmail.com>
(cherry picked from commit 8dd9d924fe)
2025-02-20 13:03:28 -06:00
Jordan Yates
f61d53d5b5 bluetooth: increment BT_BUF_CMD_TX_COUNT
The extended advertising start procedure can consume both command
buffers in a single API call, resulting in `bt_le_create_conn_cancel`
being unable to claim a buffer to terminate the connection request.

Increase the command count if both extended advertising and Bluetooth
central are enabled in an application.

Signed-off-by: Jordan Yates <jordan@embeint.com>
2025-02-20 10:56:17 -08:00
Jordan Yates
790bd7c44b bluetooth: host: hci_core: add missing NULL check
Add check that the command buffer claimed in `bt_le_create_conn_cancel`
is not `NULL`. Fixes a fault caused by providing the `NULL` buffer to
`bt_hci_cmd_state_set_init`.

Signed-off-by: Jordan Yates <jordan@embeint.com>
2025-02-20 10:56:17 -08:00
Florian Weber
c2ad663314 rtio: workq: bugfix of memory allocation
This commit fixes the bug that the memory of the work request is
freed up in the work handler, before it is not needed anymore by the p4wq.
This is fixed now, by using the new done_handler in the p4wq
for freeing up that memory.

Signed-off-by: Florian Weber <Florian.Weber@live.de>
(cherry picked from commit b55b9ae72b)
2025-02-20 10:55:30 -08:00
Florian Weber
f0c05671da lib: os: p4wq: add done handler
Add an optional handler to the p4wq to give the submitting code
(e.g. rtio workq) a possibility execute code after the work was
succesfully executed.

Signed-off-by: Florian Weber <Florian.Weber@live.de>
(cherry picked from commit 093b29fdb5)
2025-02-20 10:55:30 -08:00
Jukka Rissanen
b40cacb67d net: Update IP address refcount properly when address already exists
If an IP address already exists when it is tried to be added to the
network interface, then just return it but update ref count if it was
not updated. This could happen if the address was added and then removed,
but for example an active connection was still using it and keeping the
ref count > 0. In this case we must update the ref count so that the IP
address is not removed if the connection is closed.

Fixes #85380

Signed-off-by: Jukka Rissanen <jukka.rissanen@nordicsemi.no>
(cherry picked from commit ae05221762)
2025-02-20 10:52:59 -08:00
Erwan Gouriou
c1f3c446bc dts: stm32h7: Fix ltdc reset lines
LTDC reset lines where off by 1. Fix it.

Signed-off-by: Erwan Gouriou <erwan.gouriou@st.com>
(cherry picked from commit 6f55a32da8)
2025-02-20 10:46:32 -08:00
Robert Lubos
cd85e0ef82 net: ipv6: Fix Neighbor Advertisement processing w/o TLLA option
According to RFC 4861, ch. 7.2.5:

 "If the Override flag is set, or the supplied link-layer address
  is the same as that in the cache, or no Target Link-Layer Address
  option was supplied, the received advertisement MUST update the
  Neighbor Cache entry as follows

  ...

  If the Solicited flag is set, the state of the entry MUST be
  set to REACHABLE"

This indicates that Target Link-Layer Address option does not need to be
present in the received solicited Neighbor Advertisement to confirm
reachability. Therefore remove `tllao_offset` variable check from the
if condition responsible for updating cache entry. No further changes in
the logic are required because if TLLA option is missing,
`lladdr_changed` will be set to false, so no LL address will be updated.

Signed-off-by: Robert Lubos <robert.lubos@nordicsemi.no>
(cherry picked from commit 02c153c8b1)
2025-02-20 10:44:32 -08:00
Robert Lubos
e47c933b29 net: ipv6: Send Neighbor Solicitations in PROBE state as unicast
According to RFC 4861, ch. 7.3.3:

 "Upon entering the PROBE state, a node sends a unicast Neighbor
  Solicitation message to the neighbor using the cached link-layer
  address."

Zephyr's implementation was not compliant with behavior, as instead of
sending a unicast probe for reachability confirmation, it was sending a
multicast packet instead. This commit fixes it.

Signed-off-by: Robert Lubos <robert.lubos@nordicsemi.no>
(cherry picked from commit 8cd213e846)
2025-02-20 10:44:32 -08:00
Robert Lubos
c4cb9bd1af net: ipv6: Fix neighbor registration based on received RA message
When Router Advertisement with Source Link-Layer Address option is
received, host should register a new neighbor marked as STALE
(RFC 4861, ch. 6.3.4). This behavior was broken however, because
we always added a new neighbor in INCOMPLETE state before processing
SLLA option. In result, the entry was not updated to the STALE state,
and a redundant Neighbor Solicitation was sent.

Fix this by moving the code responsible for adding neighbor in
INCOMPLETE state after options processing, and only as a fallback
behavior if the SLLA option was not present.

Signed-off-by: Robert Lubos <robert.lubos@nordicsemi.no>
(cherry picked from commit fce53922ef)
2025-02-20 10:44:32 -08:00
Lingao Meng
f001cf6de9 Bluetooth: Mesh: Fix Assert in bt_mesh_adv_unref when messages to a proxy
Fixes:https://github.com/zephyrproject-rtos/zephyr/issues/83904

This solution fix is to define a separate variable for the each proxy FIFO.

Signed-off-by: Lingao Meng <menglingao@xiaomi.com>
(cherry picked from commit 6371080406)
2025-02-13 09:44:08 -08:00
Nazar Palamar
da7ec06d12 test: arm: irq: Add overlays files for Infineon boards
Changed interrupt priority for GPIO, default 6 is not suitable for
for the ZERO_LATENCY_IRQS function used in this test.
used in this test.

Fixes for problem with arm_irq_zero_latency_levels
refer to pull/81377

Signed-off-by: Nazar Palamar <nazar.palamar@infineon.com>
(cherry picked from commit 6172092730)
2025-02-11 10:21:13 -08:00
Nazar Palamar
b0e5803c06 Infineon: board: Add CONFIG_GPIO to defconfigs
Add CONFIG_GPIO from defconfigs for Infineon boards.

Revert pull/81377, which affect some ble samples which
used GPIO.

Signed-off-by: Nazar Palamar <nazar.palamar@infineon.com>
(cherry picked from commit 697efe8b50)
2025-02-11 10:21:13 -08:00
Michal Myczkowski
e104bf9a28 dts: atmel sam4s: fix sram address
Changed Atmel SAM4S series sram adress from 0x20100000 to 0x20000000. Now
it matches what is in Atmel SAM4S Series Datasheet chapter 8
section 1.1 Internal SRAM:

https://ww1.microchip.com/downloads/aemDocuments/documents/OTH/ProductDocuments/DataSheets/Atmel-11100-32-bitCortex-M4-Microcontroller-SAM4S_Datasheet.pdf#G1.1069257

Fixes #85211.

Signed-off-by: Michal Myczkowski <mmyczkowski@antmicro.com>
(cherry picked from commit 3a53845fad)
2025-02-11 10:19:42 -08:00
Jamie McCrae
41bafc5dbf mgmt: mcumgr: grp: img_mgmt: Fix calling confirm
Fixes calling the registered callbacks for image being confirmed
if the confirmation was not successful

Signed-off-by: Jamie McCrae <jamie.mccrae@nordicsemi.no>
(cherry picked from commit 98d5aa3792)
2025-02-11 10:18:49 -08:00
Mahesh Mahadevan
37321f0702 boards: frdm_mcxn947: Delete enable of GPIO5 clock
There is no bit to enable GPIO5 clock in the clock control
register.

Signed-off-by: Mahesh Mahadevan <mahesh.mahadevan@nxp.com>
(cherry picked from commit bda04093fb)
2025-02-11 10:13:45 -08:00
Dario Binacchi
8c516c6d86 dts: arm: st: re-enable master can gating clock for can2
Commit 57723cf405 ("dts: arm: st: Refactor DTSI files to use macro"),
which replaced raw hex codes by using STM32_CLOCK macro, causes
regression in the case of the CAN device where the previous raw value
contained more than one bit set to 1. The macro is in fact correct only
for values with a single bit set. In all other cases, raw values must
continue to be used.

Tested on STM32F429I-DISC1 board

Fixes: 57723cf405
Co-authored-by: Michael Trimarchi <michael@amarulasolutions.com>
Signed-off-by: Dario Binacchi <dario.binacchi@amarulasolutions.com>
(cherry picked from commit 2c3294b079)
2025-02-11 09:50:03 -08:00
Erwan Gouriou
e397b1b657 drivers: ethernet: stm32: Use MDIO API only if enabled by DTS
Not all STM32 series support Zephyr MDIO API yet, while the API is enabled
by default.
To preserve compatibility, put MDIO API related code under the condition
of "st,stm32-mdio" compatible enablement.

Signed-off-by: Erwan Gouriou <erwan.gouriou@st.com>
(cherry picked from commit 2d81351517)
2025-02-04 16:30:36 -06:00
Luis Ubieda
b1631a2913 samples: sensor🐚 test all channels
Loop through all of the channels and make sure that each
channel has an output.

Changes in patch originally present in #72972.

Signed-off-by: Yong Cong Sin <ycsin@meta.com>
Signed-off-by: Luis Ubieda <luisf@croxel.com>
(cherry picked from commit 1e64d7bea1)
2025-02-04 16:29:39 -06:00
Luis Ubieda
c4210d749a tests: sensor: generic_test: Remove special-handling for axis values
Since now they're treated as q31 values when individually queried, as
opposed to lump them in a triplet.

Signed-off-by: Luis Ubieda <luisf@croxel.com>
(cherry picked from commit 19a4261517)
2025-02-04 16:29:39 -06:00
Luis Ubieda
212d0c50f1 sensor: default_rtio_sensor: fix: Treat individual axis data as q31_t
Instead of assuming that an individual axis must contain all
data-values. Additionally, for determining that there's XYZ data, all
three channels must be present in the header.

Signed-off-by: Luis Ubieda <luisf@croxel.com>
(cherry picked from commit 5a1dfc82c3)
2025-02-04 16:29:39 -06:00
Luis Ubieda
2b884a703e sensor: shell: Allow individual axis data to be processed
For the supported channels, instead of just allowing 3-axis data being
printed out, allow individual channels to be processed (e.g: accel_x,
gyro_y, etc).

Signed-off-by: Luis Ubieda <luisf@croxel.com>
(cherry picked from commit 36dfea121d)
2025-02-04 16:29:39 -06:00
Jukka Rissanen
50c415d3ee net: socket: Release packets in accepted socket in close
If we have received data to the accepted socket, then release
those before removing the accepted socket. This is a rare event
as it requires that we get multiple simultaneous connections
and there is a failure before the socket accept is called by
the application.
For example one such scenario is when HTTP server receives multiple
connection attempts at the same time, and the server poll fails
before socket accept is called. This leads to buffer leak as the
socket close is not called for the accepted socket because the
accepted is not yet created from application point of view.
The solution is to flush the received queue of the accepted socket
before removing the actual accepted socket.

Fixes #84538

Signed-off-by: Jukka Rissanen <jukka.rissanen@nordicsemi.no>
(cherry picked from commit 535e70a298)
2025-02-04 16:27:07 -06:00
Luis Ubieda
0429f17b76 sensor: adxl3xx: Move run-time modification of ODR from cfg to data
Config struct is constant and attempting to modify it triggers a fault.

Signed-off-by: Luis Ubieda <luisf@croxel.com>
(cherry picked from commit 5a9ff03c21)
2025-02-04 16:22:11 -06:00
Chen Shu
5080ba8b0c fs: ext2: Fix nbytes_to_read calculation in ext2_inode_read()
Fix incorrect nbytes_to_read calculation in ext2_inode_read() function.
Previously nbytes_to_read was decremented by read value which caused
incorrect calculation of bytes to read in subsequent iterations.
Now nbytes_to_read is decremented by to_read value which represents
the actual number of bytes read in current iteration.

This fixes potential data corruption issues when reading files from
ext2 filesystem.

Signed-off-by: Chen Shu <751541594@qq.com>
(cherry picked from commit 5a5f05ba4e)
2025-02-04 16:21:32 -06:00
Vinayak Kariappa Chettimada
13e7d6501f Bluetooth: Controller: Fix uninitialized is_aborted in conn done event
Fix uninitialized is_aborted in connection done event.

Relates to commit cadef5a64f ("Bluetooth: Controller:
Introduce BT_CTLR_PERIPHERAL_RESERVE_MAX").

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
(cherry picked from commit f3e398d64c)
2025-02-04 16:20:40 -06:00
Jakub Rzeszutko
4a1ffd8519 shell: fix unsafe API calls and add configurable autoflush behavior
Fixes an issue where the shell API could block indefinitely when called
from threads other than the shell's processing thread, especially when
the transport (e.g. USB CDC ACM) was unavailable or inactive.

Replaced `k_mutex_lock` calls with an indefinite timeout (`K_FOREVER`)
by using a fixed timeout (`K_MSEC(SHELL_TX_MTX_TIMEOUT_MS)`) in shell
API functions to prevent indefinite blocking.

Link: https://github.com/zephyrproject-rtos/zephyr/issues/84274

Signed-off-by: Jakub Rzeszutko <jakub.rzeszutko@verkada.com>
(cherry picked from commit b0a0febe58)
2025-01-30 17:54:30 -08:00
Jakub Rzeszutko
98a1a1510c shell: add Kconfig option for configurable autoflush behavior
Introduced a new Kconfig option `SHELL_PRINTF_AUTOFLUSH` to allow
configuring the autoflush behavior of shell printing functions.

Updated `Z_SHELL_FPRINTF_DEFINE` to use the
`CONFIG_SHELL_PRINTF_AUTOFLUSH` setting instead of hardcoding
the autoflush behavior to `true`.

Signed-off-by: Jakub Rzeszutko <jakub.rzeszutko@verkada.com>
(cherry picked from commit 8991b954bc)
2025-01-30 17:54:30 -08:00
Tomislav Milkovic
5fe9a7e6f9 drivers: can: can_tcan4x5x: fix compiler build warning/error
Fix compiler warning when optional property reset-gpios
is not supplied in the ti,tcan4x5x-compatible device tree
node

Signed-off-by: Tomislav Milkovic <tomislav.milkovic95@gmail.com>
(cherry picked from commit fb98387f4d)
2025-01-30 17:52:29 -08:00
Mayank Narang
a1eb85002e drivers: sensor: lis2de12: add length check in spi write incr routine
Added a length check in the spi write incr routine to handle both
single and multi byte write operations properly.

Signed-off-by: Mayank Narang <narang.may77@gmail.com>
(cherry picked from commit cb608812cf)
2025-01-30 17:51:43 -08:00
Mayank Narang
58491c40c0 drivers: sensor: lis2de12: fix read accel via spi
The lis2de12 sensor driver spi interface was calling spi read api.
This leads to a single byte operation on reading acceleration data
which is a multi byte operation. Fix it by adding a call to spi read
incr api instead. Added a length check to handle both single and multi
byte read properly.

Signed-off-by: Mayank Narang <narang.may77@gmail.com>
(cherry picked from commit 7b062f5b09)
2025-01-30 17:51:43 -08:00
Luca Burelli
adfc5b3863 tests: cmake: run zephyr_get() tests in script mode
Re-run the zephyr_get() testsuite in script mode after the project
mode testsuite has been executed. This is to ensure that the
zephyr_get() function works correctly in script mode as well.

Signed-off-by: Luca Burelli <l.burelli@arduino.cc>
(cherry picked from commit d492c849a3)
2025-01-30 17:49:53 -08:00
Torsten Rasmussen
58b2b82b44 cmake: use GLOBAL property instead TARGET properties for scoping
Targets are not available in script mode.
To support the Zephyr scoping feature used by snippets and yaml module
then this commit moves from using custom targets to use GLOBAL
properties for scopes.

A scope property is prefixed with `<scope>:<property>` to avoid naming
collisions.
A `scope:<scope-name>` global property is used to track created scopes.
Tracking valid scopes ensure that properties are only set on known
scopes and thus catches typos / naming errors.

Add zephyr_scope_exists() and zephyr_get_scoped() to abstract the
implementation details of the scoped property retrieval and refactor
current code to use them.

Signed-off-by: Torsten Rasmussen <Torsten.Rasmussen@nordicsemi.no>
Signed-off-by: Luca Burelli <l.burelli@arduino.cc>
(cherry picked from commit 4e29a35b22)
2025-01-30 17:49:53 -08:00
Pieter De Gendt
2f0e67fee8 drivers: spi: spi_mcux_ecspi: Fix data size when using 16/32 bit transfers
The data size is set using a burst length, the data size for 8/16/32 is
always 1 in those cases.

Signed-off-by: Pieter De Gendt <pieter.degendt@basalte.be>
(cherry picked from commit 0e8aed7393)
2025-01-22 14:46:39 -08:00
Pieter De Gendt
7d59e9f11e drivers: spi: spi_mcux_ecspi: Ignore chip select channel with cs-gpios
The internal chip selects are limited to 4, however when using GPIOS
does not pose this limitation.
Also set internal channel to 0 if GPIOS are used.

Signed-off-by: Pieter De Gendt <pieter.degendt@basalte.be>
(cherry picked from commit 7250e987e5)
2025-01-22 14:46:39 -08:00
Alberto Escolar Piedras
de825415d6 tests/subsys/lorawan/frag_decoder: Change random seed
These tests are quite sensitive to the exact random sequence.
After a change of the native_sim entropy generation 2 of them
started failing. Let's set a random seed which avoids the failure.

Signed-off-by: Alberto Escolar Piedras <alberto.escolar.piedras@nordicsemi.no>
(cherry picked from commit 64d4fcd8cb)
2025-01-22 14:43:50 -08:00
Alberto Escolar Piedras
fff6ad389e drivers: entropy: fix native_posix driver for more than 3 byte requests
Fix the native_posix fake entropy driver for more than 3 byte requests,
and specially for native_sim//64 builds.

The host random() provides a number between 0 and 2**31-1 (INT_MAX),
so bit 32 was always 0.
So when filling a buffer with more than 3 bytes we would be filling
each 4th byte with a byte which always had its MSbit as 0.

For LP64 native_sim//64 builds, this was even worse, as the driver had
another bug where we assumed random() returned the whole long filled,
and therefore all 4 upper bytes would be 0.

Signed-off-by: Alberto Escolar Piedras <alberto.escolar.piedras@nordicsemi.no>
(cherry picked from commit b3407f04e7)
2025-01-22 14:43:50 -08:00
Derek Snell
7d2ac02254 drivers: flash: soc_flash_mcux: remove CMD_MARGIN_CHECK
The CMD_BLANK_CHECK can return errors when the flash is readable, and
should only be used after programming, not in is_area_readable().  From
the LPC55S69 datasheet: "As cells age and lose charge, a correctly
programmed address will fail this check, while still being able to be
read successfully for the remaining duration of the data retention time."

Signed-off-by: Derek Snell <derek.snell@nxp.com>
(cherry picked from commit 88b9cb6efc)
2025-01-18 09:00:18 -06:00
Thomas Schranz
72a1c809df soc: sam0: samd5x: xosc32 configurable startup time
Adds Kconfig option to configure the startup time of the external
32KHz crystal oscillator.

Signed-off-by: Thomas Schranz <electronics@wandfluh.com>
(cherry picked from commit cd20154bc7)
2025-01-15 16:32:01 -08:00
Daniel DeGrasse
c861286bc0 drivers: flash: flash_mcux_flexspi_nor: check all 3 bytes of JEDEC ID
The FlexSPI NOR driver should verify all 3 bytes of the JEDEC ID match
the expected value before attempting to use a custom LUT table with a
flash chip. This reduces the odds that an incompatible LUT will be used
with a flash chip, as some flash chips may share the same first byte of
their device ID but not be compatible with the custom LUT table.

Signed-off-by: Daniel DeGrasse <daniel.degrasse@nxp.com>
(cherry picked from commit c12030acb9)
2025-01-06 08:37:43 -08:00
Erik Tamlin
3f4037b951 manifest: update percepio
Update the percepio module to use TraceRecorder v4.10.2

Signed-off-by: Erik Tamlin <erik.tamlin@percepio.com>
(cherry picked from commit 91b2156398)
2024-12-27 12:06:38 -08:00
Jukka Rissanen
763665dcda tests: net: dns_dispatcher: Add tests for dispatcher
Make sure that the socket service is properly unregistered when
dispatcher is unregistered.

Signed-off-by: Jukka Rissanen <jukka.rissanen@nordicsemi.no>
(cherry picked from commit d98fe73684)
2024-12-27 11:32:22 -08:00
Jukka Rissanen
d58e1ea145 net: dns: Close socket service properly from dispatcher
We need to close the socket service context when dispatcher is
unregistered.

Fixes #82652

Signed-off-by: Jukka Rissanen <jukka.rissanen@nordicsemi.no>
(cherry picked from commit da148ab3bc)
2024-12-27 11:32:22 -08:00
Jukka Rissanen
fc639b6b27 net: dns: Avoid errors when DNS dispatcher is already registered
Skip error prints and extra DNS events if DNS dispatcher was already
registered.

Signed-off-by: Jukka Rissanen <jukka.rissanen@nordicsemi.no>
(cherry picked from commit c110331507)
2024-12-27 11:32:22 -08:00
Jukka Rissanen
1d445685bf net: dns: Fix the debug print when dispatcher fails
Depending on DNS type, print the output (mDNS vs DNS) correctly.

Signed-off-by: Jukka Rissanen <jukka.rissanen@nordicsemi.no>
(cherry picked from commit efcfcfe292)
2024-12-27 11:32:22 -08:00
Marcio Ribeiro
6dc5129040 drivers: uart: esp32: reset tx fifo during driver init
Resets uart tx fifo during driver initialization to have a well defined
initial condition mainly preventing unwanted characters being sent

Signed-off-by: Marcio Ribeiro <marcio.ribeiro@espressif.com>
(cherry picked from commit f0516ead27)
2024-12-27 11:31:09 -08:00
DEVER Emiel
d8ab599ecf fs: ext2: Fix ext2 read buffer overflow
Keep track of the amount of bytes read so no buffer overflow occurs.

Signed-off-by: DEVER Emiel <emiel.dever@psicontrol.com>
(cherry picked from commit 336c650646)
2024-12-27 11:29:38 -08:00
Djordje Nedic
8abda1d12f fs: littlefs: Fix cache and lookahead size checks
This fixes an issue where wrong values were checked against block device
defaults. Kconfig values are used when no filesystem config values are
provided, and the buffers are sized according to the Kconfig values,
but the checks were only performed against filesystem config variables.

Signed-off-by: Djordje Nedic <nedic.djordje2@gmail.com>
(cherry picked from commit 2750492e86)
2024-12-27 11:27:37 -08:00
Gerson Fernando Budke
578251beac dts: atmel: samr21: Use samd21 as base
The samr21 is a samd21 with a builtin at86rf233 radio. Use the samd21 as
base for these SoC and drop all duplicated nodes.

Signed-off-by: Gerson Fernando Budke <nandojve@gmail.com>
(cherry picked from commit 01fc0a750a)
2024-12-27 11:21:47 -08:00
Gerson Fernando Budke
ca7eba2f11 dts: atmel: saml2x: Define USB node
The saml2x series provide USB to all SoC series. This moves the USB node
from saml21.dtsi to the base file saml2x.dtsi.

Signed-off-by: Gerson Fernando Budke <nandojve@gmail.com>
(cherry picked from commit 34fc01e008)
2024-12-27 11:21:47 -08:00
Gerson Fernando Budke
3acaf7a069 dts: atmel: saml21: Exclude DMA node
The DMA is already defined on the base header. This exclude the
duplicated node.

Signed-off-by: Gerson Fernando Budke <nandojve@gmail.com>
(cherry picked from commit 2e7599eac6)
2024-12-27 11:21:47 -08:00
Gerson Fernando Budke
ba39f63957 dts: atmel: sam0: Fix aliases and chosen nodes
Reorder and add missing aliases and chosen nodes.

Signed-off-by: Gerson Fernando Budke <nandojve@gmail.com>
(cherry picked from commit 748bbd012b)
2024-12-27 11:21:47 -08:00
Gerson Fernando Budke
503d458893 dts: atmel: sam0: Explicity disable nodes
When running tests on sam0 platform was detected that pinctrl for ADC
were not defined for some boards. To force an error at build time
nodes should be explicity disabled. This explicity disable nodes on
devicetree that require some user configuration.

In addition, the adc feature were excluded in some boards and
samr21-xpro was correct updated.

Signed-off-by: Gerson Fernando Budke <nandojve@gmail.com>
(cherry picked from commit 162f728787)
2024-12-27 11:21:47 -08:00
Gerson Fernando Budke
8e2f36cfcf dts: atmel: sam0: Normalize dtsi nodes
Keep a consistent order on the nodes definitions to make it easy to read
between all the SoC series.

Signed-off-by: Gerson Fernando Budke <nandojve@gmail.com>
(cherry picked from commit 6f1f598a72)
2024-12-27 11:21:47 -08:00
Jamie McCrae
d70bbe0eb6 samples: mgmt: mcumgr: smp_svr: Fix re-advertise issue on connection
Fixes an issue introduced with commit
c6ad4a7927 which wrong restarts
advertising after a device connects when it should only restart
advertising if a device fails to connect

Signed-off-by: Jamie McCrae <jamie.mccrae@nordicsemi.no>
(cherry picked from commit 655be99fa7)
2024-12-27 11:19:58 -08:00
Dominik Ermel
fe8dc82b16 drivers/flash: Correct flash_erase userspace handler
As the erase callback is optional, handler should not check
if it is not NULL.

Fixes #81777

Signed-off-by: Dominik Ermel <dominik.ermel@nordicsemi.no>
(cherry picked from commit 486428951b)
2024-12-27 11:18:30 -08:00
Jilay Pandya
17df0f1748 drivers: auxdisplay: jhd1313: fix Out-of-bounds read
fix out of bounds read by doing the comparison with ARRAY_SIZE correctly

Signed-off-by: Jilay Pandya <jilay.pandya@outlook.com>
(cherry picked from commit 3202773b11)
2024-12-27 11:17:37 -08:00
Jukka Rissanen
d6ec225075 net: wifi_cred: Decrease flash usage for error print strings
As the error print strings are very similar, construct the final
output at runtime to save some flash space.

Signed-off-by: Jukka Rissanen <jukka.rissanen@nordicsemi.no>
(cherry picked from commit 3c59bd4fb5)
2024-12-09 13:40:54 -08:00
Jukka Rissanen
e60954d54e net: wifi_cred: Remove extra empty lines
Follow net coding style and remove extra new lines between
variable set and checking its value.

Signed-off-by: Jukka Rissanen <jukka.rissanen@nordicsemi.no>
(cherry picked from commit 8deebcec21)
2024-12-09 13:40:54 -08:00
Jukka Rissanen
18845d2bb5 net: wifi_cred: Introduce variables at the start of function
Follow the net coding style and declare the variables at the start
of the function.

Signed-off-by: Jukka Rissanen <jukka.rissanen@nordicsemi.no>
(cherry picked from commit b2056e0966)
2024-12-09 13:40:54 -08:00
Jukka Rissanen
8fdf221640 net: wifi_cred: Check null before access
We must do null check before trying to access the fields.

Fixes #81980
Coverify-CID: 434549

Signed-off-by: Jukka Rissanen <jukka.rissanen@nordicsemi.no>
(cherry picked from commit 3aa0c8670e)
2024-12-09 13:40:54 -08:00
Marek Matej
b306b24ec4 dts: espressif: Add flash size options to partition tables
Update the partition table list with 16MB and 32MB options.

Signed-off-by: Marek Matej <marek.matej@espressif.com>
(cherry picked from commit 2dc2cdea75)
2024-12-09 13:40:32 -08:00
Jukka Rissanen
2fe01e9400 tests: net: dns: Add test for invalid DNS answer parsing
Make sure we catch invalid answer during parsing.

Signed-off-by: Jukka Rissanen <jukka.rissanen@nordicsemi.no>
(cherry picked from commit 16669ec4d5)
2024-12-09 13:37:22 -08:00
Jukka Rissanen
aec5129706 net: dns: Check DNS answer properly
The dns_unpack_answer() did not check the length of the message
properly which can cause out of bounds read.

Signed-off-by: Jukka Rissanen <jukka.rissanen@nordicsemi.no>
(cherry picked from commit 6e7fcff579)
2024-12-09 13:37:22 -08:00
Jukka Rissanen
b912e674cf net: dns: Validate source buffer length properly
Make sure that when copying the qname, the source buffer is large
enough for the data.

Signed-off-by: Jukka Rissanen <jukka.rissanen@nordicsemi.no>
(cherry picked from commit 43c2b9cfe8)
2024-12-09 13:37:22 -08:00
Jukka Rissanen
1805b1579f tests: net: dns: Add checking of malformed packet
Make sure we test malformed packet parsing.

Signed-off-by: Jukka Rissanen <jukka.rissanen@nordicsemi.no>
(cherry picked from commit 6f96915a14)
2024-12-09 13:37:22 -08:00
Jukka Rissanen
ce695706c0 net: dns: Check parsing error properly for response
If the packet parsing fails in dns_unpack_response_query(), then
do not continue further but bail out early.

Signed-off-by: Jukka Rissanen <jukka.rissanen@nordicsemi.no>
(cherry picked from commit eb2550a441)
2024-12-09 13:37:22 -08:00
Jukka Rissanen
1e88982fe3 net: ethernet: bridge: Drop the cloned packet in error
We need to drop the cloned packet that was fed to the bridge instead of
returning directly from the function. Without this change we have a
buffer leak.

Signed-off-by: Jukka Rissanen <jukka.rissanen@nordicsemi.no>
(cherry picked from commit c6a1af5c17)
2024-12-09 13:36:03 -08:00
Jukka Rissanen
d4e329483f net: ethernet: bridge: Avoid null pointer access
If the packet cloning failed, bail out in order to avoid
null pointer access.

Fixes #81992
Coverity-CID: 434493

Signed-off-by: Jukka Rissanen <jukka.rissanen@nordicsemi.no>
(cherry picked from commit ed0dcca2fb)
2024-12-09 13:36:03 -08:00
Jamie McCrae
4d3a4eafd6 mgmt: mcumgr: grp: img_mgmt: Fix misplaced #endif
Fixes a misplaced endif which wrongly excluded functions

Signed-off-by: Jamie McCrae <spam@helper3000.net>
(cherry picked from commit 5437ded36c)
2024-12-09 13:35:23 -08:00
Steven Poon
9c63873db2 net: lib: lwm2m: Fix missing mutex unlock
lwm2m_engine_set() and lwm2m_engine_get() locks
the registry_lock mutex, but this is not unlocked
when setting or getting a time resource where the buffer
lengths are invalid resulting in an early return without
unlocking the mutex. This results in a deadlock when
attempting to lock the registry in another thread.

Signed-off-by: Steven Poon <steven-github@outlook.com>
(cherry picked from commit 30b30c29a3)
2024-12-09 13:35:07 -08:00
Jakub Rzeszutko
bd1db48d25 lib: shell: replace strtol with strtoul in cmd_load for address parsing
Addresses in cmd_load() should always be unsigned. Previously, strtol()
was used, which is limited to signed long values, causing issues with
addresses >= 0x80000000. This commit replaces strtol() with strtoul(),
ensuring proper handling of the full 32-bit address space.

Fixes #81343

Signed-off-by: Aaron Fontaine <aaron.fontaine@dojofive.com>
Signed-off-by: Jakub Rzeszutko <jakub.rzeszutko@verkada.com>
(cherry picked from commit 1aaf08f7f1)
2024-12-06 13:47:19 -08:00
Pavel Vasilyev
8ac7bde2b1 bluetooth: mesh: proxy_msg: Fix extracting role from k_work
Fix extracting role from k_work.
Hot fix for #78914

Signed-off-by: Pavel Vasilyev <pavel.vasilyev@nordicsemi.no>
(cherry picked from commit f5409bd3de)
2024-12-06 13:40:02 -08:00
Jerzy Kasenberg
98d9f3e551 i2c: target: eeprom_target: Fix buffer write
When larger buffer index was introduced only function:
eeprom_target_write_received() was updated to handle
address-width = 16

This adds the same functionality when buffered API is used,
enabled by CONFIG_I2C_TARGET_BUFFER_MODE.

Signed-off-by: Jerzy Kasenberg <jerzy.kasenberg@codecoup.pl>
(cherry picked from commit e6c9e9a2dd)
2024-12-06 09:55:36 -06:00
Yong Cong Sin
2c5fb2d916 arch: riscv: reg: include required header
Include `zephyr/sys/util.h` for the `STRINGIFY()` macro.

Signed-off-by: Yong Cong Sin <ycsin@meta.com>
(cherry picked from commit 45ebd390cf)
2024-12-01 14:49:38 -08:00
Gerson Fernando Budke
3b14973425 drivers: rtc: sam: Add platform on API test coverage
The #81456 fixed the driver issue related to issue #81454. This add
the RTC configurations on sam_v71_xult board to enable test coverage.

Fixes #81454

Signed-off-by: Gerson Fernando Budke <nandojve@gmail.com>
(cherry picked from commit cffb66f5c6)
2024-12-01 14:44:05 -08:00
Alberto Escolar Piedras
aae851fdb5 bluetooth: CTS: Fix includes to avoid build error with some libCs
Remove unnecessary include in header and source file.

gmtime_r() is an extension to the C library, and therefore one
needs to explicitly ask for its prototype to have it exposed.
This is done by defining _POSIX_C_SOURCE so let's do so.

These two changes fix build errors with some libCs.
Tested with pico, newlib, minimal and the host glibc.

Signed-off-by: Alberto Escolar Piedras <alberto.escolar.piedras@nordicsemi.no>
(cherry picked from commit a785548df6)
2024-12-01 14:43:44 -08:00
Matthias Alleman
f7edec7da6 posix: fpu: Fix compiler error when enabling fpu on posix boards
Enabling CONFIG_FPU and CONFIG_FPU_SHARING requires the
definition of `arch_float_disable` and `arch_float_enable`.

Signed-off-by: Matthias Alleman <matthias.alleman@basalte.be>
(cherry picked from commit e7c353bf14)
2024-12-01 14:43:31 -08:00
Andrej Butok
371e3a6769 boards: nxp: fix s26ks512s0 flash write-block-size
- Sets s26ks512s0 flash write-block-size to correct 256KB.
- Optimizes MCUboot partitions to fit the correct write-block-size.

Fixes #80284

Signed-off-by: Andrej Butok <andrey.butok@nxp.com>
(cherry picked from commit a730d9abad)
2024-11-22 13:03:12 -06:00
Chaitanya Tata
07a0ae1a99 drivers: nrfwifi: Remove passing unused flag
This flag is now unused in OSAL as it takes the input via the API.

Signed-off-by: Chaitanya Tata <Chaitanya.Tata@nordicsemi.no>
(cherry picked from commit f537cf311d)
2024-11-22 13:02:54 -06:00
Kapil Bhatt
966da750e1 drivers: wifi: Fix offloaded raw TX feature flags
Pass passive scan and offloaded raw tx feature flags to OSAL.

Signed-off-by: Kapil Bhatt <kapil.bhatt@nordicsemi.no>
(cherry picked from commit 62e06a5072)
2024-11-22 13:02:54 -06:00
Benjamin Cabé
01a3232eeb doc: doxygen: improve formatting for kconfig alias
\verbatim is not giving the right output as we need an inline literal.
Switch to \c instead.

Fixes #81595.

Signed-off-by: Benjamin Cabé <benjamin@zephyrproject.org>
(cherry picked from commit 83356e924a)
2024-11-22 13:01:56 -06:00
23599 changed files with 297223 additions and 1003446 deletions

View File

@@ -5,6 +5,7 @@
--min-conf-desc-length=1
--typedefsfile=scripts/checkpatch/typedefsfile
--ignore BRACES
--ignore PRINTK_WITHOUT_KERN_LEVEL
--ignore SPLIT_STRING
--ignore VOLATILE

View File

@@ -82,7 +82,6 @@ ForEachMacros:
- 'HTTP_SERVICE_FOREACH_RESOURCE'
- 'I3C_BUS_FOR_EACH_I3CDEV'
- 'I3C_BUS_FOR_EACH_I2CDEV'
- 'MIN_HEAP_FOREACH'
IfMacros:
- 'CHECKIF'
# Disabled for now, see bug https://github.com/zephyrproject-rtos/zephyr/issues/48520
@@ -100,7 +99,6 @@ IndentCaseLabels: false
IndentGotoLabels: false
IndentWidth: 8
InsertBraces: true
InsertNewlineAtEOF: true
SpaceBeforeInheritanceColon: False
SpaceBeforeParens: ControlStatementsExceptControlMacros
SortIncludes: Never
@@ -113,4 +111,3 @@ WhitespaceSensitiveMacros:
- LISTIFY
- STRINGIFY
- Z_STRINGIFY
- DT_FOREACH_PROP_ELEM_SEP

View File

@@ -0,0 +1,71 @@
---
name: Bug report
about: Create a report to help us improve Zephyr
title: ''
labels: bug
assignees: ''
---
<!--
**Notes**
Github Discussions (https://github.com/zephyrproject-rtos/zephyr/discussions)
are available to first verify that the issue is a genuine Zephyr bug and not a
consequence of Zephyr services misuse.
This issue list is only for bugs in the main Zephyr code base
(https://github.com/zephyrproject-rtos/zephyr/). If the bug is for a project
fork (such as NCS) specific feature, please open an issue in the fork project
instead.
-->
**Describe the bug**
<!--
A clear and concise description of what the bug is.
Please also mention any information which could help others to understand
the problem you're facing:
- What target platform are you using?
- What have you tried to diagnose or workaround this issue?
- Is this a regression? If yes, have you been able to "git bisect" it to a
specific commit?
- ...
-->
**To Reproduce**
<!--
Steps to reproduce the behavior:
1. mkdir build; cd build
2. cmake -DBOARD=board\_xyz
3. make
4. See error
-->
**Expected behavior**
<!--
A clear and concise description of what you expected to happen.
-->
**Impact**
<!--
What impact does this issue have on your progress (e.g., annoyance, showstopper)
-->
**Logs and console output**
<!--
If applicable, add console logs or other types of debug information
e.g Wireshark capture or Logic analyzer capture (upload in zip archive).
copy-and-paste text and put a code fence (\`\`\`) before and after, to help
explain the issue. (if unable to obtain text log, add a screenshot)
-->
**Environment (please complete the following information):**
- OS: (e.g. Linux, MacOS, Windows)
- Toolchain (e.g Zephyr SDK, ...)
- Commit SHA or Version used
**Additional context**
<!--
Add any other context that could be relevant to your issue, such as pin setting,
target configuration, ...
-->

View File

@@ -1,85 +0,0 @@
name: Bug Report
description: File a bug report.
labels: ["bug"]
type: "Bug"
assignees: []
body:
- type: markdown
attributes:
value: |
Thanks for taking the time to fill out this bug report!
- type: textarea
id: what-happened
attributes:
label: Describe the bug
description: |
A clear and concise description of what the bug is.
placeholder: |
Please also mention any information which could help others to understand
the problem you're facing:
- What target platform are you using?
- What have you tried to diagnose or workaround this issue?
- Is this a regression? If yes, have you been able to "git bisect" it to a
specific commit?
validations:
required: true
- type: checkboxes
id: regression
attributes:
label: Regression
description: |
Check this box if this is a regression and provide a SHA if you were able to "git bisect" to a specific commit.
options:
- label: This is a regression.
required: false
- type: textarea
id: reproduce
attributes:
label: Steps to reproduce
description: |
Steps to reproduce the behavior.
placeholder: |
Steps to reproduce the behavior:
1. mkdir build; cd build
2. cmake -DBOARD=board\_xyz
3. make
4. See error
validations:
required: false
- type: textarea
id: logs
attributes:
label: Relevant log output
description: Please copy and paste any relevant log output. This will be automatically formatted into code, so no need for backticks.
render: shell
- type: dropdown
attributes:
label: Impact
description: Impact of this bug
multiple: false
options:
- Showstopper Prevents release or major functionality; system unusable.
- Major Severely degrades functionality; workaround is difficult or unavailable.
- Functional Limitation Some features not working as expected, but system usable.
- Annoyance Minor irritation; no significant impact on usability or functionality.
- Intermittent Occurs occasionally; hard to reproduce.
- Not sure
default: 3
validations:
required: true
- type: textarea
id: env
attributes:
label: Environment
description: please complete the following information
placeholder: |
- OS: (e.g. Linux, MacOS, Windows)
- Toolchain (e.g Zephyr SDK, ...)
- Commit SHA or Version used
- type: textarea
id: context
attributes:
label: Additional Context
description: Provide other context that could be relevant to the bug, such as pin setting, target configuration,etc.

View File

@@ -0,0 +1,28 @@
---
name: Enhancement
about: Suggest enhancements to existing features
title: ''
labels: Enhancement
assignees: ''
---
**Is your enhancement proposal related to a problem? Please describe.**
<!--
A clear and concise description of what the problem is.
-->
**Describe the solution you'd like**
<!--
A clear and concise description of what you want to happen.
-->
**Describe alternatives you've considered**
<!--
A clear and concise description of any alternative solutions or features you've considered.
-->
**Additional context**
<!--
Add any other context or graphics (drag-and-drop an image) about the feature request here.
-->

View File

@@ -1,42 +0,0 @@
name: Enhancement
description: Submit an Enhancement
labels: ["Enhancement"]
type: "Enhancement"
assignees: []
body:
- type: markdown
attributes:
value: |
Thanks for taking the time to fill out this enhancement proposal.
- type: textarea
id: description
attributes:
label: Summary
description: |
Is your enhancement proposal related to a problem? Please describe.
placeholder: |
A clear and concise description of what the problem is.
validations:
required: true
- type: textarea
id: solution
attributes:
label: Describe the solution you'd like
description: |
Describe the solution you'd like
placeholder: |
A clear and concise description of what you want to happen.
validations:
required: true
- type: textarea
id: alternatives
attributes:
label: Alternatives
description: Describe alternatives you've considered
placeholder: |
A clear and concise description of any alternative solutions or features you've considered.
- type: textarea
id: context
attributes:
label: Additional Context
description: Add any other context or graphics (drag-and-drop an image) about the enhancement here.

View File

@@ -0,0 +1,60 @@
---
name: RFC / Proposal
about: Submit an RFC / Proposal
title: ''
labels: RFC
assignees: ''
---
## Introduction
<!--
This section targets end users, TSC members, maintainers and anyone else that might
need a quick explanation of your proposed change.
-->
### Problem description
<!--
Why do we want this change and what problem are we trying to address?
-->
### Proposed change
<!--
A brief summary of the proposed change - the 10,000 ft view on what it will
change once this change is implemented.
-->
## Detailed RFC
<!--
In this section of the document the target audience is the dev team. Upon
reading this section each engineer should have a rather clear picture of what
needs to be done in order to implement the described feature.
-->
### Proposed change (Detailed)
<!--
This section is freeform - you should describe your change in as much detail
as possible. Please also ensure to include any context or background info here.
For example, do we have existing components which can be reused or altered.
By reading this section, each team member should be able to know what exactly
you're planning to change and how.
-->
### Dependencies
<!--
Highlight how the change may affect the rest of the project (new components,
modifications in other areas), or other teams/projects.
-->
### Concerns and Unresolved Questions
<!--
List any concerns, unknowns, and generally unresolved questions etc.
-->
## Alternatives
<!--
List any alternatives considered, and the reasons for choosing this option
over them.
-->

View File

@@ -1,75 +0,0 @@
name: RFC / Proposal
description: Submit a Proposal (RFC)
labels: ["RFC"]
type: RFC
assignees: []
body:
- type: markdown
attributes:
value: |
## Introduction
This section targets end users, TSC members, maintainers and anyone else
that might need a quick explanation of your proposed change.
- type: textarea
id: problem-description
attributes:
label: Problem Description
description: Why do we want this change and what problem are we trying to address?
placeholder: Explain the problem or limitation this RFC is meant to resolve.
validations:
required: true
- type: textarea
id: proposed-change-summary
attributes:
label: Proposed Change (Summary)
description: A high-level summary of the proposed change.
placeholder: Brief summary of what will change if this RFC is implemented.
validations:
required: true
- type: markdown
attributes:
value: |
## Detailed RFC
This section targets the development team. Upon reading it, each engineer
should understand what must be done to implement the proposed feature.
- type: textarea
id: detailed-change
attributes:
label: Proposed Change (Detailed)
description: Describe the change in as much detail as possible. Include context or background info, and reuse of existing components if applicable.
placeholder: Explain exactly what youre planning to change and how.
validations:
required: true
- type: textarea
id: dependencies
attributes:
label: Dependencies
description: Highlight how this change may affect the rest of the project or other teams/components.
placeholder: List components, modules, or teams affected.
validations:
required: false
- type: textarea
id: concerns
attributes:
label: Concerns and Unresolved Questions
description: List any concerns, unknowns, or unresolved questions related to this proposal.
placeholder: Any areas of uncertainty?
validations:
required: false
- type: textarea
id: alternatives
attributes:
label: Alternatives Considered
description: What alternative solutions were considered? Why was this proposal chosen?
placeholder: List alternatives and explain the rationale behind your choice.
validations:
required: false

View File

@@ -0,0 +1,28 @@
---
name: Feature request
about: Suggest an idea for this project
title: ''
labels: Feature Request
assignees: ''
---
**Is your feature request related to a problem? Please describe.**
<!--
A clear and concise description of what the problem is.
-->
**Describe the solution you'd like**
<!--
A clear and concise description of what you want to happen.
-->
**Describe alternatives you've considered**
<!--
A clear and concise description of any alternative solutions or features you've considered.
-->
**Additional context**
<!--
Add any other context or graphics (drag-and-drop an image) about the feature request here.
-->

View File

@@ -1,29 +0,0 @@
name: Feature Request
description: Suggest a new feature or enhancement
labels: ["Feature Request"]
type: Feature
assignees: []
body:
- type: textarea
id: problem
attributes:
label: Is your feature request related to a problem? Please describe.
description: A clear and concise description of what the problem is.
placeholder: e.g., I'm frustrated when I need to do X manually because Y is missing.
validations:
required: true
- type: textarea
id: solution
attributes:
label: Describe the solution you'd like
description: A clear and concise description of what you want to happen.
placeholder: e.g., It would be great if the system could automatically handle X by doing Y.
validations:
required: true
- type: textarea
id: alternatives
attributes:
label: Describe alternatives you've considered
description: Include any alternative solutions or features

View File

@@ -0,0 +1,42 @@
---
name: Contributor Nomination
about: Nominate a GitHub user for additional rights on the Zephyr Project
title: ''
labels: Role Nomination
assignees: ''
---
# Background
The [TSC Project Roles] defines the main roles for the Zephyr Project, including
Maintainer, Collaborator, and Contributor.
By default anyone that contributes code or documentation is a Contributor, but
with the lowest [GitHub Permission Level] of Read. For example, Contributors
with Read permission do not have the permission to add reviewers to a pull
request.
Use this template to nominate a GitHub user for the Contributor role with
Triage permission level, which allows the user to add reviewers to a pull
request and be added as a reviewer by other users.
# Nomination
## GitHub User
Provide the following information about the GitHub user:
1. Full Name
1. GitHub username
1. Organization (optional)
## Supporting Documents
Add links to 3-5 GitHub pull requests, in the Zephyr project, authored or
reviewed by the GitHub user that demonstrate the user's dedication to the
Zephyr project.
[TSC Project Roles]: <https://docs.zephyrproject.org/latest/project/project_roles.html>
[GitHub Permission Level]: <https://docs.github.com/en/organizations/managing-access-to-your-organizations-repositories/repository-roles-for-an-organization>

View File

@@ -1,57 +0,0 @@
name: Contributor Nomination
description: Nominate a GitHub user for the Contributor role with triage permissions
labels: [Role Nomination]
assignees: ['nashif']
body:
- type: markdown
attributes:
value: |
## Background
The [TSC Project Roles](https://docs.zephyrproject.org/latest/project/project_roles.html) defines the main roles for the Zephyr Project, including Maintainer, Collaborator, and Contributor.
By default, anyone who contributes code or documentation is a Contributor, but with the lowest [GitHub Permission Level](https://docs.github.com/en/organizations/managing-access-to-your-organizations-repositories/repository-roles-for-an-organization) of **Read**.
Use this form to nominate a user for the **Contributor** role with **Triage** permission, which allows the user to:
- Add reviewers to pull requests
- Be added as a reviewer by others
- type: input
id: full-name
attributes:
label: Full Name
description: Full name of the nominated contributor.
placeholder: e.g., Jane Doe
validations:
required: true
- type: input
id: github-username
attributes:
label: GitHub Username
description: GitHub handle of the nominated contributor.
placeholder: e.g., @janedoe
validations:
required: true
- type: input
id: organization
attributes:
label: Organization
description: Organization the nominee is affiliated with (optional).
placeholder: e.g., Acme Corp
validations:
required: false
- type: textarea
id: supporting-documents
attributes:
label: Supporting Documents
description: Provide links to 35 pull requests authored or reviewed by the nominee that demonstrate their dedication to the Zephyr project.
placeholder: |
e.g.,
- https://github.com/zephyrproject-rtos/zephyr/pull/12345
- https://github.com/zephyrproject-rtos/zephyr/pull/23456
- https://github.com/zephyrproject-rtos/zephyr/pull/34567
validations:
required: true

View File

@@ -0,0 +1,68 @@
---
name: External Source Code
about: Submit a proposal to integrate external source code
title: ''
labels: TSC
assignees: ''
---
## Origin
Name of project hosting the original open source code
Provide a link to the source
## Purpose
Brief description of what this software does
## Mode of integration
Describe whether you'd like to integrate this external component in the main tree
or as a module, and why. If the mode of integration is a module, suggest a
repository name for the module
## Maintainership
List the person(s) that will be maintaining the integration of this external code
for the foreseeable future. Please use GitHub IDs to identify them. You can
choose to identify a single maintainer only or add collaborators as well
## Pull Request
Pull request (if any) with the actual implementation of the integration, be it
in the main tree or as a module (pointing to your own fork for now). Make sure
the PR is correctly labeled as "DNM"
## Description
Long description that will help reviewers discuss suitability of the
component to solve the problem at hand (there may be a better options
available.)
What is its primary functionality (e.g., SQLLite is a lightweight
database)?
What problem are you trying to solve? (e.g., a state store is
required to maintain ...)
Why is this the right component to solve it (e.g., SQLite is small,
easy to use, and has a very liberal license.)
## Dependencies
What other components does this package depend on?
Will the Zephyr project have a direct dependency on the component, or
will it be included via an abstraction layer with this component as a
replaceable implementation?
## Revision
Version or SHA you would like to integrate initially
## License
Please use an SPDX identifier (https://spdx.org/licenses/), such as
``BSD-3-Clause``

View File

@@ -1,106 +0,0 @@
name: External Component Integration
description: Propose integration of an external open source component
labels: ["TSC"]
assignees: []
body:
- type: textarea
id: origin
attributes:
label: Origin
description: Name of project hosting the original open source code. Provide a link to the source.
placeholder: e.g., SQLite - https://sqlite.org
validations:
required: true
- type: textarea
id: purpose
attributes:
label: Purpose
description: Brief description of what this software does.
placeholder: |
e.g., A small, fast, self-contained SQL database engine.
validations:
required: true
- type: textarea
id: integration-mode
attributes:
label: Mode of Integration
description: Should this be integrated in the main tree or as a module? Explain your choice and suggest a module repo name if applicable.
placeholder: |
e.g., As a module - proposed repo name: zephyr-sqlite
validations:
required: true
- type: textarea
id: maintainership
attributes:
label: Maintainership
description: List maintainers (GitHub IDs) for this integration. Include at least one primary maintainer.
placeholder: |
e.g., @username1 (primary), @username2 (collaborator)
validations:
required: true
- type: input
id: pull-request
attributes:
label: Pull Request
description: Link to the pull request (if any) for this integration. Must be labeled "DNM" (Do Not Merge).
placeholder: |
e.g., https://github.com/zephyrproject-rtos/zephyr/pull/12345
validations:
required: false
- type: textarea
id: description
attributes:
label: Description
description: Long-form description to justify suitability of this component.
placeholder: |
- What is its primary functionality?
- What problem does it solve?
- Why is this the right component?
validations:
required: true
- type: textarea
id: security
attributes:
label: Security
description: Security-related aspects of this component, including cryptographic functions and known vulnerabilities.
placeholder: |
- Does it use cryptography?
- How are vulnerabilities handled?
- Any known CVEs?
validations:
required: false
- type: textarea
id: dependencies
attributes:
label: Dependencies
description: What does this component depend on, and how will it be integrated (directly or via abstraction)?
placeholder: |
- Other external packages?
- Direct or abstracted use in Zephyr?
validations:
required: false
- type: input
id: revision
attributes:
label: Version or SHA
description: Which version or specific commit should be initially integrated?
placeholder: e.g., v3.45.0 or 79cc94d
validations:
required: true
- type: input
id: license
attributes:
label: License (SPDX)
description: Provide the license using a valid SPDX identifier (e.g., BSD-3-Clause).
placeholder: e.g., MIT or BSD-3-Clause
validations:
required: true

41
.github/ISSUE_TEMPLATE/008_bin-blobs.md vendored Normal file
View File

@@ -0,0 +1,41 @@
---
name: Binary blobs
about: Submit a proposal to integrate binary blob(s)
title: ''
labels: TSC
assignees: ''
---
## Origin
Describe where the binary blob(s) originate from
## Type
- [ ] Precompiled library
- [ ] Firmware image
## Module
The Zephyr module that this blob(s) will be referenced from
## Purpose
Brief description of what the blob(s) do. It is especially important to describe
the functionality that the blob(s) provide, to the largest extent possible
## Pull Request
Link to the Pull request with the actual implementation of the integration. If
you are submitting this as part of a new module you may link to the same Pull
Request that introduced the new module.
## Dependencies
What other components do the blob(s) depend on, if any?
## License
Document the license the blob(s) are distributed under

View File

@@ -1,5 +0,0 @@
blank_issues_enabled: false
contact_links:
- name: Zephyr Community Support
url: https://github.com/zephyrproject-rtos/zephyr/discussions
about: Please ask and answer questions here.

6
.github/SECURITY.md vendored
View File

@@ -8,12 +8,12 @@ updates:
- The most recent release, and the release prior to that.
- Active LTS releases.
At this time, with the latest release of v4.2, the supported
At this time, with the latest release of v3.6, the supported
versions are:
- v4.2: Current release
- v4.1: Prior release
- v3.7: Current LTS
- v3.6: Prior release
- v2.7: Prior LTS
## Reporting process

View File

@@ -1,2 +0,0 @@
paths:
- .github

View File

@@ -1,2 +0,0 @@
paths:
- doc

View File

@@ -1,26 +0,0 @@
version: 2
enable-beta-ecosystems: true
updates:
- package-ecosystem: "github-actions"
directory: "/"
schedule:
interval: "weekly"
commit-message:
prefix: "ci: github: "
labels: []
groups:
actions-deps:
patterns:
- "*"
- package-ecosystem: "uv"
directory: "/doc"
schedule:
interval: "weekly"
commit-message:
prefix: "ci: doc: "
labels: []
groups:
doc-deps:
patterns:
- "*"

View File

@@ -15,36 +15,29 @@ on:
types:
- labeled
permissions:
contents: read
jobs:
assignment:
name: Pull Request Assignment
if: github.event.pull_request.draft == false
runs-on: ubuntu-24.04
permissions:
pull-requests: write # to add assignees to pull requests
issues: write # to add assignees to issues
runs-on: ubuntu-22.04
steps:
- name: Check out source code
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Set up Python
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
uses: actions/setup-python@v5
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
- name: Install Python dependencies
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
sudo pip3 install -U setuptools wheel pip
pip3 install -U PyGithub>=1.55 west
- name: Check out source code
uses: actions/checkout@v4
- name: Run assignment script
env:
GITHUB_TOKEN: ${{ secrets.ZB_PR_ASSIGNER_GITHUB_TOKEN }}
GITHUB_TOKEN: ${{ secrets.ZB_GITHUB_TOKEN }}
run: |
FLAGS="-v"
FLAGS+=" -o ${{ github.event.repository.owner.login }}"

View File

@@ -7,17 +7,10 @@ on:
branches:
- main
permissions:
contents: read
jobs:
backport:
name: Backport
runs-on: ubuntu-24.04
permissions:
contents: write # to create/push backport branches
pull-requests: write # to create backport PRs
issues: write # to add labels to issue created if backport fails
runs-on: ubuntu-22.04
# Only react to merged PRs for security reasons.
# See https://docs.github.com/en/actions/using-workflows/events-that-trigger-workflows#pull_request_target.
if: >
@@ -31,8 +24,8 @@ jobs:
)
steps:
- name: Backport
uses: zephyrproject-rtos/action-backport@7e74f601d11eaca577742445e87775b5651a965f # v2.0.3-3
uses: zephyrproject-rtos/action-backport@v2.0.3-3
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
github_token: ${{ secrets.ZB_GITHUB_TOKEN }}
issue_labels: Backport
labels_template: '["Backport"]'

View File

@@ -10,41 +10,30 @@ on:
branches:
- v*-branch
permissions:
contents: read
jobs:
backport:
name: Backport Issue Check
concurrency:
group: backport-issue-check-${{ github.ref }}
cancel-in-progress: true
runs-on: ubuntu-24.04
runs-on: ubuntu-22.04
if: github.repository == 'zephyrproject-rtos/zephyr'
permissions:
issues: read # to check if associated issue exists for backport
steps:
- name: Check out source code
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
- name: Install Python dependencies
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
sudo pip3 install -U setuptools wheel pip
pip3 install -U pygithub
- name: Run backport issue checker
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GITHUB_TOKEN: ${{ secrets.ZB_GITHUB_TOKEN }}
run: |
./scripts/release/list_backports.py \
-o ${{ github.event.repository.owner.login }} \
-r ${{ github.event.repository.name }} \
-b ${{ github.event.pull_request.base.ref }} \
-p ${{ github.event.pull_request.number }}
-o ${{ github.event.repository.owner.login }} \
-r ${{ github.event.repository.name }} \
-b ${{ github.event.pull_request.base.ref }} \
-p ${{ github.event.pull_request.number }}

View File

@@ -5,26 +5,20 @@ on:
workflows: ["BabbleSim Tests"]
types:
- completed
permissions:
contents: read
jobs:
bsim-test-results:
name: "Publish BabbleSim Test Results"
runs-on: ubuntu-24.04
runs-on: ubuntu-22.04
if: github.event.workflow_run.conclusion != 'skipped'
permissions:
checks: write # to create the check run entry with test results
steps:
- name: Download artifacts
uses: dawidd6/action-download-artifact@ac66b43f0e6a346234dd65d4d0c8fbb31cb316e5 # v11
uses: dawidd6/action-download-artifact@v6
with:
run_id: ${{ github.event.workflow_run.id }}
- name: Publish BabbleSim Test Results
uses: EnricoMi/publish-unit-test-result-action@3a74b2957438d0b6e2e61d67b05318aa25c9e6c6 # v2.20.0
uses: EnricoMi/publish-unit-test-result-action@v2
with:
check_name: BabbleSim Test Results
comment_mode: off

View File

@@ -10,15 +10,14 @@ on:
- "tests/bsim/**"
- "boards/nordic/nrf5*/*dt*"
- "dts/*/nordic/**"
- "tests/bluetooth/**"
- "tests/bluetooth/common/testlib/**"
- "samples/bluetooth/**"
- "boards/native/**"
- "soc/native/**"
- "boards/posix/**"
- "soc/posix/**"
- "arch/posix/**"
- "include/zephyr/arch/posix/**"
- "scripts/native_simulator/**"
- "samples/net/sockets/echo_*/**"
- "modules/hal_nordic/**"
- "modules/mbedtls/**"
- "modules/openthread/**"
- "subsys/net/l2/openthread/**"
@@ -27,10 +26,6 @@ on:
- "include/zephyr/net/ieee802154*"
- "drivers/serial/*nrfx*"
- "tests/drivers/uart/**"
- '!**.rst'
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.event_name }}-${{ github.head_ref || github.ref }}
@@ -42,16 +37,13 @@ jobs:
runs-on:
group: zephyr-runner-v2-linux-x64-4xlarge
container:
image: ghcr.io/zephyrproject-rtos/ci-repo-cache:v0.28.1.20250624
image: ghcr.io/zephyrproject-rtos/ci-repo-cache:v0.27.4.20241026
options: '--entrypoint /bin/bash'
env:
ZEPHYR_TOOLCHAIN_VARIANT: zephyr
BSIM_OUT_PATH: /opt/bsim/
BSIM_COMPONENTS_PATH: /opt/bsim/components
EDTT_PATH: ../tools/edtt
permissions:
checks: write # to create the check run entry with test results
steps:
- name: Apply container owner mismatch workaround
run: |
@@ -74,7 +66,7 @@ jobs:
git remote set-url origin ${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}
- name: Checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
uses: actions/checkout@v4
with:
fetch-depth: 0
@@ -85,7 +77,6 @@ jobs:
git config --global user.email "bot@zephyrproject.org"
git config --global user.name "Zephyr Bot"
rm -fr ".git/rebase-apply"
rm -fr ".git/rebase-merge"
git rebase origin/${BASE_REF}
git clean -f -d
git log --pretty=oneline | head -n 10
@@ -98,15 +89,15 @@ jobs:
echo "ZEPHYR_SDK_INSTALL_DIR=/opt/toolchains/zephyr-sdk-$( cat SDK_VERSION )" >> $GITHUB_ENV
- name: Check common triggering files
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
uses: tj-actions/changed-files@v45
id: check-common-files
with:
files: |
.github/workflows/bsim-tests.yaml
.github/workflows/bsim-tests-publish.yaml
west.yml
boards/native/
soc/native/
boards/posix/
soc/posix/
arch/posix/
include/zephyr/arch/posix/
scripts/native_simulator/
@@ -114,20 +105,18 @@ jobs:
boards/nordic/nrf5*/*dt*
dts/*/nordic/
modules/mbedtls/**
modules/hal_nordic/**
- name: Check if Bluethooth files changed
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
uses: tj-actions/changed-files@v45
id: check-bluetooth-files
with:
files: |
tests/bsim/bluetooth/
samples/bluetooth/
subsys/bluetooth/
tests/bluetooth/
tests/bsim/bluetooth/
- name: Check if Networking files changed
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
uses: tj-actions/changed-files@v45
id: check-networking-files
with:
files: |
@@ -140,7 +129,7 @@ jobs:
include/zephyr/net/ieee802154*
- name: Check if UART files changed
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
uses: tj-actions/changed-files@v45
id: check-uart-files
with:
files: |
@@ -180,12 +169,13 @@ jobs:
- name: Merge Test Results
run: |
pip3 install junitparser junit2html
junitparser merge --glob "./bsim_*/*bsim_results.*.xml" "./twister-out/twister.xml" junit.xml
junit2html junit.xml junit.html
- name: Upload Unit Test Results in HTML
if: always()
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
uses: actions/upload-artifact@v4
with:
name: HTML Unit Test Results
if-no-files-found: ignore
@@ -193,7 +183,7 @@ jobs:
junit.html
- name: Publish Unit Test Results
uses: EnricoMi/publish-unit-test-result-action@3a74b2957438d0b6e2e61d67b05318aa25c9e6c6 # v2.20.0
uses: EnricoMi/publish-unit-test-result-action@v2
with:
check_name: Bsim Test Results
files: "junit.xml"
@@ -201,7 +191,7 @@ jobs:
- name: Upload Event Details
if: always()
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
uses: actions/upload-artifact@v4
with:
name: event
path: |

View File

@@ -10,33 +10,23 @@ on:
workflow_dispatch:
branches: [main]
schedule:
# Run daily at 00:05
- cron: '5 00 * * *'
permissions:
contents: read
# Run daily at 14:05
- cron: '5 14 * * *'
jobs:
make_bugs_pickle:
name: Make bugs pickle
runs-on: ubuntu-24.04
runs-on: ubuntu-22.04
if: github.repository_owner == 'zephyrproject-rtos'
steps:
- name: Checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
- name: Install Python dependencies
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
sudo pip3 install -U setuptools wheel pip
pip3 install -U pygithub
- name: Snapshot bugs
env:
@@ -52,7 +42,7 @@ jobs:
echo "BUGS_PICKLE_PATH=${BUGS_PICKLE_PATH}" >> ${GITHUB_ENV}
- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@b47578312673ae6fa5b5096b330d9fbac3d116df # v4.2.1
uses: aws-actions/configure-aws-credentials@v4
with:
aws-access-key-id: ${{ vars.AWS_BUILDS_ZEPHYR_BUG_SNAPSHOT_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_BUILDS_ZEPHYR_BUG_SNAPSHOT_SECRET_ACCESS_KEY }}

View File

@@ -1,12 +1,6 @@
name: Build with Clang/LLVM
on:
push:
branches:
- main
- v*-branch
- collab-*
permissions:
contents: read
on: pull_request_target
concurrency:
group: ${{ github.workflow }}-${{ github.event_name }}-${{ github.head_ref || github.ref }}
@@ -18,19 +12,21 @@ jobs:
runs-on:
group: zephyr-runner-v2-linux-x64-4xlarge
container:
image: ghcr.io/zephyrproject-rtos/ci-repo-cache:v0.28.1.20250624
image: ghcr.io/zephyrproject-rtos/ci-repo-cache:v0.27.4.20241026
options: '--entrypoint /bin/bash'
strategy:
fail-fast: false
matrix:
subset: [1, 2]
platform: ["native_sim"]
env:
CCACHE_DIR: /node-cache/ccache-zephyr
CCACHE_REMOTE_STORAGE: "redis://cache-*.keydb-cache.svc.cluster.local|shards=1,2,3"
CCACHE_REMOTE_ONLY: "true"
CCACHE_IGNOREOPTIONS: '-specs=* --specs=*'
LLVM_TOOLCHAIN_PATH: /usr/lib/llvm-20
LLVM_TOOLCHAIN_PATH: /usr/lib/llvm-16
COMMIT_RANGE: ${{ github.event.pull_request.base.sha }}..${{ github.event.pull_request.head.sha }}
BASE_REF: ${{ github.base_ref }}
outputs:
report_needed: ${{ steps.twister.outputs.report_needed }}
steps:
- name: Apply container owner mismatch workaround
run: |
@@ -53,8 +49,9 @@ jobs:
git remote set-url origin ${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}
- name: Checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
uses: actions/checkout@v4
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
persist-credentials: false
@@ -64,7 +61,7 @@ jobs:
git config --global user.email "bot@zephyrproject.org"
git config --global user.name "Zephyr Bot"
rm -fr ".git/rebase-apply"
rm -fr ".git/rebase-merge"
git rebase origin/${BASE_REF}
git clean -f -d
git log --pretty=oneline | head -n 10
west init -l . || true
@@ -109,8 +106,18 @@ jobs:
export ZEPHYR_BASE=${PWD}
export ZEPHYR_TOOLCHAIN_VARIANT=llvm
./scripts/twister -p native_sim --no-detailed-test-id --force-color --inline-logs -M -N -v --retry-failed 2 \
-T tests --subset ${{matrix.subset}}/2 -j 16
# check if we need to run a full twister or not based on files changed
python3 ./scripts/ci/test_plan.py --platform ${{ matrix.platform }} -c origin/${BASE_REF}..
# We can limit scope to just what has changed
if [ -s testplan.json ]; then
echo "report_needed=1" >> $GITHUB_OUTPUT
# Full twister but with options based on changes
./scripts/twister --force-color --inline-logs -M -N -v --load-tests testplan.json --retry-failed 2
else
# if nothing is run, skip reporting step
echo "report_needed=0" >> $GITHUB_OUTPUT
fi
- name: Print ccache stats
if: always()
@@ -118,53 +125,31 @@ jobs:
ccache -s -vv
- name: Upload Unit Test Results
if: always()
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
if: always() && steps.twister.outputs.report_needed != 0
uses: actions/upload-artifact@v4
with:
name: Unit Test Results (Subset ${{ matrix.subset }})
path: |
twister-out/twister.xml
twister-out/twister.json
if-no-files-found: ignore
name: Unit Test Results (Subset ${{ matrix.platform }})
path: twister-out/twister.xml
clang-build-results:
name: "Publish Unit Tests Results"
needs: clang-build
runs-on: ubuntu-24.04
permissions:
checks: write # to create GitHub annotations
if: (success() || failure())
runs-on: ubuntu-22.04
if: (success() || failure() ) && needs.clang-build.outputs.report_needed != 0
steps:
- name: Checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
fetch-depth: 0
persist-credentials: false
- name: Download Artifacts
uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
uses: actions/download-artifact@v4
with:
path: artifacts
- name: Set up Python
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Merge Test Results
run: |
pip3 install junitparser junit2html
junitparser merge artifacts/*/twister.xml junit.xml
junit2html junit.xml junit-clang.html
- name: Upload Unit Test Results in HTML
if: always()
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
uses: actions/upload-artifact@v4
with:
name: HTML Unit Test Results
if-no-files-found: ignore
@@ -172,7 +157,7 @@ jobs:
junit-clang.html
- name: Publish Unit Test Results
uses: EnricoMi/publish-unit-test-result-action@3a74b2957438d0b6e2e61d67b05318aa25c9e6c6 # v2.20.0
uses: EnricoMi/publish-unit-test-result-action@v2
if: always()
with:
check_name: Unit Test Results

View File

@@ -4,9 +4,6 @@ on:
schedule:
- cron: '25 06,18 * * *'
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.event_name }}-${{ github.head_ref || github.ref }}
cancel-in-progress: true
@@ -17,7 +14,7 @@ jobs:
runs-on:
group: zephyr-runner-v2-linux-x64-4xlarge
container:
image: ghcr.io/zephyrproject-rtos/ci-repo-cache:v0.28.1.20250624
image: ghcr.io/zephyrproject-rtos/ci-repo-cache:v0.27.4.20241026
options: '--entrypoint /bin/bash'
strategy:
fail-fast: false
@@ -64,21 +61,10 @@ jobs:
git remote set-url origin ${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}
- name: checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Set up Python
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: west setup
run: |
west init -l . || true
@@ -99,13 +85,24 @@ jobs:
ccache -p
ccache -z -s -vv
- name: Update BabbleSim to manifest revision
run: |
export BSIM_VERSION=$( west list bsim -f {revision} )
echo "Manifest points to bsim sha $BSIM_VERSION"
cd /opt/bsim_west/bsim
git fetch -n origin ${BSIM_VERSION}
git -c advice.detachedHead=false checkout ${BSIM_VERSION}
west update
make everything -s -j 8
- name: Run Tests with Twister (Push)
continue-on-error: true
run: |
export ZEPHYR_BASE=${PWD}
export ZEPHYR_TOOLCHAIN_VARIANT=zephyr
mkdir -p coverage/reports
./scripts/twister --save-tests ${{matrix.normalized}}-testplan.json
pip3 install gcovr==6.0
./scripts/twister -E ${{matrix.normalized}}-testplan.json
ls -la
./scripts/twister \
-i --force-color -N -v --filter runnable -p ${{ matrix.platform }} --coverage \
@@ -124,7 +121,7 @@ jobs:
- name: Upload Coverage Results
if: always()
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
uses: actions/upload-artifact@v4
with:
name: Coverage Data (Subset ${{ matrix.normalized }})
path: |
@@ -134,29 +131,18 @@ jobs:
codecov-results:
name: "Publish Coverage Results"
needs: codecov
runs-on: ubuntu-24.04
runs-on: ubuntu-22.04
# the codecov job might be skipped, we don't need to run this job then
if: success() || failure()
steps:
- name: checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Set up Python
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Download Artifacts
uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
uses: actions/download-artifact@v4
with:
path: coverage/reports
@@ -196,6 +182,7 @@ jobs:
- name: Merge coverage files
run: |
pushd ./coverage/reports
pip3 install gcovr==6.0
gcovr ${{ steps.get-coverage-files.outputs.mergefiles }} --merge-mode-functions=separate --json merged.json
gcovr ${{ steps.get-coverage-files.outputs.mergefiles }} --merge-mode-functions=separate --cobertura merged.xml
popd
@@ -211,6 +198,7 @@ jobs:
- name: Generate Coverage Report
if: always()
run: |
pip install xlsxwriter ijson
python3 ./scripts/ci/coverage/coverage_analysis.py \
-t native_sim-testplan.json \
-m MAINTAINERS.yml \
@@ -221,7 +209,7 @@ jobs:
- name: Upload Merged Coverage Results and Report
if: always()
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
uses: actions/upload-artifact@v4
with:
name: Coverage Data and report
path: |
@@ -232,7 +220,7 @@ jobs:
- name: Upload coverage to Codecov
if: always()
uses: codecov/codecov-action@18283e04ce6e62d37312384ff67231eb8fd56d24 # v5.4.3
uses: codecov/codecov-action@v4
with:
env_vars: OS,PYTHON
fail_ci_if_error: false

View File

@@ -1,58 +0,0 @@
name: "CodeQL"
on:
push:
branches:
- main
- v*-branch
- collab-*
schedule:
- cron: '34 16 * * 6'
pull_request:
branches:
- main
- v*-branch
- collab-*
permissions:
contents: read
jobs:
analyze:
name: Analyze (${{ matrix.language }})
runs-on: ubuntu-24.04
permissions:
security-events: write
strategy:
fail-fast: false
matrix:
include:
- language: python
build-mode: none
- language: actions
build-mode: none
config: ./.github/codeql/codeql-actions-config.yml
- language: javascript-typescript
build-mode: none
config: ./.github/codeql/codeql-js-config.yml
steps:
- name: Checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Initialize CodeQL
uses: github/codeql-action/init@181d5eefc20863364f96762470ba6f862bdef56b # v3.29.2
with:
languages: ${{ matrix.language }}
build-mode: ${{ matrix.build-mode }}
queries: security-extended
config-file: ${{ matrix.config }}
- if: matrix.build-mode == 'manual'
shell: bash
run: |
echo "nothing yet"
exit 0
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@181d5eefc20863364f96762470ba6f862bdef56b # v3.29.2
with:
category: "/language:${{matrix.language}}"

View File

@@ -2,37 +2,35 @@ name: Coding Guidelines
on: pull_request
permissions:
contents: read
jobs:
compliance_job:
runs-on: ubuntu-24.04
runs-on: ubuntu-22.04
name: Run coding guidelines checks on patch series (PR)
steps:
- name: Checkout the code
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
uses: actions/checkout@v4
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
- name: Set up Python
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
- name: cache-pip
uses: actions/cache@v4
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
path: ~/.cache/pip
key: ${{ runner.os }}-pip-${{ hashFiles('.github/workflows/coding_guidelines.yml') }}
- name: Install Python packages
- name: Install python dependencies
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
pip3 install unidiff
pip3 install wheel
pip3 install sh
- name: Install Packages
run: |
sudo apt-get update
sudo apt-get install coccinelle
- name: Run Coding Guidelines Checks
- name: Run Coding Guildeines Checks
continue-on-error: true
id: coding_guidelines
env:
@@ -42,8 +40,6 @@ jobs:
git config --global user.email "actions@zephyrproject.org"
git config --global user.name "Github Actions"
git remote -v
rm -fr ".git/rebase-apply"
rm -fr ".git/rebase-merge"
git rebase origin/${BASE_REF}
git clean -f -d
source zephyr-env.sh

View File

@@ -8,12 +8,9 @@ on:
- reopened
- synchronize
permissions:
contents: read
jobs:
check_compliance:
runs-on: ubuntu-24.04
runs-on: ubuntu-22.04
name: Run compliance checks on patch series (PR)
steps:
- name: Update PATH for west
@@ -21,12 +18,30 @@ jobs:
echo "$HOME/.local/bin" >> $GITHUB_PATH
- name: Checkout the code
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
uses: actions/checkout@v4
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
- name: Rebase onto the target branch
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: 3.11
- name: cache-pip
uses: actions/cache@v4
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-${{ hashFiles('.github/workflows/compliance.yml') }}
- name: Install python dependencies
run: |
pip3 install setuptools
pip3 install wheel
pip3 install python-magic lxml junitparser gitlint pylint pykwalify yamllint clang-format unidiff sphinx-lint
pip3 install west
- name: west setup
env:
BASE_REF: ${{ github.base_ref }}
run: |
@@ -36,30 +51,22 @@ jobs:
# Ensure there's no merge commits in the PR
[[ "$(git rev-list --merges --count origin/${BASE_REF}..)" == "0" ]] || \
(echo "::error ::Merge commits not allowed, rebase instead";false)
rm -fr ".git/rebase-apply"
rm -fr ".git/rebase-merge"
git rebase origin/${BASE_REF}
git clean -f -d
# debug
git log --pretty=oneline | head -n 10
- name: Set up Python
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: west setup
run: |
west init -l . || true
west config manifest.group-filter -- +ci,-optional
west update -o=--depth=1 -n 2>&1 1> west.update.log || west update -o=--depth=1 -n 2>&1 1> west.update2.log
- name: Check for PR description
if: ${{ github.event.pull_request.body == '' }}
continue-on-error: true
id: pr_description
run: |
echo "Pull request description cannot be empty."
exit 1
- name: Run Compliance Tests
continue-on-error: true
id: compliance
@@ -72,15 +79,11 @@ jobs:
git log --pretty=oneline | head -n 10
# Increase rename limit to allow for large PRs
git config diff.renameLimit 10000
excludes="-e KconfigBasic -e SysbuildKconfigBasic -e ClangFormat"
# The signed-off-by check for dependabot should be skipped
if [ "${{ github.actor }}" == "dependabot[bot]" ]; then
excludes="$excludes -e Identity"
fi
./scripts/ci/check_compliance.py --annotate $excludes -c origin/${BASE_REF}..
./scripts/ci/check_compliance.py --annotate -e KconfigBasic \
-c origin/${BASE_REF}..
- name: upload-results
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
uses: actions/upload-artifact@v4
continue-on-error: true
with:
name: compliance.xml

View File

@@ -10,38 +10,28 @@ on:
branches:
- refs/tags/*
permissions:
contents: read
jobs:
get_version:
runs-on: ubuntu-24.04
runs-on: ubuntu-22.04
if: github.repository == 'zephyrproject-rtos/zephyr'
steps:
- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@b47578312673ae6fa5b5096b330d9fbac3d116df # v4.2.1
uses: aws-actions/configure-aws-credentials@v4
with:
aws-access-key-id: ${{ vars.AWS_TESTING_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_TESTING_SECRET_ACCESS_KEY }}
aws-region: us-east-1
- name: install-pip
run: |
pip3 install gitpython
- name: checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Set up Python
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Upload to AWS S3
run: |
python3 scripts/ci/version_mgr.py --update .

View File

@@ -20,9 +20,6 @@ on:
- 'scripts/dts/**'
- '.github/workflows/devicetree_checks.yml'
permissions:
contents: read
jobs:
devicetree-checks:
name: Devicetree script tests
@@ -33,19 +30,40 @@ jobs:
os: [ubuntu-22.04, macos-14, windows-2022]
steps:
- name: checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
uses: actions/checkout@v4
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
- name: cache-pip-linux
if: startsWith(runner.os, 'Linux')
uses: actions/cache@v4
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-${{ matrix.python-version }}
restore-keys: |
${{ runner.os }}-pip-${{ matrix.python-version }}
- name: cache-pip-mac
if: startsWith(runner.os, 'macOS')
uses: actions/cache@v4
with:
path: ~/Library/Caches/pip
# Trailing '-' was just to get a different cache name
key: ${{ runner.os }}-pip-${{ matrix.python-version }}-
restore-keys: |
${{ runner.os }}-pip-${{ matrix.python-version }}-
- name: cache-pip-win
if: startsWith(runner.os, 'Windows')
uses: actions/cache@v4
with:
path: ~\AppData\Local\pip\Cache
key: ${{ runner.os }}-pip-${{ matrix.python-version }}
restore-keys: |
${{ runner.os }}-pip-${{ matrix.python-version }}
- name: install python dependencies
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
pip3 install wheel
pip3 install pytest pyyaml tox
- name: run tox
working-directory: scripts/dts/python-devicetree
run: |

20
.github/workflows/do_not_merge.yml vendored Normal file
View File

@@ -0,0 +1,20 @@
name: Do Not Merge
on:
pull_request:
types: [synchronize, opened, reopened, labeled, unlabeled]
jobs:
do-not-merge:
if: ${{ contains(github.event.*.labels.*.name, 'DNM') ||
contains(github.event.*.labels.*.name, 'TSC') ||
contains(github.event.*.labels.*.name, 'Architecture Review') ||
contains(github.event.*.labels.*.name, 'dev-review') }}
name: Prevent Merging
runs-on: ubuntu-22.04
steps:
- name: Check for label
run: |
echo "Pull request is labeled as 'DNM', 'TSC', 'Architecture Review' or 'dev-review'."
echo "This workflow fails so that the pull request cannot be merged."
exit 1

View File

@@ -11,28 +11,33 @@ on:
- v*
pull_request:
permissions:
contents: read
env:
DOXYGEN_VERSION: 1.14.0
DOXYGEN_MD5SUM: e761a5097ae20ecccfd02041925f102a
JOB_COUNT: 4
# NOTE: west docstrings will be extracted from the version listed here
WEST_VERSION: 1.2.0
# The latest CMake available directly with apt is 3.18, but we need >=3.20
# so we fetch that through pip.
CMAKE_VERSION: 3.20.5
DOXYGEN_VERSION: 1.12.0
# Job count is set to 2 less than the vCPU count of 16 because the total available RAM is 32GiB
# and each sphinx-build process may use more than 2GiB of RAM.
JOB_COUNT: 14
jobs:
doc-file-check:
name: Check for doc changes
runs-on: ubuntu-24.04
runs-on: ubuntu-22.04
if: >
github.repository_owner == 'zephyrproject-rtos'
outputs:
file_check: ${{ steps.check-doc-files.outputs.any_modified }}
steps:
- name: checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
uses: actions/checkout@v4
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
- name: Check if Documentation related files changed
uses: tj-actions/changed-files@ed68ef82c095e0d48ec87eccea555d944a631a4c # v46.0.5
uses: tj-actions/changed-files@v45
id: check-doc-files
with:
files: |
@@ -43,6 +48,7 @@ jobs:
kernel/include/kernel_arch_interface.h
lib/libc/**
subsys/testsuite/ztest/include/**
tests/
**/Kconfig*
west.yml
scripts/dts/
@@ -55,34 +61,34 @@ jobs:
name: "Documentation Build (HTML)"
needs: [doc-file-check]
if: >
needs.doc-file-check.outputs.file_check == 'true' || github.event_name != 'pull_request'
runs-on: ubuntu-24.04
github.repository_owner == 'zephyrproject-rtos' &&
( needs.doc-file-check.outputs.file_check == 'true' || github.event_name != 'pull_request' )
runs-on: ubuntu-22.04
timeout-minutes: 90
concurrency:
group: doc-build-html-${{ github.ref }}
cancel-in-progress: true
steps:
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: 3.12
- name: install-pkgs
run: |
sudo apt-get update
sudo apt-get install -y wget python3-pip git ninja-build graphviz lcov
wget --no-verbose "https://github.com/doxygen/doxygen/releases/download/Release_${DOXYGEN_VERSION//./_}/doxygen-${DOXYGEN_VERSION}.linux.bin.tar.gz"
echo "${DOXYGEN_MD5SUM} doxygen-${DOXYGEN_VERSION}.linux.bin.tar.gz" | md5sum -c
if [ $? -ne 0 ]; then
echo "Failed to verify doxygen tarball"
exit 1
fi
sudo tar xf doxygen-${DOXYGEN_VERSION}.linux.bin.tar.gz -C /opt
echo "/opt/doxygen-${DOXYGEN_VERSION}/bin" >> $GITHUB_PATH
echo "${HOME}/.local/bin" >> $GITHUB_PATH
- name: checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
uses: actions/checkout@v4
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
path: zephyr
- name: Rebase
if: github.event_name == 'pull_request'
@@ -90,37 +96,33 @@ jobs:
env:
BASE_REF: ${{ github.base_ref }}
PR_HEAD: ${{ github.event.pull_request.head.sha }}
working-directory: zephyr
run: |
git config --global user.email "actions@zephyrproject.org"
git config --global user.name "Github Actions"
rm -fr ".git/rebase-apply"
rm -fr ".git/rebase-merge"
git rebase origin/${BASE_REF}
git clean -f -d
git log --graph --oneline HEAD...${PR_HEAD}
- name: Set up Python
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
- name: cache-pip
uses: actions/cache@v4
with:
python-version: 3.12
cache: pip
cache-dependency-path: doc/requirements.txt
- name: Setup Zephyr project
uses: zephyrproject-rtos/action-zephyr-setup@b2453c72966ee67b1433be22b250348d48283286 # v1.0.7
with:
app-path: zephyr
toolchains: 'all'
path: ~/.cache/pip
key: pip-${{ hashFiles('doc/requirements.txt') }}
- name: install-pip
working-directory: zephyr
run: |
pip install -r doc/requirements.txt --require-hashes
sudo pip3 install -U setuptools wheel pip
pip3 install -r doc/requirements.txt
pip3 install west==${WEST_VERSION}
pip3 install cmake==${CMAKE_VERSION}
pip3 install coverxygen
- name: west setup
run: |
west init -l .
- name: build-docs
shell: bash
working-directory: zephyr
run: |
if [[ "$GITHUB_REF" =~ "refs/tags/v" ]]; then
DOC_TAG="release"
@@ -146,23 +148,22 @@ jobs:
genhtml --no-function-coverage --no-branch-coverage new.info -o coverage-report
- name: compress-docs
working-directory: zephyr
run: |
tar --use-compress-program="xz -T0" -cf html-output.tar.xz --exclude html/_sources --exclude html/doxygen/xml --directory=doc/_build html
tar --use-compress-program="xz -T0" -cf html-output.tar.xz --directory=doc/_build html
tar --use-compress-program="xz -T0" -cf api-output.tar.xz --directory=doc/_build html/doxygen/html
tar --use-compress-program="xz -T0" -cf api-coverage.tar.xz coverage-report
- name: upload-build
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
uses: actions/upload-artifact@v4
with:
name: html-output
path: zephyr/html-output.tar.xz
path: html-output.tar.xz
- name: upload-api-coverage
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
uses: actions/upload-artifact@v4
with:
name: api-coverage
path: zephyr/api-coverage.tar.xz
path: api-coverage.tar.xz
- name: process-pr
if: github.event_name == 'pull_request'
@@ -179,7 +180,7 @@ jobs:
echo "API Coverage Report will be available shortly at: ${API_COVERAGE_URL}" >> $GITHUB_STEP_SUMMARY
- name: upload-pr-number
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
uses: actions/upload-artifact@v4
if: github.event_name == 'pull_request'
with:
name: pr_num
@@ -189,56 +190,58 @@ jobs:
name: "Documentation Build (PDF)"
needs: [doc-file-check]
if: |
github.event_name != 'pull_request'
runs-on: ubuntu-24.04
github.event_name != 'pull_request' &&
github.repository_owner == 'zephyrproject-rtos'
runs-on: ubuntu-22.04
container: texlive/texlive:latest
timeout-minutes: 120
concurrency:
group: doc-build-pdf-${{ github.ref }}
cancel-in-progress: true
steps:
- name: Apply container owner mismatch workaround
run: |
git config --global --add safe.directory ${GITHUB_WORKSPACE}
- name: checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
path: zephyr
uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
uses: actions/setup-python@v5
with:
python-version: 3.12
cache: pip
cache-dependency-path: doc/requirements.txt
- name: install-pkgs
run: |
sudo apt-get update
sudo apt-get install --no-install-recommends graphviz librsvg2-bin \
texlive-latex-base texlive-latex-extra latexmk \
texlive-fonts-recommended texlive-fonts-extra texlive-xetex \
imagemagick fonts-noto xindy
wget --no-verbose "https://github.com/doxygen/doxygen/releases/download/Release_${DOXYGEN_VERSION//./_}/doxygen-${DOXYGEN_VERSION}.linux.bin.tar.gz"
echo "${DOXYGEN_MD5SUM} doxygen-${DOXYGEN_VERSION}.linux.bin.tar.gz" | md5sum -c
if [ $? -ne 0 ]; then
echo "Failed to verify doxygen tarball"
exit 1
fi
sudo tar xf doxygen-${DOXYGEN_VERSION}.linux.bin.tar.gz -C /opt
echo "/opt/doxygen-${DOXYGEN_VERSION}/bin" >> $GITHUB_PATH
apt-get update
apt-get install -y python3-pip python3-venv ninja-build doxygen graphviz librsvg2-bin imagemagick
- name: Setup Zephyr project
uses: zephyrproject-rtos/action-zephyr-setup@b2453c72966ee67b1433be22b250348d48283286 # v1.0.7
- name: cache-pip
uses: actions/cache@v4
with:
app-path: zephyr
toolchains: 'arm-zephyr-eabi'
path: ~/.cache/pip
key: pip-${{ hashFiles('doc/requirements.txt') }}
- name: install-pip-pkgs
working-directory: zephyr
- name: setup-venv
run: |
pip install -r doc/requirements.txt --require-hashes
python3 -m venv .venv
. .venv/bin/activate
echo PATH=$PATH >> $GITHUB_ENV
- name: install-pip
run: |
pip3 install -U setuptools wheel pip
pip3 install -r doc/requirements.txt
pip3 install west==${WEST_VERSION}
pip3 install cmake==${CMAKE_VERSION}
- name: west setup
run: |
west init -l .
- name: build-docs
shell: bash
working-directory: zephyr
continue-on-error: true
run: |
if [[ "$GITHUB_REF" =~ "refs/tags/v" ]]; then
@@ -254,13 +257,13 @@ jobs:
- name: upload-build
if: always()
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
uses: actions/upload-artifact@v4
with:
name: pdf-output
if-no-files-found: ignore
path: |
zephyr/doc/_build/latex/zephyr.pdf
zephyr/doc/_build/latex/zephyr.log
doc/_build/latex/zephyr.pdf
doc/_build/latex/zephyr.log
doc-build-status-check:
if: always()

View File

@@ -10,13 +10,10 @@ on:
types:
- completed
permissions:
contents: read
jobs:
doc-publish:
name: Publish Documentation
runs-on: ubuntu-24.04
runs-on: ubuntu-22.04
if: |
github.event.workflow_run.event == 'pull_request' &&
github.event.workflow_run.conclusion == 'success' &&
@@ -25,7 +22,7 @@ jobs:
steps:
- name: Download artifacts
id: download-artifacts
uses: dawidd6/action-download-artifact@ac66b43f0e6a346234dd65d4d0c8fbb31cb316e5 # v11
uses: dawidd6/action-download-artifact@v6
with:
workflow: doc-build.yml
run_id: ${{ github.event.workflow_run.id }}
@@ -33,7 +30,7 @@ jobs:
- name: Load PR number
if: steps.download-artifacts.outputs.found_artifact == 'true'
uses: actions/github-script@60a0d83039c74a4aee543508d2ffcb1c3799cdea # v7.0.1
uses: actions/github-script@v7
with:
script: |
let fs = require("fs");
@@ -43,7 +40,7 @@ jobs:
- name: Check PR number
if: steps.download-artifacts.outputs.found_artifact == 'true'
id: check-pr
uses: carpentries/actions/check-valid-pr@2e20fd5ee53b691e27455ce7ca3b16ea885140e8 # v0.15.0
uses: carpentries/actions/check-valid-pr@v0.14.0
with:
pr: ${{ env.PR_NUM }}
sha: ${{ github.event.workflow_run.head_sha }}
@@ -66,7 +63,7 @@ jobs:
- name: Configure AWS Credentials
if: steps.download-artifacts.outputs.found_artifact == 'true'
uses: aws-actions/configure-aws-credentials@b47578312673ae6fa5b5096b330d9fbac3d116df # v4.2.1
uses: aws-actions/configure-aws-credentials@v4
with:
aws-access-key-id: ${{ vars.AWS_BUILDS_ZEPHYR_PR_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_BUILDS_ZEPHYR_PR_SECRET_ACCESS_KEY }}

View File

@@ -13,13 +13,10 @@ on:
types:
- completed
permissions:
contents: read
jobs:
doc-publish:
name: Publish Documentation
runs-on: ubuntu-24.04
runs-on: ubuntu-22.04
if: |
github.event.workflow_run.event != 'pull_request' &&
github.event.workflow_run.conclusion == 'success' &&
@@ -27,7 +24,7 @@ jobs:
steps:
- name: Download artifacts
uses: dawidd6/action-download-artifact@ac66b43f0e6a346234dd65d4d0c8fbb31cb316e5 # v11
uses: dawidd6/action-download-artifact@v6
with:
workflow: doc-build.yml
run_id: ${{ github.event.workflow_run.id }}
@@ -40,7 +37,7 @@ jobs:
fi
- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@b47578312673ae6fa5b5096b330d9fbac3d116df # v4.2.1
uses: aws-actions/configure-aws-credentials@v4
with:
aws-access-key-id: ${{ vars.AWS_DOCS_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_DOCS_SECRET_ACCESS_KEY }}

View File

@@ -6,14 +6,11 @@ on:
- 'lib/libc/minimal/include/errno.h'
- 'scripts/ci/errno.py'
permissions:
contents: read
jobs:
check-errno:
runs-on: ubuntu-24.04
runs-on: ubuntu-22.04
container:
image: ghcr.io/zephyrproject-rtos/ci:v0.28.0
image: ghcr.io/zephyrproject-rtos/ci:v0.27.4
steps:
- name: Apply container owner mismatch workaround
@@ -25,7 +22,7 @@ jobs:
git config --global --add safe.directory ${GITHUB_WORKSPACE}
- name: checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
uses: actions/checkout@v4
- name: Environment Setup
run: |

View File

@@ -1,8 +1,9 @@
name: Footprint Tracking
# Run every 12 hours and on tags
on:
schedule:
- cron: '50 00 * * *'
- cron: '50 1/12 * * *'
push:
paths:
- 'VERSION'
@@ -15,9 +16,6 @@ on:
# same commit
- 'v*'
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.event_name }}-${{ github.head_ref || github.ref }}
cancel-in-progress: true
@@ -28,7 +26,7 @@ jobs:
group: zephyr-runner-v2-linux-x64-4xlarge
if: github.repository_owner == 'zephyrproject-rtos'
container:
image: ghcr.io/zephyrproject-rtos/ci-repo-cache:v0.28.1.20250624
image: ghcr.io/zephyrproject-rtos/ci-repo-cache:v0.27.4.20241026
options: '--entrypoint /bin/bash'
defaults:
run:
@@ -60,24 +58,14 @@ jobs:
run: |
sudo apt-get update
sudo apt-get install -y python3-venv
sudo pip3 install -U setuptools wheel pip gitpython
- name: checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
uses: actions/checkout@v4
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
- name: Set up Python
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Environment Setup
run: |
echo "ZEPHYR_SDK_INSTALL_DIR=/opt/toolchains/zephyr-sdk-$( cat SDK_VERSION )" >> $GITHUB_ENV
@@ -89,7 +77,7 @@ jobs:
west update 2>&1 1> west.update.log || west update 2>&1 1> west.update2.log
- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@b47578312673ae6fa5b5096b330d9fbac3d116df # v4.2.1
uses: aws-actions/configure-aws-credentials@v4
with:
aws-access-key-id: ${{ vars.AWS_TESTING_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_TESTING_SECRET_ACCESS_KEY }}
@@ -106,6 +94,7 @@ jobs:
run: |
python3 -m venv .venv
. .venv/bin/activate
pip3 install awscli
aws s3 sync --quiet footprint_data/ s3://testing.zephyrproject.org/footprint_data/
- name: Transform Footprint data to Twister JSON reports
@@ -124,6 +113,7 @@ jobs:
ELASTICSEARCH_INDEX: ${{ vars.FOOTPRINT_TRACKING_INDEX }}
run: |
shopt -s globstar
pip3 install -U elasticsearch
run_date=`date --iso-8601=minutes`
python3 ./scripts/ci/upload_test_results_es.py -r ${run_date} \
--flatten footprint \

View File

@@ -6,20 +6,14 @@ on:
pull_request_target:
types: [opened, closed]
permissions:
contents: read
jobs:
check_for_first_interaction:
runs-on: ubuntu-24.04
runs-on: ubuntu-22.04
if: github.repository == 'zephyrproject-rtos/zephyr'
permissions:
pull-requests: write # to comment on pull requests
issues: write # to comment on issues
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: zephyrproject-rtos/action-first-interaction@58853996b1ac504b8e0f6964301f369d2bb22e5c # v1.1.1+zephyr.6
- uses: actions/checkout@v4
- uses: zephyrproject-rtos/action-first-interaction@v1.1.1-zephyr-5
with:
repo-token: ${{ secrets.GITHUB_TOKEN }}

View File

@@ -12,13 +12,11 @@ on:
- v*-branch
- collab-*
paths:
- 'scripts/**'
- 'scripts/build/**'
- 'scripts/requirements*.txt'
- '.github/workflows/hello_world_multiplatform.yaml'
- 'SDK_VERSION'
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.event_name }}-${{ github.head_ref || github.ref }}
cancel-in-progress: true
@@ -32,7 +30,7 @@ jobs:
runs-on: ${{ matrix.os }}
steps:
- name: Checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
uses: actions/checkout@v4
with:
path: zephyr
fetch-depth: 0
@@ -47,22 +45,20 @@ jobs:
run: |
git config --global user.email "actions@zephyrproject.org"
git config --global user.name "Github Actions"
rm -fr ".git/rebase-apply"
rm -fr ".git/rebase-merge"
git rebase origin/${BASE_REF}
git clean -f -d
git log --graph --oneline HEAD...${PR_HEAD}
- name: Set up Python
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
uses: actions/setup-python@v5
with:
python-version: 3.12
python-version: 3.11
- name: Setup Zephyr project
uses: zephyrproject-rtos/action-zephyr-setup@b2453c72966ee67b1433be22b250348d48283286 # v1.0.7
uses: zephyrproject-rtos/action-zephyr-setup@v1
with:
app-path: zephyr
toolchains: aarch64-zephyr-elf:arc-zephyr-elf:arc64-zephyr-elf:arm-zephyr-eabi:mips-zephyr-elf:riscv64-zephyr-elf:sparc-zephyr-elf:x86_64-zephyr-elf:xtensa-dc233c_zephyr-elf:xtensa-sample_controller32_zephyr-elf:rx-zephyr-elf
toolchains: all
- name: Build firmware
working-directory: zephyr
@@ -73,11 +69,11 @@ jobs:
elif [ "${{ runner.os }}" = "Windows" ]; then
EXTRA_TWISTER_FLAGS="-P native_sim --short-build-path -O/tmp/twister-out"
fi
./scripts/twister --runtime-artifact-cleanup --force-color --inline-logs -T samples/hello_world -T samples/cpp/hello_world -v $EXTRA_TWISTER_FLAGS
./scripts/twister --force-color --inline-logs -T samples/hello_world -T samples/cpp/hello_world -v $EXTRA_TWISTER_FLAGS
- name: Upload artifacts
if: failure()
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
uses: actions/upload-artifact@v4
with:
if-no-files-found: ignore
path:

View File

@@ -2,10 +2,7 @@ name: Issue Tracker
on:
schedule:
- cron: '10 1/4 * * *'
permissions:
contents: read
- cron: '*/10 * * * *'
env:
OUTPUT_FILE_NAME: IssuesReport.md
@@ -17,7 +14,7 @@ env:
jobs:
track-issues:
name: "Collect Issue Stats"
runs-on: ubuntu-24.04
runs-on: ubuntu-22.04
if: github.repository == 'zephyrproject-rtos/zephyr'
steps:
@@ -30,7 +27,7 @@ jobs:
sudo apt-get update
sudo apt-get install discount
- uses: brcrista/summarize-issues@54c549b7d38b7db39e5c6e06fd9617e12e5c3491 # v4
- uses: brcrista/summarize-issues@v4
with:
title: 'Issues Report for ${{ github.repository }}'
configPath: 'issues-report-config.json'
@@ -38,14 +35,14 @@ jobs:
token: ${{ secrets.GITHUB_TOKEN }}
- name: upload-stats
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
uses: actions/upload-artifact@v4
continue-on-error: true
with:
name: ${{ env.OUTPUT_FILE_NAME }}
path: ${{ env.OUTPUT_FILE_NAME }}
- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@b47578312673ae6fa5b5096b330d9fbac3d116df # v4.2.1
uses: aws-actions/configure-aws-credentials@v4
with:
aws-access-key-id: ${{ vars.AWS_TESTING_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_TESTING_SECRET_ACCESS_KEY }}

View File

@@ -2,25 +2,22 @@ name: Scancode
on: [pull_request]
permissions:
contents: read
jobs:
scancode_job:
runs-on: ubuntu-24.04
runs-on: ubuntu-22.04
name: Scan code for licenses
steps:
- name: Checkout the code
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Scan the code
id: scancode
uses: zephyrproject-rtos/action_scancode@23ef91ce31cd4b954366a7b71eea47520da9b380 # v4
uses: zephyrproject-rtos/action_scancode@v4
with:
directory-to-scan: 'scan/'
- name: Artifact Upload
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
uses: actions/upload-artifact@v4
with:
name: scancode
path: ./artifacts

View File

@@ -1,43 +0,0 @@
name: Maintainer file check
on:
pull_request_target:
branches:
- main
paths:
- MAINTAINERS.yml
permissions:
contents: read
jobs:
assignment:
name: Check MAINTAINERS.yml changes
runs-on: ubuntu-24.04
steps:
- name: Check out source code
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Set up Python
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Fetch MAINTAINERS.yml from mainline
run: |
git fetch origin main
git show origin/main:MAINTAINERS.yml > mainline_MAINTAINERS.yml
- name: Check maintainer file changes
env:
GITHUB_TOKEN: ${{ secrets.ZB_PR_ASSIGNER_GITHUB_TOKEN }}
run: |
python ./scripts/ci/check_maintainer_changes.py \
--repo zephyrproject-rtos/zephyr mainline_MAINTAINERS.yml MAINTAINERS.yml

View File

@@ -2,49 +2,33 @@ name: Manifest
on:
pull_request_target:
permissions:
contents: read
jobs:
contribs:
runs-on: ubuntu-24.04
permissions:
pull-requests: write # to create/update pull request comments
runs-on: ubuntu-22.04
name: Manifest
steps:
- name: Checkout the code
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
uses: actions/checkout@v4
with:
path: zephyrproject/zephyr
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
persist-credentials: false
- name: Set up Python
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
cd zephyrproject/zephyr
pip install -r scripts/requirements-actions.txt --require-hashes
- name: west setup
env:
BASE_REF: ${{ github.base_ref }}
working-directory: zephyrproject/zephyr
run: |
pip3 install west
git config --global user.email "you@example.com"
git config --global user.name "Your Name"
west init -l . || true
- name: Manifest
uses: zephyrproject-rtos/action-manifest@1729cded3fc798cf0de4a789c596dcb9c40eb14c # v1.9.1
uses: zephyrproject-rtos/action-manifest@v1.3.1
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
github-token: ${{ secrets.ZB_GITHUB_TOKEN }}
manifest-path: 'west.yml'
checkout-path: 'zephyrproject/zephyr'
use-tree-checkout: 'true'
@@ -52,6 +36,4 @@ jobs:
label-prefix: 'manifest-'
verbosity-level: '1'
labels: 'manifest'
dnm-labels: 'DNM (manifest)'
blobs-added-labels: 'Binary Blobs Added'
blobs-modified-labels: 'Binary Blobs Modified'
dnm-labels: 'DNM'

View File

@@ -1,19 +0,0 @@
name: Check SHA-pinned GitHub Actions
on:
pull_request:
paths:
- '.github/workflows/**'
permissions:
contents: read
jobs:
check-sha-pinned-actions:
name: Verify GitHub Actions
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Ensure SHA pinned actions
uses: zgosalvez/github-actions-ensure-sha-pinned-actions@fc87bb5b5a97953d987372e74478de634726b3e5 # v3.0.25

View File

@@ -1,39 +0,0 @@
name: PR Metadata Check
on:
pull_request:
types:
- synchronize
- opened
- reopened
- labeled
- unlabeled
- edited
permissions:
contents: read
jobs:
do-not-merge:
name: Prevent Merging
runs-on: ubuntu-24.04
steps:
- name: Checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Set up Python
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python dependencies
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Run the check script
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
python -u scripts/ci/do_not_merge.py -p "${{ github.event.pull_request.number }}"

View File

@@ -19,9 +19,6 @@ on:
- 'scripts/pylib/build_helpers/**'
- '.github/workflows/pylib_tests.yml'
permissions:
contents: read
jobs:
pylib-tests:
name: Misc. Pylib Unit Tests
@@ -29,21 +26,25 @@ jobs:
strategy:
matrix:
python-version: ['3.10', '3.11', '3.12', '3.13']
os: [ubuntu-24.04]
os: [ubuntu-22.04]
steps:
- name: checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
uses: actions/checkout@v4
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
- name: cache-pip-linux
if: startsWith(runner.os, 'Linux')
uses: actions/cache@v4
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-${{ matrix.python-version }}
restore-keys: |
${{ runner.os }}-pip-${{ matrix.python-version }}
- name: install-packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
pip3 install -r scripts/requirements-base.txt -r scripts/requirements-build-test.txt
- name: Run pytest for build_helpers
env:
ZEPHYR_BASE: ./

View File

@@ -7,9 +7,6 @@ on:
type: string
required: true
permissions:
contents: read
jobs:
all_jobs_passed:
name: all jobs passed

View File

@@ -6,16 +6,11 @@ on:
- 'v*'
- '!v*rc*'
permissions:
contents: read
jobs:
release:
runs-on: ubuntu-24.04
permissions:
contents: write # to create GitHub release entry
runs-on: ubuntu-22.04
steps:
- uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- uses: actions/checkout@v4
with:
fetch-depth: 0
@@ -26,12 +21,12 @@ jobs:
echo "TRIMMED_VERSION=${GITHUB_REF#refs/tags/v}" >> $GITHUB_OUTPUT
- name: REUSE Compliance Check
uses: fsfe/reuse-action@bb774aa972c2a89ff34781233d275075cbddf542 # v5.0.0
uses: fsfe/reuse-action@v4
with:
args: spdx -o zephyr-${{ steps.get_version.outputs.VERSION }}.spdx
- name: upload-results
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
uses: actions/upload-artifact@v4
continue-on-error: true
with:
name: zephyr-${{ steps.get_version.outputs.VERSION }}.spdx
@@ -43,7 +38,7 @@ jobs:
- name: Create Release
id: create_release
uses: actions/create-release@0cb9c9b65d5d1901c1f53e5e66eaf4afd303e70e # v1.1.4
uses: actions/create-release@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
@@ -55,7 +50,7 @@ jobs:
- name: Upload Release Assets
id: upload-release-asset
uses: actions/upload-release-asset@e8f9f06c4b078e705bd2ea027f0926603fc9b4d5 # v1.0.2
uses: actions/upload-release-asset@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:

View File

@@ -29,12 +29,12 @@ jobs:
steps:
- name: "Checkout code"
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
uses: actions/checkout@692973e3d937129bcbf40652eb9f2f61becf3332 # v4.1.7
with:
persist-credentials: false
- name: "Run analysis"
uses: ossf/scorecard-action@05b42c624433fc40578a4040d5cf5e36ddca8cde # v2.4.2
uses: ossf/scorecard-action@62b2cac7ed8198b15735ed49ab1e5cf35480ba46 # v2.4.0
with:
results_file: results.sarif
results_format: sarif
@@ -47,7 +47,7 @@ jobs:
# uploads of run results in SARIF format to the repository Actions tab.
# https://docs.github.com/en/actions/advanced-guides/storing-workflow-data-as-artifacts
- name: "Upload artifact"
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
uses: actions/upload-artifact@89ef406dd8d7e03cfd12d9e0a4a378f454709029 # v4.3.5
with:
name: SARIF file
path: results.sarif
@@ -56,6 +56,6 @@ jobs:
# Upload the results to GitHub's code scanning dashboard (optional).
# Commenting out will disable upload of results to your repo's Code Scanning dashboard
- name: "Upload to code-scanning"
uses: github/codeql-action/upload-sarif@181d5eefc20863364f96762470ba6f862bdef56b # v3.29.2
uses: github/codeql-action/upload-sarif@afb54ba388a7dca6ecae48f608c4ff05ff4cc77a # v3.25.15
with:
sarif_file: results.sarif

View File

@@ -19,9 +19,6 @@ on:
- 'scripts/build/**'
- '.github/workflows/scripts_tests.yml'
permissions:
contents: read
jobs:
scripts-tests:
name: Scripts tests
@@ -29,10 +26,10 @@ jobs:
strategy:
matrix:
python-version: ['3.10', '3.11', '3.12', '3.13']
os: [ubuntu-24.04]
os: [ubuntu-20.04]
steps:
- name: checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
uses: actions/checkout@v4
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
@@ -45,22 +42,27 @@ jobs:
run: |
git config --global user.email "actions@zephyrproject.org"
git config --global user.name "Github Actions"
rm -fr ".git/rebase-apply"
rm -fr ".git/rebase-merge"
git rebase origin/${BASE_REF}
git clean -f -d
git log --graph --oneline HEAD...${PR_HEAD}
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
- name: cache-pip-linux
if: startsWith(runner.os, 'Linux')
uses: actions/cache@v4
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-${{ matrix.python-version }}
restore-keys: |
${{ runner.os }}-pip-${{ matrix.python-version }}
- name: install-packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
pip3 install -r scripts/requirements-base.txt -r scripts/requirements-build-test.txt
- name: Run pytest
env:

View File

@@ -7,9 +7,6 @@ on:
# everyday at 15:00
- cron: '0 15 * * *'
permissions:
contents: read
concurrency:
group: stale-workflow-queue-cleanup
cancel-in-progress: true
@@ -17,13 +14,11 @@ concurrency:
jobs:
cleanup:
name: Cleanup
runs-on: ubuntu-24.04
permissions:
actions: write # to delete stale workflow runs
runs-on: ubuntu-22.04
steps:
- name: Delete stale queued workflow runs
uses: MajorScruffy/delete-old-workflow-runs@78b5af714fefaefdf74862181c467b061782719e # v0.3.0
uses: MajorScruffy/delete-old-workflow-runs@v0.3.0
with:
repository: ${{ github.repository }}
# Remove any workflow runs in "queued" state for more than 1 day

View File

@@ -3,20 +3,13 @@ on:
schedule:
- cron: "16 00 * * *"
permissions:
contents: read
jobs:
stale:
name: Find Stale issues and PRs
runs-on: ubuntu-24.04
runs-on: ubuntu-22.04
if: github.repository == 'zephyrproject-rtos/zephyr'
permissions:
pull-requests: write # to comment on stale pull requests
issues: write # to comment on stale issues
steps:
- uses: actions/stale@5bef64f19d7facfb25b37b414482c7164d639639 # v9.1.0
- uses: actions/stale@v9
with:
stale-pr-message: 'This pull request has been marked as stale because it has been open (more
than) 60 days with no activity. Remove the stale label or add a comment saying that you

View File

@@ -6,28 +6,13 @@ on:
- main
- v*-branch
types: [closed]
permissions:
contents: read
jobs:
record_merged:
if: github.event.pull_request.merged == true && github.repository == 'zephyrproject-rtos/zephyr'
runs-on: ubuntu-24.04
runs-on: ubuntu-22.04
steps:
- name: checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
- name: Set up Python
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
uses: actions/checkout@v4
- name: PR event
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
@@ -35,4 +20,5 @@ jobs:
ELASTICSEARCH_SERVER: "https://elasticsearch.zephyrproject.io:443"
PR_STAT_ES_INDEX: ${{ vars.PR_STAT_ES_INDEX }}
run: |
pip3 install pygithub elasticsearch
python3 ./scripts/ci/stats/merged_prs.py --pull-request ${{ github.event.pull_request.number }} --repo ${{ github.repository }}

View File

@@ -1,64 +0,0 @@
name: Publish Twister Test Results
on:
workflow_run:
workflows: ["Run tests with twister"]
branches:
- main
types:
- completed
permissions:
contents: read
jobs:
upload-to-elasticsearch:
if: |
github.repository == 'zephyrproject-rtos/zephyr' &&
github.event.workflow_run.event != 'pull_request'
env:
ELASTICSEARCH_KEY: ${{ secrets.ELASTICSEARCH_KEY }}
ELASTICSEARCH_SERVER: "https://elasticsearch.zephyrproject.io:443"
runs-on: ubuntu-24.04
steps:
# Needed for elasticearch and upload script
- name: Checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
fetch-depth: 0
persist-credentials: false
- name: Set up Python
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Download Artifacts
id: download-artifacts
uses: dawidd6/action-download-artifact@ac66b43f0e6a346234dd65d4d0c8fbb31cb316e5 # v11
with:
path: artifacts
workflow: twister.yml
run_id: ${{ github.event.workflow_run.id }}
if_no_artifact_found: ignore
- name: Upload to elasticsearch
if: steps.download-artifacts.outputs.found_artifact == 'true'
run: |
# set run date on upload to get consistent and unified data across the matrix.
run_date=`date --iso-8601=minutes`
if [ "${{github.event.workflow_run.event}}" = "push" ]; then
python3 ./scripts/ci/upload_test_results_es.py -r ${run_date} \
--run-attempt ${{github.run_attempt}} \
--run-branch ${{github.ref_name}} \
--index zephyr-main-ci-push-1 artifacts/*/*/twister.json
elif [ "${{github.event.workflow_run.event}}" = "schedule" ]; then
python3 ./scripts/ci/upload_test_results_es.py -r ${run_date} \
--run-attempt ${{github.run_attempt}} \
--run-branch ${{github.ref_name}} \
--index zephyr-main-ci-weekly-1 artifacts/*/*/twister.json
fi

View File

@@ -6,17 +6,14 @@ on:
- main
- v*-branch
- collab-*
pull_request:
pull_request_target:
branches:
- main
- v*-branch
- collab-*
schedule:
# Run at 02:00 UTC on every Sunday
- cron: '0 2 * * 0'
permissions:
contents: read
# Run at 03:00 UTC on every Sunday
- cron: '0 3 * * 0'
concurrency:
group: ${{ github.workflow }}-${{ github.event_name }}-${{ github.head_ref || github.ref }}
@@ -25,7 +22,11 @@ concurrency:
jobs:
twister-build-prep:
if: github.repository_owner == 'zephyrproject-rtos'
runs-on: ubuntu-24.04
runs-on:
group: zephyr-runner-v2-linux-x64-4xlarge
container:
image: ghcr.io/zephyrproject-rtos/ci-repo-cache:v0.27.4.20241026
options: '--entrypoint /bin/bash'
outputs:
subset: ${{ steps.output-services.outputs.subset }}
size: ${{ steps.output-services.outputs.size }}
@@ -33,63 +34,66 @@ jobs:
env:
MATRIX_SIZE: 10
PUSH_MATRIX_SIZE: 20
WEEKLY_MATRIX_SIZE: 200
DAILY_MATRIX_SIZE: 80
BSIM_OUT_PATH: /opt/bsim/
BSIM_COMPONENTS_PATH: /opt/bsim/components
TESTS_PER_BUILDER: 900
TESTS_PER_BUILDER: 700
COMMIT_RANGE: ${{ github.event.pull_request.base.sha }}..${{ github.event.pull_request.head.sha }}
BASE_REF: ${{ github.base_ref }}
steps:
- name: Apply container owner mismatch workaround
run: |
# FIXME: The owner UID of the GITHUB_WORKSPACE directory may not
# match the container user UID because of the way GitHub
# Actions runner is implemented. Remove this workaround when
# GitHub comes up with a fundamental fix for this problem.
git config --global --add safe.directory ${GITHUB_WORKSPACE}
- name: Print cloud service information
run: |
echo "ZEPHYR_RUNNER_CLOUD_PROVIDER = ${ZEPHYR_RUNNER_CLOUD_PROVIDER}"
echo "ZEPHYR_RUNNER_CLOUD_NODE = ${ZEPHYR_RUNNER_CLOUD_NODE}"
echo "ZEPHYR_RUNNER_CLOUD_POD = ${ZEPHYR_RUNNER_CLOUD_POD}"
- name: Clone cached Zephyr repository
if: github.event_name == 'pull_request_target'
continue-on-error: true
run: |
git clone --shared /repo-cache/zephyrproject/zephyr .
git remote set-url origin ${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}
- name: Checkout
if: github.event_name == 'pull_request'
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
if: github.event_name == 'pull_request_target'
uses: actions/checkout@v4
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
path: zephyr
persist-credentials: false
- name: Set up Python
if: github.event_name == 'pull_request'
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: install-packages
working-directory: zephyr
if: github.event_name == 'pull_request'
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Setup Zephyr project
if: github.event_name == 'pull_request'
uses: zephyrproject-rtos/action-zephyr-setup@b2453c72966ee67b1433be22b250348d48283286 # v1.0.7
with:
app-path: zephyr
toolchains: all
- name: Environment Setup
working-directory: zephyr
if: github.event_name == 'pull_request'
if: github.event_name == 'pull_request_target'
run: |
git config --global user.email "bot@zephyrproject.org"
git config --global user.name "Zephyr Bot"
rm -fr ".git/rebase-apply"
rm -fr ".git/rebase-merge"
git rebase origin/${BASE_REF}
git clean -f -d
git log --pretty=oneline | head -n 10
west init -l . || true
west config manifest.group-filter -- +ci,+optional
west config --global update.narrow true
west update --path-cache /repo-cache/zephyrproject 2>&1 1> west.update.log || west update --path-cache /repo-cache/zephyrproject 2>&1 1> west.update.log || ( rm -rf ../modules ../bootloader ../tools && west update --path-cache /repo-cache/zephyrproject)
west forall -c 'git reset --hard HEAD'
echo "ZEPHYR_SDK_INSTALL_DIR=/opt/toolchains/zephyr-sdk-$( cat SDK_VERSION )" >> $GITHUB_ENV
- name: Generate Test Plan with Twister
working-directory: zephyr
if: github.event_name == 'pull_request'
if: github.event_name == 'pull_request_target'
id: test-plan
run: |
export ZEPHYR_BASE=${PWD}
export ZEPHYR_TOOLCHAIN_VARIANT=zephyr
python3 ./scripts/ci/test_plan.py -c origin/${BASE_REF}.. --no-detailed-test-id --pull-request -t $TESTS_PER_BUILDER
python3 ./scripts/ci/test_plan.py -c origin/${BASE_REF}.. --pull-request -t $TESTS_PER_BUILDER
if [ -s .testplan ]; then
cat .testplan >> $GITHUB_ENV
else
@@ -100,23 +104,22 @@ jobs:
- name: Determine matrix size
id: output-services
run: |
if [ "${{github.event_name}}" = "push" ]; then
subset="[$(seq -s',' 1 ${PUSH_MATRIX_SIZE})]"
size=${MATRIX_SIZE}
elif [ "${{github.event_name}}" = "pull_request" ]; then
if [ "${{github.event_name}}" = "pull_request_target" ]; then
if [ -n "${TWISTER_NODES}" ]; then
subset="[$(seq -s',' 1 ${TWISTER_NODES})]"
else
subset="[$(seq -s',' 1 ${MATRIX_SIZE})]"
fi
size=${TWISTER_NODES}
elif [ "${{github.event_name}}" = "push" ]; then
subset="[$(seq -s',' 1 ${PUSH_MATRIX_SIZE})]"
size=${MATRIX_SIZE}
elif [ "${{github.event_name}}" = "schedule" -a "${{github.repository}}" = "zephyrproject-rtos/zephyr" ]; then
subset="[$(seq -s',' 1 ${WEEKLY_MATRIX_SIZE})]"
size=${WEEKLY_MATRIX_SIZE}
subset="[$(seq -s',' 1 ${DAILY_MATRIX_SIZE})]"
size=${DAILY_MATRIX_SIZE}
else
size=0
fi
echo "subset=${subset}" >> $GITHUB_OUTPUT
echo "size=${size}" >> $GITHUB_OUTPUT
echo "fullrun=${TWISTER_FULL}" >> $GITHUB_OUTPUT
@@ -127,7 +130,7 @@ jobs:
needs: twister-build-prep
if: needs.twister-build-prep.outputs.size != 0
container:
image: ghcr.io/zephyrproject-rtos/ci-repo-cache:v0.28.1.20250624
image: ghcr.io/zephyrproject-rtos/ci-repo-cache:v0.27.4.20241026
options: '--entrypoint /bin/bash'
strategy:
fail-fast: false
@@ -142,13 +145,12 @@ jobs:
CCACHE_IGNOREOPTIONS: '-specs=* --specs=*'
BSIM_OUT_PATH: /opt/bsim/
BSIM_COMPONENTS_PATH: /opt/bsim/components
TWISTER_COMMON: ' --test-config tests/test_config_ci.yaml --no-detailed-test-id --force-color --inline-logs -v -N -M --retry-failed 3 --timeout-multiplier 2 '
WEEKLY_OPTIONS: ' -M --build-only --all --show-footprint --report-filtered -j 32'
PR_OPTIONS: ' --clobber-output --integration -j 16'
PUSH_OPTIONS: ' --clobber-output -M --show-footprint --report-filtered -j 16'
TWISTER_COMMON: ' --force-color --inline-logs -v -N -M --retry-failed 3 --timeout-multiplier 2 '
DAILY_OPTIONS: ' -M --build-only --all --show-footprint'
PR_OPTIONS: ' --clobber-output --integration'
PUSH_OPTIONS: ' --clobber-output -M --show-footprint --report-filtered'
COMMIT_RANGE: ${{ github.event.pull_request.base.sha }}..${{ github.event.pull_request.head.sha }}
BASE_REF: ${{ github.base_ref }}
LLVM_TOOLCHAIN_PATH: /usr/lib/llvm-20
steps:
- name: Print cloud service information
run: |
@@ -171,7 +173,7 @@ jobs:
git remote set-url origin ${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}
- name: Checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
uses: actions/checkout@v4
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
@@ -179,11 +181,10 @@ jobs:
- name: Environment Setup
run: |
if [ "${{github.event_name}}" = "pull_request" ]; then
if [ "${{github.event_name}}" = "pull_request_target" ]; then
git config --global user.email "bot@zephyrproject.org"
git config --global user.name "Zephyr Builder"
rm -fr ".git/rebase-apply"
rm -fr ".git/rebase-merge"
git rebase origin/${BASE_REF}
git clean -f -d
git log --pretty=oneline | head -n 10
@@ -210,10 +211,6 @@ jobs:
echo "github.base_ref: ${{ github.base_ref }}"
echo "github.ref_name: ${{ github.ref_name }}"
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Set up ccache
run: |
mkdir -p ${CCACHE_DIR}
@@ -233,7 +230,6 @@ jobs:
- if: github.event_name == 'push'
name: Run Tests with Twister (Push)
id: run_twister
run: |
export ZEPHYR_BASE=${PWD}
export ZEPHYR_TOOLCHAIN_VARIANT=zephyr
@@ -245,14 +241,13 @@ jobs:
fi
fi
- if: github.event_name == 'pull_request'
- if: github.event_name == 'pull_request_target'
name: Run Tests with Twister (Pull Request)
id: run_twister_pr
run: |
rm -f testplan.json
export ZEPHYR_BASE=${PWD}
export ZEPHYR_TOOLCHAIN_VARIANT=zephyr
python3 ./scripts/ci/test_plan.py -c origin/${BASE_REF}.. --pull-request --no-detailed-test-id
python3 ./scripts/ci/test_plan.py -c origin/${BASE_REF}.. --pull-request
./scripts/twister --subset ${{matrix.subset}}/${{ strategy.job-total }} --load-tests testplan.json ${TWISTER_COMMON} ${PR_OPTIONS}
if [ "${{matrix.subset}}" = "1" -a ${{needs.twister-build-prep.outputs.fullrun}} = 'True' ]; then
./scripts/zephyr_module.py --twister-out module_tests.args
@@ -262,16 +257,15 @@ jobs:
fi
- if: github.event_name == 'schedule'
name: Run Tests with Twister (Weekly)
id: run_twister_sched
name: Run Tests with Twister (Daily)
run: |
export ZEPHYR_BASE=${PWD}
export ZEPHYR_TOOLCHAIN_VARIANT=zephyr
./scripts/twister --subset ${{matrix.subset}}/${{ strategy.job-total }} ${TWISTER_COMMON} ${WEEKLY_OPTIONS}
./scripts/twister --subset ${{matrix.subset}}/${{ strategy.job-total }} ${TWISTER_COMMON} ${DAILY_OPTIONS}
if [ "${{matrix.subset}}" = "1" ]; then
./scripts/zephyr_module.py --twister-out module_tests.args
if [ -s module_tests.args ]; then
./scripts/twister +module_tests.args --outdir module_tests ${TWISTER_COMMON} ${WEEKLY_OPTIONS}
./scripts/twister +module_tests.args --outdir module_tests ${TWISTER_COMMON} ${DAILY_OPTIONS}
fi
fi
@@ -282,7 +276,7 @@ jobs:
- name: Upload Unit Test Results
if: always()
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
uses: actions/upload-artifact@v4
with:
name: Unit Test Results (Subset ${{ matrix.subset }})
if-no-files-found: ignore
@@ -300,11 +294,11 @@ jobs:
timestamp="$(date)"
version="$(git describe --abbrev=12 --always)"
echo -e "# Generated at $timestamp ($version)\n" > $FREEZE_FILE
pip freeze | tee -a $FREEZE_FILE
pip3 freeze | tee -a $FREEZE_FILE
- if: matrix.subset == 1 && github.event_name == 'push'
name: Upload the list of Python packages
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
uses: actions/upload-artifact@v4
with:
name: Frozen PIP package set
path: |
@@ -312,78 +306,63 @@ jobs:
twister-test-results:
name: "Publish Unit Tests Results"
needs:
- twister-build
runs-on: ubuntu-24.04
permissions:
checks: write # to create the check run entry with Twister test results
env:
ELASTICSEARCH_KEY: ${{ secrets.ELASTICSEARCH_KEY }}
ELASTICSEARCH_SERVER: "https://elasticsearch.zephyrproject.io:443"
needs: twister-build
runs-on: ubuntu-22.04
# the build-and-test job might be skipped, we don't need to run this job then
if: success() || failure()
steps:
- name: Check out source code
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
# Needed for elasticearch and upload script
- if: github.event_name == 'push' || github.event_name == 'schedule'
name: Checkout
uses: actions/checkout@v4
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
persist-credentials: false
- name: Set up Python
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Download Artifacts
uses: actions/download-artifact@d3f86a106a0bac45b974a628896c90dbdf5c8093 # v4.3.0
uses: actions/download-artifact@v4
with:
path: artifacts
- if: github.event_name == 'push' || github.event_name == 'schedule'
name: Upload to elasticsearch
run: |
pip3 install elasticsearch
# set run date on upload to get consistent and unified data across the matrix.
run_date=`date --iso-8601=minutes`
if [ "${{github.event_name}}" = "push" ]; then
python3 ./scripts/ci/upload_test_results_es.py -r ${run_date} \
--index zephyr-main-ci-push-1 artifacts/*/*/twister.json
elif [ "${{github.event_name}}" = "schedule" ]; then
python3 ./scripts/ci/upload_test_results_es.py -r ${run_date} \
--index zephyr-main-ci-weekly-1 artifacts/*/*/twister.json
fi
- name: Merge Test Results
run: |
pip3 install junitparser junit2html
junitparser merge artifacts/*/*/twister.xml junit.xml
junit2html junit.xml junit.html
- name: Upload Unit Test Results
- name: Upload Unit Test Results in HTML
if: always()
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
uses: actions/upload-artifact@v4
with:
name: Unit Test Results
name: HTML Unit Test Results
if-no-files-found: ignore
path: |
junit.html
junit.xml
- name: Publish Unit Test Results
uses: EnricoMi/publish-unit-test-result-action@3a74b2957438d0b6e2e61d67b05318aa25c9e6c6 # v2.20.0
uses: EnricoMi/publish-unit-test-result-action@v2
with:
check_name: Unit Test Results
files: "**/twister.xml"
comment_mode: off
- name: Analyze Twister Reports
if: needs.twister-build.result == 'failure'
run: |
./scripts/ci/twister_report_analyzer.py artifacts/*/*/twister.json --long-summary --platforms --output-md errors.md
if [[ -s "errors.md" ]]; then
echo '### Error Summary! 🚀' >> $GITHUB_STEP_SUMMARY
cat errors.md >> $GITHUB_STEP_SUMMARY
fi
- name: Upload Twister Analysis Results
if: needs.twister-build.result == 'failure'
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
with:
name: Twister Analysis Results
if-no-files-found: ignore
path: |
twister_report_summary.json
twister-status-check:
if: always()
name: "Check Twister Status"

View File

@@ -26,9 +26,6 @@ on:
- '.github/workflows/twister_tests.yml'
- 'scripts/schemas/twister/'
permissions:
contents: read
jobs:
twister-tests:
name: Twister Unit Tests
@@ -36,22 +33,25 @@ jobs:
strategy:
matrix:
python-version: ['3.10', '3.11', '3.12', '3.13']
os: [ubuntu-24.04]
os: [ubuntu-22.04]
steps:
- name: checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
uses: actions/checkout@v4
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
- name: cache-pip-linux
if: startsWith(runner.os, 'Linux')
uses: actions/cache@v4
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-${{ matrix.python-version }}
restore-keys: |
${{ runner.os }}-pip-${{ matrix.python-version }}
- name: install-packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
pip3 install -r scripts/requirements-base.txt -r scripts/requirements-build-test.txt -r scripts/requirements-run-test.txt
- name: Run pytest for twisterlib
env:
ZEPHYR_BASE: ./

View File

@@ -15,172 +15,65 @@ on:
- 'scripts/tests/twister_blackbox/**'
- '.github/workflows/twister_tests_blackbox.yml'
permissions:
contents: read
env:
PYTHONIOENCODING: utf-8
jobs:
twister-tests:
name: Twister Black Box Tests
runs-on: ${{ matrix.os }}
strategy:
matrix:
python-version: ['3.10', '3.11', '3.12', '3.13']
os: [ubuntu-24.04, macos-14, windows-2022]
fail-fast: false
runs-on: ${{ matrix.os }}
os: [ubuntu-22.04]
container:
image: ghcr.io/zephyrproject-rtos/ci:v0.27.4
steps:
- name: Apply Container Owner Mismatch Workaround
run: |
# FIXME: The owner UID of the GITHUB_WORKSPACE directory may not
# match the container user UID because of the way GitHub
# Actions runner is implemented. Remove this workaround when
# GitHub comes up with a fundamental fix for this problem.
git config --global --add safe.directory ${GITHUB_WORKSPACE}
- name: Checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
with:
path: zephyr
fetch-depth: 0
uses: actions/checkout@v4
- name: Environment Setup
run: |
echo "$HOME/.local/bin" >> $GITHUB_PATH
west init -l . || true
# we do not depend on any hals, tools or bootloader, save some time and space...
west config manifest.group-filter -- -hal,-tools,-bootloader
west config --global update.narrow true
west update --path-cache /github/cache/zephyrproject 2>&1 1> west.update.log || west update --path-cache /github/cache/zephyrproject 2>&1 1> west.update.log || ( rm -rf ../modules ../bootloader ../tools && west update --path-cache /github/cache/zephyrproject)
west forall -c 'git reset --hard HEAD'
echo "ZEPHYR_SDK_INSTALL_DIR=/opt/toolchains/zephyr-sdk-$( cat SDK_VERSION )" >> $GITHUB_ENV
- name: Set Up Python ${{ matrix.python-version }}
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Setup Zephyr project
uses: zephyrproject-rtos/action-zephyr-setup@b2453c72966ee67b1433be22b250348d48283286 # v1.0.7
with:
app-path: zephyr
toolchains: all
- name: Go Into Venv
shell: bash
run: |
python3 -m pip install --user virtualenv
python3 -m venv env
source env/bin/activate
echo "$(which python)"
- name: Install Packages
run: |
python3 -m pip install -U -r scripts/requirements-base.txt -r scripts/requirements-build-test.txt -r scripts/requirements-run-test.txt
- name: Run Pytest For Twister Black Box Tests
if: ${{ startsWith(runner.os, 'ubuntu') }}
working-directory: zephyr
shell: bash
env:
ZEPHYR_BASE: ./
ZEPHYR_TOOLCHAIN_VARIANT: zephyr
run: |
export ZEPHYR_SDK_INSTALL_DIR=${{ github.workspace }}/zephyr-sdk
echo "Run twister tests"
source zephyr-env.sh
PYTHONPATH="./scripts/tests" pytest ./scripts/tests/twister_blackbox/
- name: Build firmware No. 1 - basic
working-directory: zephyr
shell: bash
run: |
if [ "${{ runner.os }}" = "macOS" ]; then
EXTRA_TWISTER_FLAGS="-P native_sim --build-only"
elif [ "${{ runner.os }}" = "Windows" ]; then
EXTRA_TWISTER_FLAGS="-P native_sim --short-build-path -O /tmp/twister-out"
fi
./scripts/twister --runtime-artifact-cleanup --force-color --inline-logs -T samples/hello_world -T samples/cpp/hello_world -v $EXTRA_TWISTER_FLAGS
- name: Build firmware No. 2 - save and load with emulation only
working-directory: zephyr
shell: bash
run: |
if [ "${{ runner.os }}" = "macOS" ]; then
EXTRA_TWISTER_FLAGS="-P native_sim --build-only"
elif [ "${{ runner.os }}" = "Windows" ]; then
EXTRA_TWISTER_FLAGS="-P native_sim --short-build-path -O /tmp/twister-out"
fi
BASIC_FLAGS="--runtime-artifact-cleanup --force-color --inline-logs -T samples/hello_world -T samples/cpp/hello_world -v $EXTRA_TWISTER_FLAGS"
./scripts/twister --save-tests tests.file $BASIC_FLAGS
./scripts/twister --load-tests tests.file --emulation-only $BASIC_FLAGS
rm tests.file
- name: Build firmware No. 3 - print out test plan
working-directory: zephyr
shell: bash
run: |
if [ "${{ runner.os }}" = "macOS" ]; then
EXTRA_TWISTER_FLAGS="-P native_sim --build-only"
elif [ "${{ runner.os }}" = "Windows" ]; then
EXTRA_TWISTER_FLAGS="-P native_sim --short-build-path -O /tmp/twister-out"
fi
BASIC_FLAGS="--runtime-artifact-cleanup --force-color --inline-logs -v $EXTRA_TWISTER_FLAGS"
./scripts/twister --test-tree -T tests/kernel/spinlock $BASIC_FLAGS
- name: Build firmware No. 4 - integration, exclude tag, filter, shuffle, dry run
working-directory: zephyr
shell: bash
run: |
if [ "${{ runner.os }}" = "macOS" ]; then
EXTRA_TWISTER_FLAGS="-P native_sim --build-only"
elif [ "${{ runner.os }}" = "Windows" ]; then
EXTRA_TWISTER_FLAGS="-P native_sim --short-build-path -O /tmp/twister-out"
fi
BASIC_FLAGS="--runtime-artifact-cleanup --force-color --inline-logs -v $EXTRA_TWISTER_FLAGS"
./scripts/twister --dry-run --integration --subset 1/3 --shuffle-tests --shuffle-tests-seed 1 --filter runnable --exclude-tag audio --exclude-tag driver $BASIC_FLAGS
- name: Build firmware No. 5 - test, arch, vendor, exclude-platform, platform-reports
working-directory: zephyr
shell: bash
run: |
if [ "${{ runner.os }}" = "macOS" ]; then
EXTRA_TWISTER_FLAGS="-P native_sim --build-only"
elif [ "${{ runner.os }}" = "Windows" ]; then
EXTRA_TWISTER_FLAGS="-P native_sim --short-build-path -O /tmp/twister-out"
fi
BASIC_FLAGS="--runtime-artifact-cleanup --force-color --inline-logs -v $EXTRA_TWISTER_FLAGS"
./scripts/twister --test kernel.multiprocessing.spinlock --arch x86 --exclude-platform qemu_x86_64 --vendor qemu --platform-reports $BASIC_FLAGS
- name: Build firmware No. 6 - subtest, platform, rom-ram report, ROM footprint report from buildlog, size report
working-directory: zephyr
shell: bash
run: |
if [ "${{ runner.os }}" = "macOS" ]; then
EXTRA_TWISTER_FLAGS="-P native_sim --build-only"
elif [ "${{ runner.os }}" = "Windows" ]; then
EXTRA_TWISTER_FLAGS="-P native_sim --short-build-path -O /tmp/twister-out"
fi
BASIC_FLAGS="--runtime-artifact-cleanup --force-color --inline-logs -v $EXTRA_TWISTER_FLAGS"
./scripts/twister --sub-test kernel.multiprocessing.spinlock.minimallibc.spinlock.spinlock_basic --platform qemu_x86 --create-rom-ram-report --footprint-report ROM --enable-size-report --footprint-from-buildlog $BASIC_FLAGS
- name: Build firmware No. 7 - list tags
working-directory: zephyr
shell: bash
run: |
if [ "${{ runner.os }}" = "macOS" ]; then
EXTRA_TWISTER_FLAGS="-P native_sim --build-only"
elif [ "${{ runner.os }}" = "Windows" ]; then
EXTRA_TWISTER_FLAGS="-P native_sim --short-build-path -O /tmp/twister-out"
fi
BASIC_FLAGS="--runtime-artifact-cleanup --force-color --inline-logs -v $EXTRA_TWISTER_FLAGS"
./scripts/twister --sub-test kernel.multiprocessing.spinlock.minimallibc.spinlock.spinlock_basic --list-tags $BASIC_FLAGS
- name: Build firmware No. 8 - list tests
working-directory: zephyr
shell: bash
run: |
if [ "${{ runner.os }}" = "macOS" ]; then
EXTRA_TWISTER_FLAGS="-P native_sim --build-only"
elif [ "${{ runner.os }}" = "Windows" ]; then
EXTRA_TWISTER_FLAGS="-P native_sim --short-build-path -O /tmp/twister-out"
fi
BASIC_FLAGS="--runtime-artifact-cleanup --force-color --inline-logs -v $EXTRA_TWISTER_FLAGS"
./scripts/twister -T tests/posix/common --list-tests $BASIC_FLAGS
- name: Build firmware No. 9 - report flags - dir, name, suffix, summary, all-options, filtered
working-directory: zephyr
shell: bash
run: |
if [ "${{ runner.os }}" = "macOS" ]; then
EXTRA_TWISTER_FLAGS="-P native_sim --build-only"
elif [ "${{ runner.os }}" = "Windows" ]; then
EXTRA_TWISTER_FLAGS="-P native_sim --short-build-path -O /tmp/twister-out"
fi
BASIC_FLAGS="--runtime-artifact-cleanup --force-color --inline-logs -v $EXTRA_TWISTER_FLAGS"
./scripts/twister --sub-test kernel.multiprocessing.spinlock.minimallibc.spinlock.spinlock_basic --platform qemu_x86 --report-dir . --report-name test_name --report-suffix suffix --report-summary 0 --report-all-options --report-filtered $BASIC_FLAGS
- name: Build firmware No. 10 - force platform and toolchain, log level, timestamps, logfile
working-directory: zephyr
shell: bash
run: |
if [ "${{ runner.os }}" = "macOS" ]; then
EXTRA_TWISTER_FLAGS="-P native_sim --build-only"
elif [ "${{ runner.os }}" = "Windows" ]; then
EXTRA_TWISTER_FLAGS="-P native_sim --short-build-path -O /tmp/twister-out"
fi
BASIC_FLAGS="--runtime-artifact-cleanup --force-color --inline-logs -v $EXTRA_TWISTER_FLAGS"
./scripts/twister --sub-test kernel.multiprocessing.spinlock.minimallibc.spinlock.spinlock_basic --force-platform --platform qemu_x86 --force-toolchain --log-level WARNING --log-file log.file $BASIC_FLAGS
rm log.file
PYTHONPATH="./scripts/tests" pytest ./scripts/tests/twister_blackbox

View File

@@ -23,11 +23,8 @@ on:
- 'scripts/west_commands/**'
- '.github/workflows/west_cmds.yml'
permissions:
contents: read
jobs:
west-commands:
west-commnads:
name: West Command Tests
runs-on: ${{ matrix.os }}
strategy:
@@ -36,24 +33,44 @@ jobs:
os: [ubuntu-22.04, macos-14, windows-2022]
steps:
- name: checkout
uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2
uses: actions/checkout@v4
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@a26af69be951a213d495a4c3e4e4022e16d87065 # v5.6.0
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
- name: cache-pip-linux
if: startsWith(runner.os, 'Linux')
uses: actions/cache@v4
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-${{ matrix.python-version }}
restore-keys: |
${{ runner.os }}-pip-${{ matrix.python-version }}
- name: cache-pip-mac
if: startsWith(runner.os, 'macOS')
uses: actions/cache@v4
with:
path: ~/Library/Caches/pip
# Trailing '-' was just to get a different cache name
key: ${{ runner.os }}-pip-${{ matrix.python-version }}-
restore-keys: |
${{ runner.os }}-pip-${{ matrix.python-version }}-
- name: cache-pip-win
if: startsWith(runner.os, 'Windows')
uses: actions/cache@v4
with:
path: ~\AppData\Local\pip\Cache
key: ${{ runner.os }}-pip-${{ matrix.python-version }}
restore-keys: |
${{ runner.os }}-pip-${{ matrix.python-version }}
- name: install pytest
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
pip3 install wheel
pip3 install pytest west pyelftools canopen natsort progress mypy intelhex psutil ply pyserial anytree
- name: run pytest-win
if: runner.os == 'Windows'
run: |
python ./scripts/west_commands/run_tests.py
- name: run pytest-mac-linux
if: runner.os != 'Windows'
run: |

5
.gitignore vendored
View File

@@ -102,11 +102,6 @@ MaintainersFormat.txt
ModulesMaintainers.txt
Nits.txt
Pylint.txt
PythonCompat.txt
Ruff.txt
SphinxLint.txt
SysbuildKconfig.txt
SysbuildKconfigBasic.txt
SysbuildKconfigBasicNoModules.txt
TextEncoding.txt
YAMLLint.txt

View File

@@ -59,7 +59,3 @@ ignore-merge-commits=false
# By specifying this rule, developers can only change the file when they explicitly reference
# it in the commit message.
#files=gitlint/rules.py,README.md
[ignore-by-author-name]
regex=^dependabot\[bot\]$
ignore=all

View File

@@ -1,14 +1,14 @@
Alexandr Kolosov <rikorsev@gmail.com>
Alexandre d'Alton <alexandre.dalton@intel.com>
Amir Kaplan <amir.kaplan@intel.com>
Anas Nashif <anas.nashif@intel.com>
Amir Kaplan <amir.kaplan@intel.com> <amir.kaplan@intel.com>
Anas Nashif <anas.nashif@intel.com> <anas.nashif@intel.com>
Andrzej Kuroś <andrzej.kuros@nordicsemi.no>
Anthony Smigielski <thebasti0ncode@gmail.com>
Armand Ciejak <armand@riedonetworks.com> <armandciejak@users.noreply.github.com>
Aska Wu <aska.wu@linaro.org>
Bit Pathe <bitpathe@gmail.com>
Bit Pathe <bitpathe@gmail.com> <bitpathe@gmail.com>
Bjarki Arge Andreasen <baa@trackunit.com>
Carles Cufi <carles.cufi@nordicsemi.no>
Carles Cufi <carles.cufi@nordicsemi.no> <carles.cufi@nordicsemi.no>
chao an <anchao@xiaomi.com>
Charles E. Youse <charles.youse@intel.com>
Chen Xingyu <hi@xingrz.me>
@@ -16,32 +16,32 @@ Christoph Schnetzler <christoph.schnetzler@husqvarnagroup.com>
Christoph Schramm <schramm@makaio.com>
Christopher Friedt <cfriedt@meta.com>
Christopher Friedt <cfriedt@meta.com> <cfriedt@fb.com>
Chuck Jordan <cjordan@synopsys.com>
Chuck Jordan <cjordan@synopsys.com> <cjordan@synopsys.com>
Chunlin Han <chunlin.han@linaro.org> <chunlin.han@acer.com>
David B. Kinder <david.b.kinder@intel.com>
David Komel <a8961713@gmail.com>
David Leach <david.leach@nxp.com>
Dirk Brandewie <dirk.j.brandewie@intel.com>
Douglas Su <d0u9.su@outlook.com>
Dirk Brandewie <dirk.j.brandewie@intel.com> <dirk.j.brandewie@intel.com>
Douglas Su <d0u9.su@outlook.com> <d0u9.su@outlook.com>
Enjia Mai <enjia.mai@intel.com>
Enjia Mai <enjia.mai@intel.com>
Evan Couzens <evanx.couzens@intel.com>
Enjia Mai <enjia.mai@intel.com> <enjiax.mai@intel.com>
Evan Couzens <evanx.couzens@intel.com> <evanx.couzens@intel.com>
Evgeniy Paltsev <PaltsevEvgeniy@gmail.com>
Evgeniy Paltsev <PaltsevEvgeniy@gmail.com> <Eugeniy.Paltsev@synopsys.com>
Felipe Neves <ryukokki.felipe@gmail.com>
Felipe Neves <ryukokki.felipe@gmail.com> <ryukokki.felipe@gmail.com>
Findlay Feng <i@fengch.me>
Flavio Arieta Netto <flavio@exati.com.br>
Francois Ramu <francois.ramu@st.com>
Gerardo Aceves <gerardo.aceves@intel.com>
Gerardo Aceves <gerardo.aceves@intel.com> <gerardo.aceves@intel.com>
Gregory Shue <gregory.shue@legrand.com>
Gregory Shue <gregory.shue@legrand.com> <gregory.shue@legrand.us>
HaiLong Yang <cameledyang@pm.me>
James Johnson <james.johnson672@t-mobile.com>
Jarno Lämsä <jarno.lamsa@nordicsemi.no>
Jędrzej Ciupis <jedrzej.ciupis@nordicsemi.no>
Jeremie Garcia <jeremie.garcia@intel.com>
Jeremie Garcia <jeremie.garcia@intel.com> <jeremie.garcia@intel.com>
Jim Benjamin Luther <jilu@oticon.com>
Johan Kruger <johan.kruger@windriver.com>
Johan Kruger <johan.kruger@windriver.com> <johan.kruger@windriver.com>
Johann Fischer <j.fischer@phytec.de>
Jørgen Kvalvaag <jorgen.kvalvaag@nordicsemi.no>
Juan Manuel Cruz Alcaraz <juan.m.cruz.alcaraz@intel.com>
@@ -51,11 +51,11 @@ Jun Li <jun.r.li@intel.com>
Justin Watson <jwatson5@gmail.com>
Kamil Sroka <kamil.sroka@nordicsemi.no>
Katarzyna Giądła <katarzyna.giadla@nordicsemi.no>
Keren Siman-Tov <keren.siman-tov@intel.com>
Keren Siman-Tov <keren.siman-tov@intel.com> <keren.siman-tov@intel.com>
Krzysztof Chruściński <krzysztof.chruscinski@nordicsemi.no>
Kuo-Lang Tseng <kuo-lang.tseng@intel.com>
Lei Liu <lei.a.liu@intel.com>
Leona Cook <leonax.cook@intel.com>
Kuo-Lang Tseng <kuo-lang.tseng@intel.com> <kuo-lang.tseng@intel.com>
Lei Liu <lei.a.liu@intel.com> <lei.a.liu@intel.com>
Leona Cook <leonax.cook@intel.com> <leonax.cook@intel.com>
Leona Cook <leonax.cook@intel.com> <lsc@hackeress.com>
Lixin Guo <lixinx.guo@intel.com>
Łukasz Mazur <lukasz.mazur@hidglobal.com>
@@ -72,14 +72,14 @@ Martí Bolívar <marti.bolivar@nordicsemi.no> <marti.bolivar@linaro.org>
Martí Bolívar <marti.bolivar@nordicsemi.no> <marti.f.bolivar@gmail.com>
Martí Bolívar <marti.bolivar@nordicsemi.no> <marti@foundries.io>
Martí Bolívar <marti.bolivar@nordicsemi.no> <marti@opensourcefoundries.com>
Martin Jäger <martin@libre.solar> <17674105+martinjaeger@users.noreply.github.com>
Martin Jäger <martin@libre.solar> <17674105+martinjaeger@users.noreply.github.com>
Mateusz Hołenko <mholenko@antmicro.com>
Michael Rosen <michael.r.rosen@intel.com>
Michal Narajowski <michal.narajowski@codecoup.pl>
Mike Hirst <michael.hirst@windriver.com>
Mike Hirst <michael.hirst@windriver.com> <michael.hirst@windriver.com>
Ming Shao <ming.shao@intel.com>
Mohan Kumar Kumar <mohankm@fb.com>
Naga Raja Rao Tulasi <tulasi.r@tcs.com>
Naga Raja Rao Tulasi <tulasi.r@tcs.com> <tulasi.r@tcs.com>
Navin Sankar Velliangiri <navin@linumiz.com>
NingX Zhao <ningx.zhao@intel.com>
Nishikant Nayak <nishikantax.nayak@intel.com>
@@ -100,23 +100,21 @@ Radosław Koppel <radoslaw.koppel@nordicsemi.no> <r.koppel@k-el.com>
Raja D. Singh <rdsingh@iotwizards.com>
Ricardo Salveti <ricardo@opensourcefoundries.com>
Ricardo Salveti <ricardo@opensourcefoundries.com> <ricardo.salveti@linaro.org>
Ruud Derwig <Ruud.Derwig@synopsys.com>
Ruud Derwig <Ruud.Derwig@synopsys.com> <Ruud.Derwig@synopsys.com>
Saku Rautio <saku.rautio@nordicsemi.no>
Scott Worley <scott.worley@microchip.com>
Sean Nyekjaer <sean@geanix.com> <sean.nyekjaer@prevas.dk>
Sean Nyekjaer <sean@geanix.com> <sean@nyekjaer.dk>
Sharron Liu <sharron.liu@intel.com>
Shilpashree L C <shilpashree.lc@intel.com>
Shuang He <shuang.he@intel.com>
Shuang He <shuang.he@intel.com> <shuang.he@intel.com>
Sigvart Hovland <sigvart.hovland@nordicsemi.no>
Stéphane D'Alu <sdalu@sdalu.com>
Stine Åkredalen <stine.akredalen@nordicsemi.no>
Thomas Heeley <thomas.heeley@intel.com>
Thomas Heeley <thomas.heeley@intel.com> <thomas.heeley@intel.com>
Tim Sørensen <tims@demant.com>
Tim Sørensen <tims@demant.com> <tims@oticon.com>
Vinayak Kariappa Chettimada <vinayak.kariappa.chettimada@nordicsemi.no>
Vinayak Kariappa Chettimada <vinayak.kariappa.chettimada@nordicsemi.no> <vich@nordicsemi.no>
Vinayak Kariappa Chettimada <vinayak.kariappa.chettimada@nordicsemi.no> <vinayak.kariappa@gmail.com>
Vinayak Kariappa Chettimada <vinayak.kariappa.chettimada@nordicsemi.no> <vinayak.kariappa.chettimada@nordicsemi.no> <vich@nordicsemi.no> <vinayak.kariappa@gmail.com>
Xiaorui Hu <xiaorui.hu@linaro.org>
Yannis Damigos <giannis.damigos@gmail.com> <ydamigos@iccs.gr>
Yonattan Louise <yonattan.a.louise.mendoza@intel.com>

File diff suppressed because it is too large Load Diff

View File

@@ -1,30 +0,0 @@
# Copyright (c) 2024 Basalte bv
# SPDX-License-Identifier: Apache-2.0
extend = ".ruff-excludes.toml"
line-length = 100
target-version = "py310"
[lint]
select = [
# zephyr-keep-sorted-start
"B", # flake8-bugbear
"E", # pycodestyle
"F", # pyflakes
"I", # isort
"SIM", # flake8-simplify
"UP", # pyupgrade
"W", # pycodestyle warnings
# zephyr-keep-sorted-stop
]
ignore = [
# zephyr-keep-sorted-start
"SIM108", # Allow if-else blocks instead of forcing ternary operator
# zephyr-keep-sorted-stop
]
[format]
quote-style = "preserve"
line-ending = "lf"

View File

@@ -71,15 +71,6 @@ set(ZEPHYR_CURRENT_LINKER_PASS 0)
set(ZEPHYR_CURRENT_LINKER_CMD linker_zephyr_pre${ZEPHYR_CURRENT_LINKER_PASS}.cmd)
set(ZEPHYR_LINK_STAGE_EXECUTABLE zephyr_pre${ZEPHYR_CURRENT_LINKER_PASS})
# Make kconfig variables available to the linker script generator
zephyr_linker_include_generated(KCONFIG ${CMAKE_CURRENT_BINARY_DIR}/.config)
# The linker generator also needs sections.h to be able to access e.g. _APP_SMEM_SECTION_NAME
# for linkerscripts that do not support c-preprocessing.
zephyr_linker_include_generated(HEADER ${ZEPHYR_BASE}/include/zephyr/linker/sections.h)
zephyr_linker_include_var(VAR CMAKE_VERBOSE_MAKEFILE VALUE ${CMAKE_VERBOSE_MAKEFILE})
# ZEPHYR_PREBUILT_EXECUTABLE is used outside of this file, therefore keep the
# existing variable to allow slowly cleanup of linking stage handling.
# Three stage linking active: pre0 -> pre1 -> final, this will correspond to `pre1`
@@ -96,7 +87,7 @@ set(OFFSETS_H_TARGET offsets_h)
set(SYSCALL_LIST_H_TARGET syscall_list_h_target)
set(DRIVER_VALIDATION_H_TARGET driver_validation_h_target)
set(KOBJ_TYPES_H_TARGET kobj_types_h_target)
set(DEVICE_API_LD_TARGET device_api_ld_target)
set(PARSE_SYSCALLS_TARGET parse_syscalls_target)
define_property(GLOBAL PROPERTY PROPERTY_OUTPUT_FORMAT BRIEF_DOCS " " FULL_DOCS " ")
set_property( GLOBAL PROPERTY PROPERTY_OUTPUT_FORMAT elf32-little${ARCH}) # BFD format
@@ -159,19 +150,13 @@ zephyr_compile_options($<$<COMPILE_LANGUAGE:C>:$<TARGET_PROPERTY:compiler,no_str
zephyr_compile_options($<$<COMPILE_LANGUAGE:CXX>:$<TARGET_PROPERTY:compiler-cpp,no_strict_aliasing>>)
# Extra warnings options for twister run
if(CONFIG_COMPILER_WARNINGS_AS_ERRORS)
if (CONFIG_COMPILER_WARNINGS_AS_ERRORS)
zephyr_compile_options($<$<COMPILE_LANGUAGE:C>:$<TARGET_PROPERTY:compiler,warnings_as_errors>>)
zephyr_compile_options($<$<COMPILE_LANGUAGE:CXX>:$<TARGET_PROPERTY:compiler,warnings_as_errors>>)
zephyr_compile_options($<$<COMPILE_LANGUAGE:ASM>:$<TARGET_PROPERTY:asm,warnings_as_errors>>)
zephyr_link_libraries($<TARGET_PROPERTY:linker,warnings_as_errors>)
endif()
if(CONFIG_DEPRECATION_TEST)
zephyr_compile_options($<$<COMPILE_LANGUAGE:C>:$<TARGET_PROPERTY:compiler,no_deprecation_warning>>)
zephyr_compile_options($<$<COMPILE_LANGUAGE:CXX>:$<TARGET_PROPERTY:compiler,no_deprecation_warning>>)
zephyr_compile_options($<$<COMPILE_LANGUAGE:ASM>:$<TARGET_PROPERTY:asm,no_deprecation_warning>>)
endif()
# @Intent: Set compiler flags to enable buffer overflow checks in libc functions
# @details:
# Kconfig.zephyr "Detect buffer overflows in libc calls" is a kconfig choice,
@@ -188,17 +173,7 @@ endif()
# @Intent: Set compiler flags to detect general stack overflows across all functions
if(CONFIG_STACK_CANARIES)
zephyr_compile_options("$<$<COMPILE_LANGUAGE:C>:$<TARGET_PROPERTY:compiler,security_canaries>>")
zephyr_compile_options("$<$<COMPILE_LANGUAGE:CXX>:$<TARGET_PROPERTY:compiler,security_canaries>>")
elseif(CONFIG_STACK_CANARIES_STRONG)
zephyr_compile_options("$<$<COMPILE_LANGUAGE:C>:$<TARGET_PROPERTY:compiler,security_canaries_strong>>")
zephyr_compile_options("$<$<COMPILE_LANGUAGE:CXX>:$<TARGET_PROPERTY:compiler,security_canaries_strong>>")
elseif(CONFIG_STACK_CANARIES_ALL)
zephyr_compile_options("$<$<COMPILE_LANGUAGE:C>:$<TARGET_PROPERTY:compiler,security_canaries_all>>")
zephyr_compile_options("$<$<COMPILE_LANGUAGE:CXX>:$<TARGET_PROPERTY:compiler,security_canaries_all>>")
elseif(CONFIG_STACK_CANARIES_EXPLICIT)
zephyr_compile_options("$<$<COMPILE_LANGUAGE:C>:$<TARGET_PROPERTY:compiler,security_canaries_explicit>>")
zephyr_compile_options("$<$<COMPILE_LANGUAGE:CXX>:$<TARGET_PROPERTY:compiler,security_canaries_explicit>>")
zephyr_compile_options($<TARGET_PROPERTY:compiler,security_canaries>)
endif()
# @Intent: Obtain compiler optimizations flags and store in variables
@@ -245,9 +220,7 @@ SOC_* symbol.")
endif()
# Apply the final optimization flag(s)
zephyr_compile_options($<$<COMPILE_LANGUAGE:ASM>:${OPTIMIZATION_FLAG}>)
zephyr_compile_options($<$<COMPILE_LANGUAGE:C>:${OPTIMIZATION_FLAG}>)
zephyr_compile_options($<$<COMPILE_LANGUAGE:CXX>:${OPTIMIZATION_FLAG}>)
zephyr_compile_options(${OPTIMIZATION_FLAG})
if(CONFIG_LTO)
zephyr_compile_options($<TARGET_PROPERTY:compiler,optimization_lto>)
@@ -335,9 +308,7 @@ if(CONFIG_CODING_GUIDELINE_CHECK)
endif()
# @Intent: Set compiler specific macro inclusion of AUTOCONF_H
zephyr_compile_options("SHELL: $<$<COMPILE_LANGUAGE:C>:$<TARGET_PROPERTY:compiler,imacros> ${AUTOCONF_H}>")
zephyr_compile_options("SHELL: $<$<COMPILE_LANGUAGE:CXX>:$<TARGET_PROPERTY:compiler,imacros> ${AUTOCONF_H}>")
zephyr_compile_options("SHELL: $<$<COMPILE_LANGUAGE:ASM>:$<TARGET_PROPERTY:asm,imacros> ${AUTOCONF_H}>")
zephyr_compile_options("SHELL: $<TARGET_PROPERTY:compiler,imacros> ${AUTOCONF_H}")
if(CONFIG_COMPILER_FREESTANDING)
# @Intent: Set compiler specific flag for bare metal freestanding option
@@ -350,18 +321,6 @@ if (CONFIG_PICOLIBC AND NOT CONFIG_PICOLIBC_IO_FLOAT)
zephyr_compile_options($<$<COMPILE_LANGUAGE:C>:$<TARGET_PROPERTY:compiler,no_printf_return_value>>)
endif()
if(CONFIG_UBSAN)
zephyr_compile_options($<$<COMPILE_LANGUAGE:C>:$<TARGET_PROPERTY:compiler,sanitizer_undefined>>)
zephyr_link_libraries($<TARGET_PROPERTY:linker,sanitizer_undefined>)
if(CONFIG_UBSAN_LIBRARY)
zephyr_compile_options($<$<COMPILE_LANGUAGE:C>:$<TARGET_PROPERTY:compiler,sanitizer_undefined_library>>)
zephyr_link_libraries($<TARGET_PROPERTY:linker,sanitizer_undefined_library>)
elseif(CONFIG_UBSAN_TRAP)
zephyr_compile_options($<$<COMPILE_LANGUAGE:C>:$<TARGET_PROPERTY:compiler,sanitizer_undefined_trap>>)
zephyr_link_libraries($<TARGET_PROPERTY:linker,sanitizer_undefined_trap>)
endif()
endif()
# @Intent: Set compiler specific flag for tentative definitions, no-common
zephyr_compile_options($<TARGET_PROPERTY:compiler,no_common>)
@@ -394,9 +353,7 @@ zephyr_compile_options($<$<COMPILE_LANGUAGE:ASM>:$<TARGET_PROPERTY:asm,required>
# @Intent: Enforce standard integer type correspondence to match Zephyr usage.
# (must be after compiler specific flags)
if(CONFIG_ENFORCE_ZEPHYR_STDINT)
zephyr_compile_options("SHELL:$<$<COMPILE_LANGUAGE:C>:$<TARGET_PROPERTY:compiler,imacros> ${ZEPHYR_BASE}/include/zephyr/toolchain/zephyr_stdint.h>")
zephyr_compile_options("SHELL:$<$<COMPILE_LANGUAGE:CXX>:$<TARGET_PROPERTY:compiler,imacros> ${ZEPHYR_BASE}/include/zephyr/toolchain/zephyr_stdint.h>")
zephyr_compile_options("SHELL:$<$<COMPILE_LANGUAGE:ASM>:$<TARGET_PROPERTY:asm,imacros> ${ZEPHYR_BASE}/include/zephyr/toolchain/zephyr_stdint.h>")
zephyr_compile_options("SHELL: $<TARGET_PROPERTY:compiler,imacros> ${ZEPHYR_BASE}/include/zephyr/toolchain/zephyr_stdint.h")
endif()
# Common toolchain-agnostic assembly flags
@@ -840,7 +797,7 @@ add_custom_command(
--file-list ${syscalls_file_list_output}
$<$<BOOL:${CONFIG_EMIT_ALL_SYSCALLS}>:--emit-all-syscalls>
DEPENDS ${syscalls_subdirs_trigger} ${PARSE_SYSCALLS_HEADER_DEPENDS}
${syscalls_file_list_output} syscalls_interface
${syscalls_file_list_output} ${syscalls_interface}
)
# Make sure Picolibc is built before the rest of the system; there's no explicit
@@ -858,6 +815,12 @@ set_property(TARGET ${SYSCALL_LIST_H_TARGET}
${CMAKE_CURRENT_BINARY_DIR}/include/generated/zephyr/syscalls
)
add_custom_target(${PARSE_SYSCALLS_TARGET}
DEPENDS
${syscalls_json}
${struct_tags_json}
)
# 64-bit systems do not require special handling of 64-bit system call
# parameters or return values, indicate this to the system call boilerplate
# generation script.
@@ -878,12 +841,7 @@ if(CONFIG_LEGACY_GENERATED_INCLUDE_PATH)
${CMAKE_CURRENT_BINARY_DIR}/include/generated/syscall_list.h)
endif()
add_custom_command(
OUTPUT
include/generated/zephyr/syscall_dispatch.c
include/generated/zephyr/syscall_exports_llext.c
syscall_weakdefs_llext.c
${syscall_list_h}
add_custom_command(OUTPUT include/generated/zephyr/syscall_dispatch.c ${syscall_list_h}
# Also, some files are written to include/generated/zephyr/syscalls/
COMMAND
${PYTHON_EXECUTABLE}
@@ -891,8 +849,7 @@ add_custom_command(
--json-file ${syscalls_json} # Read this file
--base-output include/generated/zephyr/syscalls # Write to this dir
--syscall-dispatch include/generated/zephyr/syscall_dispatch.c # Write this file
--syscall-exports-llext include/generated/zephyr/syscall_exports_llext.c
--syscall-weakdefs-llext syscall_weakdefs_llext.c # compiled in CMake library 'syscall_weakdefs'
--syscall-export-llext include/generated/zephyr/syscall_export_llext.c
--syscall-list ${syscall_list_h}
$<$<BOOL:${CONFIG_USERSPACE}>:--gen-mrsh-files>
${SYSCALL_LONG_REGISTERS_ARG}
@@ -900,57 +857,30 @@ add_custom_command(
COMMAND
${LEGACY_SYSCALL_LIST_H_ARGS}
WORKING_DIRECTORY ${CMAKE_CURRENT_BINARY_DIR}
DEPENDS ${syscalls_json}
DEPENDS ${PARSE_SYSCALLS_TARGET}
)
include(${ZEPHYR_BASE}/cmake/kobj.cmake)
# This is passed into all calls to the gen_kobject_list.py script.
set(gen_kobject_list_include_args --include-subsystem-list ${struct_tags_json})
set(DRV_VALIDATION ${PROJECT_BINARY_DIR}/include/generated/zephyr/driver-validation.h)
gen_kobject_list(
TARGET ${DRIVER_VALIDATION_H_TARGET}
OUTPUTS ${DRV_VALIDATION}
SCRIPT_ARGS --validation-output ${DRV_VALIDATION}
INCLUDES ${struct_tags_json}
DEPENDS ${struct_tags_json}
)
gen_kobject_list_headers(
INCLUDES ${struct_tags_json}
DEPENDS ${struct_tags_json}
)
# Generate sections for kernel device subsystems
set(
DEVICE_API_LD_SECTIONS
${CMAKE_CURRENT_BINARY_DIR}/include/generated/device-api-sections.ld
)
set(DEVICE_API_LINKER_SECTIONS_CMAKE
${CMAKE_CURRENT_BINARY_DIR}/include/generated/device-api-sections.cmake
)
add_custom_command(
OUTPUT ${DEVICE_API_LD_SECTIONS} ${DEVICE_API_LINKER_SECTIONS_CMAKE}
OUTPUT ${DRV_VALIDATION}
COMMAND
${PYTHON_EXECUTABLE}
${ZEPHYR_BASE}/scripts/build/gen_iter_sections.py
--alignment ${CONFIG_LINKER_ITERABLE_SUBALIGN}
--input ${struct_tags_json}
--tag __subsystem
--ld-output ${DEVICE_API_LD_SECTIONS}
--cmake-output ${DEVICE_API_LINKER_SECTIONS_CMAKE}
${PYTHON_EXECUTABLE}
${ZEPHYR_BASE}/scripts/build/gen_kobject_list.py
--validation-output ${DRV_VALIDATION}
${gen_kobject_list_include_args}
$<$<BOOL:${CMAKE_VERBOSE_MAKEFILE}>:--verbose>
DEPENDS
${ZEPHYR_BASE}/scripts/build/gen_iter_sections.py
${struct_tags_json}
${ZEPHYR_BASE}/scripts/build/gen_kobject_list.py
${PARSE_SYSCALLS_TARGET}
WORKING_DIRECTORY ${CMAKE_CURRENT_BINARY_DIR}
)
add_custom_target(${DRIVER_VALIDATION_H_TARGET} DEPENDS ${DRV_VALIDATION})
add_custom_target(${DEVICE_API_LD_TARGET}
DEPENDS ${DEVICE_API_LD_SECTIONS}
${DEVICE_API_LINKER_SECTIONS_CMAKE}
)
zephyr_linker_include_generated(CMAKE ${DEVICE_API_LINKER_SECTIONS_CMAKE})
include(${ZEPHYR_BASE}/cmake/kobj.cmake)
gen_kobj(KOBJ_INCLUDE_PATH)
# Add a pseudo-target that is up-to-date when all generated headers
# are up-to-date.
@@ -982,7 +912,6 @@ add_dependencies(zephyr_interface
${SYSCALL_LIST_H_TARGET}
${DRIVER_VALIDATION_H_TARGET}
${KOBJ_TYPES_H_TARGET}
${DEVICE_API_LD_TARGET}
)
add_custom_command(
@@ -1047,16 +976,6 @@ else()
set(NO_WHOLE_ARCHIVE_LIBS kernel)
endif()
if(CONFIG_LLEXT)
# LLEXT exports symbols for all syscalls, including unimplemented ones.
# Weak definitions for these must be added at the end of the link order
# to avoid shadowing actual implementations.
add_library(syscall_weakdefs syscall_weakdefs_llext.c)
add_dependencies(syscall_weakdefs zephyr_generated_headers)
target_link_libraries(syscall_weakdefs zephyr_interface)
list(APPEND NO_WHOLE_ARCHIVE_LIBS syscall_weakdefs)
endif()
get_property(OUTPUT_FORMAT GLOBAL PROPERTY PROPERTY_OUTPUT_FORMAT)
if (CONFIG_CODE_DATA_RELOCATION)
@@ -1111,23 +1030,23 @@ if(CONFIG_CODE_DATA_RELOCATION)
endif()
if(CONFIG_USERSPACE)
# Go for raw properties here since zephyr_get_compile_options_for_lang()
# processes the list of options, and wraps it in a $<JOIN thing. When
# generating the build systems this leads to some interesting command lines,
# with SHELL: not being present and other "random" list-join related issues
# (e.g. for IAR toolchains the lists were joined with "gnu" postfixed on a
# bunch of entries).
get_property(compiler_flags_priv TARGET zephyr_interface PROPERTY INTERFACE_COMPILE_OPTIONS)
zephyr_get_compile_options_for_lang_as_string(C compiler_flags_priv)
string(REPLACE "$<TARGET_PROPERTY:compiler,coverage>" ""
KOBJECT_HASH_COMPILE_OPTIONS "${compiler_flags_priv}")
list(APPEND KOBJECT_HASH_COMPILE_OPTIONS
$<TARGET_PROPERTY:compiler,no_function_sections>
$<TARGET_PROPERTY:compiler,no_data_sections>)
NO_COVERAGE_FLAGS "${compiler_flags_priv}"
)
set(GEN_KOBJ_LIST ${ZEPHYR_BASE}/scripts/build/gen_kobject_list.py)
set(PROCESS_GPERF ${ZEPHYR_BASE}/scripts/build/process_gperf.py)
endif()
get_property(GLOBAL_CSTD GLOBAL PROPERTY CSTD)
if(DEFINED GLOBAL_CSTD)
message(DEPRECATION
"Global CSTD property is deprecated, see Kconfig.zephyr for C Standard options.")
set(CSTD ${GLOBAL_CSTD})
list(APPEND CMAKE_C_COMPILE_FEATURES ${compile_features_${CSTD}})
endif()
# @Intent: Obtain compiler specific flag for specifying the c standard
zephyr_compile_options(
$<$<COMPILE_LANGUAGE:C>:$<TARGET_PROPERTY:compiler,cstd>${CSTD}>
@@ -1143,15 +1062,17 @@ set_ifndef( TOPT "${COMPILER_TOPT}")
set_ifndef( TOPT -Wl,-T) # Use this if the compiler driver doesn't set a value
if(CONFIG_HAVE_CUSTOM_LINKER_SCRIPT)
string(CONFIGURE ${APPLICATION_SOURCE_DIR}/${CONFIG_CUSTOM_LINKER_SCRIPT} LINKER_SCRIPT)
set(LINKER_SCRIPT ${APPLICATION_SOURCE_DIR}/${CONFIG_CUSTOM_LINKER_SCRIPT})
if(NOT EXISTS ${LINKER_SCRIPT})
string(CONFIGURE ${CONFIG_CUSTOM_LINKER_SCRIPT} LINKER_SCRIPT)
assert_exists(LINKER_SCRIPT)
set(LINKER_SCRIPT ${CONFIG_CUSTOM_LINKER_SCRIPT})
assert_exists(CONFIG_CUSTOM_LINKER_SCRIPT)
endif()
elseif(DEFINED BOARD_LINKER_SCRIPT)
set(LINKER_SCRIPT ${BOARD_LINKER_SCRIPT})
elseif(DEFINED SOC_LINKER_SCRIPT)
set(LINKER_SCRIPT ${SOC_LINKER_SCRIPT})
else()
find_package(Deprecated COMPONENTS SEARCHED_LINKER_SCRIPT)
endif()
if(NOT EXISTS ${LINKER_SCRIPT})
@@ -1159,20 +1080,14 @@ if(NOT EXISTS ${LINKER_SCRIPT})
endif()
if(CONFIG_USERSPACE)
if(CONFIG_CMAKE_LINKER_GENERATOR)
set(APP_SMEM_LD_EXT "cmake")
else()
set(APP_SMEM_LD_EXT "ld")
endif()
set(APP_SMEM_ALIGNED_LD "${PROJECT_BINARY_DIR}/include/generated/app_smem_aligned.${APP_SMEM_LD_EXT}")
set(APP_SMEM_UNALIGNED_LD "${PROJECT_BINARY_DIR}/include/generated/app_smem_unaligned.${APP_SMEM_LD_EXT}")
set(APP_SMEM_ALIGNED_LD "${PROJECT_BINARY_DIR}/include/generated/app_smem_aligned.ld")
set(APP_SMEM_UNALIGNED_LD "${PROJECT_BINARY_DIR}/include/generated/app_smem_unaligned.ld")
if(CONFIG_LINKER_USE_PINNED_SECTION)
set(APP_SMEM_PINNED_ALIGNED_LD
"${PROJECT_BINARY_DIR}/include/generated/app_smem_pinned_aligned.${APP_SMEM_LD_EXT}")
"${PROJECT_BINARY_DIR}/include/generated/app_smem_pinned_aligned.ld")
set(APP_SMEM_PINNED_UNALIGNED_LD
"${PROJECT_BINARY_DIR}/include/generated/app_smem_pinned_unaligned.${APP_SMEM_LD_EXT}")
"${PROJECT_BINARY_DIR}/include/generated/app_smem_pinned_unaligned.ld")
if(NOT CONFIG_LINKER_GENERIC_SECTIONS_PRESENT_AT_BOOT)
# The libc partition may hold symbols that are required during boot process,
@@ -1237,11 +1152,9 @@ if(CONFIG_USERSPACE)
set(APP_SMEM_UNALIGNED_LIB app_smem_unaligned_output_obj_renamed_lib)
list(APPEND LINKER_PASS_${ZEPHYR_CURRENT_LINKER_PASS}_DEFINE "LINKER_APP_SMEM_UNALIGNED")
endif()
foreach(dep ${APP_SMEM_UNALIGNED_LD} ${APP_SMEM_PINNED_UNALIGNED_LD})
zephyr_linker_include_generated(CMAKE ${dep} PASS LINKER_APP_SMEM_UNALIGNED)
endforeach()
if (CONFIG_USERSPACE)
add_custom_command(
OUTPUT ${APP_SMEM_ALIGNED_LD} ${APP_SMEM_PINNED_ALIGNED_LD}
COMMAND ${PYTHON_EXECUTABLE}
@@ -1261,9 +1174,6 @@ if(CONFIG_USERSPACE)
COMMAND_EXPAND_LISTS
COMMENT "Generating app_smem_aligned linker section"
)
foreach(dep ${APP_SMEM_ALIGNED_LD} ${APP_SMEM_PINNED_ALIGNED_LD})
zephyr_linker_include_generated(CMAKE ${dep} PASS NOT LINKER_APP_SMEM_UNALIGNED)
endforeach()
endif()
if(CONFIG_USERSPACE)
@@ -1278,13 +1188,23 @@ if(CONFIG_USERSPACE)
set(KOBJECT_PREBUILT_HASH_OUTPUT_SRC_PRE kobject_prebuilt_hash_preprocessed.c)
set(KOBJECT_PREBUILT_HASH_OUTPUT_SRC kobject_prebuilt_hash.c)
gen_kobject_list_gperf(
TARGET kobj_prebuilt_hash_list
add_custom_command(
OUTPUT ${KOBJECT_PREBUILT_HASH_LIST}
KERNEL_TARGET ${ZEPHYR_LINK_STAGE_EXECUTABLE}
INCLUDES ${struct_tags_json}
DEPENDS ${struct_tags_json}
COMMAND
${PYTHON_EXECUTABLE}
${GEN_KOBJ_LIST}
--kernel $<TARGET_FILE:${ZEPHYR_LINK_STAGE_EXECUTABLE}>
--gperf-output ${KOBJECT_PREBUILT_HASH_LIST}
${gen_kobject_list_include_args}
$<$<BOOL:${CMAKE_VERBOSE_MAKEFILE}>:--verbose>
DEPENDS
${ZEPHYR_LINK_STAGE_EXECUTABLE}
WORKING_DIRECTORY ${CMAKE_CURRENT_BINARY_DIR}
)
add_custom_target(
kobj_prebuilt_hash_list
DEPENDS ${CMAKE_CURRENT_BINARY_DIR}/${KOBJECT_PREBUILT_HASH_LIST}
)
add_custom_command(
OUTPUT ${KOBJECT_PREBUILT_HASH_OUTPUT_SRC_PRE}
@@ -1321,13 +1241,11 @@ if(CONFIG_USERSPACE)
add_library(
kobj_prebuilt_hash_output_lib
OBJECT ${CMAKE_CURRENT_BINARY_DIR}/${KOBJECT_PREBUILT_HASH_OUTPUT_SRC}
)
)
# set_target_properties sets ALL properties, target_compile_options() adds
# and KOBJECT_HASH_COMPILE_OPTIONS contains all the options.
set_target_properties(kobj_prebuilt_hash_output_lib PROPERTIES
COMPILE_OPTIONS "${KOBJECT_HASH_COMPILE_OPTIONS}"
)
set_source_files_properties(${KOBJECT_PREBUILT_HASH_OUTPUT_SRC}
PROPERTIES COMPILE_FLAGS
"${NO_COVERAGE_FLAGS} -fno-function-sections -fno-data-sections")
target_compile_definitions(kobj_prebuilt_hash_output_lib
PRIVATE $<TARGET_PROPERTY:zephyr_interface,INTERFACE_COMPILE_DEFINITIONS>
@@ -1363,13 +1281,6 @@ if(CONFIG_USERSPACE)
DEPENDS
${KOBJECT_LINKER_HEADER_DATA}
)
# gen_kobject_placeholders.py generates linker-kobject-prebuild-data.h,
# linker-kobject-prebuild-priv-stacks.h and linker-kobject-prebuild-rodata.h
foreach(ext "-data.h" "-priv-stacks.h" "-rodata.h")
string(REGEX REPLACE "-data.h$" ${ext} file ${KOBJECT_LINKER_HEADER_DATA})
zephyr_linker_include_generated(HEADER ${file} PASS LINKER_ZEPHYR_PREBUILT LINKER_ZEPHYR_FINAL)
endforeach()
endif()
if(CONFIG_USERSPACE OR CONFIG_DEVICE_DEPS)
@@ -1477,13 +1388,23 @@ if(CONFIG_USERSPACE)
# Use the script GEN_KOBJ_LIST to scan the kernel binary's
# (${ZEPHYR_LINK_STAGE_EXECUTABLE}) DWARF information to produce a table of kernel
# objects (KOBJECT_HASH_LIST) which we will then pass to gperf
gen_kobject_list_gperf(
TARGET kobj_hash_list
add_custom_command(
OUTPUT ${KOBJECT_HASH_LIST}
KERNEL_TARGET ${ZEPHYR_LINK_STAGE_EXECUTABLE}
INCLUDES ${struct_tags_json}
DEPENDS ${struct_tags_json}
COMMAND
${PYTHON_EXECUTABLE}
${GEN_KOBJ_LIST}
--kernel $<TARGET_FILE:${ZEPHYR_LINK_STAGE_EXECUTABLE}>
--gperf-output ${KOBJECT_HASH_LIST}
${gen_kobject_list_include_args}
$<$<BOOL:${CMAKE_VERBOSE_MAKEFILE}>:--verbose>
DEPENDS
${ZEPHYR_LINK_STAGE_EXECUTABLE}
WORKING_DIRECTORY ${CMAKE_CURRENT_BINARY_DIR}
)
add_custom_target(
kobj_hash_list
DEPENDS ${CMAKE_CURRENT_BINARY_DIR}/${KOBJECT_HASH_LIST}
)
# Use gperf to generate C code (KOBJECT_HASH_OUTPUT_SRC_PRE) which implements a
# perfect hashtable based on KOBJECT_HASH_LIST
@@ -1532,9 +1453,9 @@ if(CONFIG_USERSPACE)
OBJECT ${CMAKE_CURRENT_BINARY_DIR}/${KOBJECT_HASH_OUTPUT_SRC}
)
set_target_properties(kobj_hash_output_lib PROPERTIES
COMPILE_OPTIONS "${KOBJECT_HASH_COMPILE_OPTIONS}"
)
set_source_files_properties(${KOBJECT_HASH_OUTPUT_SRC}
PROPERTIES COMPILE_FLAGS
"${NO_COVERAGE_FLAGS} -fno-function-sections -fno-data-sections")
target_compile_definitions(kobj_hash_output_lib
PRIVATE $<TARGET_PROPERTY:zephyr_interface,INTERFACE_COMPILE_DEFINITIONS>
@@ -1701,12 +1622,9 @@ list(APPEND
)
list(APPEND post_build_byproducts ${KERNEL_MAP_NAME})
# Use ';' as separator to get proper space in resulting command.
# Ignore gapfill for RX target as some RX toolchains generate wrong output image when running
# gapfill for binary format.
if(NOT CONFIG_RX)
set(gap_fill_prop "$<TARGET_PROPERTY:bintools,elfconvert_flag_gapfill>")
set(gap_fill "$<$<BOOL:${gap_fill_prop}>:${gap_fill_prop}${CONFIG_BUILD_GAP_FILL_PATTERN}>")
if(NOT CONFIG_BUILD_NO_GAP_FILL)
# Use ';' as separator to get proper space in resulting command.
set(GAP_FILL "$<TARGET_PROPERTY:bintools,elfconvert_flag_gapfill>0xff")
endif()
if(CONFIG_OUTPUT_PRINT_MEMORY_USAGE)
@@ -1750,8 +1668,13 @@ if(CONFIG_BUILD_OUTPUT_ADJUST_LMA)
)
endif()
if(NOT CONFIG_CPP_EXCEPTIONS)
set(eh_frame_section ".eh_frame")
else()
set(eh_frame_section "")
endif()
set(remove_sections_argument_list "")
foreach(section .comment COMMON)
foreach(section .comment COMMON ${eh_frame_section})
list(APPEND remove_sections_argument_list
$<TARGET_PROPERTY:bintools,elfconvert_flag_section_remove>${section})
endforeach()
@@ -1763,7 +1686,7 @@ if(CONFIG_BUILD_OUTPUT_HEX OR BOARD_FLASH_RUNNER STREQUAL openocd)
post_build_commands
COMMAND $<TARGET_PROPERTY:bintools,elfconvert_command>
$<TARGET_PROPERTY:bintools,elfconvert_flag>
$<$<BOOL:${CONFIG_BUILD_OUTPUT_HEX_GAP_FILL}>:${gap_fill}>
${GAP_FILL}
$<TARGET_PROPERTY:bintools,elfconvert_flag_outtarget>ihex
${remove_sections_argument_list}
$<TARGET_PROPERTY:bintools,elfconvert_flag_infile>${KERNEL_ELF_NAME}
@@ -1785,7 +1708,7 @@ if(CONFIG_BUILD_OUTPUT_BIN)
post_build_commands
COMMAND $<TARGET_PROPERTY:bintools,elfconvert_command>
$<TARGET_PROPERTY:bintools,elfconvert_flag>
${gap_fill}
${GAP_FILL}
$<TARGET_PROPERTY:bintools,elfconvert_flag_outtarget>binary
${remove_sections_argument_list}
$<TARGET_PROPERTY:bintools,elfconvert_flag_infile>${KERNEL_ELF_NAME}
@@ -1832,28 +1755,6 @@ if(CONFIG_BUILD_OUTPUT_BIN AND CONFIG_BUILD_OUTPUT_UF2)
set(BYPRODUCT_KERNEL_UF2_NAME "${PROJECT_BINARY_DIR}/${KERNEL_UF2_NAME}" CACHE FILEPATH "Kernel uf2 file" FORCE)
endif()
if(CONFIG_BUILD_OUTPUT_MOT)
get_property(elfconvert_formats TARGET bintools PROPERTY elfconvert_formats)
if(srec IN_LIST elfconvert_formats)
list(APPEND
post_build_commands
COMMAND $<TARGET_PROPERTY:bintools,elfconvert_command>
$<TARGET_PROPERTY:bintools,elfconvert_flag>
${GAP_FILL}
$<TARGET_PROPERTY:bintools,elfconvert_flag_outtarget>srec
$<TARGET_PROPERTY:bintools,elfconvert_flag_intarget>${OUTPUT_FORMAT}
$<TARGET_PROPERTY:bintools,elfconvert_flag_infile>${KERNEL_ELF_NAME}
$<TARGET_PROPERTY:bintools,elfconvert_flag_outfile>${KERNEL_MOT_NAME}
$<TARGET_PROPERTY:bintools,elfconvert_flag_final>
)
list(APPEND
post_build_byproducts
${KERNEL_MOT_NAME}
)
set(BYPRODUCT_KERNEL_MOT_NAME "${PROJECT_BINARY_DIR}/${KERNEL_MOT_NAME}" CACHE FILEPATH "Kernel mot file" FORCE)
endif()
endif()
set(KERNEL_META_PATH ${PROJECT_BINARY_DIR}/${KERNEL_META_NAME} CACHE INTERNAL "")
if(CONFIG_BUILD_OUTPUT_META)
list(APPEND
@@ -1894,7 +1795,7 @@ if(CONFIG_BUILD_OUTPUT_S19)
post_build_commands
COMMAND $<TARGET_PROPERTY:bintools,elfconvert_command>
$<TARGET_PROPERTY:bintools,elfconvert_flag>
$<$<BOOL:${CONFIG_BUILD_OUTPUT_S19_GAP_FILL}>:${gap_fill}>
${GAP_FILL}
$<TARGET_PROPERTY:bintools,elfconvert_flag_outtarget>srec
$<TARGET_PROPERTY:bintools,elfconvert_flag_srec_len>1
$<TARGET_PROPERTY:bintools,elfconvert_flag_infile>${KERNEL_ELF_NAME}
@@ -1910,14 +1811,11 @@ if(CONFIG_BUILD_OUTPUT_S19)
endif()
if(CONFIG_OUTPUT_DISASSEMBLY)
if(CONFIG_OUTPUT_DISASSEMBLE_ALL)
if(CONFIG_OUTPUT_DISASSEMBLE_ALL)
set(disassembly_type "$<TARGET_PROPERTY:bintools,disassembly_flag_all>")
elseif(CONFIG_OUTPUT_DISASSEMBLY_WITH_SOURCE)
elseif (CONFIG_OUTPUT_DISASSEMBLY_WITH_SOURCE)
set(disassembly_type "$<TARGET_PROPERTY:bintools,disassembly_flag_inline_source>")
endif()
if(CONFIG_OUTPUT_DISASSEMBLY_NO_ALIASES)
list(APPEND disassembly_type "$<TARGET_PROPERTY:bintools,disassembly_flag_no_aliases>")
endif()
list(APPEND
post_build_commands
COMMAND $<TARGET_PROPERTY:bintools,disassembly_command>
@@ -2046,6 +1944,7 @@ if (CONFIG_LLEXT AND CONFIG_LLEXT_EXPORT_BUILTINS_BY_SLID)
--elf-file ${PROJECT_BINARY_DIR}/${KERNEL_ELF_NAME}
--slid-listing ${PROJECT_BINARY_DIR}/slid_listing.txt
)
endif()
if(NOT CMAKE_C_COMPILER_ID STREQUAL "ARMClang")
@@ -2276,57 +2175,51 @@ if((CMAKE_BUILD_TYPE IN_LIST build_types) AND (NOT NO_BUILD_TYPE_WARNING))
endif()
# Extension Development Kit (EDK) generation.
if(CONFIG_LLEXT_EDK)
if(CONFIG_LLEXT_EDK_FORMAT_TAR_XZ)
set(llext_edk_extension "tar.xz")
elseif(CONFIG_LLEXT_EDK_FORMAT_TAR_ZSTD)
set(llext_edk_extension "tar.Z")
elseif(CONFIG_LLEXT_EDK_FORMAT_ZIP)
set(llext_edk_extension "zip")
else()
message(FATAL_ERROR "Unsupported LLEXT_EDK_FORMAT choice")
endif()
set(llext_edk_file ${PROJECT_BINARY_DIR}/${CONFIG_LLEXT_EDK_NAME}.${llext_edk_extension})
set(llext_edk_file ${PROJECT_BINARY_DIR}/${CONFIG_LLEXT_EDK_NAME}.tar.xz)
# TODO maybe generate flags for C CXX ASM
zephyr_get_compile_definitions_for_lang(C zephyr_defs)
zephyr_get_compile_options_for_lang(C zephyr_flags)
# TODO maybe generate flags for C CXX ASM
zephyr_get_compile_definitions_for_lang(C zephyr_defs)
zephyr_get_compile_options_for_lang(C zephyr_flags)
# Filter out non LLEXT and LLEXT_EDK flags - and add required ones
llext_filter_zephyr_flags(LLEXT_REMOVE_FLAGS ${zephyr_flags} llext_filt_flags)
llext_filter_zephyr_flags(LLEXT_EDK_REMOVE_FLAGS ${llext_filt_flags} llext_filt_flags)
# Filter out non LLEXT and LLEXT_EDK flags - and add required ones
llext_filter_zephyr_flags(LLEXT_REMOVE_FLAGS ${zephyr_flags} llext_filt_flags)
llext_filter_zephyr_flags(LLEXT_EDK_REMOVE_FLAGS ${llext_filt_flags} llext_filt_flags)
set(llext_edk_cflags ${zephyr_defs} -DLL_EXTENSION_BUILD)
list(APPEND llext_edk_cflags ${llext_filt_flags})
list(APPEND llext_edk_cflags ${LLEXT_APPEND_FLAGS})
list(APPEND llext_edk_cflags ${LLEXT_EDK_APPEND_FLAGS})
set(llext_edk_cflags ${zephyr_defs} -DLL_EXTENSION_BUILD)
list(APPEND llext_edk_cflags ${llext_filt_flags})
list(APPEND llext_edk_cflags ${LLEXT_APPEND_FLAGS})
list(APPEND llext_edk_cflags ${LLEXT_EDK_APPEND_FLAGS})
build_info(llext-edk file PATH ${llext_edk_file})
build_info(llext-edk cflags VALUE ${llext_edk_cflags})
build_info(llext-edk include-dirs VALUE "$<TARGET_PROPERTY:zephyr_interface,INTERFACE_INCLUDE_DIRECTORIES>")
add_custom_command(
add_custom_command(
OUTPUT ${llext_edk_file}
# Regenerate syscalls in case CONFIG_LLEXT_EDK_USERSPACE_ONLY
COMMAND ${CMAKE_COMMAND}
-E make_directory edk/include/generated/zephyr
-E make_directory edk/include/generated/zephyr
COMMAND
${PYTHON_EXECUTABLE}
${ZEPHYR_BASE}/scripts/build/gen_syscalls.py
--json-file ${syscalls_json} # Read this file
--base-output edk/include/generated/zephyr/syscalls # Write to this dir
--syscall-dispatch edk/include/generated/zephyr/syscall_dispatch.c # Write this file
--syscall-list ${edk_syscall_list_h}
$<$<BOOL:${CONFIG_LLEXT_EDK_USERSPACE_ONLY}>:--userspace-only>
${SYSCALL_LONG_REGISTERS_ARG}
${SYSCALL_SPLIT_TIMEOUT_ARG}
${PYTHON_EXECUTABLE}
${ZEPHYR_BASE}/scripts/build/gen_syscalls.py
--json-file ${syscalls_json} # Read this file
--base-output edk/include/generated/zephyr/syscalls # Write to this dir
--syscall-dispatch edk/include/generated/zephyr/syscall_dispatch.c # Write this file
--syscall-list ${edk_syscall_list_h}
$<$<BOOL:${CONFIG_LLEXT_EDK_USERSPACE_ONLY}>:--userspace-only>
${SYSCALL_LONG_REGISTERS_ARG}
${SYSCALL_SPLIT_TIMEOUT_ARG}
COMMAND ${CMAKE_COMMAND}
-DPROJECT_BINARY_DIR=${PROJECT_BINARY_DIR}
-DAPPLICATION_SOURCE_DIR=${APPLICATION_SOURCE_DIR}
-DINTERFACE_INCLUDE_DIRECTORIES="$<TARGET_PROPERTY:zephyr_interface,INTERFACE_INCLUDE_DIRECTORIES>"
-Dllext_edk_file=${llext_edk_file}
-Dllext_edk_cflags="${llext_edk_cflags}"
-Dllext_edk_name=${CONFIG_LLEXT_EDK_NAME}
-DWEST_TOPDIR=${WEST_TOPDIR}
-DZEPHYR_BASE=${ZEPHYR_BASE}
-DCONFIG_LLEXT_EDK_USERSPACE_ONLY=${CONFIG_LLEXT_EDK_USERSPACE_ONLY}
-P ${ZEPHYR_BASE}/cmake/llext-edk.cmake
DEPENDS ${logical_target_for_zephyr_elf} ${syscalls_json} build_info_yaml_saved
DEPENDS ${logical_target_for_zephyr_elf}
COMMAND_EXPAND_LISTS
)
add_custom_target(llext-edk DEPENDS ${llext_edk_file})
endif()
)
add_custom_target(llext-edk DEPENDS ${llext_edk_file})
# @Intent: Set compiler specific flags for standard C/C++ includes
# Done at the very end, so any other system includes which may
@@ -2347,8 +2240,9 @@ add_subdirectory_ifdef(
toolchain_linker_finalize()
# export build information
build_info(zephyr version VALUE ${PROJECT_VERSION_STR})
build_info(zephyr zephyr-base VALUE ${ZEPHYR_BASE})
yaml_save(NAME build_info)
yaml_context(EXISTS NAME build_info result)
if(result)
build_info(zephyr version VALUE ${PROJECT_VERSION_STR})
build_info(zephyr zephyr-base VALUE ${ZEPHYR_BASE})
yaml_save(NAME build_info)
endif()

View File

@@ -1,4 +1,491 @@
# Instead of the CODEOWNERS file, The Zephyr Project uses a custom format.
# See MAINTAINERS.yml for details.
# CODEOWNERS for autoreview assigning in github
# https://help.github.com/en/articles/about-code-owners#codeowners-syntax
# Order is important; for each modified file, the last matching
# pattern takes the most precedence.
# That is, with the last pattern being
# *.rst @nashif
# if only .rst files are being modified, only nashif is
# automatically requested for review, but you can manually
# add others as needed.
# Do not use wildcard on all source yet
#
# DO NOT EDIT THIS FILE.
# +++++++++++ NOTE ++++++++++++++++
#
# Please use the MAINTAINERS file to add yourself in an area or to add a new
# component or code. This file is going to be deprecated and currently only had
# entries that are not covered by the MAINTAINERS file.
/soc/arm/aspeed/ @aspeeddylan
/soc/atmel/ @nandojve
/soc/arm/bcm*/ @sbranden
/soc/arm/ene/ @ene-steven
/soc/arm/infineon_cat1/ @ifyall @npal-cy
/soc/arm/infineon_xmc/ @parthitce
/soc/arm/silabs_exx32/efm32pg1b/ @rdmeneze
/soc/arm/silabs_exx32/efr32mg21/ @l-alfred
/soc/arm/st_stm32/stm32mp1/ @arnopo
/soc/arm/st_stm32/stm32h7/*stm32h735* @benediktibk
/soc/arm/st_stm32/stm32l4/*stm32l451* @benediktibk
/soc/arm/ti_simplelink/cc13x2_cc26x2/ @bwitherspoon
/soc/arm/ti_simplelink/cc32xx/ @vanti
/soc/arm/ti_simplelink/msp432p4xx/ @Mani-Sadhasivam
/soc/arm/xilinx_zynq7000/ @ibirnbaum
/soc/arm/xilinx_zynqmp/ @stephanosio
/soc/arm/renesas_rcar/ @aaillet
/soc/riscv/openisa*/ @dleach02
/soc/riscv/riscv-privileged/andes_v5/ @cwshu @kevinwang821020 @jimmyzhe
/soc/riscv/riscv-privileged/neorv32/ @henrikbrixandersen
/soc/riscv/riscv-privileged/gd32vf103/ @soburi
/soc/starfive/jh71xx/ @pfarwsi
/soc/riscv/riscv-privileged/niosv/ @sweeaun
/boards/adafruit/feather_nrf52840/ @jacobw
/boards/ene/ @ene-steven
/boards/arm/96b_argonkey/ @avisconti
/boards/arm/96b_avenger96/ @Mani-Sadhasivam
/boards/arm/96b_carbon/ @idlethread
/boards/arm/96b_meerkat96/ @Mani-Sadhasivam
/boards/arm/96b_nitrogen/ @idlethread
/boards/arm/96b_neonkey/ @Mani-Sadhasivam
/boards/arm/96b_stm32_sensor_mez/ @Mani-Sadhasivam
/boards/arm/96b_wistrio/ @Mani-Sadhasivam
/boards/arm/acn52832/ @sven-hm
/boards/arm/arduino_mkrzero/ @soburi
/boards/arm/bbc_microbit_v2/ @LingaoM
/boards/arm/blackpill_f401ce/ @coderkalyan
/boards/arm/blackpill_f411ce/ @coderkalyan
/boards/arm/bt*10/ @greg-leach
/boards/arm/cc1352r1_launchxl/ @bwitherspoon
/boards/arm/cc26x2r1_launchxl/ @bwitherspoon
/boards/arm/cc3220sf_launchxl/ @vanti
/boards/arm/cy8ckit_062_ble/ @ifyall @npal-cy
/boards/arm/cy8ckit_062s4/ @DaWei8823
/boards/arm/cy8ckit_062_wifi_bt/ @ifyall @npal-cy
/boards/arm/cy8cproto_062_4343w/ @ifyall @npal-cy
/boards/arm/efm32pg_stk3401a/ @rdmeneze
/boards/arm/faze/ @mbittan @simonguinot
/boards/arm/frdm*/ @mmahadevan108 @dleach02
/boards/arm/gd32*/ @nandojve
/boards/arm/google_*/ @jackrosenthal
/boards/arm/hexiwear*/ @mmahadevan108 @dleach02
/boards/arm/ip_k66f/ @parthitce @lmajewski
/boards/arm/legend/ @mbittan @simonguinot
/boards/arm/lpcxpresso*/ @mmahadevan108 @dleach02
/boards/arm/mimx8mm_evk/ @Mani-Sadhasivam
/boards/arm/mimx8mm_phyboard_polis @pefech
/boards/arm/mimxrt*/ @mmahadevan108 @dleach02
/boards/arm/mps2_an385/ @fvincenzo
/boards/arm/msp_exp432p401r_launchxl/ @Mani-Sadhasivam
/boards/arm/npcx7m6fb_evb/ @MulinChao @ChiHuaL
/boards/arm/nrf*/ @carlescufi @lemrey
/boards/arm/nucleo_f401re/ @idlethread
/boards/arm/nuvoton_pfm_m487/ @ssekar15
/boards/arm/qemu_cortex_a9/ @ibirnbaum
/boards/arm/qemu_cortex_r*/ @stephanosio
/boards/arm/qemu_cortex_m*/ @ioannisg @stephanosio
/boards/arm/quick_feather/ @fkokosinski @kgugala
/boards/arm/rak4631_nrf52840/ @gpaquet85
/boards/arm/rak5010_nrf52840/ @gpaquet85
/boards/arm/rpi_pico/ @yonsch
/boards/arm/ronoth_lodev/ @NorthernDean
/boards/arm/xmc45_relax_kit/ @parthitce
/boards/atmel/ @nandojve
/boards/arm/scobc_module1/ @yashi
/boards/arm/v2m_beetle/ @fvincenzo
/boards/arm/olimexino_stm32/ @ydamigos
/boards/arm/s32*/ @manuargue
/boards/arm/sensortile_box/ @avisconti
/boards/arm/steval_fcu001v1/ @Navin-Sankar
/boards/arm/stm32l1_disco/ @karlp
/boards/arm/stm32h735g_disco/ @benediktibk
/boards/arm/stm32f3_disco/ @ydamigos
/boards/arm/rcar_*/ @aaillet
/boards/arm/ubx_bmd345eval_nrf52840/ @Navin-Sankar @brec-u-blox
/boards/arm/nrf5340_audio_dk_nrf5340 @koffes @alexsven @erikrobstad @rick1082 @gWacey
/boards/arm/stm32_min_dev/ @sidcha
/boards/ezurio/* @rerickson1
/boards/riscv/rv32m1_vega/ @dleach02
/boards/riscv/adp_xc7k_ae350/ @cwshu @kevinwang821020 @jimmyzhe
/boards/riscv/longan_nano/ @soburi
/boards/riscv/neorv32/ @henrikbrixandersen
/boards/riscv/niosv*/ @sweeaun
/boards/riscv/sparkfun_red_v_things_plus/ @soburi
/boards/riscv/stamp_c3/ @soburi
/boards/starfive/visionfive2/ @kanakshilledar @pfarwsi
/boards/shields/atmel_rf2xx/ @nandojve
/boards/shields/esp_8266/ @nandojve
/boards/shields/inventek_eswifi/ @nandojve
/boards/xtensa/odroid_go/ @ydamigos
/boards/xtensa/nxp_adsp_imx8/ @iuliana-prodan @dbaluta
/boards/xtensa/kincony_kc868_a32/ @bbilas
/boards/arm64/qemu_cortex_a53/ @carlocaione
/boards/arm64/bcm958402m2_a72/ @abhishek-brcm
/boards/arm64/mimx8mm_evk/ @MrVan @JiafeiPan
/boards/arm64/mimx8mp_evk/ @MrVan @JiafeiPan
/boards/arm64/mimx8mn_evk/ @JiafeiPan
/boards/arm64/mimx93_evk/ @JiafeiPan
/boards/arm64/nxp_ls1046ardb/ @JiafeiPan
/boards/arm64/xenvm/ @lorc @firscity
/boards/arm64/fvp_baser_aemv8r/ @povergoing
/boards/arm64/fvp_base_revc_2xaemv8a/ @carlocaione
/boards/arm64/intel_socfpga_agilex_socdk/ @siclim @ngboonkhai
/boards/arm64/intel_socfpga_agilex5_socdk/ @teikheng @gdengi
/boards/arm64/rcar_*/ @lorc @xakep-amatop
# All cmake related files
/doc/develop/tools/coccinelle.rst @himanshujha199640 @JuliaLawall
/doc/services/device_mgmt/smp_protocol.rst @de-nordic @nordicjm
/doc/services/device_mgmt/smp_groups/ @de-nordic @nordicjm
/doc/services/sensing/ @lixuzha @ghu0510 @qianruh
/doc/CMakeLists.txt @carlescufi
/doc/_scripts/ @carlescufi
/drivers/*/*sam4l* @nandojve
/drivers/*/*cc13xx_cc26xx* @bwitherspoon
/drivers/*/*gd32* @nandojve
/drivers/*/*mcux* @mmahadevan108 @dleach02
/drivers/*/*native_posix* @aescolar @daor-oti
/drivers/*/*lpc11u6x* @mbittan @simonguinot
/drivers/*/*npcx* @MulinChao @ChiHuaL
/drivers/*/*andes* @cwshu @kevinwang821020 @jimmyzhe
/drivers/*/*ifx_cat1* @ifyall @npal-cy
/drivers/*/*neorv32* @henrikbrixandersen
/drivers/*/*_s32* @manuargue
/drivers/adc/adc_ads1x1x.c @XenuIsWatching
/drivers/adc/adc_stm32.c @cybertale
/drivers/adc/adc_rpi_pico.c @soburi
/drivers/adc/*ads114s0x* @benediktibk
/drivers/adc/*max11102_17* @benediktibk
/drivers/adc/*kb1200* @ene-steven
/drivers/adc/adc_ad559x.c @bbilas
/drivers/audio/*nrfx* @anangl
/drivers/auxdisplay/*pt6314* @xingrz
/drivers/auxdisplay/* @thedjnK
/drivers/bbram/* @yperess @sjg20 @jackrosenthal
/drivers/bluetooth/ @alwa-nordic @jhedberg @Vudentz
/drivers/bluetooth/hci/hci_esp32.c @sylvioalves
/drivers/can/*mcp2515* @karstenkoenig
/drivers/can/*rcar* @aaillet
/drivers/clock_control/*agilex* @siclim @gdengi
/drivers/clock_control/*nrf* @nordic-krch
/drivers/clock_control/*esp32* @extremegtx @sylvioalves
/drivers/clock_control/*cpg_mssr* @aaillet
/drivers/console/ipm_console.c @finikorg
/drivers/console/semihost_console.c @luozhongyao
/drivers/console/jailhouse_debug_console.c @MrVan
/drivers/counter/counter_cmos.c @dcpleung
/drivers/counter/counter_ll_stm32_timer.c @kentjhall
/drivers/counter/*esp32* @sylvioalves
/drivers/counter/dw_timer.c @pbalsundar
/drivers/counter/counter_timer_shell.c @pbalsundar
/drivers/crypto/*nrf_ecb* @maciekfabia @anangl
/drivers/display/*rm68200* @mmahadevan108
/drivers/display/display_ili9342c.* @extremegtx
/drivers/dac/*ad56xx* @benediktibk
/drivers/dac/dac_ad559x.c @bbilas
/drivers/dai/ @kv2019i @marcinszkudlinski @abonislawski
/drivers/dai/intel/ @kv2019i @marcinszkudlinski @abonislawski
/drivers/dai/intel/ssp/ @kv2019i @marcinszkudlinski @abonislawski
/drivers/dai/intel/dmic/ @marcinszkudlinski @abonislawski
/drivers/dai/intel/alh/ @abonislawski
/drivers/dma/dma_dw_axi.c @pbalsundar
/drivers/dma/*dw* @tbursztyka
/drivers/dma/*dw_common* @abonislawski
/drivers/dma/*sam0* @Sizurka
/drivers/dma/dma_stm32* @cybertale @lowlander
/drivers/dma/*pl330* @raveenp
/drivers/dma/*iproc_pax* @raveenp
/drivers/dma/*intel_adsp* @teburd @abonislawski
/drivers/dma/*rpi_pico* @soburi
/drivers/dma/*xmc4xxx* @talih0
/drivers/eeprom/eeprom_stm32.c @KwonTae-young
/drivers/entropy/*b91* @andy-liu-telink
/drivers/entropy/*bt_hci* @JordanYates
/drivers/entropy/*rv32m1* @dleach02
/drivers/ethernet/*dwmac* @npitre
/drivers/ethernet/*stm32* @Nukersson @lochej
/drivers/ethernet/*w5500* @parthitce
/drivers/ethernet/*xlnx_gem* @ibirnbaum
/drivers/ethernet/*smsc91x* @sgrrzhf
/drivers/ethernet/*adin2111* @GeorgeCGV
/drivers/ethernet/*oa_tc6* @lmajewski
/drivers/ethernet/*lan865x* @lmajewski
/drivers/ethernet/dwc_xgmac @Smale-12048867
/drivers/ethernet/dwc_xgmac/dwc_xgmac @Smale-12048867
/drivers/ethernet/phy/ @rlubos @tbursztyka @arvinf @jukkar
/drivers/ethernet/phy/*adin2111* @GeorgeCGV
/drivers/mdio/*adin2111* @GeorgeCGV
/drivers/flash/*stm32_qspi* @lmajewski
/drivers/flash/*b91* @andy-liu-telink
/drivers/flash/*cadence* @ngboonkhai
/drivers/flash/*cc13xx_cc26xx* @pepe2k
/drivers/flash/*nrf* @de-nordic
/drivers/flash/*esp32* @sylvioalves
/drivers/flash/flash_cadence_nand* @nbalabak
/drivers/gpio/*b91* @andy-liu-telink
/drivers/gpio/*lmp90xxx* @henrikbrixandersen
/drivers/gpio/*nct38xx* @MulinChao @ChiHuaL
/drivers/gpio/*eos_s3* @fkokosinski @kgugala
/drivers/gpio/*rcar* @aaillet
/drivers/gpio/*esp32* @sylvioalves
/drivers/gpio/*rpi_pico* @yonsch
/drivers/gpio/*xlnx_ps* @ibirnbaum
/drivers/gpio/*ads114s0x* @benediktibk
/drivers/gpio/*bd8lb600fs* @benediktibk
/drivers/gpio/*pcal64xxa* @benediktibk
/drivers/gpio/*kb1200* @ene-steven
/drivers/gpio/gpio_altera_pio.c @shilinte
/drivers/gpio/gpio_ad559x.c @bbilas
/drivers/i2c/i2c_common.c @sjg20
/drivers/i2c/i2c_emul.c @sjg20
/drivers/i2c/i2c_ite_enhance.c @GTLin08
/drivers/i2c/i2c_ite_it8xxx2.c @GTLin08
/drivers/i2c/i2c_shell.c @nashif
/drivers/i2c/Kconfig.i2c_emul @sjg20
/drivers/i2c/Kconfig.it8xxx2 @GTLin08
/drivers/i2c/target/*eeprom* @henrikbrixandersen
/drivers/i2c/Kconfig.test @mbolivar-ampere
/drivers/i2c/i2c_test.c @mbolivar-ampere
/drivers/i2c/*rcar* @aaillet
/drivers/i2c/*kb1200* @ene-steven
/drivers/i2s/i2s_ll_stm32* @avisconti
/drivers/i2s/*nrfx* @anangl
/drivers/i3c/i3c_cdns.c @XenuIsWatching
/drivers/ieee802154/ @rlubos @tbursztyka @jukkar @fgrandel
/drivers/ieee802154/*b91* @andy-liu-telink
/drivers/ieee802154/ieee802154_nrf5* @ankuns
/drivers/ieee802154/ieee802154_rf2xx* @tbursztyka @nandojve
/drivers/ieee802154/ieee802154_cc13xx* @bwitherspoon @cfriedt @vaishnavachath
/drivers/interrupt_controller/ @dcpleung @nashif
/drivers/interrupt_controller/intc_gic.c @stephanosio
/drivers/interrupt_controller/*esp32* @sylvioalves
/drivers/interrupt_controller/intc_nuclei_eclic.c @soburi
/drivers/ipm/ipm_mhu* @karl-zh
/drivers/ipm/Kconfig.nrfx @masz-nordic
/drivers/ipm/Kconfig.nrfx_ipc_channel @masz-nordic
/drivers/ipm/ipm_cavs* @dcpleung @andyross
/drivers/ipm/ipm_nrfx_ipc.c @masz-nordic
/drivers/ipm/ipm_nrfx_ipc.h @masz-nordic
/drivers/ipm/ipm_stm32_ipcc.c @arnopo
/drivers/ipm/ipm_stm32_hsem.c @cameled
/drivers/ipm/ipm_esp32.c @uLipe
/drivers/ipm/ipm_ivshmem.c @uLipe
/drivers/kscan/*xec* @franciscomunoz @sjvasanth1
/drivers/kscan/*ft5336* @MaureenHelm
/drivers/kscan/*ht16k33* @henrikbrixandersen
/drivers/led_strip/ @mbolivar-ampere
/drivers/mfd/mfd_ad559x.c @bbilas
/drivers/mfd/mfd_max20335.c @bbilas
/drivers/misc/ft8xx/ @hubertmis
/drivers/modem/hl7800.c @rerickson1
/drivers/modem/simcom-sim7080.c @lgehreke
/drivers/modem/simcom-sim7080.h @lgehreke
/drivers/modem/Kconfig.hl7800 @rerickson1
/drivers/modem/Kconfig.simcom-sim7080 @lgehreke
/drivers/pinctrl/*esp32* @sylvioalves
/drivers/pinctrl/*it8xxx2* @ite
/drivers/pinctrl/*kb1200* @ene-steven
/drivers/pm_cpu_ops/psci_shell.c @nbalabak @gdengi
/drivers/power_domain/ @ceolin
/drivers/ps2/*xec* @franciscomunoz @sjvasanth1
/drivers/ps2/*npcx* @MulinChao @ChiHuaL
/drivers/pwm/*b91* @andy-liu-telink
/drivers/pwm/*pca9685* @nixward
/drivers/pwm/*rpi_pico* @burumaj
/drivers/pwm/*rv32m1* @henrikbrixandersen
/drivers/pwm/*sam0* @nzmichaelh
/drivers/pwm/*test* @JordanYates
/drivers/pwm/*xlnx* @henrikbrixandersen
/drivers/pwm/pwm_capture.c @henrikbrixandersen
/drivers/pwm/pwm_shell.c @henrikbrixandersen
/drivers/pwm/*gecko* @sun681
/drivers/pwm/*it8xxx2* @RuibinChang
/drivers/pwm/*esp32* @LucasTambor
/drivers/pwm/*rcar* @aaillet
/drivers/pwm/*max31790* @benediktibk
/drivers/pwm/*kb1200* @ene-steven
/drivers/regulator/* @gmarull
/drivers/regulator/regulator_max20335.c @bbilas
/drivers/regulator/regulator_pca9420.c @danieldegrasse
/drivers/regulator/regulator_rpi_pico.c @soburi
/drivers/regulator/regulator_shell.c @danieldegrasse
/drivers/reset/reset_intel_socfpga.c @nbalabak
/drivers/reset/Kconfig.intel_socfpga @nbalabak
/drivers/sensor/ams_iAQcore/ @alexanderwachter
/drivers/sensor/ens210/ @alexanderwachter
/drivers/sensor/grow_r502a/ @DineshDK03
/drivers/sensor/hts*/ @avisconti
/drivers/sensor/ina23*/ @bbilas
/drivers/sensor/lis*/ @avisconti
/drivers/sensor/lps*/ @avisconti
/drivers/sensor/lsm*/ @avisconti
/drivers/sensor/mpr/ @sven-hm
/drivers/sensor/qdec_stm32/ @valeriosetti
/drivers/sensor/rpi_pico_temp/ @soburi
/drivers/sensor/st*/ @avisconti
/drivers/sensor/veaa_x_3/ @jeppenodgaard @MaureenHelm
/drivers/sensor/ene_tack_kb1200/ @ene-steven
/drivers/serial/*b91* @andy-liu-telink
/drivers/serial/uart_altera_jtag.c @nashif @gohshunjing
/drivers/serial/uart_altera.c @gohshunjing
/drivers/serial/*ns16550* @dcpleung @nashif @gdengi
/drivers/serial/*nrfx* @anangl
/drivers/serial/Kconfig.mcux_iuart @Mani-Sadhasivam
/drivers/serial/uart_mcux_iuart.c @Mani-Sadhasivam
/drivers/serial/Kconfig.rtt @carlescufi @pkral78
/drivers/serial/uart_rtt.c @carlescufi @pkral78
/drivers/serial/*rpi_pico* @yonsch
/drivers/serial/Kconfig.xlnx @wjliang
/drivers/serial/uart_xlnx_ps.c @wjliang
/drivers/serial/uart_xlnx_uartlite.c @henrikbrixandersen
/drivers/serial/*xmc4xxx* @parthitce
/drivers/serial/*numicro* @ssekar15
/drivers/serial/*apbuart* @julius-barendt
/drivers/serial/*rcar* @aaillet
/drivers/serial/Kconfig.xen @lorc @firscity
/drivers/serial/uart_hvc_xen.c @lorc @firscity
/drivers/serial/uart_hvc_xen_consoleio.c @lorc @firscity
/drivers/serial/Kconfig.it8xxx2 @GTLin08
/drivers/serial/uart_ite_it8xxx2.c @GTLin08
/drivers/serial/*intel_lw* @shilinte
/drivers/serial/*kb1200* @ene-steven
/drivers/disk/sdmmc_stm32.c @anthonybrandon
/drivers/ptp_clock/ @tbursztyka @jukkar
/drivers/spi/*b91* @andy-liu-telink
/drivers/spi/spi_rv32m1_lpspi* @karstenkoenig
/drivers/spi/*esp32* @sylvioalves
/drivers/spi/*pl022* @soburi
/drivers/sdhc/ @danieldegrasse
/drivers/sdhc/sdhc_cdns* @roymurlidhar @tanmaykathpalia
/drivers/timer/*arm_arch* @carlocaione
/drivers/timer/*cortex_m_systick* @anangl
/drivers/timer/*altera_avalon* @nashif
/drivers/timer/*riscv_machine* @kgugala @pgielda
/drivers/timer/*ite_it8xxx2* @ite
/drivers/timer/*xlnx_psttc* @wjliang @stephanosio
/drivers/timer/*cc13xx_cc26xx_rtc* @vanti
/drivers/timer/*cavs* @dcpleung
/drivers/timer/*leon_gptimer* @julius-barendt
/drivers/timer/*mips_cp0* @frantony
/drivers/timer/*rcar_cmt* @aaillet
/drivers/timer/*esp32_sys* @uLipe
/drivers/timer/*sam0_rtc* @bendiscz
/drivers/timer/*xtensa* @dcpleung
/drivers/timer/*rv32m1_lptmr* @mbolivar
/drivers/timer/*nrf_rtc* @anangl
/drivers/timer/*hpet* @dcpleung
/drivers/usb/device/usb_dc_stm32.c @ydamigos @loicpoulain
/drivers/i2c/*b91* @andy-liu-telink
/drivers/i2c/i2c_ll_stm32* @ydamigos
/drivers/i2c/i2c_rv32m1_lpi2c* @henrikbrixandersen
/drivers/i2c/*sam0* @Sizurka
/drivers/i2c/i2c_dw* @dcpleung
/drivers/i2c/*tca954x* @kurddt
/drivers/*/*xec* @franciscomunoz @albertofloyd @sjvasanth1
/drivers/watchdog/*gecko* @oanerer
/drivers/watchdog/*sifive* @katsuster
/drivers/watchdog/wdt_handlers.c @dcpleung @nashif
/drivers/watchdog/*cc32xx* @pavlohamov
/drivers/watchdog/wdt_ite_it8xxx2.c @RuibinChang
/drivers/watchdog/Kconfig.it8xxx2 @RuibinChang
/drivers/watchdog/wdt_counter.c @nordic-krch
/drivers/watchdog/*rpi_pico* @thedjnK
/drivers/watchdog/*dw* @softwarecki @pbalsundar
/drivers/watchdog/*ifx* @sreeramIfx
/drivers/watchdog/*kb1200* @ene-steven
/drivers/wifi/esp_at/ @mniestroj
/drivers/wifi/eswifi/ @loicpoulain @nandojve
/drivers/wifi/winc1500/ @kludentwo
/drivers/virtualization/ @tbursztyka
/dts/arm/acsip/ @NorthernDean
/dts/arm/aspeed/ @aspeeddylan
/dts/arm/atmel/ @galak @nandojve
/dts/arm/broadcom/ @sbranden
/dts/arm/cypress/ @ifyall @npal-cy
/dts/arm/ene/kb1200 @ene-steven
/dts/arm/gd/ @nandojve
/dts/arm/infineon/xmc4* @parthitce @ifyall @npal-cy
/dts/arm/infineon/psoc6/ @ifyall @npal-cy
/dts/arm64/armv8-r.dtsi @povergoing
/dts/arm64/intel/*intel_socfpga* @siclim
/dts/arm64/nxp/ @JiafeiPan
/dts/arm64/renesas/ @lorc @xakep-amatop
/dts/arm/quicklogic/ @fkokosinski @kgugala
/dts/arm/seeed_studio/ @str4t0m
/dts/arm/st/h7/*stm32h735* @benediktibk
/dts/arm/st/l4/*stm32l451* @benediktibk
/dts/arm/ti/cc13?2* @bwitherspoon
/dts/arm/ti/cc26?2* @bwitherspoon
/dts/arm/ti/cc3235* @vanti
/dts/arm/nordic/ @anangl @carlescufi
/dts/arm/nuvoton/ @ssekar15 @MulinChao @ChiHuaL
/dts/arm/nxp/ @mmahadevan108 @dleach02
/dts/arm/nxp/nxp_s32* @manuargue
/dts/arm/microchip/ @franciscomunoz @albertofloyd @sjvasanth1
/dts/arm/rpi_pico/ @yonsch
/dts/arm/silabs/efm32_pg_1b.dtsi @rdmeneze
/dts/arm/silabs/efm32gg11b* @oanerer
/dts/arm/silabs/efr32bg13p* @mnkp
/dts/arm/silabs/efr32bg22* @kgugala @fkokosinski @pczarnecki
/dts/arm/silabs/efr32xg13p* @mnkp
/dts/arm/silabs/efm32pg1b* @rdmeneze
/dts/arm/silabs/efr32mg21* @l-alfred
/dts/arm/silabs/efr32fg13* @yonsch
/dts/riscv/ite/ @ite
/dts/riscv/microchip/microchip-miv.dtsi @galak
/dts/riscv/openisa/rv32m1* @dleach02
/dts/riscv/starfive/ @rajnesh-kanwal @pfarwsi
/dts/riscv/andes/andes_v5* @cwshu @kevinwang821020 @jimmyzhe
/dts/riscv/niosv/ @sweeaun
/dts/arm/armv*m.dtsi @galak @ioannisg
/dts/arm/armv7-a.dtsi @ibirnbaum
/dts/arm/armv7-r.dtsi @bbolen @stephanosio
/dts/arm/xilinx/ @bbolen @stephanosio
/dts/arm/renesas/rcar/ @aaillet
/dts/xtensa/xtensa.dtsi @ydamigos
/dts/xtensa/intel/ @dcpleung
/dts/xtensa/espressif/ @sylvioalves
/dts/xtensa/nxp/ @iuliana-prodan @dbaluta
/dts/bindings/can/ @alexanderwachter @henrikbrixandersen
/dts/bindings/i2c/zephyr*i2c-emul*.yaml @sjg20
/dts/bindings/adc/st*stm32-adc.yaml @cybertale
/dts/bindings/adc/*ads114s08.yaml @benediktibk
/dts/bindings/adc/*max111* @benediktibk
/dts/bindings/modem/*hl7800.yaml @rerickson1
/dts/bindings/serial/ns16550.yaml @dcpleung @nashif
/dts/bindings/counter/snps,dw-timers.yaml @pbalsundar
/dts/bindings/wifi/*esp-at.yaml @mniestroj
/dts/bindings/*/*gd32* @nandojve
/dts/bindings/*/*sam* @nandojve
/dts/bindings/*/*npcx* @MulinChao @ChiHuaL
/dts/bindings/*/*psoc6* @ifyall @npal-cy
/dts/bindings/*/*infineon*cat1* @ifyall @npal-cy
/dts/bindings/*/nordic* @anangl
/dts/bindings/*/nxp* @mmahadevan108 @dleach02
/dts/bindings/*/nxp*s32* @manuargue
/dts/bindings/*/openisa* @dleach02
/dts/bindings/*/raspberrypi*pico* @yonsch
/dts/bindings/sensor/ams* @alexanderwachter
/dts/bindings/*/sifive* @mateusz-holenko @kgugala @pgielda
/dts/bindings/*/andes* @cwshu @kevinwang821020 @jimmyzhe
/dts/bindings/*/neorv32* @henrikbrixandersen
/dts/bindings/*/*lan91c111* @sgrrzhf
/dts/bindings/i3c/ @dcpleung
/dts/bindings/pm_cpu_ops/* @carlocaione
/dts/bindings/ethernet/*gem.yaml @ibirnbaum
/dts/bindings/auxdisplay/*pt6314.yaml @xingrz
/dts/bindings/auxdisplay/* @thedjnK
/dts/bindings/sensor/*bme680* @BoschSensortec
/dts/bindings/sensor/*ina23* @bbilas
/dts/bindings/sensor/st* @avisconti
/dts/bindings/sensor/zephyr,sensing.yaml @lixuzha @ghu0510 @qianruh
/dts/bindings/sensor/zephyr,sensing*.yaml @lixuzha @ghu0510 @qianruh
/dts/bindings/smbus/ @finikorg
/dts/bindings/sip_svc/ @maheshraotm
/dts/bindings/cpu/intel,niosv.yaml @sweeaun
/dts/bindings/reset/intel,socfpga-reset.yaml @nbalabak
/dts/bindings/gpio/*pcal64* @benediktibk
/dts/bindings/gpio/*bd8lb600fs* @benediktibk
/dts/bindings/gpio/*ads114s0x* @benediktibk
/dts/bindings/pwm/*max31790* @benediktibk
/dts/bindings/dac/*ad56* @benediktibk

View File

@@ -215,6 +215,12 @@ config LINKER_SORT_BY_ALIGNMENT
in decreasing size of symbols. This helps to minimize
padding between symbols.
config SRAM_VECTOR_TABLE
bool "Place the vector table in SRAM instead of flash"
help
The option specifies that the vector table should be placed at the
start of SRAM instead of the start of flash.
config HAS_SRAM_OFFSET
bool
help
@@ -442,7 +448,6 @@ config CODING_GUIDELINE_CHECK
config NATIVE_LIBC
bool
select FULL_LIBC_SUPPORTED
select TC_PROVIDES_POSIX_C_LANG_SUPPORT_R
help
Zephyr will use the host system C library.
@@ -465,11 +470,9 @@ config NATIVE_APPLICATION
default y if ARCH_POSIX
depends on !NATIVE_LIBRARY
select NATIVE_BUILD
select DEPRECATED
help
Build as a native application that can run on the host and using
resources and libraries provided by the host. This option is deprecated
and will be removed in Zephyr v4.3
resources and libraries provided by the host.
config NATIVE_LIBRARY
bool
@@ -507,7 +510,7 @@ config SIZE_OPTIMIZATIONS
config SIZE_OPTIMIZATIONS_AGGRESSIVE
bool "Aggressively optimize for size"
help
Compiler optimizations will be set to -Oz independently of other
Compiler optimizations wil be set to -Oz independently of other
options.
config SPEED_OPTIMIZATIONS
@@ -545,14 +548,6 @@ config COMPILER_WARNINGS_AS_ERRORS
help
Turn on "warning as error" toolchain flags
config DEPRECATION_TEST
bool "Indicate test for deprecated feature"
help
This option is selected by tests which check functionality of
deprecated features. It ensures that COMPILER_WARNINGS_AS_ERRORS
is not selected as that would generate errors when the deprecated
features are used.
config COMPILER_SAVE_TEMPS
bool "Save temporary object files"
help
@@ -623,13 +618,6 @@ config MISRA_SANE
standard for safety reasons. Specifically variable length
arrays are not permitted (and gcc will enforce this).
config TOOLCHAIN_SUPPORTS_VLA_IN_STATEMENTS
bool
default y
help
Hidden symbol to state if the toolchain can handle vla in
statements.
endmenu
choice
@@ -697,13 +685,6 @@ config OUTPUT_DISASSEMBLY_WITH_SOURCE
the .lst file that can vary across platforms because
of reasons such as having ".." include paths.
config OUTPUT_DISASSEMBLY_NO_ALIASES
bool "Disassemble with raw instruction mnemonics"
depends on OUTPUT_DISASSEMBLY
help
The .lst file will contain raw instruction mnemonic instead of
pseudo instructions.
config OUTPUT_PRINT_MEMORY_USAGE
bool "Print memory usage to stdout"
default y
@@ -725,21 +706,8 @@ config CLEANUP_INTERMEDIATE_FILES
from the build process. Note this breaks incremental builds, west spdx
(Software Bill of Material generation), and maybe others.
config BUILD_GAP_FILL_PATTERN
hex "Gap fill pattern"
default 0xFF
help
Pattern used for gap filling of output files.
This value should be set to the value of a clean flash as this can
significantly reduce flash write times.
This setting only defines the gap fill pattern and doesn't enable gap
filling.
Note: binary files are always gap filled as they contain no address
information.
config BUILD_NO_GAP_FILL
bool "Don't fill gaps in generated hex/s19 files [DEPRECATED]."
select DEPRECATED
bool "Don't fill gaps in generated hex/bin/s19 files."
config BUILD_OUTPUT_HEX
bool "Build a binary in HEX format"
@@ -747,12 +715,6 @@ config BUILD_OUTPUT_HEX
Build an Intel HEX binary zephyr/zephyr.hex in the build directory.
The name of this file can be customized with CONFIG_KERNEL_BIN_NAME.
config BUILD_OUTPUT_HEX_GAP_FILL
bool "Fill gaps in hex files"
depends on !BUILD_NO_GAP_FILL
help
Fill gaps in hex based files.
config BUILD_OUTPUT_BIN
bool "Build a binary in BIN format"
default y
@@ -787,12 +749,6 @@ config BUILD_OUTPUT_S19
Build an S19 binary zephyr/zephyr.s19 in the build directory.
The name of this file can be customized with CONFIG_KERNEL_BIN_NAME.
config BUILD_OUTPUT_S19_GAP_FILL
bool "Fill gaps in s19 files"
depends on !BUILD_NO_GAP_FILL
help
Fill gaps in s19 based files.
config BUILD_OUTPUT_UF2
bool "Build a binary in UF2 format"
depends on BUILD_OUTPUT_BIN
@@ -800,12 +756,6 @@ config BUILD_OUTPUT_UF2
Build a UF2 binary zephyr/zephyr.uf2 in the build directory.
The name of this file can be customized with CONFIG_KERNEL_BIN_NAME.
config BUILD_OUTPUT_MOT
bool "Build a binary in MOT format"
help
Build a MOT binary zephyr/zephyr.mot in the build directory.
The name of this file can be customized with CONFIG_KERNEL_BIN_NAME.
if BUILD_OUTPUT_UF2
config BUILD_OUTPUT_UF2_FAMILY_ID
@@ -815,8 +765,7 @@ config BUILD_OUTPUT_UF2_FAMILY_ID
default "0xada52840" if SOC_NRF52840_QIAA
default "0x4fb2d5bd" if SOC_SERIES_IMXRT10XX || SOC_SERIES_IMXRT11XX
default "0x2abc77ec" if SOC_SERIES_LPC55XXX
default "0xe48bff56" if SOC_SERIES_RP2040
default "0xe48bff57" if SOC_SERIES_RP2350
default "0xe48bff56" if SOC_SERIES_RP2XXX
default "0x68ed2b88" if SOC_SERIES_SAMD21
default "0x55114460" if SOC_SERIES_SAMD51
default "0x647824b6" if SOC_SERIES_STM32F0X
@@ -1037,12 +986,6 @@ config WARN_EXPERIMENTAL
Print a warning when the Kconfig tree is parsed if any experimental
features are enabled.
config NOT_SECURE
bool
help
Symbol to be selected by a feature to inidicate that feature is
not secure.
config TAINT
bool
help
@@ -1078,6 +1021,32 @@ config IS_BOOTLOADER
This option indicates that Zephyr will act as a bootloader to execute
a separate Zephyr image payload.
config BOOTLOADER_SRAM_SIZE
int "SRAM reserved for bootloader [DEPRECATED]"
default 0
depends on !XIP || IS_BOOTLOADER
depends on ARM || XTENSA
help
This option specifies the amount of SRAM (measure in kB) reserved for
a bootloader image, when either:
- the Zephyr image itself is to act as the bootloader, or
- Zephyr is a !XIP image, which implicitly assumes existence of a
bootloader that loads the Zephyr !XIP image onto SRAM.
This option is deprecated, users should transition to using DTS to set this, if needed.
To be removed after Zephyr 3.7 release.
config BOOTLOADER_SRAM_SIZE_DEPRECATED
bool
default y
select DEPRECATED
depends on BOOTLOADER_SRAM_SIZE != 0
depends on !XIP || IS_BOOTLOADER
depends on ARM || XTENSA
help
Non-prompt symbol to indicate that the deprecated BOOTLOADER_SRAM_SIZE Kconfig has a
non-0 value. Please transition to using devicetree.
config BOOTLOADER_BOSSA
bool "BOSSA bootloader support"
select USE_DT_CODE_PARTITION

File diff suppressed because it is too large Load Diff

View File

@@ -24,7 +24,7 @@ resource-constrained systems: from simple embedded environmental sensors and
LED wearables to sophisticated smart watches and IoT wireless gateways.
The Zephyr kernel supports multiple architectures, including ARM (Cortex-A,
Cortex-R, Cortex-M), Intel x86, ARC, Tensilica Xtensa, and RISC-V,
Cortex-R, Cortex-M), Intel x86, ARC, Nios II, Tensilica Xtensa, and RISC-V,
SPARC, MIPS, and a large number of `supported boards`_.
.. below included in doc/introduction/introduction.rst

View File

@@ -1 +1 @@
0.17.2
0.17.0

View File

@@ -1,5 +1,5 @@
VERSION_MAJOR = 4
VERSION_MINOR = 2
VERSION_MINOR = 0
PATCHLEVEL = 0
VERSION_TWEAK = 0
EXTRAVERSION =

View File

@@ -8,7 +8,7 @@
# Include these first so that any properties (e.g. defaults) below can be
# overridden (by defining symbols in multiple locations)
source "$(KCONFIG_BINARY_DIR)/arch/Kconfig"
source "$(ARCH_DIR)/Kconfig.$(HWM_SCHEME)"
# ToDo: Generate a Kconfig.arch for loading of additional arch in HWMv2.
osource "$(KCONFIG_BINARY_DIR)/Kconfig.arch"
@@ -33,7 +33,6 @@ config ARM
select ARCH_IS_SET
select ARCH_SUPPORTS_COREDUMP if CPU_CORTEX_M
select ARCH_SUPPORTS_COREDUMP_THREADS if CPU_CORTEX_M
select ARCH_SUPPORTS_COREDUMP_STACK_PTR if CPU_CORTEX_M
# FIXME: current state of the code for all ARM requires this, but
# is really only necessary for Cortex-M with ARM MPU!
select GEN_PRIV_STACKS
@@ -51,12 +50,11 @@ config ARM64
select ARCH_HAS_THREAD_LOCAL_STORAGE
select USE_SWITCH
select USE_SWITCH_SUPPORTED
select IRQ_OFFLOAD_NESTED if IRQ_OFFLOAD
select BARRIER_OPERATIONS_ARCH
select ARCH_HAS_DIRECTED_IPIS
select ARCH_HAS_DEMAND_PAGING
select ARCH_HAS_DEMAND_MAPPING
select ARCH_SUPPORTS_EVICTION_TRACKING
select EVICTION_TRACKING if DEMAND_PAGING
help
ARM64 (AArch64) architecture
@@ -95,6 +93,7 @@ config X86
select ARCH_HAS_THREAD_LOCAL_STORAGE
select ARCH_HAS_DEMAND_PAGING if !X86_64
select ARCH_HAS_DEMAND_MAPPING if ARCH_HAS_DEMAND_PAGING
select IRQ_OFFLOAD_NESTED if IRQ_OFFLOAD
select NEED_LIBC_MEM_PARTITION if USERSPACE && TIMING_FUNCTIONS \
&& !BOARD_HAS_TIMING_FUNCTIONS \
&& !SOC_HAS_TIMING_FUNCTIONS
@@ -104,17 +103,25 @@ config X86
help
x86 architecture
config NIOS2
bool
select ARCH_IS_SET
select ATOMIC_OPERATIONS_C
imply XIP
select ARCH_HAS_TIMING_FUNCTIONS
help
Nios II Gen 2 architecture
config RISCV
bool
select ARCH_IS_SET
select ATOMIC_OPERATIONS_C if !RISCV_ISA_EXT_A
select ATOMIC_OPERATIONS_BUILTIN if RISCV_ISA_EXT_A
select ARCH_SUPPORTS_COREDUMP
select ARCH_SUPPORTS_COREDUMP_PRIV_STACKS
select ARCH_SUPPORTS_ROM_START if !SOC_FAMILY_ESPRESSIF_ESP32
select ARCH_SUPPORTS_EMPTY_IRQ_SPURIOUS
select ARCH_HAS_CODE_DATA_RELOCATION
select ARCH_HAS_THREAD_LOCAL_STORAGE
select IRQ_OFFLOAD_NESTED if IRQ_OFFLOAD
select USE_SWITCH_SUPPORTED
select USE_SWITCH
select SCHED_IPI_SUPPORTED if SMP
@@ -129,6 +136,7 @@ config XTENSA
select ARCH_IS_SET
select USE_SWITCH
select USE_SWITCH_SUPPORTED
select IRQ_OFFLOAD_NESTED if IRQ_OFFLOAD
select ARCH_HAS_CODE_DATA_RELOCATION
select ARCH_HAS_TIMING_FUNCTIONS
select ARCH_MEM_DOMAIN_DATA if USERSPACE
@@ -160,15 +168,6 @@ config ARCH_POSIX
help
POSIX (native) architecture
config RX
bool
select ARCH_IS_SET
select ATOMIC_OPERATIONS_C
select USE_SWITCH
select USE_SWITCH_SUPPORTED
help
Renesas RX architecture
config ARCH_IS_SET
bool
help
@@ -230,7 +229,7 @@ config SRAM_BASE_ADDRESS
/chosen/zephyr,sram in devicetree. The user should generally avoid
changing it via menuconfig or in configuration files.
if ARC || ARM || ARM64 || X86 || RISCV || RX
if ARC || ARM || ARM64 || NIOS2 || X86 || RISCV
# Workaround for not being able to have commas in macro arguments
DT_CHOSEN_Z_FLASH := zephyr,flash
@@ -253,7 +252,7 @@ config FLASH_BASE_ADDRESS
normally set by the board's defconfig file and the user should generally
avoid modifying it via the menu configuration.
endif # ARM || ARM64 || ARC || X86 || RISCV || RX
endif # ARM || ARM64 || ARC || NIOS2 || X86 || RISCV
if ARCH_HAS_TRUSTED_EXECUTION
@@ -320,7 +319,7 @@ config PRIVILEGED_STACK_SIZE
int "Size of privileged stack"
default 2048 if EMUL
default 1024
depends on USERSPACE
depends on ARCH_HAS_USERSPACE
help
This option sets the privileged stack region size that will be used
in addition to the user mode thread stack. During normal execution,
@@ -330,19 +329,18 @@ config PRIVILEGED_STACK_SIZE
config KOBJECT_TEXT_AREA
int "Size of kobject text area"
default 1024 if UBSAN
default 512 if COVERAGE_GCOV
default 512 if NO_OPTIMIZATIONS
default 512 if STACK_CANARIES && RISCV
default 256
depends on USERSPACE
depends on ARCH_HAS_USERSPACE
help
Size of kernel object text area. Used in linker script.
config KOBJECT_DATA_AREA_RESERVE_EXTRA_PERCENT
int "Reserve extra kobject data area (in percentage)"
default 100
depends on USERSPACE
depends on ARCH_HAS_USERSPACE
help
Multiplication factor used to calculate the size of placeholder to
reserve space for kobject metadata hash table. The hash table is
@@ -356,7 +354,7 @@ config KOBJECT_DATA_AREA_RESERVE_EXTRA_PERCENT
config KOBJECT_RODATA_AREA_EXTRA_BYTES
int "Reserve extra bytes for kobject rodata area"
default 16
depends on USERSPACE
depends on ARCH_HAS_USERSPACE
help
Reserve a few more bytes for the RODATA region for kobject metadata.
This is to account for the uncertainty of tables generated by gperf.
@@ -442,25 +440,20 @@ config ARCH_STACKWALK_MAX_FRAMES
menu "Interrupt Configuration"
config TOOLCHAIN_SUPPORTS_ISR_TABLES_LOCAL_DECLARATION
bool
help
Hidden option to signal that toolchain supports local declaration of
interrupt tables.
config ISR_TABLES_LOCAL_DECLARATION_SUPPORTED
bool
default y
# Userspace is currently not supported
depends on !USERSPACE
# List of currently supported architectures
depends on ARM || ARM64 || RISCV
depends on ARM || ARM64
# List of currently supported toolchains
depends on "$(ZEPHYR_TOOLCHAIN_VARIANT)" = "zephyr" || "$(ZEPHYR_TOOLCHAIN_VARIANT)" = "gnuarmemb" || "$(ZEPHYR_TOOLCHAIN_VARIANT)" = "llvm" || TOOLCHAIN_SUPPORTS_ISR_TABLES_LOCAL_DECLARATION
depends on "$(ZEPHYR_TOOLCHAIN_VARIANT)" = "zephyr" || "$(ZEPHYR_TOOLCHAIN_VARIANT)" = "gnuarmemb"
config ISR_TABLES_LOCAL_DECLARATION
bool "ISR tables created locally and placed by linker"
bool "ISR tables created locally and placed by linker [EXPERIMENTAL]"
depends on ISR_TABLES_LOCAL_DECLARATION_SUPPORTED
select EXPERIMENTAL
help
Enable new scheme of interrupt tables generation.
This is totally different generator that would create tables entries locally
@@ -523,12 +516,6 @@ config ARCH_IRQ_VECTOR_TABLE_ALIGN
to be aligned to architecture specific size. The default
size is 0 for no alignment.
config ARCH_DEVICE_STATE_ALIGN
int "Alignment size of device state"
default 4
help
This option controls alignment size of device state.
choice IRQ_VECTOR_TABLE_TYPE
prompt "IRQ vector table type"
depends on GEN_IRQ_VECTOR_TABLE
@@ -589,24 +576,14 @@ config IRQ_OFFLOAD
run in interrupt context. Only useful for test cases that need
to validate the correctness of kernel objects in IRQ context.
config SRAM_VECTOR_TABLE
bool "Place the vector table in SRAM instead of flash"
depends on ARCH_HAS_VECTOR_TABLE_RELOCATION
depends on XIP
depends on !ROMSTART_RELOCATION_ROM
help
When XiP is enabled, this option will result in the vector table being
relocated from Flash to SRAM.
config IRQ_OFFLOAD_NESTED
bool "irq_offload() supports nested IRQs"
depends on IRQ_OFFLOAD
default y if ARM64 || X86 || RISCV || XTENSA
help
When set by the platform layers, indicates that
irq_offload() may legally be called in interrupt context to
cause a synchronous nested interrupt on the current CPU.
Not all hardware is capable.
When set by the arch layer, indicates that irq_offload() may
legally be called in interrupt context to cause a
synchronous nested interrupt on the current CPU. Not all
hardware is capable.
config EXCEPTION_DEBUG
bool "Unhandled exception debugging"
@@ -696,9 +673,6 @@ config ARCH_HAS_NOCACHE_MEMORY_SUPPORT
config ARCH_HAS_RAMFUNC_SUPPORT
bool
config ARCH_HAS_VECTOR_TABLE_RELOCATION
bool
config ARCH_HAS_NESTED_EXCEPTION_DETECTION
bool
@@ -711,9 +685,6 @@ config ARCH_SUPPORTS_COREDUMP_THREADS
config ARCH_SUPPORTS_COREDUMP_PRIV_STACKS
bool
config ARCH_SUPPORTS_COREDUMP_STACK_PTR
bool
config ARCH_SUPPORTS_ARCH_HW_INIT
bool
@@ -723,12 +694,6 @@ config ARCH_SUPPORTS_ROM_START
config ARCH_SUPPORTS_EMPTY_IRQ_SPURIOUS
bool
config ARCH_SUPPORTS_EVICTION_TRACKING
bool
help
Architecture code supports page tracking for eviction algorithms
when demand paging is enabled.
config ARCH_HAS_EXTRA_EXCEPTION_INFO
bool
@@ -1165,9 +1130,9 @@ config ARCH_HAS_CUSTOM_CPU_ATOMIC_IDLE
config ARCH_HAS_CUSTOM_SWAP_TO_MAIN
bool
help
It's possible that an architecture port cannot use z_swap_unlocked()
to swap to the main thread (bg_thread_main), but instead must do
something custom. It must enable this option in that case.
It's possible that an architecture port cannot use _Swap() to swap to
the _main() thread, but instead must do something custom. It must
enable this option in that case.
config ARCH_HAS_CUSTOM_BUSY_WAIT
bool
@@ -1175,9 +1140,3 @@ config ARCH_HAS_CUSTOM_BUSY_WAIT
It's possible that an architecture port cannot or does not want to use
the provided k_busy_wait(), but instead must do something custom. It must
enable this option in that case.
config ARCH_HAS_CUSTOM_CURRENT_IMPL
bool
help
Select when architecture implements arch_current_thread() &
arch_current_thread_set().

5
arch/Kconfig.v1 Normal file
View File

@@ -0,0 +1,5 @@
# Copyright (c) 2023 Nordic Semiconductor ASA
# SPDX-License-Identifier: Apache-2.0
# Note: $ARCH might be a glob pattern
source "$(ARCH_DIR)/$(ARCH)/Kconfig"

5
arch/Kconfig.v2 Normal file
View File

@@ -0,0 +1,5 @@
# Copyright (c) 2023 Nordic Semiconductor ASA
# SPDX-License-Identifier: Apache-2.0
source "$(KCONFIG_BINARY_DIR)/arch/Kconfig"

View File

@@ -258,11 +258,11 @@ config ARC_CURRENT_THREAD_USE_NO_TLS
select CURRENT_THREAD_USE_NO_TLS
default y if (RGF_NUM_BANKS > 1) || ("$(ZEPHYR_TOOLCHAIN_VARIANT)" = "arcmwdt")
help
Disable current Thread Local Storage for ARC. For cores with more than one
RGF_NUM_BANKS the parameter is disabled by-default because banks synchronization
Disable current Thread Local Storage for ARC. For cores with more then one
RGF_NUM_BANKS the parameter is disabled by-default because banks syncronization
requires significant time, and it slows down performance.
ARCMWDT works with TLS pointer in different way then GCC. Optimized access to
TLS pointer via the _current symbol does not provide significant advantages
ARCMWDT works with tls pointer in different way then GCC. Optimized access to
TLS pointer via _current variable does not provide significant advantages
in case of MetaWare.
config GEN_ISR_TABLES
@@ -274,7 +274,7 @@ config GEN_IRQ_START_VECTOR
config HARVARD
bool "Harvard Architecture"
help
The ARC CPU can be configured to have two buses;
The ARC CPU can be configured to have two busses;
one for instruction fetching and another that serves as a data bus.
config CODE_DENSITY

View File

@@ -8,6 +8,14 @@
__weak void *__dso_handle;
int __cxa_atexit(void (*destructor)(void *), void *objptr, void *dso)
{
ARG_UNUSED(destructor);
ARG_UNUSED(objptr);
ARG_UNUSED(dso);
return 0;
}
int atexit(void (*function)(void))
{
return 0;

View File

@@ -34,5 +34,3 @@ add_subdirectory_ifdef(CONFIG_ARC_CORE_MPU mpu)
add_subdirectory_ifdef(CONFIG_ARC_SECURE_FIRMWARE secureshield)
zephyr_linker_sources(ROM_START SORT_KEY 0x0vectors vector_table.ld)
zephyr_library_sources_ifdef(CONFIG_LLEXT elf.c)

View File

@@ -39,7 +39,7 @@ config ARC_XY_ENABLE
bool "ARC address generation unit registers"
help
Processors with XY memory and AGU registers can configure this
option to accelerate DSP instructions.
option to accelerate DSP instrctions.
config ARC_AGU_SHARING
bool "ARC address generation unit register sharing"

View File

@@ -1,101 +0,0 @@
/*
* Copyright (c) 2024 Intel Corporation
*
* SPDX-License-Identifier: Apache-2.0
*/
#include <zephyr/llext/elf.h>
#include <zephyr/llext/llext.h>
#include <zephyr/llext/llext_internal.h>
#include <zephyr/llext/loader.h>
#include <zephyr/logging/log.h>
#include <zephyr/sys/util.h>
LOG_MODULE_REGISTER(elf, CONFIG_LLEXT_LOG_LEVEL);
#define R_ARC_32 4
#define R_ARC_B26 5 /* AKA R_ARC_64 */
#define R_ARC_S25W_PCREL 17
#define R_ARC_32_ME 27
/* ARCompact insns packed in memory have Middle Endian encoding */
#define ME(x) (((x & 0xffff0000) >> 16) | ((x & 0xffff) << 16))
/**
* @brief Architecture specific function for relocating shared elf
*
* Elf files contain a series of relocations described in multiple sections.
* These relocation instructions are architecture specific and each architecture
* supporting modules must implement this.
*
* The relocation codes are well documented:
* https://github.com/foss-for-synopsys-dwc-arc-processors/arc-ABI-manual/blob/master/ARCv2_ABI.pdf
* https://github.com/zephyrproject-rtos/binutils-gdb
*/
int arch_elf_relocate(struct llext_loader *ldr, struct llext *ext, elf_rela_t *rel,
const elf_shdr_t *shdr)
{
int ret = 0;
uint32_t value;
const uintptr_t loc = llext_get_reloc_instruction_location(ldr, ext, shdr->sh_info, rel);
uint32_t insn = UNALIGNED_GET((uint32_t *)loc);
elf_sym_t sym;
uintptr_t sym_base_addr;
const char *sym_name;
ret = llext_read_symbol(ldr, ext, rel, &sym);
if (ret != 0) {
LOG_ERR("Could not read symbol from binary!");
return ret;
}
sym_name = llext_symbol_name(ldr, ext, &sym);
ret = llext_lookup_symbol(ldr, ext, &sym_base_addr, rel, &sym, sym_name, shdr);
if (ret != 0) {
LOG_ERR("Could not find symbol %s!", sym_name);
return ret;
}
sym_base_addr += rel->r_addend;
int reloc_type = ELF32_R_TYPE(rel->r_info);
switch (reloc_type) {
case R_ARC_32:
case R_ARC_B26:
UNALIGNED_PUT(sym_base_addr, (uint32_t *)loc);
break;
case R_ARC_S25W_PCREL:
/* ((S + A) - P) >> 2
* S = symbol address
* A = addend
* P = relative offset to PCL
*/
value = (sym_base_addr + rel->r_addend - (loc & ~0x3)) >> 2;
insn = ME(insn);
/* disp25w */
insn = insn & ~0x7fcffcf;
insn |= ((value >> 0) & 0x01ff) << 18;
insn |= ((value >> 9) & 0x03ff) << 6;
insn |= ((value >> 19) & 0x000f) << 0;
insn = ME(insn);
UNALIGNED_PUT(insn, (uint32_t *)loc);
break;
case R_ARC_32_ME:
UNALIGNED_PUT(ME(sym_base_addr), (uint32_t *)loc);
break;
default:
LOG_ERR("unknown relocation: %u\n", reloc_type);
ret = -ENOEXEC;
break;
}
return ret;
}

View File

@@ -346,7 +346,7 @@ static void dump_exception_info(uint32_t vector, uint32_t cause, uint32_t parame
* invokes the user provided routine k_sys_fatal_error_handler() which is
* responsible for implementing the error handling policy.
*/
void z_arc_fault(struct arch_esf *esf, uint32_t old_sp)
void _Fault(struct arch_esf *esf, uint32_t old_sp)
{
uint32_t vector, cause, parameter;
uint32_t exc_addr = z_arc_v2_aux_reg_read(_ARC_V2_EFA);

View File

@@ -19,7 +19,7 @@
#include <zephyr/syscall.h>
#include <zephyr/arch/arc/asm-compat/assembler.h>
GTEXT(z_arc_fault)
GTEXT(_Fault)
GTEXT(__reset)
GTEXT(__memory_error)
GTEXT(__instruction_error)
@@ -99,11 +99,11 @@ _exc_entry:
_save_exc_regs_into_stack
/* sp is parameter of z_arc_fault */
/* sp is parameter of _Fault */
MOVR r0, sp
/* ilink is the thread's original sp */
MOVR r1, ilink
jl z_arc_fault
jl _Fault
_exc_return:
/* the exception cause must be fixed in exception handler when exception returns

View File

@@ -26,6 +26,49 @@
#include <zephyr/platform/hooks.h>
#include <zephyr/arch/cache.h>
/* XXX - keep for future use in full-featured cache APIs */
#if 0
/**
* @brief Disable the i-cache if present
*
* For those ARC CPUs that have a i-cache present,
* invalidate the i-cache and then disable it.
*/
static void disable_icache(void)
{
unsigned int val;
val = z_arc_v2_aux_reg_read(_ARC_V2_I_CACHE_BUILD);
val &= 0xff; /* version field */
if (val == 0) {
return; /* skip if i-cache is not present */
}
z_arc_v2_aux_reg_write(_ARC_V2_IC_IVIC, 0);
__builtin_arc_nop();
z_arc_v2_aux_reg_write(_ARC_V2_IC_CTRL, 1);
}
/**
* @brief Invalidate the data cache if present
*
* For those ARC CPUs that have a data cache present,
* invalidate the data cache.
*/
static void invalidate_dcache(void)
{
unsigned int val;
val = z_arc_v2_aux_reg_read(_ARC_V2_D_CACHE_BUILD);
val &= 0xff; /* version field */
if (val == 0) {
return; /* skip if d-cache is not present */
}
z_arc_v2_aux_reg_write(_ARC_V2_DC_IVDC, 1);
}
#endif
#ifdef CONFIG_ISA_ARCV3
/* NOTE: it will be called from early C code - we must NOT use global / static variables in it! */
static void arc_cluster_scm_enable(void)
@@ -49,8 +92,8 @@ static void arc_cluster_scm_enable(void)
/* Invalidate SCM before enabling. */
arc_cln_write_reg_nolock(ARC_CLN_CACHE_CMD,
ARC_CLN_CACHE_CMD_OP_REG_INV | ARC_CLN_CACHE_CMD_INCR);
while (arc_cln_read_reg_nolock(ARC_CLN_CACHE_STATUS) & ARC_CLN_CACHE_STATUS_BUSY) {
}
while (arc_cln_read_reg_nolock(ARC_CLN_CACHE_STATUS) & ARC_CLN_CACHE_STATUS_BUSY)
;
arc_cln_write_reg_nolock(ARC_CLN_CACHE_STATUS, ARC_CLN_CACHE_STATUS_EN);
}

View File

@@ -163,20 +163,20 @@ hw_pf_setup_done:
#if defined(CONFIG_SMP) || CONFIG_MP_MAX_NUM_CPUS > 1
_get_cpu_id r0
breq r0, 0, _primary_core_startup
breq r0, 0, _master_core_startup
/*
* Non-primary cores wait for primary core (core 0) to boot enough
* Non-masters wait for master core (core 0) to boot enough
*/
_secondary_core_wait:
_slave_core_wait:
#if CONFIG_MP_MAX_NUM_CPUS == 1
kflag 1
#endif
ld r1, [arc_cpu_wake_flag]
brne r0, r1, _secondary_core_wait
brne r0, r1, _slave_core_wait
LDR sp, arc_cpu_sp
/* signal primary core that secondary core runs */
/* signal master core that slave core runs */
st 0, [arc_cpu_wake_flag]
#if defined(CONFIG_ARC_FIRQ_STACK)
@@ -186,7 +186,7 @@ _secondary_core_wait:
#endif
j arch_secondary_cpu_init
_primary_core_startup:
_master_core_startup:
#endif
#ifdef CONFIG_INIT_STACKS

View File

@@ -16,7 +16,6 @@
#include <ipi.h>
#include <zephyr/init.h>
#include <zephyr/irq.h>
#include <zephyr/platform/hooks.h>
#include <arc_irq_offload.h>
volatile struct {
@@ -25,10 +24,10 @@ volatile struct {
} arc_cpu_init[CONFIG_MP_MAX_NUM_CPUS];
/*
* arc_cpu_wake_flag is used to sync up primary core and secondary cores
* Secondary core will spin for arc_cpu_wake_flag until primary core sets
* it to the core id of secondary core. Then, secondary core clears it to notify
* primary core that it's waken
* arc_cpu_wake_flag is used to sync up master core and slave cores
* Slave core will spin for arc_cpu_wake_flag until master core sets
* it to the core id of slave core. Then, slave core clears it to notify
* master core that it's waken
*
*/
volatile uint32_t arc_cpu_wake_flag;
@@ -50,13 +49,13 @@ void arch_cpu_start(int cpu_num, k_thread_stack_t *stack, int sz,
/* set the initial sp of target sp through arc_cpu_sp
* arc_cpu_wake_flag will protect arc_cpu_sp that
* only one secondary cpu can read it per time
* only one slave cpu can read it per time
*/
arc_cpu_sp = K_KERNEL_STACK_BUFFER(stack) + sz;
arc_cpu_wake_flag = cpu_num;
/* wait secondary cpu to start */
/* wait slave cpu to start */
while (arc_cpu_wake_flag != 0U) {
;
}
@@ -90,7 +89,7 @@ static void arc_connect_debug_mask_update(int cpu_num)
void arc_core_private_intc_init(void);
/* the C entry of secondary cores */
/* the C entry of slave cores */
void arch_secondary_cpu_init(int cpu_num)
{
arch_cpustart_t fn;
@@ -116,11 +115,6 @@ void arch_secondary_cpu_init(int cpu_num)
DT_IRQ(DT_NODELABEL(ici), priority), 0);
irq_enable(DT_IRQN(DT_NODELABEL(ici)));
#endif
#ifdef CONFIG_SOC_PER_CORE_INIT_HOOK
soc_per_core_init_hook();
#endif /* CONFIG_SOC_PER_CORE_INIT_HOOK */
/* call the function set by arch_cpu_start */
fn = arc_cpu_init[cpu_num].fn;
@@ -162,7 +156,7 @@ int arch_smp_init(void)
{
struct arc_connect_bcr bcr;
/* necessary primary core init */
/* necessary master core init */
_curr_cpu[0] = &(_kernel.cpus[0]);
bcr.val = z_arc_v2_aux_reg_read(_ARC_V2_CONNECT_BCR);
@@ -173,7 +167,7 @@ int arch_smp_init(void)
}
if (bcr.ipi) {
/* register ici interrupt, just need primary core to register once */
/* register ici interrupt, just need master core to register once */
z_arc_connect_ici_clear();
IRQ_CONNECT(DT_IRQN(DT_NODELABEL(ici)),
DT_IRQ(DT_NODELABEL(ici), priority),

View File

@@ -47,7 +47,7 @@ struct vector_table {
uintptr_t unused_2;
};
const struct vector_table _VectorTable Z_GENERIC_SECTION(.exc_vector_table) = {
struct vector_table _VectorTable Z_GENERIC_SECTION(.exc_vector_table) = {
(uintptr_t)__reset,
(uintptr_t)__memory_error,
(uintptr_t)__instruction_error,

View File

@@ -26,8 +26,6 @@
#include <v2/irq.h>
#include <zephyr/platform/hooks.h>
#ifdef __cplusplus
extern "C" {
#endif
@@ -35,10 +33,6 @@ extern "C" {
static ALWAYS_INLINE void arch_kernel_init(void)
{
z_irq_setup();
#ifdef CONFIG_SOC_PER_CORE_INIT_HOOK
soc_per_core_init_hook();
#endif /* CONFIG_SOC_PER_CORE_INIT_HOOK */
}

View File

@@ -1,31 +1,21 @@
archs:
- name: arc
path: arc
full_name: Synopsys DesignWare ARC
- name: arm
path: arm
full_name: ARM
- name: arm64
path: arm64
full_name: ARM 64
- name: mips
path: mips
full_name: MIPS
- name: nios2
path: nios2
- name: posix
path: posix
full_name: POSIX
- name: riscv
path: riscv
full_name: RISC-V
- name: sparc
path: sparc
full_name: SPARC
- name: xtensa
path: xtensa
full_name: Xtensa
- name: x86
path: x86
full_name: x86
- name: rx
path: rx
full_name: Renesas RX

View File

@@ -49,7 +49,7 @@ config ROMSTART_RELOCATION_ROM
Most SOCs include an alias for the boot-vector at address 0x00000000
so a default which might be supported by the corresponding Linux rproc driver.
If it is not, additional options allow to specify the addresses.
If it is not, additionnal options allows to specify the addresses.
In general this option should be chosen if the zephyr,flash chosen node
is not placed into the boot-vector memory area.
@@ -155,12 +155,12 @@ config CPU_HAS_ARM_MPU
This option is enabled when the CPU has a Memory Protection Unit (MPU)
in ARM flavor.
config CPU_HAS_NXP_SYSMPU
config CPU_HAS_NXP_MPU
bool
select CPU_HAS_MPU
help
This option is enabled when the CPU has an NXP System Memory Protection
Unit (SYSMPU).
This option is enabled when the CPU has a Memory Protection Unit (MPU)
in NXP flavor.
config CPU_HAS_CUSTOM_FIXED_SOC_MPU_REGIONS
bool "Custom fixed SoC MPU region definition"

View File

@@ -16,7 +16,6 @@ config CPU_CORTEX_M
select ARCH_HAS_USERSPACE if ARM_MPU
select ARCH_HAS_NOCACHE_MEMORY_SUPPORT if ARM_MPU && CPU_HAS_ARM_MPU && CPU_HAS_DCACHE
select ARCH_HAS_RAMFUNC_SUPPORT
select ARCH_HAS_VECTOR_TABLE_RELOCATION if CPU_CORTEX_M_HAS_VTOR
select ARCH_HAS_NESTED_EXCEPTION_DETECTION
select SWAP_NONATOMIC
select ARCH_HAS_EXTRA_EXCEPTION_INFO

View File

@@ -109,17 +109,6 @@ config VFP_U_DP_D16_FP16_FMAC
fused multiply-accumulate) and floating-point exception trapping with 16
double-word registers.
config VFP_DP_D32
bool
select CPU_HAS_VFP
select VFP_FEATURE_SINGLE_PRECISION
select VFP_FEATURE_DOUBLE_PRECISION
select VFP_FEATURE_REGS_S64_D32
help
This option signifies the use of a VFP floating-point coprocessor
that supports single- and double-precision operations
with 32 double-word registers.
config VFP_DP_D32_FP16_FMAC
bool
select CPU_HAS_VFP

View File

@@ -24,4 +24,4 @@ zephyr_library_sources_ifdef(CONFIG_SEMIHOST semihost.c)
zephyr_library_sources_ifdef(CONFIG_THREAD_LOCAL_STORAGE __aeabi_read_tp.S)
zephyr_library_sources_ifdef(CONFIG_ARCH_CACHE cache.c)
zephyr_library_sources_ifdef(CONFIG_USE_SWITCH switch.S)
zephyr_library_sources_ifndef(CONFIG_USE_SWITCH swap_helper.S exc_exit.S)
zephyr_library_sources_ifndef(CONFIG_USE_SWITCH swap.c swap_helper.S exc_exit.S)

View File

@@ -13,15 +13,6 @@
# CPU_AARCH32_CORTEX_A / if CPU_AARCH32_CORTEX_R blocks so they are not
# exposed if one selects a different ARM Cortex Family (Cortex-M).
config CPU_CORTEX_A7
bool
select CPU_AARCH32_CORTEX_A
select ARMV7_A
select CPU_HAS_ICACHE
select CPU_HAS_DCACHE
help
This option signifies the use of a Cortex-A7 CPU.
config CPU_CORTEX_A9
bool
select CPU_AARCH32_CORTEX_A
@@ -91,8 +82,6 @@ config CPU_CORTEX_R5
select CPU_AARCH32_CORTEX_R
select ARMV7_R
select ARMV7_R_FP if CPU_HAS_FPU
select CPU_HAS_ICACHE
select CPU_HAS_DCACHE
help
This option signifies the use of a Cortex-R5 CPU
@@ -122,32 +111,6 @@ config CPU_CORTEX_R52
help
This option signifies the use of a Cortex-R52 CPU
config CPU_CORTEX_R52_CACHE_SEGREGATION
bool "Control segregation of L1 I/D-Cache ways between Flash and AXIM"
depends on CPU_CORTEX_R52
help
Control segregation of L1 I/D-Cache ways between Flash and AXIM.
Updates to the cache segregation controls are only permitted before the caches
have ever been enabled, following a system reset, otherwise the update is ignored.
config CPU_CORTEX_R52_ICACHE_FLASH_WAY
int "L1 I-Cache Flash way"
depends on CPU_CORTEX_R52_CACHE_SEGREGATION
range 0 4
default 0
help
Configure L1 I-Cache ways for Flash interface. Default is reset value, all
I-Cache ways are allocated for AXIM interface.
config CPU_CORTEX_R52_DCACHE_FLASH_WAY
int "L1 D-Cache Flash way"
depends on CPU_CORTEX_R52_CACHE_SEGREGATION
range 0 4
default 0
help
Configure L1 D-Cache ways for Flash interface. Default is reset value,
all D-Cache ways are allocated for AXIM interface.
if CPU_AARCH32_CORTEX_R
config ARMV7_R

View File

@@ -63,13 +63,13 @@ void arch_dcache_disable(void)
{
uint32_t val;
L1C_CleanInvalidateDCacheAll();
val = __get_SCTLR();
val &= ~SCTLR_C_Msk;
barrier_dsync_fence_full();
__set_SCTLR(val);
barrier_isync_fence_full();
arch_dcache_flush_and_invd_all();
}
int arch_dcache_flush_all(void)
@@ -203,30 +203,16 @@ int arch_icache_flush_and_invd_all(void)
int arch_icache_flush_range(void *start_addr, size_t size)
{
ARG_UNUSED(start_addr);
ARG_UNUSED(size);
return -ENOTSUP;
}
int arch_icache_invd_range(void *start_addr, size_t size)
{
ARG_UNUSED(start_addr);
ARG_UNUSED(size);
/* Cortex A/R do have the ICIMVAU operation to selectively invalidate
* the instruction cache, but not currently supported by CMSIS.
* For now, invalidate the entire cache.
*/
L1C_InvalidateICacheAll();
return 0;
return -ENOTSUP;
}
int arch_icache_flush_and_invd_range(void *start_addr, size_t size)
{
ARG_UNUSED(start_addr);
ARG_UNUSED(size);
return -ENOTSUP;
}

View File

@@ -237,28 +237,6 @@ SECTION_SUBSEC_FUNC(TEXT, __exc, z_arm_data_abort)
b z_arm_exc_exit
#else
GTEXT(z_arm_cortex_ar_exit_exc)
SECTION_SUBSEC_FUNC(TEXT, _HandlerModeExit, z_arm_cortex_ar_exit_exc)
/* Note:
* This function is expected to be *always* called with
* processor mode set to MODE_SYS.
*/
/* decrement exception depth */
get_cpu r2
ldrb r1, [r2, #_cpu_offset_to_exc_depth]
sub r1, r1, #1
strb r1, [r2, #_cpu_offset_to_exc_depth]
/*
* Restore r0-r3, r12, lr, lr_und and spsr_und from the exception stack
* and return to the current thread.
*/
pop {r0-r3, r12, lr}
rfeia sp!
/**
* @brief Undefined instruction exception handler
*

View File

@@ -100,59 +100,6 @@ static uint32_t dump_fault(uint32_t status, uint32_t addr)
reason = K_ERR_ARM_UNSUPPORTED_EXCLUSIVE_ACCESS_FAULT;
LOG_ERR("Unsupported Exclusive Access Fault @ 0x%08x", addr);
break;
#elif defined(CONFIG_ARMV7_A)
case FSR_FS_PERMISSION_FAULT_2ND_LEVEL:
reason = K_ERR_ARM_PERMISSION_FAULT_2ND_LEVEL;
LOG_ERR("2nd Level Permission Fault @ 0x%08x", addr);
break;
case FSR_FS_ACCESS_FLAG_FAULT_1ST_LEVEL:
reason = K_ERR_ARM_ACCESS_FLAG_FAULT_1ST_LEVEL;
LOG_ERR("1st Level Access Flag Fault @ 0x%08x", addr);
break;
case FSR_FS_ACCESS_FLAG_FAULT_2ND_LEVEL:
reason = K_ERR_ARM_ACCESS_FLAG_FAULT_2ND_LEVEL;
LOG_ERR("2nd Level Access Flag Fault @ 0x%08x", addr);
break;
case FSR_FS_CACHE_MAINTENANCE_INSTRUCTION_FAULT:
reason = K_ERR_ARM_CACHE_MAINTENANCE_INSTRUCTION_FAULT;
LOG_ERR("Cache Maintenance Instruction Fault @ 0x%08x", addr);
break;
case FSR_FS_TRANSLATION_FAULT:
reason = K_ERR_ARM_TRANSLATION_FAULT;
LOG_ERR("1st Level Translation Fault @ 0x%08x", addr);
break;
case FSR_FS_TRANSLATION_FAULT_2ND_LEVEL:
reason = K_ERR_ARM_TRANSLATION_FAULT_2ND_LEVEL;
LOG_ERR("2nd Level Translation Fault @ 0x%08x", addr);
break;
case FSR_FS_DOMAIN_FAULT_1ST_LEVEL:
reason = K_ERR_ARM_DOMAIN_FAULT_1ST_LEVEL;
LOG_ERR("1st Level Domain Fault @ 0x%08x", addr);
break;
case FSR_FS_DOMAIN_FAULT_2ND_LEVEL:
reason = K_ERR_ARM_DOMAIN_FAULT_2ND_LEVEL;
LOG_ERR("2nd Level Domain Fault @ 0x%08x", addr);
break;
case FSR_FS_SYNC_EXTERNAL_ABORT_TRANSLATION_TABLE_1ST_LEVEL:
reason = K_ERR_ARM_SYNC_EXTERNAL_ABORT_TRANSLATION_TABLE_1ST_LEVEL;
LOG_ERR("1st Level Synchronous External Abort Translation Table @ 0x%08x", addr);
break;
case FSR_FS_SYNC_EXTERNAL_ABORT_TRANSLATION_TABLE_2ND_LEVEL:
reason = K_ERR_ARM_SYNC_EXTERNAL_ABORT_TRANSLATION_TABLE_2ND_LEVEL;
LOG_ERR("2nd Level Synchronous External Abort Translation Table @ 0x%08x", addr);
break;
case FSR_FS_TLB_CONFLICT_ABORT:
reason = K_ERR_ARM_TLB_CONFLICT_ABORT;
LOG_ERR("TLB Conflict Abort @ 0x%08x", addr);
break;
case FSR_FS_SYNC_PARITY_ERROR_TRANSLATION_TABLE_1ST_LEVEL:
reason = K_ERR_ARM_SYNC_PARITY_ERROR_TRANSLATION_TABLE_1ST_LEVEL;
LOG_ERR("1st Level Synchronous Parity Error Translation Table @ 0x%08x", addr);
break;
case FSR_FS_SYNC_PARITY_ERROR_TRANSLATION_TABLE_2ND_LEVEL:
reason = K_ERR_ARM_SYNC_PARITY_ERROR_TRANSLATION_TABLE_2ND_LEVEL;
LOG_ERR("2nd Level Synchronous Parity Error Translation Table @ 0x%08x", addr);
break;
#else
case FSR_FS_BACKGROUND_FAULT:
reason = K_ERR_ARM_BACKGROUND_FAULT;

View File

@@ -33,14 +33,6 @@ extern void z_arm_reserved(void);
* Generic Interrupt Controller (GIC) and therefore the architecture interrupt
* control functions are mapped to the GIC driver interface.
*
* When GIC is used together with other interrupt controller for
* multi-level interrupts support (i.e. CONFIG_MULTI_LEVEL_INTERRUPTS
* is enabled), the architecture interrupt control functions are mapped
* to the SoC layer in `include/arch/arm/irq.h`.
* The exported arm interrupt control functions which are wrappers of
* GIC control could be used for SoC to do level 1 irq control to implement SoC
* layer interrupt control functions.
*
* When a custom interrupt controller is used (i.e.
* CONFIG_ARM_CUSTOM_INTERRUPT_CONTROLLER is enabled), the architecture
* interrupt control functions are mapped to the SoC layer in
@@ -48,17 +40,17 @@ extern void z_arm_reserved(void);
*/
#if !defined(CONFIG_ARM_CUSTOM_INTERRUPT_CONTROLLER)
void arm_irq_enable(unsigned int irq)
void arch_irq_enable(unsigned int irq)
{
arm_gic_irq_enable(irq);
}
void arm_irq_disable(unsigned int irq)
void arch_irq_disable(unsigned int irq)
{
arm_gic_irq_disable(irq);
}
int arm_irq_is_enabled(unsigned int irq)
int arch_irq_is_enabled(unsigned int irq)
{
return arm_gic_irq_is_enabled(irq);
}
@@ -73,11 +65,10 @@ int arm_irq_is_enabled(unsigned int irq)
* priority levels which are reserved: three for various types of exceptions,
* and possibly one additional to support zero latency interrupts.
*/
void arm_irq_priority_set(unsigned int irq, unsigned int prio, uint32_t flags)
void z_arm_irq_priority_set(unsigned int irq, unsigned int prio, uint32_t flags)
{
arm_gic_irq_set_priority(irq, prio, flags);
}
#endif /* !CONFIG_ARM_CUSTOM_INTERRUPT_CONTROLLER */
void z_arm_fatal_error(unsigned int reason, const struct arch_esf *esf);
@@ -132,7 +123,7 @@ static inline void z_arm_irq_dynamic_direct_isr_dispatch(void)
uint32_t irq = __get_IPSR() - 16;
if (irq < IRQ_TABLE_SIZE) {
const struct _isr_table_entry *isr_entry = &_sw_isr_table[irq];
struct _isr_table_entry *isr_entry = &_sw_isr_table[irq];
isr_entry->isr(isr_entry->arg);
}

View File

@@ -28,13 +28,16 @@ static inline void relocate_vector_table(void)
#else
#if defined(__GNUC__)
/*
* GCC can detect if memcpy is passed a NULL argument, however one of
* the cases of relocate_vector_table() it is valid to pass NULL, so we
* suppress the warning for this case. We need to do this before
* string.h is included to get the declaration of memcpy.
*/
TOOLCHAIN_DISABLE_WARNING(TOOLCHAIN_WARNING_NONNULL)
#pragma GCC diagnostic push
#pragma GCC diagnostic ignored "-Wnonnull"
#endif /* __GNUC__ */
#include <string.h>
@@ -50,7 +53,9 @@ void __weak relocate_vector_table(void)
#endif
}
TOOLCHAIN_ENABLE_WARNING(TOOLCHAIN_WARNING_NONNULL)
#if defined(__GNUC__)
#pragma GCC diagnostic pop
#endif
#endif /* !CONFIG_AARCH32_ARMV8_R */

View File

@@ -56,12 +56,9 @@ SECTION_SUBSEC_FUNC(TEXT, _reset_section, __start)
cmp r0, #MODE_HYP
bne EL1_Reset_Handler
/*
* The HSCTLR register provides top-level control of system operation in Hyp mode.
* Since the OS is not running in Hyp mode, and considering the Armv8-R AArch32
* architecture profile, there's no need to modify HSCTLR configuration unless
* Fast Interrupts need to be enabled.
*/
/* Init HSCTLR see Armv8-R AArch32 architecture profile */
ldr r0, =(HSCTLR_RES1 | SCTLR_I_BIT | SCTLR_C_BIT)
mcr p15, 4, r0, c1, c0, 0
/* Init HACTLR: Enable EL1 access to all IMP DEF registers */
ldr r0, =HACTLR_INIT
@@ -203,12 +200,6 @@ EL1_Reset_Handler:
#endif /* CONFIG_DCLS */
#if defined(CONFIG_CPU_CORTEX_R52_CACHE_SEGREGATION)
ldr r0, =IMP_CSCTLR(CONFIG_CPU_CORTEX_R52_ICACHE_FLASH_WAY,
CONFIG_CPU_CORTEX_R52_DCACHE_FLASH_WAY)
mcr p15, 1, r0, c9, c1, 0
#endif
ldr r0, =arm_cpu_boot_params
#if CONFIG_MP_MAX_NUM_CPUS > 1
@@ -227,12 +218,12 @@ EL1_Reset_Handler:
/* signal our desire to vote */
mov r5, #1
strb r5, [r4, r2]
mov r7, #0
ldr r3, [r0, #BOOT_PARAM_MPID_OFFSET]
cmn r3, #1
beq 1f
/* some core already won, release */
mov r7, #0
strb r7, [r4, r2]
b _secondary_core

View File

@@ -19,10 +19,10 @@ long semihost_exec(enum semihost_instr instr, void *args)
register long ret __asm__ ("r0");
if (IS_ENABLED(CONFIG_ISA_THUMB2)) {
__asm__ volatile ("svc 0xab"
__asm__ __volatile__ ("svc 0xab"
: "=r" (ret) : "r" (r0), "r" (r1) : "memory");
} else {
__asm__ volatile ("svc 0x123456"
__asm__ __volatile__ ("svc 0x123456"
: "=r" (ret) : "r" (r0), "r" (r1) : "memory");
}
return ret;

Some files were not shown because too many files have changed in this diff Show More