Compare commits

..

129 Commits

Author SHA1 Message Date
Thomas Stranger
091f5956b7 github: workflows: disable workflows
Disable workflows because the branch is no longer maintained.

Signed-off-by: Thomas Stranger <thomas.stranger@outlook.com>
2025-11-24 13:09:08 -05:00
Henrik Brix Andersen
c49ce6725d ci: build samples/cpp/hello_world as part of the multiplatform test
Build the C++ version of the Hello, World sample as part of the
multiplatform (build) test in CI.

Signed-off-by: Henrik Brix Andersen <hebad@vestas.com>
(cherry picked from commit cd543887f4)
2024-11-06 06:44:42 +01:00
Henrik Brix Andersen
1040cd6023 SDK_VERSION: Use Zephyr SDK 0.16.9
This commit updates ZEPHYR_SDK to point to the Zephyr SDK 0.16.9 release.

Signed-off-by: Henrik Brix Andersen <hebad@vestas.com>
2024-11-06 06:44:42 +01:00
Dominik Ermel
03bc6e0d70 storage/stream_flash: Fix range check in stream_flash_erase_page
Added check where stream_flash_erase_page checks if requested
offset is actually within stream flash designated area.

Fixes #79800

Signed-off-by: Dominik Ermel <dominik.ermel@nordicsemi.no>
(cherry picked from commit 8714c172ed)
2024-11-06 06:44:30 +01:00
Torsten Rasmussen
6aeb7a2b96 cmake: limit Zephyr SDK to 0.16.x series for Zephyr 3.6 LTS
Copy of 6da35a811a

Signed-off-by: Torsten Rasmussen <Torsten.Rasmussen@nordicsemi.no>
Signed-off-by: Pieter De Gendt <pieter.degendt@basalte.be>
2024-10-29 22:11:21 +01:00
Torsten Rasmussen
6e7db36176 cmake: support range for find_package(Zephyr-sdk)
Fixes: #80200

CMake `find_package(<package> <version>)` support the use of ranges,
like `1.0.0...4.0.0`.

Update the FindZephyr-sdk.cmake module to support this.
This allows looking up the Zephyr SDK with an upper boundry, for example
`find_package(Zephyr-sdk 0.16...<0.17)`.

Signed-off-by: Torsten Rasmussen <Torsten.Rasmussen@nordicsemi.no>
2024-10-29 22:11:21 +01:00
Abram Early
ee4c40c507 modbus: reset wait semaphore before tx
A response returned after a request times out would increment the
semaphore and stay until the next request is made which will immediately
return when k_sem_take is called even before a response is returned. This
will once again have the same problem when the actual response arrives.
So the wait semaphore just needs to be reset before transmitting.

Signed-off-by: Abram Early <abram.early@gmail.com>
(cherry picked from commit 583f4956dc)
2024-10-22 20:41:04 +02:00
Ibe Van de Veire
7f478311e6 test: net: igmp: Add extra IGMPv3 testcase
Added extra testcases for the IGMPv3 protocol. The IGMP driver is
supposed to send an IGMPv3 report when joining a group.

Signed-off-by: Ibe Van de Veire <ibe.vandeveire@basalte.be>
(cherry picked from commit e6dd4cda89)
2024-10-22 09:30:41 +02:00
Ibe Van de Veire
d2ba008b0a net: ip: igmp: Add igmp.h for definitions
Add igmp.h file to declare definitions for IGMP that are not meant te be
included by the application but can be used in e.g. tests.

Signed-off-by: Ibe Van de Veire <ibe.vandeveire@basalte.be>
(cherry picked from commit ba9eca3181)
2024-10-22 09:30:41 +02:00
Ibe Van de Veire
8e2419fce4 net: ip: igmp: Remove too strict length check
According to RFC2236 section 2.5, the IGMP message may be longer then 8
bytes. The rest of the bytes should be ignored.

Signed-off-by: Ibe Van de Veire <ibe.vandeveire@basalte.be>
(cherry picked from commit c646dd37e5)
2024-10-22 09:30:41 +02:00
Ibe Van de Veire
73372783ce net: ip: igmp: Fix wrong header length
The header length of the net ip packet was calculated using only the
net_pkt_ip_hdr_len function. The correct header length should be
calculated by adding net_pkt_ip_hdr_len and net_pkt_ipv4_opts_len. This
resulted in an incorrect IGMP version type in case of IGMPv2 message
(when IGMPv3 was enabled). The IGMP message was not parsed correctly and
therefore dropped.

Signed-off-by: Ibe Van de Veire <ibe.vandeveire@basalte.be>
(cherry picked from commit f852c12360)
2024-10-22 09:30:41 +02:00
Daekeun Kang
6937dabf26 net: fix handle unaligned memory access in net_context_bind()
This commit addresses an issue in net_context_bind() where unaligned
memory access was not properly handled when checking for INADDR_ANY.
The problem primarily affected MCUs like ARMv6 that don't support
unaligned memory access.

- Use UNALIGNED_GET() to safely access the sin_addr.s_addr field
- Ensures correct behavior on architectures with alignment restrictions

This fix improves compatibility and prevents potential crashes or
unexpected behavior on affected platforms.

Signed-off-by: Daekeun Kang <dkkang@huconn.com>
(cherry picked from commit b24c5201a0)
2024-10-02 10:06:12 +02:00
Jamie McCrae
25d2970266 doc: services: device_mgmt: mcumgr: Correct license for tool
Corrects an incorrect license for a tool

Fixes #78927

Signed-off-by: Jamie McCrae <jamie.mccrae@nordicsemi.no>
2024-09-30 17:08:57 -05:00
Lyle Zhu
097ed18842 Bluetooth: SDP: Fix stack override issue
Check the remaining space of the local variable
`filter` to avoid stack override issue.

Signed-off-by: Lyle Zhu <lyle.zhu@nxp.com>
(cherry picked from commit f3a1cf2782)
2024-09-26 11:31:34 +02:00
Lyle Zhu
f5f9872138 Bluetooth: RFCOMM: check the validity of received frame
Check whether the received frame is complete by
comparing the length of the received data with
the frame data.

Signed-off-by: Lyle Zhu <lyle.zhu@nxp.com>
(cherry picked from commit 37d62c6a16)
2024-09-26 11:16:26 +02:00
Emil Gydesen
2c2540ee46 zephyr: Add zero-len check for utf8_trunc
The function did not check if the provided string had a zero
length before starting to truncate, which meant that last_byte_p
could possible have pointed to the value before the string.

Signed-off-by: Emil Gydesen <emil.gydesen@nordicsemi.no>
2024-09-18 12:51:33 +02:00
Eunkyu Lee
a0c8a437c1 Bluetooth: Classic: Add length check in bluetooth classic
Added length checks for user input in `sdp_client_receive` and
`l2cap_br_info_rsp`.

Signed-off-by: Eunkyu Lee <mochaccino.00.00@gmail.com>
(cherry picked from commit 88881257ab)
2024-09-06 10:05:42 -05:00
Eunkyu Lee
ecfc6e1e89 Bluetooth: Host: Add missing buffer length check
Modified to check the length of the remaining data in buffer
before processing the next report. The length check is missing
in the cont routine.

Signed-off-by: Eunkyu Lee <mochaccino.00.00@gmail.com>
(cherry picked from commit e491f220d8)
2024-09-06 10:05:19 -05:00
Emil Gydesen
6567d6e370 Bluetooth: ASCS: Validate num_ases in CP requests
Add validation of the number of ASEs in control point
write requests.

This validates that the number of ASEs
in the control point is not greater than the total number
of ASEs we support.

This also validates that the GATT MTU is large enough to
hold all the responses from the write since those can only be
sent as notifications and never be truncated.

Finally this validates and updates the size of the buffer used to
hold the responses, which may be an optimization for some builds.

Signed-off-by: Emil Gydesen <emil.gydesen@nordicsemi.no>
(cherry picked from commit 7b0784c1f6)
2024-09-06 10:04:50 -05:00
Emil Gydesen
9ab5868c91 Bluetooth: OTS: Add len validation in olcp_ind_handler
Verify the length of the indication before we pull from the
buffer.

Signed-off-by: Emil Gydesen <emil.gydesen@nordicsemi.no>
(cherry picked from commit 044f8aaeb3)
2024-09-06 10:04:31 -05:00
Emil Gydesen
5afc636af9 Bluetooth: BAP: Add check for num_subgroups in parse_recv_state
In the parse_recv_state we did not verify that we can handle all
the subgroups before we started parsing them.

Signed-off-by: Emil Gydesen <emil.gydesen@nordicsemi.no>
(cherry picked from commit edbe34eaf2)
2024-09-06 10:04:09 -05:00
Anas Nashif
6874cf56ab ci: rerun issue check on PR edit
Re-run issue check when a PR is updated, i.e. when someone adds
'Fixes...` to the PR body.

This is mostly for release branches and has no effect on main branch.

Also, add concurrency check in the workflow.

Signed-off-by: Anas Nashif <anas.nashif@intel.com>
(cherry picked from commit 4077249cc7)
2024-08-29 22:09:03 +02:00
Benjamin Cabé
cc23222d25 doc: fix issue with keydown/keyup being ignored
Ensure initial search menu visibility is properly set so that key
events don't get trapped preventing arrow key navigation on the
document.

Signed-off-by: Benjamin Cabé <benjamin@zephyrproject.org>
(cherry picked from commit b3776ca97c)
2024-08-26 17:01:45 -05:00
Aleksander Wasaznik
d5a127ad2f Bluetooth: host: sched-lock bt_recv()
`bt_recv` is invoked from the BT long work queue, which is preemptible.
The host uses cooperative scheduling to ensure thread safety.

Signed-off-by: Aleksander Wasaznik <aleksander.wasaznik@nordicsemi.no>
2024-08-14 13:27:50 -05:00
Georges Oates_Larsen
c684dabf6a net: net_if: fix net_if_send_data for offloaded ifaces
Some offloaded ifaces have an L2, but lack support for
net_l2->send. This edge case is not handled by
net_if_send_data, resulting in a NULL dereference under
rare circumstances.

This patch expands the offloaded iface guard in
net_if_send_data to handle this edge case.

Signed-off-by: Georges Oates_Larsen <georges.larsen@nordicsemi.no>
(cherry picked from commit 1c79445059)
2024-08-13 22:03:24 +02:00
Francois Ramu
7c5e43aea6 doc: release 3.6: no CACHE_MANAGEMENT by default on stm32h7/f7
Mentioning that stm32h7 and stm32F7 series do not have the
CONFIG_CACHE_MANAGEMENT by default. The application must
explicitly set CONFIG_CACHE_MANAGEMENT=y to activate
the cache (Icache, Dcache) on those stm32 series.

Signed-off-by: Francois Ramu <francois.ramu@st.com>
(cherry picked from commit 1c59fa5231)
2024-08-05 22:12:18 +02:00
Flavio Ceolin
850601b055 tests: dynamic_thread_stack: Check thread stack permission
Check if a user thread is capable of free another thread stack.

Signed-off-by: Flavio Ceolin <flavio.ceolin@intel.com>
(cherry picked from commit 95dde52b1e)
2024-08-01 21:36:06 +02:00
Flavio Ceolin
28107c9b54 userspace: dynamic: Fix k_thread_stack_free verification
k_thread_stack_free syscall was not checking if the caller
had permission to given stack object.

Signed-off-by: Flavio Ceolin <flavio.ceolin@intel.com>
(cherry picked from commit c12f0507b6)
2024-08-01 21:36:06 +02:00
Jukka Rissanen
c9473ca717 net: context: Check null pointer in V6ONLY getsockopt
Make sure we are not accessing NULL pointer when checking
if the IPv4 mapping to IPv6 is enabled.

Signed-off-by: Jukka Rissanen <jukka.rissanen@nordicsemi.no>
(cherry picked from commit cf552905f4)
2024-08-01 10:12:54 +02:00
Vinayak Kariappa Chettimada
2cddfcef89 Bluetooth: Controller: Add explicit LLCP error code check
Add unit tests to cover explicit LLCP error code check and
cover the same in the Controller implementation.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
(cherry picked from commit d6f2bc9669)
Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2024-07-21 00:34:05 +02:00
Vinayak Kariappa Chettimada
90225887db Bluetooth: Controller: Use BT_HCI_ERR_UNSPECIFIED as needed
A Host shall consider any error code that it does not
explicitly understand equivalent to the error code
Unspecified Error (0x1F).

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
(cherry picked from commit 78466c8f52)
Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2024-07-21 00:34:05 +02:00
Vinayak Kariappa Chettimada
e31a6ef555 Bluetooth: Controller: Refactor BT_CTLR_LE_ENC implementation
Refactor reused function in BT_CTLR_LE_ENC feature.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
(cherry picked from commit fe205a598e)
Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2024-07-21 00:34:05 +02:00
Vinayak Kariappa Chettimada
9d03b8f1d3 Bluetooth: Controller: Fix missing conn update ind PDU validation
Fix missing validation of Connection Update Ind PDU. Ignore
invalid connection update parameters and force a silent
local connection termination.

Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
(cherry picked from commit 4b6d3f1e16)
Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2024-07-21 00:34:05 +02:00
Erik Brockhoff
ae70d76bee Bluetooth: controller: minor cleanup and a fix-up re. LLCP
Only perform retention if not already done.
Ensure 'sched' is performed on phy ntf even if dle is not.

Signed-off-by: Erik Brockhoff <erbr@oticon.com>
(cherry picked from commit 9d8059b6e5)
Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2024-07-21 00:34:05 +02:00
Erik Brockhoff
d1a10bc004 Bluetooth: controller: fix node_rx retention mechanism
Ensure that in LLCP reference to node_rx is cleared when
retention is NOT used, to avoid corruption of node_rx later
re-allocated

Signed-off-by: Erik Brockhoff <erbr@oticon.com>
(cherry picked from commit 806a4fcf92)
Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2024-07-21 00:34:05 +02:00
Erik Brockhoff
a3ca8d09f2 Bluetooth: controller: fixing rx node leak on CPR reject of parallel CPR
In case a CPR is intiated but rejected due to CPR active on
other connection, rx nodes are leaked due to retained node not
being properly released.

Signed-off-by: Erik Brockhoff <erbr@oticon.com>
(cherry picked from commit edef1b7cf4)
Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2024-07-21 00:34:05 +02:00
Morten Priess
e8cc179b6e Bluetooth: controller: Prevent invalid compiler code reordering
In ull_disable, it is imperative that the callback is set up before a
second reference counter check, otherwise it may happen that an LLL done
event has already passed when the disable callback and semaphore is
assigned.

This causes the HCI thread to wait until timeout and assert after
ull_ticker_stop_with_mark.

For certain compilers, due to compiler optimizations, it can be seen
from the assembler code that the callback is assigned after the second
reference counter check.

By adding memory barriers, the code correctly reorders code to the
expected sequence.

Signed-off-by: Morten Priess <mtpr@oticon.com>
(cherry picked from commit 7f82b6a219)
Signed-off-by: Vinayak Kariappa Chettimada <vich@nordicsemi.no>
2024-07-21 00:34:05 +02:00
Jordan Yates
a025a65d1f net: lib: dhcpv4: goto INIT on IF down, not RENEWING
When the interface goes down, the safest thing to do is to return to
the INIT state, as there is no guarantee that any state is preserved
upon the interface coming back up again.

This is particularly the case with WiFi.

Signed-off-by: Jordan Yates <jordan@embeint.com>
(cherry picked from commit 0f56974c9d)
2024-07-15 10:04:46 -05:00
Robert Lubos
ace6e447cf net: mqtt: Fix possible socket leak with websocket transport
In case underlying TCP/TLS connection is already down, the
websocket_disconnect() call is expected to fail, as it involves
communication. Therefore, mqtt_client_websocket_disconnect() should not
quit early in such cases, as it could lead to an underlying socket leak.

Signed-off-by: Robert Lubos <robert.lubos@nordicsemi.no>
(cherry picked from commit 4625fa713f)
2024-07-09 18:58:50 +02:00
Jared Kangas
4769dbb650 drivers: adc: lmp90xxx: fix checksum mismatch return value
During channel reads, zero is returned on CRC mismatches: the returned
error variable is not written to after a previous non-zero check. Return
-EIO to mirror other drivers' checksum validation behaviors.

Signed-off-by: Jared Kangas <kangas.jd@gmail.com>
(cherry picked from commit 8ec3c045f8)
2024-07-05 08:32:16 +02:00
Jamie McCrae
b27d3a61b7 cmake: modules: extensions: Fix dts watch file processing
Fixes and simplifies the handling of how the dts watch file is
processed

Signed-off-by: Grzegorz Swiderski <grzegorz.swiderski@nordicsemi.no>
Signed-off-by: Jamie McCrae <jamie.mccrae@nordicsemi.no>
(cherry picked from commit 11c1f3de61)
2024-07-01 13:43:12 +02:00
Jamie McCrae
388717ecf1 cmake: modules: extension: Fix dts file watch
Fixes an issue in the code that processes the output file of a
compiler to see which files should be watched, the compiler can
combine multiple files into a single line instead of putting them
each on separate lines if the length of the file paths is short,
therefore account for this and split it up into multiple elements

Signed-off-by: Jamie McCrae <jamie.mccrae@nordicsemi.no>
(cherry picked from commit f4cfb8cd96)
2024-07-01 13:43:12 +02:00
Sven Ginka
6866bbbf05 soc: atmel: sam: Add invalidate d-cache at z_arm_platform_init
Before that fix, the SOC was unable to boot properly.
Starting turned directly into z_arm_usage_fault().
Fixes zephyrproject-rtos#73485

Signed-off-by: Sven Ginka <sven.ginka@gmail.com>
(cherry picked from commit c3d7b1c978)
2024-06-25 22:32:07 +02:00
Johan Hedberg
fee892052a Bluetooth: Host: Avoid processing "no change" encryption changes
If the new encryption state is the same as the old one, there's no point in
doing additional processing or callbacks. Simply log a warning and ignore
the HCI event in such a case.

Signed-off-by: Johan Hedberg <johan.hedberg@silabs.com>
(cherry picked from commit bf363d7c3e)
2024-06-25 17:43:37 +02:00
Sean Nyekjaer
8cd7337d69 dts: arm: st: mp1: fix exti interrupt numbering
Align interrupt numbering with RM0436 for STM32MP157.
This will allow EXTI interrupt for line 6, 7, 8, 9, 10 and 11.

Fixes: ff231fa20a ("dts: stm32: Populate new properties for exti nodes")
Signed-off-by: Sean Nyekjaer <sean@geanix.com>
(cherry picked from commit ede866440d)
2024-06-05 16:06:37 +02:00
Abderrahmane Jarmouni
999c9a3b36 drivers: counter: stm32_rtc: fix clk disable for WBAX
clock_control_on() was called instead of clock_control_off().

Signed-off-by: Abderrahmane Jarmouni <abderrahmane.jarmouni-ext@st.com>
(cherry picked from commit c09f1ec91a)
2024-06-05 16:06:21 +02:00
Henrik Brix Andersen
1c606f5528 drivers: can: shell: print raw DLC when sending frame, not bytes
Print the raw DLC when enqueuing a CAN frame for sending, not the
corresponding number of bytes.

Fixes: #73309

Signed-off-by: Henrik Brix Andersen <hebad@vestas.com>
(cherry picked from commit 6a070ee165)
2024-05-29 14:53:17 +02:00
Henrik Brix Andersen
a5d72fec6e drivers: can: shell: fully initialize frame before sending
Zerorise the CAN frame before filling in data to ensure all data bytes are
initialized.

Fixes: #73309

Signed-off-by: Henrik Brix Andersen <hebad@vestas.com>
(cherry picked from commit fb4f67b775)
2024-05-29 14:53:17 +02:00
Gerson Fernando Budke
899e9e5320 mgmt: updatehub: Fix mark for update
This fixes compatibility with recent bootutils API.

Fixes #69297

Signed-off-by: Gerson Fernando Budke <gerson.budke@ossystems.com.br>
(cherry picked from commit 94cd46d6ef)
2024-05-29 14:53:08 +02:00
Gerson Fernando Budke
73d6a10f54 mgmt: updatehub: Fix json arrays
After the changes introduced by #50816 the UpdateHub could not decode
anymore the JSON object. This introduce missing parsing definitions
to allow JSON parser undertood the correct UpdateHub probe object.

Fixes #69297

Signed-off-by: Gerson Fernando Budke <gerson.budke@ossystems.com.br>
(cherry picked from commit 5fb62ca960)
2024-05-29 14:53:08 +02:00
Henrik Brix Andersen
c5156632a5 drivers: sensor: nxp: kinetis: temp: fix memset() length
Use the correct buffer size when calling memset().

Fixes: #73093

Signed-off-by: Henrik Brix Andersen <hebad@vestas.com>
(cherry picked from commit 59402fd82e)
2024-05-23 12:43:06 +02:00
Henrik Brix Andersen
4fe22b0f4e drivers: sensors: nxp: kinetis: temp: select CONFIG_ADC
The NXP Kinetis temperature sensor depends on CONFIG_ADC. Make the driver
Kconfig select CONFIG_ADC to get better CI coverage (enabling the driver
when CONFIG_SENSOR is enabled without depending on CONFIG_ADC=y).

Signed-off-by: Henrik Brix Andersen <hebad@vestas.com>
(cherry picked from commit 0a49b788a4)
2024-05-23 12:43:06 +02:00
Aleksander Wasaznik
f743d124b8 Bluetooth: Check buffer length in GATT rsp functions
Add length checks local to the parsing function. This removes the need
for a separate data validation step.

Signed-off-by: Aleksander Wasaznik <aleksander.wasaznik@nordicsemi.no>
(cherry picked from commit 3eeb8f8d18)
2024-05-05 21:25:26 +02:00
Alberto Escolar Piedras
a9fbf481f3 tests bsim edtt: Kill stuck processes in the same way as other tests
This test keeps its own partial way of running tests.
Let's have it kill stuck processes in the same way as
the rest (sending another kill 5 seconds after, and printing
a message about what happened)

Signed-off-by: Alberto Escolar Piedras <alberto.escolar.piedras@nordicsemi.no>
(cherry picked from commit 693ae8635a)
Signed-off-by: Alberto Escolar Piedras <alberto.escolar.piedras@nordicsemi.no>
2024-05-05 21:25:26 +02:00
Alberto Escolar Piedras
29976389f4 tests/bsim/bt l2cap/stress: Increase runtime timeout
This test has been seen failing in the new runners
due to a (realtime) timeout.
Let's double the timeout so it does not.

Signed-off-by: Alberto Escolar Piedras <alberto.escolar.piedras@nordicsemi.no>
(cherry picked from commit f4972347d4)
Signed-off-by: Alberto Escolar Piedras <alberto.escolar.piedras@nordicsemi.no>
2024-05-05 21:25:26 +02:00
Alberto Escolar Piedras
e946ab2cc0 tests/bsim broadcast_audio: Increase realtime timeout
This test has been seen failing in older slower computers
due to timeouts, let's increase the timeout so we don't
break in those cases.
Note this timeout is just a safety to eventually kill
hung simulations even if nobody presses Ctrl+C.

Signed-off-by: Alberto Escolar Piedras <alberto.escolar.piedras@nordicsemi.no>
(cherry picked from commit 0888882262)
Signed-off-by: Alberto Escolar Piedras <alberto.escolar.piedras@nordicsemi.no>
2024-05-05 21:25:26 +02:00
Alberto Escolar Piedras
89e601ec2d tests/bsim audio mcs_mcc: Increase realtime timeout
This test has been seen failing in older slower computers
due to timeouts, let's increase the timeout so we don't
break in those cases.
Note this timeout is just a safety to eventually kill
hung simulations even if nobody presses Ctrl+C.

Signed-off-by: Alberto Escolar Piedras <alberto.escolar.piedras@nordicsemi.no>
(cherry picked from commit e9c8856165)
Signed-off-by: Alberto Escolar Piedras <alberto.escolar.piedras@nordicsemi.no>
2024-05-05 21:25:26 +02:00
Alberto Escolar Piedras
09b84c709a tests/bsim mesh: Increase realtime timeout
One of these tests has been seen failing in older slower
computers due to timeouts, let's increase the timeout so
we don't break in those cases.
Note this timeout is just a safety to eventually kill
hung simulations even if nobody presses Ctrl+C.

Signed-off-by: Alberto Escolar Piedras <alberto.escolar.piedras@nordicsemi.no>
(cherry picked from commit d592455cc0)
Signed-off-by: Alberto Escolar Piedras <alberto.escolar.piedras@nordicsemi.no>
2024-05-05 21:25:26 +02:00
Alberto Escolar Piedras
313a062cbe tests bsim/bt/ll: Split build script in 2
The CIS tests are building quite many
apps on their own.
Let's split them in their separate sub-script,
so we don't parallelize too many builds,
and users have more granularity if they only
want to build a subset.

Signed-off-by: Alberto Escolar Piedras <alberto.escolar.piedras@nordicsemi.no>
(cherry picked from commit 940d53e839)
Signed-off-by: Alberto Escolar Piedras <alberto.escolar.piedras@nordicsemi.no>
2024-05-05 21:25:26 +02:00
Alberto Escolar Piedras
00b117f430 tests/bsim/bt/host: Split build script in 6
There are quite many BT host test images being built.
Today these are all built in parallel, causing a quite
high load.

Let's split them in their separate sub-scripts,
so we don't parallelize too many builds,
and users have more granularity if they only
want to build a subset.

Signed-off-by: Alberto Escolar Piedras <alberto.escolar.piedras@nordicsemi.no>
(cherry picked from commit 65d49cd434)
Signed-off-by: Alberto Escolar Piedras <alberto.escolar.piedras@nordicsemi.no>
2024-05-05 21:25:26 +02:00
Takuya Sasaki
4c45c3e1b2 net: context: Fix the ICMP error on raw
This commit applies the issues detected in UDP to recv_raw() as
well. Please refer to the previous commit log for details.

Signed-off-by: Takuya Sasaki <takuya.sasaki@spacecubics.com>
(cherry picked from commit 7d1edd1fcb)
2024-05-01 10:14:51 -05:00
Takuya Sasaki
4c54123941 net: context: Fix the ICMP error on udp
When receiving a UDP packet, net_conn_input() searches for a
matching connection within `conn_used`.

However, when receiving UDP packets simultaneously from multiple
clients, we may encounter a situation where the connection that was
supposed to be bound cannot be found within `conn_used`, and raise
the ICMP error.

This is because, within recv_udp(), to avoid the failure of
bind_default(), we temporarily remove it from `conn_used` using
net_conn_unregister().

If the context already has a connection handler, it means it's
already registered. In that case, all we have to do is 1) update
the callback registered in the net_context and 2) update the
user_data and remote address and port using net_conn_update().

Fixes #70020

Signed-off-by: Takuya Sasaki <takuya.sasaki@spacecubics.com>
(cherry picked from commit 4f802e1197)
2024-05-01 10:14:51 -05:00
Takuya Sasaki
bc1af7a997 net: conn: Add internal function for update connection
This commit adds the new internal function for update the callback,
user data, remote address, and port for a registered connection
handle.

Signed-off-by: Takuya Sasaki <takuya.sasaki@spacecubics.com>
(cherry picked from commit 46ca624be4)
2024-05-01 10:14:51 -05:00
Takuya Sasaki
17c98d3316 net: conn: Add static function for changing remote
This commit adds the new static function for change the remote
address and port to connection, and replaces the changing process
for remote address and port in net_conn_register().

Signed-off-by: Takuya Sasaki <takuya.sasaki@spacecubics.com>
(cherry picked from commit ef18518e91)
2024-05-01 10:14:51 -05:00
Takuya Sasaki
efdad10cc6 net: conn: Move net_conn_change_callback() to static
The net_conn_change_callback() is not currently being called by
anyone, so this commit moves to static function, and replaces
the change callback parameter process in net_conn_register().

Signed-off-by: Takuya Sasaki <takuya.sasaki@spacecubics.com>
(cherry picked from commit 49c6da51ce)
2024-05-01 10:14:51 -05:00
Torsten Rasmussen
ed1f5b7a20 cmake: fix issue with parsing version file located in /VERSION
Fixes: #71384

A VERSION file placed in `/` or `<drive>:\` was accidentally being
picked up during `find_package(Zephyr)`.

This happened because Zephyr loads the VERSION file to determine if it
is the correct Zephyr to use.

During initial phase of find_package(), then APPLICATION_SOURCE_DIR is
not defined, causing one version file to be picked up from `/VERSION`
instead of `<app>/VERSION`. `/VERSION` is outside any Zephyr repo, west
workspace, or the application itself and therefore should not be picked
up accidentally.

Fix this be checking that APPLICATION_SOURCE_DIR is defined, and only
when defined, look for a VERSION file there.

Signed-off-by: Torsten Rasmussen <Torsten.Rasmussen@nordicsemi.no>
(cherry picked from commit 4959a0241e)
2024-05-01 10:12:08 -05:00
Grzegorz Swiderski
df2c9863dc cmake: configuration_files: Add missing FILE_SUFFIX handling
for Kconfig fragments in the `boards` application subdirectory.

Signed-off-by: Grzegorz Swiderski <grzegorz.swiderski@nordicsemi.no>
(cherry picked from commit 3cff550180)
2024-04-26 14:10:03 -04:00
Jamie McCrae
d731d424fb cmake: modules: extensions: Rename prefixes in functions
Renames prefixes in some functions to avoid clashes with global
variables that have the same name

Signed-off-by: Jamie McCrae <jamie.mccrae@nordicsemi.no>
2024-04-26 14:10:03 -04:00
Jamie McCrae
d5b083f98a cmake: modules: extensions: Fix missing board revision overlays
Fixes an issue whereby board overlays for board revisions were not
included when configuring applications

Signed-off-by: Jamie McCrae <jamie.mccrae@nordicsemi.no>
2024-04-26 14:10:03 -04:00
Stephanos Ioannidis
597ab8302d tests: kernel: thread_runtime_stats: Relax precision test for QEMU
This commit relaxes the idle event statistics test precision requirement
for emulated QEMU targets because the cycle counts may be inaccurate when
the host CPU is overloaded (e.g. when running tests with twister) and a
high failure rate is observed for this test in the CI.

Signed-off-by: Stephanos Ioannidis <root@stephanos.io>
(cherry picked from commit 2b2dd01c38)
2024-04-24 20:11:32 +09:00
Stephanos Ioannidis
18d4edcffb testsuite: ztest: Increase ZTEST_TEST_DELAY_MS to 5000
This commit increases the default value of `ZTEST_TEST_DELAY_MS` from 3000
to 5000 milliseconds because the current value of 3000ms may not be
sufficient for the hosts with lower CPU clock frequency (e.g. new Zephyr CI
runners with cost-effective processors).

Signed-off-by: Stephanos Ioannidis <root@stephanos.io>
(cherry picked from commit 1bf751074d)
2024-04-24 20:11:32 +09:00
Stephanos Ioannidis
05da0d6823 ci: bsim-tests: Use zephyr-runner v2
This commit updates the bsim-tests workflow to use the new zephyr-runner v2
CI runner deployment.

It also updates the workflow to use the `ci-repo-cache` Docker image, which
includes the Zephyr repository cache, because the node level repository
cache is no longer available in the zephyr-runner v2.

Signed-off-by: Stephanos Ioannidis <root@stephanos.io>
(cherry picked from commit 9f9a6c547b)
2024-04-24 20:11:32 +09:00
Stephanos Ioannidis
49fbbf0494 ci: clang: Prioritise remote Redis cache storage
This commit updates the clang workflow such that ccache only uses remote
Redis cache storage when available.

The purpose of this to reduce the individual runner local disk IOPS
requirement; thereby, reducing the overall load on the SAN.

Signed-off-by: Stephanos Ioannidis <root@stephanos.io>
(cherry picked from commit 95e7eb31e6)
2024-04-24 20:11:32 +09:00
Stephanos Ioannidis
e1f1922a33 ci: clang: Use Redis remote storage for ccache
This commit updates the clang workflow to, when available, use Redis remote
storage backend for the ccache compilation cache data.

The Redis cache server is hosted in the Kubernetes cluster in which the
zephyr-runner pods run -- the Redis remote storage backend will be ignored
if the server is unavailable.

Signed-off-by: Stephanos Ioannidis <root@stephanos.io>
(cherry picked from commit 4a2884c652)
2024-04-24 20:11:32 +09:00
Stephanos Ioannidis
914ac45778 ci: clang: Store ccache data in node cache
This commit updates the clang workflow to store ccache data in the
zephyr-runner v2 node cache.

Signed-off-by: Stephanos Ioannidis <root@stephanos.io>
(cherry picked from commit cd83f0724b)
2024-04-24 20:11:32 +09:00
Stephanos Ioannidis
1f9fde3dc2 ci: clang: Use zephyr-runner v2
This commit updates the clang workflow to use the new zephyr-runner v2 CI
runner deployment.

It also updates the workflow to use the `ci-repo-cache` Docker image, which
includes the Zephyr repository cache, because the node level repository
cache is no longer available in the zephyr-runner v2.

Signed-off-by: Stephanos Ioannidis <root@stephanos.io>
(cherry picked from commit 64ca699fc8)
2024-04-24 20:11:32 +09:00
Stephanos Ioannidis
e25ff072ee ci: codecov: Set twister timeout multiplier to 2
This commit sets the codecov workflow twister timeout multiplier to 2,
which effectively increases the default test timeout from 60 to 120
seconds, because the new cost-effective Zephyr runners may take longer to
execute tests and the default timeout is not sufficient for some tests to
complete.

Signed-off-by: Stephanos Ioannidis <root@stephanos.io>
(cherry picked from commit 550bb4e4a6)
2024-04-24 20:11:32 +09:00
Stephanos Ioannidis
4f8ccc3122 ci: codecov: Prioritise remote Redis cache storage
This commit updates the codecov workflow such that ccache only uses remote
Redis cache storage when available.

The purpose of this to reduce the individual runner local disk IOPS
requirement; thereby, reducing the overall load on the SAN.

Signed-off-by: Stephanos Ioannidis <root@stephanos.io>
(cherry picked from commit a636c52b6a)
2024-04-24 20:11:32 +09:00
Stephanos Ioannidis
1cef4cba1a ci: codecov: Add --specs to ccache ignore option list
This commit adds the compiler `--specs=*` flag to the ccache ignore option
list because ccache is unable to resolve the toolchain-provided specs file
path and will consider source files to be uncacheable if it is unable to
read the specified specs file.

Note that adding `--specs=*` to the ignore option list is not a problem
because it is unlikely for the content of the toolchain libc spec file to
change without the compiler executable itself changing.

Signed-off-by: Stephanos Ioannidis <root@stephanos.io>
(cherry picked from commit ab9f6b456b)
2024-04-24 20:11:32 +09:00
Stephanos Ioannidis
ce208dbe6d ci: codecov: Use Redis remote storage for ccache
This commit updates the codecov workflow to, when available, use Redis
remote storage backend for the ccache compilation cache data.

The Redis cache server is hosted in the Kubernetes cluster in which the
zephyr-runner pods run -- the Redis remote storage backend will be ignored
if the server is unavailable.

Signed-off-by: Stephanos Ioannidis <root@stephanos.io>
(cherry picked from commit b57f1b5a15)
2024-04-24 20:11:32 +09:00
Stephanos Ioannidis
ee7617ccad ci: codecov: Store ccache data in node cache
This commit updates the codecov workflow to store ccache data in the
zephyr-runner v2 node cache.

Signed-off-by: Stephanos Ioannidis <root@stephanos.io>
(cherry picked from commit 36b0b101d4)
2024-04-24 20:11:32 +09:00
Stephanos Ioannidis
27f2c02f8d ci: codecov: Use zephyr-runner v2
This commit updates the codecov workflow to use the new zephyr-runner v2 CI
runner deployment.

It also updates the workflow to use the `ci-repo-cache` Docker image, which
includes the Zephyr repository cache, because the node level repository
cache is no longer available in the zephyr-runner v2.

Signed-off-by: Stephanos Ioannidis <root@stephanos.io>
(cherry picked from commit 354e290a23)
2024-04-24 20:11:32 +09:00
Stephanos Ioannidis
f5eaee5d13 ci: codecov: Run on all zephyrproject-rtos organisation repositories
This commit updates the codecov workflow to run on all forks under the
zephyrproject-rtos organisation.

The purpose of this is mainly to simplify the process of testing of this
workflow under the zephyr-testing repository.

Signed-off-by: Stephanos Ioannidis <root@stephanos.io>
(cherry picked from commit c1bd5a613f)
2024-04-24 20:11:32 +09:00
Stephanos Ioannidis
902fb9b4b0 ci: footprint-tracking: Use zephyr-runner v2
This commit updates the bsim-tests workflow to use the new zephyr-runner v2
CI runner deployment.

Signed-off-by: Stephanos Ioannidis <root@stephanos.io>
(cherry picked from commit 9838633c0e)
2024-04-24 20:11:32 +09:00
Stephanos Ioannidis
1a45a764a6 ci: doc-build: Use zephyr-runner v2
This commit updates the doc-build workflow to use the new zephyr-runner v2
CI runner deployment.

It also installs additional system packages that are not available by
default in the zephyr-runner v2.

Signed-off-by: Stephanos Ioannidis <root@stephanos.io>
(cherry picked from commit 2819c3526a)
2024-04-24 20:11:32 +09:00
Anas Nashif
7b47e9cd7f ci: twister: increase matrix size for push jobs
Increase matrix size to 20 from 15 on push events.

Signed-off-by: Anas Nashif <anas.nashif@intel.com>
(cherry picked from commit 9970724652)
2024-04-24 20:11:32 +09:00
Stephanos Ioannidis
8aa0204a74 ci: twister: Set build job timeout to 24 hours
This commit increases the twister build job timeout from the default value
of 6 hours to 24 hours because scheduled (weekly) build runs take longer
than 6 hours to complete.

Signed-off-by: Stephanos Ioannidis <root@stephanos.io>
(cherry picked from commit 7df7e834d9)
2024-04-24 20:11:32 +09:00
Stephanos Ioannidis
d9fdd258b1 ci: twister: Set twister timeout multiplier to 2
This commit sets the twister timeout multiplier to 2, which effectively
increases the default test timeout from 60 to 120 seconds, because the new
cost-effective Zephyr runners may take longer to execute tests and the
default timeout is not sufficient for some tests to complete.

Signed-off-by: Stephanos Ioannidis <root@stephanos.io>
(cherry picked from commit de68ea7ce0)
2024-04-24 20:11:32 +09:00
Stephanos Ioannidis
47ed96c9f3 ci: twister: Prioritise remote Redis cache storage
This commit updates the twister workflow such that ccache only uses remote
Redis cache storage when available.

The purpose of this to reduce the individual runner local disk IOPS
requirement; thereby, reducing the overall load on the SAN.

Signed-off-by: Stephanos Ioannidis <root@stephanos.io>
(cherry picked from commit 527435d642)
2024-04-24 20:11:32 +09:00
Stephanos Ioannidis
1660ff0bb5 ci: twister: Add --specs to ccache ignore option list
This commit adds the compiler `--specs=*` flag to the ccache ignore option
list because ccache is unable to resolve the toolchain-provided specs file
path and will consider source files to be uncacheable if it is unable to
read the specified specs file.

Note that adding `--specs=*` to the ignore option list is not a problem
because it is unlikely for the content of the toolchain libc spec file to
change without the compiler executable itself changing.

Signed-off-by: Stephanos Ioannidis <root@stephanos.io>
(cherry picked from commit d3f9f391ad)
2024-04-24 20:11:32 +09:00
Stephanos Ioannidis
6eb54b420d ci: twister: Use Redis remote storage for ccache
This commit updates the twister workflow to, when available, use Redis
remote storage backend for the ccache compilation cache data.

The Redis cache server is hosted in the Kubernetes cluster in which the
zephyr-runner pods run -- the Redis remote storage backend will be ignored
if the server is unavailable.

Signed-off-by: Stephanos Ioannidis <root@stephanos.io>
(cherry picked from commit 3823f1f0db)
2024-04-24 20:11:32 +09:00
Stephanos Ioannidis
d8f9eb7c93 ci: twister: Store ccache data in node cache
This commit updates the twister workflow to store ccache data in the
zephyr-runner v2 node cache.

Signed-off-by: Stephanos Ioannidis <root@stephanos.io>
(cherry picked from commit 1be3aacad3)
2024-04-24 20:11:32 +09:00
Stephanos Ioannidis
5a75516c9f ci: twister: Print cloud service information
This commit updates the twister workflow jobs that run on the zephyr-runner
v2 to print the underlying cloud service information in the logs to help
trace and debug potential runner issues.

Signed-off-by: Stephanos Ioannidis <root@stephanos.io>
(cherry picked from commit 9a9bebb9f9)
2024-04-24 20:11:32 +09:00
Stephanos Ioannidis
8d6c22bb5c ci: twister: Use zephyr-runner v2
This commit updates the twister workflow to use the new zephyr-runner v2 CI
runner deployment.

It also updates the workflow to use the `ci-repo-cache` Docker image, which
includes the Zephyr repository cache, because the node level repository
cache is no longer available in the zephyr-runner v2.

Signed-off-by: Stephanos Ioannidis <root@stephanos.io>
(cherry picked from commit 7c19bc70bf)
2024-04-24 20:11:32 +09:00
Celina Sophie Kalus
33b431c841 tests: posix: eventfd: Add ioctl F_SETFL test
Add a test to protect against future regressions in the ioctl F_SETFL
operation of eventfd. Flags are set and unset and validity of the file
descriptor is checked by reading and writing.

Signed-off-by: Celina Sophie Kalus <hello@celinakalus.de>
(cherry picked from commit 325f22a16f)
2024-04-23 10:11:26 +02:00
Celina Sophie Kalus
02aa8a246d posix: eventfd: Fix unsetting internal flags in ioctl
Commit e6eb0a705b ("posix: eventfd: revise locking, signaling, and
allocation") introduced a regression where the internal flags of an
event file descriptor would be erased when calling the F_SETFL ioctl
operation.

This includes the flag EFD_IN_USE_INTERNAL which determines whether
this file descriptor has been opened, thus effectively closing the
eventfd whenever one tries to change a flag.

Signed-off-by: Celina Sophie Kalus <hello@celinakalus.de>
(cherry picked from commit 5bd86eaddb)
2024-04-23 10:11:26 +02:00
Marco Widmer
1ef57ab277 pm: runtime: fix race when waiting for suspended event
To wait for the asynchronous suspending work item to complete, a
combination of semaphores and events is used. First, the semaphore is
released, then the events are cleared (through the boolean argument to
k_event_wait), then events are awaited.

However, if the event flag happens to be set by the work handler in the
short time between k_sem_give and k_event_wait, it is then cleared by
k_event_wait and k_event_wait blocks forever waiting for the event.

Make sure that we clear the event flag before releasing the semaphore.

Signed-off-by: Marco Widmer <marco.widmer@bytesatwork.ch>
(cherry picked from commit d83c63ecce)
2024-04-21 03:27:40 -07:00
Francois Ramu
d0af8d80de samples: drivers: spi flash testcase check the erased data
After erasing the sector, compare dat read with expected 0xFF pattern
to decide if erasing is successful instead of relying on the
returned code of the flash_erase function

Signed-off-by: Francois Ramu <francois.ramu@st.com>
(cherry picked from commit 325ae4d32a)
2024-04-12 07:49:25 -05:00
Francois Ramu
6d9dee38f4 drivers: flash: stm32 ospi flash driver the correct erase instruction
Use the most adapted instruction for the sector erase command
It can be 0x20 or ox21 or 0x21DE in octo - DTR mode.
The value is given by the SFDP table and  filled in the erase_types
 table the during the SFDP discovery process.

Signed-off-by: Francois Ramu <francois.ramu@st.com>
(cherry picked from commit 01684df03a)
2024-04-12 07:49:25 -05:00
Henrik Brix Andersen
719b338055 drivers: can: mcan: enable transmitter delay compensation when possible
Enable Transmitter Delay Compensation whenever the data phase timing
parameters allow it.

Fixes: #70447

Signed-off-by: Henrik Brix Andersen <hebad@vestas.com>
(cherry picked from commit bfad7bc00e)
2024-04-05 08:42:26 -05:00
Henrik Brix Andersen
d17fedb5f8 drivers: can: add utility macro for calculating TDC offset
Add a utility macro for calculating the Transmitter Delay Compensation
(TDC) Offset using the sample point and CAN core clock prescaler specified
by a set of data phase timing parameters.

Signed-off-by: Henrik Brix Andersen <hebad@vestas.com>
(cherry picked from commit a3631264d1)
2024-04-05 08:42:26 -05:00
Henrik Brix Andersen
1b610be502 dts: bindings: can: can-fd-controller: remove tx-delay-comp-offset prop
Remove the unused "tx-delay-comp-offset" property from the base CAN FD
controller devicetree binding.

Having a static Transmitter Delay Compensation (TDC) offset is useless.
The offset needs to match the data phase timing parameters in order to
properly configure the second sample point when transmitting CAN FD frames
with BRS enabled.

Signed-off-by: Henrik Brix Andersen <hebad@vestas.com>
(cherry picked from commit 744f20d005)
2024-04-05 08:42:26 -05:00
Henrik Brix Andersen
89356355d8 drivers: can: mcan: remove broken transmitter delay compensation support
Remove broken support for Transmitter Delay Compensation from the Bosch
M_CAN backend driver.

Even if this was enabled via Kconfig, the TDC bit in the DBTP register set
during driver initialization is overwritten in can_mcan_set_timing_data(),
turning TDC off.

Signed-off-by: Henrik Brix Andersen <hebad@vestas.com>
(cherry picked from commit ec75dc2232)
2024-04-05 08:42:26 -05:00
Reto Schneider
c719aedb0d doc: releases: Fix GH link in migration guide v3.6
Putting a space between :github: and the PR number causes the rendered
text to not link to the issue.

Signed-off-by: Reto Schneider <reto.schneider@husqvarnagroup.com>
(cherry picked from commit 5a49b7c90d)
2024-04-03 09:28:55 -05:00
Krzysztof Chruściński
8380437a8c shell: rtt: Add detection of host presence
If host is not reading RTT data (because there is no PC connection
or RTT reading application is not running on the host), thread will
stuck continuously trying to write to RTT. All threads with equal or
lower priority are blocked then. Adding detection of that case and
if host is not reading data for configurable period then data is
dropped until host accepts new data.

Similar solution is using in RTT logging backend.

Signed-off-by: Krzysztof Chruściński <krzysztof.chruscinski@nordicsemi.no>
(cherry picked from commit 04a74ce107)
2024-04-03 09:25:06 -05:00
Flavio Ceolin
b6391efe58 intel_adsp/ace: power: Lock interruption when power gate fails
In case the core is not power gated, waiti will restore intlevel. In
this case we lock interruption after it.

In the bug scenario, the host starts streaming and via SOF APIs, keeps a
lock to prevent Zephyr from entering PM_STATE_RUNTIME_IDLE. During the
test case, host removes this block and core0 is allowed to enter IDLE
state.

When core0 enters power gated state, interrrupts are left enabled (so
the core can be woken up when something happens). This leaves a race
where suitably timed interrupt will actually block entry to power gated
state and k_cpu_idle() in power_gate_entry() will return. This is rare,
but happens often enough that the relatively short test plan run on SOF
pull-requests will trigger this case.

Fixes #69807

Signed-off-by: Flavio Ceolin <flavio.ceolin@intel.com>
Signed-off-by: Anas Nashif <anas.nashif@intel.com>
(cherry picked from commit 07426a800c)
2024-03-29 09:22:24 +01:00
Fin Maaß
c3ff6548f3 mgmt: hawkbit: remove hb_context.status_buffer_size
remove hb_context.status_buffer_size and replace it with
sizeof(hb_context.status_buffer), because hb_context.status_buffer_size
is never set.

Signed-off-by: Fin Maaß <f.maass@vogl-electronic.com>
(cherry picked from commit 1bea938c9f)
2024-03-29 09:22:10 +01:00
Fabio Baltieri
83466c2ac1 input: gpio_keys: fix suspend race condition
Change the suspend/resume code to ensure that the interrupt are disabled
before changing the pin configuration. The current sequence has been
reported to cause spurious readouts on some platforms, this takes the
existing code and duplicates for the suspend and resume case, but swaps
the interrupt disable and configure for the suspend case.

Signed-off-by: Fabio Baltieri <fabiobaltieri@google.com>
(cherry picked from commit d0a8c4158c)
2024-03-18 23:55:26 +01:00
Daniel DeGrasse
78aaf1d527 drivers: mipi_dbi: mipi_dbi_spi: do not take spinlock
Taking a spinlock will result in interrupts being blocked in the MIPI
DBI driver, which is not desired behavior while issuing SPI transfers,
since the driver may use interrupts to drive the transfer

Fixes #68815

Signed-off-by: Daniel DeGrasse <daniel.degrasse@nxp.com>
(cherry picked from commit 5b6fadc10d)
2024-03-18 22:11:30 +01:00
Sylvio Alves
b3e4eef8b5 hal_espressif: update to include bugfixes
Added longjmp patch.
Fixes build warning in phy driver.
Fixes runtime missing rom function.
Fixes missing mcuboot assertion implementation.
Added function to retrieve uart port num

Signed-off-by: Sylvio Alves <sylvio.alves@espressif.com>
(cherry picked from commit f7eac8ab8d)
2024-03-17 12:55:25 +01:00
Yasushi SHOJI
a25087f9e5 drivers: sensor: ams_as5600: Fix calculation of fractional part
commit 98903d48c3 upstream.

The original calculation has two bugs. One is the calculated value, and the
other is that the value is not in one-millionth parts.

What the original calculation does is compute a scaled position value by
multiplying the raw sensor value (`dev_data->position`) by
`AS5600_FULL_ANGLE`, which represents a full rotation in degrees. It then
subtracts the product of the whole number of pulses (`val->val1`) and
`AS5600_PULSES_PER_REV` from this scaled position value.

    ((int32_t)dev_data->position * AS5600_FULL_ANGLE)
    - (val->val1 * AS5600_PULSES_PER_REV);

What you actually need is to extract the fractional part of the value by
taking the modulo of AS5600_PULSES_PER_REV from the scaled value of the
position.

   (((int32_t)dev_data->position * AS5600_FULL_ANGLE)
   % AS5600_PULSES_PER_REV)

Then convert the value to one-millionth part.

   * (AS5600_MILLION_UNIT / AS5600_PULSES_PER_REV);

Signed-off-by: Yasushi SHOJI <yashi@spacecubics.com>
2024-03-17 12:54:44 +01:00
Alberto Escolar Piedras
492eeeac31 drivers/counter nrfx: Fix with DT instance not matching device instance
478530ec0a introduced a bug
where if the DT index while iterating its DT structure
initialization does not match the actual peripheral instance,
or if the device instance string is not just a simple integer,
but a more complex string like "00", or "02", either
the wrong peripheral address would be used, or the file
would failt to compile.

Let's fix this by reverting that change, and instead, for
simulation converting the hardcoded DT/real HW address
to the valid addr for simulation on the fly.

Signed-off-by: Alberto Escolar Piedras <alberto.escolar.piedras@nordicsemi.no>
(cherry picked from commit f9d5e8458c)
2024-03-14 09:47:45 +01:00
Alberto Escolar Piedras
72113c183b manifest: Update nrf hw models to latest
* Update the HW models module to
3925b7030736f25f45ceedc3621219125a2d4685

Including the following:
* 3925b70: Add new API to convert real peripheral addr to simulated one
* 319e3eb: nhw_convert_periph_base_addr: Fix include for nrf5340

Signed-off-by: Alberto Escolar Piedras <alberto.escolar.piedras@nordicsemi.no>
(cherry picked from commit f6435e012e)
2024-03-14 09:47:45 +01:00
Peter Mitsis
cd21e04ec5 kernel: Update k_wakeup()
This commit does two things to k_wakeup():

1. It locks the scheduler before marking the thread as not suspended.
As the the clearing of the _THREAD_SUSPENDED bit is not atomic, this
helps ensure that neither another thread nor ISR interrupts this
action (resulting in a corrupted thread_state).

2. The call to flag_ipi() has been removed as it is already being
made within ready_thread().

Signed-off-by: Peter Mitsis <peter.mitsis@intel.com>
(cherry picked from commit 51ae993c12)
2024-03-12 16:02:04 +01:00
Jamie McCrae
fe92cf677d scripts: snippets: Fix path output on windows
Fixes an issue with paths being output in windows-style with back
slashes, this causes issues for certain escape sequences when
cmake interprets them. Replace these paths with posix paths so
that they are not treated as possible escape sequences.

Signed-off-by: Jamie McCrae <jamie.mccrae@nordicsemi.no>
(cherry picked from commit b680a6ec72)
2024-03-12 14:56:55 +01:00
Guillaume Gautier
517880c664 samples: boards: stm32: pm: s2ram: enable standby mode
Add Standby mode to the overlay since here we want to test Suspend to RAM.

Signed-off-by: Guillaume Gautier <guillaume.gautier-ext@st.com>
(cherry picked from commit cfa7e38378)
2024-03-11 15:09:06 +01:00
Guillaume Gautier
b0bf5d3e09 dts: arm: st: wba: do not enable standby pm mode
Do not enable Standby low-power mode by default since the associated
Kconfig PM_S2RAM is disabled by default. Otherwise we could enter an
unsupported low-power state.

Signed-off-by: Guillaume Gautier <guillaume.gautier-ext@st.com>
(cherry picked from commit f1bfbb0627)
2024-03-11 15:09:06 +01:00
Yuval Peress
be48697fa2 it82xx2: Add missing ISRs for gpioj
Without this we can't take advandage of pins 6 & 7.

Fixes #69503

Signed-off-by: Yuval Peress <peress@google.com>
(cherry picked from commit 375aa90c09)
2024-03-08 19:01:46 +01:00
Robert Lubos
658a52be20 tests: net: sockets: tls: Add test verifying send() after close
Add test case verifying that send() returns an error when called after
TLS session has been closed.

Signed-off-by: Robert Lubos <robert.lubos@nordicsemi.no>
(cherry picked from commit b920793e9b)
2024-03-08 19:01:30 +01:00
Robert Lubos
0383b7fb1c net: sockets: tls: Return an error on send() after session is closed
It was an overlook to return 0 on TLS send() call, after detecting that
TLS session has been closed by peer, such a behavior is only valid for
recv(). Instead, an error should be returned.

Signed-off-by: Robert Lubos <robert.lubos@nordicsemi.no>
(cherry picked from commit dc52b20705)
2024-03-08 19:01:30 +01:00
Maochen Wang
169020a824 net: zperf: Fix TOS option not working in zperf
When the zperf command is called with '-S' option which means IP_TOS
for IPv4 and IPV6_TCLASS for IPv6, an error is printed and the
setting does not work. The socket option handling was changed by
commit 77e522a5a243('net: context: Refactor option setters'), but the
callers of option setters were not changed. This causes the IP_TOS
or IPV6_TCLASS option failed to set. The fix is to use uint8_t to
store the value of the -S option.

Signed-off-by: Maochen Wang <maochen.wang@nxp.com>
(cherry picked from commit e7444dcf42)
2024-03-08 19:01:10 +01:00
Laurentiu Mihalcea
da2a9af2f2 tests: devicetree: check L2 interrupt encoding
Add some assert statements meant to check if L2 interrupts
are encoded right when dealing with nodes that consume interrupts
from multiple aggregators. For this to work, also add another
interrupt controller node which extends a different L1 interrupt
from `test_intc`.

Signed-off-by: Laurentiu Mihalcea <laurentiu.mihalcea@nxp.com>
(cherry picked from commit 8fec5f361c)
2024-03-08 09:34:09 +01:00
Laurentiu Mihalcea
634b41780e devicetree.h: switch to DT_IRQ_INTC_BY_IDX for L2 and L3 INTID encodings
Using `DT_IRQ_INTC()` to fetch the interrupt controller associated
with a node works well for nodes which consume interrupts from a
single aggregator. However, when specifying multiple (and different)
interrupt aggregators via the `interrupts-extended` property,
the L2 and L3 interrupts will no longer be encoded properly. This
is because `DT_IRQ_INTC(node_id)` uses `DT_IRQ_INTC_BY_IDX(node_id, 0)`
so all the interrupts will use the first aggregator as their parent.
To fix this, switch from using `DT_IRQ_INTC()` to `DT_IRQ_INTC_BY_IDX()`.

Signed-off-by: Laurentiu Mihalcea <laurentiu.mihalcea@nxp.com>
(cherry picked from commit 3799e0f498)
2024-03-08 09:34:09 +01:00
Marcin Gasiorek
a8af8ae2eb net: ip: Fix for improper offset return by net_pkt_find_offset()
The original packet's link-layer destination and source address can be
stored in separately allocated memory. This allocated memory can be
placed just after pkt data buffers.
In case when `net_pkt_find_offset()` uses condition:
`if (buf->data <= ptr && ptr <= (buf->data + buf->len)) {`
the offset is set outside the packet's buffer and the function returns
incorrect offset instead of error code.
Finally the offset is used to set ll address in cloned packet, and
this can have unexpected behavior (e.g. crash when cursor will be set
to empty memory).

Signed-off-by: Marcin Gasiorek <marcin.gasiorek@nordicsemi.no>
(cherry picked from commit fb99f65fe9)
2024-03-07 09:15:30 +01:00
Mathieu Choplain
f9576a7713 drivers: can: add missing argument to LOG_ERR call
PR #64399 introduced checks for out-of-bounds filter IDs
in CAN drivers, along with logging of said IDs; however,
the call to LOG_ERR in the native POSIX/Linux driver is
missing the 'filter_id' argument.

This commit adds the missing argument to ensure proper
data is printed when the LOG_ERR call is performed.

Signed-off-by: Mathieu Choplain <mathieu.choplain@st.com>
(cherry picked from commit 6ff47b15db)
2024-03-05 18:38:24 +01:00
Sylvio Alves
b90726cd10 drivers: dac: esp32: fix clock control subsys argument
Current cfg->clock_subsys is passed as address and is
causing driver assertion.

Fixes #69198

Signed-off-by: Sylvio Alves <sylvio.alves@espressif.com>
(cherry picked from commit a79c54dc43)
2024-03-05 18:38:07 +01:00
Jukka Rissanen
3a7d43abba net: sockets: Do not start service thread if too little resources
If the CONFIG_NET_SOCKETS_POLL_MAX is smaller than what is needed
for the socket service API to work properly, then we should not
start the service thread as the service API cannot work and might
cause memory overwrite in ctx.events[] array.

Fixes #69233

Signed-off-by: Jukka Rissanen <jukka.rissanen@nordicsemi.no>
(cherry picked from commit ea189d5aee)
2024-03-05 18:37:51 +01:00
Pisit Sawangvonganan
8710af1838 wifi: shell: removed NULL check to net_mgmt callback
Since PR, PR_SHELL, PR_ERROR, PR_INFO, and PR_WARNING already have
an embedded `sh` NULL check, we can remove the change from PR #68809.

Signed-off-by: Pisit Sawangvonganan <pisit@ndrsolution.com>
(cherry picked from commit b6f51edd6c)
2024-03-05 18:37:24 +01:00
Pisit Sawangvonganan
720b33ebf3 net: shell: ensure the shell sh is valid before call shell_printf
It is possible that the `sh` was not set before use.
This change adds a NULL check for `sh` in the following macros:
PR, PR_SHELL, PR_ERROR, PR_INFO, and PR_WARNING.
In case `sh` is NULL, the above macros will call `printk` instead.

Fixes #68793

Signed-off-by: Pisit Sawangvonganan <pisit@ndrsolution.com>
(cherry picked from commit 7b8a9e1818)
2024-03-05 18:37:24 +01:00
53699 changed files with 838871 additions and 2770834 deletions

View File

@@ -5,6 +5,7 @@
--min-conf-desc-length=1
--typedefsfile=scripts/checkpatch/typedefsfile
--ignore BRACES
--ignore PRINTK_WITHOUT_KERN_LEVEL
--ignore SPLIT_STRING
--ignore VOLATILE
@@ -28,5 +29,3 @@
--ignore MULTISTATEMENT_MACRO_USE_DO_WHILE
--ignore ENOSYS
--ignore IS_ENABLED_CONFIG
--ignore EXPORT_SYMBOL
--ignore COMPARISON_TO_NULL

View File

@@ -32,21 +32,17 @@ ColumnLimit: 100
ConstructorInitializerIndentWidth: 8
ContinuationIndentWidth: 8
ForEachMacros:
- 'ARRAY_FOR_EACH'
- 'ARRAY_FOR_EACH_PTR'
- 'FOR_EACH'
- 'FOR_EACH_FIXED_ARG'
- 'FOR_EACH_IDX'
- 'FOR_EACH_IDX_FIXED_ARG'
- 'FOR_EACH_NONEMPTY_TERM'
- 'FOR_EACH_FIXED_ARG_NONEMPTY_TERM'
- 'RB_FOR_EACH'
- 'RB_FOR_EACH_CONTAINER'
- 'SYS_DLIST_FOR_EACH_CONTAINER'
- 'SYS_DLIST_FOR_EACH_CONTAINER_SAFE'
- 'SYS_DLIST_FOR_EACH_NODE'
- 'SYS_DLIST_FOR_EACH_NODE_SAFE'
- 'SYS_SEM_LOCK'
- 'SYS_SFLIST_FOR_EACH_CONTAINER'
- 'SYS_SFLIST_FOR_EACH_CONTAINER_SAFE'
- 'SYS_SFLIST_FOR_EACH_NODE'
@@ -70,19 +66,8 @@ ForEachMacros:
- 'Z_GENLIST_FOR_EACH_NODE'
- 'Z_GENLIST_FOR_EACH_NODE_SAFE'
- 'STRUCT_SECTION_FOREACH'
- 'STRUCT_SECTION_FOREACH_ALTERNATE'
- 'TYPE_SECTION_FOREACH'
- 'K_SPINLOCK'
- 'COAP_RESOURCE_FOREACH'
- 'COAP_SERVICE_FOREACH'
- 'COAP_SERVICE_FOREACH_RESOURCE'
- 'HTTP_RESOURCE_FOREACH'
- 'HTTP_SERVER_CONTENT_TYPE_FOREACH'
- 'HTTP_SERVICE_FOREACH'
- 'HTTP_SERVICE_FOREACH_RESOURCE'
- 'I3C_BUS_FOR_EACH_I3CDEV'
- 'I3C_BUS_FOR_EACH_I2CDEV'
- 'MIN_HEAP_FOREACH'
IfMacros:
- 'CHECKIF'
# Disabled for now, see bug https://github.com/zephyrproject-rtos/zephyr/issues/48520
@@ -97,20 +82,11 @@ IncludeCategories:
- Regex: '.*'
Priority: 3
IndentCaseLabels: false
IndentGotoLabels: false
IndentWidth: 8
InsertBraces: true
InsertNewlineAtEOF: true
SpaceBeforeInheritanceColon: False
SpaceBeforeParens: ControlStatementsExceptControlMacros
SortIncludes: Never
UseTab: ForContinuationAndIndentation
WhitespaceSensitiveMacros:
- COND_CODE_0
- COND_CODE_1
- IF_DISABLED
- IF_ENABLED
- LISTIFY
- STRINGIFY
- Z_STRINGIFY
- DT_FOREACH_PROP_ELEM_SEP

View File

@@ -1,21 +0,0 @@
# SPDX-License-Identifier: Apache-2.0
#
# Copyright (c) 2024, Basalte bv
analyzer:
# Start by disabling all
- --disable-all
# Enable the sensitive profile
- --enable=sensitive
# Disable unused cases
- --disable=boost
- --disable=mpi
# Many identifiers in zephyr start with _
- --disable=clang-diagnostic-reserved-identifier
- --disable=clang-diagnostic-reserved-macro-identifier
# Cleanup
- --clean

View File

@@ -86,10 +86,6 @@ indent_size = 8
[COMMIT_EDITMSG]
max_line_length = 75
# Patches
[{*.patch,*.diff}]
trim_trailing_whitespace = false
# Kconfig
[Kconfig*]
indent_style = tab

View File

@@ -0,0 +1,57 @@
---
name: Bug report
about: Create a report to help us improve Zephyr
title: ''
labels: bug
assignees: ''
---
**Notes (delete this)**
Github Discussions (https://github.com/zephyrproject-rtos/zephyr/discussions)
are available to first verify that the issue is a genuine Zephyr bug and not a
consequence of Zephyr services misuse.
This issue list is only for bugs in the main Zephyr code base
(https://github.com/zephyrproject-rtos/zephyr/). If the bug is for a project
fork (such as NCS) specific feature, please open an issue in the fork project
instead.
**Describe the bug**
A clear and concise description of what the bug is.
Please also mention any information which could help others to understand
the problem you're facing:
- What target platform are you using?
- What have you tried to diagnose or workaround this issue?
- Is this a regression? If yes, have you been able to "git bisect" it to a
specific commit?
- ...
**To Reproduce**
Steps to reproduce the behavior:
1. mkdir build; cd build
2. cmake -DBOARD=board\_xyz
3. make
4. See error
**Expected behavior**
A clear and concise description of what you expected to happen.
**Impact**
What impact does this issue have on your progress (e.g., annoyance, showstopper)
**Logs and console output**
If applicable, add console logs or other types of debug information
e.g Wireshark capture or Logic analyzer capture (upload in zip archive).
copy-and-paste text and put a code fence (\`\`\`) before and after, to help
explain the issue. (if unable to obtain text log, add a screenshot)
**Environment (please complete the following information):**
- OS: (e.g. Linux, MacOS, Windows)
- Toolchain (e.g Zephyr SDK, ...)
- Commit SHA or Version used
**Additional context**
Add any other context that could be relevant to your issue, such as pin setting,
target configuration, ...

View File

@@ -1,85 +0,0 @@
name: Bug Report
description: File a bug report.
labels: ["bug"]
type: "Bug"
assignees: []
body:
- type: markdown
attributes:
value: |
Thanks for taking the time to fill out this bug report!
- type: textarea
id: what-happened
attributes:
label: Describe the bug
description: |
A clear and concise description of what the bug is.
placeholder: |
Please also mention any information which could help others to understand
the problem you're facing:
- What target platform are you using?
- What have you tried to diagnose or workaround this issue?
- Is this a regression? If yes, have you been able to "git bisect" it to a
specific commit?
validations:
required: true
- type: checkboxes
id: regression
attributes:
label: Regression
description: |
Check this box if this is a regression and provide a SHA if you were able to "git bisect" to a specific commit.
options:
- label: This is a regression.
required: false
- type: textarea
id: reproduce
attributes:
label: Steps to reproduce
description: |
Steps to reproduce the behavior.
placeholder: |
Steps to reproduce the behavior:
1. mkdir build; cd build
2. cmake -DBOARD=board\_xyz
3. make
4. See error
validations:
required: false
- type: textarea
id: logs
attributes:
label: Relevant log output
description: Please copy and paste any relevant log output. This will be automatically formatted into code, so no need for backticks.
render: shell
- type: dropdown
attributes:
label: Impact
description: Impact of this bug
multiple: false
options:
- Showstopper Prevents release or major functionality; system unusable.
- Major Severely degrades functionality; workaround is difficult or unavailable.
- Functional Limitation Some features not working as expected, but system usable.
- Annoyance Minor irritation; no significant impact on usability or functionality.
- Intermittent Occurs occasionally; hard to reproduce.
- Not sure
default: 3
validations:
required: true
- type: textarea
id: env
attributes:
label: Environment
description: please complete the following information
placeholder: |
- OS: (e.g. Linux, MacOS, Windows)
- Toolchain (e.g Zephyr SDK, ...)
- Commit SHA or Version used
- type: textarea
id: context
attributes:
label: Additional Context
description: Provide other context that could be relevant to the bug, such as pin setting, target configuration,etc.

View File

@@ -0,0 +1,20 @@
---
name: Enhancement
about: Suggest enhancements to existing features
title: ''
labels: Enhancement
assignees: ''
---
**Is your enhancement proposal related to a problem? Please describe.**
A clear and concise description of what the problem is.
**Describe the solution you'd like**
A clear and concise description of what you want to happen.
**Describe alternatives you've considered**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or graphics (drag-and-drop an image) about the feature request here.

View File

@@ -1,42 +0,0 @@
name: Enhancement
description: Submit an Enhancement
labels: ["Enhancement"]
type: "Enhancement"
assignees: []
body:
- type: markdown
attributes:
value: |
Thanks for taking the time to fill out this enhancement proposal.
- type: textarea
id: description
attributes:
label: Summary
description: |
Is your enhancement proposal related to a problem? Please describe.
placeholder: |
A clear and concise description of what the problem is.
validations:
required: true
- type: textarea
id: solution
attributes:
label: Describe the solution you'd like
description: |
Describe the solution you'd like
placeholder: |
A clear and concise description of what you want to happen.
validations:
required: true
- type: textarea
id: alternatives
attributes:
label: Alternatives
description: Describe alternatives you've considered
placeholder: |
A clear and concise description of any alternative solutions or features you've considered.
- type: textarea
id: context
attributes:
label: Additional Context
description: Add any other context or graphics (drag-and-drop an image) about the enhancement here.

View File

@@ -0,0 +1,51 @@
---
name: RFC / Proposal
about: Submit an RFC / Proposal
title: ''
labels: RFC
assignees: ''
---
## Introduction
This section targets end users, TSC members, maintainers and anyone else that might
need a quick explanation of your proposed change.
### Problem description
Why do we want this change and what problem are we trying to address?
### Proposed change
A brief summary of the proposed change - the 10,000 ft view on what it will
change once this change is implemented.
## Detailed RFC
In this section of the document the target audience is the dev team. Upon
reading this section each engineer should have a rather clear picture of what
needs to be done in order to implement the described feature.
### Proposed change (Detailed)
This section is freeform - you should describe your change in as much detail
as possible. Please also ensure to include any context or background info here.
For example, do we have existing components which can be reused or altered.
By reading this section, each team member should be able to know what exactly
you're planning to change and how.
### Dependencies
Highlight how the change may affect the rest of the project (new components,
modifications in other areas), or other teams/projects.
### Concerns and Unresolved Questions
List any concerns, unknowns, and generally unresolved questions etc.
## Alternatives
List any alternatives considered, and the reasons for choosing this option
over them.

View File

@@ -1,75 +0,0 @@
name: RFC / Proposal
description: Submit a Proposal (RFC)
labels: ["RFC"]
type: RFC
assignees: []
body:
- type: markdown
attributes:
value: |
## Introduction
This section targets end users, TSC members, maintainers and anyone else
that might need a quick explanation of your proposed change.
- type: textarea
id: problem-description
attributes:
label: Problem Description
description: Why do we want this change and what problem are we trying to address?
placeholder: Explain the problem or limitation this RFC is meant to resolve.
validations:
required: true
- type: textarea
id: proposed-change-summary
attributes:
label: Proposed Change (Summary)
description: A high-level summary of the proposed change.
placeholder: Brief summary of what will change if this RFC is implemented.
validations:
required: true
- type: markdown
attributes:
value: |
## Detailed RFC
This section targets the development team. Upon reading it, each engineer
should understand what must be done to implement the proposed feature.
- type: textarea
id: detailed-change
attributes:
label: Proposed Change (Detailed)
description: Describe the change in as much detail as possible. Include context or background info, and reuse of existing components if applicable.
placeholder: Explain exactly what youre planning to change and how.
validations:
required: true
- type: textarea
id: dependencies
attributes:
label: Dependencies
description: Highlight how this change may affect the rest of the project or other teams/components.
placeholder: List components, modules, or teams affected.
validations:
required: false
- type: textarea
id: concerns
attributes:
label: Concerns and Unresolved Questions
description: List any concerns, unknowns, or unresolved questions related to this proposal.
placeholder: Any areas of uncertainty?
validations:
required: false
- type: textarea
id: alternatives
attributes:
label: Alternatives Considered
description: What alternative solutions were considered? Why was this proposal chosen?
placeholder: List alternatives and explain the rationale behind your choice.
validations:
required: false

View File

@@ -0,0 +1,20 @@
---
name: Feature request
about: Suggest an idea for this project
title: ''
labels: Feature Request
assignees: ''
---
**Is your feature request related to a problem? Please describe.**
A clear and concise description of what the problem is.
**Describe the solution you'd like**
A clear and concise description of what you want to happen.
**Describe alternatives you've considered**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or graphics (drag-and-drop an image) about the feature request here.

View File

@@ -1,29 +0,0 @@
name: Feature Request
description: Suggest a new feature or enhancement
labels: ["Feature Request"]
type: Feature
assignees: []
body:
- type: textarea
id: problem
attributes:
label: Is your feature request related to a problem? Please describe.
description: A clear and concise description of what the problem is.
placeholder: e.g., I'm frustrated when I need to do X manually because Y is missing.
validations:
required: true
- type: textarea
id: solution
attributes:
label: Describe the solution you'd like
description: A clear and concise description of what you want to happen.
placeholder: e.g., It would be great if the system could automatically handle X by doing Y.
validations:
required: true
- type: textarea
id: alternatives
attributes:
label: Describe alternatives you've considered
description: Include any alternative solutions or features

View File

@@ -0,0 +1,42 @@
---
name: Contributor Nomination
about: Nominate a GitHub user for additional rights on the Zephyr Project
title: ''
labels: Role Nomination
assignees: ''
---
# Background
The [TSC Project Roles] defines the main roles for the Zephyr Project, including
Maintainer, Collaborator, and Contributor.
By default anyone that contributes code or documentation is a Contributor, but
with the lowest [GitHub Permission Level] of Read. For example, Contributors
with Read permission do not have the permission to add reviewers to a pull
request.
Use this template to nominate a GitHub user for the Contributor role with
Triage permission level, which allows the user to add reviewers to a pull
request and be added as a reviewer by other users.
# Nomination
## GitHub User
Provide the following information about the GitHub user:
1. Full Name
1. GitHub username
1. Organization (optional)
## Supporting Documents
Add links to 3-5 GitHub pull requests, in the Zephyr project, authored or
reviewed by the GitHub user that demonstrate the user's dedication to the
Zephyr project.
[TSC Project Roles]: <https://docs.zephyrproject.org/latest/project/project_roles.html>
[GitHub Permission Level]: <https://docs.github.com/en/organizations/managing-access-to-your-organizations-repositories/repository-roles-for-an-organization>

View File

@@ -1,57 +0,0 @@
name: Contributor Nomination
description: Nominate a GitHub user for the Contributor role with triage permissions
labels: [Role Nomination]
assignees: ['nashif']
body:
- type: markdown
attributes:
value: |
## Background
The [TSC Project Roles](https://docs.zephyrproject.org/latest/project/project_roles.html) defines the main roles for the Zephyr Project, including Maintainer, Collaborator, and Contributor.
By default, anyone who contributes code or documentation is a Contributor, but with the lowest [GitHub Permission Level](https://docs.github.com/en/organizations/managing-access-to-your-organizations-repositories/repository-roles-for-an-organization) of **Read**.
Use this form to nominate a user for the **Contributor** role with **Triage** permission, which allows the user to:
- Add reviewers to pull requests
- Be added as a reviewer by others
- type: input
id: full-name
attributes:
label: Full Name
description: Full name of the nominated contributor.
placeholder: e.g., Jane Doe
validations:
required: true
- type: input
id: github-username
attributes:
label: GitHub Username
description: GitHub handle of the nominated contributor.
placeholder: e.g., @janedoe
validations:
required: true
- type: input
id: organization
attributes:
label: Organization
description: Organization the nominee is affiliated with (optional).
placeholder: e.g., Acme Corp
validations:
required: false
- type: textarea
id: supporting-documents
attributes:
label: Supporting Documents
description: Provide links to 35 pull requests authored or reviewed by the nominee that demonstrate their dedication to the Zephyr project.
placeholder: |
e.g.,
- https://github.com/zephyrproject-rtos/zephyr/pull/12345
- https://github.com/zephyrproject-rtos/zephyr/pull/23456
- https://github.com/zephyrproject-rtos/zephyr/pull/34567
validations:
required: true

View File

@@ -0,0 +1,68 @@
---
name: External Source Code
about: Submit a proposal to integrate external source code
title: ''
labels: TSC
assignees: ''
---
## Origin
Name of project hosting the original open source code
Provide a link to the source
## Purpose
Brief description of what this software does
## Mode of integration
Describe whether you'd like to integrate this external component in the main tree
or as a module, and why. If the mode of integration is a module, suggest a
repository name for the module
## Maintainership
List the person(s) that will be maintaining the integration of this external code
for the foreseeable future. Please use GitHub IDs to identify them. You can
choose to identify a single maintainer only or add collaborators as well
## Pull Request
Pull request (if any) with the actual implementation of the integration, be it
in the main tree or as a module (pointing to your own fork for now). Make sure
the PR is correctly labeled as "DNM"
## Description
Long description that will help reviewers discuss suitability of the
component to solve the problem at hand (there may be a better options
available.)
What is its primary functionality (e.g., SQLLite is a lightweight
database)?
What problem are you trying to solve? (e.g., a state store is
required to maintain ...)
Why is this the right component to solve it (e.g., SQLite is small,
easy to use, and has a very liberal license.)
## Dependencies
What other components does this package depend on?
Will the Zephyr project have a direct dependency on the component, or
will it be included via an abstraction layer with this component as a
replaceable implementation?
## Revision
Version or SHA you would like to integrate initially
## License
Please use an SPDX identifier (https://spdx.org/licenses/), such as
``BSD-3-Clause``

View File

@@ -1,106 +0,0 @@
name: External Component Integration
description: Propose integration of an external open source component
labels: ["TSC"]
assignees: []
body:
- type: textarea
id: origin
attributes:
label: Origin
description: Name of project hosting the original open source code. Provide a link to the source.
placeholder: e.g., SQLite - https://sqlite.org
validations:
required: true
- type: textarea
id: purpose
attributes:
label: Purpose
description: Brief description of what this software does.
placeholder: |
e.g., A small, fast, self-contained SQL database engine.
validations:
required: true
- type: textarea
id: integration-mode
attributes:
label: Mode of Integration
description: Should this be integrated in the main tree or as a module? Explain your choice and suggest a module repo name if applicable.
placeholder: |
e.g., As a module - proposed repo name: zephyr-sqlite
validations:
required: true
- type: textarea
id: maintainership
attributes:
label: Maintainership
description: List maintainers (GitHub IDs) for this integration. Include at least one primary maintainer.
placeholder: |
e.g., @username1 (primary), @username2 (collaborator)
validations:
required: true
- type: input
id: pull-request
attributes:
label: Pull Request
description: Link to the pull request (if any) for this integration. Must be labeled "DNM" (Do Not Merge).
placeholder: |
e.g., https://github.com/zephyrproject-rtos/zephyr/pull/12345
validations:
required: false
- type: textarea
id: description
attributes:
label: Description
description: Long-form description to justify suitability of this component.
placeholder: |
- What is its primary functionality?
- What problem does it solve?
- Why is this the right component?
validations:
required: true
- type: textarea
id: security
attributes:
label: Security
description: Security-related aspects of this component, including cryptographic functions and known vulnerabilities.
placeholder: |
- Does it use cryptography?
- How are vulnerabilities handled?
- Any known CVEs?
validations:
required: false
- type: textarea
id: dependencies
attributes:
label: Dependencies
description: What does this component depend on, and how will it be integrated (directly or via abstraction)?
placeholder: |
- Other external packages?
- Direct or abstracted use in Zephyr?
validations:
required: false
- type: input
id: revision
attributes:
label: Version or SHA
description: Which version or specific commit should be initially integrated?
placeholder: e.g., v3.45.0 or 79cc94d
validations:
required: true
- type: input
id: license
attributes:
label: License (SPDX)
description: Provide the license using a valid SPDX identifier (e.g., BSD-3-Clause).
placeholder: e.g., MIT or BSD-3-Clause
validations:
required: true

41
.github/ISSUE_TEMPLATE/008_bin-blobs.md vendored Normal file
View File

@@ -0,0 +1,41 @@
---
name: Binary blobs
about: Submit a proposal to integrate binary blob(s)
title: ''
labels: TSC
assignees: ''
---
## Origin
Describe where the binary blob(s) originate from
## Type
- [ ] Precompiled library
- [ ] Firmware image
## Module
The Zephyr module that this blob(s) will be referenced from
## Purpose
Brief description of what the blob(s) do. It is especially important to describe
the functionality that the blob(s) provide, to the largest extent possible
## Pull Request
Link to the Pull request with the actual implementation of the integration. If
you are submitting this as part of a new module you may link to the same Pull
Request that introduced the new module.
## Dependencies
What other components do the blob(s) depend on, if any?
## License
Document the license the blob(s) are distributed under

View File

@@ -1,5 +0,0 @@
blank_issues_enabled: false
contact_links:
- name: Zephyr Community Support
url: https://github.com/zephyrproject-rtos/zephyr/discussions
about: Please ask and answer questions here.

8
.github/SECURITY.md vendored
View File

@@ -8,12 +8,12 @@ updates:
- The most recent release, and the release prior to that.
- Active LTS releases.
At this time, with the latest release of v4.3, the supported
At this time, with the latest release of v3.5, the supported
versions are:
- v4.3: Current release
- v4.2: Prior release
- v3.7: Current LTS
- v2.7: Current LTS
- v3.4: Prior release
- v3.5: Current release
## Reporting process

View File

@@ -1,2 +0,0 @@
paths:
- .github

View File

@@ -1,2 +0,0 @@
paths:
- doc

View File

@@ -1,26 +0,0 @@
version: 2
enable-beta-ecosystems: true
updates:
- package-ecosystem: "github-actions"
directory: "/"
schedule:
interval: "weekly"
commit-message:
prefix: "ci: github: "
labels: []
groups:
actions-deps:
patterns:
- "*"
- package-ecosystem: "uv"
directory: "/doc"
schedule:
interval: "weekly"
commit-message:
prefix: "ci: doc: "
labels: []
groups:
doc-deps:
patterns:
- "*"

View File

@@ -1,88 +0,0 @@
name: Pull Request/Issue Assigner
on:
pull_request_target:
types:
- opened
- synchronize
- reopened
- ready_for_review
branches:
- main
- collab-*
- v*-branch
issues:
types:
- labeled
permissions:
contents: read
jobs:
assignment:
name: Pull Request/Issue Assignment
if: github.event.pull_request.draft == false
runs-on: ubuntu-24.04
permissions:
issues: write # to add assignees to issues
steps:
- name: Check out source code
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
fetch-depth: 0
persist-credentials: false
- name: Set up Python
uses: zephyrproject-rtos/action-python-env@32e53bef090c33d53aa94f1d9a9d29c93cfdc5f7 # main
with:
python-version: 3.12
- name: Fetch west.yml/Maintainer.yml from pull request
if: >
github.event_name == 'pull_request_target' && github.base_ref == 'main'
run: |
git fetch origin pull/${{ github.event.pull_request.number }}/merge
git show FETCH_HEAD:west.yml > pr_west.yml
git show FETCH_HEAD:MAINTAINERS.yml > pr_MAINTAINERS.yml
- name: west setup
if: >
github.event_name == 'pull_request_target'
run: |
git config --global user.email "you@example.com"
git config --global user.name "Your Name"
west init -l . || true
- name: Run assignment script
env:
GITHUB_TOKEN: ${{ secrets.ZB_PR_ASSIGNER_GITHUB_TOKEN }}
run: |
FLAGS="-v"
FLAGS+=" -o ${{ github.event.repository.owner.login }}"
FLAGS+=" -r ${{ github.event.repository.name }}"
FLAGS+=" -M MAINTAINERS.yml"
if [ "${{ github.event_name }}" = "pull_request_target" ]; then
if [ "${{ github.base_ref }}" != "main" ]; then
FLAGS+=" -P ${{ github.event.pull_request.number }} --updated-manifest pr_west.yml --updated-maintainer-file pr_MAINTAINERS.yml"
else
FLAGS+=" -P ${{ github.event.pull_request.number }}"
fi
elif [ "${{ github.event_name }}" = "issues" ]; then
FLAGS+=" -I ${{ github.event.issue.number }}"
elif [ "${{ github.event_name }}" = "schedule" ]; then
FLAGS+=" --modules"
else
echo "Unknown event: ${{ github.event_name }}"
exit 1
fi
python3 scripts/ci/set_assignees.py $FLAGS
- name: Check maintainer file changes
if: >
github.event_name == 'pull_request_target' && github.base_ref == 'main'
env:
GITHUB_TOKEN: ${{ secrets.ZB_PR_ASSIGNER_GITHUB_TOKEN }}
run: |
python ./scripts/ci/check_maintainer_changes.py \
--repo zephyrproject-rtos/zephyr MAINTAINERS.yml pr_MAINTAINERS.yml

52
.github/workflows/assigner.yml.disabled vendored Normal file
View File

@@ -0,0 +1,52 @@
name: Pull Request Assigner
on:
pull_request_target:
types:
- opened
- synchronize
- reopened
- ready_for_review
branches:
- main
- collab-*
- v*-branch
issues:
types:
- labeled
jobs:
assignment:
name: Pull Request Assignment
if: github.event.pull_request.draft == false
runs-on: ubuntu-22.04
steps:
- name: Install Python dependencies
run: |
sudo pip3 install -U setuptools wheel pip
pip3 install -U PyGithub>=1.55 west
- name: Check out source code
uses: actions/checkout@v4
- name: Run assignment script
env:
GITHUB_TOKEN: ${{ secrets.ZB_GITHUB_TOKEN }}
run: |
FLAGS="-v"
FLAGS+=" -o ${{ github.event.repository.owner.login }}"
FLAGS+=" -r ${{ github.event.repository.name }}"
FLAGS+=" -M MAINTAINERS.yml"
if [ "${{ github.event_name }}" = "pull_request_target" ]; then
FLAGS+=" -P ${{ github.event.pull_request.number }}"
elif [ "${{ github.event_name }}" = "issues" ]; then
FLAGS+=" -I ${{ github.event.issue.number }}"
elif [ "${{ github.event_name }}" = "schedule" ]; then
FLAGS+=" --modules"
else
echo "Unknown event: ${{ github.event_name }}"
exit 1
fi
python3 scripts/set_assignees.py $FLAGS

View File

@@ -1,38 +0,0 @@
name: Backport
on:
pull_request_target:
types:
- closed
- labeled
branches:
- main
permissions:
contents: read
jobs:
backport:
name: Backport
runs-on: ubuntu-24.04
permissions:
contents: write # to create/push backport branches
pull-requests: write # to create backport PRs
issues: write # to add labels to issue created if backport fails
# Only react to merged PRs for security reasons.
# See https://docs.github.com/en/actions/using-workflows/events-that-trigger-workflows#pull_request_target.
if: >
github.event.pull_request.merged &&
(
github.event.action == 'closed' ||
(
github.event.action == 'labeled' &&
contains(github.event.label.name, 'backport')
)
)
steps:
- name: Backport
uses: zephyrproject-rtos/action-backport@7e74f601d11eaca577742445e87775b5651a965f # v2.0.3-3
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
issue_labels: Backport
labels_template: '["Backport"]'

31
.github/workflows/backport.yml.disabled vendored Normal file
View File

@@ -0,0 +1,31 @@
name: Backport
on:
pull_request_target:
types:
- closed
- labeled
branches:
- main
jobs:
backport:
name: Backport
runs-on: ubuntu-22.04
# Only react to merged PRs for security reasons.
# See https://docs.github.com/en/actions/using-workflows/events-that-trigger-workflows#pull_request_target.
if: >
github.event.pull_request.merged &&
(
github.event.action == 'closed' ||
(
github.event.action == 'labeled' &&
contains(github.event.label.name, 'backport')
)
)
steps:
- name: Backport
uses: zephyrproject-rtos/action-backport@v2.0.3-3
with:
github_token: ${{ secrets.ZB_GITHUB_TOKEN }}
issue_labels: Backport
labels_template: '["Backport"]'

View File

@@ -1,50 +0,0 @@
name: Backport Issue Check
on:
pull_request_target:
types:
- edited
- opened
- reopened
- synchronize
branches:
- v*-branch
permissions:
contents: read
jobs:
backport:
name: Backport Issue Check
concurrency:
group: backport-issue-check-${{ github.ref }}
cancel-in-progress: true
runs-on: ubuntu-24.04
if: github.repository == 'zephyrproject-rtos/zephyr'
permissions:
issues: read # to check if associated issue exists for backport
steps:
- name: Check out source code
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Run backport issue checker
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
./scripts/release/list_backports.py \
-o ${{ github.event.repository.owner.login }} \
-r ${{ github.event.repository.name }} \
-b ${{ github.event.pull_request.base.ref }} \
-p ${{ github.event.pull_request.number }}

View File

@@ -0,0 +1,39 @@
name: Backport Issue Check
on:
pull_request_target:
types:
- edited
- opened
- reopened
- synchronize
branches:
- v*-branch
jobs:
backport:
name: Backport Issue Check
concurrency:
group: backport-issue-check-${{ github.ref }}
cancel-in-progress: true
runs-on: ubuntu-22.04
if: github.repository == 'zephyrproject-rtos/zephyr'
steps:
- name: Check out source code
uses: actions/checkout@v4
- name: Install Python dependencies
run: |
sudo pip3 install -U setuptools wheel pip
pip3 install -U pygithub
- name: Run backport issue checker
env:
GITHUB_TOKEN: ${{ secrets.ZB_GITHUB_TOKEN }}
run: |
./scripts/release/list_backports.py \
-o ${{ github.event.repository.owner.login }} \
-r ${{ github.event.repository.name }} \
-b ${{ github.event.pull_request.base.ref }} \
-p ${{ github.event.pull_request.number }}

View File

@@ -1,34 +0,0 @@
name: Publish BabbleSim Tests Results
on:
workflow_run:
workflows: ["BabbleSim Tests"]
types:
- completed
permissions:
contents: read
jobs:
bsim-test-results:
name: "Publish BabbleSim Test Results"
runs-on: ubuntu-24.04
if: github.event.workflow_run.conclusion != 'skipped'
permissions:
checks: write # to create the check run entry with test results
steps:
- name: Download artifacts
uses: dawidd6/action-download-artifact@ac66b43f0e6a346234dd65d4d0c8fbb31cb316e5 # v11
with:
run_id: ${{ github.event.workflow_run.id }}
- name: Publish BabbleSim Test Results
uses: EnricoMi/publish-unit-test-result-action@34d7c956a59aed1bfebf31df77b8de55db9bbaaf # v2.21.0
with:
check_name: BabbleSim Test Results
comment_mode: off
commit: ${{ github.event.workflow_run.head_sha }}
event_file: event/event.json
event_name: ${{ github.event.workflow_run.event }}
files: "bsim-test-results/**/bsim_results.xml"

View File

@@ -0,0 +1,28 @@
name: Publish BabbleSim Tests Results
on:
workflow_run:
workflows: ["BabbleSim Tests"]
types:
- completed
jobs:
bsim-test-results:
name: "Publish BabbleSim Test Results"
runs-on: ubuntu-22.04
if: github.event.workflow_run.conclusion != 'skipped'
steps:
- name: Download artifacts
uses: dawidd6/action-download-artifact@v2
with:
run_id: ${{ github.event.workflow_run.id }}
- name: Publish BabbleSim Test Results
uses: EnricoMi/publish-unit-test-result-action@v2
with:
check_name: BabbleSim Test Results
comment_mode: off
commit: ${{ github.event.workflow_run.head_sha }}
event_file: event/event.json
event_name: ${{ github.event.workflow_run.event }}
files: "bsim-test-results/**/bsim_results.xml"

View File

@@ -1,212 +0,0 @@
name: BabbleSim Tests
on:
pull_request:
paths:
- ".github/workflows/bsim-tests.yaml"
- ".github/workflows/bsim-tests-publish.yaml"
- "west.yml"
- "subsys/bluetooth/**"
- "tests/bsim/**"
- "boards/nordic/nrf5*/*dt*"
- "dts/*/nordic/**"
- "tests/bluetooth/**"
- "samples/bluetooth/**"
- "boards/native/**"
- "soc/native/**"
- "arch/posix/**"
- "include/zephyr/arch/posix/**"
- "scripts/native_simulator/**"
- "samples/net/sockets/echo_*/**"
- "modules/hal_nordic/**"
- "modules/mbedtls/**"
- "modules/openthread/**"
- "subsys/net/l2/openthread/**"
- "include/zephyr/net/openthread.h"
- "drivers/ieee802154/**"
- "include/zephyr/net/ieee802154*"
- "drivers/serial/*nrfx*"
- "tests/drivers/uart/**"
- '!**.rst'
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.event_name }}-${{ github.head_ref || github.ref }}
cancel-in-progress: true
jobs:
bsim-test:
if: github.repository_owner == 'zephyrproject-rtos'
runs-on:
group: zephyr-runner-v2-linux-x64-4xlarge
container:
image: ghcr.io/zephyrproject-rtos/ci-repo-cache:v0.28.7.20251127
options: '--entrypoint /bin/bash'
env:
ZEPHYR_TOOLCHAIN_VARIANT: zephyr
BSIM_OUT_PATH: /opt/bsim/
BSIM_COMPONENTS_PATH: /opt/bsim/components
EDTT_PATH: ../tools/edtt
permissions:
checks: write # to create the check run entry with test results
steps:
- name: Apply container owner mismatch workaround
run: |
# FIXME: The owner UID of the GITHUB_WORKSPACE directory may not
# match the container user UID because of the way GitHub
# Actions runner is implemented. Remove this workaround when
# GitHub comes up with a fundamental fix for this problem.
git config --global --add safe.directory ${GITHUB_WORKSPACE}
- name: Print cloud service information
run: |
echo "ZEPHYR_RUNNER_CLOUD_PROVIDER = ${ZEPHYR_RUNNER_CLOUD_PROVIDER}"
echo "ZEPHYR_RUNNER_CLOUD_NODE = ${ZEPHYR_RUNNER_CLOUD_NODE}"
echo "ZEPHYR_RUNNER_CLOUD_POD = ${ZEPHYR_RUNNER_CLOUD_POD}"
- name: Clone cached Zephyr repository
continue-on-error: true
run: |
git clone --shared /repo-cache/zephyrproject/zephyr .
git remote set-url origin ${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
fetch-depth: 0
- name: Environment Setup
env:
BASE_REF: ${{ github.base_ref }}
run: |
git config --global user.email "bot@zephyrproject.org"
git config --global user.name "Zephyr Bot"
rm -fr ".git/rebase-apply"
rm -fr ".git/rebase-merge"
git rebase origin/${BASE_REF}
git clean -f -d
git log --pretty=oneline | head -n 10
west init -l . || true
west config manifest.group-filter -- +ci
west config --global update.narrow true
west update --path-cache /repo-cache/zephyrproject 2>&1 1> west.update.log || west update --path-cache /repo-cache/zephyrproject 2>&1 1> west.update.log || ( rm -rf ../modules ../bootloader ../tools && west update --path-cache /repo-cache/zephyrproject)
west forall -c 'git reset --hard HEAD'
echo "ZEPHYR_SDK_INSTALL_DIR=/opt/toolchains/zephyr-sdk-$( cat SDK_VERSION )" >> $GITHUB_ENV
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Check common triggering files
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
id: check-common-files
with:
files: |
.github/workflows/bsim-tests.yaml
.github/workflows/bsim-tests-publish.yaml
west.yml
boards/native/
soc/native/
arch/posix/
include/zephyr/arch/posix/
scripts/native_simulator/
tests/bsim/*
boards/nordic/nrf5*/*dt*
dts/*/nordic/
modules/mbedtls/**
modules/hal_nordic/**
- name: Check if Bluethooth files changed
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
id: check-bluetooth-files
with:
files: |
samples/bluetooth/
subsys/bluetooth/
tests/bluetooth/
tests/bsim/bluetooth/
- name: Check if Networking files changed
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
id: check-networking-files
with:
files: |
tests/bsim/net/
samples/net/sockets/echo_*/
modules/openthread/
subsys/net/l2/openthread/
include/zephyr/net/openthread.h
drivers/ieee802154/
include/zephyr/net/ieee802154*
- name: Check if UART files changed
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
id: check-uart-files
with:
files: |
tests/bsim/drivers/uart/
drivers/serial/*nrfx*
tests/drivers/uart/
- name: Update BabbleSim to manifest revision
if: >
steps.check-bluetooth-files.outputs.any_modified == 'true'
|| steps.check-networking-files.outputs.any_modified == 'true'
|| steps.check-uart-files.outputs.any_modified == 'true'
|| steps.check-common-files.outputs.any_modified == 'true'
run: |
export BSIM_VERSION=$( west list bsim -f {revision} )
echo "Manifest points to bsim sha $BSIM_VERSION"
cd /opt/bsim_west/bsim
git fetch -n origin ${BSIM_VERSION}
git -c advice.detachedHead=false checkout ${BSIM_VERSION}
west update
make everything -s -j 8
- name: Run Bluetooth Tests with BSIM
if: steps.check-bluetooth-files.outputs.any_modified == 'true' || steps.check-common-files.outputs.any_modified == 'true'
run: |
tests/bsim/ci.bt.sh
- name: Run Networking Tests with BSIM
if: steps.check-networking-files.outputs.any_modified == 'true' || steps.check-common-files.outputs.any_modified == 'true'
run: |
tests/bsim/ci.net.sh
- name: Run UART Tests with BSIM
if: steps.check-uart-files.outputs.any_modified == 'true' || steps.check-common-files.outputs.any_modified == 'true'
run: |
tests/bsim/ci.uart.sh
- name: Merge Test Results
run: |
junitparser merge --glob "./bsim_*/*bsim_results.*.xml" "./twister-out/twister.xml" junit.xml
junit2html junit.xml junit.html
- name: Upload Unit Test Results in HTML
if: always()
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
with:
name: HTML Unit Test Results
if-no-files-found: ignore
path: |
junit.html
- name: Publish Unit Test Results
uses: EnricoMi/publish-unit-test-result-action@34d7c956a59aed1bfebf31df77b8de55db9bbaaf # v2.21.0
with:
check_name: Bsim Test Results
files: "junit.xml"
comment_mode: off
- name: Upload Event Details
if: always()
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
with:
name: event
path: |
${{ github.event_path }}

View File

@@ -0,0 +1,219 @@
name: BabbleSim Tests
on:
pull_request:
paths:
- ".github/workflows/bsim-tests.yaml"
- ".github/workflows/bsim-tests-publish.yaml"
- "west.yml"
- "subsys/bluetooth/**"
- "tests/bsim/**"
- "samples/bluetooth/**"
- "boards/posix/**"
- "soc/posix/**"
- "arch/posix/**"
- "include/zephyr/arch/posix/**"
- "scripts/native_simulator/**"
- "samples/net/sockets/echo_*/**"
- "modules/openthread/**"
- "subsys/net/l2/openthread/**"
- "include/zephyr/net/openthread.h"
- "drivers/ieee802154/**"
- "include/zephyr/net/ieee802154*"
- "drivers/serial/*nrfx*"
- "tests/drivers/uart/**"
concurrency:
group: ${{ github.workflow }}-${{ github.event_name }}-${{ github.head_ref || github.ref }}
cancel-in-progress: true
jobs:
bsim-test:
if: github.repository_owner == 'zephyrproject-rtos'
runs-on:
group: zephyr-runner-v2-linux-x64-4xlarge
container:
image: ghcr.io/zephyrproject-rtos/ci-repo-cache:v0.26.7.20240313
options: '--entrypoint /bin/bash'
env:
ZEPHYR_TOOLCHAIN_VARIANT: zephyr
BSIM_OUT_PATH: /opt/bsim/
BSIM_COMPONENTS_PATH: /opt/bsim/components
EDTT_PATH: ../tools/edtt
bsim_bt_52_test_results_file: ./bsim_bt/52_bsim_results.xml
bsim_bt_53_test_results_file: ./bsim_bt/53_bsim_results.xml
bsim_bt_53split_test_results_file: ./bsim_bt/53_bsim_split_results.xml
bsim_net_52_test_results_file: ./bsim_net/52_bsim_results.xml
bsim_uart_test_results_file: ./bsim_uart/uart_bsim_results.xml
steps:
- name: Apply container owner mismatch workaround
run: |
# FIXME: The owner UID of the GITHUB_WORKSPACE directory may not
# match the container user UID because of the way GitHub
# Actions runner is implemented. Remove this workaround when
# GitHub comes up with a fundamental fix for this problem.
git config --global --add safe.directory ${GITHUB_WORKSPACE}
- name: Print cloud service information
run: |
echo "ZEPHYR_RUNNER_CLOUD_PROVIDER = ${ZEPHYR_RUNNER_CLOUD_PROVIDER}"
echo "ZEPHYR_RUNNER_CLOUD_NODE = ${ZEPHYR_RUNNER_CLOUD_NODE}"
echo "ZEPHYR_RUNNER_CLOUD_POD = ${ZEPHYR_RUNNER_CLOUD_POD}"
- name: Clone cached Zephyr repository
continue-on-error: true
run: |
git clone --shared /repo-cache/zephyrproject/zephyr .
git remote set-url origin ${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}
- name: Checkout
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Environment Setup
env:
BASE_REF: ${{ github.base_ref }}
run: |
git config --global user.email "bot@zephyrproject.org"
git config --global user.name "Zephyr Bot"
rm -fr ".git/rebase-apply"
git rebase origin/${BASE_REF}
git log --pretty=oneline | head -n 10
west init -l . || true
west config manifest.group-filter -- +ci
west config --global update.narrow true
west update --path-cache /repo-cache/zephyrproject 2>&1 1> west.update.log || west update --path-cache /repo-cache/zephyrproject 2>&1 1> west.update.log || ( rm -rf ../modules ../bootloader ../tools && west update --path-cache /repo-cache/zephyrproject)
west forall -c 'git reset --hard HEAD'
echo "ZEPHYR_SDK_INSTALL_DIR=/opt/toolchains/zephyr-sdk-$( cat SDK_VERSION )" >> $GITHUB_ENV
- name: Check common triggering files
uses: tj-actions/changed-files@v41
id: check-common-files
with:
files: |
.github/workflows/bsim-tests.yaml
.github/workflows/bsim-tests-publish.yaml
west.yml
boards/posix/**
soc/posix/**
arch/posix/**
include/zephyr/arch/posix/**
scripts/native_simulator/**
tests/bsim/*
- name: Check if Bluethooth files changed
uses: tj-actions/changed-files@v41
id: check-bluetooth-files
with:
files: |
tests/bsim/bluetooth/**
samples/bluetooth/**
subsys/bluetooth/**
- name: Check if Networking files changed
uses: tj-actions/changed-files@v41
id: check-networking-files
with:
files: |
tests/bsim/net/**
samples/net/sockets/echo_*/**
modules/openthread/**
subsys/net/l2/openthread/**
include/zephyr/net/openthread.h
drivers/ieee802154/**
include/zephyr/net/ieee802154*
- name: Check if UART files changed
uses: tj-actions/changed-files@v41
id: check-uart-files
with:
files: |
tests/bsim/drivers/uart/**
drivers/serial/*nrfx*
tests/drivers/uart/**
- name: Update BabbleSim to manifest revision
if: >
steps.check-bluetooth-files.outputs.any_changed == 'true'
|| steps.check-networking-files.outputs.any_changed == 'true'
|| steps.check-uart-files.outputs.any_changed == 'true'
|| steps.check-common-files.outputs.any_changed == 'true'
run: |
export BSIM_VERSION=$( west list bsim -f {revision} )
echo "Manifest points to bsim sha $BSIM_VERSION"
cd /opt/bsim_west/bsim
git fetch -n origin ${BSIM_VERSION}
git config --global advice.detachedHead false
git checkout ${BSIM_VERSION}
west update
make everything -s -j 8
- name: Run Bluetooth Tests with BSIM
if: steps.check-bluetooth-files.outputs.any_changed == 'true' || steps.check-common-files.outputs.any_changed == 'true'
run: |
export ZEPHYR_BASE=${PWD}
export WORK_DIR=${ZEPHYR_BASE}/bsim_bt
# Build and run the BT tests for nrf52_bsim:
nice tests/bsim/bluetooth/compile.sh
RESULTS_FILE=${ZEPHYR_BASE}/${bsim_bt_52_test_results_file} \
TESTS_FILE=tests/bsim/bluetooth/tests.nrf52bsim.txt tests/bsim/run_parallel.sh
# Build and run the BT controller tests also for the nrf5340bsim_nrf5340_cpunet
BOARD=nrf5340bsim_nrf5340_cpunet \
nice tests/bsim/bluetooth/compile.nrf5340bsim_nrf5340_cpunet.sh
BOARD=nrf5340bsim_nrf5340_cpunet \
RESULTS_FILE=${ZEPHYR_BASE}/${bsim_bt_53_test_results_file} \
TESTS_FILE=tests/bsim/bluetooth/tests.nrf5340bsim_nrf5340_cpunet.txt \
tests/bsim/run_parallel.sh
# Build and run the nrf5340 split stack tests set
BOARD=nrf5340bsim_nrf5340_cpuapp \
nice tests/bsim/bluetooth/compile.nrf5340bsim_nrf5340_cpuapp.sh
BOARD=nrf5340bsim_nrf5340_cpuapp \
RESULTS_FILE=${ZEPHYR_BASE}/${bsim_bt_53split_test_results_file} \
TESTS_FILE=tests/bsim/bluetooth/tests.nrf5340bsim_nrf5340_cpuapp.txt \
tests/bsim/run_parallel.sh
- name: Run Networking Tests with BSIM
if: steps.check-networking-files.outputs.any_changed == 'true' || steps.check-common-files.outputs.any_changed == 'true'
run: |
export ZEPHYR_BASE=${PWD}
WORK_DIR=${ZEPHYR_BASE}/bsim_net nice tests/bsim/net/compile.sh
RESULTS_FILE=${ZEPHYR_BASE}/${bsim_net_52_test_results_file} \
SEARCH_PATH=tests/bsim/net/ tests/bsim/run_parallel.sh
- name: Run UART Tests with BSIM
if: steps.check-uart-files.outputs.any_changed == 'true' || steps.check-common-files.outputs.any_changed == 'true'
run: |
echo "UART: Single device tests"
./scripts/twister -T tests/drivers/uart/ --force-color --inline-logs -v -M -p nrf52_bsim \
--fixture gpio_loopback -- -uart0_loopback
echo "UART: Multi device tests"
export ZEPHYR_BASE=${PWD}
WORK_DIR=${ZEPHYR_BASE}/bsim_uart nice tests/bsim/drivers/uart/compile.sh
RESULTS_FILE=${ZEPHYR_BASE}/${bsim_uart_test_results_file} \
SEARCH_PATH=tests/bsim/drivers/uart/ tests/bsim/run_parallel.sh
- name: Upload Test Results
if: always()
uses: actions/upload-artifact@v4
with:
name: bsim-test-results
path: |
./bsim_bt/52_bsim_results.xml
./bsim_bt/53_bsim_results.xml
./bsim_bt/53_bsim_split_results.xml
./bsim_net/52_bsim_results.xml
./bsim_uart/uart_bsim_results.xml
./twister-out/twister.xml
./twister-out/twister.json
${{ github.event_path }}
if-no-files-found: warn
- name: Upload Event Details
if: always()
uses: actions/upload-artifact@v4
with:
name: event
path: |
${{ github.event_path }}

View File

@@ -1,76 +0,0 @@
# Copyright (c) 2021, 2022 Nordic Semiconductor ASA
# SPDX-License-Identifier: Apache-2.0
# Make a snapshot of open bugs as a python pickle file, compressed
# using xz. Upload the xz file to Amazon S3.
name: Bug Snapshot
on:
workflow_dispatch:
schedule:
# Run daily at 00:05
- cron: '5 00 * * *'
permissions:
contents: read
jobs:
make_bugs_pickle:
name: Make bugs pickle
runs-on: ubuntu-24.04
if: github.repository_owner == 'zephyrproject-rtos' && github.ref == 'refs/heads/main'
steps:
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Snapshot bugs
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
BUGS_PICKLE_FILENAME="zephyr-bugs-$(date -I).pickle.xz"
BUGS_PICKLE_PATH="${RUNNER_TEMP}/${BUGS_PICKLE_FILENAME}"
set -euo pipefail
python3 scripts/make_bugs_pickle.py | xz > ${BUGS_PICKLE_PATH}
echo "BUGS_PICKLE_FILENAME=${BUGS_PICKLE_FILENAME}" >> ${GITHUB_ENV}
echo "BUGS_PICKLE_PATH=${BUGS_PICKLE_PATH}" >> ${GITHUB_ENV}
- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@00943011d9042930efac3dcd3a170e4273319bc8 # v5.1.0
with:
aws-access-key-id: ${{ vars.AWS_BUILDS_ZEPHYR_BUG_SNAPSHOT_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_BUILDS_ZEPHYR_BUG_SNAPSHOT_SECRET_ACCESS_KEY }}
aws-region: us-east-1
- name: Upload to AWS S3
run: |
REPOSITORY_NAME="${GITHUB_REPOSITORY#*/}"
PUBLISH_BUCKET="builds.zephyrproject.org"
PUBLISH_DOMAIN="builds.zephyrproject.io"
if [ "${{github.event_name}}" = "schedule" ]; then
PUBLISH_ROOT="${REPOSITORY_NAME}/bug-snapshot/daily"
else
PUBLISH_ROOT="${REPOSITORY_NAME}/bug-snapshot"
fi
PUBLISH_UPLOAD_URI="s3://${PUBLISH_BUCKET}/${PUBLISH_ROOT}/${BUGS_PICKLE_FILENAME}"
PUBLISH_DOWNLOAD_URI="https://${PUBLISH_DOMAIN}/${PUBLISH_ROOT}/${BUGS_PICKLE_FILENAME}"
aws s3 cp --quiet ${BUGS_PICKLE_PATH} ${PUBLISH_UPLOAD_URI}
echo "Bug pickle is available at: ${PUBLISH_DOWNLOAD_URI}" >> ${GITHUB_STEP_SUMMARY}

View File

@@ -0,0 +1,67 @@
# Copyright (c) 2021, 2022 Nordic Semiconductor ASA
# SPDX-License-Identifier: Apache-2.0
# Make a snapshot of open bugs as a python pickle file, compressed
# using xz. Upload the xz file to Amazon S3.
name: Bug Snapshot
on:
workflow_dispatch:
branches: [main]
schedule:
# Run daily at 14:05
- cron: '5 14 * * *'
jobs:
make_bugs_pickle:
name: Make bugs pickle
runs-on: ubuntu-22.04
if: github.repository_owner == 'zephyrproject-rtos'
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Install Python dependencies
run: |
sudo pip3 install -U setuptools wheel pip
pip3 install -U pygithub
- name: Snapshot bugs
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
BUGS_PICKLE_FILENAME="zephyr-bugs-$(date -I).pickle.xz"
BUGS_PICKLE_PATH="${RUNNER_TEMP}/${BUGS_PICKLE_FILENAME}"
set -euo pipefail
python3 scripts/make_bugs_pickle.py | xz > ${BUGS_PICKLE_PATH}
echo "BUGS_PICKLE_FILENAME=${BUGS_PICKLE_FILENAME}" >> ${GITHUB_ENV}
echo "BUGS_PICKLE_PATH=${BUGS_PICKLE_PATH}" >> ${GITHUB_ENV}
- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@v2
with:
aws-access-key-id: ${{ vars.AWS_BUILDS_ZEPHYR_BUG_SNAPSHOT_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_BUILDS_ZEPHYR_BUG_SNAPSHOT_SECRET_ACCESS_KEY }}
aws-region: us-east-1
- name: Upload to AWS S3
run: |
REPOSITORY_NAME="${GITHUB_REPOSITORY#*/}"
PUBLISH_BUCKET="builds.zephyrproject.org"
PUBLISH_DOMAIN="builds.zephyrproject.io"
if [ "${{github.event_name}}" = "schedule" ]; then
PUBLISH_ROOT="${REPOSITORY_NAME}/bug-snapshot/daily"
else
PUBLISH_ROOT="${REPOSITORY_NAME}/bug-snapshot"
fi
PUBLISH_UPLOAD_URI="s3://${PUBLISH_BUCKET}/${PUBLISH_ROOT}/${BUGS_PICKLE_FILENAME}"
PUBLISH_DOWNLOAD_URI="https://${PUBLISH_DOMAIN}/${PUBLISH_ROOT}/${BUGS_PICKLE_FILENAME}"
aws s3 cp --quiet ${BUGS_PICKLE_PATH} ${PUBLISH_UPLOAD_URI}
echo "Bug pickle is available at: ${PUBLISH_DOWNLOAD_URI}" >> ${GITHUB_STEP_SUMMARY}

View File

@@ -1,184 +0,0 @@
name: Build with Clang/LLVM
on:
push:
branches:
- main
- v*-branch
- collab-*
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.event_name }}-${{ github.head_ref || github.ref }}
cancel-in-progress: true
jobs:
clang-build:
if: github.repository_owner == 'zephyrproject-rtos'
runs-on:
group: zephyr-runner-v2-linux-x64-4xlarge
container:
image: ghcr.io/zephyrproject-rtos/ci-repo-cache:v0.28.7.20251127
options: '--entrypoint /bin/bash'
strategy:
fail-fast: false
matrix:
subset: [1, 2]
env:
CCACHE_DIR: /node-cache/ccache-zephyr
CCACHE_REMOTE_STORAGE: "redis://cache-*.keydb-cache.svc.cluster.local|shards=1,2,3"
CCACHE_REMOTE_ONLY: "true"
CCACHE_IGNOREOPTIONS: '-specs=* --specs=*'
LLVM_TOOLCHAIN_PATH: /usr/lib/llvm-20
BASE_REF: ${{ github.base_ref }}
steps:
- name: Apply container owner mismatch workaround
run: |
# FIXME: The owner UID of the GITHUB_WORKSPACE directory may not
# match the container user UID because of the way GitHub
# Actions runner is implemented. Remove this workaround when
# GitHub comes up with a fundamental fix for this problem.
git config --global --add safe.directory ${GITHUB_WORKSPACE}
- name: Print cloud service information
run: |
echo "ZEPHYR_RUNNER_CLOUD_PROVIDER = ${ZEPHYR_RUNNER_CLOUD_PROVIDER}"
echo "ZEPHYR_RUNNER_CLOUD_NODE = ${ZEPHYR_RUNNER_CLOUD_NODE}"
echo "ZEPHYR_RUNNER_CLOUD_POD = ${ZEPHYR_RUNNER_CLOUD_POD}"
- name: Clone cached Zephyr repository
continue-on-error: true
run: |
git clone --shared /repo-cache/zephyrproject/zephyr .
git remote set-url origin ${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
fetch-depth: 0
persist-credentials: false
- name: Environment Setup
run: |
echo "$HOME/.local/bin" >> $GITHUB_PATH
git config --global user.email "bot@zephyrproject.org"
git config --global user.name "Zephyr Bot"
rm -fr ".git/rebase-apply"
rm -fr ".git/rebase-merge"
git clean -f -d
git log --pretty=oneline | head -n 10
west init -l . || true
west config --global update.narrow true
west config manifest.group-filter -- +ci,+optional
# In some cases modules are left in a state where they can't be
# updated (i.e. when we cancel a job and the builder is killed),
# So first retry to update, if that does not work, remove all modules
# and start over. (Workaround until we implement more robust module
# west caching).
west update --path-cache /repo-cache/zephyrproject 2>&1 1> west.log || west update --path-cache /repo-cache/zephyrproject 2>&1 1> west2.log || ( rm -rf ../modules ../bootloader ../tools && west update --path-cache /repo-cache/zephyrproject)
echo "ZEPHYR_SDK_INSTALL_DIR=/opt/toolchains/zephyr-sdk-$( cat SDK_VERSION )" >> $GITHUB_ENV
- name: Check Environment
run: |
cmake --version
${LLVM_TOOLCHAIN_PATH}/bin/clang --version
gcc --version
ls -la
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Set up ccache
run: |
mkdir -p ${CCACHE_DIR}
ccache -M 10G
ccache -p
ccache -z -s -vv
- name: Update BabbleSim to manifest revision
run: |
export BSIM_VERSION=$( west list bsim -f {revision} )
echo "Manifest points to bsim sha $BSIM_VERSION"
cd /opt/bsim_west/bsim
git fetch -n origin ${BSIM_VERSION}
git -c advice.detachedHead=false checkout ${BSIM_VERSION}
west update
make everything -s -j 8
- name: Run Tests with Twister
id: twister
run: |
export ZEPHYR_BASE=${PWD}
export ZEPHYR_TOOLCHAIN_VARIANT=llvm
./scripts/twister -p native_sim --force-color --inline-logs -M -N -v --retry-failed 2 \
-T tests --subset ${{matrix.subset}}/2 -j 16
- name: Print ccache stats
if: always()
run: |
ccache -s -vv
- name: Upload Unit Test Results
if: always()
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
with:
name: Unit Test Results (Subset ${{ matrix.subset }})
path: |
twister-out/twister.xml
twister-out/twister.json
if-no-files-found: ignore
clang-build-results:
name: "Publish Unit Tests Results"
needs: clang-build
runs-on: ubuntu-24.04
permissions:
checks: write # to create GitHub annotations
if: (success() || failure())
steps:
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
fetch-depth: 0
persist-credentials: false
- name: Download Artifacts
uses: actions/download-artifact@018cc2cf5baa6db3ef3c5f8a56943fffe632ef53 # v6.0.0
with:
path: artifacts
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Merge Test Results
run: |
junitparser merge --glob 'artifacts/*/twister.xml' 'artifacts/*/*/twister.xml' junit.xml
junit2html junit.xml junit-clang.html
- name: Upload Unit Test Results in HTML
if: always()
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
with:
name: HTML Unit Test Results
if-no-files-found: ignore
path: |
junit-clang.html
- name: Publish Unit Test Results
uses: EnricoMi/publish-unit-test-result-action@34d7c956a59aed1bfebf31df77b8de55db9bbaaf # v2.21.0
if: always()
with:
check_name: Unit Test Results
files: "**/twister.xml"
comment_mode: off

154
.github/workflows/clang.yaml.disabled vendored Normal file
View File

@@ -0,0 +1,154 @@
name: Build with Clang/LLVM
on: pull_request_target
concurrency:
group: ${{ github.workflow }}-${{ github.event_name }}-${{ github.head_ref || github.ref }}
cancel-in-progress: true
jobs:
clang-build:
if: github.repository_owner == 'zephyrproject-rtos'
runs-on:
group: zephyr-runner-v2-linux-x64-4xlarge
container:
image: ghcr.io/zephyrproject-rtos/ci-repo-cache:v0.26.7.20240313
options: '--entrypoint /bin/bash'
strategy:
fail-fast: false
matrix:
platform: ["native_sim"]
env:
CCACHE_DIR: /node-cache/ccache-zephyr
CCACHE_REMOTE_STORAGE: "redis://cache-*.keydb-cache.svc.cluster.local|shards=1,2,3"
CCACHE_REMOTE_ONLY: "true"
LLVM_TOOLCHAIN_PATH: /usr/lib/llvm-16
COMMIT_RANGE: ${{ github.event.pull_request.base.sha }}..${{ github.event.pull_request.head.sha }}
BASE_REF: ${{ github.base_ref }}
outputs:
report_needed: ${{ steps.twister.outputs.report_needed }}
steps:
- name: Apply container owner mismatch workaround
run: |
# FIXME: The owner UID of the GITHUB_WORKSPACE directory may not
# match the container user UID because of the way GitHub
# Actions runner is implemented. Remove this workaround when
# GitHub comes up with a fundamental fix for this problem.
git config --global --add safe.directory ${GITHUB_WORKSPACE}
- name: Print cloud service information
run: |
echo "ZEPHYR_RUNNER_CLOUD_PROVIDER = ${ZEPHYR_RUNNER_CLOUD_PROVIDER}"
echo "ZEPHYR_RUNNER_CLOUD_NODE = ${ZEPHYR_RUNNER_CLOUD_NODE}"
echo "ZEPHYR_RUNNER_CLOUD_POD = ${ZEPHYR_RUNNER_CLOUD_POD}"
- name: Clone cached Zephyr repository
continue-on-error: true
run: |
git clone --shared /repo-cache/zephyrproject/zephyr .
git remote set-url origin ${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}
- name: Checkout
uses: actions/checkout@v4
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
persist-credentials: false
- name: Environment Setup
run: |
echo "$HOME/.local/bin" >> $GITHUB_PATH
git config --global user.email "bot@zephyrproject.org"
git config --global user.name "Zephyr Bot"
rm -fr ".git/rebase-apply"
git rebase origin/${BASE_REF}
git log --pretty=oneline | head -n 10
west init -l . || true
west config --global update.narrow true
west config manifest.group-filter -- +ci,+optional
# In some cases modules are left in a state where they can't be
# updated (i.e. when we cancel a job and the builder is killed),
# So first retry to update, if that does not work, remove all modules
# and start over. (Workaround until we implement more robust module
# west caching).
west update --path-cache /repo-cache/zephyrproject 2>&1 1> west.log || west update --path-cache /repo-cache/zephyrproject 2>&1 1> west2.log || ( rm -rf ../modules ../bootloader ../tools && west update --path-cache /repo-cache/zephyrproject)
echo "ZEPHYR_SDK_INSTALL_DIR=/opt/toolchains/zephyr-sdk-$( cat SDK_VERSION )" >> $GITHUB_ENV
- name: Check Environment
run: |
cmake --version
${LLVM_TOOLCHAIN_PATH}/bin/clang --version
gcc --version
ls -la
- name: Set up ccache
run: |
mkdir -p ${CCACHE_DIR}
ccache -M 10G
ccache -p
ccache -z -s -vv
- name: Run Tests with Twister
id: twister
run: |
export ZEPHYR_BASE=${PWD}
export ZEPHYR_TOOLCHAIN_VARIANT=llvm
# check if we need to run a full twister or not based on files changed
python3 ./scripts/ci/test_plan.py --platform ${{ matrix.platform }} -c origin/${BASE_REF}..
# We can limit scope to just what has changed
if [ -s testplan.json ]; then
echo "report_needed=1" >> $GITHUB_OUTPUT
# Full twister but with options based on changes
./scripts/twister --force-color --inline-logs -M -N -v --load-tests testplan.json --retry-failed 2
else
# if nothing is run, skip reporting step
echo "report_needed=0" >> $GITHUB_OUTPUT
fi
- name: Print ccache stats
if: always()
run: |
ccache -s -vv
- name: Upload Unit Test Results
if: always() && steps.twister.outputs.report_needed != 0
uses: actions/upload-artifact@v4
with:
name: Unit Test Results (Subset ${{ matrix.platform }})
path: twister-out/twister.xml
clang-build-results:
name: "Publish Unit Tests Results"
needs: clang-build
runs-on: ubuntu-22.04
if: (success() || failure() ) && needs.clang-build.outputs.report_needed != 0
steps:
- name: Download Artifacts
uses: actions/download-artifact@v4
with:
path: artifacts
- name: Merge Test Results
run: |
pip3 install junitparser junit2html
junitparser merge artifacts/*/twister.xml junit.xml
junit2html junit.xml junit-clang.html
- name: Upload Unit Test Results in HTML
if: always()
uses: actions/upload-artifact@v4
with:
name: HTML Unit Test Results
if-no-files-found: ignore
path: |
junit-clang.html
- name: Publish Unit Test Results
uses: EnricoMi/publish-unit-test-result-action@v2
if: always()
with:
check_name: Unit Test Results
files: "**/twister.xml"
comment_mode: off

View File

@@ -1,275 +0,0 @@
name: Code Coverage with codecov
on:
push:
branches:
- main
- v*-branch
- collab-*
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.event_name }}-${{ github.head_ref || github.ref }}
cancel-in-progress: true
jobs:
codecov:
if: github.repository_owner == 'zephyrproject-rtos'
runs-on:
group: zephyr-runner-v2-linux-x64-4xlarge
container:
image: ghcr.io/zephyrproject-rtos/ci-repo-cache:v0.28.7.20251127
options: '--entrypoint /bin/bash'
strategy:
fail-fast: false
matrix:
platform: ["mps2/an385", "native_sim", "qemu_x86", "unit_testing"]
include:
- platform: 'mps2/an385'
normalized: 'mps2_an385'
- platform: 'native_sim'
normalized: 'native_sim'
- platform: 'qemu_x86'
normalized: 'qemu_x86'
- platform: 'unit_testing'
normalized: 'unit_testing'
env:
CCACHE_DIR: /node-cache/ccache-zephyr
CCACHE_REMOTE_STORAGE: "redis://cache-*.keydb-cache.svc.cluster.local|shards=1,2,3"
CCACHE_REMOTE_ONLY: "true"
# `--specs` is ignored because ccache is unable to resovle the toolchain specs file path.
CCACHE_IGNOREOPTIONS: '-specs=* --specs=*'
steps:
- name: Apply container owner mismatch workaround
run: |
# FIXME: The owner UID of the GITHUB_WORKSPACE directory may not
# match the container user UID because of the way GitHub
# Actions runner is implemented. Remove this workaround when
# GitHub comes up with a fundamental fix for this problem.
git config --global --add safe.directory ${GITHUB_WORKSPACE}
- name: Print cloud service information
run: |
echo "ZEPHYR_RUNNER_CLOUD_PROVIDER = ${ZEPHYR_RUNNER_CLOUD_PROVIDER}"
echo "ZEPHYR_RUNNER_CLOUD_NODE = ${ZEPHYR_RUNNER_CLOUD_NODE}"
echo "ZEPHYR_RUNNER_CLOUD_POD = ${ZEPHYR_RUNNER_CLOUD_POD}"
- name: Update PATH for west
run: |
echo "$HOME/.local/bin" >> $GITHUB_PATH
- name: Clone cached Zephyr repository
continue-on-error: true
run: |
git clone --shared /repo-cache/zephyrproject/zephyr .
git remote set-url origin ${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}
- name: checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
fetch-depth: 0
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: west setup
run: |
west init -l . || true
west update 1> west.update.log || west update 1> west.update-2.log
- name: Environment Setup
run: |
cmake --version
gcc --version
ls -la
echo "ZEPHYR_SDK_INSTALL_DIR=/opt/toolchains/zephyr-sdk-$( cat SDK_VERSION )" >> $GITHUB_ENV
- name: Set up ccache
run: |
mkdir -p ${CCACHE_DIR}
ccache -M 10G
ccache -p
ccache -z -s -vv
- name: Run Tests with Twister (Push)
continue-on-error: true
run: |
export ZEPHYR_BASE=${PWD}
export ZEPHYR_TOOLCHAIN_VARIANT=zephyr
mkdir -p coverage/reports
./scripts/twister --save-tests ${{matrix.normalized}}-testplan.json
ls -la
./scripts/twister \
-i --force-color -N -v --filter runnable -p ${{ matrix.platform }} --coverage \
-T tests --coverage-tool gcovr -xCONFIG_TEST_EXTRA_STACK_SIZE=4096 -e nano \
--timeout-multiplier 2
- name: Build Doxygen Coverage
if: matrix.platform == 'unit_testing'
run: |
pip install -r doc/requirements.txt --require-hashes
sudo apt-get update
sudo apt-get install -y graphviz # dot is needed but currently missing from the Docker image
cmake -B doc/_build -S doc
cmake --build doc/_build --target doxygen-coverage
- name: Upload Doxygen Coverage Results
if: matrix.platform == 'unit_testing'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
with:
name: doxygen-coverage-results
path: |
doc/_build/new.info
doc/_build/coverage-report
- name: Print ccache stats
if: always()
run: |
ccache -s -vv
- name: Rename coverage files
if: always()
run: |
mv twister-out/coverage.json coverage/reports/${{matrix.normalized}}.json
- name: Upload Coverage Results
if: always()
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
with:
name: Coverage Data (Subset ${{ matrix.normalized }})
path: |
coverage/reports/${{ matrix.normalized }}.json
${{ matrix.normalized }}-testplan.json
codecov-results:
name: "Publish Coverage Results"
needs: codecov
runs-on: ubuntu-24.04
# the codecov job might be skipped, we don't need to run this job then
if: success() || failure()
steps:
- name: checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
fetch-depth: 0
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Download Artifacts
uses: actions/download-artifact@018cc2cf5baa6db3ef3c5f8a56943fffe632ef53 # v6.0.0
with:
path: coverage/reports
- name: Move coverage files
run: |
ls -lRt ./coverage/reports
mv ./coverage/reports/*/*testplan.json .
mv ./coverage/reports/*/coverage/reports/*.json ./coverage/reports
ls -la ./coverage/reports
- name: Generate list of coverage files
id: get-coverage-files
shell: cmake -P {0}
run: |
file(GLOB INPUT_FILES_LIST "coverage/reports/*.json")
set(MERGELIST "")
set(FILELIST "")
foreach(ITEM ${INPUT_FILES_LIST})
get_filename_component(f ${ITEM} NAME)
if(FILELIST STREQUAL "")
set(FILELIST "${f}")
else()
set(FILELIST "${FILELIST},${f}")
endif()
endforeach()
foreach(ITEM ${INPUT_FILES_LIST})
get_filename_component(f ${ITEM} NAME)
if(MERGELIST STREQUAL "")
set(MERGELIST "--add-tracefile ${f}")
else()
set(MERGELIST "${MERGELIST} -a ${f}")
endif()
endforeach()
file(APPEND $ENV{GITHUB_OUTPUT} "mergefiles=${MERGELIST}\n")
file(APPEND $ENV{GITHUB_OUTPUT} "covfiles=${FILELIST}\n")
- name: Merge coverage files
run: |
pushd ./coverage/reports
gcovr ${{ steps.get-coverage-files.outputs.mergefiles }} --merge-mode-functions=separate --json merged.json
gcovr ${{ steps.get-coverage-files.outputs.mergefiles }} --merge-mode-functions=separate --cobertura merged.xml
popd
- name: Get current date
id: run_date
run: |
echo "run_date=$(date --iso-8601=minutes)" >> "$GITHUB_OUTPUT"
echo "run_date_short=$(date +'%Y-%m-%d')" >> "$GITHUB_OUTPUT"
echo "run_date_year=$(date +'%Y')" >> "$GITHUB_OUTPUT"
echo "run_date_month=$(date +'%m')" >> "$GITHUB_OUTPUT"
- name: Generate Coverage Report
if: always()
run: |
python3 ./scripts/ci/coverage/coverage_analysis.py \
-t native_sim-testplan.json \
-m MAINTAINERS.yml \
-c coverage/reports/merged.json \
-o coverage-report-${{ steps.run_date.outputs.run_date_short }} \
-f all
cp coverage-report-* coverage/reports/
- name: Upload Merged Coverage Results and Report
if: always()
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
with:
name: Coverage Data and report
path: |
coverage/reports/merged.json
coverage/reports/merged.xml
coverage/reports/coverage-report-${{ steps.run_date.outputs.run_date_short }}.json
coverage/reports/coverage-report-${{ steps.run_date.outputs.run_date_short }}.xlsx
- name: Upload test coverage to Codecov
if: always()
uses: codecov/codecov-action@5a1091511ad55cbe89839c7260b706298ca349f7 # v5.5.1
with:
env_vars: OS,PYTHON
fail_ci_if_error: false
verbose: true
token: ${{ secrets.CODECOV_TOKEN }}
files: coverage/reports/merged.xml
flags: unittests-coverage
- name: Upload Doxygen coverage to Codecov
if: always()
uses: codecov/codecov-action@5a1091511ad55cbe89839c7260b706298ca349f7 # v5.5.1
with:
env_vars: OS,PYTHON
fail_ci_if_error: false
verbose: true
token: ${{ secrets.CODECOV_TOKEN }}
files: coverage/reports/doxygen-coverage-results/new.info
disable_search: true
flags: doxygen-coverage

179
.github/workflows/codecov.yaml.disabled vendored Normal file
View File

@@ -0,0 +1,179 @@
name: Code Coverage with codecov
on:
schedule:
- cron: '25 06,18 * * 1-5'
concurrency:
group: ${{ github.workflow }}-${{ github.event_name }}-${{ github.head_ref || github.ref }}
cancel-in-progress: true
jobs:
codecov:
if: github.repository_owner == 'zephyrproject-rtos'
runs-on:
group: zephyr-runner-v2-linux-x64-4xlarge
container:
image: ghcr.io/zephyrproject-rtos/ci-repo-cache:v0.26.7.20240313
options: '--entrypoint /bin/bash'
strategy:
fail-fast: false
matrix:
platform: ["mps2_an385", "native_sim", "qemu_x86", "unit_testing"]
env:
CCACHE_DIR: /node-cache/ccache-zephyr
CCACHE_REMOTE_STORAGE: "redis://cache-*.keydb-cache.svc.cluster.local|shards=1,2,3"
CCACHE_REMOTE_ONLY: "true"
# `--specs` is ignored because ccache is unable to resovle the toolchain specs file path.
CCACHE_IGNOREOPTIONS: '--specs=*'
steps:
- name: Apply container owner mismatch workaround
run: |
# FIXME: The owner UID of the GITHUB_WORKSPACE directory may not
# match the container user UID because of the way GitHub
# Actions runner is implemented. Remove this workaround when
# GitHub comes up with a fundamental fix for this problem.
git config --global --add safe.directory ${GITHUB_WORKSPACE}
- name: Print cloud service information
run: |
echo "ZEPHYR_RUNNER_CLOUD_PROVIDER = ${ZEPHYR_RUNNER_CLOUD_PROVIDER}"
echo "ZEPHYR_RUNNER_CLOUD_NODE = ${ZEPHYR_RUNNER_CLOUD_NODE}"
echo "ZEPHYR_RUNNER_CLOUD_POD = ${ZEPHYR_RUNNER_CLOUD_POD}"
- name: Update PATH for west
run: |
echo "$HOME/.local/bin" >> $GITHUB_PATH
- name: Clone cached Zephyr repository
continue-on-error: true
run: |
git clone --shared /repo-cache/zephyrproject/zephyr .
git remote set-url origin ${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}
- name: checkout
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: west setup
run: |
west init -l . || true
west update 1> west.update.log || west update 1> west.update-2.log
- name: Environment Setup
run: |
cmake --version
gcc --version
ls -la
echo "ZEPHYR_SDK_INSTALL_DIR=/opt/toolchains/zephyr-sdk-$( cat SDK_VERSION )" >> $GITHUB_ENV
- name: Set up ccache
run: |
mkdir -p ${CCACHE_DIR}
ccache -M 10G
ccache -p
ccache -z -s -vv
- name: Run Tests with Twister (Push)
continue-on-error: true
run: |
export ZEPHYR_BASE=${PWD}
export ZEPHYR_TOOLCHAIN_VARIANT=zephyr
mkdir -p coverage/reports
pip3 install gcovr==6.0
./scripts/twister -i --force-color -N -v --filter runnable -p ${{ matrix.platform }} \
--coverage -T tests --coverage-tool gcovr -xCONFIG_TEST_EXTRA_STACK_SIZE=4096 -e nano \
--timeout-multiplier 2
- name: Print ccache stats
if: always()
run: |
ccache -s -vv
- name: Rename coverage files
if: always()
run: |
cp twister-out/coverage.json coverage/reports/${{ matrix.platform }}.json
- name: Upload Coverage Results
if: always()
uses: actions/upload-artifact@v4
with:
name: Coverage Data (Subset ${{ matrix.platform }})
path: coverage/reports/${{ matrix.platform }}.json
codecov-results:
name: "Publish Coverage Results"
needs: codecov
runs-on: ubuntu-22.04
# the codecov job might be skipped, we don't need to run this job then
if: success() || failure()
steps:
- name: checkout
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Download Artifacts
uses: actions/download-artifact@v4
with:
path: coverage/reports
- name: Move coverage files
run: |
mv ./coverage/reports/*/*.json ./coverage/reports
ls -la ./coverage/reports
- name: Generate list of coverage files
id: get-coverage-files
shell: cmake -P {0}
run: |
file(GLOB INPUT_FILES_LIST "coverage/reports/*.json")
set(MERGELIST "")
set(FILELIST "")
foreach(ITEM ${INPUT_FILES_LIST})
get_filename_component(f ${ITEM} NAME)
if(FILELIST STREQUAL "")
set(FILELIST "${f}")
else()
set(FILELIST "${FILELIST},${f}")
endif()
endforeach()
foreach(ITEM ${INPUT_FILES_LIST})
get_filename_component(f ${ITEM} NAME)
if(MERGELIST STREQUAL "")
set(MERGELIST "--add-tracefile ${f}")
else()
set(MERGELIST "${MERGELIST} -a ${f}")
endif()
endforeach()
file(APPEND $ENV{GITHUB_OUTPUT} "mergefiles=${MERGELIST}\n")
file(APPEND $ENV{GITHUB_OUTPUT} "covfiles=${FILELIST}\n")
- name: Merge coverage files
run: |
cd ./coverage/reports
pip3 install gcovr==6.0
gcovr ${{ steps.get-coverage-files.outputs.mergefiles }} --merge-mode-functions=separate --json merged.json
gcovr ${{ steps.get-coverage-files.outputs.mergefiles }} --merge-mode-functions=separate --cobertura merged.xml
- name: Upload Merged Coverage Results
if: always()
uses: actions/upload-artifact@v4
with:
name: Merged Coverage Data
path: |
coverage/reports/merged.json
coverage/reports/merged.xml
- name: Upload coverage to Codecov
if: always()
uses: codecov/codecov-action@v3
with:
directory: ./coverage/reports
env_vars: OS,PYTHON
fail_ci_if_error: false
verbose: true
files: merged.xml

View File

@@ -1,58 +0,0 @@
name: "CodeQL"
on:
push:
branches:
- main
- v*-branch
- collab-*
schedule:
- cron: '34 16 * * 6'
pull_request:
branches:
- main
- v*-branch
- collab-*
permissions:
contents: read
jobs:
analyze:
name: Analyze (${{ matrix.language }})
runs-on: ubuntu-24.04
permissions:
security-events: write
strategy:
fail-fast: false
matrix:
include:
- language: python
build-mode: none
- language: actions
build-mode: none
config: ./.github/codeql/codeql-actions-config.yml
- language: javascript-typescript
build-mode: none
config: ./.github/codeql/codeql-js-config.yml
steps:
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- name: Initialize CodeQL
uses: github/codeql-action/init@0499de31b99561a6d14a36a5f662c2a54f91beee # v4.31.2
with:
languages: ${{ matrix.language }}
build-mode: ${{ matrix.build-mode }}
queries: security-extended
config-file: ${{ matrix.config }}
- if: matrix.build-mode == 'manual'
shell: bash
run: |
echo "nothing yet"
exit 0
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@0499de31b99561a6d14a36a5f662c2a54f91beee # v4.31.2
with:
category: "/language:${{matrix.language}}"

View File

@@ -1,64 +0,0 @@
name: Coding Guidelines
on: pull_request
permissions:
contents: read
jobs:
compliance_job:
runs-on: ubuntu-24.04
name: Run coding guidelines checks on patch series (PR)
steps:
- name: Checkout the code
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Install Packages
run: |
sudo apt-get update
sudo apt-get install coccinelle
- name: Run Coding Guidelines Checks
continue-on-error: true
id: coding_guidelines
env:
BASE_REF: ${{ github.base_ref }}
run: |
export ZEPHYR_BASE=$PWD
git config --global user.email "actions@zephyrproject.org"
git config --global user.name "Github Actions"
git remote -v
rm -fr ".git/rebase-apply"
rm -fr ".git/rebase-merge"
git rebase origin/${BASE_REF}
git clean -f -d
source zephyr-env.sh
# debug
ls -la
git log --pretty=oneline | head -n 10
./scripts/ci/guideline_check.py --output output.txt -c origin/${BASE_REF}..
- name: check-warns
run: |
if [[ -s "output.txt" ]]; then
errors=$(cat output.txt)
errors="${errors//'%'/'%25'}"
errors="${errors//$'\n'/'%0A'}"
errors="${errors//$'\r'/'%0D'}"
echo "::error file=output.txt::$errors"
exit 1;
fi

View File

@@ -0,0 +1,59 @@
name: Coding Guidelines
on: pull_request
jobs:
compliance_job:
runs-on: ubuntu-22.04
name: Run coding guidelines checks on patch series (PR)
steps:
- name: Checkout the code
uses: actions/checkout@v4
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
- name: cache-pip
uses: actions/cache@v4
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-${{ hashFiles('.github/workflows/coding_guidelines.yml') }}
- name: Install python dependencies
run: |
pip3 install unidiff
pip3 install wheel
pip3 install sh
- name: Install Packages
run: |
sudo apt-get update
sudo apt-get install coccinelle
- name: Run Coding Guildeines Checks
continue-on-error: true
id: coding_guidelines
env:
BASE_REF: ${{ github.base_ref }}
run: |
export ZEPHYR_BASE=$PWD
git config --global user.email "actions@zephyrproject.org"
git config --global user.name "Github Actions"
git remote -v
git rebase origin/${BASE_REF}
source zephyr-env.sh
# debug
ls -la
git log --pretty=oneline | head -n 10
./scripts/ci/guideline_check.py --output output.txt -c origin/${BASE_REF}..
- name: check-warns
run: |
if [[ -s "output.txt" ]]; then
errors=$(cat output.txt)
errors="${errors//'%'/'%25'}"
errors="${errors//$'\n'/'%0A'}"
errors="${errors//$'\r'/'%0D'}"
echo "::error file=output.txt::$errors"
exit 1;
fi

View File

@@ -1,138 +0,0 @@
name: Compliance Checks
on:
pull_request:
types:
- edited
- opened
- reopened
- synchronize
permissions:
contents: read
jobs:
check_compliance:
runs-on: ubuntu-24.04
name: Run compliance checks on patch series (PR)
steps:
- name: Update PATH for west
run: |
echo "$HOME/.local/bin" >> $GITHUB_PATH
- name: Checkout the code
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
- name: Rebase onto the target branch
env:
BASE_REF: ${{ github.base_ref }}
run: |
git config --global user.email "you@example.com"
git config --global user.name "Your Name"
git remote -v
# Ensure there's no merge commits in the PR
[[ "$(git rev-list --merges --count origin/${BASE_REF}..)" == "0" ]] || \
(echo "::error ::Merge commits not allowed, rebase instead";false)
rm -fr ".git/rebase-apply"
rm -fr ".git/rebase-merge"
git rebase origin/${BASE_REF}
git clean -f -d
# debug
git log --pretty=oneline | head -n 10
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: west setup
run: |
west init -l . || true
west config manifest.group-filter -- +ci,-optional
west update -o=--depth=1 -n 2>&1 1> west.update.log || west update -o=--depth=1 -n 2>&1 1> west.update2.log
- name: Setup Node.js
uses: actions/setup-node@2028fbc5c25fe9cf00d9f06a71cc4710d4507903 # v6.0.0
with:
node-version: "lts/*"
cache: npm
check-latest: true
cache-dependency-path: ./scripts/ci/package-lock.json
- name: Install Node dependencies
run: npm --prefix ./scripts/ci ci
- name: Run Compliance Tests
continue-on-error: true
id: compliance
env:
BASE_REF: ${{ github.base_ref }}
run: |
export ZEPHYR_BASE=$PWD
# debug
ls -la
git log --pretty=oneline | head -n 10
# Increase rename limit to allow for large PRs
git config diff.renameLimit 10000
excludes="-e KconfigBasic -e SysbuildKconfigBasic -e ClangFormat"
# The signed-off-by check for dependabot should be skipped
if [ "${{ github.actor }}" == "dependabot[bot]" ]; then
excludes="$excludes -e Identity"
fi
./scripts/ci/check_compliance.py --annotate $excludes -c origin/${BASE_REF}..
- name: upload-results
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
continue-on-error: true
with:
name: compliance.xml
path: compliance.xml
- name: Upload dts linter patch
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
continue-on-error: true
if: hashFiles('dts_linter.patch') != ''
with:
name: dts_linter.patch
path: dts_linter.patch
- name: check-warns
run: |
if [[ ! -s "compliance.xml" ]]; then
exit 1;
fi
warns=("ClangFormat" "LicenseAndCopyrightCheck")
files=($(./scripts/ci/check_compliance.py -l))
for file in "${files[@]}"; do
f="${file}.txt"
if [[ -s $f ]]; then
results=$(cat $f)
results="${results//'%'/'%25'}"
results="${results//$'\n'/'%0A'}"
results="${results//$'\r'/'%0D'}"
if [[ "${warns[@]}" =~ "${file}" ]]; then
echo "::warning file=${f}::$results"
else
echo "::error file=${f}::$results"
exit=1
fi
fi
done
if [ "${exit}" == "1" ]; then
echo "Compliance error, check for error messages in the \"Run Compliance Tests\" step"
echo "You can run this step locally with the ./scripts/ci/check_compliance.py script."
exit 1;
fi

View File

@@ -0,0 +1,91 @@
name: Compliance Checks
on: pull_request
jobs:
check_compliance:
runs-on: ubuntu-22.04
name: Run compliance checks on patch series (PR)
steps:
- name: Update PATH for west
run: |
echo "$HOME/.local/bin" >> $GITHUB_PATH
- name: Checkout the code
uses: actions/checkout@v4
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
- name: cache-pip
uses: actions/cache@v4
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-${{ hashFiles('.github/workflows/compliance.yml') }}
- name: Install python dependencies
run: |
pip3 install setuptools
pip3 install wheel
pip3 install python-magic lxml junitparser gitlint pylint pykwalify yamllint
pip3 install west
- name: west setup
env:
BASE_REF: ${{ github.base_ref }}
run: |
git config --global user.email "you@example.com"
git config --global user.name "Your Name"
git remote -v
# Ensure there's no merge commits in the PR
[[ "$(git rev-list --merges --count origin/${BASE_REF}..)" == "0" ]] || \
(echo "::error ::Merge commits not allowed, rebase instead";false)
git rebase origin/${BASE_REF}
# debug
git log --pretty=oneline | head -n 10
west init -l . || true
west config manifest.group-filter -- +ci,-optional
west update -o=--depth=1 -n 2>&1 1> west.update.log || west update -o=--depth=1 -n 2>&1 1> west.update2.log
- name: Run Compliance Tests
continue-on-error: true
id: compliance
env:
BASE_REF: ${{ github.base_ref }}
run: |
export ZEPHYR_BASE=$PWD
# debug
ls -la
git log --pretty=oneline | head -n 10
./scripts/ci/check_compliance.py --annotate -e KconfigBasic \
-c origin/${BASE_REF}..
- name: upload-results
uses: actions/upload-artifact@v4
continue-on-error: true
with:
name: compliance.xml
path: compliance.xml
- name: check-warns
run: |
if [[ ! -s "compliance.xml" ]]; then
exit 1;
fi
files=($(./scripts/ci/check_compliance.py -l))
for file in "${files[@]}"; do
f="${file}.txt"
if [[ -s $f ]]; then
errors=$(cat $f)
errors="${errors//'%'/'%25'}"
errors="${errors//$'\n'/'%0A'}"
errors="${errors//$'\r'/'%0D'}"
echo "::error file=${f}::$errors"
exit=1
fi
done
if [ "${exit}" == "1" ]; then
exit 1;
fi

View File

@@ -1,48 +0,0 @@
# Copyright (c) 2020 Intel Corp.
# SPDX-License-Identifier: Apache-2.0
name: Publish commit for daily testing
on:
schedule:
- cron: '50 22 * * *'
push:
branches:
- refs/tags/*
permissions:
contents: read
jobs:
get_version:
runs-on: ubuntu-24.04
if: github.repository == 'zephyrproject-rtos/zephyr'
steps:
- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@00943011d9042930efac3dcd3a170e4273319bc8 # v5.1.0
with:
aws-access-key-id: ${{ vars.AWS_TESTING_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_TESTING_SECRET_ACCESS_KEY }}
aws-region: us-east-1
- name: checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
fetch-depth: 0
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Upload to AWS S3
run: |
python3 scripts/ci/version_mgr.py --update .
aws s3 cp versions.json s3://testing.zephyrproject.org/daily_tests/versions.json

View File

@@ -0,0 +1,38 @@
# Copyright (c) 2020 Intel Corp.
# SPDX-License-Identifier: Apache-2.0
name: Publish commit for daily testing
on:
schedule:
- cron: '50 22 * * *'
push:
branches:
- refs/tags/*
jobs:
get_version:
runs-on: ubuntu-22.04
if: github.repository == 'zephyrproject-rtos/zephyr'
steps:
- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@v2
with:
aws-access-key-id: ${{ vars.AWS_TESTING_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_TESTING_SECRET_ACCESS_KEY }}
aws-region: us-east-1
- name: install-pip
run: |
pip3 install gitpython
- name: checkout
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Upload to AWS S3
run: |
python3 scripts/ci/version_mgr.py --update .
aws s3 cp versions.json s3://testing.zephyrproject.org/daily_tests/versions.json

View File

@@ -1,52 +0,0 @@
# Copyright (c) 2020 Linaro Limited.
# Copyright (c) 2020 Nordic Semiconductor ASA
# SPDX-License-Identifier: Apache-2.0
name: Devicetree script tests
on:
push:
branches:
- main
- v*-branch
paths:
- 'scripts/dts/**'
- '.github/workflows/devicetree_checks.yml'
pull_request:
branches:
- main
- v*-branch
paths:
- 'scripts/dts/**'
- '.github/workflows/devicetree_checks.yml'
permissions:
contents: read
jobs:
devicetree-checks:
name: Devicetree script tests
runs-on: ${{ matrix.os }}
strategy:
matrix:
python-version: ['3.10', '3.11', '3.12', '3.13']
os: [ubuntu-22.04, macos-14, windows-2022]
steps:
- name: checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: ${{ matrix.python-version }}
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: run tox
working-directory: scripts/dts/python-devicetree
run: |
tox

View File

@@ -0,0 +1,75 @@
# Copyright (c) 2020 Linaro Limited.
# Copyright (c) 2020 Nordic Semiconductor ASA
# SPDX-License-Identifier: Apache-2.0
name: Devicetree script tests
on:
push:
branches:
- main
- v*-branch
paths:
- 'scripts/dts/**'
- '.github/workflows/devicetree_checks.yml'
pull_request:
branches:
- main
- v*-branch
paths:
- 'scripts/dts/**'
- '.github/workflows/devicetree_checks.yml'
jobs:
devicetree-checks:
name: Devicetree script tests
runs-on: ${{ matrix.os }}
strategy:
matrix:
python-version: [3.8, 3.9, '3.10', '3.11', '3.12']
os: [ubuntu-22.04, macos-11, windows-2022]
exclude:
- os: macos-11
python-version: 3.6
- os: windows-2022
python-version: 3.6
steps:
- name: checkout
uses: actions/checkout@v4
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4
with:
python-version: ${{ matrix.python-version }}
- name: cache-pip-linux
if: startsWith(runner.os, 'Linux')
uses: actions/cache@v4
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-${{ matrix.python-version }}
restore-keys: |
${{ runner.os }}-pip-${{ matrix.python-version }}
- name: cache-pip-mac
if: startsWith(runner.os, 'macOS')
uses: actions/cache@v4
with:
path: ~/Library/Caches/pip
# Trailing '-' was just to get a different cache name
key: ${{ runner.os }}-pip-${{ matrix.python-version }}-
restore-keys: |
${{ runner.os }}-pip-${{ matrix.python-version }}-
- name: cache-pip-win
if: startsWith(runner.os, 'Windows')
uses: actions/cache@v4
with:
path: ~\AppData\Local\pip\Cache
key: ${{ runner.os }}-pip-${{ matrix.python-version }}
restore-keys: |
${{ runner.os }}-pip-${{ matrix.python-version }}
- name: install python dependencies
run: |
pip3 install wheel
pip3 install pytest pyyaml tox
- name: run tox
working-directory: scripts/dts/python-devicetree
run: |
tox

View File

@@ -0,0 +1,20 @@
name: Do Not Merge
on:
pull_request:
types: [synchronize, opened, reopened, labeled, unlabeled]
jobs:
do-not-merge:
if: ${{ contains(github.event.*.labels.*.name, 'DNM') ||
contains(github.event.*.labels.*.name, 'TSC') ||
contains(github.event.*.labels.*.name, 'Architecture Review') ||
contains(github.event.*.labels.*.name, 'dev-review') }}
name: Prevent Merging
runs-on: ubuntu-22.04
steps:
- name: Check for label
run: |
echo "Pull request is labeled as 'DNM', 'TSC', 'Architecture Review' or 'dev-review'."
echo "This workflow fails so that the pull request cannot be merged."
exit 1

View File

@@ -1,279 +0,0 @@
# Copyright (c) 2020 Linaro Limited.
# SPDX-License-Identifier: Apache-2.0
name: Documentation Build
on:
push:
branches:
- main
tags:
- v*
pull_request:
permissions:
contents: read
env:
DOXYGEN_VERSION: 1.15.0
DOXYGEN_SHA256SUM: 0ec2e5b2c3cd82b7106d19cb42d8466450730b8cb7a9e85af712be38bf4523a1
JOB_COUNT: 8
jobs:
doc-file-check:
name: Check for doc changes
runs-on: ubuntu-24.04
outputs:
file_check: ${{ steps.check-doc-files.outputs.any_modified }}
steps:
- name: checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
- name: Check if Documentation related files changed
uses: tj-actions/changed-files@24d32ffd492484c1d75e0c0b894501ddb9d30d62 # v47.0.0
id: check-doc-files
with:
files: |
doc/
boards/**/doc/
**.rst
include/
kernel/include/kernel_arch_interface.h
lib/libc/**
subsys/testsuite/ztest/include/**
**/Kconfig*
west.yml
scripts/dts/
doc/requirements.txt
.github/workflows/doc-build.yml
scripts/pylib/pytest-twister-harness/src/twister_harness/device/device_adapter.py
scripts/pylib/pytest-twister-harness/src/twister_harness/helpers/shell.py
doc-build-html:
name: "Documentation Build (HTML)"
needs: [doc-file-check]
if: >
needs.doc-file-check.outputs.file_check == 'true' || github.event_name != 'pull_request'
runs-on:
group: zephyr-runner-v2-linux-x64-4xlarge
container:
image: ghcr.io/zephyrproject-rtos/ci-repo-cache:v0.28.7.20251127
options: '--entrypoint /bin/bash'
timeout-minutes: 60
concurrency:
group: doc-build-html-${{ github.ref }}
cancel-in-progress: true
env:
BASE_REF: ${{ github.base_ref }}
steps:
- name: Print cloud service information
run: |
echo "ZEPHYR_RUNNER_CLOUD_PROVIDER = ${ZEPHYR_RUNNER_CLOUD_PROVIDER}"
echo "ZEPHYR_RUNNER_CLOUD_NODE = ${ZEPHYR_RUNNER_CLOUD_NODE}"
echo "ZEPHYR_RUNNER_CLOUD_POD = ${ZEPHYR_RUNNER_CLOUD_POD}"
- name: Apply container owner mismatch workaround
run: |
# FIXME: The owner UID of the GITHUB_WORKSPACE directory may not
# match the container user UID because of the way GitHub
# Actions runner is implemented. Remove this workaround when
# GitHub comes up with a fundamental fix for this problem.
git config --global --add safe.directory ${GITHUB_WORKSPACE}
- name: Clone cached Zephyr repository
continue-on-error: true
run: |
git clone --shared /repo-cache/zephyrproject/zephyr .
git remote set-url origin ${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
- name: Environment Setup
run: |
if [ "${{github.event_name}}" = "pull_request" ]; then
git config --global user.email "bot@zephyrproject.org"
git config --global user.name "Zephyr Builder"
rm -fr ".git/rebase-apply"
rm -fr ".git/rebase-merge"
git rebase origin/${BASE_REF}
git clean -f -d
git log --pretty=oneline | head -n 10
fi
echo "$HOME/.local/bin" >> $GITHUB_PATH
echo "$HOME/.cargo/bin" >> $GITHUB_PATH
west init -l . || true
west config --global update.narrow true
west update --path-cache /repo-cache/zephyrproject 2>&1 1> west.update.log || west update --path-cache /repo-cache/zephyrproject 2>&1 1> west.update.log || ( rm -rf ../modules ../bootloader ../tools && west update --path-cache /repo-cache/zephyrproject)
west forall -c 'git reset --hard HEAD'
echo "ZEPHYR_SDK_INSTALL_DIR=/opt/toolchains/zephyr-sdk-$( cat SDK_VERSION )" >> $GITHUB_ENV
- name: Install Python packages required for documentation build
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
pip install -r doc/requirements.txt --require-hashes
- name: Build HTML documentation
shell: bash
run: |
if [[ "$GITHUB_REF" =~ "refs/tags/v" ]]; then
DOC_TAG="release"
else
DOC_TAG="development"
fi
if [[ "${{ github.event_name }}" == "pull_request" ]]; then
DOC_TARGET="html-fast"
else
DOC_TARGET="html"
fi
DOC_TAG=${DOC_TAG} \
SPHINXOPTS="-j ${JOB_COUNT} -W --keep-going -T" \
SPHINXOPTS_EXTRA="-q -t publish" \
make -C doc ${DOC_TARGET}
# API documentation coverage
python3 -m coverxygen --xml-dir doc/_build/html/doxygen/xml/ --src-dir include/ --output doc-coverage.info
# deprecated page causing issues
lcov --remove doc-coverage.info \*/deprecated > new.info
genhtml --no-function-coverage --no-branch-coverage new.info -o coverage-report
- name: Compress documentation build artifacts
run: |
tar --use-compress-program="xz -T0" -cf html-output.tar.xz --exclude html/_sources --exclude html/doxygen/xml --directory=doc/_build html
tar --use-compress-program="xz -T0" -cf api-output.tar.xz --directory=doc/_build html/doxygen/html
tar --use-compress-program="xz -T0" -cf api-coverage.tar.xz coverage-report
- name: Upload HTML output
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
with:
name: html-output
path: html-output.tar.xz
- name: Upload Doxygen coverage artifacts
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
with:
name: api-coverage
path: api-coverage.tar.xz
- name: Summarize PR documentation URLs
if: github.event_name == 'pull_request'
run: |
REPO_NAME="${{ github.event.repository.name }}"
PR_NUM="${{ github.event.pull_request.number }}"
DOC_URL="https://builds.zephyrproject.io/${REPO_NAME}/pr/${PR_NUM}/docs/"
API_DOC_URL="https://builds.zephyrproject.io/${REPO_NAME}/pr/${PR_NUM}/docs/doxygen/html/"
API_COVERAGE_URL="https://builds.zephyrproject.io/${REPO_NAME}/pr/${PR_NUM}/api-coverage/"
echo "${PR_NUM}" > pr_num
echo "Documentation will be available shortly at: ${DOC_URL}" >> $GITHUB_STEP_SUMMARY
echo "API Documentation will be available shortly at: ${API_DOC_URL}" >> $GITHUB_STEP_SUMMARY
echo "API Coverage Report will be available shortly at: ${API_COVERAGE_URL}" >> $GITHUB_STEP_SUMMARY
- name: Upload PR number
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
if: github.event_name == 'pull_request'
with:
name: pr_num
path: pr_num
doc-build-pdf:
name: "Documentation Build (PDF)"
needs: [doc-file-check]
if: |
github.event_name != 'pull_request'
runs-on: ubuntu-24.04
timeout-minutes: 120
concurrency:
group: doc-build-pdf-${{ github.ref }}
cancel-in-progress: true
steps:
- name: checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
path: zephyr
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: doc/requirements.txt
- name: install-pkgs
run: |
sudo apt-get update
sudo apt-get install --no-install-recommends graphviz librsvg2-bin \
texlive-latex-base texlive-latex-extra latexmk \
texlive-fonts-recommended texlive-fonts-extra texlive-xetex \
imagemagick fonts-noto xindy
wget --no-verbose "https://github.com/doxygen/doxygen/releases/download/Release_${DOXYGEN_VERSION//./_}/doxygen-${DOXYGEN_VERSION}.linux.bin.tar.gz"
echo "${DOXYGEN_SHA256SUM} doxygen-${DOXYGEN_VERSION}.linux.bin.tar.gz" | sha256sum -c
if [ $? -ne 0 ]; then
echo "Failed to verify doxygen tarball"
exit 1
fi
sudo tar xf doxygen-${DOXYGEN_VERSION}.linux.bin.tar.gz -C /opt
echo "/opt/doxygen-${DOXYGEN_VERSION}/bin" >> $GITHUB_PATH
- name: Setup Zephyr project
uses: zephyrproject-rtos/action-zephyr-setup@c125c5ebeeadbd727fa740b407f862734af1e52a # v1.0.9
with:
app-path: zephyr
toolchains: 'arm-zephyr-eabi'
enable-ccache: false
- name: install-pip-pkgs
working-directory: zephyr
run: |
pip install -r doc/requirements.txt --require-hashes
- name: build-docs
shell: bash
working-directory: zephyr
continue-on-error: true
run: |
if [[ "$GITHUB_REF" =~ "refs/tags/v" ]]; then
DOC_TAG="release"
else
DOC_TAG="development"
fi
DOC_TAG=${DOC_TAG} \
SPHINXOPTS="-q -j ${JOB_COUNT}" \
LATEXMKOPTS="-quiet -halt-on-error" \
make -C doc pdf
- name: upload-build
if: always()
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
with:
name: pdf-output
if-no-files-found: ignore
path: |
zephyr/doc/_build/latex/zephyr.pdf
zephyr/doc/_build/latex/zephyr.log
doc-build-status-check:
if: always()
name: "Documentation Build Status"
needs:
- doc-build-pdf
- doc-file-check
- doc-build-html
uses: ./.github/workflows/ready-to-merge.yml
with:
needs_context: ${{ toJson(needs) }}

234
.github/workflows/doc-build.yml.disabled vendored Normal file
View File

@@ -0,0 +1,234 @@
# Copyright (c) 2020 Linaro Limited.
# SPDX-License-Identifier: Apache-2.0
name: Documentation Build
on:
schedule:
- cron: '0 */3 * * *'
push:
tags:
- v*
pull_request:
paths:
- 'doc/**'
- '**.rst'
- 'include/**'
- 'kernel/include/kernel_arch_interface.h'
- 'lib/libc/**'
- 'subsys/testsuite/ztest/include/**'
- 'tests/**'
- '**/Kconfig*'
- 'west.yml'
- '.github/workflows/doc-build.yml'
- 'scripts/dts/**'
- 'doc/requirements.txt'
env:
# NOTE: west docstrings will be extracted from the version listed here
WEST_VERSION: 1.2.0
# The latest CMake available directly with apt is 3.18, but we need >=3.20
# so we fetch that through pip.
CMAKE_VERSION: 3.20.5
DOXYGEN_VERSION: 1.9.6
jobs:
doc-build-html:
name: "Documentation Build (HTML)"
if: github.repository_owner == 'zephyrproject-rtos'
runs-on:
group: zephyr-runner-v2-linux-x64-4xlarge
timeout-minutes: 45
concurrency:
group: doc-build-html-${{ github.ref }}
cancel-in-progress: true
steps:
- name: Print cloud service information
run: |
echo "ZEPHYR_RUNNER_CLOUD_PROVIDER = ${ZEPHYR_RUNNER_CLOUD_PROVIDER}"
echo "ZEPHYR_RUNNER_CLOUD_NODE = ${ZEPHYR_RUNNER_CLOUD_NODE}"
echo "ZEPHYR_RUNNER_CLOUD_POD = ${ZEPHYR_RUNNER_CLOUD_POD}"
- name: install-pkgs
run: |
sudo apt-get update
sudo apt-get install -y wget python3-pip git ninja-build graphviz lcov
wget --no-verbose "https://github.com/doxygen/doxygen/releases/download/Release_${DOXYGEN_VERSION//./_}/doxygen-${DOXYGEN_VERSION}.linux.bin.tar.gz"
sudo tar xf doxygen-${DOXYGEN_VERSION}.linux.bin.tar.gz -C /opt
echo "/opt/doxygen-${DOXYGEN_VERSION}/bin" >> $GITHUB_PATH
echo "${HOME}/.local/bin" >> $GITHUB_PATH
- name: checkout
uses: actions/checkout@v4
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
- name: Rebase
if: github.event_name == 'pull_request'
continue-on-error: true
env:
BASE_REF: ${{ github.base_ref }}
PR_HEAD: ${{ github.event.pull_request.head.sha }}
run: |
git config --global user.email "actions@zephyrproject.org"
git config --global user.name "Github Actions"
git rebase origin/${BASE_REF}
git log --graph --oneline HEAD...${PR_HEAD}
- name: cache-pip
uses: actions/cache@v4
with:
path: ~/.cache/pip
key: pip-${{ hashFiles('doc/requirements.txt') }}
- name: install-pip
run: |
sudo pip3 install -U setuptools wheel pip
pip3 install -r doc/requirements.txt
pip3 install west==${WEST_VERSION}
pip3 install cmake==${CMAKE_VERSION}
pip3 install coverxygen
- name: west setup
run: |
west init -l .
- name: build-docs
shell: bash
run: |
if [[ "$GITHUB_REF" =~ "refs/tags/v" ]]; then
DOC_TAG="release"
else
DOC_TAG="development"
fi
if [[ "${{ github.event_name }}" == "pull_request" ]]; then
DOC_TARGET="html-fast"
else
DOC_TARGET="html"
fi
DOC_TAG=${DOC_TAG} SPHINXOPTS_EXTRA="-q -t publish" make -C doc ${DOC_TARGET}
# API documentation coverage
python3 -m coverxygen --xml-dir doc/_build/html/doxygen/xml/ --src-dir include/ --output doc-coverage.info
# deprecated page causing issues
lcov --remove doc-coverage.info \*/deprecated > new.info
genhtml --no-function-coverage --no-branch-coverage new.info -o coverage-report
- name: compress-docs
run: |
tar cfJ html-output.tar.xz --directory=doc/_build html
tar cfJ api-output.tar.xz --directory=doc/_build html/doxygen/html
tar cfJ api-coverage.tar.xz coverage-report
- name: upload-build
uses: actions/upload-artifact@v4
with:
name: html-output
path: html-output.tar.xz
- name: upload-api-coverage
uses: actions/upload-artifact@v4
with:
name: api-coverage
path: api-coverage.tar.xz
- name: process-pr
if: github.event_name == 'pull_request'
run: |
REPO_NAME="${{ github.event.repository.name }}"
PR_NUM="${{ github.event.pull_request.number }}"
DOC_URL="https://builds.zephyrproject.io/${REPO_NAME}/pr/${PR_NUM}/docs/"
API_DOC_URL="https://builds.zephyrproject.io/${REPO_NAME}/pr/${PR_NUM}/docs/doxygen/html/"
API_COVERAGE_URL="https://builds.zephyrproject.io/${REPO_NAME}/pr/${PR_NUM}/api-coverage/"
echo "${PR_NUM}" > pr_num
echo "Documentation will be available shortly at: ${DOC_URL}" >> $GITHUB_STEP_SUMMARY
echo "API Documentation will be available shortly at: ${API_DOC_URL}" >> $GITHUB_STEP_SUMMARY
echo "API Coverage Report will be available shortly at: ${API_COVERAGE_URL}" >> $GITHUB_STEP_SUMMARY
- name: upload-pr-number
uses: actions/upload-artifact@v4
if: github.event_name == 'pull_request'
with:
name: pr_num
path: pr_num
doc-build-pdf:
name: "Documentation Build (PDF)"
if: |
github.event_name != 'pull_request' &&
github.repository_owner == 'zephyrproject-rtos'
runs-on:
group: zephyr-runner-v2-linux-x64-4xlarge
container: texlive/texlive:latest
timeout-minutes: 60
concurrency:
group: doc-build-pdf-${{ github.ref }}
cancel-in-progress: true
steps:
- name: Apply container owner mismatch workaround
run: |
git config --global --add safe.directory ${GITHUB_WORKSPACE}
- name: Print cloud service information
run: |
echo "ZEPHYR_RUNNER_CLOUD_PROVIDER = ${ZEPHYR_RUNNER_CLOUD_PROVIDER}"
echo "ZEPHYR_RUNNER_CLOUD_NODE = ${ZEPHYR_RUNNER_CLOUD_NODE}"
echo "ZEPHYR_RUNNER_CLOUD_POD = ${ZEPHYR_RUNNER_CLOUD_POD}"
- name: checkout
uses: actions/checkout@v4
- name: install-pkgs
run: |
apt-get update
apt-get install -y python3-pip python3-venv ninja-build doxygen graphviz librsvg2-bin
- name: cache-pip
uses: actions/cache@v4
with:
path: ~/.cache/pip
key: pip-${{ hashFiles('doc/requirements.txt') }}
- name: setup-venv
run: |
python3 -m venv .venv
. .venv/bin/activate
echo PATH=$PATH >> $GITHUB_ENV
- name: install-pip
run: |
pip3 install -U setuptools wheel pip
pip3 install -r doc/requirements.txt
pip3 install west==${WEST_VERSION}
pip3 install cmake==${CMAKE_VERSION}
- name: west setup
run: |
west init -l .
- name: build-docs
shell: bash
continue-on-error: true
run: |
if [[ "$GITHUB_REF" =~ "refs/tags/v" ]]; then
DOC_TAG="release"
else
DOC_TAG="development"
fi
DOC_TAG=${DOC_TAG} SPHINXOPTS="-q -j auto" LATEXMKOPTS="-quiet -halt-on-error" make -C doc pdf
- name: upload-build
if: always()
uses: actions/upload-artifact@v4
with:
name: pdf-output
if-no-files-found: ignore
path: |
doc/_build/latex/zephyr.pdf
doc/_build/latex/zephyr.log

View File

@@ -1,87 +0,0 @@
# Copyright (c) 2020 Linaro Limited.
# Copyright (c) 2021 Nordic Semiconductor ASA
# SPDX-License-Identifier: Apache-2.0
name: Documentation Publish (Pull Request)
on:
workflow_run:
workflows: ["Documentation Build"]
types:
- completed
permissions:
contents: read
jobs:
doc-publish:
name: Publish Documentation
runs-on: ubuntu-24.04
if: |
github.event.workflow_run.event == 'pull_request' &&
github.event.workflow_run.conclusion == 'success' &&
github.repository == 'zephyrproject-rtos/zephyr'
steps:
- name: Download artifacts
id: download-artifacts
uses: dawidd6/action-download-artifact@ac66b43f0e6a346234dd65d4d0c8fbb31cb316e5 # v11
with:
workflow: doc-build.yml
run_id: ${{ github.event.workflow_run.id }}
if_no_artifact_found: ignore
- name: Load PR number
if: steps.download-artifacts.outputs.found_artifact == 'true'
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # v8.0.0
with:
script: |
let fs = require("fs");
let pr_number = Number(fs.readFileSync("./pr_num/pr_num"));
core.exportVariable("PR_NUM", pr_number);
- name: Check PR number
if: steps.download-artifacts.outputs.found_artifact == 'true'
id: check-pr
uses: carpentries/actions/check-valid-pr@2e20fd5ee53b691e27455ce7ca3b16ea885140e8 # v0.15.0
with:
pr: ${{ env.PR_NUM }}
sha: ${{ github.event.workflow_run.head_sha }}
- name: Validate PR number
if: |
steps.download-artifacts.outputs.found_artifact == 'true' &&
steps.check-pr.outputs.VALID != 'true'
run: |
echo "ABORT: PR number validation failed!"
exit 1
- name: Uncompress HTML docs
if: steps.download-artifacts.outputs.found_artifact == 'true'
run: |
tar xf html-output/html-output.tar.xz -C html-output
if [ -f api-coverage/api-coverage.tar.xz ]; then
tar xf api-coverage/api-coverage.tar.xz -C api-coverage
fi
- name: Configure AWS Credentials
if: steps.download-artifacts.outputs.found_artifact == 'true'
uses: aws-actions/configure-aws-credentials@00943011d9042930efac3dcd3a170e4273319bc8 # v5.1.0
with:
aws-access-key-id: ${{ vars.AWS_BUILDS_ZEPHYR_PR_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_BUILDS_ZEPHYR_PR_SECRET_ACCESS_KEY }}
aws-region: us-east-1
- name: Upload to AWS S3
if: steps.download-artifacts.outputs.found_artifact == 'true'
env:
HEAD_BRANCH: ${{ github.event.workflow_run.head_branch }}
run: |
aws s3 sync --quiet html-output/html \
s3://builds.zephyrproject.org/${{ github.event.repository.name }}/pr/${PR_NUM}/docs \
--delete
if [ -d api-coverage/coverage-report ]; then
aws s3 sync --quiet api-coverage/coverage-report/ \
s3://builds.zephyrproject.org/${{ github.event.repository.name }}/pr/${PR_NUM}/api-coverage \
--delete
fi

View File

@@ -0,0 +1,67 @@
# Copyright (c) 2020 Linaro Limited.
# Copyright (c) 2021 Nordic Semiconductor ASA
# SPDX-License-Identifier: Apache-2.0
name: Documentation Publish (Pull Request)
on:
workflow_run:
workflows: ["Documentation Build"]
types:
- completed
jobs:
doc-publish:
name: Publish Documentation
runs-on: ubuntu-22.04
if: |
github.event.workflow_run.event == 'pull_request' &&
github.event.workflow_run.conclusion == 'success' &&
github.repository == 'zephyrproject-rtos/zephyr'
steps:
- name: Download artifacts
uses: dawidd6/action-download-artifact@v2
with:
workflow: doc-build.yml
run_id: ${{ github.event.workflow_run.id }}
- name: Load PR number
run: |
echo "PR_NUM=$(<pr_num/pr_num)" >> $GITHUB_ENV
- name: Check PR number
id: check-pr
uses: carpentries/actions/check-valid-pr@v0.14.0
with:
pr: ${{ env.PR_NUM }}
sha: ${{ github.event.workflow_run.head_sha }}
- name: Validate PR number
if: steps.check-pr.outputs.VALID != 'true'
run: |
echo "ABORT: PR number validation failed!"
exit 1
- name: Uncompress HTML docs
run: |
tar xf html-output/html-output.tar.xz -C html-output
tar xf api-coverage/api-coverage.tar.xz -C api-coverage
- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@v2
with:
aws-access-key-id: ${{ vars.AWS_BUILDS_ZEPHYR_PR_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_BUILDS_ZEPHYR_PR_SECRET_ACCESS_KEY }}
aws-region: us-east-1
- name: Upload to AWS S3
env:
HEAD_BRANCH: ${{ github.event.workflow_run.head_branch }}
run: |
aws s3 sync --quiet html-output/html \
s3://builds.zephyrproject.org/${{ github.event.repository.name }}/pr/${PR_NUM}/docs \
--delete
aws s3 sync --quiet api-coverage/coverage-report/ \
s3://builds.zephyrproject.org/${{ github.event.repository.name }}/pr/${PR_NUM}/api-coverage \
--delete

View File

@@ -1,64 +0,0 @@
# Copyright (c) 2020 Linaro Limited.
# Copyright (c) 2021 Nordic Semiconductor ASA
# SPDX-License-Identifier: Apache-2.0
name: Documentation Publish
on:
workflow_run:
workflows: ["Documentation Build"]
branches:
- main
- v*
types:
- completed
permissions:
contents: read
jobs:
doc-publish:
name: Publish Documentation
runs-on: ubuntu-24.04
if: |
github.event.workflow_run.event != 'pull_request' &&
github.event.workflow_run.conclusion == 'success' &&
github.repository == 'zephyrproject-rtos/zephyr'
steps:
- name: Download artifacts
uses: dawidd6/action-download-artifact@ac66b43f0e6a346234dd65d4d0c8fbb31cb316e5 # v11
with:
workflow: doc-build.yml
run_id: ${{ github.event.workflow_run.id }}
- name: Uncompress HTML docs
run: |
tar xf html-output/html-output.tar.xz -C html-output
if [ -f api-coverage/api-coverage.tar.xz ]; then
tar xf api-coverage/api-coverage.tar.xz -C api-coverage
fi
- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@00943011d9042930efac3dcd3a170e4273319bc8 # v5.1.0
with:
aws-access-key-id: ${{ vars.AWS_DOCS_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_DOCS_SECRET_ACCESS_KEY }}
aws-region: us-east-1
- name: Upload to AWS S3
env:
HEAD_BRANCH: ${{ github.event.workflow_run.head_branch }}
run: |
if [ "${HEAD_BRANCH:0:1}" == "v" ]; then
VERSION=${HEAD_BRANCH:1}
else
VERSION="latest"
fi
aws s3 sync --quiet html-output/html s3://docs.zephyrproject.org/${VERSION} --delete
aws s3 sync --quiet html-output/html/doxygen/html s3://docs.zephyrproject.org/apidoc/${VERSION} --delete
if [ -d api-coverage/coverage-report ]; then
aws s3 sync --quiet api-coverage/coverage-report/ s3://docs.zephyrproject.org/api-coverage/${VERSION} --delete
fi
aws s3 cp --quiet pdf-output/zephyr.pdf s3://docs.zephyrproject.org/${VERSION}/zephyr.pdf

View File

@@ -0,0 +1,57 @@
# Copyright (c) 2020 Linaro Limited.
# Copyright (c) 2021 Nordic Semiconductor ASA
# SPDX-License-Identifier: Apache-2.0
name: Documentation Publish
on:
workflow_run:
workflows: ["Documentation Build"]
branches:
- main
- v*
types:
- completed
jobs:
doc-publish:
name: Publish Documentation
runs-on: ubuntu-22.04
if: |
github.event.workflow_run.event != 'pull_request' &&
github.event.workflow_run.conclusion == 'success' &&
github.repository == 'zephyrproject-rtos/zephyr'
steps:
- name: Download artifacts
uses: dawidd6/action-download-artifact@v2
with:
workflow: doc-build.yml
run_id: ${{ github.event.workflow_run.id }}
- name: Uncompress HTML docs
run: |
tar xf html-output/html-output.tar.xz -C html-output
tar xf api-coverage/api-coverage.tar.xz -C api-coverage
- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@v2
with:
aws-access-key-id: ${{ vars.AWS_DOCS_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_DOCS_SECRET_ACCESS_KEY }}
aws-region: us-east-1
- name: Upload to AWS S3
env:
HEAD_BRANCH: ${{ github.event.workflow_run.head_branch }}
run: |
if [ "${HEAD_BRANCH:0:1}" == "v" ]; then
VERSION=${HEAD_BRANCH:1}
else
VERSION="latest"
fi
aws s3 sync --quiet html-output/html s3://docs.zephyrproject.org/${VERSION} --delete
aws s3 sync --quiet html-output/html/doxygen/html s3://docs.zephyrproject.org/apidoc/${VERSION} --delete
aws s3 sync --quiet api-coverage/coverage-report/ s3://docs.zephyrproject.org/api-coverage/${VERSION} --delete
aws s3 cp --quiet pdf-output/zephyr.pdf s3://docs.zephyrproject.org/${VERSION}/zephyr.pdf

View File

@@ -1,36 +0,0 @@
name: Error numbers
on:
pull_request:
paths:
- '.github/workflows/errno.yml'
- 'lib/libc/minimal/include/errno.h'
- 'scripts/ci/errno.py'
- 'SDK_VERSION'
permissions:
contents: read
jobs:
check-errno:
runs-on: ubuntu-24.04
steps:
- name: checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
path: zephyr
- name: Setup Zephyr project
uses: zephyrproject-rtos/action-zephyr-setup@c125c5ebeeadbd727fa740b407f862734af1e52a # v1.0.9
with:
app-path: zephyr
toolchains: 'arm-zephyr-eabi'
west-group-filter: -hal,-tools,-bootloader,-babblesim
west-project-filter: -nrf_hw_models
enable-ccache: false
- name: Run errno.py
working-directory: zephyr
run: |
export ZEPHYR_SDK_INSTALL_DIR=${{ github.workspace }}/zephyr-sdk
export ZEPHYR_BASE=${PWD}
./scripts/ci/errno.py

34
.github/workflows/errno.yml.disabled vendored Normal file
View File

@@ -0,0 +1,34 @@
name: Error numbers
on:
pull_request:
paths:
- '.github/workflows/errno.yml'
- 'lib/libc/minimal/include/errno.h'
- 'scripts/ci/errno.py'
jobs:
check-errno:
runs-on: ubuntu-22.04
container:
image: ghcr.io/zephyrproject-rtos/ci:v0.26.7
steps:
- name: Apply container owner mismatch workaround
run: |
# FIXME: The owner UID of the GITHUB_WORKSPACE directory may not
# match the container user UID because of the way GitHub
# Actions runner is implemented. Remove this workaround when
# GitHub comes up with a fundamental fix for this problem.
git config --global --add safe.directory ${GITHUB_WORKSPACE}
- name: checkout
uses: actions/checkout@v4
- name: Environment Setup
run: |
echo "ZEPHYR_SDK_INSTALL_DIR=/opt/toolchains/zephyr-sdk-$( cat SDK_VERSION )" >> $GITHUB_ENV
- name: Run errno.py
run: |
export ZEPHYR_BASE=${PWD}
./scripts/ci/errno.py

View File

@@ -1,138 +0,0 @@
name: Footprint Tracking
on:
schedule:
- cron: '50 00 * * *'
push:
paths:
- 'VERSION'
- '.github/workflows/footprint-tracking.yml'
branches:
- main
- v*-branch
tags:
# only publish v* tags, do not care about zephyr-v* which point to the
# same commit
- 'v*'
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.event_name }}-${{ github.head_ref || github.ref }}
cancel-in-progress: true
jobs:
footprint-tracking:
runs-on:
group: zephyr-runner-v2-linux-x64-4xlarge
if: github.repository_owner == 'zephyrproject-rtos'
container:
image: ghcr.io/zephyrproject-rtos/ci-repo-cache:v0.28.7.20251127
options: '--entrypoint /bin/bash'
defaults:
run:
shell: bash
strategy:
fail-fast: false
env:
ZEPHYR_TOOLCHAIN_VARIANT: zephyr
steps:
- name: Apply container owner mismatch workaround
run: |
# FIXME: The owner UID of the GITHUB_WORKSPACE directory may not
# match the container user UID because of the way GitHub
# Actions runner is implemented. Remove this workaround when
# GitHub comes up with a fundamental fix for this problem.
git config --global --add safe.directory ${GITHUB_WORKSPACE}
- name: Print cloud service information
run: |
echo "ZEPHYR_RUNNER_CLOUD_PROVIDER = ${ZEPHYR_RUNNER_CLOUD_PROVIDER}"
echo "ZEPHYR_RUNNER_CLOUD_NODE = ${ZEPHYR_RUNNER_CLOUD_NODE}"
echo "ZEPHYR_RUNNER_CLOUD_POD = ${ZEPHYR_RUNNER_CLOUD_POD}"
- name: Update PATH for west
run: |
echo "$HOME/.local/bin" >> $GITHUB_PATH
- name: Install packages
run: |
sudo apt-get update
sudo apt-get install -y python3-venv
- name: checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Environment Setup
run: |
echo "ZEPHYR_SDK_INSTALL_DIR=/opt/toolchains/zephyr-sdk-$( cat SDK_VERSION )" >> $GITHUB_ENV
- name: west setup
run: |
west init -l . || true
west config --global update.narrow true
west update 2>&1 1> west.update.log || west update 2>&1 1> west.update2.log
- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@00943011d9042930efac3dcd3a170e4273319bc8 # v5.1.0
with:
aws-access-key-id: ${{ vars.AWS_TESTING_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_TESTING_SECRET_ACCESS_KEY }}
aws-region: us-east-1
- name: Record Footprint
env:
BASE_REF: ${{ github.base_ref }}
run: |
export ZEPHYR_BASE=${PWD}
./scripts/footprint/track.py -p scripts/footprint/plan.txt
- name: Upload footprint data
run: |
python3 -m venv .venv
. .venv/bin/activate
aws s3 sync --quiet footprint_data/ s3://testing.zephyrproject.org/footprint_data/
- name: Transform Footprint data to Twister JSON reports
run: |
shopt -s globstar
export ZEPHYR_BASE=${PWD}
python3 ./scripts/footprint/pack_as_twister.py -vvv \
--plan ./scripts/footprint/plan.txt \
--test-name='name.feature' \
./footprint_data/**/
- name: Upload to ElasticSearch
env:
ELASTICSEARCH_KEY: ${{ secrets.ELASTICSEARCH_KEY }}
ELASTICSEARCH_SERVER: "https://elasticsearch.zephyrproject.io:443"
ELASTICSEARCH_INDEX: ${{ vars.FOOTPRINT_TRACKING_INDEX }}
run: |
shopt -s globstar
run_date=`date --iso-8601=minutes`
python3 ./scripts/ci/upload_test_results_es.py -r ${run_date} \
--flatten footprint \
--flatten-list-names "{'children':'name'}" \
--transform "{ 'footprint_name': '^(?P<footprint_area>([^\/]+\/){0,2})(?P<footprint_path>([^\/]*\/)*)(?P<footprint_symbol>[^\/]*)$' }" \
--run-id "${{ github.run_id }}" \
--run-attempt "${{ github.run_attempt }}" \
--run-workflow "footprint-tracking:${{ github.event_name }}" \
--run-branch "${{ github.ref_name }}" \
-i ${ELASTICSEARCH_INDEX} \
./footprint_data/**/twister_footprint.json
#

View File

@@ -0,0 +1,95 @@
name: Footprint Tracking
# Run every 12 hours and on tags
on:
schedule:
- cron: '50 1/12 * * *'
push:
paths:
- 'VERSION'
- '.github/workflows/footprint-tracking.yml'
branches:
- main
- v*-branch
tags:
# only publish v* tags, do not care about zephyr-v* which point to the
# same commit
- 'v*'
concurrency:
group: ${{ github.workflow }}-${{ github.event_name }}-${{ github.head_ref || github.ref }}
cancel-in-progress: true
jobs:
footprint-tracking:
runs-on:
group: zephyr-runner-v2-linux-x64-4xlarge
if: github.repository_owner == 'zephyrproject-rtos'
container:
image: ghcr.io/zephyrproject-rtos/ci-repo-cache:v0.26.7.20240313
options: '--entrypoint /bin/bash'
strategy:
fail-fast: false
env:
ZEPHYR_TOOLCHAIN_VARIANT: zephyr
steps:
- name: Apply container owner mismatch workaround
run: |
# FIXME: The owner UID of the GITHUB_WORKSPACE directory may not
# match the container user UID because of the way GitHub
# Actions runner is implemented. Remove this workaround when
# GitHub comes up with a fundamental fix for this problem.
git config --global --add safe.directory ${GITHUB_WORKSPACE}
- name: Print cloud service information
run: |
echo "ZEPHYR_RUNNER_CLOUD_PROVIDER = ${ZEPHYR_RUNNER_CLOUD_PROVIDER}"
echo "ZEPHYR_RUNNER_CLOUD_NODE = ${ZEPHYR_RUNNER_CLOUD_NODE}"
echo "ZEPHYR_RUNNER_CLOUD_POD = ${ZEPHYR_RUNNER_CLOUD_POD}"
- name: Update PATH for west
run: |
echo "$HOME/.local/bin" >> $GITHUB_PATH
- name: Install packages
run: |
sudo apt-get update
sudo apt-get install -y python3-venv
sudo pip3 install -U setuptools wheel pip gitpython
- name: checkout
uses: actions/checkout@v4
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
- name: Environment Setup
run: |
echo "ZEPHYR_SDK_INSTALL_DIR=/opt/toolchains/zephyr-sdk-$( cat SDK_VERSION )" >> $GITHUB_ENV
- name: west setup
run: |
west init -l . || true
west config --global update.narrow true
west update 2>&1 1> west.update.log || west update 2>&1 1> west.update2.log
- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@v2
with:
aws-access-key-id: ${{ vars.AWS_TESTING_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_TESTING_SECRET_ACCESS_KEY }}
aws-region: us-east-1
- name: Record Footprint
env:
BASE_REF: ${{ github.base_ref }}
run: |
export ZEPHYR_BASE=${PWD}
./scripts/footprint/track.py -p scripts/footprint/plan.txt
- name: Upload footprint data
run: |
python3 -m venv .venv
. .venv/bin/activate
pip3 install awscli
aws s3 sync --quiet footprint_data/ s3://testing.zephyrproject.org/footprint_data/

View File

@@ -1,64 +0,0 @@
name: Greet first time contributor
on:
issues:
types: [opened]
pull_request_target:
types: [opened, closed]
permissions:
contents: read
jobs:
check_for_first_interaction:
runs-on: ubuntu-24.04
if: github.repository == 'zephyrproject-rtos/zephyr'
permissions:
pull-requests: write # to comment on pull requests
issues: write # to comment on issues
steps:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- uses: zephyrproject-rtos/action-first-interaction@58853996b1ac504b8e0f6964301f369d2bb22e5c # v1.1.1+zephyr.6
with:
repo-token: ${{ secrets.GITHUB_TOKEN }}
issue-message: >
Hi @${{github.event.issue.user.login}}! We appreciate you submitting your first issue
for our open-source project. 🌟
Even though I'm a bot, I can assure you that the whole community is genuinely grateful
for your time and effort. 🤖💙
pr-opened-message: >
Hello @${{ github.event.pull_request.user.login }}, and thank you very much for your
first pull request to the Zephyr project!
Our Continuous Integration pipeline will execute a series of checks on your Pull Request
commit messages and code, and you are expected to address any failures by updating the PR.
Please take a look at [our commit message guidelines](https://docs.zephyrproject.org/latest/contribute/guidelines.html#commit-message-guidelines)
to find out how to format your commit messages, and at [our contribution workflow](https://docs.zephyrproject.org/latest/contribute/guidelines.html#contribution-workflow)
to understand how to update your Pull Request.
If you haven't already, please make sure to review the project's [Contributor
Expectations](https://docs.zephyrproject.org/latest/contribute/contributor_expectations.html)
and update (by amending and force-pushing the commits) your pull request if necessary.
If you are stuck or need help please join us on [Discord](https://chat.zephyrproject.org/)
and ask your question there. Additionally, you can [escalate the review](https://docs.zephyrproject.org/latest/contribute/contributor_expectations.html#pr-technical-escalation)
when applicable. 😊
pr-merged-message: >
Hi @${{ github.event.pull_request.user.login }}!
Congratulations on getting your very first Zephyr pull request merged 🎉🥳. This is a
fantastic achievement, and we're thrilled to have you as part of our community!
To celebrate this milestone and showcase your contribution, we'd love to award you the
Zephyr Technical Contributor badge. If you're interested, please claim your badge by
filling out this form: [Claim Your Zephyr Badge](https://forms.gle/oCw9iAPLhUsHTapc8).
Thank you for your valuable input, and we look forward to seeing more of your
contributions in the future! 🪁

View File

@@ -0,0 +1,58 @@
name: Greet first time contributor
on:
issues:
types: [opened]
pull_request_target:
types: [opened, closed]
jobs:
check_for_first_interaction:
runs-on: ubuntu-22.04
if: github.repository == 'zephyrproject-rtos/zephyr'
steps:
- uses: actions/checkout@v4
- uses: zephyrproject-rtos/action-first-interaction@v1.1.1-zephyr-5
with:
repo-token: ${{ secrets.GITHUB_TOKEN }}
issue-message: >
Hi @${{github.event.issue.user.login}}! We appreciate you submitting your first issue
for our open-source project. 🌟
Even though I'm a bot, I can assure you that the whole community is genuinely grateful
for your time and effort. 🤖💙
pr-opened-message: >
Hello @${{ github.event.pull_request.user.login }}, and thank you very much for your
first pull request to the Zephyr project!
Our Continuous Integration pipeline will execute a series of checks on your Pull Request
commit messages and code, and you are expected to address any failures by updating the PR.
Please take a look at [our commit message guidelines](https://docs.zephyrproject.org/latest/contribute/guidelines.html#commit-message-guidelines)
to find out how to format your commit messages, and at [our contribution workflow](https://docs.zephyrproject.org/latest/contribute/guidelines.html#contribution-workflow)
to understand how to update your Pull Request.
If you haven't already, please make sure to review the project's [Contributor
Expectations](https://docs.zephyrproject.org/latest/contribute/contributor_expectations.html)
and update (by amending and force-pushing the commits) your pull request if necessary.
If you are stuck or need help please join us on [Discord](https://chat.zephyrproject.org/)
and ask your question there. Additionally, you can [escalate the review](https://docs.zephyrproject.org/latest/contribute/contributor_expectations.html#pr-review-escalation)
when applicable. 😊
pr-merged-message: >
Hi @${{ github.event.pull_request.user.login }}!
Congratulations on getting your very first Zephyr pull request merged 🎉🥳. This is a
fantastic achievement, and we're thrilled to have you as part of our community!
To celebrate this milestone and showcase your contribution, we'd love to award you the
Zephyr Technical Contributor badge. If you're interested, please claim your badge by
filling out this form: [Claim Your Zephyr Badge](https://forms.gle/oCw9iAPLhUsHTapc8).
Thank you for your valuable input, and we look forward to seeing more of your
contributions in the future! 🪁

View File

@@ -1,88 +0,0 @@
name: Hello World (Multiplatform)
on:
push:
branches:
- main
- v*-branch
- collab-*
pull_request:
branches:
- main
- v*-branch
- collab-*
paths:
- 'scripts/**'
- '.github/workflows/hello_world_multiplatform.yaml'
- 'SDK_VERSION'
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.event_name }}-${{ github.head_ref || github.ref }}
cancel-in-progress: true
jobs:
build:
strategy:
fail-fast: false
matrix:
os: [ubuntu-22.04, ubuntu-24.04, ubuntu-24.04-arm, macos-14, windows-2022]
runs-on: ${{ matrix.os }}
steps:
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
path: zephyr
fetch-depth: 0
- name: Rebase
if: github.event_name == 'pull_request'
env:
BASE_REF: ${{ github.base_ref }}
PR_HEAD: ${{ github.event.pull_request.head.sha }}
working-directory: zephyr
shell: bash
run: |
git config --global user.email "actions@zephyrproject.org"
git config --global user.name "Github Actions"
rm -fr ".git/rebase-apply"
rm -fr ".git/rebase-merge"
git rebase origin/${BASE_REF}
git clean -f -d
git log --graph --oneline HEAD...${PR_HEAD}
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
- name: Setup Zephyr project
uses: zephyrproject-rtos/action-zephyr-setup@c125c5ebeeadbd727fa740b407f862734af1e52a # v1.0.9
with:
app-path: zephyr
toolchains: aarch64-zephyr-elf:arc-zephyr-elf:arc64-zephyr-elf:arm-zephyr-eabi:mips-zephyr-elf:riscv64-zephyr-elf:sparc-zephyr-elf:x86_64-zephyr-elf:xtensa-dc233c_zephyr-elf:xtensa-sample_controller32_zephyr-elf:rx-zephyr-elf
ccache-cache-key: hw-${{ matrix.os }}
- name: Build firmware
working-directory: zephyr
shell: bash
run: |
if [ "${{ runner.os }}" = "macOS" ]; then
EXTRA_TWISTER_FLAGS="-P native_sim --build-only"
elif [ "${{ runner.os }}" = "Windows" ]; then
EXTRA_TWISTER_FLAGS="-P native_sim --short-build-path -O/tmp/twister-out"
elif [ "${{ runner.os }}-${{ runner.arch }}" == "Linux-ARM64" ]; then
EXTRA_TWISTER_FLAGS="--exclude-platform native_sim/native"
fi
west twister --runtime-artifact-cleanup --force-color --inline-logs -T samples/hello_world -T samples/cpp/hello_world -v $EXTRA_TWISTER_FLAGS
- name: Upload artifacts
if: failure()
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
with:
if-no-files-found: ignore
path:
zephyr/twister-out/*/samples/hello_world/sample.basic.helloworld/build.log
zephyr/twister-out/*/samples/cpp/hello_world/sample.cpp.helloworld/build.log

View File

@@ -0,0 +1,80 @@
name: Hello World (Multiplatform)
on:
push:
branches:
- main
- v*-branch
- collab-*
pull_request:
branches:
- main
- v*-branch
- collab-*
paths:
- 'scripts/build/**'
- 'scripts/requirements*.txt'
- '.github/workflows/hello_world_multiplatform.yaml'
- 'SDK_VERSION'
concurrency:
group: ${{ github.workflow }}-${{ github.event_name }}-${{ github.head_ref || github.ref }}
cancel-in-progress: true
jobs:
build:
strategy:
fail-fast: false
matrix:
os: [ubuntu-22.04, macos-12, macos-14, windows-2022]
runs-on: ${{ matrix.os }}
steps:
- name: Checkout
uses: actions/checkout@v4
with:
path: zephyr
fetch-depth: 0
- name: Rebase
if: github.event_name == 'pull_request'
env:
BASE_REF: ${{ github.base_ref }}
PR_HEAD: ${{ github.event.pull_request.head.sha }}
working-directory: zephyr
shell: bash
run: |
git config --global user.email "actions@zephyrproject.org"
git config --global user.name "Github Actions"
git rebase origin/${BASE_REF}
git log --graph --oneline HEAD...${PR_HEAD}
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: 3.11
- name: Setup Zephyr project
uses: zephyrproject-rtos/action-zephyr-setup@v1
with:
app-path: zephyr
toolchains: all
- name: Build firmware
working-directory: zephyr
shell: bash
run: |
if [ "${{ runner.os }}" = "macOS" ]; then
EXTRA_TWISTER_FLAGS="-P native_sim --build-only"
elif [ "${{ runner.os }}" = "Windows" ]; then
EXTRA_TWISTER_FLAGS="-P native_sim --short-build-path -O/tmp/twister-out"
fi
./scripts/twister --force-color --inline-logs -T samples/hello_world -T samples/cpp/hello_world -v $EXTRA_TWISTER_FLAGS
- name: Upload artifacts
if: failure()
uses: actions/upload-artifact@v4
with:
if-no-files-found: ignore
path:
zephyr/twister-out/*/samples/hello_world/sample.basic.helloworld/build.log
zephyr/twister-out/*/samples/cpp/hello_world/sample.cpp.helloworld/build.log

View File

@@ -1,57 +0,0 @@
name: Issue Tracker
on:
schedule:
- cron: '10 1/4 * * *'
permissions:
contents: read
env:
OUTPUT_FILE_NAME: IssuesReport.md
COMMITTER_EMAIL: actions@github.com
COMMITTER_NAME: github-actions
COMMITTER_USERNAME: github-actions
jobs:
track-issues:
name: "Collect Issue Stats"
runs-on: ubuntu-24.04
if: github.repository == 'zephyrproject-rtos/zephyr'
steps:
- name: Download configuration file
run: |
wget -q https://raw.githubusercontent.com/$GITHUB_REPOSITORY/main/.github/workflows/issues-report-config.json
- name: install-packages
run: |
sudo apt-get update
sudo apt-get install discount
- uses: brcrista/summarize-issues@54c549b7d38b7db39e5c6e06fd9617e12e5c3491 # v4
with:
title: 'Issues Report for ${{ github.repository }}'
configPath: 'issues-report-config.json'
outputPath: ${{ env.OUTPUT_FILE_NAME }}
token: ${{ secrets.GITHUB_TOKEN }}
- name: upload-stats
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
continue-on-error: true
with:
name: ${{ env.OUTPUT_FILE_NAME }}
path: ${{ env.OUTPUT_FILE_NAME }}
- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@00943011d9042930efac3dcd3a170e4273319bc8 # v5.1.0
with:
aws-access-key-id: ${{ vars.AWS_TESTING_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_TESTING_SECRET_ACCESS_KEY }}
aws-region: us-east-1
- name: Post Results
run: |
mkd2html IssuesReport.md IssuesReport.html
aws s3 cp --quiet IssuesReport.html s3://testing.zephyrproject.org/issues/$GITHUB_REPOSITORY/index.html

View File

@@ -0,0 +1,54 @@
name: Issue Tracker
on:
schedule:
- cron: '*/10 * * * *'
env:
OUTPUT_FILE_NAME: IssuesReport.md
COMMITTER_EMAIL: actions@github.com
COMMITTER_NAME: github-actions
COMMITTER_USERNAME: github-actions
jobs:
track-issues:
name: "Collect Issue Stats"
runs-on: ubuntu-22.04
if: github.repository == 'zephyrproject-rtos/zephyr'
steps:
- name: Download configuration file
run: |
wget -q https://raw.githubusercontent.com/$GITHUB_REPOSITORY/main/.github/workflows/issues-report-config.json
- name: install-packages
run: |
sudo apt-get update
sudo apt-get install discount
- uses: brcrista/summarize-issues@v3
with:
title: 'Issues Report for ${{ github.repository }}'
configPath: 'issues-report-config.json'
outputPath: ${{ env.OUTPUT_FILE_NAME }}
token: ${{ secrets.GITHUB_TOKEN }}
- name: upload-stats
uses: actions/upload-artifact@v4
continue-on-error: true
with:
name: ${{ env.OUTPUT_FILE_NAME }}
path: ${{ env.OUTPUT_FILE_NAME }}
- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@v2
with:
aws-access-key-id: ${{ vars.AWS_TESTING_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_TESTING_SECRET_ACCESS_KEY }}
aws-region: us-east-1
- name: Post Results
run: |
mkd2html IssuesReport.md IssuesReport.html
aws s3 cp --quiet IssuesReport.html s3://testing.zephyrproject.org/issues/$GITHUB_REPOSITORY/index.html

View File

@@ -1,37 +0,0 @@
name: Scancode
on: [pull_request]
permissions:
contents: read
jobs:
scancode_job:
runs-on: ubuntu-24.04
name: Scan code for licenses
steps:
- name: Checkout the code
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
fetch-depth: 0
- name: Scan the code
id: scancode
uses: zephyrproject-rtos/action_scancode@23ef91ce31cd4b954366a7b71eea47520da9b380 # v4
with:
directory-to-scan: 'scan/'
- name: Artifact Upload
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
with:
name: scancode
path: ./artifacts
- name: Verify
run: |
if [ -s ./artifacts/report.txt ]; then
report=$(cat ./artifacts/report.txt)
report="${report//'%'/'%25'}"
report="${report//$'\n'/'%0A'}"
report="${report//$'\r'/'%0D'}"
echo "::error file=./artifacts/report.txt::$report"
exit 1
fi

View File

@@ -0,0 +1,34 @@
name: Scancode
on: [pull_request]
jobs:
scancode_job:
runs-on: ubuntu-22.04
name: Scan code for licenses
steps:
- name: Checkout the code
uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Scan the code
id: scancode
uses: zephyrproject-rtos/action_scancode@v4
with:
directory-to-scan: 'scan/'
- name: Artifact Upload
uses: actions/upload-artifact@v4
with:
name: scancode
path: ./artifacts
- name: Verify
run: |
if [ -s ./artifacts/report.txt ]; then
report=$(cat ./artifacts/report.txt)
report="${report//'%'/'%25'}"
report="${report//$'\n'/'%0A'}"
report="${report//$'\r'/'%0D'}"
echo "::error file=./artifacts/report.txt::$report"
exit 1
fi

View File

@@ -1,45 +0,0 @@
name: Manifest
on:
pull_request_target:
permissions:
contents: read
jobs:
contribs:
runs-on: ubuntu-24.04
permissions:
pull-requests: write # to create/update pull request comments
name: Manifest
steps:
- name: Checkout the code
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
path: zephyrproject/zephyr
fetch-depth: 0
persist-credentials: false
- name: west setup
env:
BASE_REF: ${{ github.base_ref }}
working-directory: zephyrproject/zephyr
run: |
pip install -r scripts/requirements-west.txt --require-hashes
git config --global user.email "you@example.com"
git config --global user.name "Your Name"
west init -l . || true
- name: Manifest
uses: zephyrproject-rtos/action-manifest@09983f53d3d878791aa37a7755ae44d695f4c1e5 # v2.0.0
with:
github-token: ${{ secrets.GITHUB_TOKEN }}
manifest-path: 'west.yml'
checkout-path: 'zephyrproject/zephyr'
use-tree-checkout: 'true'
check-impostor-commits: 'true'
label-prefix: 'manifest-'
verbosity-level: '1'
labels: 'manifest'
dnm-labels: 'DNM (manifest)'
blobs-added-labels: 'Binary Blobs Added'
blobs-modified-labels: 'Binary Blobs Modified'

38
.github/workflows/manifest.yml.disabled vendored Normal file
View File

@@ -0,0 +1,38 @@
name: Manifest
on:
pull_request_target:
jobs:
contribs:
runs-on: ubuntu-22.04
name: Manifest
steps:
- name: Checkout the code
uses: actions/checkout@v4
with:
path: zephyrproject/zephyr
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
persist-credentials: false
- name: west setup
env:
BASE_REF: ${{ github.base_ref }}
working-directory: zephyrproject/zephyr
run: |
pip3 install west
git config --global user.email "you@example.com"
git config --global user.name "Your Name"
west init -l . || true
- name: Manifest
uses: zephyrproject-rtos/action-manifest@v1.2.2
with:
github-token: ${{ secrets.ZB_GITHUB_TOKEN }}
manifest-path: 'west.yml'
checkout-path: 'zephyrproject/zephyr'
use-tree-checkout: 'true'
label-prefix: 'manifest-'
verbosity-level: '1'
labels: 'manifest'
dnm-labels: 'DNM'

View File

@@ -1,19 +0,0 @@
name: Check SHA-pinned GitHub Actions
on:
pull_request:
paths:
- '.github/workflows/**'
permissions:
contents: read
jobs:
check-sha-pinned-actions:
name: Verify GitHub Actions
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- name: Ensure SHA pinned actions
uses: zgosalvez/github-actions-ensure-sha-pinned-actions@9e9574ef04ea69da568d6249bd69539ccc704e74 # v4.0.0

View File

@@ -1,47 +0,0 @@
name: PR Metadata Check
on:
pull_request:
types:
- synchronize
- opened
- reopened
- labeled
- unlabeled
- edited
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.event_name }}-${{ github.head_ref || github.ref }}
cancel-in-progress: true
jobs:
do-not-merge:
name: Prevent Merging
runs-on: ubuntu-24.04
timeout-minutes: 30
steps:
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python dependencies
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Run the check script
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
run: |
python -u scripts/ci/do_not_merge.py \
-p "${{ github.event.pull_request.number }}" \
-o "${{ github.event.repository.owner.login }}" \
-r "${{ github.event.repository.name }}"

View File

@@ -1,53 +0,0 @@
# Copyright (c) 2023 Intel Corporation.
# SPDX-License-Identifier: Apache-2.0
name: Misc. Pylib Scripts TestSuite
on:
push:
branches:
- main
- v*-branch
paths:
- 'scripts/pylib/build_helpers/**'
- '.github/workflows/pylib_tests.yml'
pull_request:
branches:
- main
- v*-branch
paths:
- 'scripts/pylib/build_helpers/**'
- '.github/workflows/pylib_tests.yml'
permissions:
contents: read
jobs:
pylib-tests:
name: Misc. Pylib Unit Tests
runs-on: ${{ matrix.os }}
strategy:
matrix:
python-version: ['3.10', '3.11', '3.12', '3.13']
os: [ubuntu-24.04]
steps:
- name: checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: ${{ matrix.python-version }}
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Run pytest for build_helpers
env:
ZEPHYR_BASE: ./
ZEPHYR_TOOLCHAIN_VARIANT: zephyr
run: |
echo "Run build_helpers tests"
PYTHONPATH=./scripts/tests pytest ./scripts/tests/build_helpers

View File

@@ -0,0 +1,54 @@
# Copyright (c) 2023 Intel Corporation.
# SPDX-License-Identifier: Apache-2.0
name: Misc. Pylib Scripts TestSuite
on:
push:
branches:
- main
- v*-branch
paths:
- 'scripts/pylib/build_helpers/**'
- '.github/workflows/pylib_tests.yml'
pull_request:
branches:
- main
- v*-branch
paths:
- 'scripts/pylib/build_helpers/**'
- '.github/workflows/pylib_tests.yml'
jobs:
pylib-tests:
name: Misc. Pylib Unit Tests
runs-on: ${{ matrix.os }}
strategy:
matrix:
python-version: [3.8, 3.9, '3.10', '3.11', '3.12']
os: [ubuntu-22.04]
steps:
- name: checkout
uses: actions/checkout@v4
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4
with:
python-version: ${{ matrix.python-version }}
- name: cache-pip-linux
if: startsWith(runner.os, 'Linux')
uses: actions/cache@v4
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-${{ matrix.python-version }}
restore-keys: |
${{ runner.os }}-pip-${{ matrix.python-version }}
- name: install-packages
run: |
pip3 install -r scripts/requirements-base.txt -r scripts/requirements-build-test.txt
- name: Run pytest for build_helpers
env:
ZEPHYR_BASE: ./
ZEPHYR_TOOLCHAIN_VARIANT: zephyr
run: |
echo "Run build_helpers tests"
PYTHONPATH=./scripts/tests pytest ./scripts/tests/build_helpers

View File

@@ -1,31 +0,0 @@
name: ready to merge
on:
workflow_call:
inputs:
needs_context:
type: string
required: true
permissions:
contents: read
jobs:
all_jobs_passed:
name: all jobs passed
runs-on: ubuntu-latest
steps:
- name: "Check status of all required jobs"
run: |-
NEEDS_CONTEXT='${{ inputs.needs_context }}'
JOB_IDS=$(echo "$NEEDS_CONTEXT" | jq -r 'keys[]')
for JOB_ID in $JOB_IDS; do
RESULT=$(echo "$NEEDS_CONTEXT" | jq -r ".[\"$JOB_ID\"].result")
echo "$JOB_ID job result: $RESULT"
if [[ $RESULT != "success" && $RESULT != "skipped" ]]; then
echo "***"
echo "Error: The $JOB_ID job did not pass."
exit 1
fi
done
echo "All jobs passed or were skipped."

View File

@@ -1,65 +0,0 @@
name: Create a Release
on:
push:
tags:
- 'v*'
- '!v*rc*'
permissions:
contents: read
jobs:
release:
runs-on: ubuntu-24.04
permissions:
contents: write # to create GitHub release entry
steps:
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
fetch-depth: 0
- name: Get the version
id: get_version
run: |
echo "VERSION=${GITHUB_REF#refs/tags/}" >> $GITHUB_OUTPUT
echo "TRIMMED_VERSION=${GITHUB_REF#refs/tags/v}" >> $GITHUB_OUTPUT
- name: REUSE Compliance Check
uses: fsfe/reuse-action@676e2d560c9a403aa252096d99fcab3e1132b0f5 # v6.0.0
with:
args: spdx -o zephyr-${{ steps.get_version.outputs.VERSION }}.spdx
- name: upload-results
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
continue-on-error: true
with:
name: zephyr-${{ steps.get_version.outputs.VERSION }}.spdx
path: zephyr-${{ steps.get_version.outputs.VERSION }}.spdx
- name: Create empty release notes body
run: |
echo "TODO: add release overview and notes link" > release-notes.txt
- name: Create Release
id: create_release
uses: actions/create-release@0cb9c9b65d5d1901c1f53e5e66eaf4afd303e70e # v1.1.4
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
tag_name: ${{ github.ref }}
release_name: Zephyr ${{ steps.get_version.outputs.TRIMMED_VERSION }}
body_path: release-notes.txt
draft: true
prerelease: true
- name: Upload Release Assets
id: upload-release-asset
uses: actions/upload-release-asset@e8f9f06c4b078e705bd2ea027f0926603fc9b4d5 # v1.0.2
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.create_release.outputs.upload_url }}
asset_path: zephyr-${{ steps.get_version.outputs.VERSION }}.spdx
asset_name: zephyr-${{ steps.get_version.outputs.VERSION }}.spdx
asset_content_type: text/plain

60
.github/workflows/release.yml.disabled vendored Normal file
View File

@@ -0,0 +1,60 @@
name: Create a Release
on:
push:
tags:
- 'v*'
- '!v*rc*'
jobs:
release:
runs-on: ubuntu-22.04
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Get the version
id: get_version
run: |
echo "VERSION=${GITHUB_REF#refs/tags/}" >> $GITHUB_OUTPUT
echo "TRIMMED_VERSION=${GITHUB_REF#refs/tags/v}" >> $GITHUB_OUTPUT
- name: REUSE Compliance Check
uses: fsfe/reuse-action@v1
with:
args: spdx -o zephyr-${{ steps.get_version.outputs.VERSION }}.spdx
- name: upload-results
uses: actions/upload-artifact@v4
continue-on-error: true
with:
name: zephyr-${{ steps.get_version.outputs.VERSION }}.spdx
path: zephyr-${{ steps.get_version.outputs.VERSION }}.spdx
- name: Create empty release notes body
run: |
echo "TODO: add release overview and notes link" > release-notes.txt
- name: Create Release
id: create_release
uses: actions/create-release@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
tag_name: ${{ github.ref }}
release_name: Zephyr ${{ steps.get_version.outputs.TRIMMED_VERSION }}
body_path: release-notes.txt
draft: true
prerelease: true
- name: Upload Release Assets
id: upload-release-asset
uses: actions/upload-release-asset@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.create_release.outputs.upload_url }}
asset_path: zephyr-${{ steps.get_version.outputs.VERSION }}.spdx
asset_name: zephyr-${{ steps.get_version.outputs.VERSION }}.spdx
asset_content_type: text/plain

View File

@@ -1,61 +0,0 @@
# This workflow uses actions that are not certified by GitHub. They are provided
# by a third-party and are governed by separate terms of service, privacy
# policy, and support documentation.
name: Scorecards supply-chain security
on:
# For Branch-Protection check. Only the default branch is supported. See
# https://github.com/ossf/scorecard/blob/main/docs/checks.md#branch-protection
branch_protection_rule:
# To guarantee Maintained check is occasionally updated. See
# https://github.com/ossf/scorecard/blob/main/docs/checks.md#maintained
schedule:
- cron: '43 7 * * 6'
push:
branches:
- main
permissions: read-all
jobs:
analysis:
name: Scorecard analysis
runs-on: ubuntu-latest
permissions:
# Needed for Code scanning upload
security-events: write
# Needed for GitHub OIDC token if publish_results is true
id-token: write
steps:
- name: "Checkout code"
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- name: "Run analysis"
uses: ossf/scorecard-action@4eaacf0543bb3f2c246792bd56e8cdeffafb205a # v2.4.3
with:
results_file: results.sarif
results_format: sarif
# Publish results to OpenSSF REST API for easy access by consumers.
# - Allows the repository to include the Scorecard badge.
# - See https://github.com/ossf/scorecard-action#publishing-results.
publish_results: true
# Upload the results as artifacts (optional). Commenting out will disable
# uploads of run results in SARIF format to the repository Actions tab.
# https://docs.github.com/en/actions/advanced-guides/storing-workflow-data-as-artifacts
- name: "Upload artifact"
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
with:
name: SARIF file
path: results.sarif
retention-days: 5
# Upload the results to GitHub's code scanning dashboard (optional).
# Commenting out will disable upload of results to your repo's Code Scanning dashboard
- name: "Upload to code-scanning"
uses: github/codeql-action/upload-sarif@0499de31b99561a6d14a36a5f662c2a54f91beee # v4.31.2
with:
sarif_file: results.sarif

View File

@@ -1,70 +0,0 @@
# Copyright 2023 Google LLC
# SPDX-License-Identifier: Apache-2.0
name: Scripts tests
on:
push:
branches:
- main
- v*-branch
paths:
- 'scripts/build/**'
- '.github/workflows/scripts_tests.yml'
pull_request:
branches:
- main
- v*-branch
paths:
- 'scripts/build/**'
- '.github/workflows/scripts_tests.yml'
permissions:
contents: read
jobs:
scripts-tests:
name: Scripts tests
runs-on: ${{ matrix.os }}
strategy:
matrix:
python-version: ['3.10', '3.11', '3.12', '3.13']
os: [ubuntu-24.04]
steps:
- name: checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
- name: Rebase
continue-on-error: true
env:
BASE_REF: ${{ github.base_ref }}
PR_HEAD: ${{ github.event.pull_request.head.sha }}
run: |
git config --global user.email "actions@zephyrproject.org"
git config --global user.name "Github Actions"
rm -fr ".git/rebase-apply"
rm -fr ".git/rebase-merge"
git rebase origin/${BASE_REF}
git clean -f -d
git log --graph --oneline HEAD...${PR_HEAD}
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: ${{ matrix.python-version }}
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Run pytest
env:
ZEPHYR_BASE: ./
run: |
echo "Run script tests"
pytest ./scripts/build

View File

@@ -0,0 +1,71 @@
# Copyright 2023 Google LLC
# SPDX-License-Identifier: Apache-2.0
name: Scripts tests
on:
push:
branches:
- main
- v*-branch
paths:
- 'scripts/build/**'
- '.github/workflows/scripts_tests.yml'
pull_request:
branches:
- main
- v*-branch
paths:
- 'scripts/build/**'
- '.github/workflows/scripts_tests.yml'
jobs:
scripts-tests:
name: Scripts tests
runs-on: ${{ matrix.os }}
strategy:
matrix:
python-version: [3.8, 3.9, '3.10', '3.11', '3.12']
os: [ubuntu-20.04]
steps:
- name: checkout
uses: actions/checkout@v4
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
- name: Rebase
continue-on-error: true
env:
BASE_REF: ${{ github.base_ref }}
PR_HEAD: ${{ github.event.pull_request.head.sha }}
run: |
git config --global user.email "actions@zephyrproject.org"
git config --global user.name "Github Actions"
git rebase origin/${BASE_REF}
git log --graph --oneline HEAD...${PR_HEAD}
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4
with:
python-version: ${{ matrix.python-version }}
- name: cache-pip-linux
if: startsWith(runner.os, 'Linux')
uses: actions/cache@v4
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-${{ matrix.python-version }}
restore-keys: |
${{ runner.os }}-pip-${{ matrix.python-version }}
- name: install-packages
run: |
pip3 install -r scripts/requirements-base.txt -r scripts/requirements-build-test.txt
- name: Run pytest
env:
ZEPHYR_BASE: ./
run: |
echo "Run script tests"
pytest ./scripts/build

View File

@@ -1,33 +0,0 @@
name: Stale Workflow Queue Cleanup
on:
workflow_dispatch:
schedule:
# everyday at 15:00
- cron: '0 15 * * *'
permissions:
contents: read
concurrency:
group: stale-workflow-queue-cleanup
cancel-in-progress: true
jobs:
cleanup:
name: Cleanup
runs-on: ubuntu-24.04
if: github.ref == 'refs/heads/main'
permissions:
actions: write # to delete stale workflow runs
steps:
- name: Delete stale queued workflow runs
uses: MajorScruffy/delete-old-workflow-runs@78b5af714fefaefdf74862181c467b061782719e # v0.3.0
with:
repository: ${{ github.repository }}
# Remove any workflow runs in "queued" state for more than 1 day
older-than-seconds: 86400
status: queued
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

View File

@@ -0,0 +1,28 @@
name: Stale Workflow Queue Cleanup
on:
workflow_dispatch:
branches: [main]
schedule:
# everyday at 15:00
- cron: '0 15 * * *'
concurrency:
group: stale-workflow-queue-cleanup
cancel-in-progress: true
jobs:
cleanup:
name: Cleanup
runs-on: ubuntu-22.04
steps:
- name: Delete stale queued workflow runs
uses: MajorScruffy/delete-old-workflow-runs@v0.3.0
with:
repository: ${{ github.repository }}
# Remove any workflow runs in "queued" state for more than 1 day
older-than-seconds: 86400
status: queued
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

View File

@@ -1,35 +0,0 @@
name: "Close stale pull requests/issues"
on:
schedule:
- cron: "16 00 * * *"
permissions:
contents: read
jobs:
stale:
name: Find Stale issues and PRs
runs-on: ubuntu-24.04
if: github.repository == 'zephyrproject-rtos/zephyr'
permissions:
pull-requests: write # to comment on stale pull requests
issues: write # to comment on stale issues
steps:
- uses: actions/stale@5f858e3efba33a5ca4407a664cc011ad407f2008 # v10.1.0
with:
stale-pr-message: 'This pull request has been marked as stale because it has been open (more
than) 60 days with no activity. Remove the stale label or add a comment saying that you
would like to have the label removed otherwise this pull request will automatically be
closed in 14 days. Note, that you can always re-open a closed pull request at any time.'
stale-issue-message: 'This issue has been marked as stale because it has been open (more
than) 60 days with no activity. Remove the stale label or add a comment saying that you
would like to have the label removed otherwise this issue will automatically be closed in
14 days. Note, that you can always re-open a closed issue at any time.'
days-before-stale: 60
days-before-close: 14
stale-issue-label: 'Stale'
stale-pr-label: 'Stale'
exempt-pr-labels: 'Blocked,In progress'
exempt-issue-labels: 'In progress,Enhancement,Feature,Feature Request,RFC,Meta,Process,Coverity'
operations-per-run: 400

View File

@@ -0,0 +1,28 @@
name: "Close stale pull requests/issues"
on:
schedule:
- cron: "16 00 * * *"
jobs:
stale:
name: Find Stale issues and PRs
runs-on: ubuntu-22.04
if: github.repository == 'zephyrproject-rtos/zephyr'
steps:
- uses: actions/stale@v8
with:
stale-pr-message: 'This pull request has been marked as stale because it has been open (more
than) 60 days with no activity. Remove the stale label or add a comment saying that you
would like to have the label removed otherwise this pull request will automatically be
closed in 14 days. Note, that you can always re-open a closed pull request at any time.'
stale-issue-message: 'This issue has been marked as stale because it has been open (more
than) 60 days with no activity. Remove the stale label or add a comment saying that you
would like to have the label removed otherwise this issue will automatically be closed in
14 days. Note, that you can always re-open a closed issue at any time.'
days-before-stale: 60
days-before-close: 14
stale-issue-label: 'Stale'
stale-pr-label: 'Stale'
exempt-pr-labels: 'Blocked,In progress'
exempt-issue-labels: 'In progress,Enhancement,Feature,Feature Request,RFC,Meta,Process,Coverity'
operations-per-run: 400

View File

@@ -1,38 +0,0 @@
name: Merged PR stats
on:
pull_request_target:
branches:
- main
- v*-branch
types: [closed]
permissions:
contents: read
jobs:
record_merged:
if: github.event.pull_request.merged == true && github.repository == 'zephyrproject-rtos/zephyr'
runs-on: ubuntu-24.04
steps:
- name: checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: PR event
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
ELASTICSEARCH_KEY: ${{ secrets.ELASTICSEARCH_KEY }}
ELASTICSEARCH_SERVER: "https://elasticsearch.zephyrproject.io:443"
PR_STAT_ES_INDEX: ${{ vars.PR_STAT_ES_INDEX }}
run: |
python3 ./scripts/ci/stats/merged_prs.py --pull-request ${{ github.event.pull_request.number }} --repo ${{ github.repository }}

View File

@@ -0,0 +1,24 @@
name: Merged PR stats
on:
pull_request_target:
branches:
- main
- v*-branch
types: [closed]
jobs:
record_merged:
if: github.event.pull_request.merged == true && github.repository == 'zephyrproject-rtos/zephyr'
runs-on: ubuntu-22.04
steps:
- name: checkout
uses: actions/checkout@v4
- name: PR event
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
ELASTICSEARCH_KEY: ${{ secrets.ELASTICSEARCH_KEY }}
ELASTICSEARCH_SERVER: "https://elasticsearch.zephyrproject.io:443"
PR_STAT_ES_INDEX: ${{ vars.PR_STAT_ES_INDEX }}
run: |
pip3 install pygithub elasticsearch
python3 ./scripts/ci/stats/merged_prs.py --pull-request ${{ github.event.pull_request.number }} --repo ${{ github.repository }}

View File

@@ -1,64 +0,0 @@
name: Publish Twister Test Results
on:
workflow_run:
workflows: ["Run tests with twister"]
branches:
- main
types:
- completed
permissions:
contents: read
jobs:
upload-to-elasticsearch:
if: |
github.repository == 'zephyrproject-rtos/zephyr' &&
github.event.workflow_run.event != 'pull_request'
env:
ELASTICSEARCH_KEY: ${{ secrets.ELASTICSEARCH_KEY }}
ELASTICSEARCH_SERVER: "https://elasticsearch.zephyrproject.io:443"
runs-on: ubuntu-24.04
steps:
# Needed for elasticearch and upload script
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
fetch-depth: 0
persist-credentials: false
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Download Artifacts
id: download-artifacts
uses: dawidd6/action-download-artifact@ac66b43f0e6a346234dd65d4d0c8fbb31cb316e5 # v11
with:
path: artifacts
workflow: twister.yml
run_id: ${{ github.event.workflow_run.id }}
if_no_artifact_found: ignore
- name: Upload to elasticsearch
if: steps.download-artifacts.outputs.found_artifact == 'true'
run: |
# set run date on upload to get consistent and unified data across the matrix.
run_date=`date --iso-8601=minutes`
if [ "${{github.event.workflow_run.event}}" = "push" ]; then
python3 ./scripts/ci/upload_test_results_es.py -r ${run_date} \
--run-attempt ${{github.run_attempt}} \
--run-branch ${{github.ref_name}} \
--index zephyr-main-ci-push-1 artifacts/*/*/twister.json
elif [ "${{github.event.workflow_run.event}}" = "schedule" ]; then
python3 ./scripts/ci/upload_test_results_es.py -r ${run_date} \
--run-attempt ${{github.run_attempt}} \
--run-branch ${{github.ref_name}} \
--index zephyr-main-ci-weekly-1 artifacts/*/*/twister.json
fi

View File

@@ -1,402 +0,0 @@
name: Run tests with twister
on:
push:
branches:
- main
- v*-branch
- collab-*
pull_request:
branches:
- main
- v*-branch
- collab-*
schedule:
# Run at 02:00 UTC on every Sunday
- cron: '0 2 * * 0'
permissions:
contents: read
concurrency:
group: ${{ github.workflow }}-${{ github.event_name }}-${{ github.head_ref || github.ref }}
cancel-in-progress: true
jobs:
twister-build-prep:
if: github.repository_owner == 'zephyrproject-rtos'
runs-on: ubuntu-24.04
outputs:
subset: ${{ steps.output-services.outputs.subset }}
size: ${{ steps.output-services.outputs.size }}
fullrun: ${{ steps.output-services.outputs.fullrun }}
env:
MATRIX_SIZE: 10
PUSH_MATRIX_SIZE: 20
WEEKLY_MATRIX_SIZE: 200
BSIM_OUT_PATH: /opt/bsim/
BSIM_COMPONENTS_PATH: /opt/bsim/components
TESTS_PER_BUILDER: 900
COMMIT_RANGE: ${{ github.event.pull_request.base.sha }}..${{ github.event.pull_request.head.sha }}
BASE_REF: ${{ github.base_ref }}
steps:
- name: Checkout
if: github.event_name == 'pull_request'
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
path: zephyr
persist-credentials: false
- name: Set up Python
if: github.event_name == 'pull_request'
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: install-packages
working-directory: zephyr
if: github.event_name == 'pull_request'
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Setup Zephyr project
if: github.event_name == 'pull_request'
uses: zephyrproject-rtos/action-zephyr-setup@c125c5ebeeadbd727fa740b407f862734af1e52a # v1.0.9
with:
app-path: zephyr
enable-ccache: false
- name: Environment Setup
working-directory: zephyr
if: github.event_name == 'pull_request'
run: |
git config --global user.email "bot@zephyrproject.org"
git config --global user.name "Zephyr Bot"
rm -fr ".git/rebase-apply"
rm -fr ".git/rebase-merge"
git rebase origin/${BASE_REF}
git clean -f -d
git log --pretty=oneline | head -n 10
- name: Generate Test Plan with Twister
working-directory: zephyr
if: github.event_name == 'pull_request'
id: test-plan
run: |
export ZEPHYR_BASE=${PWD}
export ZEPHYR_TOOLCHAIN_VARIANT=zephyr
python3 ./scripts/ci/test_plan.py -c origin/${BASE_REF}.. --pull-request -t $TESTS_PER_BUILDER
if [ -s .testplan ]; then
cat .testplan >> $GITHUB_ENV
else
echo "TWISTER_NODES=${MATRIX_SIZE}" >> $GITHUB_ENV
fi
rm -f testplan.json .testplan
- name: Determine matrix size
id: output-services
run: |
if [ "${{github.event_name}}" = "push" ]; then
subset="[$(seq -s',' 1 ${PUSH_MATRIX_SIZE})]"
size=${MATRIX_SIZE}
elif [ "${{github.event_name}}" = "pull_request" ]; then
if [ -n "${TWISTER_NODES}" ]; then
subset="[$(seq -s',' 1 ${TWISTER_NODES})]"
else
subset="[$(seq -s',' 1 ${MATRIX_SIZE})]"
fi
size=${TWISTER_NODES}
elif [ "${{github.event_name}}" = "schedule" -a "${{github.repository}}" = "zephyrproject-rtos/zephyr" ]; then
subset="[$(seq -s',' 1 ${WEEKLY_MATRIX_SIZE})]"
size=${WEEKLY_MATRIX_SIZE}
else
size=0
fi
echo "subset=${subset}" >> $GITHUB_OUTPUT
echo "size=${size}" >> $GITHUB_OUTPUT
echo "fullrun=${TWISTER_FULL}" >> $GITHUB_OUTPUT
twister-build:
runs-on:
group: zephyr-runner-v2-linux-x64-4xlarge
needs: twister-build-prep
if: needs.twister-build-prep.outputs.size != 0
container:
image: ghcr.io/zephyrproject-rtos/ci-repo-cache:v0.28.7.20251127
options: '--entrypoint /bin/bash'
strategy:
fail-fast: false
matrix:
subset: ${{fromJSON(needs.twister-build-prep.outputs.subset)}}
timeout-minutes: 1440
env:
CCACHE_DIR: /node-cache/ccache-zephyr
CCACHE_REMOTE_STORAGE: "redis://cache-*.keydb-cache.svc.cluster.local|shards=1,2,3"
CCACHE_REMOTE_ONLY: "true"
# `--specs` is ignored because ccache is unable to resolve the toolchain specs file path.
CCACHE_IGNOREOPTIONS: '-specs=* --specs=*'
BSIM_OUT_PATH: /opt/bsim/
BSIM_COMPONENTS_PATH: /opt/bsim/components
TWISTER_COMMON: ' --test-config tests/test_config_ci.yaml --force-color --inline-logs -v -N -M --retry-failed 3 --timeout-multiplier 2 '
WEEKLY_OPTIONS: ' -M --build-only --all --show-footprint --report-filtered -j 32'
PR_OPTIONS: ' --clobber-output --integration -j 16'
PUSH_OPTIONS: ' --clobber-output -M --show-footprint --report-filtered -j 16'
COMMIT_RANGE: ${{ github.event.pull_request.base.sha }}..${{ github.event.pull_request.head.sha }}
BASE_REF: ${{ github.base_ref }}
LLVM_TOOLCHAIN_PATH: /usr/lib/llvm-20
steps:
- name: Print cloud service information
run: |
echo "ZEPHYR_RUNNER_CLOUD_PROVIDER = ${ZEPHYR_RUNNER_CLOUD_PROVIDER}"
echo "ZEPHYR_RUNNER_CLOUD_NODE = ${ZEPHYR_RUNNER_CLOUD_NODE}"
echo "ZEPHYR_RUNNER_CLOUD_POD = ${ZEPHYR_RUNNER_CLOUD_POD}"
- name: Apply container owner mismatch workaround
run: |
# FIXME: The owner UID of the GITHUB_WORKSPACE directory may not
# match the container user UID because of the way GitHub
# Actions runner is implemented. Remove this workaround when
# GitHub comes up with a fundamental fix for this problem.
git config --global --add safe.directory ${GITHUB_WORKSPACE}
- name: Clone cached Zephyr repository
continue-on-error: true
run: |
git clone --shared /repo-cache/zephyrproject/zephyr .
git remote set-url origin ${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
persist-credentials: false
- name: Environment Setup
run: |
if [ "${{github.event_name}}" = "pull_request" ]; then
git config --global user.email "bot@zephyrproject.org"
git config --global user.name "Zephyr Builder"
rm -fr ".git/rebase-apply"
rm -fr ".git/rebase-merge"
git rebase origin/${BASE_REF}
git clean -f -d
git log --pretty=oneline | head -n 10
fi
echo "$HOME/.local/bin" >> $GITHUB_PATH
echo "$HOME/.cargo/bin" >> $GITHUB_PATH
west init -l . || true
west config manifest.group-filter -- +ci,+optional,+testing
west config --global update.narrow true
west update --path-cache /repo-cache/zephyrproject 2>&1 1> west.update.log || west update --path-cache /repo-cache/zephyrproject 2>&1 1> west.update.log || ( rm -rf ../modules ../bootloader ../tools && west update --path-cache /repo-cache/zephyrproject)
west forall -c 'git reset --hard HEAD'
echo "ZEPHYR_SDK_INSTALL_DIR=/opt/toolchains/zephyr-sdk-$( cat SDK_VERSION )" >> $GITHUB_ENV
- name: Check Environment
run: |
cmake --version
gcc --version
cargo --version
rustup target list --installed
ls -la
echo "github.ref: ${{ github.ref }}"
echo "github.base_ref: ${{ github.base_ref }}"
echo "github.ref_name: ${{ github.ref_name }}"
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
west packages pip --install
- name: Set up ccache
run: |
mkdir -p ${CCACHE_DIR}
ccache -M 10G
ccache -p
ccache -z -s -vv
- name: Update BabbleSim to manifest revision
run: |
export BSIM_VERSION=$( west list bsim -f {revision} )
echo "Manifest points to bsim sha $BSIM_VERSION"
cd /opt/bsim_west/bsim
git fetch -n origin ${BSIM_VERSION}
git -c advice.detachedHead=false checkout ${BSIM_VERSION}
west update
make everything -s -j 8
- if: github.event_name == 'push'
name: Run Tests with Twister (Push)
id: run_twister
run: |
export ZEPHYR_BASE=${PWD}
export ZEPHYR_TOOLCHAIN_VARIANT=zephyr
./scripts/twister --subset ${{matrix.subset}}/${{ strategy.job-total }} ${TWISTER_COMMON} ${PUSH_OPTIONS}
if [ "${{matrix.subset}}" = "1" ]; then
./scripts/zephyr_module.py --twister-out module_tests.args
if [ -s module_tests.args ]; then
./scripts/twister +module_tests.args --outdir module_tests ${TWISTER_COMMON} ${PUSH_OPTIONS}
fi
fi
- if: github.event_name == 'pull_request'
name: Run Tests with Twister (Pull Request)
id: run_twister_pr
run: |
rm -f testplan.json
export ZEPHYR_BASE=${PWD}
export ZEPHYR_TOOLCHAIN_VARIANT=zephyr
python3 ./scripts/ci/test_plan.py -c origin/${BASE_REF}.. --pull-request
./scripts/twister --subset ${{matrix.subset}}/${{ strategy.job-total }} --load-tests testplan.json ${TWISTER_COMMON} ${PR_OPTIONS}
if [ "${{matrix.subset}}" = "1" -a ${{needs.twister-build-prep.outputs.fullrun}} = 'True' ]; then
./scripts/zephyr_module.py --twister-out module_tests.args
if [ -s module_tests.args ]; then
./scripts/twister +module_tests.args --outdir module_tests ${TWISTER_COMMON} ${PR_OPTIONS}
fi
fi
- if: github.event_name == 'schedule'
name: Run Tests with Twister (Weekly)
id: run_twister_sched
run: |
export ZEPHYR_BASE=${PWD}
export ZEPHYR_TOOLCHAIN_VARIANT=zephyr
./scripts/twister --subset ${{matrix.subset}}/${{ strategy.job-total }} ${TWISTER_COMMON} ${WEEKLY_OPTIONS}
if [ "${{matrix.subset}}" = "1" ]; then
./scripts/zephyr_module.py --twister-out module_tests.args
if [ -s module_tests.args ]; then
./scripts/twister +module_tests.args --outdir module_tests ${TWISTER_COMMON} ${WEEKLY_OPTIONS}
fi
fi
- name: Print ccache stats
if: always()
run: |
ccache -s -vv
- name: Upload Unit Test Results
if: always()
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
with:
name: Unit Test Results (Subset ${{ matrix.subset }})
if-no-files-found: ignore
path: |
twister-out/twister.xml
twister-out/twister.json
module_tests/twister.xml
testplan.json
- if: matrix.subset == 1 && github.event_name == 'push'
name: Save the list of Python packages
shell: bash
run: |
FREEZE_FILE="frozen-requirements.txt"
timestamp="$(date)"
version="$(git describe --abbrev=12 --always)"
echo -e "# Generated at $timestamp ($version)\n" > $FREEZE_FILE
pip freeze | tee -a $FREEZE_FILE
- if: matrix.subset == 1 && github.event_name == 'push'
name: Upload the list of Python packages
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
with:
name: Frozen PIP package set
path: |
frozen-requirements.txt
twister-test-results:
name: "Publish Unit Tests Results"
needs:
- twister-build
runs-on: ubuntu-24.04
permissions:
checks: write # to create the check run entry with Twister test results
# the build-and-test job might be skipped, we don't need to run this job then
if: success() || failure()
steps:
- name: Check out source code
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
persist-credentials: false
- name: Set up Python
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: 3.12
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Download Artifacts
uses: actions/download-artifact@018cc2cf5baa6db3ef3c5f8a56943fffe632ef53 # v6.0.0
with:
path: artifacts
- name: Merge Test Results
run: |
junitparser merge --glob 'artifacts/*/twister.xml' 'artifacts/*/*/twister.xml' junit.xml
junit2html junit.xml junit.html
- name: Upload Unit Test Results
if: always()
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
with:
name: Unit Test Results
if-no-files-found: ignore
path: |
junit.html
junit.xml
- name: Upload test results to Codecov
if: ${{ !cancelled() && (github.event_name == 'push') }}
uses: codecov/test-results-action@47f89e9acb64b76debcd5ea40642d25a4adced9f # v1.1.1
with:
token: ${{ secrets.CODECOV_TOKEN }}
- name: Publish Unit Test Results
uses: EnricoMi/publish-unit-test-result-action@34d7c956a59aed1bfebf31df77b8de55db9bbaaf # v2.21.0
with:
check_name: Unit Test Results
files: "**/twister.xml"
comment_mode: off
- name: Analyze Twister Reports
if: needs.twister-build.result == 'failure'
run: |
./scripts/ci/twister_report_analyzer.py artifacts/*/*/twister.json --long-summary --platforms --output-md errors.md
if [[ -s "errors.md" ]]; then
echo '### Error Summary! 🚀' >> $GITHUB_STEP_SUMMARY
cat errors.md >> $GITHUB_STEP_SUMMARY
fi
- name: Upload Twister Analysis Results
if: needs.twister-build.result == 'failure'
uses: actions/upload-artifact@330a01c490aca151604b8cf639adc76d48f6c5d4 # v5.0.0
with:
name: Twister Analysis Results
if-no-files-found: ignore
path: |
twister_report_summary.json
twister-status-check:
if: always()
name: "Check Twister Status"
needs:
- twister-build-prep
- twister-build
uses: ./.github/workflows/ready-to-merge.yml
with:
needs_context: ${{ toJson(needs) }}

350
.github/workflows/twister.yaml.disabled vendored Normal file
View File

@@ -0,0 +1,350 @@
name: Run tests with twister
on:
push:
branches:
- main
- v*-branch
- collab-*
pull_request_target:
branches:
- main
- v*-branch
- collab-*
schedule:
# Run at 03:00 UTC on every Sunday
- cron: '0 3 * * 0'
concurrency:
group: ${{ github.workflow }}-${{ github.event_name }}-${{ github.head_ref || github.ref }}
cancel-in-progress: true
jobs:
twister-build-prep:
if: github.repository_owner == 'zephyrproject-rtos'
runs-on:
group: zephyr-runner-v2-linux-x64-4xlarge
container:
image: ghcr.io/zephyrproject-rtos/ci-repo-cache:v0.26.7.20240313
options: '--entrypoint /bin/bash'
outputs:
subset: ${{ steps.output-services.outputs.subset }}
size: ${{ steps.output-services.outputs.size }}
fullrun: ${{ steps.output-services.outputs.fullrun }}
env:
MATRIX_SIZE: 10
PUSH_MATRIX_SIZE: 20
DAILY_MATRIX_SIZE: 80
BSIM_OUT_PATH: /opt/bsim/
BSIM_COMPONENTS_PATH: /opt/bsim/components
TESTS_PER_BUILDER: 700
COMMIT_RANGE: ${{ github.event.pull_request.base.sha }}..${{ github.event.pull_request.head.sha }}
BASE_REF: ${{ github.base_ref }}
steps:
- name: Apply container owner mismatch workaround
run: |
# FIXME: The owner UID of the GITHUB_WORKSPACE directory may not
# match the container user UID because of the way GitHub
# Actions runner is implemented. Remove this workaround when
# GitHub comes up with a fundamental fix for this problem.
git config --global --add safe.directory ${GITHUB_WORKSPACE}
- name: Print cloud service information
run: |
echo "ZEPHYR_RUNNER_CLOUD_PROVIDER = ${ZEPHYR_RUNNER_CLOUD_PROVIDER}"
echo "ZEPHYR_RUNNER_CLOUD_NODE = ${ZEPHYR_RUNNER_CLOUD_NODE}"
echo "ZEPHYR_RUNNER_CLOUD_POD = ${ZEPHYR_RUNNER_CLOUD_POD}"
- name: Clone cached Zephyr repository
if: github.event_name == 'pull_request_target'
continue-on-error: true
run: |
git clone --shared /repo-cache/zephyrproject/zephyr .
git remote set-url origin ${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}
- name: Checkout
if: github.event_name == 'pull_request_target'
uses: actions/checkout@v4
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
persist-credentials: false
- name: Environment Setup
if: github.event_name == 'pull_request_target'
run: |
git config --global user.email "bot@zephyrproject.org"
git config --global user.name "Zephyr Bot"
rm -fr ".git/rebase-apply"
git rebase origin/${BASE_REF}
git log --pretty=oneline | head -n 10
west init -l . || true
west config manifest.group-filter -- +ci,+optional
west config --global update.narrow true
west update --path-cache /repo-cache/zephyrproject 2>&1 1> west.update.log || west update --path-cache /repo-cache/zephyrproject 2>&1 1> west.update.log || ( rm -rf ../modules ../bootloader ../tools && west update --path-cache /repo-cache/zephyrproject)
west forall -c 'git reset --hard HEAD'
echo "ZEPHYR_SDK_INSTALL_DIR=/opt/toolchains/zephyr-sdk-$( cat SDK_VERSION )" >> $GITHUB_ENV
- name: Generate Test Plan with Twister
if: github.event_name == 'pull_request_target'
id: test-plan
run: |
export ZEPHYR_BASE=${PWD}
export ZEPHYR_TOOLCHAIN_VARIANT=zephyr
python3 ./scripts/ci/test_plan.py -c origin/${BASE_REF}.. --pull-request -t $TESTS_PER_BUILDER
if [ -s .testplan ]; then
cat .testplan >> $GITHUB_ENV
else
echo "TWISTER_NODES=${MATRIX_SIZE}" >> $GITHUB_ENV
fi
rm -f testplan.json .testplan
- name: Determine matrix size
id: output-services
run: |
if [ "${{github.event_name}}" = "pull_request_target" ]; then
if [ -n "${TWISTER_NODES}" ]; then
subset="[$(seq -s',' 1 ${TWISTER_NODES})]"
else
subset="[$(seq -s',' 1 ${MATRIX_SIZE})]"
fi
size=${TWISTER_NODES}
elif [ "${{github.event_name}}" = "push" ]; then
subset="[$(seq -s',' 1 ${PUSH_MATRIX_SIZE})]"
size=${MATRIX_SIZE}
elif [ "${{github.event_name}}" = "schedule" -a "${{github.repository}}" = "zephyrproject-rtos/zephyr" ]; then
subset="[$(seq -s',' 1 ${DAILY_MATRIX_SIZE})]"
size=${DAILY_MATRIX_SIZE}
else
size=0
fi
echo "subset=${subset}" >> $GITHUB_OUTPUT
echo "size=${size}" >> $GITHUB_OUTPUT
echo "fullrun=${TWISTER_FULL}" >> $GITHUB_OUTPUT
twister-build:
runs-on:
group: zephyr-runner-v2-linux-x64-4xlarge
needs: twister-build-prep
if: needs.twister-build-prep.outputs.size != 0
container:
image: ghcr.io/zephyrproject-rtos/ci-repo-cache:v0.26.7.20240313
options: '--entrypoint /bin/bash'
strategy:
fail-fast: false
matrix:
subset: ${{fromJSON(needs.twister-build-prep.outputs.subset)}}
timeout-minutes: 1440
env:
CCACHE_DIR: /node-cache/ccache-zephyr
CCACHE_REMOTE_STORAGE: "redis://cache-*.keydb-cache.svc.cluster.local|shards=1,2,3"
CCACHE_REMOTE_ONLY: "true"
# `--specs` is ignored because ccache is unable to resolve the toolchain specs file path.
CCACHE_IGNOREOPTIONS: '--specs=*'
BSIM_OUT_PATH: /opt/bsim/
BSIM_COMPONENTS_PATH: /opt/bsim/components
TWISTER_COMMON: ' --force-color --inline-logs -v -N -M --retry-failed 3 --timeout-multiplier 2 '
DAILY_OPTIONS: ' -M --build-only --all --show-footprint'
PR_OPTIONS: ' --clobber-output --integration'
PUSH_OPTIONS: ' --clobber-output -M --show-footprint'
COMMIT_RANGE: ${{ github.event.pull_request.base.sha }}..${{ github.event.pull_request.head.sha }}
BASE_REF: ${{ github.base_ref }}
steps:
- name: Print cloud service information
run: |
echo "ZEPHYR_RUNNER_CLOUD_PROVIDER = ${ZEPHYR_RUNNER_CLOUD_PROVIDER}"
echo "ZEPHYR_RUNNER_CLOUD_NODE = ${ZEPHYR_RUNNER_CLOUD_NODE}"
echo "ZEPHYR_RUNNER_CLOUD_POD = ${ZEPHYR_RUNNER_CLOUD_POD}"
- name: Apply container owner mismatch workaround
run: |
# FIXME: The owner UID of the GITHUB_WORKSPACE directory may not
# match the container user UID because of the way GitHub
# Actions runner is implemented. Remove this workaround when
# GitHub comes up with a fundamental fix for this problem.
git config --global --add safe.directory ${GITHUB_WORKSPACE}
- name: Clone cached Zephyr repository
continue-on-error: true
run: |
git clone --shared /repo-cache/zephyrproject/zephyr .
git remote set-url origin ${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}
- name: Checkout
uses: actions/checkout@v4
with:
ref: ${{ github.event.pull_request.head.sha }}
fetch-depth: 0
persist-credentials: false
- name: Environment Setup
run: |
if [ "${{github.event_name}}" = "pull_request_target" ]; then
git config --global user.email "bot@zephyrproject.org"
git config --global user.name "Zephyr Builder"
rm -fr ".git/rebase-apply"
git rebase origin/${BASE_REF}
git log --pretty=oneline | head -n 10
fi
echo "$HOME/.local/bin" >> $GITHUB_PATH
west init -l . || true
west config manifest.group-filter -- +ci,+optional
west config --global update.narrow true
west update --path-cache /repo-cache/zephyrproject 2>&1 1> west.update.log || west update --path-cache /repo-cache/zephyrproject 2>&1 1> west.update.log || ( rm -rf ../modules ../bootloader ../tools && west update --path-cache /repo-cache/zephyrproject)
west forall -c 'git reset --hard HEAD'
echo "ZEPHYR_SDK_INSTALL_DIR=/opt/toolchains/zephyr-sdk-$( cat SDK_VERSION )" >> $GITHUB_ENV
- name: Check Environment
run: |
cmake --version
gcc --version
ls -la
echo "github.ref: ${{ github.ref }}"
echo "github.base_ref: ${{ github.base_ref }}"
echo "github.ref_name: ${{ github.ref_name }}"
- name: Set up ccache
run: |
mkdir -p ${CCACHE_DIR}
ccache -M 10G
ccache -p
ccache -z -s -vv
- if: github.event_name == 'push'
name: Run Tests with Twister (Push)
run: |
export ZEPHYR_BASE=${PWD}
export ZEPHYR_TOOLCHAIN_VARIANT=zephyr
./scripts/twister --subset ${{matrix.subset}}/${{ strategy.job-total }} ${TWISTER_COMMON} ${PUSH_OPTIONS}
if [ "${{matrix.subset}}" = "1" ]; then
./scripts/zephyr_module.py --twister-out module_tests.args
if [ -s module_tests.args ]; then
./scripts/twister +module_tests.args --outdir module_tests ${TWISTER_COMMON} ${PUSH_OPTIONS}
fi
fi
- if: github.event_name == 'pull_request_target'
name: Run Tests with Twister (Pull Request)
run: |
rm -f testplan.json
export ZEPHYR_BASE=${PWD}
export ZEPHYR_TOOLCHAIN_VARIANT=zephyr
python3 ./scripts/ci/test_plan.py -c origin/${BASE_REF}.. --pull-request
./scripts/twister --subset ${{matrix.subset}}/${{ strategy.job-total }} --load-tests testplan.json ${TWISTER_COMMON} ${PR_OPTIONS}
if [ "${{matrix.subset}}" = "1" -a ${{needs.twister-build-prep.outputs.fullrun}} = 'True' ]; then
./scripts/zephyr_module.py --twister-out module_tests.args
if [ -s module_tests.args ]; then
./scripts/twister +module_tests.args --outdir module_tests ${TWISTER_COMMON} ${PR_OPTIONS}
fi
fi
- if: github.event_name == 'schedule'
name: Run Tests with Twister (Daily)
run: |
export ZEPHYR_BASE=${PWD}
export ZEPHYR_TOOLCHAIN_VARIANT=zephyr
./scripts/twister --subset ${{matrix.subset}}/${{ strategy.job-total }} ${TWISTER_COMMON} ${DAILY_OPTIONS}
if [ "${{matrix.subset}}" = "1" ]; then
./scripts/zephyr_module.py --twister-out module_tests.args
if [ -s module_tests.args ]; then
./scripts/twister +module_tests.args --outdir module_tests ${TWISTER_COMMON} ${DAILY_OPTIONS}
fi
fi
- name: Print ccache stats
if: always()
run: |
ccache -s -vv
- name: Upload Unit Test Results
if: always()
uses: actions/upload-artifact@v4
with:
name: Unit Test Results (Subset ${{ matrix.subset }})
if-no-files-found: ignore
path: |
twister-out/twister.xml
twister-out/twister.json
module_tests/twister.xml
testplan.json
- if: matrix.subset == 1 && github.event_name == 'push'
name: Save the list of Python packages
shell: bash
run: |
FREEZE_FILE="frozen-requirements.txt"
timestamp="$(date)"
version="$(git describe --abbrev=12 --always)"
echo -e "# Generated at $timestamp ($version)\n" > $FREEZE_FILE
pip3 freeze | tee -a $FREEZE_FILE
- if: matrix.subset == 1 && github.event_name == 'push'
name: Upload the list of Python packages
uses: actions/upload-artifact@v4
with:
name: Frozen PIP package set
path: |
frozen-requirements.txt
twister-test-results:
name: "Publish Unit Tests Results"
env:
ELASTICSEARCH_KEY: ${{ secrets.ELASTICSEARCH_KEY }}
ELASTICSEARCH_SERVER: "https://elasticsearch.zephyrproject.io:443"
needs: twister-build
runs-on: ubuntu-22.04
# the build-and-test job might be skipped, we don't need to run this job then
if: success() || failure()
steps:
# Needed for opensearch and upload script
- if: github.event_name == 'push' || github.event_name == 'schedule'
name: Checkout
uses: actions/checkout@v4
with:
fetch-depth: 0
persist-credentials: false
- name: Download Artifacts
uses: actions/download-artifact@v4
with:
path: artifacts
- if: github.event_name == 'push' || github.event_name == 'schedule'
name: Upload to opensearch
run: |
pip3 install elasticsearch
# set run date on upload to get consistent and unified data across the matrix.
run_date=`date --iso-8601=minutes`
if [ "${{github.event_name}}" = "push" ]; then
python3 ./scripts/ci/upload_test_results_es.py -r ${run_date} \
--index zephyr-main-ci-push-1 artifacts/*/*/twister.json
elif [ "${{github.event_name}}" = "schedule" ]; then
python3 ./scripts/ci/upload_test_results_es.py -r ${run_date} \
--index zephyr-main-ci-weekly-1 artifacts/*/*/twister.json
fi
- name: Merge Test Results
run: |
pip3 install junitparser junit2html
junitparser merge artifacts/*/*/twister.xml junit.xml
junit2html junit.xml junit.html
- name: Upload Unit Test Results in HTML
if: always()
uses: actions/upload-artifact@v4
with:
name: HTML Unit Test Results
if-no-files-found: ignore
path: |
junit.html
- name: Publish Unit Test Results
uses: EnricoMi/publish-unit-test-result-action@v2
with:
check_name: Unit Test Results
files: "**/twister.xml"
comment_mode: off

View File

@@ -1,69 +0,0 @@
# Copyright (c) 2020 Intel Corporation.
# SPDX-License-Identifier: Apache-2.0
name: Twister TestSuite
on:
push:
branches:
- main
- v*-branch
- collab-*
paths:
- 'scripts/pylib/**'
- 'scripts/twister'
- 'scripts/tests/twister/**'
- '.github/workflows/twister_tests.yml'
- 'scripts/schemas/twister/'
pull_request:
branches:
- main
- v*-branch
paths:
- 'scripts/pylib/**'
- 'scripts/twister'
- 'scripts/tests/twister/**'
- '.github/workflows/twister_tests.yml'
- 'scripts/schemas/twister/'
permissions:
contents: read
jobs:
twister-tests:
name: Twister Unit Tests
runs-on: ${{ matrix.os }}
strategy:
matrix:
python-version: ['3.10', '3.11', '3.12', '3.13']
os: [ubuntu-24.04]
steps:
- name: checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: ${{ matrix.python-version }}
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: Run pytest for twisterlib
env:
ZEPHYR_BASE: ./
ZEPHYR_TOOLCHAIN_VARIANT: zephyr
run: |
echo "Run twister tests"
PYTHONPATH=./scripts/tests pytest ./scripts/tests/twister
- name: Run pytest for pytest-twister-harness
env:
ZEPHYR_BASE: ./
ZEPHYR_TOOLCHAIN_VARIANT: zephyr
PYTHONPATH: ./scripts/pylib/pytest-twister-harness/src:${PYTHONPATH}
run: |
echo "Run twister tests"
pytest ./scripts/pylib/pytest-twister-harness/tests

View File

@@ -0,0 +1,66 @@
# Copyright (c) 2020 Intel Corporation.
# SPDX-License-Identifier: Apache-2.0
name: Twister TestSuite
on:
push:
branches:
- main
- v*-branch
paths:
- 'scripts/pylib/**'
- 'scripts/twister'
- 'scripts/tests/twister/**'
- '.github/workflows/twister_tests.yml'
pull_request:
branches:
- main
- v*-branch
paths:
- 'scripts/pylib/**'
- 'scripts/twister'
- 'scripts/tests/twister/**'
- '.github/workflows/twister_tests.yml'
jobs:
twister-tests:
name: Twister Unit Tests
runs-on: ${{ matrix.os }}
strategy:
matrix:
python-version: [3.8, 3.9, '3.10', '3.11', '3.12']
os: [ubuntu-22.04]
steps:
- name: checkout
uses: actions/checkout@v4
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4
with:
python-version: ${{ matrix.python-version }}
- name: cache-pip-linux
if: startsWith(runner.os, 'Linux')
uses: actions/cache@v4
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-${{ matrix.python-version }}
restore-keys: |
${{ runner.os }}-pip-${{ matrix.python-version }}
- name: install-packages
run: |
pip3 install -r scripts/requirements-base.txt -r scripts/requirements-build-test.txt -r scripts/requirements-run-test.txt
- name: Run pytest for twisterlib
env:
ZEPHYR_BASE: ./
ZEPHYR_TOOLCHAIN_VARIANT: zephyr
run: |
echo "Run twister tests"
PYTHONPATH=./scripts/tests pytest ./scripts/tests/twister
- name: Run pytest for pytest-twister-harness
env:
ZEPHYR_BASE: ./
ZEPHYR_TOOLCHAIN_VARIANT: zephyr
PYTHONPATH: ./scripts/pylib/pytest-twister-harness/src:${PYTHONPATH}
run: |
echo "Run twister tests"
pytest ./scripts/pylib/pytest-twister-harness/tests

View File

@@ -1,68 +0,0 @@
# Copyright (c) 2023 Intel Corporation.
# SPDX-License-Identifier: Apache-2.0
name: Twister BlackBox TestSuite
on:
pull_request:
branches:
- main
- collab-*
- v*-branch
paths:
- 'scripts/pylib/twister/**'
- 'scripts/twister'
- 'scripts/tests/twister_blackbox/**'
- '.github/workflows/twister_tests_blackbox.yml'
permissions:
contents: read
env:
PYTHONIOENCODING: utf-8
jobs:
twister-tests:
name: Twister Black Box Tests
strategy:
matrix:
python-version: ['3.10', '3.11', '3.12', '3.13']
os: [ubuntu-24.04, macos-14, windows-2022]
fail-fast: false
runs-on: ${{ matrix.os }}
steps:
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
path: zephyr
fetch-depth: 0
- name: Set Up Python ${{ matrix.python-version }}
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: ${{ matrix.python-version }}
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Setup Zephyr project
uses: zephyrproject-rtos/action-zephyr-setup@c125c5ebeeadbd727fa740b407f862734af1e52a # v1.0.9
with:
app-path: zephyr
toolchains: all
enable-ccache: false
west-group-filter: -tools,-bootloader,-babblesim,-hal
west-project-filter: -nrf_hw_models,+cmsis,+hal_xtensa,+cmsis_6
- name: Run Pytest For Twister Black Box Tests
if: ${{ runner.os == 'Linux' }}
working-directory: zephyr
shell: bash
env:
ZEPHYR_BASE: ./
ZEPHYR_TOOLCHAIN_VARIANT: zephyr
run: |
export ZEPHYR_SDK_INSTALL_DIR=${{ github.workspace }}/zephyr-sdk
sudo apt-get install -y lcov
echo "Run twister tests"
source zephyr-env.sh
PYTHONPATH="./scripts/tests" pytest ./scripts/tests/twister_blackbox/

View File

@@ -0,0 +1,91 @@
# Copyright (c) 2023 Intel Corporation.
# SPDX-License-Identifier: Apache-2.0
name: Twister BlackBox TestSuite
on:
pull_request:
branches:
- main
paths:
- 'scripts/pylib/twister/**'
- 'scripts/twister'
- 'scripts/tests/twister_blackbox/**'
- '.github/workflows/twister_tests_blackbox.yml'
jobs:
twister-tests:
name: Twister Black Box Tests
runs-on: ${{ matrix.os }}
strategy:
matrix:
python-version: [3.8, 3.9, '3.10', '3.11', '3.12']
os: [ubuntu-22.04]
container:
image: ghcr.io/zephyrproject-rtos/ci:v0.26.7
steps:
- name: Apply Container Owner Mismatch Workaround
run: |
# FIXME: The owner UID of the GITHUB_WORKSPACE directory may not
# match the container user UID because of the way GitHub
# Actions runner is implemented. Remove this workaround when
# GitHub comes up with a fundamental fix for this problem.
git config --global --add safe.directory ${GITHUB_WORKSPACE}
- name: Checkout
uses: actions/checkout@v4
- name: Environment Setup
run: |
echo "$HOME/.local/bin" >> $GITHUB_PATH
west init -l . || true
west config --global update.narrow true
west update --path-cache /github/cache/zephyrproject 2>&1 1> west.update.log || west update --path-cache /github/cache/zephyrproject 2>&1 1> west.update.log || ( rm -rf ../modules ../bootloader ../tools && west update --path-cache /github/cache/zephyrproject)
west forall -c 'git reset --hard HEAD'
echo "ZEPHYR_SDK_INSTALL_DIR=/opt/toolchains/zephyr-sdk-$( cat SDK_VERSION )" >> $GITHUB_ENV
- name: Set Up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4
with:
python-version: ${{ matrix.python-version }}
- name: Go Into Venv
shell: bash
run: |
python3 -m pip install --user virtualenv
python3 -m venv env
source env/bin/activate
echo "$(which python)"
- name: Install Packages
run: |
python3 -m pip install -U -r scripts/requirements-base.txt -r scripts/requirements-build-test.txt -r scripts/requirements-run-test.txt
- name: Run Pytest For Twister Black Box Tests
shell: bash
env:
ZEPHYR_BASE: ./
ZEPHYR_TOOLCHAIN_VARIANT: zephyr
run: |
echo "Run twister tests"
source zephyr-env.sh
PYTHONPATH="./scripts/tests" pytest ./scripts/tests/twister_blackbox
- name: Upload Unit Test Results
if: success() || failure()
uses: actions/upload-artifact@v2
with:
name: Black Box Test Results (Python ${{ matrix.python-version }})
path: |
twister-out*/twister.log
twister-out*/twister.json
twister-out*/testplan.log
retention-days: 14
- name: Clear Workspace
if: success() || failure()
run: |
rm -rf twister-out*/

View File

@@ -1,60 +0,0 @@
# Copyright (c) 2020 Linaro Limited.
# SPDX-License-Identifier: Apache-2.0
name: Zephyr West Command Tests
on:
push:
branches:
- main
- v*-branch
- collab-*
paths:
- 'scripts/west-commands.yml'
- 'scripts/west_commands/**'
- '.github/workflows/west_cmds.yml'
pull_request:
branches:
- main
- v*-branch
- collab-*
paths:
- 'scripts/west-commands.yml'
- 'scripts/west_commands/**'
- '.github/workflows/west_cmds.yml'
permissions:
contents: read
jobs:
west-commands:
name: West Command Tests
runs-on: ${{ matrix.os }}
strategy:
matrix:
python-version: ['3.10', '3.11', '3.12', '3.13']
os: [ubuntu-22.04, macos-14, windows-2022]
steps:
- name: checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@e797f83bcb11b83ae66e0230d6156d7c80228e7c # v6.0.0
with:
python-version: ${{ matrix.python-version }}
cache: pip
cache-dependency-path: scripts/requirements-actions.txt
- name: Install Python packages
run: |
pip install -r scripts/requirements-actions.txt --require-hashes
- name: run pytest-win
if: runner.os == 'Windows'
run: |
python ./scripts/west_commands/run_tests.py
- name: run pytest-mac-linux
if: runner.os != 'Windows'
run: |
./scripts/west_commands/run_tests.py

View File

@@ -0,0 +1,80 @@
# Copyright (c) 2020 Linaro Limited.
# SPDX-License-Identifier: Apache-2.0
name: Zephyr West Command Tests
on:
push:
branches:
- main
- v*-branch
paths:
- 'scripts/west-commands.yml'
- 'scripts/west_commands/**'
- '.github/workflows/west_cmds.yml'
pull_request:
branches:
- main
- v*-branch
paths:
- 'scripts/west-commands.yml'
- 'scripts/west_commands/**'
- '.github/workflows/west_cmds.yml'
jobs:
west-commnads:
name: West Command Tests
runs-on: ${{ matrix.os }}
strategy:
matrix:
python-version: [3.8, 3.9, '3.10', '3.11', '3.12']
os: [ubuntu-22.04, macos-11, windows-2022]
exclude:
- os: macos-11
python-version: 3.6
- os: windows-2022
python-version: 3.6
steps:
- name: checkout
uses: actions/checkout@v4
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4
with:
python-version: ${{ matrix.python-version }}
- name: cache-pip-linux
if: startsWith(runner.os, 'Linux')
uses: actions/cache@v4
with:
path: ~/.cache/pip
key: ${{ runner.os }}-pip-${{ matrix.python-version }}
restore-keys: |
${{ runner.os }}-pip-${{ matrix.python-version }}
- name: cache-pip-mac
if: startsWith(runner.os, 'macOS')
uses: actions/cache@v4
with:
path: ~/Library/Caches/pip
# Trailing '-' was just to get a different cache name
key: ${{ runner.os }}-pip-${{ matrix.python-version }}-
restore-keys: |
${{ runner.os }}-pip-${{ matrix.python-version }}-
- name: cache-pip-win
if: startsWith(runner.os, 'Windows')
uses: actions/cache@v4
with:
path: ~\AppData\Local\pip\Cache
key: ${{ runner.os }}-pip-${{ matrix.python-version }}
restore-keys: |
${{ runner.os }}-pip-${{ matrix.python-version }}
- name: install pytest
run: |
pip3 install wheel
pip3 install pytest west pyelftools canopen natsort progress mypy intelhex psutil ply pyserial
- name: run pytest-win
if: runner.os == 'Windows'
run: |
python ./scripts/west_commands/run_tests.py
- name: run pytest-mac-linux
if: runner.os != 'Windows'
run: |
./scripts/west_commands/run_tests.py

43
.gitignore vendored
View File

@@ -1,6 +1,3 @@
# SPDX-License-Identifier: Apache-2.0
# SPDX-FileCopyrightText: Copyright The Zephyr Project Contributors
*.o
*.a
*.d
@@ -10,10 +7,8 @@
*.swp
*.swo
*~
# Emacs
.\#*
\#*\#
build*/
!doc/build/
!scripts/build
@@ -32,8 +27,6 @@ outdir
outdir-*
scripts/basic/fixdep
scripts/gen_idt/gen_idt
coverage-report
doc-coverage.info
doc/_build
doc/doxygen
doc/xml
@@ -46,7 +39,6 @@ sanity-out*
twister-out*
bsim_out
bsim_bt_out
myresults.xml
tests/RunResults.xml
scripts/grub
doc/reference/kconfig/*.rst
@@ -58,25 +50,11 @@ doc/doc.warnings
hide-defaults-note
venv
.venv
.envrc
.DS_Store
.clangd
new.info
# Cargo drops lock files in projects to capture resolved dependencies.
# We don't want to record these.
Cargo.lock
# Cargo encourages a .cargo/config.toml file to symlink to a generated file. Don't save these.
.cargo/
# Normal west builds will place the Rust target directory under the build directory. However,
# sometimes IDEs and such will litter these target directories as well.
target/
# CI output
compliance.xml
dts_linter.patch
_error.types
# Tag files
@@ -89,37 +67,18 @@ tags
.idea
# from check_compliance.py
# zephyr-keep-sorted-start
BinaryFiles.txt
BoardYml.txt
Checkpatch.txt
ClangFormat.txt
DevicetreeBindings.txt
DevicetreeLinting.txt
GitDiffCheck.txt
Gitlint.txt
Identity.txt
ImageSize.txt
Kconfig.txt
KconfigBasic.txt
KconfigBasicNoModules.txt
KconfigHWMv2.txt
KeepSorted.txt
LicenseAndCopyrightCheck.txt
MaintainersFormat.txt
ModulesMaintainers.txt
Nits.txt
Pylint.txt
PythonCompat.txt
Ruff.txt
SphinxLint.txt
SysbuildKconfig.txt
SysbuildKconfigBasic.txt
SysbuildKconfigBasicNoModules.txt
TextEncoding.txt
YAMLLint.txt
ZephyrModuleFile.txt
# zephyr-keep-sorted-stop
# Node dependecies
node_modules

View File

@@ -1,7 +1,6 @@
# All these sections are optional, edit this file as you like.
# Zephyr-specific defaults are located in scripts/gitlint/zephyr_commit_rules.py
[general]
regex-style-search=true
ignore=title-trailing-punctuation, T3, title-max-length, T1, body-hard-tab, B3, B1
# verbosity should be a value between 1 and 3, the commandline -v flags take precedence over this
verbosity = 3
@@ -27,7 +26,7 @@ extra-path=scripts/gitlint
# max-line-count=200
[title-starts-with-subsystem]
regex = ^(?!subsys:)(?!treewide:)(([^:]+):)(\s([^:]+):)*\s(.+)$
regex = ^(?!subsys:)(([^:]+):)(\s([^:]+):)*\s(.+)$
[title-must-not-contain-word]
# Comma-separated list of words that should not occur in the title. Matching is case
@@ -60,7 +59,3 @@ ignore-merge-commits=false
# By specifying this rule, developers can only change the file when they explicitly reference
# it in the commit message.
#files=gitlint/rules.py,README.md
[ignore-by-author-name]
regex=^dependabot\[bot\]$
ignore=all

View File

@@ -1,14 +1,14 @@
Alexandr Kolosov <rikorsev@gmail.com>
Alexandre d'Alton <alexandre.dalton@intel.com>
Amir Kaplan <amir.kaplan@intel.com>
Anas Nashif <anas.nashif@intel.com>
Amir Kaplan <amir.kaplan@intel.com> <amir.kaplan@intel.com>
Anas Nashif <anas.nashif@intel.com> <anas.nashif@intel.com>
Andrzej Kuroś <andrzej.kuros@nordicsemi.no>
Anthony Smigielski <thebasti0ncode@gmail.com>
Armand Ciejak <armand@riedonetworks.com> <armandciejak@users.noreply.github.com>
Aska Wu <aska.wu@linaro.org>
Bit Pathe <bitpathe@gmail.com>
Bit Pathe <bitpathe@gmail.com> <bitpathe@gmail.com>
Bjarki Arge Andreasen <baa@trackunit.com>
Carles Cufi <carles.cufi@nordicsemi.no>
Carles Cufi <carles.cufi@nordicsemi.no> <carles.cufi@nordicsemi.no>
chao an <anchao@xiaomi.com>
Charles E. Youse <charles.youse@intel.com>
Chen Xingyu <hi@xingrz.me>
@@ -16,32 +16,32 @@ Christoph Schnetzler <christoph.schnetzler@husqvarnagroup.com>
Christoph Schramm <schramm@makaio.com>
Christopher Friedt <cfriedt@meta.com>
Christopher Friedt <cfriedt@meta.com> <cfriedt@fb.com>
Chuck Jordan <cjordan@synopsys.com>
Chuck Jordan <cjordan@synopsys.com> <cjordan@synopsys.com>
Chunlin Han <chunlin.han@linaro.org> <chunlin.han@acer.com>
David B. Kinder <david.b.kinder@intel.com>
David Komel <a8961713@gmail.com>
David Leach <david.leach@nxp.com>
Dirk Brandewie <dirk.j.brandewie@intel.com>
Douglas Su <d0u9.su@outlook.com>
Dirk Brandewie <dirk.j.brandewie@intel.com> <dirk.j.brandewie@intel.com>
Douglas Su <d0u9.su@outlook.com> <d0u9.su@outlook.com>
Enjia Mai <enjia.mai@intel.com>
Enjia Mai <enjia.mai@intel.com>
Evan Couzens <evanx.couzens@intel.com>
Enjia Mai <enjia.mai@intel.com> <enjiax.mai@intel.com>
Evan Couzens <evanx.couzens@intel.com> <evanx.couzens@intel.com>
Evgeniy Paltsev <PaltsevEvgeniy@gmail.com>
Evgeniy Paltsev <PaltsevEvgeniy@gmail.com> <Eugeniy.Paltsev@synopsys.com>
Felipe Neves <ryukokki.felipe@gmail.com>
Felipe Neves <ryukokki.felipe@gmail.com> <ryukokki.felipe@gmail.com>
Findlay Feng <i@fengch.me>
Flavio Arieta Netto <flavio@exati.com.br>
Francois Ramu <francois.ramu@st.com>
Gerardo Aceves <gerardo.aceves@intel.com>
Gerardo Aceves <gerardo.aceves@intel.com> <gerardo.aceves@intel.com>
Gregory Shue <gregory.shue@legrand.com>
Gregory Shue <gregory.shue@legrand.com> <gregory.shue@legrand.us>
HaiLong Yang <cameledyang@pm.me>
James Johnson <james.johnson672@t-mobile.com>
Jarno Lämsä <jarno.lamsa@nordicsemi.no>
Jędrzej Ciupis <jedrzej.ciupis@nordicsemi.no>
Jeremie Garcia <jeremie.garcia@intel.com>
Jeremie Garcia <jeremie.garcia@intel.com> <jeremie.garcia@intel.com>
Jim Benjamin Luther <jilu@oticon.com>
Johan Kruger <johan.kruger@windriver.com>
Johan Kruger <johan.kruger@windriver.com> <johan.kruger@windriver.com>
Johann Fischer <j.fischer@phytec.de>
Jørgen Kvalvaag <jorgen.kvalvaag@nordicsemi.no>
Juan Manuel Cruz Alcaraz <juan.m.cruz.alcaraz@intel.com>
@@ -51,17 +51,16 @@ Jun Li <jun.r.li@intel.com>
Justin Watson <jwatson5@gmail.com>
Kamil Sroka <kamil.sroka@nordicsemi.no>
Katarzyna Giądła <katarzyna.giadla@nordicsemi.no>
Keren Siman-Tov <keren.siman-tov@intel.com>
Keren Siman-Tov <keren.siman-tov@intel.com> <keren.siman-tov@intel.com>
Krzysztof Chruściński <krzysztof.chruscinski@nordicsemi.no>
Kuo-Lang Tseng <kuo-lang.tseng@intel.com>
Lei Liu <lei.a.liu@intel.com>
Leona Cook <leonax.cook@intel.com>
Kuo-Lang Tseng <kuo-lang.tseng@intel.com> <kuo-lang.tseng@intel.com>
Lei Liu <lei.a.liu@intel.com> <lei.a.liu@intel.com>
Leona Cook <leonax.cook@intel.com> <leonax.cook@intel.com>
Leona Cook <leonax.cook@intel.com> <lsc@hackeress.com>
Lixin Guo <lixinx.guo@intel.com>
Łukasz Mazur <lukasz.mazur@hidglobal.com>
Manuel Argüelles <manuel.arguelles@nxp.com>
Manuel Argüelles <manuel.arguelles@nxp.com> <manuel.arguelles@coredumplabs.com>
Manuel Argüelles <manuel.arguelles@nxp.com> <marguelles.dev@gmail.com>
Marc Herbert <marc.herbert@intel.com> <46978960+marc-hb@users.noreply.github.com>
Marin Jurjević <marin.jurjevic@hotmail.com>
Mariusz Ryndzionek <mariusz.ryndzionek@firmwave.com>
@@ -72,14 +71,14 @@ Martí Bolívar <marti.bolivar@nordicsemi.no> <marti.bolivar@linaro.org>
Martí Bolívar <marti.bolivar@nordicsemi.no> <marti.f.bolivar@gmail.com>
Martí Bolívar <marti.bolivar@nordicsemi.no> <marti@foundries.io>
Martí Bolívar <marti.bolivar@nordicsemi.no> <marti@opensourcefoundries.com>
Martin Jäger <martin@libre.solar> <17674105+martinjaeger@users.noreply.github.com>
Martin Jäger <martin@libre.solar> <17674105+martinjaeger@users.noreply.github.com>
Mateusz Hołenko <mholenko@antmicro.com>
Michael Rosen <michael.r.rosen@intel.com>
Michal Narajowski <michal.narajowski@codecoup.pl>
Mike Hirst <michael.hirst@windriver.com>
Mike Hirst <michael.hirst@windriver.com> <michael.hirst@windriver.com>
Ming Shao <ming.shao@intel.com>
Mohan Kumar Kumar <mohankm@fb.com>
Naga Raja Rao Tulasi <tulasi.r@tcs.com>
Naga Raja Rao Tulasi <tulasi.r@tcs.com> <tulasi.r@tcs.com>
Navin Sankar Velliangiri <navin@linumiz.com>
NingX Zhao <ningx.zhao@intel.com>
Nishikant Nayak <nishikantax.nayak@intel.com>
@@ -100,23 +99,21 @@ Radosław Koppel <radoslaw.koppel@nordicsemi.no> <r.koppel@k-el.com>
Raja D. Singh <rdsingh@iotwizards.com>
Ricardo Salveti <ricardo@opensourcefoundries.com>
Ricardo Salveti <ricardo@opensourcefoundries.com> <ricardo.salveti@linaro.org>
Ruud Derwig <Ruud.Derwig@synopsys.com>
Ruud Derwig <Ruud.Derwig@synopsys.com> <Ruud.Derwig@synopsys.com>
Saku Rautio <saku.rautio@nordicsemi.no>
Scott Worley <scott.worley@microchip.com>
Sean Nyekjaer <sean@geanix.com> <sean.nyekjaer@prevas.dk>
Sean Nyekjaer <sean@geanix.com> <sean@nyekjaer.dk>
Sharron Liu <sharron.liu@intel.com>
Shilpashree L C <shilpashree.lc@intel.com>
Shuang He <shuang.he@intel.com>
Shuang He <shuang.he@intel.com> <shuang.he@intel.com>
Sigvart Hovland <sigvart.hovland@nordicsemi.no>
Stéphane D'Alu <sdalu@sdalu.com>
Stine Åkredalen <stine.akredalen@nordicsemi.no>
Thomas Heeley <thomas.heeley@intel.com>
Thomas Heeley <thomas.heeley@intel.com> <thomas.heeley@intel.com>
Tim Sørensen <tims@demant.com>
Tim Sørensen <tims@demant.com> <tims@oticon.com>
Vinayak Kariappa Chettimada <vinayak.kariappa.chettimada@nordicsemi.no>
Vinayak Kariappa Chettimada <vinayak.kariappa.chettimada@nordicsemi.no> <vich@nordicsemi.no>
Vinayak Kariappa Chettimada <vinayak.kariappa.chettimada@nordicsemi.no> <vinayak.kariappa@gmail.com>
Vinayak Kariappa Chettimada <vinayak.kariappa.chettimada@nordicsemi.no> <vinayak.kariappa.chettimada@nordicsemi.no> <vich@nordicsemi.no> <vinayak.kariappa@gmail.com>
Xiaorui Hu <xiaorui.hu@linaro.org>
Yannis Damigos <giannis.damigos@gmail.com> <ydamigos@iccs.gr>
Yonattan Louise <yonattan.a.louise.mendoza@intel.com>

File diff suppressed because it is too large Load Diff

View File

@@ -1,30 +0,0 @@
# Copyright (c) 2024 Basalte bv
# SPDX-License-Identifier: Apache-2.0
extend = ".ruff-excludes.toml"
line-length = 100
target-version = "py310"
[lint]
select = [
# zephyr-keep-sorted-start
"B", # flake8-bugbear
"E", # pycodestyle
"F", # pyflakes
"I", # isort
"SIM", # flake8-simplify
"UP", # pyupgrade
"W", # pycodestyle warnings
# zephyr-keep-sorted-stop
]
ignore = [
# zephyr-keep-sorted-start
"SIM108", # Allow if-else blocks instead of forcing ternary operator
# zephyr-keep-sorted-stop
]
[format]
quote-style = "preserve"
line-ending = "lf"

File diff suppressed because it is too large Load Diff

View File

@@ -1,4 +1,512 @@
# Instead of the CODEOWNERS file, The Zephyr Project uses a custom format.
# See MAINTAINERS.yml for details.
# CODEOWNERS for autoreview assigning in github
# https://help.github.com/en/articles/about-code-owners#codeowners-syntax
# Order is important; for each modified file, the last matching
# pattern takes the most precedence.
# That is, with the last pattern being
# *.rst @nashif
# if only .rst files are being modified, only nashif is
# automatically requested for review, but you can manually
# add others as needed.
# Do not use wildcard on all source yet
#
# DO NOT EDIT THIS FILE.
# +++++++++++ NOTE ++++++++++++++++
#
# Please use the MAINTAINERS file to add yourself in an area or to add a new
# component or code. This file is going to be deprecated and currently only had
# entries that are not covered by the MAINTAINERS file.
/soc/arm/aspeed/ @aspeeddylan
/soc/arm/atmel_sam/common/*_sam4l_*.c @nandojve
/soc/arm/atmel_sam/sam3x/ @ioannisg
/soc/arm/atmel_sam/sam4e/ @nandojve
/soc/arm/atmel_sam/sam4l/ @nandojve
/soc/arm/atmel_sam/sam4s/ @fallrisk
/soc/arm/atmel_sam/same70/ @nandojve
/soc/arm/atmel_sam/samv71/ @nandojve
/soc/arm/bcm*/ @sbranden
/soc/arm/infineon_cat1/ @ifyall @npal-cy
/soc/arm/infineon_xmc/ @parthitce
/soc/arm/silabs_exx32/efm32pg1b/ @rdmeneze
/soc/arm/silabs_exx32/efr32mg21/ @l-alfred
/soc/arm/st_stm32/ @erwango
/soc/arm/st_stm32/*/power.c @FRASTM
/soc/arm/st_stm32/stm32mp1/ @arnopo
/soc/arm/st_stm32/stm32h7/*stm32h735* @benediktibk
/soc/arm/st_stm32/stm32l4/*stm32l451* @benediktibk
/soc/arm/ti_simplelink/cc13x2_cc26x2/ @bwitherspoon
/soc/arm/ti_simplelink/cc32xx/ @vanti
/soc/arm/ti_simplelink/msp432p4xx/ @Mani-Sadhasivam
/soc/arm/xilinx_zynq7000/ @ibirnbaum
/soc/arm/xilinx_zynqmp/ @stephanosio
/soc/arm/renesas_rcar/ @aaillet
/soc/riscv/openisa*/ @dleach02
/soc/riscv/riscv-privileged/andes_v5/ @cwshu @kevinwang821020 @jimmyzhe
/soc/riscv/riscv-privileged/neorv32/ @henrikbrixandersen
/soc/riscv/riscv-privileged/gd32vf103/ @soburi
/soc/riscv/riscv-privileged/niosv/ @sweeaun
/boards/arm/96b_argonkey/ @avisconti
/boards/arm/96b_avenger96/ @Mani-Sadhasivam
/boards/arm/96b_carbon/ @idlethread
/boards/arm/96b_meerkat96/ @Mani-Sadhasivam
/boards/arm/96b_nitrogen/ @idlethread
/boards/arm/96b_neonkey/ @Mani-Sadhasivam
/boards/arm/96b_stm32_sensor_mez/ @Mani-Sadhasivam
/boards/arm/96b_wistrio/ @Mani-Sadhasivam
/boards/arm/acn52832/ @sven-hm
/boards/arm/arduino_mkrzero/ @soburi
/boards/arm/bbc_microbit_v2/ @LingaoM
/boards/arm/bl5340_dvk/ @lairdjm
/boards/arm/bl65*/ @lairdjm
/boards/arm/blackpill_f401ce/ @coderkalyan
/boards/arm/blackpill_f411ce/ @coderkalyan
/boards/arm/bt*10/ @greg-leach
/boards/arm/cc1352r1_launchxl/ @bwitherspoon
/boards/arm/cc26x2r1_launchxl/ @bwitherspoon
/boards/arm/cc3220sf_launchxl/ @vanti
/boards/arm/cy8ckit_062_ble/ @ifyall @npal-cy
/boards/arm/cy8ckit_062s4/ @DaWei8823
/boards/arm/cy8ckit_062_wifi_bt/ @ifyall @npal-cy
/boards/arm/cy8cproto_062_4343w/ @ifyall @npal-cy
/boards/arm/disco_l475_iot1/ @erwango
/boards/arm/efm32pg_stk3401a/ @rdmeneze
/boards/arm/faze/ @mbittan @simonguinot
/boards/arm/frdm*/ @mmahadevan108 @dleach02
/boards/arm/gd32*/ @nandojve
/boards/arm/google_*/ @jackrosenthal
/boards/arm/hexiwear*/ @mmahadevan108 @dleach02
/boards/arm/ip_k66f/ @parthitce @lmajewski
/boards/arm/legend/ @mbittan @simonguinot
/boards/arm/lpcxpresso*/ @mmahadevan108 @dleach02
/boards/arm/mg100/ @rerickson1
/boards/arm/mimx8mm_evk/ @Mani-Sadhasivam
/boards/arm/mimx8mm_phyboard_polis @pefech
/boards/arm/mimxrt*/ @mmahadevan108 @dleach02
/boards/arm/mps2_an385/ @fvincenzo
/boards/arm/msp_exp432p401r_launchxl/ @Mani-Sadhasivam
/boards/arm/npcx7m6fb_evb/ @MulinChao @ChiHuaL
/boards/arm/nrf*/ @carlescufi @lemrey
/boards/arm/nucleo*/ @erwango @ABOSTM @FRASTM
/boards/arm/nucleo_f401re/ @idlethread
/boards/arm/nuvoton_pfm_m487/ @ssekar15
/boards/arm/pinnacle_100_dvk/ @rerickson1
/boards/arm/qemu_cortex_a9/ @ibirnbaum
/boards/arm/qemu_cortex_r*/ @stephanosio
/boards/arm/qemu_cortex_m*/ @ioannisg @stephanosio
/boards/arm/quick_feather/ @fkokosinski @kgugala
/boards/arm/rak4631_nrf52840/ @gpaquet85
/boards/arm/rak5010_nrf52840/ @gpaquet85
/boards/arm/rpi_pico/ @yonsch
/boards/arm/ronoth_lodev/ @NorthernDean
/boards/arm/xmc45_relax_kit/ @parthitce
/boards/arm/sam4e_xpro/ @nandojve
/boards/arm/sam4l_ek/ @nandojve
/boards/arm/sam4s_xplained/ @fallrisk
/boards/arm/sam_e70_xplained/ @nandojve
/boards/arm/sam_v71_xult/ @nandojve
/boards/arm/scobc_module1/ @yashi
/boards/arm/v2m_beetle/ @fvincenzo
/boards/arm/olimexino_stm32/ @ydamigos
/boards/arm/s32*/ @manuargue
/boards/arm/sensortile_box/ @avisconti
/boards/arm/steval_fcu001v1/ @Navin-Sankar
/boards/arm/stm32l1_disco/ @karlp
/boards/arm/stm32*_disco/ @erwango @ABOSTM @FRASTM
/boards/arm/stm32h735g_disco/ @benediktibk
/boards/arm/stm32f3_disco/ @ydamigos
/boards/arm/stm32*_eval/ @erwango @ABOSTM @FRASTM
/boards/arm/rcar_*/ @aaillet
/boards/arm/ubx_bmd345eval_nrf52840/ @Navin-Sankar @brec-u-blox
/boards/arm/nrf5340_audio_dk_nrf5340 @koffes @alexsven @erikrobstad @rick1082 @gWacey
/boards/arm/stm32_min_dev/ @sidcha
/boards/riscv/rv32m1_vega/ @dleach02
/boards/riscv/adp_xc7k_ae350/ @cwshu @kevinwang821020 @jimmyzhe
/boards/riscv/longan_nano/ @soburi
/boards/riscv/neorv32/ @henrikbrixandersen
/boards/riscv/niosv*/ @sweeaun
/boards/riscv/sparkfun_red_v_things_plus/ @soburi
/boards/riscv/stamp_c3/ @soburi
/boards/shields/atmel_rf2xx/ @nandojve
/boards/shields/esp_8266/ @nandojve
/boards/shields/inventek_eswifi/ @nandojve
/boards/xtensa/odroid_go/ @ydamigos
/boards/xtensa/nxp_adsp_imx8/ @iuliana-prodan @dbaluta
/boards/xtensa/kincony_kc868_a32/ @bbilas
/boards/arm64/qemu_cortex_a53/ @carlocaione
/boards/arm64/bcm958402m2_a72/ @abhishek-brcm
/boards/arm64/mimx8mm_evk/ @MrVan @JiafeiPan
/boards/arm64/mimx8mp_evk/ @MrVan @JiafeiPan
/boards/arm64/mimx8mn_evk/ @JiafeiPan
/boards/arm64/mimx93_evk/ @JiafeiPan
/boards/arm64/nxp_ls1046ardb/ @JiafeiPan
/boards/arm64/xenvm/ @lorc @firscity
/boards/arm64/fvp_baser_aemv8r/ @povergoing
/boards/arm64/fvp_base_revc_2xaemv8a/ @carlocaione
/boards/arm64/intel_socfpga_agilex_socdk/ @siclim @ngboonkhai
/boards/arm64/intel_socfpga_agilex5_socdk/ @teikheng @gdengi
/boards/arm64/rcar_*/ @lorc @xakep-amatop
# All cmake related files
/doc/develop/tools/coccinelle.rst @himanshujha199640 @JuliaLawall
/doc/services/device_mgmt/smp_protocol.rst @de-nordic @nordicjm
/doc/services/device_mgmt/smp_groups/ @de-nordic @nordicjm
/doc/services/sensing/ @lixuzha @ghu0510 @qianruh
/doc/CMakeLists.txt @carlescufi
/doc/_scripts/ @carlescufi
/drivers/*/*sam4l* @nandojve
/drivers/*/*cc13xx_cc26xx* @bwitherspoon
/drivers/*/*gd32* @nandojve
/drivers/*/*litex* @mateusz-holenko @kgugala @pgielda
/drivers/*/*mcux* @mmahadevan108 @dleach02
/drivers/*/*stm32* @erwango @ABOSTM @FRASTM
/drivers/*/*native_posix* @aescolar @daor-oti
/drivers/*/*lpc11u6x* @mbittan @simonguinot
/drivers/*/*npcx* @MulinChao @ChiHuaL
/drivers/*/*andes* @cwshu @kevinwang821020 @jimmyzhe
/drivers/*/*ifx_cat1* @ifyall @npal-cy
/drivers/*/*neorv32* @henrikbrixandersen
/drivers/*/*_s32* @manuargue
/drivers/adc/adc_ads1x1x.c @XenuIsWatching
/drivers/adc/adc_stm32.c @cybertale
/drivers/adc/adc_rpi_pico.c @soburi
/drivers/adc/*ads114s0x* @benediktibk
/drivers/adc/*max11102_17* @benediktibk
/drivers/adc/adc_ad5592.c @bbilas
/drivers/audio/*nrfx* @anangl
/drivers/auxdisplay/*pt6314* @xingrz
/drivers/auxdisplay/* @thedjnK
/drivers/bbram/* @yperess @sjg20 @jackrosenthal
/drivers/bluetooth/ @alwa-nordic @jhedberg @Vudentz
/drivers/bluetooth/hci/hci_esp32.c @sylvioalves
/drivers/can/*mcp2515* @karstenkoenig
/drivers/can/*rcar* @aaillet
/drivers/clock_control/*agilex* @siclim @gdengi
/drivers/clock_control/*nrf* @nordic-krch
/drivers/clock_control/*esp32* @extremegtx @sylvioalves
/drivers/clock_control/*cpg_mssr* @aaillet
/drivers/console/ipm_console.c @finikorg
/drivers/console/semihost_console.c @luozhongyao
/drivers/console/jailhouse_debug_console.c @MrVan
/drivers/counter/counter_cmos.c @dcpleung
/drivers/counter/counter_ll_stm32_timer.c @kentjhall
/drivers/counter/*esp32* @sylvioalves
/drivers/counter/dw_timer.c @pbalsundar
/drivers/counter/counter_timer_shell.c @pbalsundar
/drivers/crypto/*nrf_ecb* @maciekfabia @anangl
/drivers/display/*rm68200* @mmahadevan108
/drivers/display/display_ili9342c.* @extremegtx
/drivers/dac/*ad56xx* @benediktibk
/drivers/dac/dac_ad5592.c @bbilas
/drivers/dai/ @kv2019i @marcinszkudlinski @abonislawski
/drivers/dai/intel/ @kv2019i @marcinszkudlinski @abonislawski
/drivers/dai/intel/ssp/ @kv2019i @marcinszkudlinski @abonislawski
/drivers/dai/intel/dmic/ @marcinszkudlinski @abonislawski
/drivers/dai/intel/alh/ @abonislawski
/drivers/dma/*dw* @tbursztyka
/drivers/dma/*dw_common* @abonislawski
/drivers/dma/*sam0* @Sizurka
/drivers/dma/dma_stm32* @cybertale @lowlander
/drivers/dma/*pl330* @raveenp
/drivers/dma/*iproc_pax* @raveenp
/drivers/dma/*intel_adsp* @teburd @abonislawski
/drivers/dma/*rpi_pico* @soburi
/drivers/dma/*xmc4xxx* @talih0
/drivers/eeprom/eeprom_stm32.c @KwonTae-young
/drivers/entropy/*b91* @andy-liu-telink
/drivers/entropy/*bt_hci* @JordanYates
/drivers/entropy/*rv32m1* @dleach02
/drivers/entropy/*litex* @mateusz-holenko @kgugala @pgielda
/drivers/ethernet/*dwmac* @npitre
/drivers/ethernet/*stm32* @Nukersson @lochej
/drivers/ethernet/*w5500* @parthitce
/drivers/ethernet/*xlnx_gem* @ibirnbaum
/drivers/ethernet/*smsc91x* @sgrrzhf
/drivers/ethernet/*adin2111* @GeorgeCGV
/drivers/ethernet/*oa_tc6* @lmajewski
/drivers/ethernet/*lan865x* @lmajewski
/drivers/ethernet/phy/ @rlubos @tbursztyka @arvinf @jukkar
/drivers/ethernet/phy/*adin2111* @GeorgeCGV
/drivers/mdio/*adin2111* @GeorgeCGV
/drivers/flash/*stm32_qspi* @lmajewski
/drivers/flash/*b91* @andy-liu-telink
/drivers/flash/*cadence* @ngboonkhai
/drivers/flash/*cc13xx_cc26xx* @pepe2k
/drivers/flash/*nrf* @de-nordic
/drivers/flash/*esp32* @sylvioalves
/drivers/flash/flash_cadence_nand* @nbalabak
/drivers/gpio/*b91* @andy-liu-telink
/drivers/gpio/*lmp90xxx* @henrikbrixandersen
/drivers/gpio/*nct38xx* @MulinChao @ChiHuaL
/drivers/gpio/*stm32* @erwango
/drivers/gpio/*eos_s3* @fkokosinski @kgugala
/drivers/gpio/*rcar* @aaillet
/drivers/gpio/*esp32* @sylvioalves
/drivers/gpio/*rpi_pico* @yonsch
/drivers/gpio/*xlnx_ps* @ibirnbaum
/drivers/gpio/*ads114s0x* @benediktibk
/drivers/gpio/*bd8lb600fs* @benediktibk
/drivers/gpio/*pcal64xxa* @benediktibk
/drivers/gpio/gpio_altera_pio.c @shilinte
/drivers/gpio/gpio_ad5592.c @bbilas
/drivers/i2c/i2c_common.c @sjg20
/drivers/i2c/i2c_emul.c @sjg20
/drivers/i2c/i2c_ite_enhance.c @GTLin08
/drivers/i2c/i2c_ite_it8xxx2.c @GTLin08
/drivers/i2c/i2c_shell.c @nashif
/drivers/i2c/Kconfig.i2c_emul @sjg20
/drivers/i2c/Kconfig.it8xxx2 @GTLin08
/drivers/i2c/target/*eeprom* @henrikbrixandersen
/drivers/i2c/Kconfig.test @mbolivar-ampere
/drivers/i2c/i2c_test.c @mbolivar-ampere
/drivers/i2c/*rcar* @aaillet
/drivers/i2s/*litex* @mateusz-holenko @kgugala @pgielda
/drivers/i2s/i2s_ll_stm32* @avisconti
/drivers/i2s/*nrfx* @anangl
/drivers/i3c/i3c_cdns.c @XenuIsWatching
/drivers/ieee802154/ @rlubos @tbursztyka @jukkar @fgrandel
/drivers/ieee802154/*b91* @andy-liu-telink
/drivers/ieee802154/ieee802154_nrf5* @jciupis
/drivers/ieee802154/ieee802154_rf2xx* @tbursztyka @nandojve
/drivers/ieee802154/ieee802154_cc13xx* @bwitherspoon @cfriedt @vaishnavachath
/drivers/interrupt_controller/ @dcpleung @nashif
/drivers/interrupt_controller/intc_gic.c @stephanosio
/drivers/interrupt_controller/*esp32* @sylvioalves
/drivers/interrupt_controller/intc_nuclei_eclic.c @soburi
/drivers/ipm/ipm_mhu* @karl-zh
/drivers/ipm/Kconfig.nrfx @masz-nordic
/drivers/ipm/Kconfig.nrfx_ipc_channel @masz-nordic
/drivers/ipm/ipm_cavs* @dcpleung @andyross
/drivers/ipm/ipm_nrfx_ipc.c @masz-nordic
/drivers/ipm/ipm_nrfx_ipc.h @masz-nordic
/drivers/ipm/ipm_stm32_ipcc.c @arnopo
/drivers/ipm/ipm_stm32_hsem.c @cameled
/drivers/ipm/ipm_esp32.c @uLipe
/drivers/ipm/ipm_ivshmem.c @uLipe
/drivers/kscan/*xec* @franciscomunoz @sjvasanth1
/drivers/kscan/*ft5336* @MaureenHelm
/drivers/kscan/*ht16k33* @henrikbrixandersen
/drivers/led_strip/ @mbolivar-ampere
/drivers/mfd/mfd_ad5592.c @bbilas
/drivers/mfd/mfd_max20335.c @bbilas
/drivers/misc/ft8xx/ @hubertmis
/drivers/modem/hl7800.c @rerickson1
/drivers/modem/simcom-sim7080.c @lgehreke
/drivers/modem/simcom-sim7080.h @lgehreke
/drivers/modem/Kconfig.hl7800 @rerickson1
/drivers/modem/Kconfig.simcom-sim7080 @lgehreke
/drivers/pinctrl/*esp32* @sylvioalves
/drivers/pinctrl/*it8xxx2* @ite
/drivers/pm_cpu_ops/psci_shell.c @nbalabak @gdengi
/drivers/power_domain/ @ceolin
/drivers/ps2/*xec* @franciscomunoz @sjvasanth1
/drivers/ps2/*npcx* @MulinChao @ChiHuaL
/drivers/pwm/*b91* @andy-liu-telink
/drivers/pwm/*pca9685* @nixward
/drivers/pwm/*rpi_pico* @burumaj
/drivers/pwm/*rv32m1* @henrikbrixandersen
/drivers/pwm/*sam0* @nzmichaelh
/drivers/pwm/*test* @JordanYates
/drivers/pwm/*xlnx* @henrikbrixandersen
/drivers/pwm/pwm_capture.c @henrikbrixandersen
/drivers/pwm/pwm_shell.c @henrikbrixandersen
/drivers/pwm/*gecko* @sun681
/drivers/pwm/*it8xxx2* @RuibinChang
/drivers/pwm/*esp32* @LucasTambor
/drivers/pwm/*rcar* @aaillet
/drivers/pwm/*max31790* @benediktibk
/drivers/regulator/* @gmarull
/drivers/regulator/regulator_max20335.c @bbilas
/drivers/regulator/regulator_pca9420.c @danieldegrasse
/drivers/regulator/regulator_rpi_pico.c @soburi
/drivers/regulator/regulator_shell.c @danieldegrasse
/drivers/reset/reset_intel_socfpga.c @nbalabak
/drivers/reset/Kconfig.intel_socfpga @nbalabak
/drivers/sensor/ams_iAQcore/ @alexanderwachter
/drivers/sensor/ens210/ @alexanderwachter
/drivers/sensor/grow_r502a/ @DineshDK03
/drivers/sensor/hts*/ @avisconti
/drivers/sensor/ina23*/ @bbilas
/drivers/sensor/lis*/ @avisconti
/drivers/sensor/lps*/ @avisconti
/drivers/sensor/lsm*/ @avisconti
/drivers/sensor/mpr/ @sven-hm
/drivers/sensor/qdec_stm32/ @valeriosetti
/drivers/sensor/rpi_pico_temp/ @soburi
/drivers/sensor/st*/ @avisconti
/drivers/serial/*b91* @andy-liu-telink
/drivers/serial/uart_altera_jtag.c @nashif @gohshunjing
/drivers/serial/uart_altera.c @gohshunjing
/drivers/serial/*ns16550* @dcpleung @nashif @gdengi
/drivers/serial/*nrfx* @anangl
/drivers/serial/uart_liteuart.c @mateusz-holenko @kgugala @pgielda
/drivers/serial/Kconfig.mcux_iuart @Mani-Sadhasivam
/drivers/serial/uart_mcux_iuart.c @Mani-Sadhasivam
/drivers/serial/Kconfig.rtt @carlescufi @pkral78
/drivers/serial/uart_rtt.c @carlescufi @pkral78
/drivers/serial/*rpi_pico* @yonsch
/drivers/serial/Kconfig.xlnx @wjliang
/drivers/serial/uart_xlnx_ps.c @wjliang
/drivers/serial/uart_xlnx_uartlite.c @henrikbrixandersen
/drivers/serial/*xmc4xxx* @parthitce
/drivers/serial/*numicro* @ssekar15
/drivers/serial/*apbuart* @julius-barendt
/drivers/serial/*rcar* @aaillet
/drivers/serial/Kconfig.xen @lorc @firscity
/drivers/serial/uart_hvc_xen.c @lorc @firscity
/drivers/serial/uart_hvc_xen_consoleio.c @lorc @firscity
/drivers/serial/Kconfig.it8xxx2 @GTLin08
/drivers/serial/uart_ite_it8xxx2.c @GTLin08
/drivers/serial/*intel_lw* @shilinte
/drivers/disk/sdmmc_sdhc.h @JunYangNXP
/drivers/disk/sdmmc_stm32.c @anthonybrandon
/drivers/ptp_clock/ @tbursztyka @jukkar
/drivers/spi/*b91* @andy-liu-telink
/drivers/spi/spi_rv32m1_lpspi* @karstenkoenig
/drivers/spi/*esp32* @sylvioalves
/drivers/spi/*pl022* @soburi
/drivers/sdhc/ @danieldegrasse
/drivers/sdhc/sdhc_cdns* @roymurlidhar @tanmaykathpalia
/drivers/timer/*arm_arch* @carlocaione
/drivers/timer/*cortex_m_systick* @anangl
/drivers/timer/*altera_avalon* @nashif
/drivers/timer/*riscv_machine* @kgugala @pgielda
/drivers/timer/*ite_it8xxx2* @ite
/drivers/timer/*xlnx_psttc* @wjliang @stephanosio
/drivers/timer/*cc13xx_cc26xx_rtc* @vanti
/drivers/timer/*cavs* @dcpleung
/drivers/timer/*stm32_lptim* @FRASTM
/drivers/timer/*leon_gptimer* @julius-barendt
/drivers/timer/*mips_cp0* @frantony
/drivers/timer/*rcar_cmt* @aaillet
/drivers/timer/*esp32c3_sys* @uLipe
/drivers/timer/*sam0_rtc* @bendiscz
/drivers/timer/*arcv2* @ruuddw
/drivers/timer/*xtensa* @dcpleung
/drivers/timer/*rv32m1_lptmr* @mbolivar
/drivers/timer/*nrf_rtc* @anangl
/drivers/timer/*hpet* @dcpleung
/drivers/usb/device/usb_dc_stm32.c @ydamigos @loicpoulain
/drivers/i2c/*b91* @andy-liu-telink
/drivers/i2c/i2c_ll_stm32* @ydamigos
/drivers/i2c/i2c_rv32m1_lpi2c* @henrikbrixandersen
/drivers/i2c/*sam0* @Sizurka
/drivers/i2c/i2c_dw* @dcpleung
/drivers/i2c/*tca954x* @kurddt
/drivers/*/*xec* @franciscomunoz @albertofloyd @sjvasanth1
/drivers/watchdog/*gecko* @oanerer
/drivers/watchdog/*sifive* @katsuster
/drivers/watchdog/wdt_handlers.c @dcpleung @nashif
/drivers/watchdog/*cc32xx* @pavlohamov
/drivers/watchdog/wdt_ite_it8xxx2.c @RuibinChang
/drivers/watchdog/Kconfig.it8xxx2 @RuibinChang
/drivers/watchdog/wdt_counter.c @nordic-krch
/drivers/watchdog/*rpi_pico* @thedjnK
/drivers/watchdog/*dw* @softwarecki @pbalsundar
/drivers/watchdog/*ifx* @sreeramIfx
/drivers/wifi/esp_at/ @mniestroj
/drivers/wifi/eswifi/ @loicpoulain @nandojve
/drivers/wifi/winc1500/ @kludentwo
/drivers/virtualization/ @tbursztyka
/dts/arc/ @abrodkin @ruuddw @iriszzw @evgeniy-paltsev
/dts/arm/acsip/ @NorthernDean
/dts/arm/aspeed/ @aspeeddylan
/dts/arm/atmel/sam4e* @nandojve
/dts/arm/atmel/sam4l* @nandojve
/dts/arm/atmel/samr21.dtsi @benpicco
/dts/arm/atmel/sam*5*.dtsi @benpicco
/dts/arm/atmel/same70* @nandojve
/dts/arm/atmel/samv71* @nandojve
/dts/arm/atmel/ @galak
/dts/arm/broadcom/ @sbranden
/dts/arm/cypress/ @ifyall @npal-cy
/dts/arm/gd/ @nandojve
/dts/arm/infineon/xmc4* @parthitce @ifyall @npal-cy
/dts/arm/infineon/psoc6/ @ifyall @npal-cy
/dts/arm64/armv8-r.dtsi @povergoing
/dts/arm64/intel/*intel_socfpga* @siclim
/dts/arm64/nxp/ @JiafeiPan
/dts/arm64/renesas/ @lorc @xakep-amatop
/dts/arm/quicklogic/ @fkokosinski @kgugala
/dts/arm/seeed/ @str4t0m
/dts/arm/st/ @erwango
/dts/arm/st/h7/*stm32h735* @benediktibk
/dts/arm/st/l4/*stm32l451* @benediktibk
/dts/arm/ti/cc13?2* @bwitherspoon
/dts/arm/ti/cc26?2* @bwitherspoon
/dts/arm/ti/cc3235* @vanti
/dts/arm/nordic/ @anangl @carlescufi
/dts/arm/nuvoton/ @ssekar15 @MulinChao @ChiHuaL
/dts/arm/nxp/ @mmahadevan108 @dleach02
/dts/arm/nxp/nxp_s32* @manuargue
/dts/arm/microchip/ @franciscomunoz @albertofloyd @sjvasanth1
/dts/arm/rpi_pico/ @yonsch
/dts/arm/silabs/efm32_pg_1b.dtsi @rdmeneze
/dts/arm/silabs/efm32gg11b* @oanerer
/dts/arm/silabs/efr32bg13p* @mnkp
/dts/arm/silabs/efr32bg22* @kgugala @fkokosinski @pczarnecki
/dts/arm/silabs/efr32xg13p* @mnkp
/dts/arm/silabs/efm32pg1b* @rdmeneze
/dts/arm/silabs/efr32mg21* @l-alfred
/dts/arm/silabs/efr32fg13* @yonsch
/dts/riscv/ite/ @ite
/dts/riscv/microchip/microchip-miv.dtsi @galak
/dts/riscv/openisa/rv32m1* @dleach02
/dts/riscv/riscv32-litex-vexriscv.dtsi @mateusz-holenko @kgugala @pgielda
/dts/riscv/starfive/ @rajnesh-kanwal
/dts/riscv/andes/andes_v5* @cwshu @kevinwang821020 @jimmyzhe
/dts/riscv/niosv/ @sweeaun
/dts/arm/armv*m.dtsi @galak @ioannisg
/dts/arm/armv7-a.dtsi @ibirnbaum
/dts/arm/armv7-r.dtsi @bbolen @stephanosio
/dts/arm/xilinx/ @bbolen @stephanosio
/dts/arm/renesas/rcar/ @aaillet
/dts/xtensa/xtensa.dtsi @ydamigos
/dts/xtensa/intel/ @dcpleung
/dts/xtensa/espressif/ @sylvioalves
/dts/xtensa/nxp/ @iuliana-prodan @dbaluta
/dts/bindings/can/ @alexanderwachter @henrikbrixandersen
/dts/bindings/i2c/zephyr*i2c-emul*.yaml @sjg20
/dts/bindings/adc/st*stm32-adc.yaml @cybertale
/dts/bindings/adc/*ads114s08.yaml @benediktibk
/dts/bindings/adc/*max111* @benediktibk
/dts/bindings/modem/*hl7800.yaml @rerickson1
/dts/bindings/serial/ns16550.yaml @dcpleung @nashif
/dts/bindings/counter/snps,dw-timers.yaml @pbalsundar
/dts/bindings/wifi/*esp-at.yaml @mniestroj
/dts/bindings/*/*gd32* @nandojve
/dts/bindings/*/*npcx* @MulinChao @ChiHuaL
/dts/bindings/*/*psoc6* @ifyall @npal-cy
/dts/bindings/*/*infineon*cat1* @ifyall @npal-cy
/dts/bindings/*/nordic* @anangl
/dts/bindings/*/nxp* @mmahadevan108 @dleach02
/dts/bindings/*/nxp*s32* @manuargue
/dts/bindings/*/openisa* @dleach02
/dts/bindings/*/raspberrypi*pico* @yonsch
/dts/bindings/*/st* @erwango
/dts/bindings/sensor/ams* @alexanderwachter
/dts/bindings/*/sifive* @mateusz-holenko @kgugala @pgielda
/dts/bindings/*/litex* @mateusz-holenko @kgugala @pgielda
/dts/bindings/*/vexriscv* @mateusz-holenko @kgugala @pgielda
/dts/bindings/*/andes* @cwshu @kevinwang821020 @jimmyzhe
/dts/bindings/*/neorv32* @henrikbrixandersen
/dts/bindings/*/*lan91c111* @sgrrzhf
/dts/bindings/i3c/ @dcpleung
/dts/bindings/pm_cpu_ops/* @carlocaione
/dts/bindings/ethernet/*gem.yaml @ibirnbaum
/dts/bindings/auxdisplay/*pt6314.yaml @xingrz
/dts/bindings/auxdisplay/* @thedjnK
/dts/bindings/sensor/*bme680* @BoschSensortec
/dts/bindings/sensor/*ina23* @bbilas
/dts/bindings/sensor/st* @avisconti
/dts/bindings/sensor/zephyr,sensing.yaml @lixuzha @ghu0510 @qianruh
/dts/bindings/sensor/zephyr,sensing*.yaml @lixuzha @ghu0510 @qianruh
/dts/bindings/smbus/ @finikorg
/dts/bindings/sip_svc/ @maheshraotm
/dts/bindings/cpu/intel,niosv.yaml @sweeaun
/dts/bindings/reset/intel,socfpga-reset.yaml @nbalabak
/dts/bindings/gpio/*pcal64* @benediktibk
/dts/bindings/gpio/*bd8lb600fs* @benediktibk
/dts/bindings/gpio/*ads114s0x* @benediktibk
/dts/bindings/pwm/*max31790* @benediktibk
/dts/bindings/dac/*ad56* @benediktibk

Some files were not shown because too many files have changed in this diff Show More